Mobile, as with much of display and input technology across the consumer electronics spectrum, has gotten rather “touchy”. Every modern smartphone requires touch-interaction, even the display on my printer at home is touch-oriented. We accept it because it’s natural to manipulate an object with our hands. And in this idea is the genius of the technology.
It wasn’t always this way… a keyboard was once the only way to access words (filenames) in a list (directory) to reach the information we needed, and the only way to touch an object in a system was to insert the floppy into the drive. We were distant and detached from the systems. But in the interests of progress developers needed to draw us in, to integrate our senses.
Touch is a visceral confluence of motive and dexterity. Some say that sixty percent of the nerves dedicated to the sense of touch end in the fingertips. We eat with our hands, express artistic concepts through various tools, we demonstrate love (and in contrast, violence) with our hands. So it makes sense that an “item” on a display should respond, even behave, relative to the way we touch it.
Now, by the standards of object-oriented programming and the cutting-edge development environments, we are compelled to gestures and combinations of touch to offer a complete experience in systems’ use. Mobile development is hinged on the idea of visceral presence in the apps we use to function daily.