Cupertino (CA) - A patent filed by Apple suggests that the company is developing its own "smart" touchscreen technology for a whole range of devices, including a tablet computer. The company apparently plans a "multipoint"-capable screen that may enable the company to "digitize" the iPod and create a "human" interface for widely used software - such as Google Earth.
Touchscreen have been around for quite some time, but never quite appealed to the mainstream computing market. The technology is limited mainly to PDAs and smartphones as well as the nichified tablet PC segment ; its functionality today covers the recognition of single taps on the screen with a finger or stylus, and as far as dynamics go the recognition of handwriting and scrolling.
More than a decade after Apple made its first touchscreen steps with its "Newton" PDA, Apple has given the technology much more thought and apparently believes that the technology can "facilitate human interaction with the computing system" and can be applied to simplify the use of virtually any consumer application.
Apple’s "Gestures for touch sensitive input devices" patent claim describes a touchscreen for "desktops, laptops, tablets [...] a handheld computer [and] may also correspond to a computing device, such as a cell phone, PDA, dedicated media player and consumer electronic device." The key feature of Apple’s idea is a "multipoint"-capable touchscreen : Instead of entering data sequentially, Apple thinks about simultaneous input of data : "The method also includes identifying at least one multipoint gesture based on the data from the touch sensitive device," the abstract reads. Multipoint data entry can mean "multiple fingers, fingers and palms, a finger and a stylus, multiple styli and/or any combination thereof."
So, why would we use more than one finger at the same time to input data ? According to Apple, using more than one and apparently as much as four input devices simultaneously allows us to control certain aspects of software more effectively. For example, "zooming may be associated with spreading a pair of fingers and rotating may be associated with rotating the pair of fingers. [...] Furthermore during zooming and rotation, the user can stop spreading their fingers so that only rotation occurs. In other words, the gesture inputs can be continuously input, either simultaneously or consecutively."
Apple’s patent filing spends quite some time on zooming and panning when using maps, which - let us speculate once more - could indicate that Apple is working on a GPS-enabled device and most likely will enable users to have more fun with Google Earth. Moving multiple fingers around the screen while searching for a particular area on a map in fact may be easier for many users than using the keyboard shortcuts or using controlling with a mouse.
Not surprisingly, Apple’s touchscreen also focuses on "virtual control interfaces" such as volume knobs and switches. For example, the abstract describes a tablet PC with a "virtual scroll wheel" displayed on the screen, on which the direction and "amount of rotation of the fingers" increases or decreases the volume accordingly.
During this rotation, the touchscreen can provide not just audible, but also tactile feedback in the form of vibration. "A haptics unit of the tablet PC may provide a certain amount of vibration or other tactile feedback for each click thereby simulating an actual knob," the filing says.
Apple filed for the patent on 31 December 2005, which indicates that the company has been experimenting with touchscreens for quite a while. But if we keep Apple’s timing of past patent filings in mind, we would expect Apple’s smart touchscreen device not to be available before 2007.