The problem with multi-touch
Multi-touch controls are the new ‘in thing.’ Soon we’ll be interacting with our tools by touching screens in multiple places at the same time using our fingers. This means that operating systems and applications will be able to respond to gestural interfaces. On the iPhone moving two fingers in a pinching together motion makes the picture or map smaller on the screen. The opposite movement makes the map larger. On some computer-based multi-touch systems, the position of your fingers at the start and finish allows you to rotate as you scale up.
Here’s a demo from January 2006 showing what a multitouch gestural interface looks like.
How does this impact on the next user interface for editing? If it’s going to related to multi-touch controls what will that be like? Will we suffer from new forms of RSI? Will we take our hands off the keyboard to directly manipulate our pictures and sound?
The advantage of mice and graphic pen tablets is that we don’t have to use palettes as large as our screens to manipulate pointers. With mice we can move the mice away from the desk and move the mouse in the opposite direction through the air before bringing it down on the desk to keep the pointer moving. However many large screens you have, you never run out of space with a mouse.
With pen tablets, we give up this advantage in return for having control of pen pressure. We need more precise hand control because the effective resolution that a small movement of a pen on a tablet is much smaller when and A5 tablet needs to represent 2 or 3 thousand horizontal positions across a pair of monitors. Wacom tablets can detect pen movements down to an accuracy of 2000 dpi, but how many people have that kind of muscle control?
Imgine having a pair of displays that add up to 3840 pixels wide by 1200 pixels high. A common editor’s setup. Imagine if these screens were touch screens that could detect every touch your fingers made. What would working with a system like that be like?
Unless we change the way we work with software, our arms are going to be very tired by the end of the day…
One of the good ways to innovate is to jump to the next stage in technology and come up with new ideas there. I would say that Avid, Apple and Adobe’s current interfaces may be tapped out.
I’ve coming up with some possible gestures and interface tools for editors. Is anyone interested? It’s worth thinking about. We might as well help out Avid or Apple or whoever’s going to come up with the interface that might beat both of them…
Keyboards 1880-1984 Mouse 1984-2009 Multitouch 2009-?
I hear similar arguments all the time, that multi touch on the desktop is just going to hurt your arms. And I will admit it would. However, I think the future of multi touch means a reworking of just how we interact with computers.
Multi touch makes complete sense immediately on mobile devices like phones and tablet PCs. As demonstrated by the many video demo’s floating around, it also works exceedingly well on very large wall type screens, which are used by those with deep pocket books and specific needs. However, kiosk type applications, such as the MS Surface platform, are where I think most people are going to get a change to use multi touch. Even PCs like the HP TouchSmart, or those digital picture frames that are all the rage, would be ideal candidates for multi touch.
Pingback: TouchSmarter: Touch Screen Gadgets » The problem with multi touch?