Integrating touchscreen technology into the OS sounds revolutionary -- until you try to use it
Touch-based interfaces have captured everyone's imagination, thanks in large part to the iPhone. With Windows 7, Microsoft joins Appls in bringing touch to the desktop, baking touch capabilities into the OS itself. Whereas Apple quietly added touch to Mac OS X Leopard a couple years back, Microsoft has hyped its Microsoft Surface technology for more than a year. Beneath this hype has been the suggestion that, with Windows 7, a touch revolution is brewing.
Or maybe not.
[ See what Windows 7's top 20 features are. And explore the new Windows 7 in InfoWorld's Deep Dive report. | Check out what the new Mac OS X Snow Leopard offers. ]
Two years of avid iPod Touch use has gotten me excited about the idea of touch UIs, so I was eager to try out the vaunted touch technology in Windows 7. My MacBook Pro has touch capabilities in its trackpad, but I usually run the laptop closed when working at my desk, so its touch capabilities haven't been regularly accessible. The new breed of all-in-one PCs with touch-sensitive screens from Dell and Hewlett-Packard promised to change the equation and make touch on the PC as cool and functional as touch on an iPhone.
Or maybe not.
[ See what Windows 7's top 20 features are. And explore the new Windows 7 in InfoWorld's Deep Dive report. | Check out what the new Mac OS X Snow Leopard offers. ]
Two years of avid iPod Touch use has gotten me excited about the idea of touch UIs, so I was eager to try out the vaunted touch technology in Windows 7. My MacBook Pro has touch capabilities in its trackpad, but I usually run the laptop closed when working at my desk, so its touch capabilities haven't been regularly accessible. The new breed of all-in-one PCs with touch-sensitive screens from Dell and Hewlett-Packard promised to change the equation and make touch on the PC as cool and functional as touch on an iPhone.
Well, that was the theory. The truth has been a bitter disappointment. In both Windows 7 and Mac OS X Snow Leopard, the touch experience has been underwhelming.
Limited deployment is partly to blame, as -- despite marketing hype -- neither Apple nor Microsoft is making a serious effort to touchify their OSes. For Microsoft, touch seems to be a technology crush it won't admit it's fallen out of love with; for Apple, touch seems to be a key part of its non-PC strategy. (Neither Apple nor Microsoft would talk to InfoWorld about touch technology.)
Of course, Microsoft and Apple may have reason for not getting serious about touch. After all, outside of the obvious use in self-contained kiosk environments, does touch really make sense on a PC?
My early experience suggests it does not.
Here are the key concerns that make PC touch useless for most people -- and that will continue to plague any notion of a "touch revolution" on the desktop PC for years to come.
Issue 1: Touch is not omnipresent
What makes the touch interface so compelling on the iPhone and on quality copycats such as the Palm Pre is that the use of touch gestures are a fundamental part of the operating system and the applications. Just as using a mouse is fundamental and universal in Windows and Mac OS X, touch gestures are universal in the iPhone, Palm Pre, and so on. This means the user interfaces are designed with touch at the core, and typically work intuitively as you put your finger to the screen.
Limited deployment is partly to blame, as -- despite marketing hype -- neither Apple nor Microsoft is making a serious effort to touchify their OSes. For Microsoft, touch seems to be a technology crush it won't admit it's fallen out of love with; for Apple, touch seems to be a key part of its non-PC strategy. (Neither Apple nor Microsoft would talk to InfoWorld about touch technology.)
Of course, Microsoft and Apple may have reason for not getting serious about touch. After all, outside of the obvious use in self-contained kiosk environments, does touch really make sense on a PC?
My early experience suggests it does not.
Here are the key concerns that make PC touch useless for most people -- and that will continue to plague any notion of a "touch revolution" on the desktop PC for years to come.
Issue 1: Touch is not omnipresent
What makes the touch interface so compelling on the iPhone and on quality copycats such as the Palm Pre is that the use of touch gestures are a fundamental part of the operating system and the applications. Just as using a mouse is fundamental and universal in Windows and Mac OS X, touch gestures are universal in the iPhone, Palm Pre, and so on. This means the user interfaces are designed with touch at the core, and typically work intuitively as you put your finger to the screen.
I expected the same level of universality in the new Windows 7, given Microsoft's trumpeting of its Surface research for years now, but it simply does not exist. And Mac OS X does no better despite Apple's pioneering use of touch in the mobile context.
What you get instead is the mapping of mouse functions to the touchscreen or trackpad, so that, in essence, your finger becomes a mouse. (Tapping your finger acts like clicking a mouse button.) There's nothing wrong with that approach, but you already have a mouse, so why switch to your finger? That's where the real issue of touch on today's desktops comes up: There's just not that much you can do with touch.
First off, the finger rarely makes a better mouse than a mouse -- it's harder to be precise with a finger on the touchscreens used in the Dell and HP touchscreen systems. (The Mac trackpad is a bit more precise, but still no mouse.)
Second, the gestures that Windows 7 and Mac OS X Snow Leopard support beyond basic mouse actions are few: rotate, scroll, zoom in, and zoom out. What's more, these gestures are rarely available. You can tap (click) and scroll using your fingers universally in the OS and in apps. But beyond that, touch support gets very dicey very fast.
Windows 7 does allow you to zoom in and out of folder views and assign touch shortcuts (called "flicks") for common actions such as copy, paste, and undo; these shortcuts work across the OS and applications. (Mac OS X doesn't do either.)
Of course, most of the touch capabilities are not new to Windows 7; they're simply the pen-input capabilities Windows has long had, now working via a touchscreen. But Windows 7 does add a few touch-specific gestures: It copies the iPhone's pinch and expand gestures for zooming, as well as the iPhone's rotation gesture. And it adds a unique two-finger gesture for opening a contextual menu (hold one finger on the object and tap a second finger near it).
But for touch gestures to work in applications, the software developer usually has to explicitly add touch support. And very few developers have, despite the fact that Microsoft made the touch SDK available to all developers a year ago.
Several Adobe Creative Suite 4 apps, such as Illustrator and InDesign, support some Mac OS X gestures such as rotate -- but the Windows versions do not. In Windows, Internet Explorer 8 supports zooming via gestures; on the Mac, Safari 4 does, too. But Firefox 3.5 doesn't support gesture-based zoom on either OS. And even within IE8 and Safari, the touch gesture support doesn't extend to Flash, PDF, and other objects that may be embedded in a Web page.
In Windows 7, one touch gesture that developers can get "for free" is zooming, since Windows 7 maps the pinch and zoom touch gestures to scrollwheel zooming; that means an app that has been enabled for zooming via a mouse's scrollwheel (such as IE8) is also enabled for touch zoom. Likewise, if an app has been enabled for horizontal scrolling, it's automatically enabled to support Windows 7's new two-finger swipe gesture for horizontal scrolling.
But until all apps are designed to support touch gestures, and the OS makes more use of them (as the iPhone OS does), it's simply easier to stick with the mouse because you know it works everywhere.
Issue 2: PC UIs aren't finger-friendly
In using a Dell Studio One desktop and an HP TouchSmart desktop -- whose touchscreens based on NextWindows' technology are quite responsive -- I found another limitation to the adoption of touch technology in its current guide: The Windows UI really isn't touch-friendly. A finger is a lot bigger than a mouse or pen, so it's not as adept at making fine movements.
Also, on a touchscreen, your hand and arm obscure your view of where your fingertip actually is, making it hard to actually touch the intended radio button, close box, slider, or what-have-you. It doesn't help that these elements are often small. And there's no tactile feel to substitute for the lost visual feedback.
But the issues of using touch gestures go beyond the visibility and size of UI controls. The ways the controls work is often not finger-friendly. Take as an example Windows 7's wireless LAN setup. It has some big buttons to select a desired network, so it's natural to just press the desired one. And sometimes that works, but often these visual buttons are really the equivalent of radio buttons -- item selectors -- and you then have to tap the Next button. That's not the kind of direct stimulation that touch assumes. When you work with something on your hands, the manipulation is direct. But most apps are designed for interaction with keyboards and mice, and aren't so direct (to prevent accidental selections and the like, since it's really easy to move a mouse unintentionally).
The result is that using touch is often an awkward process. Unlike an iPhone's apps, Windows or Mac OS X apps weren't designed for touch, and neither the OSes nor the apps are intended to adjust themselves for this input method.
Issue 3: Gesture-based computing needs a better surface
I was surprised to discover a third issue: the touch surface itself. I love using the touchscreen on my iPod Touch, but I usually did not like using the touchscreens on the Dell or HP.
The issue wasn't the screen per se, but its location. A monitor is in front of you, a good foot or two away. That means holding your hand and arm out, raised and extended. That's not comfortable for long durations. Try this: Move your mouse under your monitor, then see how long you can stand it. It also means a lot of ungainly arm movement to get to the keyboard, which few apps let you ignore. (Windows 7 does have a handwriting app that is OK to write with, but impossible to edit with. And writing more than a dozen words at a time is likely to make your fingers and arm hurt.)
There's also the issue of the parallax effect: The layer of glass above the LCD's crystals creates a slight gap between what you touch and what you see. At the edges of a screen, the distance is enough to throw off your hand-eye coordination -- a reason that so many iPhone users have trouble typing on the virtual keyboard's side keys such as Q, A, P, and L. Over time, your brain adjusts, of course.
The Mac OS's reliance on a trackpad for touch input lessens these issues. Your hand is at a more natural location -- on a trackpad, not the screen, so you can follow the mouse pointer easily to see where you are -- like using a mouse. And you can easily switch to the keyboard and even a second input device such as a pen or mouse. (Adesso does make a touchpad for PCs, but it doesn't use the Windows 7 gestures, relying instead on its own. I could not find any external touchpads for Macs.)
But Apple's use of a trackpad raises issues of its own. One is the difficulty of moving a relatively fat finger in a confined space, which is why Apple keeps increasing the size of its trackpads. The other is that you need to use a laptop and keep it open so that the trackpad is accessible. That's great on the road, but not at a desk. A laptop's screen is too low for most people to maintain good posture, and if you raise the laptop to raise its screen, the keyboard and trackpad placement are off. So chances are that your MacBook is closed and its trackpad inaccessible; you're using an external monitor, keyboard, and mouse instead. Sure, you can open the MacBook and use its LCD as a mirrored or extended monitor, but it's likely your desk isn't big enough to position the open MacBook, your external mouse, and your external keyboard all comfortably.
I can easily imagine a day when you'll have a mousepad that doubles as a trackpad, so you have the room to maneuver your fingers and won't have the "outstretched arm" issue. Logitech's recent creation of mice that can work on glass surfaces makes such a dual-purpose mouse/touchpad more likely.
Why touch remains a tantalizing prospect
Despite all these issues, the promise of touch remains tantalizing, as I'm reminded every day when I use my iPod Touch.
Although the utility of touch for working in a spreadsheet or word processor is questionable, there are some apps where touch makes sense, such as with Google Maps or Microsoft Bing Maps. On IE8 or Safari, manipulating them by touch with zoom and scroll gestures simply feels better than using a mouse. It's natural to touch the map, as you would a physical map or globe. And if you've ever used Google Earth on an iPhone, you know how the desktop version feels like a pale imitation simply because you can't manipulate it as directly.
But there's a chicken-and-egg issue to resolve. Few apps cry out for a touch UI, so Microsoft and Apple can continue to get away with merely dabbling with touch as an occasional mouse-based substitute. It would take one or both of these OS makers to truly touchify their platforms, using common components to pull touch into a great number of apps automatically. Without a clear demand, their incentive to do so doesn't exist.
I'm hoping that Apple's long-rumored tablet based on the iPod Touch might create the demand, by bringing a touch interface to a device that is more computer-like and thus might stimulate touch UI development that would more easily translate to the desktop/laptop experience. That's a big "if," since no one outside of Apple really knows what such a device would do or, indeed, if it even exists.
But if Apple changes the game in the touch-based tablet world, that could give both the OS makers and app developers the incentive to make touch more than a skin-deep graft onto Windows or Mac OS X. My money is on Apple, not Microsoft, to be the one that gets serious about touch, if anyone does.
By Galen Gruman/InfoWorld
What you get instead is the mapping of mouse functions to the touchscreen or trackpad, so that, in essence, your finger becomes a mouse. (Tapping your finger acts like clicking a mouse button.) There's nothing wrong with that approach, but you already have a mouse, so why switch to your finger? That's where the real issue of touch on today's desktops comes up: There's just not that much you can do with touch.
First off, the finger rarely makes a better mouse than a mouse -- it's harder to be precise with a finger on the touchscreens used in the Dell and HP touchscreen systems. (The Mac trackpad is a bit more precise, but still no mouse.)
Second, the gestures that Windows 7 and Mac OS X Snow Leopard support beyond basic mouse actions are few: rotate, scroll, zoom in, and zoom out. What's more, these gestures are rarely available. You can tap (click) and scroll using your fingers universally in the OS and in apps. But beyond that, touch support gets very dicey very fast.
Windows 7 does allow you to zoom in and out of folder views and assign touch shortcuts (called "flicks") for common actions such as copy, paste, and undo; these shortcuts work across the OS and applications. (Mac OS X doesn't do either.)
Of course, most of the touch capabilities are not new to Windows 7; they're simply the pen-input capabilities Windows has long had, now working via a touchscreen. But Windows 7 does add a few touch-specific gestures: It copies the iPhone's pinch and expand gestures for zooming, as well as the iPhone's rotation gesture. And it adds a unique two-finger gesture for opening a contextual menu (hold one finger on the object and tap a second finger near it).
But for touch gestures to work in applications, the software developer usually has to explicitly add touch support. And very few developers have, despite the fact that Microsoft made the touch SDK available to all developers a year ago.
Several Adobe Creative Suite 4 apps, such as Illustrator and InDesign, support some Mac OS X gestures such as rotate -- but the Windows versions do not. In Windows, Internet Explorer 8 supports zooming via gestures; on the Mac, Safari 4 does, too. But Firefox 3.5 doesn't support gesture-based zoom on either OS. And even within IE8 and Safari, the touch gesture support doesn't extend to Flash, PDF, and other objects that may be embedded in a Web page.
In Windows 7, one touch gesture that developers can get "for free" is zooming, since Windows 7 maps the pinch and zoom touch gestures to scrollwheel zooming; that means an app that has been enabled for zooming via a mouse's scrollwheel (such as IE8) is also enabled for touch zoom. Likewise, if an app has been enabled for horizontal scrolling, it's automatically enabled to support Windows 7's new two-finger swipe gesture for horizontal scrolling.
But until all apps are designed to support touch gestures, and the OS makes more use of them (as the iPhone OS does), it's simply easier to stick with the mouse because you know it works everywhere.
Issue 2: PC UIs aren't finger-friendly
In using a Dell Studio One desktop and an HP TouchSmart desktop -- whose touchscreens based on NextWindows' technology are quite responsive -- I found another limitation to the adoption of touch technology in its current guide: The Windows UI really isn't touch-friendly. A finger is a lot bigger than a mouse or pen, so it's not as adept at making fine movements.
Also, on a touchscreen, your hand and arm obscure your view of where your fingertip actually is, making it hard to actually touch the intended radio button, close box, slider, or what-have-you. It doesn't help that these elements are often small. And there's no tactile feel to substitute for the lost visual feedback.
But the issues of using touch gestures go beyond the visibility and size of UI controls. The ways the controls work is often not finger-friendly. Take as an example Windows 7's wireless LAN setup. It has some big buttons to select a desired network, so it's natural to just press the desired one. And sometimes that works, but often these visual buttons are really the equivalent of radio buttons -- item selectors -- and you then have to tap the Next button. That's not the kind of direct stimulation that touch assumes. When you work with something on your hands, the manipulation is direct. But most apps are designed for interaction with keyboards and mice, and aren't so direct (to prevent accidental selections and the like, since it's really easy to move a mouse unintentionally).
The result is that using touch is often an awkward process. Unlike an iPhone's apps, Windows or Mac OS X apps weren't designed for touch, and neither the OSes nor the apps are intended to adjust themselves for this input method.
Issue 3: Gesture-based computing needs a better surface
I was surprised to discover a third issue: the touch surface itself. I love using the touchscreen on my iPod Touch, but I usually did not like using the touchscreens on the Dell or HP.
The issue wasn't the screen per se, but its location. A monitor is in front of you, a good foot or two away. That means holding your hand and arm out, raised and extended. That's not comfortable for long durations. Try this: Move your mouse under your monitor, then see how long you can stand it. It also means a lot of ungainly arm movement to get to the keyboard, which few apps let you ignore. (Windows 7 does have a handwriting app that is OK to write with, but impossible to edit with. And writing more than a dozen words at a time is likely to make your fingers and arm hurt.)
There's also the issue of the parallax effect: The layer of glass above the LCD's crystals creates a slight gap between what you touch and what you see. At the edges of a screen, the distance is enough to throw off your hand-eye coordination -- a reason that so many iPhone users have trouble typing on the virtual keyboard's side keys such as Q, A, P, and L. Over time, your brain adjusts, of course.
The Mac OS's reliance on a trackpad for touch input lessens these issues. Your hand is at a more natural location -- on a trackpad, not the screen, so you can follow the mouse pointer easily to see where you are -- like using a mouse. And you can easily switch to the keyboard and even a second input device such as a pen or mouse. (Adesso does make a touchpad for PCs, but it doesn't use the Windows 7 gestures, relying instead on its own. I could not find any external touchpads for Macs.)
But Apple's use of a trackpad raises issues of its own. One is the difficulty of moving a relatively fat finger in a confined space, which is why Apple keeps increasing the size of its trackpads. The other is that you need to use a laptop and keep it open so that the trackpad is accessible. That's great on the road, but not at a desk. A laptop's screen is too low for most people to maintain good posture, and if you raise the laptop to raise its screen, the keyboard and trackpad placement are off. So chances are that your MacBook is closed and its trackpad inaccessible; you're using an external monitor, keyboard, and mouse instead. Sure, you can open the MacBook and use its LCD as a mirrored or extended monitor, but it's likely your desk isn't big enough to position the open MacBook, your external mouse, and your external keyboard all comfortably.
I can easily imagine a day when you'll have a mousepad that doubles as a trackpad, so you have the room to maneuver your fingers and won't have the "outstretched arm" issue. Logitech's recent creation of mice that can work on glass surfaces makes such a dual-purpose mouse/touchpad more likely.
Why touch remains a tantalizing prospect
Despite all these issues, the promise of touch remains tantalizing, as I'm reminded every day when I use my iPod Touch.
Although the utility of touch for working in a spreadsheet or word processor is questionable, there are some apps where touch makes sense, such as with Google Maps or Microsoft Bing Maps. On IE8 or Safari, manipulating them by touch with zoom and scroll gestures simply feels better than using a mouse. It's natural to touch the map, as you would a physical map or globe. And if you've ever used Google Earth on an iPhone, you know how the desktop version feels like a pale imitation simply because you can't manipulate it as directly.
But there's a chicken-and-egg issue to resolve. Few apps cry out for a touch UI, so Microsoft and Apple can continue to get away with merely dabbling with touch as an occasional mouse-based substitute. It would take one or both of these OS makers to truly touchify their platforms, using common components to pull touch into a great number of apps automatically. Without a clear demand, their incentive to do so doesn't exist.
I'm hoping that Apple's long-rumored tablet based on the iPod Touch might create the demand, by bringing a touch interface to a device that is more computer-like and thus might stimulate touch UI development that would more easily translate to the desktop/laptop experience. That's a big "if," since no one outside of Apple really knows what such a device would do or, indeed, if it even exists.
But if Apple changes the game in the touch-based tablet world, that could give both the OS makers and app developers the incentive to make touch more than a skin-deep graft onto Windows or Mac OS X. My money is on Apple, not Microsoft, to be the one that gets serious about touch, if anyone does.
By Galen Gruman/InfoWorld
0 comments:
Post a Comment