Remember the scene in Star Trek IV where Scotty tries to operate a Mac workstation?
I may be mis-remembering the dialogue, but it goes something like this:
McCoy: “You have to use the mouse.”
Scotty (picks up mouse and speaks into it): “Hello, computer.”
McCoy: “Just use the keyboard.”
Three things: one, I apologize for the Star Trek spring-board. Two, evidently the Mac is not as intuitive as we all thought, even to a starship engineer from the twenty-third century…
Heralded as a great leap forward, the Graphical User Interface (GUI) that came out of Xerox Parc, Palo Alto in the seventies – yes almost forty years ago – pioneered the use of WIMPs; Windows, Icons, Mice, Pull-down Menus. Although strictly that should be Mouses, as Mice is the plural for small rodents not computer pointing devices.
Yes, the GUI was a huge improvement on what came before. In the seventies, the Data Processing Department relied on punch-cards and paper tape. In the eighties, we got as far as green-screen terminals and mind-numbing keying of program code. Try type-setting a magazine on one. No, don’t. I did. I believe that is why I am now gray-haired. The advent of the cheap(-ish) PC brought computing to the (relatively well-off) masses The GUI accelerated the take-up, not as the catalyst, once the machines became powerful enough to run a GUI with some application programs atop it.
That’s the third thing. Xerox may have invented the GUI as a corporate tool, it was Macintosh and PC “liberated” it from the then-walled garden of the Unix X-windowing system. Here at last were machines for the rest of us, that we could all use.
I may be showing both my age and my ignorance of factual computing history, it’s never stopped me in the past. I can say with some certainty that there is nothing intuitive about the current or previous generations of GUI’s. None of them. We all have to be shown how to use them. The operation of mouses and menu’s are not innate behaviors to the human primate, any more than language – another subject I argue with my friends in teaching. A certain well-known US TV presenter recently learned how to Tweet. First he had to be taught how to operate the Twitter website with a mouse, which he first tried to touch to the screen. It’s true. Bright people don’t necessarily get it. I believe the number of mouse buttons affects the learning curve in inverse measure. I buried a former friend under my patio for having owned a Logitech super-mouse with seven buttons and a scroll-wheel. Not really, but I thought of it on every visit to his office.
The problem worsens over time, when one expects it to get better. The sheer multiplicity of devices, copyrighted, patented, and trademarked, each with its’ attendant software, also copyrighted, patented, and trademarked, is making it more difficult to be productive, not less. Consistency would be a boon, only commerce won’t allow it. Yes we have standards beyond individual platforms such as i-OS, Windows and, thank you Hewlett-Packard, Web-OS. Linux has Open Desktop.org. Not that you would know it between Gnome, KDE, Xfce, LXDE, Sugar, Linpus, Chrome and many other re-badged deviations. Android is splintered whilst Meego falters and Symbian …does whatever Symbian does in the market these days. Double-tap, pinch-to-zoom, tap-to-focus all work in slightly different ways and the menus of any two smart-phones are seldom the same.
Icons. These are religious works of art. The universal language of icons in computing is not universal. Nor is it a language. There are no standards and little permanence since icons are either about creative artistic interpretation or marketing and branding. Here endeth the lesson.
Yes, I do mean Tom Cruise with a data glove, waving his arms about in front of a holographic projection of data. Anyone who’s tried a data-glove and VR headset knows just what a fool they look (hey, got the t-shirt from VR-expo London in 1995). I’ll take it if it means no more mouses. Or labyrinths of menus. In recent months we see the first signs with the Kinect and the Wii-Motion controllers. Yes, you still look like a fool.
A term so vague, it never lost common currency, it never held common currency. I refer back to my previous point about icons. ‘User friendly’ is what we all want, despite the fact that we can’t define it and it, too, changes over time. I submit that the next generation of GUI has to be:
- flexible, accommodating all tastes, abilities, handicaps and cognitive dissonances (bingo players may take a drink now)
- consistent in the learning and operation
- layered in it’s inevitable complexity, simple in the common, day-to-day operations, as complex as it needs to achieve those more advanced
- task-oriented; really, don’t just say it, do it.
What does it look like? I have no idea. It’s not OS-X Ice-Yeti or KDE 4.7. It goes beyond i-OS, Android and Windows Phone 7. Just because our kids are really fast with them, doesn’t mean they’re good. These have all evolved from the past and it has to be a break with the past. It has to be designed around real people and not around the obstacles the software engineers find with the available hardware. Just imagine. AJS