Anyone who knows the history computing knows that right now, we have computers that are preposterously powerful in comparison to what was on offer twenty, ten or even five years ago – that the technical specs of the machines on offer now are between a thousand and ten million times better.
Someone interested in the history of of User Interface (UI) will also know that the history on that front has not been quite so dramatic. From the first popular consumer mouse in 1984 to the present day, very little has changed about how we use computers. Mouse and keyboard, windows and icons – up until about 2007, when people started to get excited about NUI.
What NUI is depends on who you ask. People like Microsoft’s Chief research and Strategy officer, think that NUI means doing away with the mouse and keyboard entirely. Instead we should be doing all our computing with a tap of the finger or a gesture of the hand. Not to mention Steve Jobs’ claim that if you have to include a pen in an interface, you’ve screwed up.
While this enthusiasm for futuristic interfaces and simplicity of design is commendable, it ignores the fact that the mouse and keyboard are themselves pretty cool. The speed of input on a keyboard leaves writing, text-to-speech and neat alternatives like Dasher in the dust. The mouse is an incredibly accurate pointing device – one that may be wasted on the majority of computing tasks, but indispensable for CAD and other realms where exactness is more important than fluidity.
The problem with the over-enthusiasm to ditch the old and embrace the new and super-simplified is that it rings hollow to anyone who knows where the old fits best. If you tell a typist that speech recognition is the way of the future, they’ll know that you’re wrong on that point – and consequently will be suspicious of claims in other realms they have less expertise in. Promising that NUI will overturn everything in computing, rather than just where our current solutions need work, threatens to turn people off of the whole thing.
That’s why voices like Bill Buxton’s are so important. In a video interview this year, he said he doesn’t even like the term NUI, let alone the suggestion that it will replace everything. Take the time to watch the video, it grounds the conversation about what comes next in something a little more useful, while still being optimistic about what can happen. I like his term “Appropriate User Interface”, because 1. it’s less dramatic and 2. it provides a reminder that the application -as in where an interface is applied – is more important than the amazingness of the new- or recently-fangled technology we have available.
Aside from the hilarity at 10:50 when he freaks the presenter out by standing way too close, I was really intrigued by the idea of using distance-aware displays to dynamically scale things up or down depending on how far away you are. I knocked together a Flash-based experiment to that effect. Have a look at it below!
the closer your monitor is to your webcam the better, but it’s an interesting experience having the size of an image remain the same relative to your distance from the screen! It give the impression, like Johnny Chung Lee’s experimnts, of turning the screen into a window through which you just see another world, rather than just a picture. I’m very interested in how this will feel built ino a more robust framework.