You're in fast moving traffic on a busy motorway approaching a complicated junction with just seconds to get into the right lane. Your phone, sensing that now is not the moment to disturb you, diverts an incoming call straight to voicemail. Later, when you are in a more relaxed state, it plays the message back and offers to ring the caller back.
Even if you are packing an iPhone 5 or the latest Samsung, it is fair to say that your phone is still a long way from doing this. Despite the impressive array of features offered by today's handsets – including voice commands - most people still interact with their phones by pressing buttons, prodding a screen or the occasional swipe or pinch.
It is a similar story with computers. Take Microsoft’s new Windows 8 operating system, due to be launched later this week. Its colourful, tile-laden start screen may look startlingly different to older versions of Windows, but beneath the eye candy it's still heavily reliant on the keyboard and mouse.
In fact, with one or two notable exceptions, it is striking just how little the way we interact with computers has changed in the last few decades.
"The keyboard and mouse are certainly a hard act to follow," says George Fitzmaurice, head of user interface research for US software maker Autodesk. But, despite an apparent lack of apparent novelty in the majority of interfaces of today's mass market devices, there are plenty of ideas in the pipeline.
Take, for instance, technology that can monitor your stress levels. One technique being developed is functional near-infrared spectroscopy(fNIRS) that monitors oxygen levels in the blood, and therefore activity, in the front of the brain. "It measures short term memory workload, which is a rough estimate of how 'busy' you are," says Robert Jacob, professor of computer science at Tufts University, near Boston, Massachusetts.
The technology is currently used in medical settings, but could one day be used to help filter phone calls, for example. Today fNIRS works via a sensor placed on your forehead, but it could also be built into baseball caps or headbands, allowing the wearer to accept only important calls. Perhaps more immediately, it could also help organisations assign workloads efficiently. "You could tell your phone only to accept calls from your wife if you get busy beyond a certain gradation of brain activity," adds Jacob. "If a machine can tell how busy you are it can tell if you can take on an additional workload, or it could take away some of your work and assign it to someone else if you are over-stretched."
Other forms of "brain-computer interface" are already being used and developed for a growing number of applications. Electroencephalography (EEG) picks up electrical signals generated by brain cell interactions. It has long been used to diagnose comas, epilepsy and brain death in hospitals and in neuroscience research. The variation of frequencies of signals generated can be used to determine different emotions and other brain states. Recent years have seen the launch of simplified EEG headsets that sell for as little as $100.
For example, a British company called Myndplay makes interactive short films, games and sports training software which users interact with via these brain wave measuring headsets. Those who can successfully focus their minds or mentally relax sufficiently when required can influence film plots and progress to higher levels in games.
Similar technologies are increasingly being used to help the disabled. Two years ago an Austrian company called Guger Technologies released a system designed to help paralysed patients type by highlighting letters on a grid one by one until the desired letter is selected and the associated EEG signal is detected. Spanish researchers have developed EEG-controlled wheelchairs and are working on using the same method to control prosthetic arms.
0 comments