From Wired News by Priya Ganapati.
The first step towards a revolution on how humans interact with computers has started with the multitouch displays. Multitouch displays are screens that are sensitive to the pressure of more than one finger. The future of human-computer interfaces may involve using neurotransmitters to help translate thoughts into computing actions, through face detection, eye tracking, speech recognition, and haptics technology that uses the sense of touch to communicate with the user. “Computing of today is primarily designed for seated individuals doing office work in the developed world,” says Scot Klemmer, a co-director of Stanford University’s Human Computer Interaction Group. “If you flip any one of those bits–look at mobile users, or users outside of the developed world, or social computing instead of individual computing–then the future is wide open.”
At Drexel University’s RePlay Lab, they are trying to measure the level of neurotransmitters in a subject’s brain to create games where mere thought controls gameplay. The lab created a 3-D game called Lazybrains that connects a neuro-monitoring device and a gaming engine.The system uses the Functional Near-Infrared Imaging Device, which shines infrared light into a user’s forehead and records the amount of light that is transmitted back to detect changes and deduce information about the amount of oxygen in the user’s blood. Concentration sends more oxygen to the frontal lobe, meaning a gamer’s concentration can be used to manipulate the height of platforms in the game.
Advancements in human computer interaction will also come from users looking to improve their personal experience by hacking, mashing and modifying devices, says Klemmer.
The keyboard and the mouse aren’t going to disappear completely. For word processing, the keyboard remains the most efficient method of input, say researchers.
No Responses to “Brain Scanners, Fingercams Take Computer Interfaces Beyond Multitouch”