MIT Media Lab researchers call it AlterEgo headset.
Though it sounds as bizarre as it can be but it is true that now you can type on the computer without even moving your fingers, simply by thinking about what you need to type.
MIT Media Lab researchers have invented a ground-breaking computer interface that can translate your thoughts onto the screen with 90% accuracy. The device has been named AlterEgo, stated MIT researcher Arnav Kapoor who added that using this device is like you have received superpowers.
Kapoor is the lead researcher on this project while Pattie Maes and an undergraduate, electrical engineering student Shreyas Kapur assisted in the research.
“The motivation for this was to build an IA device — an intelligence-augmentation device. Our idea was: Could we have a computing platform that’s more internal, that melds human and machine in some ways and that feels like an internal extension of our own cognition?” explained Kapoor.
Maes, the thesis advisor for Kapoor and professor of media arts and sciences, says that in this age, it is almost impossible to move around without our cellphones but the use of such devices is quite disruptive.
“If I want to look something up that’s relevant to a conversation I’m having, I have to find my phone and type in the passcode and open an app and type in some search keyword. So, my students and I have for a very long time been experimenting with new form factors and new types of experience that enable people to still benefit from all the wonderful knowledge and services that these devices give us, but do it in a way that lets them remain in the present,” asserts Maes.
AlterEgo transcribes words that are verbalized by the user internally without speaking them out loud. It is a comprehensive, silent computing system that allows user pose and receive answers to a variety of computing issues without getting detected. Such a system can be used, for instance, to silently report the opponent’s next move in a chess game. It may also be helpful in receiving computer suggested responses discreetly.
This seemingly impossible idea was transformed into reality by combining a wearable device and computing system. In the device, the electrodes assess user’s jaw and face to pick up neuromuscular signals. The signals are actually triggered by internal verbalization process, that is, as soon as we start thinking about something, the signals start generating.
However, these signals remain undetectable by the human eye but can be caught by a machine learning system that can correlate signals with specific words. The device’s sensors pick up signals from 7 key areas around the jaw bone, specifically the cheek area and chin. The sensors can recognize words and after processing them it can even talk back.
Also part of the device is a pair of bone-conduction headphones. These are responsible for transmitting vibration via the bones to the face and then to the inner ear. Since the headphones don’t block the ear canal, these can easily force the system to communicate information to the user without affecting the flow of conversation or interrupting the auditory experience. The headset doesn’t require commands like OK Google or Hey Siri to communicate with smart devices. It just quietly interprets whatever user is thinking.
MIT is not alone in the quest to develop Matrix-style computer interfaces but other firms are also following including Neuralink. The basic idea is to provide users advanced mental capabilities. At the moment, AlterEgo can recognize digits from 0 to 9 and its vocabulary comprises of 100 words.
The details of AlterEgo are narrated in a paper presented at the Association for Computing Machinery’s ACM Intelligent User Interface conference.
Image credit: Depositphotos