Renowned physicist Stephen Hawking has long relied on technology to help him connect with the outside world despite the degenerative motor neuron disease he has battled for the past 50 years.
Whereas Hawking’s condition has deteriorated over time, a highly respected computer scientist indicated at last week’s International Consumer Electronics Show (CES) that he and his team may be close to a breakthrough that could boost the rate at which the physicist communicates, which has fallen to a mere one word per minute in recent years.
For the past decade Hawking has used a voluntary twitch of his cheek muscle to compose words and sentences one letter at a time that are expressed through a speech-generation device connected to his computer.
Each tweak stops a cursor that continuously scans text on a screen facing the scientist.
At CES, Intel chief technology officer Justin Rattner noted that Hawking can actually make a number of other facial expressions as well that might also be used to speed up the rate at which the physicist conveys his thoughts.
Even providing Hawking with two inputs would give him the ability to communicate using Morse code, “which would be a great improvement,” said Rattner, who is also director of Intel Labs.
Intel has since the late 1990s supplied Hawking with technology to help the scientist express himself. The latest chapter in their work together began in late 2011 when Hawking reached out to Gordon Moore, informing the Intel co-founder and father of Moore’s law that the physicist’s ability to compose text was slowing and inquiring whether Intel could help.
Rattner met with Hawking early last year around the time of the latter’s 70th birthday celebration in Cambridge, where the Intel CTO was one of the speakers. After meeting with Hawking, Rattner said he wondered whether his company’s processor technology could restore the scientist’s ability to communicate at five words per minute, or even increase that rate to 10.
“Up to now, these technologies didn't work well enough to satisfy someone like Stephen, who wants to produce a lot of information,” Rattner said.
Intel is now working on a system that can use Hawking’s cheek twitch as well as mouth and eyebrow movements to provide signals to his computer. “We've built a new, character-driven interface in modern terms that includes a better word predictor,” Rattner said.
The company is also exploring the use of facial-recognition software to create a new user interface for Hawking that would be quicker than selecting individual letters or words.
Hawking’s current setup includes a tablet PC with a forward-facing Webcam that he can use to place Skype calls. A black box beneath his wheelchair contains an audio amplifier, voltage regulators and a USB hardware key that receives the input from an infrared sensor on Hawking’s eyeglasses, which detects changes in light as he twitches his cheek.
A hardware voice synthesizer sits in another black box on the back of the chair and receives commands from the computer via a USB-based serial port.
Intel’s work with Hawking is part of the company’s broader research into smart gadgets as well as assistive technologies for the elderly. The key to advancing smart devices—which have been at a plateau over the past five or six years—is context awareness, Rattner said. Devices will really get to know us the way a friend would, understanding how our facial expressions reflect our mood, he added.
Intel’s plan for identifying personal context requires a combination of hardware sensors—camera, accelerometer, microphone, thermometer and others—with software that can check one’s personal calendar, social networks and Internet browsing habits, to name a few.
“We use this [information] to reason your current context and what's important at any given time [and deliver] pervasive assistance,” Rattner said. One approach to “pervasive assistance” is the Magic Carpet, a rug that Intel and GE developed with embedded sensors and accelerometers that can record a person’s normal routine and even their gait, sounding an alert when deviations are detected.
Such assistance will anticipate our needs, letting us know when we are supposed to be at an appointment and even reminding us to carry enough cash when running certain errands, according to Rattner, who added, “We’ll be emotionally connected with our devices in a few years.”