Steve Gleason doesn’t take “no” for an answer. And neither does the Ability Hackathon: Eye Gaze team, one of more than 2,200 participating in this week’s //oneweek hackathon.
The Eye Gaze team had set out to do a project to help Gleason, a former pro football player who is living with amyotrophic lateral sclerosis (ALS). Gleason’s foundation aims to raise awareness about ALS, as well as to give others living with it the “leading edge technology, equipment and services” they need.
Gleason uses his Surface Pro to speak with the help of eye-tracking technology from Tobii. And he’d asked the Eye Gaze team to come up with a way for him to turn the Surface Pro on and off using his eyes, because he can’t do that now. To turn the device on or off means asking someone else to help him. And, he’d rather not. He already depends on the help of others for many things.
Gleason had other requests into the Eye Gaze team as well. He wanted predictive text to be better, so that he can reduce the time it takes him to talk, and let him speak more naturally with family and friends. And, there was this growing concern: He was losing his ability to drive his own wheelchair, and wanted to know if the team could figure out a way for him to do that using his eyes.
The team, like Gleason, decided to go all in. Despite a few meetings that ended in frustration, some initial starts-and-stops, and considerable sleep deprivation, they kept at it, because they believed in what they were doing. And they came up with solutions.
“It was a true One Microsoft effort; this could never have happened without a hackathon,” said Matthew Mack, one of the leaders of the Eye Gaze team. The “cross-discipline” team, he said, includes researchers, engineers, program managers, designers and media professionals from Windows, Microsoft Research, Kinect, Operation System, Customer Service and Support, and Application & Services.
They learned, he said, that “as individual parts, we were good. Together, we were so much better.”
To help Gleason “drive” his wheelchair, the team devised a way to use a Kinect sensor, Microsoft robotics research and eye-tracking technology, creating a user interface installed on a Surface Pro 3 to navigate the wheelchair, and to safely maneuver when it detects an object.
The team needed a test wheelchair, and got one gladly and on the spot from the Permobil wheelchair company, Mack said.
“Not only did they loan us the wheelchair, but they allowed us to take the wheelchair to pieces, to retrofit it,” he said.
For the eye-controlled Surface on/off feature, the team came up with a firmware edit to make it possible. And they also made strides in predictive text improvements.
“The entire team has brought themselves to work every day for the last four, five days, to make this a reality,” Mack said. “They’ve been energized to do it. When you think of the human connection here, that we’re able to set someone free from the bounds of being moved around by someone else, when all other motion is gone, except the movement of their eyes, it’s a hugely powerful story.”
Gleason, who traveled from where he lives in New Orleans to the Microsoft campus to work with the team, is thrilled, Mack said.
“His message on Wednesday to us was: ‘Keep this work rolling.’ It feels like we’re so close to really being able to deliver something. It’s not just a dream now, it’s something that we could actually give him.”
Microsoft News Center Staff