New England Machine Learning Accessibility Hackathon Designs for Inclusion

| MSNE Staff

At Microsoft, we know that technology doesn’t work until it works for everyone. Our AI for Accessibility initiative taps into that, amplifying human capability through AI-driven technologies and solutions that are empowering everyone to access the world. As a follow-up to our New England Machine Learning Day conference in May, we were excited to organize a New England Machine Learning Accessibility Hackathon.

This year’s hackathon had one goal in mind: to create solutions that promote accessibility and inclusion. Attendees came from a wide range of experiences in both the accessibility and technology sectors, with a range of representation from within the disability community, to take a user-centered approach to accessible technology that makes a true impact.

New England Machine Learning Accessibility Hackathon

This week, we were excited to join Microsoft Research to host the New England Machine Learning Accessibility Hackathon, an opportunity for local technologists to work together in teams to build technology for accessibility and inclusion. We're thankful for all the solutions built — these teams have us excited for an inclusive, accessible future. #MicrosoftLife

Posted by Microsoft New England on Thursday, June 14, 2018

“Inclusion is not about adding something on after you’ve finished building a product,” explained Manohar Swaminathan of Microsoft Research India. “Accessibility [in tech] has been mostly like that… that’s the wrong approach. Inclusivity starts by saying, ‘when I design, I take into account to make sure I get in as many people as possible.’ It’s a very different mindset right at the conceptual stage, taking all the way through to execution.”

Under the advisement of mentors Anastasiya Belyaeva (MIT Institute for Data, Systems and Society), Manohar Swaminathan (Microsoft Research India), Adam Kalai (Microsoft Research New England), and Bill Thies (Microsoft Research New England), attendees formed teams to address some of the most pressing issues in accessibility head-on. Team projects included:

  • American Sign Language: Fact or Opinion Quiz. Led by Danielle Bragg, University of Washington/Microsoft Research, and Dr. Naomi Caselli, Boston University. The ability to distinguish between facts and opinions is an important skill taught in K-12 education. Exercises used in schools are all in English, which is not the primary language of the Deaf community — American Sign Language (ASL) is. Help us build a tool entirely in ASL that quizzes students on whether content is fact or opinion. The system will both display content in signed ASL, and evaluate answers signed to a camera.
  • American Sign Language: Scattergories. Led by Danielle Bragg, University of Washington/Microsoft Research, and Dr. Naomi Caselli, Boston University. Sign language translation lags far behind spoken language translation in large part due to a lack of proper training data. Help us build an online American Sign Language (ASL) scattergories game to help collect a large, labelled corpus of signs executed by diverse signers to boost translation efforts.
  • Aphasia Augmented Language interface tool designed to facilitate word finding when needed without disrupting a conversation. Led by Kristin Williams, Carnegie-Mellon University
  • Augmented Screen Reader that uses audio and vibration input/output, using a single website and creating an easy to navigate, interactive semantic mapping of its contents. Led by Kalli Retzepi, MIT.
  • Data Analytics Tool for parents and therapists using Pathfinder Health Innovations which tracks multi-year behaviors and skill acquisition for children in autism therapy and special education.
  • Neurodiversity Social Chatbot. Led by Joel Salinas, Harvard Medical School/MGH and Dr. Jordi Albo-Canals, NTT Data/Tufts University. How do we learn to relate with another person? How do we communicate so we both feel heard, honored, and respected for who we are? How–despite so many barriers–can we connect better? We all struggle with these questions. But for some, these questions feel unanswerable and insurmountable. While there is still no replacement for the all benefits of face-to-face interaction with others, we can begin to overcome this challenge through the thoughtful application of machine-learning to make face-to-face connections easier, better. As featured in this New York Times Modern Love essay, Gus, a 13-year-old on the autism spectrum, learned how to connect better with other people on his own terms with some unexpected help: Siri. Yes, Siri on his iPhone.
  • Seeing AI App – improving UPC barcoding identification, particularly on non-flat surfaces. Led by Rob Gallo, Microsoft Accessibility Engineer, and Saqib Shaikh, Seeing AI Tech Lead.

After a day of hard work, collaboration, and solution-building, we were thrilled to announce our winners of the hackathon! First place went to the ASL Scattergories translation database, led by Danielle Bragg. In second place: the Seeing AI UPC label reader led by Rob Gallo, Microsoft accessibility engineer and Saqib Shaikh, Seeing AI tech leader. And a People’s Choice Award was given to the ASL Fact or Opinion Learning App led by Dr. Naomi Caselli, Boston University.

We can’t wait to see where our teams take the technology they built during the hackathon!

Thank you to our judges, listed below:

  • Meryl Alper, Northeastern University
  • Daniel Hannon, Tufts University
  • Elaine Harris, MIT Hack for Inclusion organizer
  • Jamie MacLennan, Microsoft Azure ML
  • Paul Medeiros, President of Easter Seals MA
  • Jaya Narain, MIT ATHack Cofounder
  • Ognjen Rudovic, MIT Media Lab
  • D. Sculley, Google Brain