Technological progress benefits people across society, but it can play a special role in supporting people with disabilities. At the intersection between artificial intelligence (AI) and inclusive design is a sweet spot where intelligent machines can enable more people to live independent lives.
This sweet spot is what Saqib Shaikh, a Microsoft software engineer who also happens to be blind, is focused on. In 2014, Saqib participated in Microsoft’s first company-wide hackathon, developing an idea for using AI to empower individuals with visual impairments. This eventually evolved into Seeing AI. Launched in 2017, this “talking camera” smartphone app uses AI to describe people, text and objects, giving people with visual impairments a new way to understand the world around them. It has already helped users with more than 20 million tasks.
In our latest #TechTalk, Saqib discusses how his hackathon idea became a Microsoft product and transformed from a pet project into his full-time day job. He also discusses the importance of listening to and taking on-board user feedback to improve the app. For instance, when users asked for Seeing AI to be available in other languages than English, Microsoft responded. As of today, Seeing AI is also available in Spanish, French, German, Dutch, and Japanese. Saqib also addresses the responsibility of technology companies to ensure that AI systems are developed using fair and representative training data that reflect not only racial and gender diversity, but also the experiences of people with disabilities. He is a firm believer in the need to adopt policies to mitigate negative impacts of AI, but notes that, given the speed at which AI technology develops, a focus on desired outcomes is required rather than regulating specific technologies.
Check out the Microsoft Stories feature to learn more about the new language support, and watch the full #TechTalk with Saqib Shaikh here: