Skip to main content
The AI Blog

Microsoft’s mission: AI for every developer

Vulcan Steel makes about 3,000 deliveries of steel a day to businesses throughout New Zealand and Australia – which means that each day, its employees need to use their training to figure out how to safely get large, heavy and unwieldly pieces of steel off of its trucks and into the hands of a very diverse group of customers.

“It’s an awkward product to transport, and it’s difficult to design out all of the risks,” said James Wells, who acts as the company’s chief information officer. “So essentially what that means for us is one of the key requirements or skills for us to keep people safe is around education.”

For years, Vulcan Steel did what most companies do – they educated their employees about safety before sending them into the field, and then they did additional training as needed if someone reported an accident or near miss.

Now, they’re using artificial intelligence to try to more proactively prevent accidents and near misses before they happen. The company recently started using Microsoft Cognitive Service’s Custom Vision tools to evaluate camera footage from the company’s trucks for actions that could be risky or lead to an accident.

The computer vision tools are able to do what the human eye couldn’t reasonably do – sift through thousands of pieces of footage a day to look for potential risks – freeing up the company’s workers to review just a small subset of footage that has been flagged as a possible concern.

That, in turn, is allowing Vulcan Steel to focus its education efforts on what it sees as the most worrisome or risky scenarios. Wells said accidents were already exceedingly rare, so the goal is to build the company’s culture of safety.

“What we’re hoping is we will measure the number of education discussions that take place as a result,” Wells said. “From our point of view, if we add an additional number of safety discussions to our organization, there’s not really any negative that can come of that.”

Vulcan Steel doesn’t have a large staff of developers, and it certainly doesn’t have a team of AI experts. Wells said the development of this AI-based system was basically the work on one enthusiastic .NET developer who saw the potential for how AI could help the business.

Vulcan Steel employees need to manage many variables when loading and delivering steel orders.

“It’s impressed me, how little detailed experience it’s taken to get to where we are,” Wells said.

At Microsoft’s Build developers conference in Seattle this week, the company is unveiling a series of new and updated tools that are part of its effort to help all developers do what the developer at Vulcan Steel did – incorporate AI into new or existing processes, whether or not they have a background in the fast-emerging field.

Joseph Sirosh, Microsoft’s corporate vice president in charge of Microsoft’s cloud AI platforms, said Microsoft’s expanding roster of AI tools for developers comes as companies are clamoring to add things like speech or image recognition into their applications as a way to stand out from the competition, operate more efficiently and better serve customers.

“What we are seeing is a huge hockey stick in the adoption of AI among all the major applications, and the apps are now starting to be differentiated based on these capabilities.”

At Build, the company’s announcements will include a number of new capabilities for its Microsoft Cognitive Services, which allow developers with little or no AI expertise to add things like speech, language and search capabilities to their applications. The announcements include a unified speech service that includes everything from improved text-to-speech capabilities to custom voice recognition, as well computer vision advances that provide new capabilities for identifying objects, extracting information from images and performing visual searches.

I hope we spark imagination, and I hope we show them ways to make that imagination come true through the APIs we have.

Microsoft also will showcase updates to its conversational AI tools, including a public preview of Bot Builder SDK v4, and the capability to do things like add personas to bots, learn conversational patterns and extract questions and answers from documents.

In addition, Microsoft is announcing support for deploying Microsoft Cognitive Services on the edge – or on a device such as a vehicle or camera that doesn’t have a constant connection to a network or the cloud. Microsoft also said it was making updates to Azure Machine Learning Service, which provides lifecycle support for AI development, training and deployment, and it announced ML.NET, which Microsoft already uses internally to more easily incorporate AI into a .NET developer’s existing workflow.

Sirosh said he hopes the Build conference will give developers practical guidance for adding AI tools and learning AI skills – and also get them excited about what they can do with those capabilities.

“I hope we spark imagination, and I hope we show them ways to make that imagination come true through the APIs we have,” Sirosh said. “And then I hope they get inspired to use more advanced machine learning to build their own custom APIs in the cloud.”

Preventing blindness

Jonathan Stevenson, the chief strategy and information officer for Intelligent Retinal Imaging Systems, likes to say that his company’s goal is simple: To end preventable blindness.

To do that, the company simplifies the process of testing people with diabetes for a condition called diabetic retinopathy, which leads to blindness unless caught early but is often relatively symptom free until severe visual impairment occurs. The program provides primary care doctors with a comprehensive diagnostic platform that allows them to perform the diabetic eye test during a regular check-up, including a medical device, software and services that capture retinal images. The images are then sent to an expert for review via a secure, cloud-based network.

Stevenson said that the screening program can help primary care doctors better identify which patients are at risk for vision loss, even if the patients aren’t showing any symptoms. Those patients can then be referred to an ophthalmologist or eye specialist for treatment.

Recently, the company, which goes by IRIS, started using Microsoft’s AI tools to increase the likelihood that eye disease can be caught earlier.

Stevenson said they are using an algorithm to scan the same patient images the eye doctors review, acting as a backstop for anything a doctor may have missed. The company also is using machine learning tools to analyze anonymized patient data to better predict what factors might make a person at risk for retinopathy. That can help them better identify and flag the patients who should be getting more regular testing or check-ups with an eye care provider.

The company also is developing an AI tool to verify that right and left eye images are tagged correctly – a seemingly small thing, but one that can save a lot of trouble down the road.

Stevenson said IRIS’s goal is not to replace physicians at any step in the process, but rather to augment the work doctors are doing by catching the occasional human error and providing helpful additional context.

“Artificial intelligence for IRIS is there not only to provide an assist for a physician,” he said. “It’s also to allow providers to make more rapid and better-informed decisions.”

Like the team at Vulcan Steel, IRIS began this project with very little AI expertise – in fact, Stevenson said, the lead programmer had previously worked only in Linux.

“Everything we’ve done in AI was done by one person with me backing him up,” Stevenson said. “If we didn’t have the tools that Microsoft provided, there’s no way we would have done that.”

Irene Gomez oversees development and deployment of Telefonica's Aura personal digital assistant.

Meet Aura


Unlike Vulcan Steel or IRIS, Telefónica has been using AI for years, but mainly for internal data analysis and other business purposes.

“By using data and machine learning we were able to reduce costs and make better internal decisions,” said Irene Gómez, who oversees development and deployment of the company’s Aura personal digital assistant. “That’s all very nice but what about the customer? We wanted to show our customers that they are in control of their data and give them tools to get the best customer experience possible out of it”.

So Gómez and a team came together, using Microsoft Cognitive Services and Azure Bot Service, to build a digital assistant that can help customers with everything from scheduling a television recording to paying a bill.

Aura has been available for only a couple of months, and Gómez said the team is already seeing that customers have high expectations for digital assistants. They want the assistant to quickly gain new knowledge about the latest programming on television. They expect Aura to understand them regardless of what words they use, rather than to be required to use specific phrases to get what they want. And they want Aura to be relatively personalized – for example, since Telefónica has operations in several Spanish-speaking countries, they programmed the assistant to use and understand the accents, grammar and language variations of each country. The AI tools are helping with all those things.

Gómez said the project is unique in that they are using cutting-edge technology to develop a cutting-edge digital assistant. It’s exciting and it’s impactful, and it also requires the team to be resilient and able to deal with the idea that things don’t always work out as planned.

“Technology is evolving at the same time you are developing your product, and so if you want to work in a project like Aura you have to have an open mind,” Gómez said.

Top image: Vulcan Steel employee Bains Gurpreet tightens the load on product going out for delivery. 

Related: