AI

In the aftermath of an earthquake, a snakelike robot that can crawl through rubble and tight air pockets is able to access places that no person could — or should — be able to go.

The Sarcos Guardian S, a small robotic visual inspection platform, is designed for exactly those scenarios: searching for cracks in industrial pipelines, finding people trapped in unstable buildings, sensing whether hazardous gases at an accident site could pose a safety risk to first responders.

Today, the robot is controlled by someone working at a safe distance, who sees the scene through its cameras and guides it with the equivalent of a video game joystick. Now, Microsoft and Sarcos are collaborating to add intelligent capabilities to the Guardian S that would allow it to navigate more autonomously — freeing the operator to focus on more important decisions.

The idea of automated industrial applications and using robots isn’t new. Robot arms now move products along an assembly line, machines turn hunks of metal into parts, a car shifts gears without your input.

But that’s a far cry from systems that are actually autonomous — ones that are capable of sensing their surroundings and knowing what to do when confronted with unfamiliar situations. Instead of performing specific tasks repeatedly without variation, these autonomous systems can dynamically respond to changing environments to solve a difficult problem. They also have vast potential to augment how people do their jobs or to perform work that is unsafe or cost-prohibitive for people to do.

Microsoft is building an end-to-end toolchain to help make it easier for every developer and every organization to create autonomous systems for their own scenarios — whether that’s a robot that can help in life-threatening situations, a drone that can inspect remote equipment or systems that help reduce downtime in a factory by autonomously calibrating equipment.

At the Microsoft Build 2019 developers conference, the company announced the platform’s first component: a limited preview program for developers to work with its experts to build intelligent agents using Microsoft AI and Azure tools that can autonomously run physical systems.

A journey from automated to autonomous systems

When people think of autonomous systems, many go straight to the vision of the fully autonomous car that drives itself while you sit in the back seat and read a book, said Mark Hammond, Microsoft general manager for Business AI.

Microsoft’s vision is to help other types of companies — from smart building and energy companies to industrial manufacturers — achieve these incremental steps towards autonomy in their own industries.

Machines have been progressing on a path from being completely manual to having a fixed automated function to becoming intelligent where they can actually deal with real-world situations themselves. We want to help accelerate that journey, without requiring our customers to have an army of AI experts.

Microsoft’s autonomous systems platform overcomes some of these challenges by using a unique approach called machine teaching. It relies on a developer’s or subject matter expert’s knowledge — someone who may not have a background in AI but understands how to steer a drill or keep the airflow in an office building at safe levels — to break a large problem into smaller chunks.

Microsoft’s platform also enables non AI-experts to establish and tweak the reward system, which is key to arriving at a solution that truly works. And it selects and configures the algorithms to tackle the task, eliminating the need for machine learning experts to custom build solutions.

Running simulation at scale in Azure

Because no company can afford to let a robot or an intelligent control system make millions of mistakes in a real-world factory or wind farm or highway as it is learning, reinforcement learning algorithms need to practice in a simulated environment that can replicate the thousands or millions of different real-world scenarios they might encounter.

The Microsoft toolchain also includes AirSim, an open source simulation platform originally developed by Microsoft researchers to use AI to teach drones, self-driving cars or robots to learn in high fidelity simulated environments. Or, the team can work with customers to train autonomous systems using existing industry-specific simulators.

AirSim also allows developers to train different AI and control tools to solve different parts of more complex problems. In helping develop autonomous forklifts for Toyota Material Handling, for instance, researchers broke the task down into sub-concepts that are simpler to learn and debug: navigating to the load, aligning with the pallet, picking it up, detecting other people and forklifts, delivering the pallet, returning to the charging station.

“We are working to provide a comprehensive platform for customers who want to build intelligent autonomous systems, covering development, operation and end-to-end lifecycle management,” Hammond said.