Since the inception of Azure, we have been focused on delivering a true hybrid cloud where applications spanning public cloud and on-premises datacenters are built and run consistently. As organizations are now building applications that span the intelligent cloud and intelligent edge, the same approach is needed. Fundamentally, the principles and technology needed for developing hybrid cloud applications are the same as intelligent cloud and intelligent edge applications. Azure’s longstanding leadership in hybrid cloud provides developers with unique know-how toward building modern applications that span the edge and the cloud.
Next week at Microsoft Build, more than 6,000 developers will join us in Seattle to experience the latest advancements in dev tools and cloud services. Today, to help usher in Build, I’m excited to share some of the new Azure innovations that we will be showcasing at Build that enable developers to build this new generation of hybrid applications with greater productivity and success.
To begin, we’re announcing several new AI services and capabilities that makes it easier for developers to build AI-powered applications. Furthering our commitment to building the most productive AI platform, we’re delivering key new innovations in Azure Machine Learning that simplify the process of building, training and deployment of machine learning models at scale. These include new automated machine learning advancements and an intuitive UI that make developing high-quality models easier, a new visual machine learning interface that provides a zero-code model creation and deployment experience using drag-and-drop capabilities and new machine learning notebooks for a rich, code-first development experience. Furthermore, new MLOps (DevOps for machine learning) capabilities with Azure DevOps integration provides developers with reproducibility, auditability and automation of the end-to-end machine learning lifecycle. To enable extremely low latency and cost-effective inferencing, we are also announcing the general availability of hardware-accelerated models that run on FPGAs, as well as ONNX Runtime support for NVIDIA TensorRT and Intel nGraph for high- speed inferencing on NVIDIA and Intel chipsets.
Azure Cognitive Services give connected devices, bots and apps the ability to see, hear, respond, translate, reason and more. Azure is the only public cloud that enables these Cognitive Services to be containerized to run on-premises, in the cloud and at the edge. Today we’re giving developers even more ways to create “smart” devices and services with a new Cognitive Services category called “Decision” that delivers users specific recommendations to enable informed and efficient decision-making. Azure Cognitive Services such as Content Moderator, the recently announced Anomaly Detector and a new preview service called Personalizer, which uses reinforcement learning to provide each user with a relevant experience to drive engagement, will be part of this new category. We’re also delivering several new services in public preview, including Ink Recognizer for embedding digital ink recognition capabilities; Form Recognizer for automating data entry by extracting text, key-value pairs and tables from documents; and new conversation transcription capability in Speech Service, which transcribes meeting conversations in real time so participants can fully engage in the discussion, know who said what when and quickly follow up on next steps.
We are also bringing AI to Azure Search with the general availability of the cognitive search capability, enabling customers to apply Cognitive Services algorithms to extract new insights their structured and unstructured content. In addition, we are previewing a new capability that enables developers to store AI insights gained from cognitive search, making it easier to create knowledge-rich experiences leveraging Power BI visualizations or machine learning models. You can read more about today’s Azure AI announcements here.
Intelligent cloud and intelligent edge applications have evolved from primarily low-compute IoT devices working with the cloud to powerful compute at the edge, which requires a new modern hybrid application approach. A key aspect of enabling this is supporting the spectrum of edge compute and data needs. SQL Server and Azure SQL Database are the leading data engines for enterprise workloads on-premises and in the cloud, respectively, and today we are bringing these powerful data and analysis capabilities to the edge with Azure SQL Database Edge preview. Azure SQL Database Edge runs on ARM processors and provides capabilities like data streaming and time series data, with in-database machine learning and graph. And because Azure SQL Database Edge shares the same programming surface area with Azure SQL Database and SQL Server, you can easily take your applications to the edge without having to learn new tools and languages, allowing you to preserve consistency in application management and security control. This consistency in database programming and control plane across cloud and edge is essential to running a secure and well-managed hybrid application.
In addition, we’re announcing IoT Plug and Play, a new open modeling language to connect IoT devices to the cloud seamlessly, enabling developers to navigate one of the biggest challenges they face — deploying IoT solutions at scale. Previously, software had to be written specifically for the connected device it supported, limiting the scale of IoT deployments. IoT Plug and Play provides developers with a faster way to build IoT devices and will provide customers with a large ecosystem of partner-certified devices that can work with any IoT solution.
Perhaps one of the best examples of a cloud and edge application is with Mixed Reality — using the combination of mixed-reality device with cloud services to create entirely new experiences. We have barely scratched the surface for possibilities with Mixed Reality development. Now we’re making it easier to create applications for HoloLens 2 with the HoloLens 2 Development Edition, which starts at $3,500 or as low as $99 a month. It provides the community of mixed-reality developers with access to solutions to help them build and run mixed-reality experiences across a range of mixed-reality devices, and with Azure credits and three-month free trials of Unity Pro and the Unity PiXYZ Plugin for CAD data. Unreal Engine 4 support for streaming and native platform integration for HoloLens 2 will be available by the end of May for developers to create high-quality, photo-realistic renders and immersive, augmented-reality and virtual-reality experiences for architecture, product design and manufacturing. Read more about our IoT and intelligent edge announcements here.
Blockchain is showing potential across many industries to manage complex workflows and logistics. Last year we announced Azure Blockchain Workbench, which gave developers a simple UI to model blockchain applications on a preconfigured Azure-supported network. Today we are doubling down on our investments to empower blockchain developers by announcing Azure Blockchain Service, which simplifies the formation, management and governance of consortium blockchain networks, allowing businesses to focus on workflow logic and app development. Azure Blockchain Service deploys a fully managed consortium network and offers built-in governance for common management tasks, such as adding new members, setting permissions and authenticating user applications. And, the service is already receiving enterprise traction — we also announced this week that J.P. Morgan’s Ethereum platform, Quorum, is the first ledger available in Azure Blockchain Service, enabling Microsoft and J.P. Morgan to offer the first enterprise-grade Ethereum stack to the industry. Read more about today’s blockchain announcements here.
Today’s announcements give developers cutting-edge tools to create the next generation of hybrid applications spanning the cloud and edge. We have even more to share on Monday at the start of Build. Be sure to tune into Satya’s keynote at 8:30 a.m. PT on Monday, May 6, here, and look forward to seeing what you go build!