The invention of differential privacy was ahead of its time. The technology, pioneered by Microsoft researchers 15 years ago, makes it possible to extract useful insights from datasets, while safeguarding the privacy of the individuals included in the data. What was needed to realize its full potential? The marriage of cloud computing and artificial intelligence (AI), which allows for the sharing and analysis of huge amounts of data requiring that individual personal privacy is protected.
Over the past year, Microsoft collaborated with the OpenDP Initiative, led by Harvard’s Institute for Quantitative Social Science (IQSS) and School of Engineering and Applied Sciences (SEAS), and together we launched the open-source differential privacy platform, SmartNoise. We’re excited about the results and the learnings we’ve collected to date. My colleague Sarah Bird recently wrote about those learnings and how Microsoft is adopting differential privacy into some of our products. Differential privacy has also become a powerful new tool in Microsoft’s privacy and security ecosystem. Externally, we’re working with partners, exploring how differential privacy applies in real world scenarios, and today we’re launching the SmartNoise Early Adopter Acceleration Program to attract more.
Differential privacy within Microsoft’s security and privacy landscape
Today, data is the fuel that drives innovation. However, legitimate security and privacy concerns restrict the ability to fully unlock the power of data. That’s understandable when you consider that data is often the most valuable asset an organization and an individual has. Microsoft is developing a range of new technologies including Azure Confidential Computing, homomorphic encryption, secure multi-party computation and federated learning to provide stronger protections and eliminate many types of threats. Each of these technologies is a valuable addition to our portfolio because no single technology solves every type of problem. However, by using them together, we are able to build solutions with unprecedented levels of privacy and security.
Encrypting data while at rest and in transit are industry standard now. The addition of Azure Confidential Computing furthers this protecting your data in use – or during computation – in a secure hardware environment. This reduces your risk to vulnerabilities like malware, insider attacks and malicious or negligent administrators.
By adding differential privacy to our suite of security and privacy technologies, Microsoft is providing another step in this journey. Differential privacy ensures that the result of a computation is safe to share or use. When data is released with differential privacy applied, your dataset has the guarantee that any individual in the dataset cannot be reidentified. SmartNoise provides organizations with additional confidence in fields like financial services and health care where both securing highly sensitive data and protecting privacy is a necessity.
With innovations like SmartNoise and Azure Confidential Computing, Microsoft is providing the tools and technology to ensure individuals data is secure and private throughout its life cycle from the beginning all the way through to the intelligence it delivers.
Differential privacy in practice
In addition to Harvard’s IQSS and SEAS, Microsoft is also working with several partners to explore the potential for differential privacy.
One of our thought leaders and partners is Educational Results Partnership (ERP), a nonprofit organization that applies data science to improve student outcomes and career readiness throughout the educational system. ERP has accumulated the largest database in the US on student achievement from kindergarten to students’ entry into the labor market. Their mission is to use actionable data to close equity gaps in education and the labor market by improving academic and workforce outcomes for students in traditionally disenfranchised communities and populations.
Dr. Jim Lanich, ERP’s president and CEO said, “ERP’s data-informed approach relies on collecting data from educational and government institutions throughout the United States. We’re excited to be partnering with Microsoft on the development of a differential privacy application that will allow organizations to look deeper into their data while strengthening privacy protections for students and individuals. The ability to draw more meaningful insights from the data will lead to action that can improve outcomes and close equity gaps.”
Another key partner is Humana, a health care company whose goal is to improve the health of their millions of members by delivering simple and easy health care experiences that lead to differentiated health outcomes. To achieve their goal, Humana is investing in data, analytics and digital health technologies to share data across all parties delivering care.
Slawek Kierner, Humana’s SVP of Enterprise Data and Analytics said, “Collaboration is key in tackling the challenges in health care. Having tools that can protect the privacy of individuals while preserving the underlying information is key. At Humana, we are exploring how differential privacy can enable us to share data with partners like researchers, community organizations, and academics to better serve our members while protecting their privacy.”
In addition, Microsoft is partnering with the Open Data Institute on an Education Open Data Challenge to generate innovative solutions to close the digital divide and improve learning outcomes in K-12 education. Among other resources, participants in the challenge will receive access to Microsoft’s US broadband usage data with differential privacy applied to protect individuals’ privacy. This dataset was initially created to help the FCC and policymakers bridge the digital divide. By opening up the data further via Differential Privacy we are enabling a whole new use case to help solve some of the world’s educational challenges. We encourage those interested to register. You can find more information here.
SmartNoise Early Adopter Acceleration Program
We’re excited about the progress we’ve made in just a little over a year through our collaboration with Harvard and the OpenDP initiative. Our partners levering SmartNoise and differential privacy have taught us a great deal about how SmartNoise can advance the sharing of data and insights. But there is more work to be done, and we are looking for additional partners to help with this effort.
We are introducing the SmartNoise Early Adopter Acceleration Program to support usage and adoption of SmartNoise and OpenDP. This collaboration program with the SmartNoise team aims to accelerate the adoption of differential privacy in solutions today that will open data and offer insights to benefit society.
If you have a project that would benefit from using differential privacy, we invite you to apply. We will accept applications through February 1, 2021. Selected applicants will be notified by March 1, 2021.
Selected teams can engage in technical and conceptual considerations incorporating SmartNoise and differential privacy into their solutions. These collaboration activities include:
- SmartNoise and OpenDP technical assistance and guidance
- Differential privacy methodology reviews
- Guidance and feedback on privacy budgets, setting parameters and managing epsilon
- Design and architecture reviews and consultation
And, since there is more work to do, there will be more progress and learnings to share. Microsoft is proud to be part of this first-of-its-kind open-source differential privacy platform with Harvard IQSS and SEAS and OpenDP community and we are committed to engaging with developers, researchers and companies as this project moves forward. If you already incorporate differential privacy into your work, we welcome your thoughts or feedback about SmartNoise on GitHub.