A privilege of working at Microsoft is being able to glimpse into the future of information technology and envision ways that society can reap the considerable benefits of Big Data— the collection, management and analysis of data on a massive scale. But this privilege also comes with responsibilities, including an obligation to help ensure strong information privacy protections. Getting this balance right is crucial not only for Microsoft and our peers, but also for policymakers, regulators, industry, educators, and, most importantly, individuals.
About two months ago, I wrote about a series of discussions that we convened to advance a global conversation aimed at generating shared ideas and new thinking in support of alternative approaches to privacy protection. Today, I am happy to share a summary report of these discussions, written by Fred Cate, Distinguished Professor and C. Ben Dutton Professor of Law, Maurer School of Law, Indiana University and Viktor Mayer-Schönberger, Professor of Internet Governance and Regulation, Oxford Internet Institute, University of Oxford.
Across all six meetings, there was a widely shared sense that “notice and consent” either have, or are perceived as having, become the dominant means of privacy protection. And there was equally broad agreement that privacy frameworks relying heavily on individual “notice and consent” are neither sustainable in the face of dramatic increases in the volume and velocity of information flows, nor desirable because of the burden they place on individuals. Generally, people agreed that new approaches to privacy protection must shift responsibility away from individuals to organizations which use data, driving a focus on what uses of that data are permitted as well as on accountability for responsible data stewardship rather than mere compliance.
If a shift in the focus of data privacy frameworks to the use of data is to occur, it seems clear that we need to come to agreement on a definition of data use. Such a model will need to focus on the “harms” or “impacts” of data use, which should not only include physical and financial injury, but also broader concepts such as reputational or social harm. It should cover and define outcomes and impacts that create value such as economic and societal improvements. Defining uses, harms and related concepts clearly and concretely will be essential to making any new data protection model operational, and help provide guidelines for enforceable organizational accountability.
Another area for further exploration is evolving privacy protection principles for the era of Big Data. Contemplating new principles at the Microsoft Global Privacy Summit, as a discussion aid, prompted valuable dialogue and generated some important feedback. A next step will be to collaboratively improve and refine these evolved principles with public and private sector stakeholders.
With an increased global focus on privacy and a growing need for new legislative frameworks, we know the time is right to advance these discussions. Thanks to those who have taken part so far, and we look forward to providing updates on our progress.