Online child exploitation is a horrific crime that requires a whole-of-society approach. Microsoft has a long-standing commitment to child online protection. First and foremost, as a technology company, we have a responsibility to create software, devices and services that have safety features built in from the outset. We leverage technology across our services to detect, disrupt and report illegal content, including child sexual exploitation. And we innovate and invest in tools, technology and partnerships to support the global fight needed to address online child sexual exploitation.
In furtherance of those commitments, today Microsoft is sharing a grooming detection technique, code name “Project Artemis,” by which online predators attempting to lure children for sexual purposes can be detected, addressed and reported. Developed in collaboration with The Meet Group, Roblox, Kik and Thorn, this technique builds off Microsoft patented technology and will be made freely available via Thorn to qualified online service companies that offer a chat function. Thorn is a technology nonprofit that builds technology to defend children from sexual abuse.
The development of this new technique began in November 2018 at a Microsoft “360 Cross-Industry Hackathon,” which was co-sponsored by the WePROTECT Global Alliance in conjunction with the Child Dignity Alliance. These “360” hackathons are multifaceted, focusing not just on technology and engineering but also on legal and policy aspects as well as operations and policy implementation. Today’s announcement marks the technical and engineering progress over the last 14 months by a cross-industry v-team from Microsoft, The Meet Group, Roblox, Kik, Thorn and others to help identify potential instances of child online grooming for sexual purposes and to operationalize an effective response. The teams were led by Dr. Hany Farid, a leading academic who, in 2009, partnered with Microsoft and Dartmouth College on the development of PhotoDNA, a free tool that has assisted in the detection, disruption and reporting of millions of child sexual exploitation images and is used by more than 150 companies and organizations around the world.
Building off the Microsoft patent, the technique is applied to historical text-based chat conversations. It evaluates and “rates” conversation characteristics and assigns an overall probability rating. This rating can then be used as a determiner, set by individual companies implementing the technique, as to when a flagged conversation should be sent to human moderators for review. Human moderators would then be capable of identifying imminent threats for referral to law enforcement, as well as incidents of suspected child sexual exploitation to the National Center for Missing and Exploited Children (NCMEC). NCMEC, along with ECPAT International, INHOPE and the Internet Watch Foundation (IWF), provided valuable feedback throughout the collaborative process.
Beginning on January 10, 2020, licensing and adoption of the technique will be handled by Thorn. Companies and services wanting to test and adopt the technique can contact Thorn directly at email@example.com. Microsoft has been leveraging the technique in programs on our Xbox platform for several years and is exploring its use in chat services, including Skype.
“Project Artemis” is a significant step forward, but it is by no means a panacea. Child sexual exploitation and abuse online and the detection of online child grooming are weighty problems. But we are not deterred by the complexity and intricacy of such issues. On the contrary, we are making the tool available at this point in time to invite further contributions and engagement from other technology companies and organizations with the goal of continuous improvement and refinement.
At Microsoft, we embrace a multi-stakeholder model to combat online child exploitation that includes survivors and their advocates, government, tech companies and civil society working together. Combating online child exploitation should and must be a universal call to action.