Expert: Public and policymakers need to pay more attention to child abuse online

Jun 11, 2015   |   Jacqueline Beauchere

Combating the online availability of child sexual abuse material (CSAM) continues to pose a significant challenge worldwide, and a lack of understanding among policymakers and the global public further complicate the issue, a leading expert said.

Speaking to a recent joint law enforcement/INHOPE conference on prevention and awareness of CSAM, Ernie Allen enumerated a number of key challenges, including an absence of public and policymaker awareness. Allen, the former president and CEO of the U.S. National Center for Missing and Exploited Children (NCMEC), delivered the event keynote, highlighting the explosion of online child exploitation and the need for multi-stakeholder cooperation – among governments, law enforcement agencies, technology companies and the NGO community – to address it.

“The victims in these photos suffer real harm,” Allen said, noting that NCMEC processed an astounding 24 million images and videos in 2014, a 48,000-percent increase since 2002, when the organization first created its Child Victim Identification Program (CVIP) and accompanying image database. “In 84 percent of the identified images from CVIP, there was physical penetration” by perpetrators, he added.

Of the thousands of victims that NCMEC identified last year, 75 percent were prepubescent, largely ages 12 and younger, including 10 percent who were infants and toddlers, Allen said. Reports made last year to INHOPE’s 52 international hotlines and helplines, of which NCMEC is one, show a slightly grimmer picture: 79 percent of the victims were prepubescent, including 7 percent identified as infants and toddlers.

And, “we know there is dramatic underreporting,” Allen continued. Estimates show reporting of child sexual abuse in general has climbed to approximately one in three cases, a 20-year high. Yet, when that abuse is chronicled in a photo or video, reporting drops to near zero. “The children are harmed when they are abused, and that harm is compounded when the abuser captures the abuse and makes it permanent by taking photos and distributing them on the Internet,” he added.

Still, now more than ever, along with the gravity of the problem come fresh attention and new initiatives, poised to thwart further the online availability and distribution of CSAM. Among them is #WePROTECT Children Online, started by U.K. Prime Minister David Cameron and advanced among more than 50 countries at the first #WePROTECT Summit last December. Allen chairs the #WePROTECT International Advisory Board (IAB) formed earlier this year. The IAB aims to grow cooperation and commitment to help combat the problem among a multitude of private- and public-sector actors. I have the privilege of representing the technology industry on the board, of which there are four other members: a representative each from INHOPE’s hotlines and helplines, the NGO community, global law enforcement and the U.S. government. The second #WePROTECT Summit is scheduled for November in the UAE.

For Microsoft’s part, working to combat CSAM online has been a priority for more than a decade. This includes our work to develop PhotoDNA – a technology that helps companies and organizations find and remove from the Internet some of the worst known images of child sexual abuse and exploitation. Microsoft uses PhotoDNA on our own services, including OneDrive, and we license the technology for free to more than 60 other international organizations. Earlier this year, Microsoft also created the free PhotoDNA Cloud Service for qualified customers, available via Azure Marketplace as another means of preventing the spread of this heinous imagery.

More can certainly be done. And, initiatives like #WePROTECT are a strong model for bringing together like-minded thinkers and actors, who – together – can undoubtedly make a difference.

To learn more about staying safer online in general, visit our website and check out our educational resources at the Microsoft YouthSpark Hub.

Tags: ,