We’ve all seen various online tools for reporting instances of cyberbullying, harassment or other forms of digital abuse to technology companies. But, how many of us have experienced or witnessed cruel or malicious treatment online or stumbled upon inappropriate content and actually made use of these resources?
On Nov. 5-7, the International Bullying Prevention Association is holding its 14th annual meeting in Nashville, Tennessee. A Microsoft representative will participate in that conference. It’s also a fitting time to remind young people and adults of the many ways they can report to us inappropriate behavior and illegal or harmful content on our hosted consumer services like OneDrive, Outlook.com, Xbox Live and Skype.
In addition to specific in-product or service links to report abuse or concerns, Microsoft has a series of topic-specific web forms available to report non-consensual pornography (“revenge porn,”) terrorist content and hate speech. These issues, as well as bullying, harassment and other inappropriate conduct, are all violations of Microsoft’s Code of Conduct as detailed in the Microsoft Services Agreement. Conversely, if consumers feel their content was removed or their account was closed in error, they can complete this form to request reinstatement.
Teen Council for Digital Good recognizes the importance of reporting
We raised the topic of “abuse reporting” with members of our inaugural Council for Digital Good this past summer. The council is a group of 15 teens from 12 U.S. states, selected as part of a pilot program for young people. Since being selected in April, the teens have been sharing their thoughts and perspectives about life online and learning about and championing Microsoft’s work in digital civility: promoting safer and healthier online interactions by leading with empathy and showing respect for differing views and opinions, among other things.
At our first council summit in August, we were discussing digital life generally when the conversation turned to the importance of reporting online abuse. No fewer than three specific mentions of this phrase – a common one used by consumer tech companies, civil society groups, governments and others – were met with blank stares and eyes glazing over from the council members, until one teen exclaimed, “Your report matters!” This comment from Christina, a teen council member from Georgia, prompted further discussion and clearly struck a chord with several others.
In fact, in completing their on-site assignments – each teen drafted an individual, multi-point “manifesto” for life online as part of their summit work – at least half of the council members noted the importance of reporting to technology service providers cruel, abusive and inappropriate content and conduct. Through our summit conversations with the teens, we learned that the phrase “report abuse” wasn’t resonating with these young people because, although they themselves or those in their social circle may have been called unkind names, sparked or witnessed “drama,” or even been bullied online, in their minds the interactions didn’t rise to the level of what they would consider “abuse.” This is one reason they were disinclined to report such instances to online companies. Talk about an “aha” moment!
Reporting protects individual consumers and the ecosystem
“It’s essential that we all take some responsibility in reporting bad content when we see it – even if we’re just quickly scrolling through our social media feeds,” says 16-year-old Christina. “Too often, people feel their report won’t have an impact because they assume someone already made one or someone else eventually will. If we all have that mindset, then of course no action can be taken by technology companies to remove harmful content. We need to make people realize that everyone’s reports really do matter. In the long run, they’ll help make the internet a much more positive space.”
Thanks to Christina and the other council members, we’re rethinking and simplifying the names we give to the resources that enable customers to report bad content and bad actors. Indeed, Microsoft has a business interest in protecting our customers and the integrity of our services by removing illegal and harmful content and addressing prohibited conduct. And, customer reporting plays an important role in achieving those aims. Depending on the egregiousness of the offense, different Microsoft consumer services take differing enforcement actions. Distributing illegal content or other grave violations result in account closure.
To learn more about reporting illegal and harmful content or inappropriate conduct on Microsoft-hosted consumer services such as Xbox Live, Outlook.com, OneDrive and Skype, see our dedicated web pages and forms. To read more about Microsoft’s Council for Digital Good and digital civility, visit our website, review our resources, “like” us on Facebook and follow us on Twitter.
At the time of writing of this post, Jacqueline Beauchere’s title was Chief Online Safety Officer.
Tags: Online Safety