ADVERTISEMENT

Microsoft Releases Chat Scanner to Detect Child Sex Predators

Online grooming is a bigger problem than human moderators can handle.  

Microsoft Releases Chat Scanner to Detect Child Sex Predators
An attendee types on a cyrillic laptop computer keyboard. (Photographer: Andrey Rudakov/Bloomberg)

(Bloomberg) -- Microsoft Corp. will share a tool it’s been using on its Xbox gaming service to scan online text chats and detect adults seeking to groom and exploit children for sexual purposes. 

Codenamed Project Artemis, the technique combs through historical messages and looks for indicative patterns and characteristics before assigning a probability rating. That can then be used by companies to decide which conversations on their platforms should get a closer look by a human moderator, wrote Courtney Gregoire, Microsoft’s chief digital safety officer, in a blog post.

Tech companies are grappling with how to stem a rising tide of child pornography and exploitation online as images and nefarious texts overwhelm moderators and private chat apps make detection tougher. Companies in the industry reported 45 million online images of child sexual abuse in 2018, a record high, the New York Times reported in September.  Adult predators use built-in chat functions on popular video games and private messaging apps to groom children and solicit nude photos, sometimes by posing as kids themselves.

Microsoft’s so-called grooming detection technique promises to help rein in that behavior with textual communications, but it still leaves voice chat in multiplayer games like Fortnite unaddressed, which serves as another avenue for child sex predators.

The project started at a Nov. 2018 hackathon co-sponsored with two child welfare groups that looked not just at new technology ideas but also legal and policy issues.  Since then, Microsoft has been developing the tools in collaboration with the companies behind online video game Roblox and messenger app Kik, The Meet Group, which owns social meeting apps like MeetMe and Skout, and Thorn, a non-profit organization co-founded by actors Ashton Kutcher and Demi Moore to fight child sex abuse. 

The team was led by Dartmouth College Computer Science Professor Hany Farid, who previously worked with Microsoft to build PhotoDNA, a tool that’s been used by 150 companies and organizations to find and report images of child sexual exploitation. Farid has written in opposition to the proliferation of end-to-end encryption in social and private messaging services, arguing that it makes detecting and preventing child abuse more difficult.

Starting Jan. 10, Thorn will handle licensing of Project Artemis, which is built on Microsoft patents and available for free to qualifying online services, who can sign up for it by emailing antigrooming@thorn.org. Microsoft said it is already using the technique for Xbox chats and looking at doing the same for Skype.

To contact the editor responsible for this story: Vlad Savov at vsavov5@bloomberg.net

©2020 Bloomberg L.P.