Microsoft has built a new tool that looks to safeguard children who befriend and establish trust with strangers online. But the system isn’t perfect.
When a child sexual predator comes across a potential victim online, very rarely will they try to begin a sexual relationship with them immediately. Typically, the predator will engage in a method called “grooming”, which involves befriending and establishing trust with the potential victim in order to exploit that trust down the track.
In an effort to combat child exploitation, Microsoft has spent the last 14 months developing a new tool to detect sexual predators attempting to “groom” children online. Named Project Artemis, the tool is designed to analyse text chat for suspicious patterns of communication.
In collaboration with Kik, Roblox, The Meet Group and Thorn, Microsoft have built off their patented technology that has been in place on the Xbox platform for several years now. The product of their collaboration is freely available via Thorn, a technology non-profit that builds technology to defend children from sexual abuse, to any online service that offers a chat function.
Identifying false positives will prove to be a flaw, since automated systems still struggle to understand context as well as any human could.
Microsoft hasn’t disclosed the specific words or patterns the tool aims to identify – doing so would cause predators to adjust their behaviour to avoid detection. What they have told us is that Project Artemis evaluates and “rates” conversation characteristics and assigns the conversation a probability rating. Those implementing the tool can set a rating threshold above which flagged conversations are then moderated by a human.
The technology isn’t perfect. Identifying false positives will prove to be a flaw, since automated systems still struggle to understand context as well as any human could. The system also assumes users consent to their private communications being read, which is not necessarily the case all the time. Sensitive information not encrypted can also be viewed by the moderator.
Microsoft themselves admit that the project is no panacea. However, assuming companies decide to adopt Project Artemis, and that the tool is at all effective, the technology is a step in the right direction, and it is likely that at least some children will be made safe – rendering the time and effort put into it worthwhile.
Microsoft has invited contributions and engagement from other tech companies and organisations with the aim of continuous improvement and refinement.