Expand existing solutions to new countries
Expand existing solutions to new countries
Most existing solutions are only used in a few number of areas, while they could be useful in most countries.
Detect CSAM on encrypted devices
Detect CSAM on encrypted devices
Nearly half of the surveyed police officers reported that encryption is the biggest challenge they face in child sexual abuse investigations, according to NetClean Report 2019.
Netclean
Digital Post Crime Solutions
Digital Post Crime Solutions
This area currently has very few tools/projects, indicating it might be low hanging fruit that can bring great improvements.
Detect self-production of live-streaming
Detect self-production of live-streaming
Voluntarily and induced (through grooming or sexual
extortion) self-produced live-streamed child sexual abuse
were both reported to be common types of live-streamed
material in investigations.
-Netclean report 2019
Netclean report 2019
Automatic in-chat grooming detection
Automatic in-chat grooming detection
There's been an increase in grooming cases on social media and gaming platform, and a relative lack of tools/project aimed at detecting it.
Netclean report 2019
Therapy Chatbot for Abuse Survivors
Therapy Chatbot for Abuse Survivors
A therapy chatbot could anonymously help children cope with trauma, and could be a first step towards talking to a parent or a therapist.
None
Advanced Network Analysis
Advanced Network Analysis
Advanced network analysis using deep learning flags suspicious accounts and prevents grooming attempts on child-friendly platforms (social media, gaming sites)
Automatic flagging of CSAM
Automatic flagging of CSAM
Automatic, flagging of CSAM on relevant platforms “at its source” helps remove material
before it is circulated.
Advanced Visual Recognition of Child Sexual Abuse
Advanced Visual Recognition of Child Sexual Abuse
Computer vision AI with advanced age, object, voice, location and facial recognition capabilities to gather more context on image for more accurate and quicker victim and perpetrator identification.
Automatic risk analysis of text
Automatic risk analysis of text
Language processing AI for advanced risk detection, analyzing text to assess abuse risk “pre-facto” of any given piece of content.
Video Clustering for Identification
Video Clustering for Identification
Data-rich pattern recognition AI for advanced grouping and prioritization tools, using deep learning to cluster videos with the same voice, location and child’s face.
On-device Content Blocking
On-device Content Blocking
On-device content blocking AI built into mobile hardware for real-time, dynamic filtering and blocking.
Therapy Chatbot for Perpetrators
Therapy Chatbot for Perpetrators
Chatbots where perpetrators can get help anonymously with how to cope with their desires. Further the chatbot can recommend anonymous hotlines for the perpetrator.
None
AI text content moderators
AI text content moderators
AI's that based on specific online communities policies can flag and remove content as it is created.