|Area (Click to open)||Short description||Reason for Being Key Area||Crime Phase||Technology||Purpose|
Most existing solutions are only used in a few number of areas, while they could be useful in most countries.
Nearly half of the surveyed police officers reported that encryption is the biggest challenge they face in child sexual abuse investigations, according to NetClean Report 2019.
Social media companies sometimes are so slow to provide information requested by the police, that evidence is gone.
This area currently has very few tools/projects, indicating it might be low hanging fruit that can bring great improvements.
Voluntarily and induced (through grooming or sexual extortion) self-produced live-streamed child sexual abuse were both reported to be common types of live-streamed material in investigations. -Netclean report 2019
There's been an increase in grooming cases on social media and gaming platform, and a relative lack of tools/project aimed at detecting it.
A therapy chatbot could anonymously help children cope with trauma, and could be a first step towards talking to a parent or a therapist.
Advanced network analysis using deep learning flags suspicious accounts and prevents grooming attempts on child-friendly platforms (social media, gaming sites)
Automatic, flagging of CSAM on relevant platforms “at its source” helps remove material before it is circulated.
Computer vision AI with advanced age, object, voice, location and facial recognition capabilities to gather more context on image for more accurate and quicker victim and perpetrator identification.
Language processing AI for advanced risk detection, analyzing text to assess abuse risk “pre-facto” of any given piece of content.
Data-rich pattern recognition AI for advanced grouping and prioritization tools, using deep learning to cluster videos with the same voice, location and child’s face.
On-device content blocking AI built into mobile hardware for real-time, dynamic filtering and blocking.
Chatbots where perpetrators can get help anonymously with how to cope with their desires. Further the chatbot can recommend anonymous hotlines for the perpetrator.
AI that can detect if sites or social media pages are secretly marketing CSAM, disguised as legal content, like marketing using dressed children to market CSAM.
AI's that based on specific online communities policies can flag and remove content as it is created.