☰
  • Home
  • About Stella Polaris
    • Introduction to AI and Child Sexual Abuse
    • About Childhood
    • About Stella Polaris
  • Reports and Articles
    • All Reports
    • Research papers on Child Sexual Abuse and AI
    • Reports by Childhood
  • Common AI Technologies
    • Natural Language Processing
    • Data Analysis
    • Machine Learning
    • Deep Learning
    • Neural Networks
    • Robotics
    • Computer Vision
  • Digital Tools Database
  • Contact Us
  • Support Us!
Data Analysis Developement Areas

Data Analysis Developement Areas

Stella Polaris Knowledge Center
Stella Polaris Knowledge Center
/Data Analysis
Data Analysis
/Data Analysis Developement Areas
Data Analysis Developement Areas
Expand existing solutions to new countries
Expand existing solutions to new countries

Most existing solutions are only used in a few number of areas, while they could be useful in most countries.

https://respect.international/wp-content/uploads/2019/11/AI-Combating-online-sexual-abuse-of-children-Bracket-Foundation-2019.pdf

Detect CSAM on encrypted devices
Detect CSAM on encrypted devices

Nearly half of the surveyed police officers reported that encryption is the biggest challenge they face in child sexual abuse investigations, according to NetClean Report 2019.

⚖️Prosecution
💻Device extraction/search
🧑‍⚖️Perpetrator prosecution

Netclean

Speed up co-op from social media platforms
Speed up co-op from social media platforms

Social media companies sometimes are so slow to provide information requested by the police, that evidence is gone.

⚖️Prosecution
🧑‍⚖️Perpetrator prosecution

Netclean report 2019

Digital Post Crime Solutions
Digital Post Crime Solutions

This area currently has very few tools/projects, indicating it might be low hanging fruit that can bring great improvements.

❤️‍🩹Post-crime efforts
Detect self-production of live-streaming
Detect self-production of live-streaming

Voluntarily and induced (through grooming or sexual extortion) self-produced live-streamed child sexual abuse were both reported to be common types of live-streamed material in investigations. -Netclean report 2019

🚨Detection
🚨Grooming detection/prevention

Netclean report 2019

Automatic in-chat grooming detection
Automatic in-chat grooming detection

There's been an increase in grooming cases on social media and gaming platform, and a relative lack of tools/project aimed at detecting it.

🚨Detection
🚨Grooming detection/prevention

Netclean report 2019

Therapy Chatbot for Abuse Survivors
Therapy Chatbot for Abuse Survivors

A therapy chatbot could anonymously help children cope with trauma, and could be a first step towards talking to a parent or a therapist.

❤️‍🩹Post-crime efforts
🔠Text analysis/processing

None

Advanced Network Analysis
Advanced Network Analysis

Advanced network analysis using deep learning flags suspicious accounts and prevents grooming attempts on child-friendly platforms (social media, gaming sites)

🔍Prevention
📊Data analysis and management
🚨Grooming detection/prevention

https://respect.international/wp-content/uploads/2019/11/AI-Combating-online-sexual-abuse-of-children-Bracket-Foundation-2019.pdf

Automatic flagging of CSAM
Automatic flagging of CSAM

Automatic, flagging of CSAM on relevant platforms “at its source” helps remove material before it is circulated.

🚨Detection
🖼️Image and video detection/classification
🛑Consumption of child abuse material prevention

https://respect.international/wp-content/uploads/2019/11/AI-Combating-online-sexual-abuse-of-children-Bracket-Foundation-2019.pdf

Advanced Visual Recognition of Child Sexual Abuse
Advanced Visual Recognition of Child Sexual Abuse

Computer vision AI with advanced age, object, voice, location and facial recognition capabilities to gather more context on image for more accurate and quicker victim and perpetrator identification.

🚨Detection⚖️Prosecution
🖼️Image and video detection/classification
🛑Consumption of child abuse material prevention🕵️‍♀️Perpetrator investigation

https://respect.international/wp-content/uploads/2019/11/AI-Combating-online-sexual-abuse-of-children-Bracket-Foundation-2019.pdf

Automatic risk analysis of text
Automatic risk analysis of text

Language processing AI for advanced risk detection, analyzing text to assess abuse risk “pre-facto” of any given piece of content.

🔍Prevention
🔠Text analysis/processing
🚨Grooming detection/prevention

https://respect.international/wp-content/uploads/2019/11/AI-Combating-online-sexual-abuse-of-children-Bracket-Foundation-2019.pdf

Video Clustering for Identification
Video Clustering for Identification

Data-rich pattern recognition AI for advanced grouping and prioritization tools, using deep learning to cluster videos with the same voice, location and child’s face.

⚖️Prosecution
🔎Facial/object detection
🆔IdentificationNo access

https://respect.international/wp-content/uploads/2019/11/AI-Combating-online-sexual-abuse-of-children-Bracket-Foundation-2019.pdf

On-device Content Blocking
On-device Content Blocking

On-device content blocking AI built into mobile hardware for real-time, dynamic filtering and blocking.

🔍Prevention
🖼️Image and video detection/classification
🛑Consumption of child abuse material prevention

https://respect.international/wp-content/uploads/2019/11/AI-Combating-online-sexual-abuse-of-children-Bracket-Foundation-2019.pdf

Therapy Chatbot for Perpetrators
Therapy Chatbot for Perpetrators

Chatbots where perpetrators can get help anonymously with how to cope with their desires. Further the chatbot can recommend anonymous hotlines for the perpetrator.

🔍Prevention
🔠Text analysis/processing
🛑Consumption of child abuse material prevention

None

Hidden CSAM marketing detection
Hidden CSAM marketing detection

AI that can detect if sites or social media pages are secretly marketing CSAM, disguised as legal content, like marketing using dressed children to market CSAM.

🔍Prevention
🔠Text analysis/processing
🛑Consumption of child abuse material prevention
AI text content moderators
AI text content moderators

AI's that based on specific online communities policies can flag and remove content as it is created.

🔍Prevention
🔠Text analysis/processing
🛑Consumption of child abuse material prevention

https://venturebeat.com/2021/08/25/how-to-rebuild-trust-in-web-3-0/

👉
www.childhood.org

Foother

ChildhoodChildhood
Childhood
linkedinlinkedin
linkedin
My AIMy AI
My AI
FacebookFacebook
Facebook

Blasieholmstorg 8, 111 48 Stockholm

+46(0)8-551 175 00

info@childhood.org