☰
  • Home
  • About Stella Polaris
    • Introduction to AI and Child Sexual Abuse
    • About Childhood
    • About Stella Polaris
  • Reports and Articles
    • All Reports
    • Research papers on Child Sexual Abuse and AI
    • Reports by Childhood
  • Common AI Technologies
    • Natural Language Processing
    • Data Analysis
    • Machine Learning
    • Deep Learning
    • Neural Networks
    • Robotics
    • Computer Vision
  • Digital Tools Database
  • Contact Us
  • Support Us!

New report

Organization

Description

File
Link
Tags
Type
Year
Stella Polaris Knowledge Center
Stella Polaris Knowledge Center
Stella Polaris Knowledge Center
Stella Polaris Knowledge Center
Artificial intelligence-produced child sexual abuse material: Insights from Dark Web forum posts
Artificial intelligence-produced child sexual abuse material: Insights from Dark Web forum posts

Anglia Ruskin University

Report
Child Sexual Abuse Material (CSAM)Generative AI
Stella Polaris Summit 2024 Summary
Stella Polaris Summit 2024 Summary

World Childhood Foundation

Other publication
Generative AI - A New Threat for Online Child Sexual Exploitation and Abuse
Generative AI - A New Threat for Online Child Sexual Exploitation and Abuse

Bracket Foundation and UNICRI Centre for AI and Robotics

Report
PreventionArtificial intelligenceGenerative AIChild Sexual Abuse Material (CSAM)
Global Threat Assessment 2023
Global Threat Assessment 2023

WeProtect Global Alliance

Report
Child Sexual Abuse Material (CSAM)Prevention
Childhood Activity Report
Childhood Activity Report

World Childhood Foundation

Report
Activity report /VerksamhetsberättelseSweden-focused
Child Sexual Abuse on the Internet: Report on the Analysis of Technological Factors that Affect the Creation and Sharing of Child Sexual Abuse Material on the Internet.
Child Sexual Abuse on the Internet: Report on the Analysis of Technological Factors that Affect the Creation and Sharing of Child Sexual Abuse Material on the Internet.

BI Norwegian Business School, Norwegian University of Science and Technology

Report
InternetPolicyDetectionprosecutionPost-crime effortsCriminal investigation
Guarding the Guardians: Automated Analysis of Online Child Sexual Abuse
Guarding the Guardians: Automated Analysis of Online Child Sexual Abuse
Research (peer reviewed)
Criminal investigationNatural Language Processing
Fine-Tuning Llama 2 Large Language Models for Detecting Online Sexual Predatory Chats and Abusive Texts
Fine-Tuning Llama 2 Large Language Models for Detecting Online Sexual Predatory Chats and Abusive Texts

Cornell University

Research (peer reviewed)
Machine learningNatural Language Processing
Black and Latinx Primary caregiver considerations for developing and implementing a machine learning-based model for detecting child abuse and neglect with implications for racial bias reduction: A qualitative study
Black and Latinx Primary caregiver considerations for developing and implementing a machine learning-based model for detecting child abuse and neglect with implications for racial bias reduction: A qualitative study

University of Pennsylvania Columbia University

Report
PreventionMachine learning
BS-SC Model: A Novel Method for Predicting Child Abuse Using Borderline-SMOTE Enabled Stacking Classifier
BS-SC Model: A Novel Method for Predicting Child Abuse Using Borderline-SMOTE Enabled Stacking Classifier

B. S. Abdur Rahman Crescent Institute of Science and Technology

Research (peer reviewed)
PreventionDetectionClustering/Classification
Applications of artificial intelligence in predicting the risk of child abuse: A literature review
Applications of artificial intelligence in predicting the risk of child abuse: A literature review

King Faisal Specialist Hospital and Research Centre Princess Nora bint Abdul Rahman University Mississippi State University

Research (peer reviewed)
DetectionChild-focusedNeural networks
Determining Child Sexual Abuse Posts based on Artificial Intelligence
Determining Child Sexual Abuse Posts based on Artificial Intelligence

Technological University Dublin

Report
Child Sexual Abuse Material (CSAM)Clustering/Classification
Identifying Online Child Sexual Texts in Dark Web through Machine Learning and Deep Learning Algorithms
Identifying Online Child Sexual Texts in Dark Web through Machine Learning and Deep Learning Algorithms

Technological University Dublin

Report
Child Sexual Abuse Material (CSAM)Machine learningDeep Learning
Discovering Child Sexual Abuse Material Creator´s Behaviors and Preferences on the Dark Web
Discovering Child Sexual Abuse Material Creator´s Behaviors and Preferences on the Dark Web

Technological University Dublin

Research (peer reviewed)
Child Sexual Abuse Material (CSAM)PerpetratorsClustering/Classification
Generative AI: New Attack Vector for Trust & Safety
Generative AI: New Attack Vector for Trust & Safety

Activefence

Report
Artificial intelligenceChild Sexual Abuse Material (CSAM)Generative AI
Child predator abuse of generative AI
Child predator abuse of generative AI

Activefence

Report
Generative AIChild Sexual Abuse Material (CSAM)Artificial intelligence
How AI is being abused to create child sexual abuse imagery
How AI is being abused to create child sexual abuse imagery

Internet Watch Foundation

Report
Artificial intelligencePerpetratorsPreventionChild Sexual Abuse Material (CSAM)Generative AI
…det var en känsla av att han befann sig i rummet bredvid
…det var en känsla av att han befann sig i rummet bredvid

World Childhood Foundation

Report
Sweden-focusedprosecutionInternet
Sperm hunting on optical microscope slides for forensic analysis with deep convolutional networks – a feasibility study
Sperm hunting on optical microscope slides for forensic analysis with deep convolutional networks – a feasibility study

Zurich Institute of Forensic Medicine

Research (peer reviewed)
Perpetrators
Developing automated methods to detect and match face and voice biometrics in child sexual abuse videos
Developing automated methods to detect and match face and voice biometrics in child sexual abuse videos

San Jose State University

Research (peer reviewed)
Child Sexual Abuse Material (CSAM)Perpetrators
Child Sexual Abuse Material Online: The Perspective of Online Investigators on Training and Support
Child Sexual Abuse Material Online: The Perspective of Online Investigators on Training and Support

Griffith University

Research (peer reviewed)
Child Sexual Abuse Material (CSAM)prosecutionPerpetrators
Child Online Safety Toolkit
Child Online Safety Toolkit

5Rights Foundation

Report
PreventionNeural networks
Child Abuse Risk Prediction and Prevention Framework using AI and Dark Web
Child Abuse Risk Prediction and Prevention Framework using AI and Dark Web

Adhiyamaan College of Engineering

Research (peer reviewed)
Child Sexual Abuse Material (CSAM)PreventionNeural networks
Applying Artificial Intelligence for Age Estimation in Digital Forensic Investigations
Applying Artificial Intelligence for Age Estimation in Digital Forensic Investigations

University of Warwick

Research (peer reviewed)
Artificial intelligenceNeural networks
Online Grooming Detection on Social Media Platforms
Online Grooming Detection on Social Media Platforms

Norwegian University of Science and Technology

Research (peer reviewed)
DetectionClustering/Classification
Hand-Based Person Identification Using Global and Part-Aware Deep Feature Representation Learning
Hand-Based Person Identification Using Global and Part-Aware Deep Feature Representation Learning

Lancaster University

Research (peer reviewed)
Artificial intelligenceprosecutionChild Sexual Abuse Material (CSAM)Machine learning
Out of the Shadows Index
Out of the Shadows Index

The Economist Intelligence Unit

Report
PreventionChild Sexual Abuse Material (CSAM)
Developing machine learning-based models to help identify child abuse and neglect: key ethical challenges and recommended solutions
Developing machine learning-based models to help identify child abuse and neglect: key ethical challenges and recommended solutions

Columbia University

Research (peer reviewed)
EthicsDetectionPrevention
ReDirection survey report
ReDirection survey report

Suojellaan Lapsia

Report
Perpetrators
Unga sex och internet #MeToo
Unga sex och internet #MeToo

Stiftelsen Allmänna Barnahus

Report
PreventionChild Sexual Abuse Material (CSAM)Child-focused
Internetrelaterade sexuella övergrepp mot barn
Internetrelaterade sexuella övergrepp mot barn

Riksrevisionen

Report
PreventionChild Sexual Abuse Material (CSAM)prosecution
How SARs fight online child sexual abuse by flagging financial crime
How SARs fight online child sexual abuse by flagging financial crime

Napier

Other publication
PreventionChild Sexual Abuse Material (CSAM)prosecution
Predicting Prolific Live Streaming of Child Sexual Abuse
Predicting Prolific Live Streaming of Child Sexual Abuse

Australian Institute of Criminology

Research (peer reviewed)
Financial transactionsArtificial intelligenceSupervised learningMachine learningCriminal investigationChild Sexual Abuse Material (CSAM)
Automated Biometric Identification using Dorsal Hand Images and Convolutional Neural Networks
Automated Biometric Identification using Dorsal Hand Images and Convolutional Neural Networks

Auckland University of Technology

Research (peer reviewed)
Neural networksPerpetrators
Multimodal Virtual Avatars for Investigative Interviews with Children
Multimodal Virtual Avatars for Investigative Interviews with Children

Oslo Metropolitan University

Research (peer reviewed)
Criminal investigationprosecution
Early Detection of Sexual Predators in Chats
Early Detection of Sexual Predators in Chats

Humboldt Universitat zu Berlin

Research (peer reviewed)
Natural Language ProcessingClustering/Classification
When “Sweetie” is not so Sweet: Artificial Intelligence and its Implications for Child Pornography
When “Sweetie” is not so Sweet: Artificial Intelligence and its Implications for Child Pornography
Report
Child Sexual Abuse Material (CSAM)prosecution
Keeping Children Safe Online With Limited Resources: Analyzing What is Seen and Heard
Keeping Children Safe Online With Limited Resources: Analyzing What is Seen and Heard

Singidunum University

Research (peer reviewed)
PreventionClustering/ClassificationNeural networks
Suspicious activity detection using deep learning in secure assisted living IoT environments
Suspicious activity detection using deep learning in secure assisted living IoT environments

Nalla Malla Engineering College, Galgotias University, Vellore Institute of Technology

Research (peer reviewed)
Child Sexual Abuse Material (CSAM)Neural networks
Prediction of the development of depression and post-traumatic stress disorder in sexually abused children using a random forest classifier
Prediction of the development of depression and post-traumatic stress disorder in sexually abused children using a random forest classifier

Uskudar University Medical Faculty, Istanbul, Turkey

Research (peer reviewed)
Clustering/ClassificationMachine learningPost-crime effortsChild-focusedPost-traumatic stress (PTS)
NetClean Report 2020 - Covid-19 Impact
NetClean Report 2020 - Covid-19 Impact

NetClean

Report
PreventionChild Sexual Abuse Material (CSAM)
Illegal Online Sexual Behavior During the COVID-19 Pandemic
Illegal Online Sexual Behavior During the COVID-19 Pandemic

Karolinska Institutet

Research (peer reviewed)
PreventionPerpetratorsChild Sexual Abuse Material (CSAM)Financial transactions
Sexuella övergrepp via nätet lika allvarliga som IRL
Sexuella övergrepp via nätet lika allvarliga som IRL

Göteborgs Universitet

Research (peer reviewed)
Child-focusedChild Sexual Abuse Material (CSAM)Prevention
Summary Paper on Online Child Sexual Exploitation
Summary Paper on Online Child Sexual Exploitation

ECPAT

Report
PreventionChild Sexual Abuse Material (CSAM)Perpetrators
Curbing the Surge in Online Child Abuse
Curbing the Surge in Online Child Abuse

European Parliament

Report
PreventionNeural networks
Action to End child Sexual Abuse and Exploitation
Action to End child Sexual Abuse and Exploitation

Unicef

Report
Child Sexual Abuse Material (CSAM)Prevention
Brief version: Action to End child Sexual Abuse and Exploitation
Brief version: Action to End child Sexual Abuse and Exploitation

Unicef

Report
Child Sexual Abuse Material (CSAM)Prevention
An Integrative Review of Historical Technology and Countermeasure Usage Trends in Online Child Sexual Exploitation Material Offender
An Integrative Review of Historical Technology and Countermeasure Usage Trends in Online Child Sexual Exploitation Material Offender

University of Edinburgh and George Mason University

Research (peer reviewed)
Child Sexual Abuse Material (CSAM)PerpetratorsCriminal investigationPrevention
How Facial Recognition Is Helping Fight Child Sexual Abuse
How Facial Recognition Is Helping Fight Child Sexual Abuse

Griffeye

Other publication
Child Sexual Abuse Material (CSAM)Criminal investigationMachine learningArtificial intelligence
Can Artificial Intelligence Achieve Human-Level Performance? A Pilot Study of Childhood Sexual Abuse Detection in Self-Figure Drawings
Can Artificial Intelligence Achieve Human-Level Performance? A Pilot Study of Childhood Sexual Abuse Detection in Self-Figure Drawings

University of Haifa

Research (peer reviewed)
Artificial intelligencePerpetratorsCriminal investigationSupervised learningNeural networksMachine learningChild-focusedChild Sexual Abuse Material (CSAM)
Estimation of the Development of Depression and PTSD in Children Exposed to Sexual Abuse and Development of Decision Support Systems by Using Artificial Intelligence
Estimation of the Development of Depression and PTSD in Children Exposed to Sexual Abuse and Development of Decision Support Systems by Using Artificial Intelligence

Inonu University

Research (peer reviewed)
Child-focusedPost-traumatic stress (PTS)Post-crime efforts
AI Roundtable 2019 Summary
AI Roundtable 2019 Summary

World Childhood Foundation

Report
PreventionArtificial intelligence
Artificial Intelligence: combating online sexual abuse of children
Artificial Intelligence: combating online sexual abuse of children

Bracket Foundation

Report
PreventionNeural networksChild Sexual Abuse Material (CSAM)
Out of the Shadows Whitepaper
Out of the Shadows Whitepaper

The Economist Intelligence Unit

Report
PreventionChild Sexual Abuse Material (CSAM)
What Works to Prevent Sexual Violence Against Children
What Works to Prevent Sexual Violence Against Children

Together for girls

Report
Child Sexual Abuse Material (CSAM)PreventionFinancial transactions
Child Online Safety: Minimizing the Risk of Violence, Abuse and Exploitation Online.
Child Online Safety: Minimizing the Risk of Violence, Abuse and Exploitation Online.

ITU/UNESCO Broadband Commission for Sustainable Development

Report
PreventionPerpetrators
A Practitioner Survey Exploring the Value of Forensic Tools, AI, Filtering, & Safer Presentation for Investigating Child Sexual Abuse Material (CSAM)
A Practitioner Survey Exploring the Value of Forensic Tools, AI, Filtering, & Safer Presentation for Investigating Child Sexual Abuse Material (CSAM)

University of New Haven / Digital Forensic Research Workshop

Research (peer reviewed)
Artificial intelligencePreventionNeural networks
Child Abuse and Domestic Abuse: Content and Feature Analysis from Social Media Disclosures
Child Abuse and Domestic Abuse: Content and Feature Analysis from Social Media Disclosures

Victoria University and University of Melbourne

Research (peer reviewed)
Artificial intelligenceMachine learningPreventionChild Sexual Abuse Material (CSAM)
Therabot: An Adaptive Therapeutic Support Robot
Therabot: An Adaptive Therapeutic Support Robot

Institute of Electrical and Electronics Engineers (IEEE) and Mississippi State University

Research (peer reviewed)
Post-crime effortsChild-focusedRoboticsPost-traumatic stress (PTS)
Offender strategies for engaging children in online sexual activity
Offender strategies for engaging children in online sexual activity

Department of Psychology, University of Gothenburg

Research (peer reviewed)
PreventionChild Sexual Abuse Material (CSAM)
Study on the Effects of New Information Technologies on the Abuse and Exploitation of Children
Study on the Effects of New Information Technologies on the Abuse and Exploitation of Children

UNODC

Report
PreventionChild Sexual Abuse Material (CSAM)
Predictive Risk Modelling to Prevent Child Maltreatment and Other Adverse Outcomes for Service Users
Predictive Risk Modelling to Prevent Child Maltreatment and Other Adverse Outcomes for Service Users

University of Queensland

Research (peer reviewed)
PreventionNeural networks
Estimation of the Development of Depression and PTSD in Children Exposed to Sexual Abuse and Development of Decision Support Systems by Using Artificial Intelligence
Estimation of the Development of Depression and PTSD in Children Exposed to Sexual Abuse and Development of Decision Support Systems by Using Artificial Intelligence

Inonu University

The report describes an experiment in which algorithms were able to predict whether a child had PTSD or major depression, with 99.2% accuracy, based solely on data about the person and the assault. This meant that no additional information was required beyond what is usually collected in child sexual abuse cases. The results indicate that it is possible to use AI early after an abuse has occurred to predict whether PTSD or major depression is likely to develop.

2020
Child-focusedPost-traumatic stress (PTS)Post-crime efforts
Research (peer reviewed)
Multimodal Virtual Avatars for Investigative Interviews with Children
Multimodal Virtual Avatars for Investigative Interviews with Children

Oslo Metropolitan University

The report describes how virtual avatars of children can be used to train police officers to interview children about sexual abuse. Furthermore, the interviews can be used to create synthetic data from interviews with children.

2021
Criminal investigationprosecution
Research (peer reviewed)
Automated Biometric Identification using Dorsal Hand Images and Convolutional Neural Networks
Automated Biometric Identification using Dorsal Hand Images and Convolutional Neural Networks

Auckland University of Technology

The researchers are investigating how well AI can be applied to identify people based solely on the back of their hands. They achieve accuracies of over 99.9%, indicating that such tools can be used to identify perpetrators in child sexual abuse material. Similar projects have been conducted in Lancaster University.

2021
Neural networksPerpetrators
Research (peer reviewed)
Developing automated methods to detect and match face and voice biometrics in child sexual abuse videos
Developing automated methods to detect and match face and voice biometrics in child sexual abuse videos

San Jose State University

The report examines how biometric (facial) search combined with voice can be used to identify abuse survivors and perpetrators, by searching for matches in other child sexual abuse material videos.

2022
Child Sexual Abuse Material (CSAM)Perpetrators
Research (peer reviewed)
Child Sexual Abuse Material Online: The Perspective of Online Investigators on Training and Support
Child Sexual Abuse Material Online: The Perspective of Online Investigators on Training and Support

Griffith University

A study of what police officers believe they need to more effectively counter child sexual abuse material. The main training/support mentioned that organisations should provide included: communication, training on different types of perpetrators, and psychological support.

2022
Child Sexual Abuse Material (CSAM)prosecutionPerpetrators
Research (peer reviewed)
When “Sweetie” is not so Sweet: Artificial Intelligence and its Implications for Child Pornography
When “Sweetie” is not so Sweet: Artificial Intelligence and its Implications for Child Pornography

The report describes how computer-generated child abuse material is becoming increasingly common, including through deep fake technology. The author argues that the US law 18 U.S. Code § 2256 should be revised to prohibit computer-generated abuse material as well.

2021
Child Sexual Abuse Material (CSAM)prosecution
Report
Early Detection of Sexual Predators in Chats
Early Detection of Sexual Predators in Chats

Humboldt Universitat zu Berlin

A report on how deep learning transform models can classify grooming attempts. The authors of the report created a dataset that was then used by Viktor Bowallius and David Eklund in the report Grooming detection of chat segments using transformer models, where an f1 score of 0.98 was achieved.

2021
Natural Language ProcessingClustering/Classification
Research (peer reviewed)
Keeping Children Safe Online With Limited Resources: Analyzing What is Seen and Heard
Keeping Children Safe Online With Limited Resources: Analyzing What is Seen and Heard

Singidunum University

The article looks at how AI can analyse activity on mobile screens and audio ports, to detect bullying, porn and sexual harassment. Unlike previous experiments, this AI can see all activity as the user sees it, and not just see input in the form of texts or images that are retrieved from the screen and then processed. The model achieves an average accuracy of 88% when classifying texts, such as classifying sexism and racism. Furthermore, the model achieves 95% accuracy in detecting pornography.

2021
PreventionClustering/ClassificationNeural networks
Research (peer reviewed)
Suspicious activity detection using deep learning in secure assisted living IoT environments
Suspicious activity detection using deep learning in secure assisted living IoT environments

Nalla Malla Engineering College, Galgotias University, Vellore Institute of Technology

The report evaluates how well an AI can detect child sexual abuse via surveillance cameras.

2021
Child Sexual Abuse Material (CSAM)Neural networks
Research (peer reviewed)
Child Abuse Risk Prediction and Prevention Framework using AI and Dark Web
Child Abuse Risk Prediction and Prevention Framework using AI and Dark Web

Adhiyamaan College of Engineering

A report on a downloadable AI that analyses browser history to assess the risk that a child may be sexually exploited. The report does not provide concrete figures on how well the tool works.

2022
Child Sexual Abuse Material (CSAM)PreventionNeural networks
Research (peer reviewed)
Applying Artificial Intelligence for Age Estimation in Digital Forensic Investigations
Applying Artificial Intelligence for Age Estimation in Digital Forensic Investigations

University of Warwick

AI is tested to classify the age of children. Worked reasonably well for ages 10-20, with a mean error of about 2.5 years. The AI worked less well for children 0-10.

2022
Artificial intelligenceNeural networks
Research (peer reviewed)
Predictive Risk Modelling to Prevent Child Maltreatment and Other Adverse Outcomes for Service Users
Predictive Risk Modelling to Prevent Child Maltreatment and Other Adverse Outcomes for Service Users

University of Queensland

Analysis of how well AI can predict the risk of child maltreatment (Predicitve Risk Modelling PRM), based on such a tool used in New Zealand.

2015
PreventionNeural networks
Research (peer reviewed)
Brief version: Action to End child Sexual Abuse and Exploitation
Brief version: Action to End child Sexual Abuse and Exploitation

Unicef

Report on the extent, nature and consequences of child sexual abuse and exploitation in different contexts. It describes evidence of effective interventions and strategies to prevent and respond to child sexual abuse and exploitation.

2020
Child Sexual Abuse Material (CSAM)Prevention
Report
Action to End child Sexual Abuse and Exploitation
Action to End child Sexual Abuse and Exploitation

Unicef

In-depth report on the extent, nature and consequences of child sexual abuse and exploitation in different contexts. It describes evidence of effective interventions and strategies to prevent and respond to child sexual abuse and exploitation.

2020
Child Sexual Abuse Material (CSAM)Prevention
Report
How SARs fight online child sexual abuse by flagging financial crime
How SARs fight online child sexual abuse by flagging financial crime

Napier

Article on how the sexual exploitation of children online, and how it is linked to financial crimes.

2021
PreventionChild Sexual Abuse Material (CSAM)prosecution
Other publication
Curbing the Surge in Online Child Abuse
Curbing the Surge in Online Child Abuse

European Parliament

Briefing on the development of online sexual abuse, and what is being done to combat it.

2020
PreventionNeural networks
Report
Offender strategies for engaging children in online sexual activity
Offender strategies for engaging children in online sexual activity

Department of Psychology, University of Gothenburg

This study aims to describe online offenders' interactions with actual children when inciting them to engage in online sexual activity.

2017
PreventionChild Sexual Abuse Material (CSAM)
Research (peer reviewed)
Child Online Safety: Minimizing the Risk of Violence, Abuse and Exploitation Online.
Child Online Safety: Minimizing the Risk of Violence, Abuse and Exploitation Online.

ITU/UNESCO Broadband Commission for Sustainable Development

The report describes causes and effects of the risks that children are exposed to online. It also proposes specific solutions that can prevent the risks. Child online safety is the global goal to be achieved through multistakeholder cooperation.

2019
PreventionPerpetrators
Report
Summary Paper on Online Child Sexual Exploitation
Summary Paper on Online Child Sexual Exploitation

ECPAT

The summary describes key manifestations of sexual exploitation of children (SEC), which includes the exploitation of children in prostitution, the sale and trafficking of children for sexual purposes, online child sexual exploitation (OCSE), the sexual exploitation of children in travel and tourism (SECTT) and some forms of child, early and forced marriages (CEFM).

2020
PreventionChild Sexual Abuse Material (CSAM)Perpetrators
Report
Study on the Effects of New Information Technologies on the Abuse and Exploitation of Children
Study on the Effects of New Information Technologies on the Abuse and Exploitation of Children

UNODC

2015
PreventionChild Sexual Abuse Material (CSAM)
Report
Childhood Activity Report
Childhood Activity Report

World Childhood Foundation

About Childhood's work to prevent and stop child sexual abuse.

2023
Activity report /VerksamhetsberättelseSweden-focused
Report
Internetrelaterade sexuella övergrepp mot barn
Internetrelaterade sexuella övergrepp mot barn

Riksrevisionen

Review of police and prosecutors' work against internet-related sexual abuse of children.

2021
PreventionChild Sexual Abuse Material (CSAM)prosecution
Report
Unga sex och internet #MeToo
Unga sex och internet #MeToo

Stiftelsen Allmänna Barnahus

A report on secondary school students' experiences of sexual abuse and sexual exploitation in Sweden 2020/2021.

2021
PreventionChild Sexual Abuse Material (CSAM)Child-focused
Report
Sexuella övergrepp via nätet lika allvarliga som IRL
Sexuella övergrepp via nätet lika allvarliga som IRL

Göteborgs Universitet

Article on research into online sexual abuse and how it affects children.

2020
Child-focusedChild Sexual Abuse Material (CSAM)Prevention
Research (peer reviewed)
Illegal Online Sexual Behavior During the COVID-19 Pandemic
Illegal Online Sexual Behavior During the COVID-19 Pandemic

Karolinska Institutet

Scientific report on a study at Karolinska Institutet where an online-based CBT (cognitive behavioural therapy) programme is offered to people who produce, view and distribute child sexual abuse on the darknet. The study has received funding and capacity support from Childhood.

2020
PreventionPerpetratorsChild Sexual Abuse Material (CSAM)Financial transactions
Research (peer reviewed)
ReDirection survey report
ReDirection survey report

Suojellaan Lapsia

The report is based on 8,484 survey responses from users of abuse material on the dark web and provides unique insight into the behavioural patterns of perpetrators.

2021
Perpetrators
Report
NetClean Report 2020 - Covid-19 Impact
NetClean Report 2020 - Covid-19 Impact

NetClean

A report providing an overview of child sexual abuse, with a focus on the internet and the impact of covid-19.

2020
PreventionChild Sexual Abuse Material (CSAM)
Report
Child Online Safety Toolkit
Child Online Safety Toolkit

5Rights Foundation

Compilation on how policies on children's online safety can be developed.

2022
PreventionNeural networks
Report
Global Threat Assessment 2023
Global Threat Assessment 2023

WeProtect Global Alliance

Summary of the WeProtect Global Alliance on global trends related to child sexual abuse and exploitation.

2023
Child Sexual Abuse Material (CSAM)Prevention
Report
What Works to Prevent Sexual Violence Against Children
What Works to Prevent Sexual Violence Against Children

Together for girls

Report discussing practical and cost-effective solutions to break the cycle of sexual violence against children.

2019
Child Sexual Abuse Material (CSAM)PreventionFinancial transactions
Report
Artificial Intelligence: combating online sexual abuse of children
Artificial Intelligence: combating online sexual abuse of children

Bracket Foundation

A survey of what efforts currently exist in AI to counter and prevent child sexual abuse, and what AI could be used for.

2019
PreventionNeural networksChild Sexual Abuse Material (CSAM)
Report
Out of the Shadows Whitepaper
Out of the Shadows Whitepaper

The Economist Intelligence Unit

Statistics and baselines for 40 countries on child sexual abuse.

2019
PreventionChild Sexual Abuse Material (CSAM)
Report
AI Roundtable 2019 Summary
AI Roundtable 2019 Summary

World Childhood Foundation

The idea for Stella Polaris emerged from a roundtable meeting in autumn 2019 that brought together leading experts in AI and child safety.

2019
PreventionArtificial intelligence
Report
Artificial intelligence-produced child sexual abuse material: Insights from Dark Web forum posts
Artificial intelligence-produced child sexual abuse material: Insights from Dark Web forum posts

Anglia Ruskin University

A study by Anglia Ruskin University’s IPPPRI highlights the growing demand for AI-generated child sexual abuse material (CSAM) on the dark web. Researchers analyzed dark web forums, finding offenders actively using AI to create and share CSAM, teaching themselves through online guides and collaborating with others. The report reveals that offenders are reusing existing material to develop AI-generated content and anticipate advancements in technology to simplify the process. The study underscores the urgent need to understand these practices to prevent and combat AI-facilitated exploitation.

2024
Child Sexual Abuse Material (CSAM)Generative AI
Report
Stella Polaris Summit 2024 Summary
Stella Polaris Summit 2024 Summary

World Childhood Foundation

In June 2024, World Childhood Foundation invited stakeholders within child rights, AI, academia, pricate sector and cybersecurity among others to join in a one- day conference on using AI to combat child sexual abuse.

2024
Other publication
Generative AI - A New Threat for Online Child Sexual Exploitation and Abuse
Generative AI - A New Threat for Online Child Sexual Exploitation and Abuse

Bracket Foundation and UNICRI Centre for AI and Robotics

This report explores the current forms of AI-generated CSAM in the form of images, video and text. Pulling together perspectives and data from law enforcement, tech companies, civil society, and caregivers, this report aims to provide a comprehensive overview of the escalating danger and suggest potential mitigation strategies.

2024
PreventionArtificial intelligenceGenerative AIChild Sexual Abuse Material (CSAM)
Report
Out of the Shadows Index
Out of the Shadows Index

The Economist Intelligence Unit

Statistics and baselines for 40 countries on child sexual abuse.

2022
PreventionChild Sexual Abuse Material (CSAM)
Report
Guarding the Guardians: Automated Analysis of Online Child Sexual Abuse
Guarding the Guardians: Automated Analysis of Online Child Sexual Abuse

This paper presents an automated tool designed to analyze children's sexual abuse reports. By automating the analysis process of abuse complaints, the tool significantly reduces the risk of exposure to harmful content by categorizing the reports on three dimensions: Subject, Degree of Criminality, and Damage.

2023
Criminal investigationNatural Language Processing
Research (peer reviewed)
Fine-Tuning Llama 2 Large Language Models for Detecting Online Sexual Predatory Chats and Abusive Texts
Fine-Tuning Llama 2 Large Language Models for Detecting Online Sexual Predatory Chats and Abusive Texts

Cornell University

This paper proposes an approach to detection of online sexual predatory chats and abusive language using the open-source pretrained Llama 2 7B-parameter model, recently released by Meta GenAI. We fine-tune the LLM using datasets with different sizes, imbalance degrees, and languages (i.e., English, Roman Urdu and Urdu). Based on the power of LLMs, our approach is generic and automated without a manual search for a synergy between feature extraction and classifier design steps like conventional methods in this domain. Experimental results show a strong performance of the proposed approach, which performs proficiently and consistently across three distinct datasets with five sets of experiments. This study's outcomes indicate that the proposed method can be implemented in real-world applications (even with non-English languages) for flagging sexual predators, offensive or toxic content, hate speech, and discriminatory language in online discussions and comments to maintain respectful internet or digital communities.

2023
Machine learningNatural Language Processing
Research (peer reviewed)
Developing machine learning-based models to help identify child abuse and neglect: key ethical challenges and recommended solutions
Developing machine learning-based models to help identify child abuse and neglect: key ethical challenges and recommended solutions

Columbia University

This article applied a phenomenological approach to discuss and provide recommendations for key ethical issues related to machine learning-based risk models development and evaluation: (1) biases in the data; (2) clinical documentation system design issues; (3) lack of centralized evidence base for child abuse and neglect; (4) lack of "gold standard "in assessment and diagnosis of child abuse and neglect; (5) challenges in evaluation of risk prediction performance; (6) challenges in testing predictive models in practice; and (7) challenges in presentation of machine learning-based prediction to clinicians and patients.

2022
EthicsDetectionPrevention
Research (peer reviewed)
Black and Latinx Primary caregiver considerations for developing and implementing a machine learning-based model for detecting child abuse and neglect with implications for racial bias reduction: A qualitative study
Black and Latinx Primary caregiver considerations for developing and implementing a machine learning-based model for detecting child abuse and neglect with implications for racial bias reduction: A qualitative study

University of Pennsylvania Columbia University

This study elicited Black and Latinx primary caregivers' viewpoints regarding child abuse and neglect while living in underserved communities to highlight considerations for designing an ML-based model for detecting child abuse and neglect in emergency departments (EDs) with implications for racial bias reduction and future interventions.

2023
PreventionMachine learning
Report
BS-SC Model: A Novel Method for Predicting Child Abuse Using Borderline-SMOTE Enabled Stacking Classifier
BS-SC Model: A Novel Method for Predicting Child Abuse Using Borderline-SMOTE Enabled Stacking Classifier

B. S. Abdur Rahman Crescent Institute of Science and Technology

For a long time, legal entities have developed and used crime prediction methodologies. The techniques are frequently updated based on crime evaluations and responses from scientific communities. There is a need to develop type-based crime prediction methodologies that can be used to address issues at the subgroup level. Child maltreatment is not adequately addressed because children are voiceless. As a result, the possibility of developing a model for predicting child abuse was investigated in this study. Various exploratory analysis methods were used to examine the city of Chicago’s child abuse events. The data set was balanced using the Borderline-SMOTE technique, and then a stacking classifier was employed to ensemble multiple algorithms to predict various types of child abuse. The proposed approach successfully predicted crime types with 93% of accuracy, precision, recall, and F1-Score. The AUC value of the same was 0.989.

2023
PreventionDetectionClustering/Classification
Research (peer reviewed)
Applications of artificial intelligence in predicting the risk of child abuse: A literature review
Applications of artificial intelligence in predicting the risk of child abuse: A literature review

King Faisal Specialist Hospital and Research Centre Princess Nora bint Abdul Rahman University Mississippi State University

Child abuse is a major problem in most of the developing and developed countries. Medical practitioners and law enforcement authorities have often tried to tackle the problem using several conventional approaches. Nevertheless, there are other modern methods to screen, detect, and predict child abuse using artificial intelligence (AI). Therefore, this article aimed to critically review the currently available AI tools including data mining, computer-aided drawing systems, self-drawing tools, and neural networks used in child abuse screening.

2023
DetectionChild-focusedNeural networks
Research (peer reviewed)
Determining Child Sexual Abuse Posts based on Artificial Intelligence
Determining Child Sexual Abuse Posts based on Artificial Intelligence

Technological University Dublin

This paper proposes a CSAM detection intelligence algorithm based on natural language processing and machine learning techniques ([2]). The CSAM detection model is not only used to remove CSAM on online platforms, but can also help determine perpetrator behaviours, provide evidences, and extract new knowledge for hotlines, child agencies, education programs and policy makers.

2023
Child Sexual Abuse Material (CSAM)Clustering/Classification
Report
Identifying Online Child Sexual Texts in Dark Web through Machine Learning and Deep Learning Algorithms
Identifying Online Child Sexual Texts in Dark Web through Machine Learning and Deep Learning Algorithms

Technological University Dublin

In this paper, we propose a novel model based on artificial intelligence algorithms to automatically detect CSA text messages in dark web forums. Our algorithms have achieved impressive results in detecting CSAM in dark web, with a recall rate of 89%, a precision rate of 92.3% and an accuracy rate of 87.6%. Moreover, the algorithms can predict the classification of a post in just 1 microsecond and 0.3 milliseconds on standard laptop capabilities. This makes it possible to integrate our model into social network sites or edge devices to for real-time CSAM detection.

2023
Child Sexual Abuse Material (CSAM)Machine learningDeep Learning
Report
Discovering Child Sexual Abuse Material Creator´s Behaviors and Preferences on the Dark Web
Discovering Child Sexual Abuse Material Creator´s Behaviors and Preferences on the Dark Web

Technological University Dublin

The research team proposed a CSAM detection intelligence system. The system uses a manually labelled dataset to train, evaluate and select an efficient CSAM classification model. By identifying CSAM creators and victims through CSAM posts on the dark web, we proceed to analyze the material with a classifier, visualizing and uncovering information concerning the behaviors of CSAM creators and victims.

2023
Child Sexual Abuse Material (CSAM)PerpetratorsClustering/Classification
Research (peer reviewed)
Generative AI: New Attack Vector for Trust & Safety
Generative AI: New Attack Vector for Trust & Safety

Activefence

This report aims to help Trust & Safety teams better understand the challenges presented by the GenAI ecosystem, by providing: • A primer on GenAI functionality and its impact on Trust & Safety; • Examples of GenAI exploitation, resulting in the mass creation of dangerous content across a range of abuse areas; • A review and analysis of emerging trends in child predator activity to create GenAI-produced CSAM, and the dissemination of new exploitative methodologies that enable additional threat actors to engage in online harm at scale; • A regulatory overview examining platform accountability for GenAI-produced malicious content.

2023
Artificial intelligenceChild Sexual Abuse Material (CSAM)Generative AI
Report
Child predator abuse of generative AI
Child predator abuse of generative AI

Activefence

This report provides an initial assessment of the threat landscape emanating from online child predator communities as they focus attention on the creation of the production of child sexual abuse material (CSAM) at scale. This report offers a sample of: • Trending behavioral developments within child predator communities as they assess GenAI; • A review of the generation of GenAI image-based CSAM and the methods employed to produce it by threat actors; • A brief on the generation of GenAI text-based CSAM and the methods employed to produce it; • AI models that are in circulation to allow specific types of CSAM to be produced; • Legal context of the GenAI CSAM.

2023
Generative AIChild Sexual Abuse Material (CSAM)Artificial intelligence
Report
How AI is being abused to create child sexual abuse imagery
How AI is being abused to create child sexual abuse imagery

Internet Watch Foundation

In 2023, the Internet Watch Foundation (IWF) has been investigating its first reports of child sexual abuse material (CSAM) generated by artificial intelligence (AI). In total, 20,254 AI-generated images were found to have been posted to one dark web CSAM forum in a one-month period. Of these, 11,108 images were selected for assessment by IWF analysts. The report details the growing issue of AI generated CSAM and recommends actions to be taken by government, law enforcement and tech companies.

2023
Artificial intelligencePerpetratorsPreventionChild Sexual Abuse Material (CSAM)Generative AI
Report
Prediction of the development of depression and post-traumatic stress disorder in sexually abused children using a random forest classifier
Prediction of the development of depression and post-traumatic stress disorder in sexually abused children using a random forest classifier

Uskudar University Medical Faculty, Istanbul, Turkey

Depression and post-traumatic stress disorder (PTSD) are among the most common psychiatric disorders observed in children and adolescents exposed to sexual abuse. The present study aimed to investigate the effects of many factors such as the characteristics of a child, abuse, and the abuser, family type of the child, and the role of social support in the development of psychiatric disorders using machine learning techniques.

2021
Clustering/ClassificationMachine learningPost-crime effortsChild-focusedPost-traumatic stress (PTS)
Research (peer reviewed)
Hand-Based Person Identification Using Global and Part-Aware Deep Feature Representation Learning
Hand-Based Person Identification Using Global and Part-Aware Deep Feature Representation Learning

Lancaster University

The aim of this research is to identify biometric traits of dorsal hand images, which are the most commonly documented aspect of perpetrator in child sexual abuse imagery. In this work, the researchers propose hand-based person identification by learning both global and local deep feature representations. Using Global and Part-Aware Network (GPA-Net), the researchers created global and local branches on the conv-layer for learning robust discriminative global and part-level features. Similar research has been conducted at Auckland University.

2022
Artificial intelligenceprosecutionChild Sexual Abuse Material (CSAM)Machine learning
Research (peer reviewed)
Online Grooming Detection on Social Media Platforms
Online Grooming Detection on Social Media Platforms

Norwegian University of Science and Technology

The aim of this research is to provide techniques that increase children’s security on online chat platforms. The research project divides the online grooming detection problem into several subproblems, including author profiling, predatory conversation detection, predatory identification, and data limitations issues. The present article presents a literature review of available data sets and grooming detection techniques.

2022
DetectionClustering/Classification
Research (peer reviewed)
Child Sexual Abuse on the Internet: Report on the Analysis of Technological Factors that Affect the Creation and Sharing of Child Sexual Abuse Material on the Internet.
Child Sexual Abuse on the Internet: Report on the Analysis of Technological Factors that Affect the Creation and Sharing of Child Sexual Abuse Material on the Internet.

BI Norwegian Business School, Norwegian University of Science and Technology

This literary review details how the development of digital technology and services impacts both the possibilities for access and sharing of child sexual abuse materials (CSAM) and the possibilities for perpetrators to establish contact. The report underlines the need for collaboration between the private sector, civil society and law enforcement as a way of making bespoke technology available and share relevant information and intelligence.

2023
InternetPolicyDetectionprosecutionPost-crime effortsCriminal investigation
Report
…det var en känsla av att han befann sig i rummet bredvid
…det var en känsla av att han befann sig i rummet bredvid

World Childhood Foundation

The focus of the present report is to provide an insight into grooming and sexual extortion of children in Sweden. The report presents judgments from court cases decided in 2020 and 2021 concerning children who have been subjected to internet-related sexual offenses. Judgments are public documents produced by the by the court and contain, for example, evidence and the court's reasoning in a specific case

2022
Sweden-focusedprosecutionInternet
Report
Child Abuse and Domestic Abuse: Content and Feature Analysis from Social Media Disclosures
Child Abuse and Domestic Abuse: Content and Feature Analysis from Social Media Disclosures

Victoria University and University of Melbourne

As the increasing volume of abuse related posts shared on social media is of interest for the public health sector and family welfare organisations to monitor public health, this study aims to identify such posts and differentiate between child abuse and domestic abuse. Researchers first analysed psycholinguistic, textual and somatic features in social media posts disclosing child abuse and domestic abuse in order find out what characterises such posts, and then deployed machine learning classifiers to examine the extracted features’ predictive power. The abuse related posts had higher proportions for features such as anxiety, anger, sadness, sexual health, and death, and carried a lot of negative emotion.

2018
Artificial intelligenceMachine learningPreventionChild Sexual Abuse Material (CSAM)
Research (peer reviewed)
Therabot: An Adaptive Therapeutic Support Robot
Therabot: An Adaptive Therapeutic Support Robot

Institute of Electrical and Electronics Engineers (IEEE) and Mississippi State University

Therabot is a socially assistive robot designed to provide therapeutic support at home and in counselling settings. As it is specifically designed for those living with post-traumatic stress disorder (PTSD) it is suitable for children who have been subjected to sexual abuse. The robot resembles an animal and was developed through an iterative design process; both therapists and trauma survivors were consulted. Through touch sensing Therabot can deduce the user’s stress level and provide support accordingly. The researchers plan on developing the robot further by integrating AI in order to allow the robot to adapt and customise its interactions to the preferences of each user.

2018
Post-crime effortsChild-focusedRoboticsPost-traumatic stress (PTS)
Research (peer reviewed)
Predicting Prolific Live Streaming of Child Sexual Abuse
Predicting Prolific Live Streaming of Child Sexual Abuse

Australian Institute of Criminology

Report by the Australian Institute of Criminology that analyses child sexual abuse (CSA) and financial transactions through machine learning in order to identify characteristics of offenders who live stream CSA in high volumes. The analysis showed that factors such as frequency and monetary value are important and have implications for identifying these crimes among financial transaction data. Furthermore, offenders did not appear to have engaged in violent offending, but rather had a criminal history of low-harm offences.

2021
Financial transactionsArtificial intelligenceSupervised learningMachine learningCriminal investigationChild Sexual Abuse Material (CSAM)
Research (peer reviewed)
An Integrative Review of Historical Technology and Countermeasure Usage Trends in Online Child Sexual Exploitation Material Offender
An Integrative Review of Historical Technology and Countermeasure Usage Trends in Online Child Sexual Exploitation Material Offender

University of Edinburgh and George Mason University

A systematic literature review of the technologies that offenders of online child sexual exploitation material (CSEM) make use of. The literature review shows that offenders tend not to be ‘early adopters’ of new technologies, but rather continue to use trusted technologies even after higher functioning options are introduced. In addition to technologies utilised to access CSEM, offenders also employ countermeasures in order to avoid detection, for example encryption and anonymous browsers that protect the user’s identity and physical place such as the Tor browser. The researchers found that only a few offenders encrypt manually. With encryption built-in to technologies and the ability to use the Tor browser to visit traditional (non-dark) websites, much of the prior research into countermeasures is dated and may not be indicative of current behaviours.

2020
Child Sexual Abuse Material (CSAM)PerpetratorsCriminal investigationPrevention
Research (peer reviewed)
How Facial Recognition Is Helping Fight Child Sexual Abuse
How Facial Recognition Is Helping Fight Child Sexual Abuse

Griffeye

The article explores how facial recognition systems using machine learning can flag material depicting victims or criminals known by law enforcement. The system can also filter and group images that belong to the same case, which makes police officers’ work of going through child sexual abuse material (CSAM) more efficient as they do not need to jump in blindly without knowledge of what could be found or if there are any linking factors. Facial recognition systems have improved significantly in the past few years, especially when applied in uncontrolled circumstances, for example when a person’s face is seen from the side or in motion. Moreover, the systems have also become better at identifying and matching faces of children at different ages, which was almost impossible for the technology a few years ago. Today, systems designed specifically for CSAM exist and their impact has been transformative for the police forces embracing them.

2020
Child Sexual Abuse Material (CSAM)Criminal investigationMachine learningArtificial intelligence
Other publication
Sperm hunting on optical microscope slides for forensic analysis with deep convolutional networks – a feasibility study
Sperm hunting on optical microscope slides for forensic analysis with deep convolutional networks – a feasibility study

Zurich Institute of Forensic Medicine

Finding sperm cells under an optical microscope is a task which is time-consuming and difficult for a human being. This study shows how convolutional neural networks can be used to speed up the process. Two networks were tested based on the VGG19 architecture with a resulting accuracy of over 90%. Human oversight is still necessary to rule out false positives. The oversight is aided by a simple visual guide that can be provided to the overseeing experts which helps determine the accuracy of any given result.

2022
Perpetrators
Research (peer reviewed)
A Practitioner Survey Exploring the Value of Forensic Tools, AI, Filtering, & Safer Presentation for Investigating Child Sexual Abuse Material (CSAM)
A Practitioner Survey Exploring the Value of Forensic Tools, AI, Filtering, & Safer Presentation for Investigating Child Sexual Abuse Material (CSAM)

University of New Haven / Digital Forensic Research Workshop

This survey investigates what value those investigating CSAM ascribe to the different tools and technologies they use in their work. Effective tools are crucial not only for detection but also for reducing the potential harm of being exposed to such material over long periods of time. The survey found that filtering technologies are more important than safe viewing technologies and that false positives are a bigger problem than false negatives. As far as resources are concerned there is still a lack of personnel, time, and money in the field. Furthermore, it was found that practitioners are still not up-to-date on data science and AI; something which should be improved in order to deal with the large amount of data that they face. The biggest need practitioners have which AI can help with is tools that automatically detect child nudity, age, and skin tones.

2019
Artificial intelligencePreventionNeural networks
Research (peer reviewed)
Can Artificial Intelligence Achieve Human-Level Performance? A Pilot Study of Childhood Sexual Abuse Detection in Self-Figure Drawings
Can Artificial Intelligence Achieve Human-Level Performance? A Pilot Study of Childhood Sexual Abuse Detection in Self-Figure Drawings

University of Haifa

Delayed disclosure of childhood sexual abuse can range from one year to disclosure in adulthood, to no disclosure at all. Against this background, the ‘Draw-A-Person’ intervention has been developed by psychologists in order to detect indicators of sexual abuse in children’s self-portraits. In the present study, a convolutional neural network (CNN) was deployed to detect such indicators through image analysis. While human experts outperformed the CNN, the system still demonstrated high accuracy, suggesting that CNNs, when further developed, have potential to detect child sexual abuse.

2020
Artificial intelligencePerpetratorsCriminal investigationSupervised learningNeural networksMachine learningChild-focusedChild Sexual Abuse Material (CSAM)
Research (peer reviewed)
👉
www.childhood.org

Foother

ChildhoodChildhood
Childhood
linkedinlinkedin
linkedin
My AIMy AI
My AI
FacebookFacebook
Facebook

Blasieholmstorg 8, 111 48 Stockholm

+46(0)8-551 175 00

info@childhood.org