🕵️

Investigation

All reports

NameOrganizationDescriptionFileTagsYearTypeLinkStatusSelf reference
The idea for Stella Polaris emerged from a roundtable meeting in autumn 2019 that brought together leading experts in AI and child safety.
PreventionArtificial intelligence
2019
Report
A survey of what efforts currently exist in AI to counter and prevent child sexual abuse, and what AI could be used for.
PreventionNeural networksChild Sexual Abuse Material (CSAM)
2019
Report
Statistics and baselines for 40 countries on child sexual abuse.
PreventionChild Sexual Abuse Material (CSAM)
2019
Report
Report discussing practical and cost-effective solutions to break the cycle of sexual violence against children.
Child Sexual Abuse Material (CSAM)PreventionFinancial transactions
2019
Report
Summary of the WeProtect Global Alliance on global trends related to child sexual abuse and exploitation.
Child Sexual Abuse Material (CSAM)Prevention
2021
Report
A report providing an overview of child sexual abuse, with a focus on the internet and the impact of covid-19.
PreventionChild Sexual Abuse Material (CSAM)
2020
Report
The report is based on 8,484 survey responses from users of abuse material on the dark web and provides unique insight into the behavioural patterns of perpetrators.
Perpetrators
2021
Report
Scientific report on a study at Karolinska Institutet where an online-based CBT (cognitive behavioural therapy) programme is offered to people who produce, view and distribute child sexual abuse on the darknet. The study has received funding and capacity support from Childhood.
PreventionPerpetratorsChild Sexual Abuse Material (CSAM)Financial transactions
2020
Research (peer reviewed)
Article on research into online sexual abuse and how it affects children.
Child-focusedChild Sexual Abuse Material (CSAM)Prevention
2020
Research (peer reviewed)
A report on secondary school students' experiences of sexual abuse and sexual exploitation in Sweden 2020/2021.
PreventionChild Sexual Abuse Material (CSAM)Child-focused
2021
Report
Review of police and prosecutors' work against internet-related sexual abuse of children.
PreventionChild Sexual Abuse Material (CSAM)prosecution
2021
Report
About Childhood's work to prevent and stop child sexual abuse.
Activity report /VerksamhetsberättelseSweden-focused
2022
Report
PreventionChild Sexual Abuse Material (CSAM)
2015
Report
The summary describes key manifestations of sexual exploitation of children (SEC), which includes the exploitation of children in prostitution, the sale and trafficking of children for sexual purposes, online child sexual exploitation (OCSE), the sexual exploitation of children in travel and tourism (SECTT) and some forms of child, early and forced marriages (CEFM).
PreventionChild Sexual Abuse Material (CSAM)Perpetrators
2020
Report
The report describes causes and effects of the risks that children are exposed to online. It also proposes specific solutions that can prevent the risks. Child online safety is the global goal to be achieved through multistakeholder cooperation.
PreventionPerpetrators
2019
Report
This study aims to describe online offenders' interactions with actual children when inciting them to engage in online sexual activity.
PreventionChild Sexual Abuse Material (CSAM)
2017
Research (peer reviewed)
Briefing on the development of online sexual abuse, and what is being done to combat it.
PreventionNeural networks
2020
Report
Article on how the sexual exploitation of children online, and how it is linked to financial crimes.
PreventionChild Sexual Abuse Material (CSAM)prosecution
2021
Other publication
Needs check
In-depth report on the extent, nature and consequences of child sexual abuse and exploitation in different contexts. It describes evidence of effective interventions and strategies to prevent and respond to child sexual abuse and exploitation.
Child Sexual Abuse Material (CSAM)Prevention
2020
Report
Report on the extent, nature and consequences of child sexual abuse and exploitation in different contexts. It describes evidence of effective interventions and strategies to prevent and respond to child sexual abuse and exploitation.
Child Sexual Abuse Material (CSAM)Prevention
2020
Report
This paper proposes a CSAM detection intelligence algorithm based on natural language processing and machine learning techniques ([2]). The CSAM detection model is not only used to remove CSAM on online platforms, but can also help determine perpetrator behaviours, provide evidences, and extract new knowledge for hotlines, child agencies, education programs and policy makers.
Child Sexual Abuse Material (CSAM)Clustering/Classification
2023
Report
In this paper, we propose a novel model based on artificial intelligence algorithms to automatically detect CSA text messages in dark web forums. Our algorithms have achieved impressive results in detecting CSAM in dark web, with a recall rate of 89%, a precision rate of 92.3% and an accuracy rate of 87.6%. Moreover, the algorithms can predict the classification of a post in just 1 microsecond and 0.3 milliseconds on standard laptop capabilities. This makes it possible to integrate our model into social network sites or edge devices to for real-time CSAM detection.
Child Sexual Abuse Material (CSAM)Machine learningDeep Learning
2023
Report
The research team proposed a CSAM detection intelligence system. The system uses a manually labelled dataset to train, evaluate and select an efficient CSAM classification model. By identifying CSAM creators and victims through CSAM posts on the dark web, we proceed to analyze the material with a classifier, visualizing and uncovering information concerning the behaviors of CSAM creators and victims.
Child Sexual Abuse Material (CSAM)PerpetratorsClustering/Classification
2023
Research (peer reviewed)
This report aims to help Trust & Safety teams better understand the challenges presented by the GenAI ecosystem, by providing: • A primer on GenAI functionality and its impact on Trust & Safety; • Examples of GenAI exploitation, resulting in the mass creation of dangerous content across a range of abuse areas; • A review and analysis of emerging trends in child predator activity to create GenAI-produced CSAM, and the dissemination of new exploitative methodologies that enable additional threat actors to engage in online harm at scale; • A regulatory overview examining platform accountability for GenAI-produced malicious content.
Artificial intelligenceChild Sexual Abuse Material (CSAM)Generative AI
2023
Report
This report provides an initial assessment of the threat landscape emanating from online child predator communities as they focus attention on the creation of the production of child sexual abuse material (CSAM) at scale. This report offers a sample of: • Trending behavioral developments within child predator communities as they assess GenAI; • A review of the generation of GenAI image-based CSAM and the methods employed to produce it by threat actors; • A brief on the generation of GenAI text-based CSAM and the methods employed to produce it; • AI models that are in circulation to allow specific types of CSAM to be produced; • Legal context of the GenAI CSAM.
Generative AIChild Sexual Abuse Material (CSAM)Artificial intelligence
2023
Report
This report sheds light on the complex role that technology plays in the creation and sharing of online sexual abuse materials and the critical role of technology in creating counter solutions to detect and prevent abuse.
InternetChild Sexual Abuse Material (CSAM)Prevention
2023
Research (peer reviewed)
In 2023, the Internet Watch Foundation (IWF) has been investigating its first reports of child sexual abuse material (CSAM) generated by artificial intelligence (AI). In total, 20,254 AI-generated images were found to have been posted to one dark web CSAM forum in a one-month period. Of these, 11,108 images were selected for assessment by IWF analysts. The report details the growing issue of AI generated CSAM and recommends actions to be taken by government, law enforcement and tech companies.
Artificial intelligencePerpetratorsPreventionChild Sexual Abuse Material (CSAM)Generative AI
2023
Report
Depression and post-traumatic stress disorder (PTSD) are among the most common psychiatric disorders observed in children and adolescents exposed to sexual abuse. The present study aimed to investigate the effects of many factors such as the characteristics of a child, abuse, and the abuser, family type of the child, and the role of social support in the development of psychiatric disorders using machine learning techniques.
Clustering/ClassificationMachine learningPost-crime effortsChild-focusedPost-traumatic stress (PTS)
2021
Research (peer reviewed)
The aim of this research is to identify biometric traits of dorsal hand images, which are the most commonly documented aspect of perpetrator in child sexual abuse imagery. In this work, the researchers propose hand-based person identification by learning both global and local deep feature representations. Using Global and Part-Aware Network (GPA-Net), the researchers created global and local branches on the conv-layer for learning robust discriminative global and part-level features. Similar research has been conducted at Auckland University.
Artificial intelligenceprosecutionChild Sexual Abuse Material (CSAM)Machine learning
2022
Research (peer reviewed)
The aim of this research is to provide techniques that increase children’s security on online chat platforms. The research project divides the online grooming detection problem into several subproblems, including author profiling, predatory conversation detection, predatory identification, and data limitations issues. The present article presents a literature review of available data sets and grooming detection techniques.
https://pdf.sciencedirectassets.com/271505/1-s2.0-S0950705122X0021X/1-s2.0-S0950705122011327/main.pdf?X-Amz-Security-Token=IQoJb3JpZ2luX2VjEJ7%2F%2F%2F%2F%2F%2F%2F%2F%2F%2FwEaCXVzLWVhc3QtMSJGMEQCIGNA4ye9KfLwdHkP0SQx6ZoIsX10JVp7HP7S%2BbXXAb%2BmAiAP88wcI2HShJjio7YKr%2Bh7YmDXxLETTeiAc9Fc1HwxBiq8BQjW%2F%2F%2F%2F%2F%2F%2F%2F%2F%2F8BEAUaDDA1OTAwMzU0Njg2NSIMgy%2BFqE9Y26ap6QD8KpAFHoALfAAx59NamEQ2v7ZnmEkN9xfRdaj6KQUVjDvmVkRQdqhHLWQh115xBN%2BvdZnTGAuegrn3AxVmatiXoS1Daf3Hv1uC3imoKCSQt5PxnYpWU7V0mQtkElpL0DKe93vY%2FJuiyT%2BEsoQDQySTSoSPsE5UJFZzPtiDLIjdf64URNQbBvAiQk6%2B73xewgWOLmr6TdnKeMmw2xNfAM2MFNVxuZILDqtrN1Anq7jTSFvhwAv0R2aXyL7oUchVoM1brULNYRRr%2B8LtDvE6DbC1UijVn%2FiMZfndDR2uD%2FWxQwQAkvoRvtRII1Qbx55xCM8l4xRXoQdM%2B1XUM0bqwuiMMFS%2BMSWF4aZhosx6Ugcgt1Uw1Q2b605dyp98PYLQ0FS209Wezjk8XX6ElaA7Vv0c6ARuP%2BaAQf0opDAKd7BoW%2FdqO3ujyWEv0x0laowJ8aw5xCGnfmr7zW86BbojFS3MRK5qAZtA0ZITeF%2B0ehsz0Z22uU4he1znrgmSUIkQ7M4EarSHr%2BeuRy4ufWAG%2FQOjHgihqh9iPvxLMfD1YHiuQFOpZn2fU1%2Frq22HOkDf7NUkcG55k6u%2BmM4pNzoWGsE3YdcKA%2Fea1N%2B1086Jr7gc09B4Mg%2BG%2FfTDBcO6pPuacZF6geRjMWYH0IcpvHZeAm5UmzaYv2AXvH%2BCRfBuzCXBTCDWUimWAPrRz%2BkQraos623S7QA1DBzFx9cgKty4O9oysaqTySBQCQ9nUA0ET0PrVB0ucmbjuiThS4%2F0rXyy%2BSMit1t%2BK%2FshEHJqKaXauOCh9Z22h2gImaz8nEw8zNUb2pjHY%2BdMuZiwcHwfNdTsKw7CL4bilcLDs1FK1eERZl6o6Bwu%2BJasJ18cqOE6WTSb0ZVAhigwgrezqgY6sgFy4WYJT2NIS0eIWUCK2HvcJbcGDsEWL55THUACYxBWHrXDH4q6zTDGJdGEi6Qt8Hup9sz27Rdy4WaLOqRBBlLlCutEzNqHyoER3aYyI4m1nw5PHY8Ma01WOuFYmpBgZruWeoaTYulccSYiRryxQGxR%2BZGJNpGw7YPtopZclJWMmG%2FWtJ9JcyUJWSAFaQ5vgPjHWl0uxQwd0gNuxgzut%2BSNWQe2Z2E73evvCIoTSFVV8G9P&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Date=20231109T144124Z&X-Amz-SignedHeaders=host&X-Amz-Expires=300&X-Amz-Credential=ASIAQ3PHCVTYWKCOS3NH%2F20231109%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Signature=0f54f9cc9c96e629840f564ae3f4ebd81a783dde4d9e913477973ea6a7cad7c4&hash=77556b71a385be4ab4ac40af4e9c2d405bb79c583571005bbf9a4d55c6d1444e&host=68042c943591013ac2b2430a89b270f6af2c76d8dfd086a07176afe7c76c2c61&pii=S0950705122011327&tid=spdf-81f503d4-6620-48ce-b726-0c9b517dad07&sid=2dae1c961858f6402a4891e-602acf561236gxrqb&type=client&tsoh=d3d3LnNjaWVuY2VkaXJlY3QuY29t&ua=09035e555d590651050303&rr=8236d0fbbeff0d36&cc=se
DetectionClustering/Classification
2022
Research (peer reviewed)
This literary review details how the development of digital technology and services impacts both the possibilities for access and sharing of child sexual abuse materials (CSAM) and the possibilities for perpetrators to establish contact. The report underlines the need for collaboration between the private sector, civil society and law enforcement as a way of making bespoke technology available and share relevant information and intelligence.
InternetPolicyDetectionprosecutionPost-crime effortsCriminal investigation
2023
Report
The focus of the present report is to provide an insight into grooming and sexual extortion of children in Sweden. The report presents judgments from court cases decided in 2020 and 2021 concerning children who have been subjected to internet-related sexual offenses. Judgments are public documents produced by the by the court and contain, for example, evidence and the court's reasoning in a specific case
Sweden-focusedprosecutionInternet
2022
Report
As the increasing volume of abuse related posts shared on social media is of interest for the public health sector and family welfare organisations to monitor public health, this study aims to identify such posts and differentiate between child abuse and domestic abuse. Researchers first analysed psycholinguistic, textual and somatic features in social media posts disclosing child abuse and domestic abuse in order find out what characterises such posts, and then deployed machine learning classifiers to examine the extracted features’ predictive power. The abuse related posts had higher proportions for features such as anxiety, anger, sadness, sexual health, and death, and carried a lot of negative emotion.
Artificial intelligenceMachine learningPreventionChild Sexual Abuse Material (CSAM)
2018
Research (peer reviewed)
Therabot is a socially assistive robot designed to provide therapeutic support at home and in counselling settings. As it is specifically designed for those living with post-traumatic stress disorder (PTSD) it is suitable for children who have been subjected to sexual abuse. The robot resembles an animal and was developed through an iterative design process; both therapists and trauma survivors were consulted. Through touch sensing Therabot can deduce the user’s stress level and provide support accordingly. The researchers plan on developing the robot further by integrating AI in order to allow the robot to adapt and customise its interactions to the preferences of each user.
Post-crime effortsChild-focusedRoboticsPost-traumatic stress (PTS)
2018
Research (peer reviewed)
Report by the Australian Institute of Criminology that analyses child sexual abuse (CSA) and financial transactions through machine learning in order to identify characteristics of offenders who live stream CSA in high volumes. The analysis showed that factors such as frequency and monetary value are important and have implications for identifying these crimes among financial transaction data. Furthermore, offenders did not appear to have engaged in violent offending, but rather had a criminal history of low-harm offences.
Financial transactionsArtificial intelligenceSupervised learningMachine learningCriminal investigationChild Sexual Abuse Material (CSAM)
2021
Research (peer reviewed)
A systematic literature review of the technologies that offenders of online child sexual exploitation material (CSEM) make use of. The literature review shows that offenders tend not to be ‘early adopters’ of new technologies, but rather continue to use trusted technologies even after higher functioning options are introduced. In addition to technologies utilised to access CSEM, offenders also employ countermeasures in order to avoid detection, for example encryption and anonymous browsers that protect the user’s identity and physical place such as the Tor browser. The researchers found that only a few offenders encrypt manually. With encryption built-in to technologies and the ability to use the Tor browser to visit traditional (non-dark) websites, much of the prior research into countermeasures is dated and may not be indicative of current behaviours.
Child Sexual Abuse Material (CSAM)PerpetratorsCriminal investigationPrevention
2020
Research (peer reviewed)
The article explores how facial recognition systems using machine learning can flag material depicting victims or criminals known by law enforcement. The system can also filter and group images that belong to the same case, which makes police officers’ work of going through child sexual abuse material (CSAM) more efficient as they do not need to jump in blindly without knowledge of what could be found or if there are any linking factors. Facial recognition systems have improved significantly in the past few years, especially when applied in uncontrolled circumstances, for example when a person’s face is seen from the side or in motion. Moreover, the systems have also become better at identifying and matching faces of children at different ages, which was almost impossible for the technology a few years ago. Today, systems designed specifically for CSAM exist and their impact has been transformative for the police forces embracing them.
Child Sexual Abuse Material (CSAM)Criminal investigationMachine learningArtificial intelligence
2020
Other publication
Finding sperm cells under an optical microscope is a task which is time-consuming and difficult for a human being. This study shows how convolutional neural networks can be used to speed up the process. Two networks were tested based on the VGG19 architecture with a resulting accuracy of over 90%. Human oversight is still necessary to rule out false positives. The oversight is aided by a simple visual guide that can be provided to the overseeing experts which helps determine the accuracy of any given result.
Perpetrators
2022
Research (peer reviewed)
This survey investigates what value those investigating CSAM ascribe to the different tools and technologies they use in their work. Effective tools are crucial not only for detection but also for reducing the potential harm of being exposed to such material over long periods of time. The survey found that filtering technologies are more important than safe viewing technologies and that false positives are a bigger problem than false negatives. As far as resources are concerned there is still a lack of personnel, time, and money in the field. Furthermore, it was found that practitioners are still not up-to-date on data science and AI; something which should be improved in order to deal with the large amount of data that they face. The biggest need practitioners have which AI can help with is tools that automatically detect child nudity, age, and skin tones.
Artificial intelligencePreventionNeural networks
2019
Research (peer reviewed)
Delayed disclosure of childhood sexual abuse can range from one year to disclosure in adulthood, to no disclosure at all. Against this background, the ‘Draw-A-Person’ intervention has been developed by psychologists in order to detect indicators of sexual abuse in children’s self-portraits. In the present study, a convolutional neural network (CNN) was deployed to detect such indicators through image analysis. While human experts outperformed the CNN, the system still demonstrated high accuracy, suggesting that CNNs, when further developed, have potential to detect child sexual abuse.
Artificial intelligencePerpetratorsCriminal investigationSupervised learningNeural networksMachine learningChild-focusedChild Sexual Abuse Material (CSAM)
2020
Research (peer reviewed)
The report examines how biometric (facial) search combined with voice can be used to identify abuse survivors and perpetrators, by searching for matches in other child sexual abuse material videos.
Child Sexual Abuse Material (CSAM)Perpetrators
2022
Research (peer reviewed)
The researchers are investigating how well AI can be applied to identify people based solely on the back of their hands. They achieve accuracies of over 99.9%, indicating that such tools can be used to identify perpetrators in child sexual abuse material. Similar projects have been conducted in Lancaster University.
Neural networksPerpetrators
2021
Research (peer reviewed)
A study of what police officers believe they need to more effectively counter child sexual abuse material. The main training/support mentioned that organisations should provide included: communication, training on different types of perpetrators, and psychological support.
Child Sexual Abuse Material (CSAM)prosecutionPerpetrators
2022
Research (peer reviewed)
The report describes how virtual avatars of children can be used to train police officers to interview children about sexual abuse. Furthermore, the interviews can be used to create synthetic data from interviews with children.
Criminal investigationprosecution
2021
Research (peer reviewed)
A report on how deep learning transform models can classify grooming attempts. The authors of the report created a dataset that was then used by Viktor Bowallius and David Eklund in the report Grooming detection of chat segments using transformer models, where an f1 score of 0.98 was achieved.
Natural Language ProcessingClustering/Classification
2021
Research (peer reviewed)
The report describes an experiment in which algorithms were able to predict whether a child had PTSD or major depression, with 99.2% accuracy, based solely on data about the person and the assault. This meant that no additional information was required beyond what is usually collected in child sexual abuse cases. The results indicate that it is possible to use AI early after an abuse has occurred to predict whether PTSD or major depression is likely to develop.
Child-focusedPost-traumatic stress (PTS)Post-crime efforts
2020
Research (peer reviewed)
Compilation on how policies on children's online safety can be developed.
PreventionNeural networks
2022
Report
The report describes how computer-generated child abuse material is becoming increasingly common, including through deep fake technology. The author argues that the US law 18 U.S. Code § 2256 should be revised to prohibit computer-generated abuse material as well.
Child Sexual Abuse Material (CSAM)prosecution
2021
Report
The article looks at how AI can analyse activity on mobile screens and audio ports, to detect bullying, porn and sexual harassment. Unlike previous experiments, this AI can see all activity as the user sees it, and not just see input in the form of texts or images that are retrieved from the screen and then processed. The model achieves an average accuracy of 88% when classifying texts, such as classifying sexism and racism. Furthermore, the model achieves 95% accuracy in detecting pornography.
PreventionClustering/ClassificationNeural networks
2021
Research (peer reviewed)
The report evaluates how well an AI can detect child sexual abuse via surveillance cameras.
Child Sexual Abuse Material (CSAM)Neural networks
2021
Research (peer reviewed)
A report on a downloadable AI that analyses browser history to assess the risk that a child may be sexually exploited. The report does not provide concrete figures on how well the tool works.
Child Sexual Abuse Material (CSAM)PreventionNeural networks
2022
Research (peer reviewed)
AI is tested to classify the age of children. Worked reasonably well for ages 10-20, with a mean error of about 2.5 years. The AI worked less well for children 0-10.
Artificial intelligenceNeural networks
2022
Research (peer reviewed)
Analysis of how well AI can predict the risk of child maltreatment (Predicitve Risk Modelling PRM), based on such a tool used in New Zealand.
PreventionNeural networks
2015
Research (peer reviewed)