BI Norwegian Business School, Norwegian University of Science and Technology
This literary review details how the development of digital technology and services impacts both the possibilities for access and sharing of child sexual abuse materials (CSAM) and the possibilities for perpetrators to establish contact. The report underlines the need for collaboration between the private sector, civil society and law enforcement as a way of making bespoke technology available and share relevant information and intelligence.
Inonu University
The report describes an experiment in which algorithms were able to predict whether a child had PTSD or major depression, with 99.2% accuracy, based solely on data about the person and the assault. This meant that no additional information was required beyond what is usually collected in child sexual abuse cases. The results indicate that it is possible to use AI early after an abuse has occurred to predict whether PTSD or major depression is likely to develop.
Oslo Metropolitan University
The report describes how virtual avatars of children can be used to train police officers to interview children about sexual abuse. Furthermore, the interviews can be used to create synthetic data from interviews with children.
Auckland University of Technology
The researchers are investigating how well AI can be applied to identify people based solely on the back of their hands. They achieve accuracies of over 99.9%, indicating that such tools can be used to identify perpetrators in child sexual abuse material.
San Jose State University
The report examines how biometric (facial) search combined with voice can be used to identify abuse survivors and perpetrators, by searching for matches in other child sexual abuse material videos.
Griffith University
A study of what police officers believe they need to more effectively counter child sexual abuse material. The main training/support mentioned that organisations should provide included: communication, training on different types of perpetrators, and psychological support.
The report describes how computer-generated child abuse material is becoming increasingly common, including through deep fake technology. The author argues that the US law 18 U.S. Code § 2256 should be revised to prohibit computer-generated abuse material as well.
Humboldt Universitat zu Berlin
A report on how deep learning transform models can classify grooming attempts. The authors of the report created a dataset that was then used by Viktor Bowallius and David Eklund in the report Grooming detection of chat segments using transformer models, where an f1 score of 0.98 was achieved.
Singidunum University
The article looks at how AI can analyse activity on mobile screens and audio ports, to detect bullying, porn and sexual harassment. Unlike previous experiments, this AI can see all activity as the user sees it, and not just see input in the form of texts or images that are retrieved from the screen and then processed. The model achieves an average accuracy of 88% when classifying texts, such as classifying sexism and racism. Furthermore, the model achieves 95% accuracy in detecting pornography.
The report evaluates how well an AI can detect child sexual abuse via surveillance cameras.
Adhiyamaan College of Engineering
A report on a downloadable AI that analyses browser history to assess the risk that a child may be sexually exploited. The report does not provide concrete figures on how well the tool works.
University of Warwick
AI is tested to classify the age of children. Worked reasonably well for ages 10-20, with a mean error of about 2.5 years. The AI worked less well for children 0-10.
Analysis of how well AI can predict the risk of child maltreatment (Predicitve Risk Modelling PRM), based on such a tool used in New Zealand.
Report on the extent, nature and consequences of child sexual abuse and exploitation in different contexts. It describes evidence of effective interventions and strategies to prevent and respond to child sexual abuse and exploitation.
In-depth report on the extent, nature and consequences of child sexual abuse and exploitation in different contexts. It describes evidence of effective interventions and strategies to prevent and respond to child sexual abuse and exploitation.
Article on how the sexual exploitation of children online, and how it is linked to financial crimes.
Briefing on the development of online sexual abuse, and what is being done to combat it.
Department of Psychology, University of Gothenburg
This study aims to describe online offenders' interactions with actual children when inciting them to engage in online sexual activity.
ITU/UNESCO Broadband Commission for Sustainable Development
The report describes causes and effects of the risks that children are exposed to online. It also proposes specific solutions that can prevent the risks. Child online safety is the global goal to be achieved through multistakeholder cooperation.
The summary describes key manifestations of sexual exploitation of children (SEC), which includes the exploitation of children in prostitution, the sale and trafficking of children for sexual purposes, online child sexual exploitation (OCSE), the sexual exploitation of children in travel and tourism (SECTT) and some forms of child, early and forced marriages (CEFM).
About Childhood's work to prevent and stop child sexual abuse.
Review of police and prosecutors' work against internet-related sexual abuse of children.
A report on secondary school students' experiences of sexual abuse and sexual exploitation in Sweden 2020/2021.
Göteborgs Universitet
Article on research into online sexual abuse and how it affects children.
Scientific report on a study at Karolinska Institutet where an online-based CBT (cognitive behavioural therapy) programme is offered to people who produce, view and distribute child sexual abuse on the darknet. The study has received funding and capacity support from Childhood.
The report is based on 8,484 survey responses from users of abuse material on the dark web and provides unique insight into the behavioural patterns of perpetrators.
A report providing an overview of child sexual abuse, with a focus on the internet and the impact of covid-19.
Compilation on how policies on children's online safety can be developed.
Summary of the WeProtect Global Alliance on global trends related to child sexual abuse and exploitation.
Report discussing practical and cost-effective solutions to break the cycle of sexual violence against children.
A survey of what efforts currently exist in AI to counter and prevent child sexual abuse, and what AI could be used for.
The Economist Intelligence Unit
Statistics and baselines for 40 countries on child sexual abuse.
The idea for Stella Polaris emerged from a roundtable meeting in autumn 2019 that brought together leading experts in AI and child safety.
World Childhood Foundation
Fokus för föreliggande rapport är att ge en inblick i grooming och sexuell utpressning av barn i Sverige. Rapporten redovisar domar från i domstol avgjorda fall under 2020 och 2021 gällande barn som utsatts för internetrelaterade sexualbrott. Domar är offentliga handlingar producerade av domstolen och innehåller, till exempel, bevisning och domstolens resonemang i ett specifikt fall
Institute of Electrical and Electronics Engineers (IEEE) and Mississippi State University
Therabot is a socially assistive robot designed to provide therapeutic support at home and in counselling settings. As it is specifically designed for those living with post-traumatic stress disorder (PTSD) it is suitable for children who have been subjected to sexual abuse. The robot resembles an animal and was developed through an iterative design process; both therapists and trauma survivors were consulted. Through touch sensing Therabot can deduce the user’s stress level and provide support accordingly. The researchers plan on developing the robot further by integrating AI in order to allow the robot to adapt and customise its interactions to the preferences of each user.
Australian Institute of Criminology
Report by the Australian Institute of Criminology that analyses child sexual abuse (CSA) and financial transactions through machine learning in order to identify characteristics of offenders who live stream CSA in high volumes. The analysis showed that factors such as frequency and monetary value are important and have implications for identifying these crimes among financial transaction data. Furthermore, offenders did not appear to have engaged in violent offending, but rather had a criminal history of low-harm offences.
University of Edinburgh and George Mason University
A systematic literature review of the technologies that offenders of online child sexual exploitation material (CSEM) make use of. The literature review shows that offenders tend not to be ‘early adopters’ of new technologies, but rather continue to use trusted technologies even after higher functioning options are introduced. In addition to technologies utilised to access CSEM, offenders also employ countermeasures in order to avoid detection, for example encryption and anonymous browsers that protect the user’s identity and physical place such as the Tor browser. The researchers found that only a few offenders encrypt manually. With encryption built-in to technologies and the ability to use the Tor browser to visit traditional (non-dark) websites, much of the prior research into countermeasures is dated and may not be indicative of current behaviours.
Griffeye
The article explores how facial recognition systems using machine learning can flag material depicting victims or criminals known by law enforcement. The system can also filter and group images that belong to the same case, which makes police officers’ work of going through child sexual abuse material (CSAM) more efficient as they do not need to jump in blindly without knowledge of what could be found or if there are any linking factors. Facial recognition systems have improved significantly in the past few years, especially when applied in uncontrolled circumstances, for example when a person’s face is seen from the side or in motion. Moreover, the systems have also become better at identifying and matching faces of children at different ages, which was almost impossible for the technology a few years ago. Today, systems designed specifically for CSAM exist and their impact has been transformative for the police forces embracing them.
Zurich Institute of Forensic Medicine
Finding sperm cells under an optical microscope is a task which is time-consuming and difficult for a human being. This study shows how convolutional neural networks can be used to speed up the process. Two networks were tested based on the VGG19 architecture with a resulting accuracy of over 90%. Human oversight is still necessary to rule out false positives. The oversight is aided by a simple visual guide that can be provided to the overseeing experts which helps determine the accuracy of any given result.
University of New Haven / Digital Forensic Research Workshop
This survey investigates what value those investigating CSAM ascribe to the different tools and technologies they use in their work. Effective tools are crucial not only for detection but also for reducing the potential harm of being exposed to such material over long periods of time. The survey found that filtering technologies are more important than safe viewing technologies and that false positives are a bigger problem than false negatives. As far as resources are concerned there is still a lack of personnel, time, and money in the field. Furthermore, it was found that practitioners are still not up-to-date on data science and AI; something which should be improved in order to deal with the large amount of data that they face. The biggest need practitioners have which AI can help with is tools that automatically detect child nudity, age, and skin tones.
University of Haifa
Delayed disclosure of childhood sexual abuse can range from one year to disclosure in adulthood, to no disclosure at all. Against this background, the ‘Draw-A-Person’ intervention has been developed by psychologists in order to detect indicators of sexual abuse in children’s self-portraits. In the present study, a convolutional neural network (CNN) was deployed to detect such indicators through image analysis. While human experts outperformed the CNN, the system still demonstrated high accuracy, suggesting that CNNs, when further developed, have potential to detect child sexual abuse.