🤖

Robotics

Stella Polaris Knowledge Center
Stella Polaris Knowledge Center

All reports

Name
Organization
Description
File
Tags
Year
Type
Link
Status
Self reference
AI Roundtable 2019 Summary

World Childhood Foundation

The idea for Stella Polaris emerged from a roundtable meeting in autumn 2019 that brought together leading experts in AI and child safety.

AI Roundtable 2019 Summary.pdf
PreventionArtificial intelligence
2019
Report
AI Roundtable 2019 Summary
Artificial Intelligence: combating online sexual abuse of children

Bracket Foundation

A survey of what efforts currently exist in AI to counter and prevent child sexual abuse, and what AI could be used for.

AI-Combating-online-sexual-abuse-of-children-Bracket-Foundation-2019.pdf
PreventionNeural networksChild Sexual Abuse Material (CSAM)
2019
Report
Artificial Intelligence: combating online sexual abuse of children
Out of the Shadows Whitepaper

The Economist Intelligence Unit

Statistics and baselines for 40 countries on child sexual abuse.

Out-the-Shadows-Whitepaper.pdf
PreventionChild Sexual Abuse Material (CSAM)
2019
Report
Out of the Shadows Whitepaper
What Works to Prevent Sexual Violence Against Children

Together for girls

Report discussing practical and cost-effective solutions to break the cycle of sexual violence against children.

What-Works-to-Prevent-Sexual-Violence-Against-Children-Evidence-Review.pdf
Child Sexual Abuse Material (CSAM)PreventionFinancial transactions
2019
Report
What Works to Prevent Sexual Violence Against Children
Global Threat Assessment 2023

WeProtect Global Alliance

Summary of the WeProtect Global Alliance on global trends related to child sexual abuse and exploitation.

https://www.weprotect.org/wp-content/uploads/Global-Threat-Assessment-2023-English.pdf
Child Sexual Abuse Material (CSAM)Prevention
2023
Report
Global Threat Assessment 2023
NetClean Report 2020 - Covid-19 Impact

NetClean

A report providing an overview of child sexual abuse, with a focus on the internet and the impact of covid-19.

NetClean Report 2020.pdf
PreventionChild Sexual Abuse Material (CSAM)
2020
Report
NetClean Report 2020 - Covid-19 Impact
ReDirection survey report

Suojellaan Lapsia

The report is based on 8,484 survey responses from users of abuse material on the dark web and provides unique insight into the behavioural patterns of perpetrators.

ReDirection Survey Report.pdf
Perpetrators
2021
Report
ReDirection survey report
Illegal Online Sexual Behavior During the COVID-19 Pandemic

Karolinska Institutet

Scientific report on a study at Karolinska Institutet where an online-based CBT (cognitive behavioural therapy) programme is offered to people who produce, view and distribute child sexual abuse on the darknet. The study has received funding and capacity support from Childhood.

Illegal Online Sexual Behavior During the COVID-19 Pandemic.pdf
PreventionPerpetratorsChild Sexual Abuse Material (CSAM)Financial transactions
2020
Research (peer reviewed)
Illegal Online Sexual Behavior During the COVID-19 Pandemic
Sexuella övergrepp via nätet lika allvarliga som IRL

Göteborgs Universitet

Article on research into online sexual abuse and how it affects children.

https://www.forskning.se/2020/07/07/sexuella-overgrepp-via-natet-lika-allvarliga-som-irl/#
Child-focusedChild Sexual Abuse Material (CSAM)Prevention
2020
Research (peer reviewed)
Sexuella övergrepp via nätet lika allvarliga som IRL
Unga sex och internet #MeToo

Stiftelsen Allmänna Barnahus

A report on secondary school students' experiences of sexual abuse and sexual exploitation in Sweden 2020/2021.

Unga Sex Och Internet Efter MeToo 2021.pdf
PreventionChild Sexual Abuse Material (CSAM)Child-focused
2021
Report
Unga sex och internet #MeToo
Internetrelaterade sexuella övergrepp mot barn

Riksrevisionen

Review of police and prosecutors' work against internet-related sexual abuse of children.

Internetrelaterade sexuella övergrepp mot barn.pdf
PreventionChild Sexual Abuse Material (CSAM)prosecution
2021
Report
Internetrelaterade sexuella övergrepp mot barn
Childhood Activity Report

World Childhood Foundation

About Childhood's work to prevent and stop child sexual abuse.

World Childhood Foundation Activity Report 2022 ENG.pdfWorld Childhood Foundation Activity Report 2023 ENG.pdf
Activity report /VerksamhetsberättelseSweden-focused
2023
Report
Childhood Activity Report
Study on the Effects of New Information Technologies on the Abuse and Exploitation of Children

UNODC

Study_on_the_Effects.pdf
PreventionChild Sexual Abuse Material (CSAM)
2015
Report
Study on the Effects of New Information Technologies on the Abuse and Exploitation of Children
Summary Paper on Online Child Sexual Exploitation

ECPAT

The summary describes key manifestations of sexual exploitation of children (SEC), which includes the exploitation of children in prostitution, the sale and trafficking of children for sexual purposes, online child sexual exploitation (OCSE), the sexual exploitation of children in travel and tourism (SECTT) and some forms of child, early and forced marriages (CEFM).

ECPAT-Summary-paper-on-Online-Child-Sexual-Exploitation-2020.pdf
PreventionChild Sexual Abuse Material (CSAM)Perpetrators
2020
Report
Summary Paper on Online Child Sexual Exploitation
Child Online Safety: Minimizing the Risk of Violence, Abuse and Exploitation Online.

ITU/UNESCO Broadband Commission for Sustainable Development

The report describes causes and effects of the risks that children are exposed to online. It also proposes specific solutions that can prevent the risks. Child online safety is the global goal to be achieved through multistakeholder cooperation.

childonlinesafety_report.pdf
PreventionPerpetrators
2019
Report
Child Online Safety: Minimizing the Risk of Violence, Abuse and Exploitation Online.
Offender strategies for engaging children in online sexual activity

Department of Psychology, University of Gothenburg

This study aims to describe online offenders' interactions with actual children when inciting them to engage in online sexual activity.

Offender strategies.pdf
PreventionChild Sexual Abuse Material (CSAM)
2017
Research (peer reviewed)
Offender strategies for engaging children in online sexual activity
Curbing the Surge in Online Child Abuse

European Parliament

Briefing on the development of online sexual abuse, and what is being done to combat it.

Curbing the surge in online child abuse.pdf
PreventionNeural networks
2020
Report
Curbing the Surge in Online Child Abuse
How SARs fight online child sexual abuse by flagging financial crime

Napier

Article on how the sexual exploitation of children online, and how it is linked to financial crimes.

PreventionChild Sexual Abuse Material (CSAM)prosecution
2021
Other publication
www.napier.ai
How SARs fight online child sexual abuse by flagging financial crime
Action to End child Sexual Abuse and Exploitation

Unicef

In-depth report on the extent, nature and consequences of child sexual abuse and exploitation in different contexts. It describes evidence of effective interventions and strategies to prevent and respond to child sexual abuse and exploitation.

CSAE-Report-v2.pdf
Child Sexual Abuse Material (CSAM)Prevention
2020
Report
Action to End child Sexual Abuse and Exploitation
Brief version: Action to End child Sexual Abuse and Exploitation

Unicef

Report on the extent, nature and consequences of child sexual abuse and exploitation in different contexts. It describes evidence of effective interventions and strategies to prevent and respond to child sexual abuse and exploitation.

CSAE-Brief-v3.pdf
Child Sexual Abuse Material (CSAM)Prevention
2020
Report
Brief version: Action to End child Sexual Abuse and Exploitation
Artificial intelligence-produced child sexual abuse material: Insights from Dark Web forum posts

Anglia Ruskin University

A study by Anglia Ruskin University’s IPPPRI highlights the growing demand for AI-generated child sexual abuse material (CSAM) on the dark web. Researchers analyzed dark web forums, finding offenders actively using AI to create and share CSAM, teaching themselves through online guides and collaborating with others. The report reveals that offenders are reusing existing material to develop AI-generated content and anticipate advancements in technology to simplify the process. The study underscores the urgent need to understand these practices to prevent and combat AI-facilitated exploitation.

IPPPRI-Insight-No-1-AI-CSAM (2).pdf
Child Sexual Abuse Material (CSAM)Generative AI
2024
Report
www.aru.ac.uk
Stella Polaris Summit 2024 Summary

World Childhood Foundation

In June 2024, World Childhood Foundation invited stakeholders within child rights, AI, academia, pricate sector and cybersecurity among others to join in a one- day conference on using AI to combat child sexual abuse.

2024 Stella Polaris Summit - Summary.pdf
2024
Other publication
Generative AI - A New Threat for Online Child Sexual Exploitation and Abuse

Bracket Foundation and UNICRI Centre for AI and Robotics

This report explores the current forms of AI-generated CSAM in the form of images, video and text. Pulling together perspectives and data from law enforcement, tech companies, civil society, and caregivers, this report aims to provide a comprehensive overview of the escalating danger and suggest potential mitigation strategies.

2024_GenAI_New_Threat_for_Child_Abuse_Final.pdf
PreventionArtificial intelligenceGenerative AIChild Sexual Abuse Material (CSAM)
2024
Report
cdn.website-editor.net
Out of the Shadows Index

The Economist Intelligence Unit

Statistics and baselines for 40 countries on child sexual abuse.

Out-the-shadows-index-2022.pdf
PreventionChild Sexual Abuse Material (CSAM)
2022
Report
cdn.outoftheshadows.global
Out of the Shadows Index
Guarding the Guardians: Automated Analysis of Online Child Sexual Abuse

This paper presents an automated tool designed to analyze children's sexual abuse reports. By automating the analysis process of abuse complaints, the tool significantly reduces the risk of exposure to harmful content by categorizing the reports on three dimensions: Subject, Degree of Criminality, and Damage.

2308.03880v2.pdf
Criminal investigationNatural Language Processing
2023
Research (peer reviewed)
arxiv.org
Fine-Tuning Llama 2 Large Language Models for Detecting Online Sexual Predatory Chats and Abusive Texts

Cornell University

This paper proposes an approach to detection of online sexual predatory chats and abusive language using the open-source pretrained Llama 2 7B-parameter model, recently released by Meta GenAI. We fine-tune the LLM using datasets with different sizes, imbalance degrees, and languages (i.e., English, Roman Urdu and Urdu). Based on the power of LLMs, our approach is generic and automated without a manual search for a synergy between feature extraction and classifier design steps like conventional methods in this domain. Experimental results show a strong performance of the proposed approach, which performs proficiently and consistently across three distinct datasets with five sets of experiments. This study's outcomes indicate that the proposed method can be implemented in real-world applications (even with non-English languages) for flagging sexual predators, offensive or toxic content, hate speech, and discriminatory language in online discussions and comments to maintain respectful internet or digital communities.

2308.14683v1.pdf
Machine learningNatural Language Processing
2023
Research (peer reviewed)
arxiv.org
Developing machine learning-based models to help identify child abuse and neglect: key ethical challenges and recommended solutions

Columbia University

This article applied a phenomenological approach to discuss and provide recommendations for key ethical issues related to machine learning-based risk models development and evaluation: (1) biases in the data; (2) clinical documentation system design issues; (3) lack of centralized evidence base for child abuse and neglect; (4) lack of "gold standard "in assessment and diagnosis of child abuse and neglect; (5) challenges in evaluation of risk prediction performance; (6) challenges in testing predictive models in practice; and (7) challenges in presentation of machine learning-based prediction to clinicians and patients.

EthicsDetectionPrevention
2022
Research (peer reviewed)
www.ncbi.nlm.nih.gov
Black and Latinx Primary caregiver considerations for developing and implementing a machine learning-based model for detecting child abuse and neglect with implications for racial bias reduction: A qualitative study

University of Pennsylvania Columbia University

This study elicited Black and Latinx primary caregivers' viewpoints regarding child abuse and neglect while living in underserved communities to highlight considerations for designing an ML-based model for detecting child abuse and neglect in emergency departments (EDs) with implications for racial bias reduction and future interventions.

formative-2023-1-e40194.pdf
PreventionMachine learning
2023
Report
formative.jmir.org
BS-SC Model: A Novel Method for Predicting Child Abuse Using Borderline-SMOTE Enabled Stacking Classifier

B. S. Abdur Rahman Crescent Institute of Science and Technology

For a long time, legal entities have developed and used crime prediction methodologies. The techniques are frequently updated based on crime evaluations and responses from scientific communities. There is a need to develop type-based crime prediction methodologies that can be used to address issues at the subgroup level. Child maltreatment is not adequately addressed because children are voiceless. As a result, the possibility of developing a model for predicting child abuse was investigated in this study. Various exploratory analysis methods were used to examine the city of Chicago’s child abuse events. The data set was balanced using the Borderline-SMOTE technique, and then a stacking classifier was employed to ensemble multiple algorithms to predict various types of child abuse. The proposed approach successfully predicted crime types with 93% of accuracy, precision, recall, and F1-Score. The AUC value of the same was 0.989.

TSP_CSSE_34910.pdf
PreventionDetectionClustering/Classification
2023
Research (peer reviewed)
www.techscience.com
Applications of artificial intelligence in predicting the risk of child abuse: A literature review

King Faisal Specialist Hospital and Research Centre Princess Nora bint Abdul Rahman University Mississippi State University

Child abuse is a major problem in most of the developing and developed countries. Medical practitioners and law enforcement authorities have often tried to tackle the problem using several conventional approaches. Nevertheless, there are other modern methods to screen, detect, and predict child abuse using artificial intelligence (AI). Therefore, this article aimed to critically review the currently available AI tools including data mining, computer-aided drawing systems, self-drawing tools, and neural networks used in child abuse screening.

applications_of_artificial_intelligence_in.1.pdf
DetectionChild-focusedNeural networks
2023
Research (peer reviewed)
journals.lww.com
Determining Child Sexual Abuse Posts based on Artificial Intelligence

Technological University Dublin

This paper proposes a CSAM detection intelligence algorithm based on natural language processing and machine learning techniques ([2]). The CSAM detection model is not only used to remove CSAM on online platforms, but can also help determine perpetrator behaviours, provide evidences, and extract new knowledge for hotlines, child agencies, education programs and policy makers.

Determining Child Sexual Abuse Posts based on Artificial Intellig.pdf
Child Sexual Abuse Material (CSAM)Clustering/Classification
2023
Report
arrow.tudublin.ie
Identifying Online Child Sexual Texts in Dark Web through Machine Learning and Deep Learning Algorithms

Technological University Dublin

In this paper, we propose a novel model based on artificial intelligence algorithms to automatically detect CSA text messages in dark web forums. Our algorithms have achieved impressive results in detecting CSAM in dark web, with a recall rate of 89%, a precision rate of 92.3% and an accuracy rate of 87.6%. Moreover, the algorithms can predict the classification of a post in just 1 microsecond and 0.3 milliseconds on standard laptop capabilities. This makes it possible to integrate our model into social network sites or edge devices to for real-time CSAM detection.

Identifying Online Child Sexual Texts in Dark Web through Machine.pdf
Child Sexual Abuse Material (CSAM)Machine learningDeep Learning
2023
Report
arrow.tudublin.ie
Discovering Child Sexual Abuse Material Creator´s Behaviors and Preferences on the Dark Web

Technological University Dublin

The research team proposed a CSAM detection intelligence system. The system uses a manually labelled dataset to train, evaluate and select an efficient CSAM classification model. By identifying CSAM creators and victims through CSAM posts on the dark web, we proceed to analyze the material with a classifier, visualizing and uncovering information concerning the behaviors of CSAM creators and victims.

Discovering Child Sexual Abuse Material Creators_ Behaviors and P.pdf
Child Sexual Abuse Material (CSAM)PerpetratorsClustering/Classification
2023
Research (peer reviewed)
arrow.tudublin.ie
Generative AI: New Attack Vector for Trust & Safety

Activefence

This report aims to help Trust & Safety teams better understand the challenges presented by the GenAI ecosystem, by providing: • A primer on GenAI functionality and its impact on Trust & Safety; • Examples of GenAI exploitation, resulting in the mass creation of dangerous content across a range of abuse areas; • A review and analysis of emerging trends in child predator activity to create GenAI-produced CSAM, and the dissemination of new exploitative methodologies that enable additional threat actors to engage in online harm at scale; • A regulatory overview examining platform accountability for GenAI-produced malicious content.

Generative AI The New Attack Vector for Trust and Safety.pdf
Artificial intelligenceChild Sexual Abuse Material (CSAM)Generative AI
2023
Report
www.activefence.com
Child predator abuse of generative AI

Activefence

This report provides an initial assessment of the threat landscape emanating from online child predator communities as they focus attention on the creation of the production of child sexual abuse material (CSAM) at scale. This report offers a sample of: • Trending behavioral developments within child predator communities as they assess GenAI; • A review of the generation of GenAI image-based CSAM and the methods employed to produce it by threat actors; • A brief on the generation of GenAI text-based CSAM and the methods employed to produce it; • AI models that are in circulation to allow specific types of CSAM to be produced; • Legal context of the GenAI CSAM.

Child Predator Abuse of Generative AI-1.pdf
Generative AIChild Sexual Abuse Material (CSAM)Artificial intelligence
2023
Report
www.activefence.com
How AI is being abused to create child sexual abuse imagery

Internet Watch Foundation

In 2023, the Internet Watch Foundation (IWF) has been investigating its first reports of child sexual abuse material (CSAM) generated by artificial intelligence (AI). In total, 20,254 AI-generated images were found to have been posted to one dark web CSAM forum in a one-month period. Of these, 11,108 images were selected for assessment by IWF analysts. The report details the growing issue of AI generated CSAM and recommends actions to be taken by government, law enforcement and tech companies.

https://www.iwf.org.uk/media/q4zll2ya/iwf-ai-csam-report_public-oct23v1.pdf
Artificial intelligencePerpetratorsPreventionChild Sexual Abuse Material (CSAM)Generative AI
2023
Report
www.iwf.org.uk
Prediction of the development of depression and post-traumatic stress disorder in sexually abused children using a random forest classifier

Uskudar University Medical Faculty, Istanbul, Turkey

Depression and post-traumatic stress disorder (PTSD) are among the most common psychiatric disorders observed in children and adolescents exposed to sexual abuse. The present study aimed to investigate the effects of many factors such as the characteristics of a child, abuse, and the abuser, family type of the child, and the role of social support in the development of psychiatric disorders using machine learning techniques.

Clustering/ClassificationMachine learningPost-crime effortsChild-focusedPost-traumatic stress (PTS)
2021
Research (peer reviewed)
www.sciencedirect.com
Hand-Based Person Identification Using Global and Part-Aware Deep Feature Representation Learning

Lancaster University

The aim of this research is to identify biometric traits of dorsal hand images, which are the most commonly documented aspect of perpetrator in child sexual abuse imagery. In this work, the researchers propose hand-based person identification by learning both global and local deep feature representations. Using Global and Part-Aware Network (GPA-Net), the researchers created global and local branches on the conv-layer for learning robust discriminative global and part-level features. Similar research has been conducted at Auckland University.

https://wp.lancs.ac.uk/h-unique/files/2022/11/Baisa22Hand.pdf
Artificial intelligenceprosecutionChild Sexual Abuse Material (CSAM)Machine learning
2022
Research (peer reviewed)
wp.lancs.ac.uk
Online Grooming Detection on Social Media Platforms

Norwegian University of Science and Technology

The aim of this research is to provide techniques that increase children’s security on online chat platforms. The research project divides the online grooming detection problem into several subproblems, including author profiling, predatory conversation detection, predatory identification, and data limitations issues. The present article presents a literature review of available data sets and grooming detection techniques.

https://pdf.sciencedirectassets.com/271505/1-s2.0-S0950705122X0021X/1-s2.0-S0950705122011327/main.pdf?X-Amz-Security-Token=IQoJb3JpZ2luX2VjEJ7%2F%2F%2F%2F%2F%2F%2F%2F%2F%2FwEaCXVzLWVhc3QtMSJGMEQCIGNA4ye9KfLwdHkP0SQx6ZoIsX10JVp7HP7S%2BbXXAb%2BmAiAP88wcI2HShJjio7YKr%2Bh7YmDXxLETTeiAc9Fc1HwxBiq8BQjW%2F%2F%2F%2F%2F%2F%2F%2F%2F%2F8BEAUaDDA1OTAwMzU0Njg2NSIMgy%2BFqE9Y26ap6QD8KpAFHoALfAAx59NamEQ2v7ZnmEkN9xfRdaj6KQUVjDvmVkRQdqhHLWQh115xBN%2BvdZnTGAuegrn3AxVmatiXoS1Daf3Hv1uC3imoKCSQt5PxnYpWU7V0mQtkElpL0DKe93vY%2FJuiyT%2BEsoQDQySTSoSPsE5UJFZzPtiDLIjdf64URNQbBvAiQk6%2B73xewgWOLmr6TdnKeMmw2xNfAM2MFNVxuZILDqtrN1Anq7jTSFvhwAv0R2aXyL7oUchVoM1brULNYRRr%2B8LtDvE6DbC1UijVn%2FiMZfndDR2uD%2FWxQwQAkvoRvtRII1Qbx55xCM8l4xRXoQdM%2B1XUM0bqwuiMMFS%2BMSWF4aZhosx6Ugcgt1Uw1Q2b605dyp98PYLQ0FS209Wezjk8XX6ElaA7Vv0c6ARuP%2BaAQf0opDAKd7BoW%2FdqO3ujyWEv0x0laowJ8aw5xCGnfmr7zW86BbojFS3MRK5qAZtA0ZITeF%2B0ehsz0Z22uU4he1znrgmSUIkQ7M4EarSHr%2BeuRy4ufWAG%2FQOjHgihqh9iPvxLMfD1YHiuQFOpZn2fU1%2Frq22HOkDf7NUkcG55k6u%2BmM4pNzoWGsE3YdcKA%2Fea1N%2B1086Jr7gc09B4Mg%2BG%2FfTDBcO6pPuacZF6geRjMWYH0IcpvHZeAm5UmzaYv2AXvH%2BCRfBuzCXBTCDWUimWAPrRz%2BkQraos623S7QA1DBzFx9cgKty4O9oysaqTySBQCQ9nUA0ET0PrVB0ucmbjuiThS4%2F0rXyy%2BSMit1t%2BK%2FshEHJqKaXauOCh9Z22h2gImaz8nEw8zNUb2pjHY%2BdMuZiwcHwfNdTsKw7CL4bilcLDs1FK1eERZl6o6Bwu%2BJasJ18cqOE6WTSb0ZVAhigwgrezqgY6sgFy4WYJT2NIS0eIWUCK2HvcJbcGDsEWL55THUACYxBWHrXDH4q6zTDGJdGEi6Qt8Hup9sz27Rdy4WaLOqRBBlLlCutEzNqHyoER3aYyI4m1nw5PHY8Ma01WOuFYmpBgZruWeoaTYulccSYiRryxQGxR%2BZGJNpGw7YPtopZclJWMmG%2FWtJ9JcyUJWSAFaQ5vgPjHWl0uxQwd0gNuxgzut%2BSNWQe2Z2E73evvCIoTSFVV8G9P&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Date=20231109T144124Z&X-Amz-SignedHeaders=host&X-Amz-Expires=300&X-Amz-Credential=ASIAQ3PHCVTYWKCOS3NH%2F20231109%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Signature=0f54f9cc9c96e629840f564ae3f4ebd81a783dde4d9e913477973ea6a7cad7c4&hash=77556b71a385be4ab4ac40af4e9c2d405bb79c583571005bbf9a4d55c6d1444e&host=68042c943591013ac2b2430a89b270f6af2c76d8dfd086a07176afe7c76c2c61&pii=S0950705122011327&tid=spdf-81f503d4-6620-48ce-b726-0c9b517dad07&sid=2dae1c961858f6402a4891e-602acf561236gxrqb&type=client&tsoh=d3d3LnNjaWVuY2VkaXJlY3QuY29t&ua=09035e555d590651050303&rr=8236d0fbbeff0d36&cc=se
DetectionClustering/Classification
2022
Research (peer reviewed)
www.sciencedirect.com
Child Sexual Abuse on the Internet: Report on the Analysis of Technological Factors that Affect the Creation and Sharing of Child Sexual Abuse Material on the Internet.

BI Norwegian Business School, Norwegian University of Science and Technology

This literary review details how the development of digital technology and services impacts both the possibilities for access and sharing of child sexual abuse materials (CSAM) and the possibilities for perpetrators to establish contact. The report underlines the need for collaboration between the private sector, civil society and law enforcement as a way of making bespoke technology available and share relevant information and intelligence.

InternetPolicyDetectionprosecutionPost-crime effortsCriminal investigation
2023
Report
biopen.bi.no
…det var en känsla av att han befann sig i rummet bredvid

World Childhood Foundation

The focus of the present report is to provide an insight into grooming and sexual extortion of children in Sweden. The report presents judgments from court cases decided in 2020 and 2021 concerning children who have been subjected to internet-related sexual offenses. Judgments are public documents produced by the by the court and contain, for example, evidence and the court's reasoning in a specific case

Childhood Stella Polaris Rapport 2022 Internetrelaterade sexuella övergrepp mot barn.pdf
Sweden-focusedprosecutionInternet
2022
Report
Child Abuse and Domestic Abuse: Content and Feature Analysis from Social Media Disclosures

Victoria University and University of Melbourne

As the increasing volume of abuse related posts shared on social media is of interest for the public health sector and family welfare organisations to monitor public health, this study aims to identify such posts and differentiate between child abuse and domestic abuse. Researchers first analysed psycholinguistic, textual and somatic features in social media posts disclosing child abuse and domestic abuse in order find out what characterises such posts, and then deployed machine learning classifiers to examine the extracted features’ predictive power. The abuse related posts had higher proportions for features such as anxiety, anger, sadness, sexual health, and death, and carried a lot of negative emotion.

Artificial intelligenceMachine learningPreventionChild Sexual Abuse Material (CSAM)
2018
Research (peer reviewed)
link.springer.com
Therabot: An Adaptive Therapeutic Support Robot

Institute of Electrical and Electronics Engineers (IEEE) and Mississippi State University

Therabot is a socially assistive robot designed to provide therapeutic support at home and in counselling settings. As it is specifically designed for those living with post-traumatic stress disorder (PTSD) it is suitable for children who have been subjected to sexual abuse. The robot resembles an animal and was developed through an iterative design process; both therapists and trauma survivors were consulted. Through touch sensing Therabot can deduce the user’s stress level and provide support accordingly. The researchers plan on developing the robot further by integrating AI in order to allow the robot to adapt and customise its interactions to the preferences of each user.

Post-crime effortsChild-focusedRoboticsPost-traumatic stress (PTS)
2018
Research (peer reviewed)
ieeexplore.ieee.org
Predicting Prolific Live Streaming of Child Sexual Abuse

Australian Institute of Criminology

Report by the Australian Institute of Criminology that analyses child sexual abuse (CSA) and financial transactions through machine learning in order to identify characteristics of offenders who live stream CSA in high volumes. The analysis showed that factors such as frequency and monetary value are important and have implications for identifying these crimes among financial transaction data. Furthermore, offenders did not appear to have engaged in violent offending, but rather had a criminal history of low-harm offences.

Financial transactionsArtificial intelligenceSupervised learningMachine learningCriminal investigationChild Sexual Abuse Material (CSAM)
2021
Research (peer reviewed)
search.informit.org
An Integrative Review of Historical Technology and Countermeasure Usage Trends in Online Child Sexual Exploitation Material Offender

University of Edinburgh and George Mason University

A systematic literature review of the technologies that offenders of online child sexual exploitation material (CSEM) make use of. The literature review shows that offenders tend not to be ‘early adopters’ of new technologies, but rather continue to use trusted technologies even after higher functioning options are introduced. In addition to technologies utilised to access CSEM, offenders also employ countermeasures in order to avoid detection, for example encryption and anonymous browsers that protect the user’s identity and physical place such as the Tor browser. The researchers found that only a few offenders encrypt manually. With encryption built-in to technologies and the ability to use the Tor browser to visit traditional (non-dark) websites, much of the prior research into countermeasures is dated and may not be indicative of current behaviours.

Child Sexual Abuse Material (CSAM)PerpetratorsCriminal investigationPrevention
2020
Research (peer reviewed)
www.sciencedirect.com
How Facial Recognition Is Helping Fight Child Sexual Abuse

Griffeye

The article explores how facial recognition systems using machine learning can flag material depicting victims or criminals known by law enforcement. The system can also filter and group images that belong to the same case, which makes police officers’ work of going through child sexual abuse material (CSAM) more efficient as they do not need to jump in blindly without knowledge of what could be found or if there are any linking factors. Facial recognition systems have improved significantly in the past few years, especially when applied in uncontrolled circumstances, for example when a person’s face is seen from the side or in motion. Moreover, the systems have also become better at identifying and matching faces of children at different ages, which was almost impossible for the technology a few years ago. Today, systems designed specifically for CSAM exist and their impact has been transformative for the police forces embracing them.

Child Sexual Abuse Material (CSAM)Criminal investigationMachine learningArtificial intelligence
2020
Other publication
www.sciencedirect.com
Sperm hunting on optical microscope slides for forensic analysis with deep convolutional networks – a feasibility study

Zurich Institute of Forensic Medicine

Finding sperm cells under an optical microscope is a task which is time-consuming and difficult for a human being. This study shows how convolutional neural networks can be used to speed up the process. Two networks were tested based on the VGG19 architecture with a resulting accuracy of over 90%. Human oversight is still necessary to rule out false positives. The oversight is aided by a simple visual guide that can be provided to the overseeing experts which helps determine the accuracy of any given result.

PIIS1872497321001393.pdf
Perpetrators
2022
Research (peer reviewed)
www.fsigenetics.com
A Practitioner Survey Exploring the Value of Forensic Tools, AI, Filtering, & Safer Presentation for Investigating Child Sexual Abuse Material (CSAM)

University of New Haven / Digital Forensic Research Workshop

This survey investigates what value those investigating CSAM ascribe to the different tools and technologies they use in their work. Effective tools are crucial not only for detection but also for reducing the potential harm of being exposed to such material over long periods of time. The survey found that filtering technologies are more important than safe viewing technologies and that false positives are a bigger problem than false negatives. As far as resources are concerned there is still a lack of personnel, time, and money in the field. Furthermore, it was found that practitioners are still not up-to-date on data science and AI; something which should be improved in order to deal with the large amount of data that they face. The biggest need practitioners have which AI can help with is tools that automatically detect child nudity, age, and skin tones.

A Practitioner Survey Exploring the Value of Forensic Tools, AI, Filtering, & Safer Presentation for Investigating Child Sexual Abuse Material (CSAM).pdf
Artificial intelligencePreventionNeural networks
2019
Research (peer reviewed)
www.sciencedirect.com
Can Artificial Intelligence Achieve Human-Level Performance? A Pilot Study of Childhood Sexual Abuse Detection in Self-Figure Drawings

University of Haifa

Delayed disclosure of childhood sexual abuse can range from one year to disclosure in adulthood, to no disclosure at all. Against this background, the ‘Draw-A-Person’ intervention has been developed by psychologists in order to detect indicators of sexual abuse in children’s self-portraits. In the present study, a convolutional neural network (CNN) was deployed to detect such indicators through image analysis. While human experts outperformed the CNN, the system still demonstrated high accuracy, suggesting that CNNs, when further developed, have potential to detect child sexual abuse.

Artificial intelligencePerpetratorsCriminal investigationSupervised learningNeural networksMachine learningChild-focusedChild Sexual Abuse Material (CSAM)
2020
Research (peer reviewed)
www.sciencedirect.com
Developing automated methods to detect and match face and voice biometrics in child sexual abuse videos

San Jose State University

The report examines how biometric (facial) search combined with voice can be used to identify abuse survivors and perpetrators, by searching for matches in other child sexual abuse material videos.

Child Sexual Abuse Material (CSAM)Perpetrators
2022
Research (peer reviewed)
search.informit.org
Developing automated methods to detect and match face and voice biometrics in child sexual abuse videos
Automated Biometric Identification using Dorsal Hand Images and Convolutional Neural Networks

Auckland University of Technology

The researchers are investigating how well AI can be applied to identify people based solely on the back of their hands. They achieve accuracies of over 99.9%, indicating that such tools can be used to identify perpetrators in child sexual abuse material. Similar projects have been conducted in Lancaster University.

Automated Biometric Identification using Dorsal Hand Images and Convolutional Neural Networks.pdf
Neural networksPerpetrators
2021
Research (peer reviewed)
iopscience.iop.org
Automated Biometric Identification using Dorsal Hand Images and Convolutional Neural Networks
Child Sexual Abuse Material Online: The Perspective of Online Investigators on Training and Support

Griffith University

A study of what police officers believe they need to more effectively counter child sexual abuse material. The main training/support mentioned that organisations should provide included: communication, training on different types of perpetrators, and psychological support.

The Perspective of Online Investigators on Training and Support.pdf
Child Sexual Abuse Material (CSAM)prosecutionPerpetrators
2022
Research (peer reviewed)
Child Sexual Abuse Material Online: The Perspective of Online Investigators on Training and Support
Multimodal Virtual Avatars for Investigative Interviews with Children

Oslo Metropolitan University

The report describes how virtual avatars of children can be used to train police officers to interview children about sexual abuse. Furthermore, the interviews can be used to create synthetic data from interviews with children.

Criminal investigationprosecution
2021
Research (peer reviewed)
dl.acm.org
Multimodal Virtual Avatars for Investigative Interviews with Children
Early Detection of Sexual Predators in Chats

Humboldt Universitat zu Berlin

A report on how deep learning transform models can classify grooming attempts. The authors of the report created a dataset that was then used by Viktor Bowallius and David Eklund in the report Grooming detection of chat segments using transformer models, where an f1 score of 0.98 was achieved.

Early_detection_of_sexual_predators_in_chat.pdf
Natural Language ProcessingClustering/Classification
2021
Research (peer reviewed)
aclanthology.org
Early Detection of Sexual Predators in Chats
Estimation of the Development of Depression and PTSD in Children Exposed to Sexual Abuse and Development of Decision Support Systems by Using Artificial Intelligence

Inonu University

The report describes an experiment in which algorithms were able to predict whether a child had PTSD or major depression, with 99.2% accuracy, based solely on data about the person and the assault. This meant that no additional information was required beyond what is usually collected in child sexual abuse cases. The results indicate that it is possible to use AI early after an abuse has occurred to predict whether PTSD or major depression is likely to develop.

Child-focusedPost-traumatic stress (PTS)Post-crime efforts
2020
Research (peer reviewed)
www.tandfonline.com
Estimation of the Development of Depression and PTSD in Children Exposed to Sexual Abuse and Development of Decision Support Systems by Using Artificial Intelligence
Child Online Safety Toolkit

5Rights Foundation

Compilation on how policies on children's online safety can be developed.

PreventionNeural networks
2022
Report
childonlinesafetytoolkit.org
Child Online Safety Toolkit
When “Sweetie” is not so Sweet: Artificial Intelligence and its Implications for Child Pornography

The report describes how computer-generated child abuse material is becoming increasingly common, including through deep fake technology. The author argues that the US law 18 U.S. Code § 2256 should be revised to prohibit computer-generated abuse material as well.

Child Sexual Abuse Material (CSAM)prosecution
2021
Report
onlinelibrary.wiley.com
When “Sweetie” is not so Sweet: Artificial Intelligence and its Implications for Child Pornography
Keeping Children Safe Online With Limited Resources: Analyzing What is Seen and Heard

Singidunum University

The article looks at how AI can analyse activity on mobile screens and audio ports, to detect bullying, porn and sexual harassment. Unlike previous experiments, this AI can see all activity as the user sees it, and not just see input in the form of texts or images that are retrieved from the screen and then processed. The model achieves an average accuracy of 88% when classifying texts, such as classifying sexism and racism. Furthermore, the model achieves 95% accuracy in detecting pornography.

Keeping_Children_Safe_Online_With_Limited_Resources_Analyzing_What_is_Seen_and_Heard.pdf
PreventionClustering/ClassificationNeural networks
2021
Research (peer reviewed)
Keeping Children Safe Online With Limited Resources: Analyzing What is Seen and Heard
Suspicious activity detection using deep learning in secure assisted living IoT environments

Nalla Malla Engineering College, Galgotias University, Vellore Institute of Technology

The report evaluates how well an AI can detect child sexual abuse via surveillance cameras.

Child Sexual Abuse Material (CSAM)Neural networks
2021
Research (peer reviewed)
link.springer.com
Suspicious activity detection using deep learning in secure assisted living IoT environments
Child Abuse Risk Prediction and Prevention Framework using AI and Dark Web

Adhiyamaan College of Engineering

A report on a downloadable AI that analyses browser history to assess the risk that a child may be sexually exploited. The report does not provide concrete figures on how well the tool works.

2648.pdf
Child Sexual Abuse Material (CSAM)PreventionNeural networks
2022
Research (peer reviewed)
Child Abuse Risk Prediction and Prevention Framework using AI and Dark Web
Applying Artificial Intelligence for Age Estimation in Digital Forensic Investigations

University of Warwick

AI is tested to classify the age of children. Worked reasonably well for ages 10-20, with a mean error of about 2.5 years. The AI worked less well for children 0-10.

2201.03045.pdf
Artificial intelligenceNeural networks
2022
Research (peer reviewed)
Applying Artificial Intelligence for Age Estimation in Digital Forensic Investigations
Predictive Risk Modelling to Prevent Child Maltreatment and Other Adverse Outcomes for Service Users

University of Queensland

Analysis of how well AI can predict the risk of child maltreatment (Predicitve Risk Modelling PRM), based on such a tool used in New Zealand.

PreventionNeural networks
2015
Research (peer reviewed)
academic.oup.com
Predictive Risk Modelling to Prevent Child Maltreatment and Other Adverse Outcomes for Service Users