📠

Machine Learning

All reports

NameOrganizationDescriptionFileTagsYearTypeLinkStatusSelf reference

Technological University Dublin

In this paper, we propose a novel model based on artificial intelligence algorithms to automatically detect CSA text messages in dark web forums. Our algorithms have achieved impressive results in detecting CSAM in dark web, with a recall rate of 89%, a precision rate of 92.3% and an accuracy rate of 87.6%. Moreover, the algorithms can predict the classification of a post in just 1 microsecond and 0.3 milliseconds on standard laptop capabilities. This makes it possible to integrate our model into social network sites or edge devices to for real-time CSAM detection.

Child Sexual Abuse Material (CSAM)Machine learningDeep Learning
2023
Report

Uskudar University Medical Faculty, Istanbul, Turkey

Depression and post-traumatic stress disorder (PTSD) are among the most common psychiatric disorders observed in children and adolescents exposed to sexual abuse. The present study aimed to investigate the effects of many factors such as the characteristics of a child, abuse, and the abuser, family type of the child, and the role of social support in the development of psychiatric disorders using machine learning techniques.

Clustering/ClassificationMachine learningPost-crime effortsChild-focusedPost-traumatic stress (PTS)
2021
Research (peer reviewed)

Lancaster University

The aim of this research is to identify biometric traits of dorsal hand images, which are the most commonly documented aspect of perpetrator in child sexual abuse imagery. In this work, the researchers propose hand-based person identification by learning both global and local deep feature representations. Using Global and Part-Aware Network (GPA-Net), the researchers created global and local branches on the conv-layer for learning robust discriminative global and part-level features. Similar research has been conducted at Auckland University.

Artificial intelligenceprosecutionChild Sexual Abuse Material (CSAM)Machine learning
2022
Research (peer reviewed)

Victoria University and University of Melbourne

As the increasing volume of abuse related posts shared on social media is of interest for the public health sector and family welfare organisations to monitor public health, this study aims to identify such posts and differentiate between child abuse and domestic abuse. Researchers first analysed psycholinguistic, textual and somatic features in social media posts disclosing child abuse and domestic abuse in order find out what characterises such posts, and then deployed machine learning classifiers to examine the extracted features’ predictive power. The abuse related posts had higher proportions for features such as anxiety, anger, sadness, sexual health, and death, and carried a lot of negative emotion.

Artificial intelligenceMachine learningPreventionChild Sexual Abuse Material (CSAM)
2018
Research (peer reviewed)

Australian Institute of Criminology

Report by the Australian Institute of Criminology that analyses child sexual abuse (CSA) and financial transactions through machine learning in order to identify characteristics of offenders who live stream CSA in high volumes. The analysis showed that factors such as frequency and monetary value are important and have implications for identifying these crimes among financial transaction data. Furthermore, offenders did not appear to have engaged in violent offending, but rather had a criminal history of low-harm offences.

Financial transactionsArtificial intelligenceSupervised learningMachine learningCriminal investigationChild Sexual Abuse Material (CSAM)
2021
Research (peer reviewed)

Griffeye

The article explores how facial recognition systems using machine learning can flag material depicting victims or criminals known by law enforcement. The system can also filter and group images that belong to the same case, which makes police officers’ work of going through child sexual abuse material (CSAM) more efficient as they do not need to jump in blindly without knowledge of what could be found or if there are any linking factors. Facial recognition systems have improved significantly in the past few years, especially when applied in uncontrolled circumstances, for example when a person’s face is seen from the side or in motion. Moreover, the systems have also become better at identifying and matching faces of children at different ages, which was almost impossible for the technology a few years ago. Today, systems designed specifically for CSAM exist and their impact has been transformative for the police forces embracing them.

Child Sexual Abuse Material (CSAM)Criminal investigationMachine learningArtificial intelligence
2020
Other publication

University of Haifa

Delayed disclosure of childhood sexual abuse can range from one year to disclosure in adulthood, to no disclosure at all. Against this background, the ‘Draw-A-Person’ intervention has been developed by psychologists in order to detect indicators of sexual abuse in children’s self-portraits. In the present study, a convolutional neural network (CNN) was deployed to detect such indicators through image analysis. While human experts outperformed the CNN, the system still demonstrated high accuracy, suggesting that CNNs, when further developed, have potential to detect child sexual abuse.

Artificial intelligencePerpetratorsCriminal investigationSupervised learningNeural networksMachine learningChild-focusedChild Sexual Abuse Material (CSAM)
2020
Research (peer reviewed)