Name | Organization | Description | File | Tags | Year | Type | Link | Status | Self reference | Crime Phase | Publication Type | Technology |
---|---|---|---|---|---|---|---|---|---|---|---|---|
Australian Institute of Criminology | Report by the Australian Institute of Criminology that analyses child sexual abuse (CSA) and financial transactions through machine learning in order to identify characteristics of offenders who live stream CSA in high volumes. The analysis showed that factors such as frequency and monetary value are important and have implications for identifying these crimes among financial transaction data. Furthermore, offenders did not appear to have engaged in violent offending, but rather had a criminal history of low-harm offences. | Financial transactionsArtificial intelligenceSupervised learningMachine learningCriminal investigationChild Sexual Abuse Material (CSAM) | 2021 | Research (peer reviewed) | ||||||||
Humboldt Universitat zu Berlin | A report on how deep learning transform models can classify grooming attempts. The authors of the report created a dataset that was then used by Viktor Bowallius and David Eklund in the report Grooming detection of chat segments using transformer models, where an f1 score of 0.98 was achieved. | Natural Language ProcessingClustering/Classification | 2021 | Research (peer reviewed) | ||||||||
Singidunum University | The article looks at how AI can analyse activity on mobile screens and audio ports, to detect bullying, porn and sexual harassment. Unlike previous experiments, this AI can see all activity as the user sees it, and not just see input in the form of texts or images that are retrieved from the screen and then processed. The model achieves an average accuracy of 88% when classifying texts, such as classifying sexism and racism. Furthermore, the model achieves 95% accuracy in detecting pornography. | PreventionClustering/ClassificationNeural networks | 2021 | Research (peer reviewed) | ||||||||
The report evaluates how well an AI can detect child sexual abuse via surveillance cameras. | Child Sexual Abuse Material (CSAM)Neural networks | 2021 | Research (peer reviewed) |