This survey investigates what value those investigating CSAM ascribe to the different tools and technologies they use in their work. Effective tools are crucial not only for detection but also for reducing the potential harm of being exposed to such material over long periods of time. The survey found that filtering technologies are more important than safe viewing technologies and that false positives are a bigger problem than false negatives. As far as resources are concerned there is still a lack of personnel, time, and money in the field. Furthermore, it was found that practitioners are still not up-to-date on data science and AI; something which should be improved in order to deal with the large amount of data that they face. The biggest need practitioners have which AI can help with is tools that automatically detect child nudity, age, and skin tones.
BI Norwegian Business School, Norwegian University of Science and Technology
Technological University Dublin
Technological University Dublin
Technological University Dublin
Zurich Institute of Forensic Medicine
Adhiyamaan College of Engineering
Australian Institute of Criminology
Auckland University of Technology
Humboldt Universitat zu Berlin
Nalla Malla Engineering College, Galgotias University, Vellore Institute of Technology
Uskudar University Medical Faculty, Istanbul, Turkey
University of Edinburgh and George Mason University
ITU/UNESCO Broadband Commission for Sustainable Development
University of New Haven / Digital Forensic Research Workshop
Institute of Electrical and Electronics Engineers (IEEE) and Mississippi State University
Department of Psychology, University of Gothenburg