Purpose
Child abuse material removal and blocking
Technology
Image and video detection/classification
Crime Phase
Prevention
Description
The Content Safety API sorts through many images and prioritizes the most likely child sexual abuse material (CSAM) content for review. The classifier can target content that has not been previously confined as CSAM.
Organization
Target group/intended user
Content moderators
Status
In use
Website
https://blog.google/around-the-globe/google-europe/using-ai-help-organizations-detect-and-report-child-sexual-abuse-material-online/
Country of Origin
USAGlobalSwitzerland
Date tool added to database
Sep 10, 2021 8:36 AM
Other tags
Visual AI
