Content Safety API

Purpose
πŸ–ΌοΈChild abuse material removal and blocking
Technology
πŸ–ΌοΈImage and video detection/classification
Crime Phase
πŸ”Prevention
Description

The Content Safety API sorts through many images and prioritizes the most likely child sexual abuse material (CSAM) content for review. The classifier can target content that has not been previously confined as CSAM.

Organization

Google

Target group/intended user
Content moderators
Status
In use
Website
https://blog.google/around-the-globe/google-europe/using-ai-help-organizations-detect-and-report-child-sexual-abuse-material-online/
Country of Origin
πŸ‡ΊπŸ‡ΈUSA🌍GlobalπŸ‡¨πŸ‡­Switzerland
Date tool added to database
Sep 10, 2021 8:36 AM
Other tags
Visual AI