☰
  • Home
  • About Stella Polaris
    • Introduction to AI and Child Sexual Abuse
    • About Childhood
    • About Stella Polaris
  • Reports and Articles
    • All Reports
    • Research papers on Child Sexual Abuse and AI
    • Reports by Childhood
  • Common AI Technologies
    • Natural Language Processing
    • Data Analysis
    • Machine Learning
    • Deep Learning
    • Neural Networks
    • Robotics
    • Computer Vision
  • Digital Tools Database
  • Contact Us
  • Support Us!

Content Safety API

Purpose
🖼️Child abuse material removal and blocking
Technology
🖼️Image and video detection/classification
Crime Phase
🔍Prevention
Description

The Content Safety API sorts through many images and prioritizes the most likely child sexual abuse material (CSAM) content for review. The classifier can target content that has not been previously confined as CSAM.

Organization

Google

Target group/intended user
Content moderators
Status
In use
Website
https://blog.google/around-the-globe/google-europe/using-ai-help-organizations-detect-and-report-child-sexual-abuse-material-online/
Country of Origin
🇺🇸USA🌍Global🇨🇭Switzerland
Date tool added to database
Sep 10, 2021 8:36 AM
Other tags
Visual AI