Content Safety API

Description

The Content Safety API sorts through many images and prioritizes the most likely child sexual abuse material (CSAM) content for review. The classifier can target content that has not been previously confined as CSAM.

Organization

Google

Target group/intended user
Content moderators
Status
In use
Date tool added to database
Sep 10, 2021 8:36 AM
Other tags
Visual AI