Vigil AI


Vigil At automatically detects, evaluates and categorizes child sexual abuse imagery. Thu system is capable of determining the severity of the sexual act in the image (Using the legacy UK 1-5 Category SAP Scale or the current UK 3 Categories A-C). The tool is available as part of Qumodo Classify, via a Cloud API or via a stand alone API for tools such as Griffeye Analyze. The tool scales linearly and can categorize millions of images per hour



Target group/intended user
Law enforcement
In use
Country of Origin
Date tool added to database
Sep 12, 2021 9:22 AM
Other tags
Visual AI