🖼️

Child abuse material removal and blocking

All Digital Tools

Sort by Technology

5 views

Sort by Technology

Sort by Organization

Sort by Crime Phase

Sort by Target Group

Sort by Country

TechnologyOrganizationTool/project (Click to open) 👆Country of OriginTarget group/intended userCrime PhaseWebsiteDescription

Cybertip Canada

Internet Service Providers

Cleanfeed Canada blocks customer access to known non-Canadian websites that are hosting child pornography to reduce Canadian's exposure to child abuse images and create a disincentive for those who access and distribute child pornography.

Cybertip Canada

Law enforcement

Project Arachnid is a platform for reducing the availability of child sexual abuse material (CSAM) which includes crawling the internet for known CSAM using digital fingerprints as well as providing an API for industry members to improve upon and accelerate the detection of this harmful material on their own networks.

Cybertip Canada

Social mediaSearch engine companies

Rather than waiting for Project Arachnid to detect material and send a notice, industry can use Shield by Project Arachnid to quickly detect known CSAM on their service, which will, in turn, speed up its removal. Industry members that do not wish to interact directly with Shield by Project Arachnid can register their service/domain with the Canadian Centre to have any notices sent directly to them instead of being sent to their hosting provider. Other industries, such as filtering providers, can download real-time lists of URLs that are believed to currently contain CSAM for filtering purposes.

Dartmouth College

Social mediaLaw enforcement

eGlyph is a robust-hashing algorithm that can identify previously tagged images,video, and audio. eGlyph extends photoDNA which operates only on images.

EOKM and Web-IQ

HashCheckServer is a tool for hosting providers to prevent the uploading of known child sexual exploitation material. To prevent the uploading of known child sexual exploitation material, hosting parties needed to be part of the solution as they provide one of the gateways to publishing CSAM. Based on this knowledge, EOKM took the initiative to help hosting parties to prevent the uploading of known CSAM and to identify existing material on their servers so that it can be deleted. Following this initiative, EOKM asked Web-IQ to develop software that allows hosting parties to check whether an image appears in the police’s database of known CSAM.

Facebook

Facebook products

Camera Vision is a machine learning tool that identifies images that contain both nudity and a child.

Google

Content moderators

The Content Safety API sorts through many images and prioritizes the most likely child sexual abuse material (CSAM) content for review. The classifier can target content that has not been previously confined as CSAM.

Google/Youtube

Social media

CSAI Match is used to surface potential child sexual abuse images (CSAI) for YouTube uploads as well as partner submissions.

Krunam

Social media

Krunam provides breakthrough technology that identifies and classifies previously unknown CSAM images and video at scale. Our CSAM classifier protects your online community, your employees, your brand’s reputation, and, last but not least, millions of children around the world.

Microsoft

Law enforcement

PhotoDNA for Video builds on PhotoDNA technology, bringing all the benefits of PhotoDNA to the video environment.

Microsoft

Law enforcement

PhotoDNA Cloud Service enables businesses to protect their assets, interests, and customers by autornaticuliy detecting and reporting the distribution of child sexual exploitation and abuse images.

NetClean

Corporations and Organizations in general

NetClean ProTective is a technical solution used to protect business mobile devices by blocking access to URLs known to contain child sexual abuse material (CSAM). The uniquely combined and continually updated URL list, from some of the world’s primary sources, makes it an effective protective solution.

Qumodo

Law enforcement

Vigil At automatically detects, evaluates and categorizes child sexual abuse imagery. Thu system is capable of determining the severity of the sexual act in the image (Using the legacy UK 1-5 Category SAP Scale or the current UK 3 Categories A-C). The tool is available as part of Qumodo Classify, via a Cloud API or via a stand alone API for tools such as Griffeye Analyze. The tool scales linearly and can categorize millions of images per hour

Qumodo

Law enforcementContent moderators

Qumodo Classify is a software product which dramatically speeds up image categorization for content moderation for law enforcement purposes white reducing the psychological impact on the users, saving both time and money. This tool has been specialty designed around the latest AI technology, as well as 21 core psychology principals to form an effective Human/ AI team. As well as reducing the psychological impact on the user, the tool is designad to prevent human bias from influencing the system and creates the ability for the software to learn new abilities over time, further reducing the time needed to manually classify large amounts of data. When paired with the Vigil AI Classifier (vigil.ai) it becomes a powerful tool in the fight against child sexual exploitation and abuse (CSEA) content.

Thorn

Content moderatorsSocial media

Safer is a child sexual abuse material (CSAM) detection, review, removal and reporting pipeline. It allows small and medium sized companies to have the same CSAM fighting tools as the largest ones.

Thorn

Law enforcement

The programme is developing artificial intelligence classifiers, which are working on automating the detection of CSAM, creating a global standard for labeling, connecting, and organizing the world's data to help identify victims faster.

Two hat

Law enforcement

Cease.AI for Law Enforcement helps reduce investigator workloads and reduce mental stress by sorting, flagging and removing non-CSAM, allowing investigators to focus their efforts on new child abuse images. investigators upload case images, run their hash lists to eliminate known material, then let the AI identify, suggest a label, and prioritize images that contain previously uncatalogued CSAM.

Videntifier Technologies

Law enforcement

Large-scale video identification, which allows searching databases of videos. Videntifier TM Forensic, radically improves the forensic video identification process, by providing law enforcement agencies with a robust, fast and easy-to-use video identification system. Using this ser-vice, a single mouse-click is sufficient to automatically scan an entire storage device and classify all videos.

Web-IQ

Law enforcement

WebIQ has developed a serverless CSA detection AI, that scans websites and hashes images, and compares the hash to know CSA images. If there is a match, an automatic Notice of Takedown is sent to the host of the server of the website.

ZiuZ Forensic, Web-IQ, Timelex, DFKI, INHOPE

Law enforcement

The AviaTor project is funded by the European Union and is currently in its final phase. The project started in 2019 and will end in 2024. The project aimed to build a prioritisation tool / database for law enforcement processing NCMEC reports (also known as industry reports or Cyber tipline referrals). The AviaTor database provides law enforcement with the tooling to prioritise these reports. AviaTor stands for Augmented Visual Intelligence and Targeted Online Research – meaning that the AviaTor database has the functionality to use visual intelligence as well as OSINT and hash matching to de-duplicate and prioritise reports. AviaTor is currently used by 19 national law enforcement agencies.

International Association of Internet Hotlines (INHOPE)

Corporations and Organizations in generalPrivate individuals

The ESCAPE project, funded by End Violence Against Children (EVAC) represents INHOPE's vision to put an eco-system in place ensuring that every industry stakeholder and each member of the public around the globe has the option and awareness to report child sexual abuse material (CSAM) encountered online. Developing automation and intelligence tools for faster classification of CSAM reporting. CSAM will be removed as soon as possible, preventing further circulation and re-victimisation of children.

Internet Watch Foundation (IWF)

Internet Service Providers

The Internet Watch Foundation's URL List is a database of known child sextral abuse material (CSAM) sites to assist with blocking and filtering.

Internet Watch Foundation (IWF)

Internet Service Providers

The Internet Watch Foundation's Hash List is a hash database of known child sexual abuse material (CSAM) available for service providers to automatically match known images before they appear on a service and remove illegal images already on services.

Internet Watch Foundation (IWF) and NetSweeper

Internet Service ProvidersLaw enforcement

The Internet Watch Foundation and Netsweeper Hash List is a collaboration between the two organisations to block and filter child sexual abuse material (CSAM) on the internet. No primary market. Nesweeper is taking hashes and sending suspected URLs to IWF.

Tech Matters

Private individualsChildren

Customisable, open-source contact centre platform that allows children and youth to reach out to helplines via voice, SMS, webchat, WhatsApp, and Facebook Messenger if they come across CSAM

The National Center for Missing & Exploited Children (NCMEC)

Social mediaContent moderators

NCMEC hosts the NGO hash sharing platform, which includes MD5 and SHA1 hash values, as well as pDNA signatures, of images and videos tagged as Child Sexual Abuse Material by NCMEC, Internet Watch Foundation and the Canadian Centre for Child Protection. These hashes and signatures are made available to ESPs and other industry partners for free, for their use in their voluntary initiatives to detect, report and remove CSAM files on their services and platforms. As part of our Survivor Services program, NCMEC provides an exploitative hash sharing list, which includes MD5 and SHA1 hash values, as well as pDNA signatures, of images and videos tagged as "exploitative" by NCMEC. These files, which not illegal under federal guidelines, depict, support, and/or promote the exploitation of victims that are seen in associated CSAM files. These hashes and signatures are made available to ESPs and other industry partners for use in their voluntary initiatives to detect, report and remove files in support of survivors' privacy and protection. NCMEC hosts the Industry Hash Sharing Platform, which includes MD5 and SHA1 hash values, as well as pDNA signatures, of images and videos tagged as Child Sexual Abuse Material, which is shared by industry themselves. These hashes and signatures are made available by ESP's to other ESPs and other industry partners for use in their voluntary initiatives to detect, report and remove CSAM files. NCMEC's Hash Sharing Platform is the infrastructure and technical system which hosts the NGO CSAM, Industry and NCMEC Exploitative hash lists.  This technical system supports the file information, hashes and signatures, and file tags to allow for these  robust sharing initiatives.

🖼️Child abuse material removal and blocking🛑Consumption of child abuse material prevention🆔Identification🚨Grooming detection/prevention🗃️Data management and analysis🕵️‍♀️Perpetrator investigation🧑‍⚖️Perpetrator prosecution🤝Agency/organization collaboration