Technology | Purpose | Organization | Tool/project (Click to open) 👆 | Country of Origin | Target group/intended user | Website | Description |
---|---|---|---|---|---|---|---|
Global Emancipation Network | Law enforcementChild sexual abuse investigators | Global Emancipation Network is excited to announce the latest counter-trafficking tool in our arsenal: Artemis. We teamed up with our friends at Splunk, Accenture, and Graphistry to produce a first-of-its-kind trafficking content classifier. Artemis is a proactive, automated solution for counter-human trafficking stakeholders to increase efficiency in investigations and disruptions by identifying high-risk establishments, individuals, and content. Our initial pilot focused on the illicit massage industry, categorizing massage businesses based on customer activity, staffing, location, services offered, imagery, and disciplinary actions. Using this data and advanced analytics, we created risk scores and tiers for targeted action. | |||||
Paliscope | Law enforcement | YOSE is an AI-driven search engine that lets you instantly track down intelligence within any file type—even from the largest, most unstructured stockpiles of locally stored data. | |||||
Stop the Traffik | Private individuals | Human trafficking is happening in plain sight all around us. Have you ever seen something that looks out of place? Does uncertainty stop you from speaking up? Every human trafficking story, no matter how long or short, whether current or historic, is important and relevant. It may be the missing piece of the puzzle, contributing to a larger and more accurate picture of human trafficking, which can then inform efforts to combat it. Technology places the power to prevent into the hand of a person who holds a mobile device. The STOP APP can be downloaded by anyone, anywhere in the world who has access to a smartphone. The app is anonymous, confidential and secure – it is available in seven languages and allows you to submit suspicious activity quickly by sending text based messages and uploading photos and videos. | |||||
Thompson Reuters, Mekong Club | Financial institutions | Toolkits is designed to help financial institutions fight human trafficking using data. The Toolkit, tailored for use in the Asia Pacific region, includes a set of ‘red flags’ - potential indicators of modern slavery - linked to suspicious patterns in financial transactions, customer data and behaviour. It also includes contextual information and case studies to widen the financial service sectors’ understanding of this multi-faceted crime. | |||||
Thorn | Law enforcement | Spotlight is a web-based application that helps law enforcement prioritize leads in their sex trafficking investigations. There are over 150,000 escort ads posted daily; Spotlight takes this massive amount of data and turns it into an asset for law enforcement. To date Spotlight has helped law enforcement identify an average of 8 children per day and has saved up to 60% of critical investigative time. | |||||
Global Emancipation Network | Law enforcementChild sexual abuse investigators | Global Emancipation Network grants Minerva access to organizations with valid counter-trafficking missions in order to host and explore millions of trafficking-related data records and use customized search, alerting, geo-location and other platform capabilities. Using Splunk Enterprise and third-party integrated technologies, Minerva protects case-sensitive information and monitors usage patterns to help keep user information private, safe and secure. Leading public, private and nonprofit organizations have already been accepted to the platform as early users. Minerva is equipped with data-processing capabilities to extract and organize information from a variety of data sources. Capabilities include: Advertisement analysis: Analyzes advertisements from the deep and open web, where most trafficking cases originate, and extracts data such as user, location, account and other identifying information. Image processing tools: Processes images of victims to reduce the time users spend analyzing photographs and manually linking them to advertisements. Minerva integrates image analysis tools to tag photographs with characteristics to expedite database search, and reverse image search to identify similar images. Text analysis tools and natural language processing: Extracts text in images from advertisements and flags correlations with missing persons reports and other valuable information. Multi-tenant system informing trend analysis: Allows all Minerva users to store their information securely on the same database at the same time, enabling secure, multi-agency collaboration on shared investigations. | |||||
Exchange Initiative | Private individuals | TraffickCam allows anyone with a smartphone to fight sex trafficking when they travel by uploading photos of hotel rooms to a law enforcement database. Photos uploaded to the free TraffickCam app are added to an enormous database of hotel room images. Federal, state and local law enforcement securely submit photos of hotel rooms used in the advertisement of sex trafficking victims to TraffickCam. Features such as patterns in the carpeting, furniture, room accessories and window views are matched against the database of traveler images to provide law enforcement with a list of potential hotels where the photo may have been taken to identify location of victims. | |||||
Cybertip Canada | Law enforcement | Project Arachnid is a platform for reducing the availability of child sexual abuse material (CSAM) which includes crawling the internet for known CSAM using digital fingerprints as well as providing an API for industry members to improve upon and accelerate the detection of this harmful material on their own networks. | |||||
Dartmouth College | Social mediaLaw enforcement | eGlyph is a robust-hashing algorithm that can identify previously tagged images,video, and audio. eGlyph extends photoDNA which operates only on images. | |||||
Facebook products | Camera Vision is a machine learning tool that identifies images that contain both nudity and a child. | ||||||
Interpol, Inspectoratul General AL Politiei, ZiuZ Visual Intelligence, NCIS Norway | Law enforcement | Project CPORT is funded by the European Union. It is a 2-year project that started in Jan 2023. The CPORT portal allows law enforcement agencies to access ICCAM – a database used by national hotlines to process and exchange reports of child sexual abuse material (CSAM) submitted by the public worldwide. The ICCAM platform is hosted by INTERPOL and is an important channel into INTERPOL’s international child sexual exploitation database and list of known domains disseminating severe abuse material (IWOL). | |||||
Microsoft | Law enforcement | PhotoDNA for Video builds on PhotoDNA technology, bringing all the benefits of PhotoDNA to the video environment. | |||||
Microsoft | Law enforcement | PhotoDNA Cloud Service enables businesses to protect their assets, interests, and customers by autornaticuliy detecting and reporting the distribution of child sexual exploitation and abuse images. | |||||
NetClean | Corporations and Organizations in general | NetClein ProActive works similarly to an anti-virus program but instead of detecting a virus, ProActive detects images and videos that law enforcement agencies have classified as child sexual abuse material. | |||||
Qumodo | Law enforcement | Vigil At automatically detects, evaluates and categorizes child sexual abuse imagery. Thu system is capable of determining the severity of the sexual act in the image (Using the legacy UK 1-5 Category SAP Scale or the current UK 3 Categories A-C). The tool is available as part of Qumodo Classify, via a Cloud API or via a stand alone API for tools such as Griffeye Analyze. The tool scales linearly and can categorize millions of images per hour | |||||
Qumodo | Law enforcementContent moderators | Qumodo Classify is a software product which dramatically speeds up image categorization for content moderation for law enforcement purposes white reducing the psychological impact on the users, saving both time and money. This tool has been specialty designed around the latest AI technology, as well as 21 core psychology principals to form an effective Human/ AI team. As well as reducing the psychological impact on the user, the tool is designad to prevent human bias from influencing the system and creates the ability for the software to learn new abilities over time, further reducing the time needed to manually classify large amounts of data. When paired with the Vigil AI Classifier (vigil.ai) it becomes a powerful tool in the fight against child sexual exploitation and abuse (CSEA) content. | |||||
Thorn | Law enforcement | The programme is developing artificial intelligence classifiers, which are working on automating the detection of CSAM, creating a global standard for labeling, connecting, and organizing the world's data to help identify victims faster. | |||||
Thorn | Content moderatorsSocial media | Safer is a child sexual abuse material (CSAM) detection, review, removal and reporting pipeline. It allows small and medium sized companies to have the same CSAM fighting tools as the largest ones. | |||||
Two hat | Law enforcement | Cease.AI for Law Enforcement helps reduce investigator workloads and reduce mental stress by sorting, flagging and removing non-CSAM, allowing investigators to focus their efforts on new child abuse images. investigators upload case images, run their hash lists to eliminate known material, then let the AI identify, suggest a label, and prioritize images that contain previously uncatalogued CSAM. | |||||
Web-IQ | Law enforcement | WebIQ has developed a serverless CSA detection AI, that scans websites and hashes images, and compares the hash to know CSA images. If there is a match, an automatic Notice of Takedown is sent to the host of the server of the website. | |||||
Aiba AI | Social mediaChildren | Amanda is a proactive safety platform designed to protect children in social environments by detecting grooming and making accurate profiling of age and gender. Amanda uses key stroke dynamics and metadata from chats to determine the gender and age group of users. By analyzing writing style, Amanda can identify if users are who they claim to be. Amanda also analyzes messages using methods such as stylometry, behavioral patterns, text, and image to continuously assess the risk of the conversation. This analysis results in a continuous risk score that enables real-time prioritization of potentially dangerous conversations. | |||||
Freedom Signal | Law enforcement | Freedom Signal is an online platform to help advocates develop ongoing relationships with through texting. We enable direct service organizations to send targeted text-based outreach to potential victims identified through web scraping. When a potential victim replies, it enables advocates to build trust with vulnerable populations in acute crisis. Freedom Signal’s outreach technology was designed by software engineers and survivors of online sex trafficking to address the specific needs of this population, while ensuring a safe, direct channel of communication. | |||||
Population Foundation of India | Children | Equipping adolescents with tools, information, and resources to identify and report online CSEA in India, using an artificially intelligent chatbot. | |||||
Swansea University | Law enforcement | Tools based on integrating AI/Linguistics that enable law enforcement to spot online grooming content in real-time. The project will impart specialist knowledge through a learning portal and chatbot to strengthen professionals’ abilities to shield children from online grooming. | |||||
International Association of Internet Hotlines (INHOPE) | Corporations and Organizations in generalPrivate individuals | The ESCAPE project, funded by End Violence Against Children (EVAC) represents INHOPE's vision to put an eco-system in place ensuring that every industry stakeholder and each member of the public around the globe has the option and awareness to report child sexual abuse material (CSAM) encountered online. Developing automation and intelligence tools for faster classification of CSAM reporting. CSAM will be removed as soon as possible, preventing further circulation and re-victimisation of children. | |||||
Internet Watch Foundation (IWF) | Internet Service Providers | The Internet Watch Foundation's Hash List is a hash database of known child sexual abuse material (CSAM) available for service providers to automatically match known images before they appear on a service and remove illegal images already on services. | |||||
Internet Watch Foundation (IWF) and NetSweeper | Internet Service ProvidersLaw enforcement | The Internet Watch Foundation and Netsweeper Hash List is a collaboration between the two organisations to block and filter child sexual abuse material (CSAM) on the internet. No primary market. Nesweeper is taking hashes and sending suspected URLs to IWF. | |||||
Tech Matters | Private individualsChildren | Customisable, open-source contact centre platform that allows children and youth to reach out to helplines via voice, SMS, webchat, WhatsApp, and Facebook Messenger if they come across CSAM | |||||
The National Center for Missing & Exploited Children (NCMEC) | Social mediaContent moderators | NCMEC hosts the NGO hash sharing platform, which includes MD5 and SHA1 hash values, as well as pDNA signatures, of images and videos tagged as Child Sexual Abuse Material by NCMEC, Internet Watch Foundation and the Canadian Centre for Child Protection. These hashes and signatures are made available to ESPs and other industry partners for free, for their use in their voluntary initiatives to detect, report and remove CSAM files on their services and platforms. As part of our Survivor Services program, NCMEC provides an exploitative hash sharing list, which includes MD5 and SHA1 hash values, as well as pDNA signatures, of images and videos tagged as "exploitative" by NCMEC. These files, which not illegal under federal guidelines, depict, support, and/or promote the exploitation of victims that are seen in associated CSAM files. These hashes and signatures are made available to ESPs and other industry partners for use in their voluntary initiatives to detect, report and remove files in support of survivors' privacy and protection. NCMEC hosts the Industry Hash Sharing Platform, which includes MD5 and SHA1 hash values, as well as pDNA signatures, of images and videos tagged as Child Sexual Abuse Material, which is shared by industry themselves. These hashes and signatures are made available by ESP's to other ESPs and other industry partners for use in their voluntary initiatives to detect, report and remove CSAM files. NCMEC's Hash Sharing Platform is the infrastructure and technical system which hosts the NGO CSAM, Industry and NCMEC Exploitative hash lists. This technical system supports the file information, hashes and signatures, and file tags to allow for these robust sharing initiatives. |