Google is partnering with StopNCII, a UK-based non-profit, to combat the spread of non-consensual intimate imagery (NCII), also known as revenge porn. Google will integrate StopNCII's hash-matching technology into its search engine. This system uses digital fingerprints of images to block unwanted content from appearing in search results. Users can create a hash of their images on StopNCII's website; the photo never leaves the device. The organisation then shares the hash with participating platforms.
If an image is uploaded to a partner platform, it can be automatically removed, or even blocked before it goes live. StopNCII's system works for known images, but not AI-generated content, audio, or text. Google already has measures in place for NCII, including a system for takedown requests. The collaboration with StopNCII aims to streamline the removal process and reduce the burden on victims.
StopNCII partners with other tech companies, including Meta, Reddit, Pornhub, OnlyFans, Snap, Microsoft, and X. Google's product manager, Griffin Hunt, stated that more needs to be done to reduce the burden on those affected by NCII. If users are over 18 and have photos they want to flag proactively, they can create a case on StopNCII's website.




