Google Fights Non-Consensual Images

Google Fights Non-Consensual Images

18 September 2025

Google is partnering with StopNCII, a UK-based non-profit, to combat the spread of non-consensual intimate imagery (NCII), also known as revenge porn. Google will integrate StopNCII's hash-matching technology into its search engine. This system uses digital fingerprints of images to block unwanted content from appearing in search results. Users can create a hash of their images on StopNCII's website; the photo never leaves the device. The organisation then shares the hash with participating platforms.

If an image is uploaded to a partner platform, it can be automatically removed, or even blocked before it goes live. StopNCII's system works for known images, but not AI-generated content, audio, or text. Google already has measures in place for NCII, including a system for takedown requests. The collaboration with StopNCII aims to streamline the removal process and reduce the burden on victims.

StopNCII partners with other tech companies, including Meta, Reddit, Pornhub, OnlyFans, Snap, Microsoft, and X. Google's product manager, Griffin Hunt, stated that more needs to be done to reduce the burden on those affected by NCII. If users are over 18 and have photos they want to flag proactively, they can create a case on StopNCII's website.

AI generated content may differ from the original.

Published on 17 September 2025
googleprivacystopnciinciisecurity
  • Google Hosts Phone Spyware

    Google Hosts Phone Spyware

    Read more about Google Hosts Phone Spyware
  • ChatGPT Chats Face Scrutiny

    ChatGPT Chats Face Scrutiny

    Read more about ChatGPT Chats Face Scrutiny
  • AI Nude Image Crackdown

    AI Nude Image Crackdown

    Read more about AI Nude Image Crackdown
  • Gemini AI Smart Home Hack

    Gemini AI Smart Home Hack

    Read more about Gemini AI Smart Home Hack