Microsoft has announced a collaboration with StopNCII to proactively remove harmful intimate images and videos from its Bing search engine using digital hashes generated by users.

In March, Microsoft shared its PhotoDNA technology with StopNCII, enhancing the creation of digital hashes without requiring the actual media to leave the user’s device. This technology adds a layer of security by creating a unique digital fingerprint of harmful images, allowing them to be identified and removed across various platforms.

As part of this effort, Microsoft has integrated StopNCII’s hash database into Bing, enabling the removal of intimate images from its search index. By the end of August, Microsoft had taken action on over 268,000 images flagged in this system.

Buy Me A Coffee

With the rise of artificial intelligence, there has also been an increase in AI-generated deepfake nude images created from non-intimate photos shared online. Despite being fake, these images can cause significant distress to the individuals involved. Unfortunately, AI-generated content presents challenges for matching against PhotoDNA hashes, requiring manual reporting for removal from search engines.

Those impacted by real or synthetic intimate images can use Microsoft’s “Report a Concern” page to request removal from Bing. While Google has not yet joined this initiative, they offer their own procedures for removing such content from their search results.

READ
LinkedIn Fined $335 Million for Privacy Violations Related to Its Tracking Ads Biz