Advertising

Microsoft Partners with StopNCII to Combat Synthetic Nude Images on Bing

How Microsoft is Fighting Synthetic Nude Images on Bing

The advancement of generative AI tools has given rise to a disturbing problem on the internet: the proliferation of synthetic nude images that closely resemble real people. This issue, commonly known as deepfake porn, has become a serious concern for victims of revenge porn. However, Microsoft has recently taken a major step towards addressing this problem by partnering with StopNCII, an organization that helps revenge porn victims remove explicit images from various platforms.

StopNCII allows victims to create a digital fingerprint, or “hash,” of explicit images on their devices. These digital fingerprints are then used by StopNCII’s partners, including Facebook, Instagram, Threads, TikTok, Snapchat, Reddit, Pornhub, OnlyFans, and now Microsoft’s Bing, to prevent the spread of revenge porn. By using these digital fingerprints, these platforms can effectively scrub explicit images from their search results and user feeds.

In a blog post, Microsoft revealed that during a pilot program with StopNCII’s database, it took action on 268,000 explicit images that were being returned through Bing’s image search by the end of August. This action demonstrates Microsoft’s commitment to combatting revenge porn and protecting victims. The company acknowledges that its previous approach, which relied solely on user reporting, was insufficient and did not effectively address the scale and risk associated with the spread of explicit imagery via search engines.

While Microsoft’s partnership with StopNCII is commendable, the larger question remains: what about Google? Despite offering its own tools to report and remove explicit images from search results, Google has faced criticism for not partnering with StopNCII. According to a Wired investigation, former employees and victims have called out Google for its failure to take more decisive action against deepfake porn. In South Korea alone, Google users have reported 170,000 search and YouTube links related to unwanted sexual content since 2020.

The prevalence of AI deepfake nude images is alarming, and it’s not just adults who are affected. “Undressing” sites have become a nightmare for high school students across the United States. Unfortunately, the country lacks a specific AI deepfake porn law to hold perpetrators accountable, leaving the responsibility to address the issue to a patchwork of state and local laws.

However, some progress is being made on this front. San Francisco prosecutors announced a lawsuit in August aimed at taking down 16 of the most prominent “undressing” sites. Wired has created a tracker for deepfake porn laws, which shows that 23 American states have passed laws addressing nonconsensual deepfakes, while nine have rejected proposed laws.

In conclusion, Microsoft’s partnership with StopNCII is a significant step towards combating the spread of synthetic nude images on the internet. By using digital fingerprints to remove explicit images from Bing’s search results, Microsoft is actively working to protect victims of revenge porn. However, the issue remains prevalent on other platforms, particularly Google, which has faced criticism for its inaction. As the problem of deepfake porn persists, it is crucial for both tech companies and lawmakers to continue developing comprehensive solutions to protect individuals from the harmful effects of synthetic nude images.