Google

Google announces steps to combat nonconsensual sexually explicit deepfakes

The announcement comes after mounting scrutiny on Google for their role in perpetuating the spread of deepfakes

Michael M. Santiago/Getty Images File photo

After a staggering increase in the number of fake pornographic videos and images uploaded online in the last several years, Google on Wednesday announced new measures to assist victims and reduce the prominence of deepfakes in top search results.

The search engine also committed to taking steps to derank websites that frequently host the nonconsensual sexually explicit fake videos — also known as deepfakes — meaning that they may appear lower in search results. Deepfakes refer to misleading fake media, which has increasingly been created using artificial-intelligence tools.

Nonconsensual sexually explicit deepfakes often “swap” a victim’s face onto the body of a person in a pre-existing pornographic video. Generative AI tools have also been used to create fake but realistic sexually explicit images that depict real people, or “undress” real photos to make victims appear nude. The practice overwhelmingly affects women and girls, both public figures and, increasingly, girls in middle and high schools around the world.

Other states have moved to ban the social media app on government devices, but this is the first statewide ban.

In 2023, more nonconsensual sexually explicit deepfakes were posted online than in all previous years combined. Google and other search engines have directed traffic to websites that allow deepfake creators to profit, as well as included links to deepfake videos and shown deepfake images in top search results. The platform has also included links to tools used to create nonconsensual sexually explicit deepfakes in top results. 

In its announcement Wednesday, Google said it will aim to filter explicit content from similar searches after victims successfully request the removal of explicit nonconsensual fake imagery through an online form. Currently, victims have to flag each URL containing the imagery. 

Google also said that it will scan for and remove duplicates of nonconsensual sexually explicit deepfake images from search results after images are successfully flagged and taken down. 

U.S. & World

Stories that affect your life across the U.S. and around the world.

Woman told House ethics committee she saw Gaetz have sex with minor, her lawyer says

Iran said it won't try to kill Trump: U.S. official

“These efforts are designed to give people added peace of mind, especially if they’re concerned about similar content about them popping up in the future,” Google’s announcement said.

Google is not proactively scanning for new deepfakes to remove, and will only remove deepfakes if a victim successfully flags them. 

NBC News previously reported that when safe-search tools are turned off, results for queries like “deepfakes” and “fake nudes” would surface the material in top results, above relevant news articles about the growing trend. 

Now, Google said it aims to rank relevant news articles above deepfakes, including when someone is searching for a person’s name and the word “deepfakes.” 

“With these changes, people can read about the impact deepfakes are having on society, rather than see pages with actual non-consensual fake images,” the announcement said. 

Google also said it will also demote websites that have been associated with a high number of deepfake removal requests in search results. One of the most prominent websites for nonconsensual sexually explicit deepfakes, which ranks highly in some Google search results, has used a variety of tactics to monetize the material. 

“This approach has worked well for other types of harmful content, and our testing shows that it will be a valuable way to reduce fake explicit content in search results,” Google said.

The announcement follows pressure from lawmakers to address the issue. In June, Senate Judiciary Chair Dick Durbin, D-Ill., sent a letter to Google’s CEO asking for details on how it plans to combat deepfakes. Federal legislation introduced by Durbin would allow nonconsensual sexually explicit deepfake victims to sue perpetrators; it passed the Senate last week and is awaiting a vote in the House.

This story first appeared on NBCNews.com. More from NBC News:

Copyright NBC News
Exit mobile version