The field of politics has become even more complicated for women leaders, with the rise of generative AI posing an even greater risk for female politicians than men, according to a report released this week.
The analysis, published by disinformation think tank the American Sunlight Project (ASP) via The 19th on Dec. 11, uncovered more than 35,000 pieces of digitally altered nonconsensual intimate imagery (NCII) depicting 26 members of Congress. ASP’s numbers, when broken down, outline a stark reality for victims of NCII: Of those thousands of images, the organization found 25 women politicians depicted by AI. Only one man was the subject of such content.
Throughout the entire study, women members of Congress were 70 times more likely than men to be targeted by non-consensual synthetic images and 1 in 6 congresswomen (or 16 percent) are the victims of nonconsensual intimate imagery.
Leaders across the political aisle have attempted to address the rise of both NCII and synthetic AI-generated images, but have been slow to reach consensus. In January, a bipartisan group of Senators introduced the Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024 (Defiance Act), intended to criminalize the act of spreading nonconsensual, sexualized “digital forgeries” made with AI. It was passed in July but is still being considered by the House.
Last week the Senate passed the Take It Down Act, introduced by Senator Ted Cruz in June. The bill similarly criminalizes the publishing of digitally manipulated deepfakes online but also metes out penalties for companies who fail to remove such content within 48 hours of it being reported.
But the gendered reality of AI-boosted images can’t be understated, especially as women leaders navigate a precarious online environment putting them at greater risk for sexual abuse. In August, the Center for Countering Digital Hate published a study on the rise of online hate and negative engagement on the social profiles of women politicians. According to an analysis of the Instagram profiles of 10 female incumbents, one in 25 comments was “highly likely” to be toxic. Instagram failed to act on 93 percent of reported abusive comments targeting female politicians.
“We need to kind of reckon with this new environment and the fact that the internet has opened up so many of these harms that are disproportionately targeting women and marginalized communities,” said American Sunlight Project founder and author Nina Jankowicz. “My hope here is that the members are pushed into action when they recognize not only that it’s affecting American women, but it’s affecting them. It’s affecting their own colleagues. And this is happening simply because they are in the public eye.”
ASP informed the offices of affected politicians, alerting them to AI-generated NCII. The images were almost entirely removed following the notice, although the organization didn’t receive any comments.
If you have had intimate images shared without your consent, call the Cyber Civil Rights Initiative’s 24/7 hotline at 844-878-2274 for free, confidential support. The CCRI website also includes helpful information as well as a list of international resources.