After a historic presidential debate filled with speeches about eating pets, Taylor Swift closed the night with a bang. Arguably the most powerful figure in American pop culture, the singer-songwriter chose the night of the debate to announce on Instagram that she plans to vote for Kamala Harris in the presidential election.
Swift’s endorsement is monumental. She has enough political clout to push tens of thousands of Americans to register to vote simply by sharing a link. But more strikingly, she also used her announcement to voice her concerns about AI fakes.
Swift wrote on Instagram: “I recently learned that an AI of “me” falsely endorsing Donald Trump’s presidential bid was posted on your site. It really brought up my fears about AI and the dangers of spreading misinformation. It led me to the conclusion that I need to be very transparent about my actual plans for this election as a voter. The easiest way to combat misinformation is with the truth.”
Writing about her experience of being the victim of a deepfake to show support for a candidate she doesn’t actually plan to vote for, Swift’s statement seemed a little more personal.
“His statement, in my opinion, was very well thought out and written in a very compelling way, but the AI piece gives him a personal point of view that not everyone else would have on this election and what the candidates are doing,” Linda Bloss-Baum, a professor at American University in the Business and Entertainment program, told Tech News.
Celebrities, especially those as prominent as Swift, are particularly vulnerable to deepfakes, as there are enough photos and videos of them online to create especially sophisticated AI fakes.
“One of the things I’m seeing a lot of in my practice right now is the rise of AI impersonators across the board trying to get sponsorships,” Noah Downs, an intellectual property and entertainment attorney, told Tech News in August. These fake AI sponsorships have become so common that even “Shark Tank” had to run a PSA to warn fans about the prevalence of scams impersonating the show’s investors.
As for Swift, the artist has been the subject of viral, non-consensual AI-generated pornography, sparking debates among lawmakers looking to legislate against this harmful byproduct of generative AI.
“Unfortunately, this happens all the time with ordinary people whose name, image and likeness have been falsified by artificial intelligence products,” Bloss-Baum said.
But when celebrities like Swift are implicated, it can make lawmakers pay more attention.
“As a long-time lobbyist in the entertainment industry, I can tell you that you get more attention when you go to Capitol Hill with celebrities telling their stories,” he said.
When deepfakes play a role in the election of one of the most powerful seats in world politics, the stakes are little more than an uncanny valley version of Lori Greiner selling dietary supplements. But as Election Day approaches, the United States has little to no legislative ability to deter the spread of this disinformation via social media, where voters are more informed than ever.
“Unfortunately, AI is playing a bigger role in this election, just because of the proliferation of technology,” Bloss-Baum said. “We’ve been subject to robocalls in the past, but now the technology has gotten so good that you can actually fake them in such a way that the callers won’t necessarily know it’s not the candidate.”
Bloss-Baum said that since Swift is a Tennessee resident, she could sue former President Trump under the ELVIS Act. However, since the law is so new, there is little legal precedent. Still, Bloss-Baum believes that both consumers and celebrities will have more power to fight back if federal legislation passes. She sees the bipartisan NO FAKES Act as particularly promising, but it’s unlikely there will be any significant legislative changes before the election in early November.
“I’m sure campaigns are using AI for positive things, like data collection and analysis, but we need to be careful that AI doesn’t misrepresent candidates,” Bloss-Baum said.