On Sunday, former President Donald Trump posted a collection of memes on Truth Social — the platform owned by his media company — that make it seem like Taylor Swift and her fans are coming out in support of his candidacy. But as new legislation takes effect, these images could have deeper implications about the use of AI-generated images in political campaigns, especially when those images misrepresent a celebrity’s likeness.
“One of the things I’m seeing a lot of in my practice right now is the rise of AI impersonators across the board for endorsements,” Noah Downs, an IP and entertainment lawyer, told TechCrunch, with the condition that his comments are not legal advice. These fake AI endorsements have become so widespread that even “Shark Tank” had to publish a PSA to warn fans about the prevalence of scams that impersonate the show’s investors.
In one of the images Trump posted, hordes of young women wear matching “Swifties for Trump” t-shirts. While there is indeed political diversity among the large population of Swift fans, these images appear to be AI-generated — in fact, these particular images come from a satirical post on X.
Another meme that Trump posted is a screenshot from X that depicts Taylor Swift as Uncle Sam, declaring, “Taylor wants you to vote for Donald Trump.”
Though the pop icon has not yet commented on the 2024 U.S. presidential election, she came out in support of the Biden-Harris campaign in 2020 and publicly disparaged Trump at the time. Some fans speculated that Swift had subtly endorsed Harris in an Instagram post this month, though this was not the case.
As one of the most dominant figures in pop culture, Swift has been subject to her fair share of deepfakes. When non-consensual, explicit AI images depicting Swift went viral on X this year, some lawmakers responded by introducing new bills aiming to protect against deepfakes. Even White House Press Secretary Karine Jean-Pierre called on Congress to do something.
Eight months later, the landscape of legal protections against misleading synthetic media already looks different. In Tennessee, where Swift’s corporate representation is based, Governor Bill Lee in March signed the trailblazing ELVIS Act into law, which carves out explicit protections for artists against unauthorized AI imitations of their work.
“This legislation was passed with bipartisan support, because everyone appears to recognize the problems that AI and misuse of AI tools can present to the public,” Downs said.
But since the ELVIS Act is so new, there isn’t precedent for how it could be used to protect artists. Much of the legislation’s language focuses specifically on AI-generated audio that can mimic an artist’s voice, like the viral Drake song that turned out to be fake.
“I do think that this is going to be a long term issue that the ELVIS Act is very prescient in taking care of, but we need to have more robust national legislation about it,” Downs said. The only reason why the ELVIS Act may even potentially be at play is because of Swift’s connections to the state, where she has business and real estate holdings.
Avi D. Kelin, a partner at PEM Law focusing on political law, is not optimistic that the ELVIS Act could apply, since the law appears more concerned with audio-based impersonation than imagery. Instead, he wonders if this could become a federal election integrity concern in the future.
“The larger question is whether the Federal Election Commission, which has jurisdiction over political communications, will get involved,” Kelin told TechCrunch. But he said that the FEC doesn’t seem likely to rollout new guidelines on AI-generated political communications this election cycle.
The Federal Communications Commission (FCC), however, announced it is moving forward with plans to enact new AI transparency requirements on TV and radio advertisements. But that doesn’t apply to social media posts by politicians running for government office, and social media remains a key component of campaign communications. Meanwhile, research from the Center for Countering Digital Hate (CCDH), a British non-profit focused on online extremism, showed that the volume of AI-generated disinformation increased an average of 130% per month on X over the last year.
These disingenuous endorsements matter so much because Swift’s support is perhaps the most coveted celebrity backing a politician can get. Her cultural influence is so vast that her support of a candidate could tip the scales in a tight race — according to Morning Consult, over half of adults in the U.S. consider themselves fans of Taylor Swift, while 16% identify as avid fans. Those numbers are staggering given the context that only about two-thirds of eligible Americans cast their vote in the 2020 election.
“The [ELVIS Act] is brand new, and the exact parameters will need to be developed by the courts,” said Kelin. “This would certainly be an interesting test case!”
Source : Techcrunch