当前位置:首页 > Scams > 正文

AI experimentation is high risk, high reward for low-profile political campaigns

2024-12-27 14:31:43 Scams

Adrian Perkins was running for reelection as the mayor of Shreveport, Louisiana, when he was surprised by a harsh campaign hit piece.

The satirical TV commercial, paid for by a rival political action committee, used artificial intelligence to depict Perkins as a high school student who had been called into the principal’s office. Instead of giving a tongue-lashing for cheating on a test or getting in a fight, the principal blasted Perkins for failing to keep communities safe and create jobs.

The video superimposed Perkins’ face onto the body of an actor playing him. Although the ad was labeled as being created with “deep learning computer technology,” Perkins said it was powerful and resonated with voters. He didn’t have enough money or campaign staff to counteract it, and thinks it was one of many reasons he lost the 2022 race. A representative for the group behind the ad did not respond to a request for comment.

“One hundred percent the deepfake ad affected our campaign because we were a down-ballot, less resourced place,” said Perkins, a Democrat. “You had to pick and choose where you put your efforts.”

Adrian Perkins sits for a portrait in his office in Chicago, Thursday, June 13, 2024. (AP Photo/Nam Y. Huh)

While such attacks are staples of the rough-and-tumble of political campaigning, the ad targeting Perkins was notable: It’s believed to be one of the first examples of an AI deepfake deployed in a political race in the U.S. It also foreshadowed a dilemma facing candidates in scores of state and local races this year as generative AI has become more widespread and easier to use.

What to know about the 2024 Election

  • Democracy: American democracy has overcome big stress tests since 2020. More challenges lie ahead in 2024.
  • AP’s Role: The Associated Press is the most trusted source of information on election night, with a history of accuracy dating to 1848. Learn more.
  • Read the latest: Follow AP’s complete coverage of this year’s election.

The technology — which can do everything from streamlining mundane campaign tasks to creating fake images, video or audio — already has been deployed in some national races around the country and has spread far more widely in elections across the globe. Despite its power as a tool to mislead, efforts to regulate it have been piecemeal or delayed, a gap that could have the greatest impact on lower-profile races down the ballot.

Artificial intelligence is a double-edged sword for candidates running such campaigns. Inexpensive, user-friendly AI models can help them save money and time on some of their day-to-day tasks. But they often don’t have the staff or expertise to combat AI-generated falsehoods, adding to fears that an eleventh-hour deepfake could fool enough voters to tilt races decided by narrow margins.

“AI-enabled threats affect close races and low-profile contests where slight shifts matter and where there are often fewer resources correcting misleading stories,” said Josh Lawson, director of AI and democracy for the Aspen Institute.


This story is part of an Associated Press series, “The AI Campaign,” that explores the influence of artificial intelligence in the 2024 election cycle.

National safeguards lacking

Some local candidates already have faced criticism for deploying AI in misleading ways, from a Republican state senate candidate in Tennessee who used an AI headshot to make himself look slimmer and younger to Philadelphia’s Democratic sheriff, whose reelection campaign promoted fake news stories generated by ChatGPT.

One challenge in separating fact from fiction is the decline of local news outlets, which in many places has meant far less coverage of candidates running for state and local office, especially reporting that digs into candidates’ backgrounds and how their campaigns operate. The lack of familiarity with candidates could make voters more open to believing fake information, said U.S. Sen. Mark Warner of Virginia.

The Democrat, who has worked extensively on AI-related legislation as chair of the Senate Intelligence Committee, said AI-generated misinformation is easier to spot and combat in high-profile races because they are under greater scrutiny. When an AI-generated robocall impersonated President Joe Biden to discourage voters from going to the polls in the New Hampshire primary this year, it was quickly reported in the media and investigated, resulting in serious consequences for the players behind it.

More than a third of states have passed laws regulating artificial intelligence in politics, and legislation aimed specifically at fighting election-related deepfakes has received bipartisan support in each state where it has passed, according to the nonprofit consumer advocacy group Public Citizen.

But Congress has yet to act, despite several bipartisan groups of lawmakers proposing such legislation.

“Congress is pathetic,” said Warner, who said he was pessimistic about Congress passing any legislation protecting elections from AI interference this year.

Travis Brimm, executive director of the Democratic Association of Secretaries of State, called the specter of AI misinformation in down-ballot races an evolving issue in which people are “still working to figure out the best way forward.”

“This is a real challenge, and that’s why you’ve seen Democratic secretaries jump to address it and pass real legislation with real penalties around the abuse of AI,” Brimm said.

A spokesperson for the Republican Secretaries of State Committee did not respond to the AP’s request for comment.

How do you regulate integrity?

While experts and lawmakers worry about how generative AI attacks could skew an election, some candidates for state or local office said AI tools have proven invaluable to their campaigns. The powerful computer systems, software or processes can emulate aspects of human work and cognition.

Glenn Cook, a Republican running for a state legislative seat in southeastern Georgia, is less well-known and has much less campaign cash than the incumbent he is facing in a runoff election on Tuesday. So, he has invested in a digital consultant who creates much of his campaign’s content using inexpensive, publicly available generative AI models.

Glenn Cook, a Republican campaigning for a seat in the Georgia state House, speaks with a driver in Kingsland, Ga., Tuesday, June 11, 2024. (AP Photo/Gary McCullough)

On his website, AI-generated articles are peppered with AI-generated images of community members smiling and chatting, none of whom actually exist. AI-generated podcast episodes use a cloned version of his voice to narrate his policy positions.

Cook said he reviews everything before it is made public. The savings — in both time and money — have let him knock on more doors in the district and attend more in-person campaign events.

“My wife and I did 4,500 doors down here,” he said. “It frees you up to do a lot.”

Cook’s opponent, Republican state Rep. Steven Sainz, said he thinks Cook “hides behind what amounts to a robot instead of authentically communicating his opinions to voters.”

“I’m not running on artificially generated promises, but real-world results,” Sainz said, adding that he isn’t using AI in his own campaign.

Georgia Republican state Rep. Steven Sainz stands for a portrait in Kingsland, Ga., Tuesday, June 11, 2024. (AP Photo/Gary McCullough)
A campaign sign for Georgia Republican state Rep. Steven Sainz is displayed in Kingsland, Ga., Tuesday, June 11, 2024. (AP Photo/Gary McCullough)

Republican voters in the district weren’t sure what to make of the use of AI in the race, but said they cared most about the candidates’ values and outreach on the campaign trail. Patricia Rowell, a retired Cook voter, said she likes that he’s been in her community three or four times while campaigning, while Mike Perry, a self-employed Sainz voter, said he’s felt more personal contact from Sainz.

He said the expanded use of AI in politics is inevitable, but wondered how voters would be able to differentiate between what’s true and what’s not.

“It’s free speech, you know, and I don’t want to discourage free speech, but it comes down to the integrity of the people putting it out,” he said. “And I don’t know how you regulate integrity. It’s pretty tough.”

Local campaigns are vulnerable

Digital firms that market AI models for political campaigns told the AP most of the AI use in local campaigns so far is minimal and designed to boost efficiency for tedious tasks, such as analyzing survey data or drafting social media copy that meets a certain word limit.

Political consultants are increasingly dabbling with AI tools to see what works, according to a new report from a team led by researchers at the University of Texas at Austin. More than 20 political operatives from across the ideological spectrum told researchers they were experimenting with generative AI models in this year’s campaigns, even though they also feared that less scrupulous actors might be doing the same.

“Local-level elections will be so much more challenging because people will be attacking,” said Zelly Martin, the report’s lead author and a senior research fellow at the university’s Center for Media Engagement. “And what recourse do they have to fight back, as opposed to Biden and Trump who have many more resources to fend off attacks?”

There are immense differences in staffing, money and expertise between down-ballot campaigns — for state legislator, mayor, school board or any other local position —- and races for federal office. Where a local campaign might have just a handful of staffers, competitive U.S. House and Senate campaigns may have dozens and presidential operations can balloon to the thousands by the end of the campaign.

The campaigns for Biden and former President Donald Trump are both experimenting with AI to enhance fundraising and voter outreach efforts. Mia Ehrenberg, a spokesperson for the Biden campaign, said they also have a plan to debunk AI-generated misinformation. A Trump campaign spokesperson did not respond to the AP’s questions about their plans for handling AI-generated misinformation.

Adrian Perkins stands for a portrait in his office in Chicago, Thursday, June 13, 2024. (AP Photo/Nam Y. Huh)

Perkins, the former Shreveport mayor, had a small team that decided to ignore the attack and keep campaigning when the deepfake of him being hauled into the principal’s office hit local TV. He said he viewed the deepfake ad against him as a typical dirty trick at the time, but the rise of AI in just two years since his campaign has made him realize the technology’s power as a tool to mislead voters.

“In politics, people are always going to push the envelope a bit to be effective,” he said. “We had no idea how significant it would be.”

___

Burke reported from San Francisco, Merica from Washington and Swenson from New York.

___

This story is part of an Associated Press series, “The AI Campaign,” exploring the influence of artificial intelligence in the 2024 election cycle.

___ The Associated Press receives support from several private foundations to enhance its explanatory coverage of elections and democracy, and from the Omidyar Network to support coverage of artificial intelligence and its impact on society. AP is solely responsible for all content. Find AP’s standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org

最近关注

友情链接