Exclusive: US intelligence spotted Chinese, Iranian deepfakes in 2020 aimed at influencing US voters

National Security Agency (NSA) headquarters in Fort Meade, Maryland. National Security Agency (NSA) headquarters in Fort Meade, Maryland. NSA/Handout/Reuters CNN  — 

Operatives working for the Chinese and Iranian governments prepared fake, AI-generated content as part of a campaign to influence US voters in the closing weeks of the 2020 election campaign, current and former US officials briefed on the intelligence told CNN.

The Chinese and Iranian operatives never disseminated the deepfake audio or video publicly, but the previously unreported intelligence demonstrates concerns US officials had four years ago about the willingness of foreign powers to amplify false information about the voting process.

The National Security Agency collected the intelligence that gave US officials insights into China and Iran’s capabilities in producing deepfakes, one of the sources said.

Now, with deepfake audio and video much easier to produce and the presidential election just six months away, US officials have grown more concerned over how a foreign influence campaign might exploit artificial intelligence to mislead voters.

At an exercise in the White House Situation Room last December in preparation for the 2024 election, senior US officials wrestled with how to respond to a scenario where Chinese operatives create a fake AI-generated video depicting a Senate candidate destroying ballots, as CNN has previously reported.

At a briefing last week, FBI officials warned that AI increases the ability of foreign states to spread election disinformation.

It’s unclear what was depicted in the deepfakes that the Chinese and Iranian operatives prepared in 2020, according to the sources, or why they were not ultimately deployed during that election.

At the time, some US officials who reviewed the intelligence were unimpressed, believing it showed China and Iran lacked the capability to deploy deepfakes in a way that would seriously impact the 2020 presidential election, a former senior US official told CNN.

“The technology has to be good; I don’t think it was that good,” the former official said. “Secondly, you have to have a risk appetite. China, no. Iran, probably yes.”

Sources pointed to no evidence of coordination between the two countries.

Keeping an eye on adversaries

The NSA has continued to collect intelligence on foreign adversaries developing deepfakes and the potential threat they pose to US elections now that the technology has advanced dramatically over the last four years, the former senior official added, pointing out that in 2020, there wasn’t, for example, a large language model like ChatGPT that was easy to use.

The NSA declined to comment.

US officials have maintained a high level of visibility into the AI and deepfake advancements made by countries including China, Iran and Russia since the 2020 election. But putting that intelligence to use inside the US remains a challenge, the former official said.

Ukrainian servicemen fire a M777 howitzer toward Russian troops near a front line in the Donetsk region, Ukraine on May 1, 2024. FBI officials are concerned that President Joe Biden's backing of Ukraine may lead Russia to take more risks in interfering in the 2024 presidential election. Ukrainian servicemen fire a M777 howitzer toward Russian troops near a front line in the Donetsk region, Ukraine on May 1, 2024. FBI officials are concerned that President Joe Biden's backing of Ukraine may lead Russia to take more risks in interfering in the 2024 presidential election. Valentyn Ogirenko/Reuters

“The question becomes how quickly can we spot an anomaly and then share that rapidly within the United States,” the former official told CNN. “Are we winning the race against a series of adversaries that might operate within the US? That’s the challenge.”

The threat of deepfakes and foreign influence is poised to come up in a Senate Intelligence Committee hearing on Wednesday, when lawmakers will get a rare opportunity to publicly interrogate the director of national intelligence and other senior officials on foreign threats to elections.

“Other adversarial nations know that it is relatively easy and, frankly, cheap to try to interfere in our election,” Sen. Mark Warner, a Democrat who chairs the Senate Intelligence Committee, told CNN’s John Berman Wednesday morning. “I think we should expect China, Russia, Iran, potentially other nation-states to try to both either cyberattack our infrastructure or, more likely, spread misinformation to try to pit Americans against Americans.”

While they didn’t deploy their deepfakes in 2020, Iranian government operatives did undertake a brazen attempt that year to influence voters by imitating the far-right Proud Boys group and disseminating a video purporting to show the hack of a US voter registration database, according to US prosecutors.

“The fact that the Iranians pulled the Proud Boys crap but didn’t try deep fakes was either a lack of faith in the capabilities or a sign of no clear internal guidance,” one person familiar with the intelligence told CNN.

Lost in translation

For foreign influence operations to be effective, they also need to resonate with the American public, something China has struggled with, the former senior US official said.

“I think it’s clearly a cultural piece,” the former official said. “They really have a very difficult understanding of the issues that that are divisive or necessarily how to play to those issues, where the Russians do not.”

Generative AI, or AI used to create video, audio, imagery or text, has made foreign influence actors more efficient in creating content, but “there is no evidence that it has made them or their campaigns any more effective,” said Lee Foster, an expert in tracking foreign influence operations online.

“Generative AI has so far not helped actors resolve the main bottleneck they face: distribution,” said Foster, who is a co-founder of AI security firm Aspect Labs. “Actors have rarely struggled with creating content. Getting it in front of the right eyeballs at a meaningful scale has been and continues to be the sticking point, one that AI so far has not helped them overcome.”

Foster and other experts have cautioned against exaggerating the impact of foreign influence operations, including those that use AI, because it benefits the propagandists themselves.

Disinformation in the US

But the US remains fertile ground for conspiracy theories, whether domestic or foreign in origin.

Nearly 70% of Republicans and Republican-leaners said that President Joe Biden’s 2020 election win was not legitimate, according to a CNN poll released in August.

And positive views of many government institutions are “at historic lows,” with just 16% of the public saying they trust the federal government always or most of the time, according to a Pew Research Center survey released in September.

“Americans, for whatever reason, are a lot more willing to believe crazy conspiracy theories and a lot less willing to accept, as truth, things coming from the federal government,” Warner said on CNN Wednesday.

The 2024 US election will present new opportunities for foreign influence operations. US military aid to Ukraine is essentially on the line, with Democrats largely backing Biden’s support for Ukraine and some leading Republicans, including former President Donald Trump, increasingly backing away from foreign aid.

FBI officials are concerned that the war in Ukraine – and US support for Kyiv — might be an “animating event for the Russians” in terms of conducting interference or influence operations aimed at the US election, a senior FBI official told reporters last week.

This story has been updated with additional information.






Powered by Betting Quoted @2013-2022 RSS地图 HTML地图

Copyright 站群 © 2024-2025