CNN —
In an assessment released Monday, the Office of the Director of National Intelligence said artificial intelligence is helping to “improve” rather than “innovate” Russian and Iranian influence operations aimed at the November U.S. elections.
“(U.S. intelligence agencies) view AI as a facilitator of malign influence and do not yet view it as a revolutionary influence tool,” an ODNI official told reporters.
The new U.S. assessment contrasts with some media and industry hype about AI-related threats, but the technology remains a top concern for U.S. intelligence agencies monitoring threats to the presidential election.
The risk that foreign AI-generated content poses to U.S. elections depends on whether foreign operatives can overcome limitations built into many AI tools, develop their own advanced AI models or “strategically target and disseminate” AI-generated content, the official said. “Foreign actors are lagging in these three areas.”
U.S. officials say foreign agents are using AI to overcome language barriers and spread disinformation to target American voters.
For example, Iran is using AI to generate Spanish-language content about immigration, which the Iranian government views as a U.S. political issue, ODNI officials said. Iran-linked operatives also use AI to target voters across the political spectrum on divisive issues like the Israel-Gaza conflict, the officials said. U.S. authorities believe the Iranian government is trying to undermine the candidacy of former President Donald Trump.
ODNI officials said Russia is the most prolific producer of AI content related to the U.S. presidential election among foreign powers, and that the AI-driven videos, photos, text, audio and other content is consistent with Moscow’s efforts to boost Trump’s candidacy and undermine Vice President Kamala Harris’ campaign.
Meanwhile, China is using AI “to amplify divisive political issues in the United States,” but is not trying to influence the outcome of any particular U.S. election, a new U.S. intelligence assessment says.
Foreign agents have employed many old-fashioned influence techniques during this election, such as using AI to stage videos rather than generate them.
ODNI officials said U.S. intelligence agencies believe a video circulated on X earlier this month falsely claiming that Harris paralyzed a young girl in a hit-and-run crash in 2011 was fabricated by Russian agents. Russia spread the story through websites posing as local San Francisco media outlets, Microsoft researchers said.
Another Russian-made video, which has been viewed at least 1.5 million times on X, purports to show Harris supporters attacking attendees of a Donald Trump rally, according to Microsoft.
US intelligence agencies warned in July that Russia was using “covert social media” to sway public opinion and undermine support for Ukraine in battleground states.
“Russia is generally a much more sophisticated actor in terms of influence and has a better understanding of how U.S. elections work, where to target, which states to target,” the ODNI official said.
This is not the first U.S. election in which a foreign power has considered deploying AI capabilities.
CNN previously reported that operatives working for the Chinese and Iranian governments prepared fake AI-generated content as part of a campaign to influence US voters in the final weeks of the 2020 election campaign, but chose not to disseminate the content. Some US officials who reviewed the information at the time were not impressed, as they thought it showed China and Iran lacked the capacity to deploy deepfakes that could seriously affect the 2020 presidential election.