<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=192888919167017&amp;ev=PageView&amp;noscript=1">
Thursday,  November 14 , 2024

Linkedin Pinterest
News / Politics

Presidential election a prime target for foreign disinformation, intelligence officials say

U.S. officials track disinformation campaigns say they're issuing more warnings to political candidates, government officials and others targeted by foreign groups as America’s adversaries seek to influence the 2024 election

By DAVID KLEPPER, , Associated Press,
Published: June 12, 2024, 7:13pm

WASHINGTON (AP) — U.S. officials who track disinformation campaigns say they have issued more warnings to political candidates, government leaders and others targeted by foreign groups in recent months as America’s adversaries seek to influence the outcome of the 2024 election.

Without giving specifics, an official from the Office of the Director of National Intelligence said Wednesday that the number is higher, at least in part, because “presidential elections draw more attention from our adversaries.”

The increase in notifications to targeted individuals, which began last fall, could also reflect a growing threat or the government’s improved detection capabilities, or both, said the official, who was one of several to brief reporters on condition of anonymity under ground rules set by the office of the director.

Lawmakers from both parties have voiced worries about the nation’s preparedness for foreign disinformation during the presidential election and the corrosive impact it has on voter confidence and trust in democratic institutions. They also have questioned whether the federal government is up to the task of issuing timely and effective warnings to voters when nations like Russia and China use disinformation to try to shape American politics.

Influence operations can include false or exaggerated claims and propaganda designed to mislead voters about specific candidates, issues or races. It can also include social media posts or other digital content that seeks to suppress the vote through intimidation or by giving voters false information about election procedures.

Officials say the list of nations launching such campaigns includes familiar foes like Russia, China and Iran as well as a growing number of second-tier players like Cuba. They also noted indications that some nations allied with the U.S. could mount their own efforts to influence voters.

Russia was the top threat, one of the officials said, noting that its main objectives are degrading public support for Ukraine and eroding confidence in American democracy in general.

China is considered to be more cautious about its online disinformation campaigns and more concerned than Russia about potential blowback from the U.S., officials said. Iran is seen as a “chaos agent” that is more likely to experiment with online techniques to stoke voter anger and even violence.

Officials would not specify how many private warnings they have issued to candidates, political organizations or local election offices. Such warnings are delivered after an interagency panel of intelligence officials concludes that an influence operation could impact the outcome of an election or prevent certain groups from voting.

The notifications are only given when officials can attribute the operation to foreign sources, allowing the person or group that was targeted to “take a more defensive stance,” an official said.

The office within the intelligence community that leads the work, the Foreign Malign Influence Center, has no jurisdiction over domestic groups. The officials who briefed reporters Wednesday said they work to avoid any appearance of policing Americans’ speech or playing favorites when it comes to candidates.

Intelligence officials have issued only one public warning so far — in 2020 when groups linked to Iran sent emails to Democratic voters in an apparent effort to intimidate them into voting for Donald Trump.

Powerful artificial intelligence programs that allow the rapid creation of images, audio and video pose a growing problem, as adversaries look to use the technology to create lifelike fakes that could easily mislead voters.

The use of AI has already popped up ahead of elections in India, Mexico, Moldova, Slovakia and Bangladesh, and in the U.S., where some voters in New Hampshire received an AI robocall that mimicked the voice of President Joe Biden.

AI deepfakes used by U.S. adversaries remain a top threat, officials said.

Loading...
Tags