In the analog days of the 1970s, long before hackers, trolls and edgelords, an audiocassette company came up with an advertising slogan that posed a trick question: “Is it live or is it Memorex?” The message toyed with reality, suggesting there was no difference in sound quality between a live performance and music recorded on tape.
Fast forward to our age of metaverse lies and deceptions, and one might ask similar questions about what’s real and what’s not: Is President Joe Biden on a robocall telling Democrats to not vote? Is Donald Trump chumming it up with Black men on a porch? Is the U.S. going to war with Russia? Fact and fiction appear interchangeable in an election year when AI-generated content is targeting voters in ways that were once unimaginable.
American politics is accustomed to chicanery — opponents of Thomas Jefferson warned the public in 1800 that he would burn their Bibles if elected — but artificial intelligence is bending reality into a video game world of avatars and deepfakes designed to sow confusion and chaos. The ability of AI programs to produce and scale disinformation with swiftness and breadth is the weapon of lone wolf provocateurs and intelligence agencies in Russia, China and North Korea.
“Truth itself will be hard to decipher. Powerful, easy-to-access new tools will be available to candidates, conspiracy theorists, foreign states, and online trolls who want to deceive voters and undermine trust in our elections,” said Drew Liebert, director of the California Initiative for Technology and Democracy, or CITED, which seeks legislation to limit disinformation. “Imagine a fake robocall [from] Gov. Newsom goes out to millions of Californians on the eve of election day telling them that their voting location has changed.”