The following editorial originally appeared in The Seattle Times:
Taylor Swift may be the best-known victim whose images have been manipulated with AI into pornography. But creators of such nude “deepfakes” have spread this vile and frightening new form of online abuse across the nation. Washington lawmakers enacted protections earlier this year, but Congress needs to act.
Those targeted — predominantly women and teenage girls — have little recourse in many parts of the country. Even in Swift’s case, one such image circulated on X, the site formerly known as Twitter, 47 million times before that website removed it, according to The Guardian.
Chaired by Sen. Maria Cantwell, D-Wash., the Senate Commerce Committee is considering legislation that would make such deepfakes a federal crime and give victims assurance they can be removed quickly from the internet. Congress should act swiftly to enact the bill.
Washington is among at least 14 states that have penalties for AI-generated deepfakes. Earlier this year, Caroline Mullet, daughter of state Sen. Mark Mullet, bravely testified to this deeply disturbing but increasingly common trend: A classmate circulated fake images he’d first captured of girls at homecoming then digitally manipulated with an AI app to make photos contain nudity. Lawmakers voted unanimously to place those images on par with state child pornography possession laws, as well as create a way for victims depicted to sue creators and publishers in court.