<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=192888919167017&amp;ev=PageView&amp;noscript=1">
Saturday,  November 23 , 2024

Linkedin Pinterest
Opinion
The following is presented as part of The Columbian’s Opinion content, which offers a point of view in order to provoke thought and debate of civic issues. Opinions represent the viewpoint of the author. Unsigned editorials represent the consensus opinion of The Columbian’s editorial board, which operates independently of the news department.
News / Opinion / Columns

Other Papers Say: Criminalize ‘deepfake’ nudes

By The Seattle Times
Published: July 29, 2024, 6:01am

The following editorial originally appeared in The Seattle Times:

Taylor Swift may be the best-known victim whose images have been manipulated with AI into pornography. But creators of such nude “deepfakes” have spread this vile and frightening new form of online abuse across the nation. Washington lawmakers enacted protections earlier this year, but Congress needs to act.

Those targeted — predominantly women and teenage girls — have little recourse in many parts of the country. Even in Swift’s case, one such image circulated on X, the site formerly known as Twitter, 47 million times before that website removed it, according to The Guardian.

Chaired by Sen. Maria Cantwell, D-Wash., the Senate Commerce Committee is considering legislation that would make such deepfakes a federal crime and give victims assurance they can be removed quickly from the internet. Congress should act swiftly to enact the bill.

Washington is among at least 14 states that have penalties for AI-generated deepfakes. Earlier this year, Caroline Mullet, daughter of state Sen. Mark Mullet, bravely testified to this deeply disturbing but increasingly common trend: A classmate circulated fake images he’d first captured of girls at homecoming then digitally manipulated with an AI app to make photos contain nudity. Lawmakers voted unanimously to place those images on par with state child pornography possession laws, as well as create a way for victims depicted to sue creators and publishers in court.

But the internet does not stop at state lines. Criminalizing the behavior across all 50 states and U.S. territories is the only way to ensure uniformity for all who fall victim to this humiliating new online exploitation. As well, publishers have a duty to remove images or face penalties by the Federal Trade Commission under legislation being considered in Congress known as the TAKE IT DOWN Act.

The act would do two things. First, it would make AI-generated fake nudes punishable by prison time — two years if the victim is an adult; three if they’re a minor. Second, they would require publishers — whether a small website publisher or a massive social media company — to remove such imagery within 48 hours of contact by the victim.

Cantwell has a chance to introduce the bill into the Senate Commerce Committee. The senator is an outspoken champion of establishing digital privacy protections for Americans and has said she supports the bill.

The Seattle Times Editorial Board also backs comprehensive digital privacy protections in legislation Cantwell introduced alongside U.S. Rep. Cathy McMorris Rodgers, R-Spokane, earlier this year.

Independent of that legislation, TAKE IT DOWN is also sorely needed. Everyone, from Taylor Swift to teenagers growing up in an age where AI can create such damaging, harmful content, deserves that much.

Loading...