The Washington State Public Disclosure Commission held its regular monthly meeting Wednesday at the Hilton Vancouver Washington as part of an ongoing effort to do more outreach throughout the state.
In June, the commission met in Spokane, and it plans to meet at locations outside of Olympia at least twice a year moving forward, according to spokeswoman Natalie Johnson.
“It’s an effort to bring the (Public Disclosure Commission) to more people and to try to communicate better what it is that we do,” Johnson said.
On Wednesday, the commission heard presentations on artificial intelligence and synthetic media — also known as “deepfakes” — and the impact burgeoning technology could have on elections. Additionally, the group discussed how to improve public understanding of money in politics.
The commission was created in 1972 with the passage of Initiative 276 to provide public access to accurate information about the financing of political campaigns, lobbyist expenditures, and the financial affairs of public officials and candidates. The commission also enforces Washington’s disclosure and campaign finance laws.
It comprises five residents appointed by the governor to five-year terms. The annual agency budget is $5.2 million appropriated from the state’s general fund.
The PDC’s 30 employees receive reports and make them available to the public. Annually, the agency receives some 6,500 annual personal financial statements and more than 90,000 reports from candidates, political committees, lobbyists and lobbyist employers.
Staff members also monitor compliance, conduct investigations, and develop online filing applications and data management systems.
“A big part of our mission is just making sure that the money in politics is available for people to understand,” Director Peter Frey Lavallee said.
As part of its outreach, the commission has met with auditors across the state to learn more about election administration issues.
Clark County Auditor Greg Kimsey told the commission he is concerned about AI, deepfakes and misinformation. However, he said his biggest concern is declining trust in election processes.
“The biggest challenge that we face is there continues to be a number of people in our county, state and country who continue to not believe that the election results reflect the will of voters,” Kimsey said. “It’s ramping up again. It all began with Donald Trump in 2016 when he said he would only trust an election that he won. And it’s still with us. It doesn’t seem to matter how many facts or how much information you have, it doesn’t change the view of those who hold that belief. I look to you for an answer to that. Please, help me. That’s really the biggest issue I’m here to talk about.”
Monthly meeting
During Wednesday’s meeting, the commission considered draft rules for two pieces of legislation.
In 2023, the state Legislature passed Senate Bill 5152, which creates a private cause of action against the sponsor of political advertising using synthetic media, also known as a deepfake, unless a disclaimer is included. It also requires the Public Disclosure Commission to draft rules in furtherance of the law, which will be administered through civil lawsuits.
Additionally, the meeting included a presentation of draft rules related to the passage this year of House Bill 2032 — sponsored by Rep. Greg Cheney, R-Battle Ground — which requires political yard signs to include information about who paid for the printing of the signs.
The commission did not adopt the draft rules Wednesday. A public hearing on the draft rules is slated for May 23.
Wednesday’s meeting also included presentations on deepfakes and synthetic media in political advertising from Jevin West, founding director of the University of Washington’s Center for an Informed Public, and Oren Etzioni, founder of TrueMedia.org, a nonprofit fighting political deepfakes.
“The thing that worries me most is when something is not AI generated, but it’s a real event and no one responds because the public has become desensitized to fake video, audio and images,” West said.
TrueMedia.org is a free website that analyzes images, audio and video to assess whether they’re likely to be deepfakes, generated by AI or manipulated by AI.
“The idea is, particularly as we come to the election season, this will be useful for a broad variety of folks,” Etzioni said.
Lucas Hansen, co-founder of CivAI, a nonprofit that disseminates information about AI capabilities and dangers, provided a demonstration of how easy it is to produce deepfakes. Within minutes, Hansen created a fake image of Gov. Jay Inslee being booked into a jail, along with fake audio of Inslee’s voice and an AI-generated article — seemingly from The New York Times — documenting Inslee’s “crimes.”
Then, in seconds, he translated the article into Russian.
“This is not a technology happening tomorrow,” Hansen said. “This is happening right now. That’s what we’re trying to communicate to people.”
He said as the technology becomes more available, it will become easier for bad actors to produce content about not just high-ranking officials such as Inslee, but local leaders, as well — even school board members.