The two top officials of the Federal Election Commission are split over whether broadcast radio and television political advertisements should be required to disclose whether content is generated by artificial intelligence (AI).
FEC vice chair Ellen Weintraub supported a May proposal by US Federal Communications Commission (FCC) Chairwoman Jessica Rosenworcel, who asked the commission to advance a proposed rule that would require disclosure of AI content in both candidate and issue advertisements.
FEC chair Sean Cooksey criticised the plan.
The proposal would not prohibit AI-generated content within political ads.
There is growing concern in Washington that AI-generated content could mislead voters in the November presidential and congressional elections.
The FCC said AI will probably play a substantial role in 2024 political ads.
Rosenworcel singled out the potential for misleading “deepfakes” or “altered images, videos, or audio recordings that depict people doing or saying things that did not actually do or say.”
“It’s about disclosure,” Rosenworcel said Thursday, saying the FCC since the 1930s has required disclosure and has ample legal authority. “We have decades of experience with doing this.”
Weintraub said in a letter to Rosenworcel that the “public would benefit from greater transparency as to when AI-generated content is being used in political advertisements.”
She said it would be beneficial for both the FEC and FCC to conduct regulatory efforts. “It’s time to act,” Weintraub said.
But Cooksey said mandatory disclosures would “directly conflict with existing law and regulations, and sow chaos among political campaigns for the upcoming election.”
The rule would require on-air and written disclosures and cover cable operators, satellite TV and radio providers.
The FCC does not have authority to regulate internet or social media ads or streaming services. The agency has already taken steps to combat misleading use of AI in political robocalls.
Republican FCC Commissioner Brendan Carr criticised the proposal saying the “FCC can only muddy the waters. AI-generated political ads that run on broadcast TV will come with a government-mandated disclaimer but the exact same or similar ad that runs on a streaming service or social media site will not?”
Electoral AI content drew attention in January after a fake robocall imitating President Joe Biden sought to dissuade people from voting for him in New Hampshire’s Democratic primary election, prompting the state to file charges against a Democratic political consultant behind the calls.