Skip to main content
Clear icon
46º

FCC pursues new rules for AI in political ads, but changes may not take effect before the election

FILE - People are reflected in a window of a hotel at the Davos Promenade in Davos, Switzerland, Jan. 15, 2024. The Federal Communications Commission has advanced a proposal that would require political advertisers to disclose their use of artificial intelligence in broadcast television and radio ads. But it's unclear whether new regulations may be in place before the November presidential election. (AP Photo/Markus Schreiber, File) (Markus Schreiber, Copyright 2024 The Associated Press. All rights reserved)

NEW YORK – The Federal Communications Commission has advanced a proposal that would require political advertisers to disclose their use of artificial intelligence in broadcast television and radio ads, though it is unclear whether new regulations may be in place before the November presidential election.

The proposed rules announced Thursday could add a layer of transparency in political campaigning that some tech watchdogs have called for to help inform voters about lifelike and misleading AI-generated media in ads.

Recommended Videos



“There’s too much potential for AI to manipulate voices and images in political advertising to do nothing,” the agency's chairwoman, Democrat Jessica Rosenworcel, said Thursday in a news release. “If a candidate or issue campaign used AI to create an ad, the public has a right to know.”

But the FCC’s action is part of a federal turf war over the regulation of AI in politics. The move has faced pushback from the chairman of the Federal Election Commission, who previously accused the FCC of stepping on his own agency’s authority and has warned of a possible legal challenge.

Political candidates and parties in the United States and around the world already have experimented with rapidly advancing generative AI tools, though some have voluntarily disclosed their use of the technology. Others have weaponized the technology to mislead voters.

The FCC is proposing requiring broadcasters to ask political advertisers whether their content was created using AI tools, such as text-to-image creators or voice-cloning software. The agency also aims to require broadcasters to make an on-air announcement when AI-generated content is used in a political ad and include a notice disclosing the use of AI in their online political files.

The commission acknowledges it would not have authority over streaming, leaving the growing political advertising industry on digital and streaming platforms unregulated at the federal level.

After the commission's 3-2 vote, the proposal will move into a 30-day public comment period, followed by a 15-day reply period. Commissioners are then expected to finalize and pass a rule. It is unclear whether there is time for it to go into effect before a presidential election that is just over three months away.

Jonathan Uriarte, a spokesperson for Rosenworcel, said the chairwoman “intends to follow the regulatory process but she has been clear that the time to act is now.”

After Rosenworcel announced her proposed rule in May, FEC Chairman Sean Cooksey, a Republican, sent her a letter cautioning her against the move.

“I am concerned that parts of your proposal would fall within the exclusive jurisdiction" of the FEC and would "directly conflict with existing law and regulations, and sow chaos among political campaigns for the upcoming election," he wrote.

If the FCC moves forward, it could create “irreconcilable conflicts” between the agencies that may end up in federal court, he said in the letter.

A Republican commissioner at the FCC, Brendan Carr, has agreed with Cooksey and voted against the proposal. In a statement Thursday, Carr argued the move was illegal and problematic so close to a presidential election, with the regulations likely to take effect after early voting has already begun in many places.

“Far from promoting transparency, the FCC’s proposed rules would mire voters in confusion, create a patchwork of inconsistent rules, and encourage monied, partisan interests to weaponize the law for electoral advantage,” Carr wrote.

But the FEC's vice chair, Democrat Ellen Weintraub, has supported the proposal, saying in a June letter to Rosenworcel that “no one agency currently has the jurisdiction or capacity to address every aspect of this large and complicated issue.”

Cooksey said in a statement Thursday that the FCC should “abandon this misguided proposal.”

“Every American should be disturbed that the Democrat-controlled FCC is pushing ahead with its radical plan to change the rules on political ads mere weeks before the general election,” he said. “Not only would these vague rules intrude on the Federal Election Commission's jurisdiction, but they would sow chaos among political campaigns and confuse voters before they head to the polls.”

The FCC maintains it has authority to regulate on the issue under the 1934 Communications Act and the Bipartisan Campaign Reform Act.

Robert Weissman, co-president of the advocacy group Public Citizen, said he supports the FCC's proposed rule as the U.S. is “barreling toward elections which may be distorted, or even decided, by political deepfakes.”

Rep. Joseph Morelle of New York, the top Democrat on the House Administration Committee, commended the FCC, saying in an emailed statement that “it is vital that our federal agencies work to ensure that voters are able to discern fact from fiction.”

Congress has not passed laws directing the agencies on how they should regulate AI in politics. Some Republican senators have circulated legislation intending to block the Democratic-led FCC from issuing its new rules. Meanwhile, the FEC is considering its own petition on regulating deepfakes in political ads.

In the absence of federal action, more than one-third of states have created their own laws regulating the use of AI in campaigns and elections, according to the National Conference of State Legislatures.

In February, the FCC ruled that robocalls containing AI-generated voices are illegal, a step that empowered the commission to fine companies that use AI voices in their calls or block the service providers that carry them.

___

The Associated Press receives support from several private foundations to enhance its explanatory coverage of elections and democracy. See more about AP’s democracy initiative here. The AP is solely responsible for all content.


Loading...