FCC Releases NPRM on AI Disclosure Requirements for Political Ads
On July 25, 2024, the Federal Communications Commission (FCC or Commission) released a Notice of Proposed Rulemaking (NPRM) that proposes rules to require all radio and television broadcast stations that air political ads to disclose on-air if there is “AI-generated content” in the ad. This NPRM release follows Chairwoman Jessica Rosenworcel’s announcement on May 22, 2024, that she had circulated a draft Notice of Proposed Rulemaking on this matter.
Comments on this NPRM must be submitted 30 days after publication in the Federal Register. Reply comments will be due 45 days after publication in the Federal Register. While any new rules will likely not go into effect until after this November’s election, those wishing to have input on the use of artificial intelligence (AI) in issue advocacy campaigns or political ads for future elections should consider submitting comments.
Below we provide a high-level summary of the NPRM and highlight some of the questions it poses.
Overview of Proposed Rules
The FCC proposes to require both radio and television broadcasters to (1) ask the person or entity seeking to air a political ad whether the ad contains any “AI-generated content” prior to its broadcast; and (2) include disclosures when there is such content in a political ad. AI-generated content is defined as “an image, audio, or video that has been generated using computational technology or other machine-based system that depicts an individual’s appearance, speech, or conduct, or an event, circumstance, or situation, including, in particular, AI-generated voices that sound like human voices, and AI-generated actors that appear to be human actors.” The NPRM requests comment on the definition of “AI-generated content” and solicits alternative definitions.
The NPRM includes questions about its proposed inquiry requirement, including: (i) whether the inquiry requirement would be expected to identify all political ads that use AI-generated content; (ii) how stations would make this inquiry; and (iii) what additional obligations stations would have to identify AI-generated content if the person or entity requesting airtime fails to respond. Additionally, the FCC asks what obligations a station should have if a third party indicates that a political ad contains AI-generated content.
The FCC proposes that the disclosure would be made on-air using standardized language. The NPRM asks for comment on whether the disclosure should be aural or in text or both. The FCC proposes to also require broadcasters to document the use of AI-generated content in the broadcasters’ political file.
In addition to addressing political ads aired by local television stations, the NPRM includes questions about how the requirement should apply to political ads embedded in network or syndicated programming, including whether broadcasters should have an obligation to document the use of AI in network political ads in the public file.
The FCC also proposes to extend the disclosure requirement to cable operators, Direct Broadcast Satellite providers, Satellite Digital Audio Radio Service licensees engaged in origination programming, and holders of permits under Section 325(c) to transmit programming to a foreign broadcast station.
The Commission states that it is not proposing to ban or restrict the use of AI-generated content, but the agency asks both about its authority to impose a disclosure requirement under its general rulemaking authority and about potential First Amendment concerns.
The Big Picture
The FCC has been eager to regulate AI in different contexts. Last May, the Commission articulated its intent to provide rules around allowing AI and machine learning to collect and analyze data on non-federal spectrum usage. More recently, on July 17, 2024, the FCC released a draft Notice of Proposed Rulemaking that would, among other things, propose new rules to regulate AI-generated calls and texts and seek comment on technologies used to detect and mitigate illegal and unwanted robocalls that use AI.
The Chairwoman’s efforts to open up a rulemaking during the middle of the 2024 campaign drew strong dissents from Commissioners Carr and Simington, who issued multi-page statements: (1) contesting that the NPRM was being hastily pushed through, creating confusion for both broadcasters and voters; (2) arguing that the proposed approach would create inconsistency between broadcast and online platforms (which would not be subject to these obligations); and (3) claiming the Commission lacks sufficient statutory authority to impose the proposed requirements.
***
Wiley’s Artificial Intelligence Practice counsels clients on AI compliance, risk management, and regulatory and policy approaches, and we engage with key government stakeholders in this quickly moving area. The firm’s Election Law & Government Ethics Practice provides incisive and sophisticated legal counsel on all aspects of political law including campaign finance, lobbying, government ethics, and elections. The Media Practice provides regulatory and transactional counsel to radio and television broadcasters, as well as content creators and distributors, news organizations, financial institutions and investors, and equipment manufacturers. Please reach out to a member of our team with any questions.