119-s1396

S
✓ Complete Data

Content Origin Protection and Integrity from Edited and Deepfaked Media Act of 2025

Login to track bills
Introduced:
Apr 9, 2025
Policy Area:
Science, Technology, Communications

Bill Statistics

2
Actions
2
Cosponsors
0
Summaries
1
Subjects
1
Text Versions
Yes
Full Text

AI Summary

No AI Summary Available

Click the button above to generate an AI-powered summary of this bill using Claude.

The summary will analyze the bill's key provisions, impact, and implementation details.

Latest Action

Apr 9, 2025
Read twice and referred to the Committee on Commerce, Science, and Transportation.

Actions (2)

Read twice and referred to the Committee on Commerce, Science, and Transportation.
Type: IntroReferral | Source: Senate
Apr 9, 2025
Introduced in Senate
Type: IntroReferral | Source: Library of Congress | Code: 10000
Apr 9, 2025

Subjects (1)

Science, Technology, Communications (Policy Area)

Cosponsors (2)

Text Versions (1)

Introduced in Senate

Apr 9, 2025

Full Bill Text

Length: 19,946 characters Version: Introduced in Senate Version Date: Apr 9, 2025 Last Updated: Nov 15, 2025 6:22 AM
[Congressional Bills 119th Congress]
[From the U.S. Government Publishing Office]
[S. 1396 Introduced in Senate

(IS) ]

<DOC>

119th CONGRESS
1st Session
S. 1396

To require transparency with respect to content and content provenance
information, to protect artistic content, and for other purposes.

_______________________________________________________________________

IN THE SENATE OF THE UNITED STATES

April 9, 2025

Ms. Cantwell (for herself, Mrs. Blackburn, and Mr. Heinrich) introduced
the following bill; which was read twice and referred to the Committee
on Commerce, Science, and Transportation

_______________________________________________________________________

A BILL

To require transparency with respect to content and content provenance
information, to protect artistic content, and for other purposes.

Be it enacted by the Senate and House of Representatives of the
United States of America in Congress assembled,
SECTION 1.

This Act may be cited as the ``Content Origin Protection and
Integrity from Edited and Deepfaked Media Act of 2025''.
SEC. 2.

It is the sense of Congress that--

(1) there is a lack of--
(A) visibility into how artificial intelligence
systems work;
(B) transparency regarding the information used to
train such systems; and
(C) consensus-based standards and practices to
guide the development and deployment of such systems;

(2) it is becoming increasingly difficult to assess the
nature, origins, and authenticity of digital content that has
been generated or modified algorithmically;

(3) these deficiencies negatively impact the public and,
particularly, the journalists, publishers, broadcasters, and
artists whose content is used to train these systems and is
manipulated to produce synthetic content and synthetically-
modified content that competes unfairly in the digital
marketplace with covered content; and

(4) the development and adoption of consensus-based
standards would mitigate these impacts, catalyze innovation in
this nascent industry, and put the United States in a position
to lead the development of artificial intelligence systems
moving forward.
SEC. 3.

In this title:

(1) Artificial intelligence.--The term ``artificial
intelligence'' has the meaning given the term in
section 5002 of the National Artificial Intelligence Initiative Act of 2020 (15 U.
of the National Artificial Intelligence Initiative Act of 2020
(15 U.S.C. 9401).

(2) Artificial intelligence blue-teaming.--The term
``artificial intelligence blue-teaming'' means an effort to
conduct operational vulnerability evaluations and provide
mitigation techniques to entities who have a need for an
independent technical review of the security posture of an
artificial intelligence system.

(3) Artificial intelligence red-teaming.--The term
``artificial intelligence red-teaming'' means structured
adversarial testing efforts of an artificial intelligence
system to identify risks, flaws, and vulnerabilities of the
artificial intelligence system, such as harmful outputs from
the system, unforeseen or undesirable system behaviors,
limitations, or potential risks associated with the misuse of
the system.

(4) Content provenance information.--The term ``content
provenance information'' means state-of-the-art, machine-
readable information documenting the origin and history of a
piece of digital content, such as an image, a video, audio, or
text.

(5) Covered content.--The term ``covered content'' means a
digital representation, such as text, an image, or audio or
video content, of any work of authorship described in
section 102 of title 17, United States Code.

(6) Covered platform.--The term ``covered platform'' means
a website, internet application, or mobile application
available to users in the United States, including a social
networking site, video sharing service, search engine, or
content aggregation service available to users in the United
States, that either--
(A) generates at least $50,000,000 in annual
revenue; or
(B) had at least 25,000,000 monthly active users
for not fewer than 3 of the 12 months immediately
preceding any conduct by the covered platform in
violation of this Act.

(7) Deepfake.--The term ``deepfake'' means synthetic
content or synthetically-modified content that--
(A) appears authentic to a reasonable person; and
(B) creates a false understanding or impression.

(8) Director.--The term ``Director'' means the Under
Secretary of Commerce for Intellectual Property and Director of
the United States Patent and Trademark Office.

(9) Synthetic content.--The term ``synthetic content''
means information, including works of human authorship such as
images, videos, audio clips, and text, that has been wholly
generated by algorithms, including by artificial intelligence.

(10) Synthetically-modified content.--The term
``synthetically-modified content'' means information, including
works of human authorship such as images, videos, audio clips,
and text, that has been significantly modified by algorithms,
including by artificial intelligence.

(11) Under secretary.--The term ``Under Secretary'' means
the Under Secretary of Commerce for Standards and Technology.

(12) Watermarking.--The term ``watermarking'' means the act
of embedding information that is intended to be difficult to
remove into an output, including an output such as text, an
image, an audio, a video, software code, or any other digital
content or data, for the purposes of verifying the authenticity
of the output or the identity or characteristics of its
provenance, modifications, or conveyance
SEC. 4.
INFORMATION AND DETECTION OF SYNTHETIC CONTENT AND
SYNTHETICALLY-MODIFIED CONTENT.

(a) In General.--The Under Secretary shall establish a public-
private partnership to facilitate the development of standards
regarding content provenance information technologies and the detection
of synthetic content and synthetically-modified content, including with
respect to the following:

(1) Facilitating the development of guidelines and
voluntary, consensus-based standards and best practices for
watermarking, content provenance information, synthetic content
and synthetically-modified content detection, including for
images, audio, video, text, and multimodal content, the use of
data to train artificial intelligence systems, and such other
matters relating to transparency of synthetic media as the
Under Secretary considers appropriate.

(2) Facilitating the development of guidelines, metrics,
and practices to evaluate and assess tools to detect and label
synthetic content, synthetically-modified content, and non-
synthetic content, including artificial intelligence red-
teaming and artificial intelligence blue-teaming.

(3) Establishing grand challenges and prizes in
coordination with the Defense Advanced Research Projects Agency
and the National Science Foundation to detect and label
synthetic content, synthetically-modified content, and non-
synthetic content and to develop cybersecurity and other
countermeasures to defend against tampering with detection
tools, watermarks, or content provenance information.

(b) Consultation.--In developing the standards described in
subsection

(a) , the Under Secretary shall consult with the Register of
Copyrights and the Director.
SEC. 5.
DEVELOPMENT, AND PUBLIC EDUCATION REGARDING SYNTHETIC
CONTENT AND SYNTHETICALLY-MODIFIED CONTENT.

(a) Research and Development.--The Under Secretary shall carry out
a research program to enable advances in measurement science,
standards, and testing relating to the robustness and efficacy of--

(1) technologies for synthetic content and synthetically-
modified content detection, watermarking, and content
provenance information; and

(2) cybersecurity protections and other countermeasures
used to prevent tampering with such technologies.

(b) Public Education Campaigns Regarding Synthetic Content.--Not
later than 1 year after the date of enactment of this Act, the Under
Secretary shall, in consultation with the Register of Copyrights and
the Director, carry out a public education campaign regarding synthetic
content and synthetically-modified content (including deepfakes),
watermarking, and content provenance information.
SEC. 6.
ACTS.

(a) Content Provenance Information.--

(1) Synthetic content and synthetically-modified content.--
Beginning on the date that is 2 years after the date of
enactment of this Act, any person who, for a commercial
purpose, makes available in interstate commerce a tool used for
the primary purpose of creating synthetic content or
synthetically-modified content shall--
(A) taking into consideration the content
provenance information standards established under
section 4, provide users of such tool with the ability to include content provenance information that indicates the piece of digital content is synthetic content or synthetically-modified content for any synthetic content or synthetically-modified content created by the tool; and (B) in the event a user opts to include content provenance information under subparagraph (A) , establish, to the extent technically feasible, reasonable security measures to ensure that such content provenance information is machine-readable and not easily removed, altered, or separated from the underlying content.
to include content provenance information that
indicates the piece of digital content is synthetic
content or synthetically-modified content for any
synthetic content or synthetically-modified content
created by the tool; and
(B) in the event a user opts to include content
provenance information under subparagraph
(A) ,
establish, to the extent technically feasible,
reasonable security measures to ensure that such
content provenance information is machine-readable and
not easily removed, altered, or separated from the
underlying content.

(2) Covered content.--Beginning on the date that is 2 years
after the date of enactment of this Act, any person who, for a
commercial purpose, makes available in interstate commerce a
tool used for the primary purpose of creating or substantially
modifying covered content shall--
(A) taking into consideration the content
provenance information standards established under
section 4, provide users of such tool with the ability to include content provenance information for any covered content created or significantly modified by the tool; and (B) in the event a user opts to include content provenance information under subparagraph (A) , establish, to the extent technically feasible, reasonable security measures to ensure that such content provenance information is machine-readable and not easily removed, altered, or separated from the underlying content.
to include content provenance information for any
covered content created or significantly modified by
the tool; and
(B) in the event a user opts to include content
provenance information under subparagraph
(A) ,
establish, to the extent technically feasible,
reasonable security measures to ensure that such
content provenance information is machine-readable and
not easily removed, altered, or separated from the
underlying content.

(b) Removal of Content Provenance Information.--

(1) In general.--It shall be unlawful for any person to
knowingly remove, alter, tamper with, or disable content
provenance information in furtherance of an unfair or deceptive
act or practice in or affecting commerce.

(2) Covered platforms.--
(A) In general.--Subject to subparagraph
(B) , it
shall be unlawful for a covered platform, to remove,
alter, tamper with, or disable content provenance
information or to separate the content provenance
information from the content so that the content
provenance information cannot be accessed by users of
the platform.
(B) Exception for security research.--A covered
platform shall not be liable for a violation of
subparagraph
(A) if such covered platform removes,
alters, tampers with, or disables content provenance
information for a purpose necessary, proportionate, and
limited to perform research to enhance the security of
the covered platform.
(c) Prohibition on Non-Consensual Use of Covered Content That Has
Attached or Associated Content Provenance Information.--It shall be
unlawful for any person, for a commercial purpose, to knowingly use any
covered content that has content provenance information that is
attached to or associated with such covered content or covered content
from which the person knows or should know that content provenance
information has been removed or separated in violation of subsection

(b) , in order to train a system that uses artificial intelligence or an
algorithm or to generate synthetic content or synthetically-modified
content unless such person obtains the express, informed consent of the
person who owns the covered content, and complies with any terms of use
pertaining to the use of such content, including terms regarding
compensation for such use, as required by the owner of copyright in
such content.
SEC. 7.

(a) Enforcement by the Commission.--

(1) Unfair or deceptive acts or practices.--A violation of
this Act or a regulation promulgated under this Act shall be
treated as a violation of a rule defining an unfair or
deceptive act or practice prescribed under
section 18 (a) (1) (B) of the Federal Trade Commission Act (15 U.

(a)

(1)
(B) of the Federal Trade Commission Act (15 U.S.C. 57a

(a)

(1)
(B) ).

(2) Powers of the commission.--
(A) In general.--The Commission shall enforce this
Act in the same manner, by the same means, and with the
same jurisdiction, powers, and duties as though all
applicable terms and provisions of the Federal Trade
Commission Act (15 U.S.C. 41 et seq.) were incorporated
into and made a part of this title.
(B) Privileges and immunities.--Any person who
violates this Act, or a regulation promulgated under
this Act shall be subject to the penalties and entitled
to the privileges and immunities provided in the
Federal Trade Commission Act (15 U.S.C. 41 et seq.).
(C) Authority preserved.--Nothing in this Act shall
be construed to limit the authority of the Commission
under any other provision of law.

(b) Enforcement by States.--

(1) In general.--In any case in which the attorney general
of a State has reason to believe that an interest of the
residents of the State has been or is threatened or adversely
affected by the engagement of any person in a practice that
violates this Act, the attorney general of the State may, as
parens patriae, bring a civil action on behalf of the residents
of the State in an appropriate district court of the United
States to--
(A) enjoin further violation of this Act by such
person;
(B) compel compliance with this Act;
(C) obtain damages, restitution, or other
compensation on behalf of such residents; and
(D) obtain such other relief as the court may
consider to be appropriate.

(2) Rights of the commission.--
(A) Notice to the commission.--
(i) In general.--Except as provided in
clause
(iii) , the attorney general of a State
shall notify the Commission in writing that the
attorney general intends to bring a civil
action under paragraph

(1) before initiating
the civil action.
(ii) Contents.--The notification required
by clause
(i) with respect to a civil action
shall include a copy of the complaint to be
filed to initiate the civil action.
(iii) Exception.--If it is not feasible for
the attorney general of a State to provide the
notification required by clause
(i) before
initiating a civil action under paragraph

(1) ,
the attorney general shall notify the
Commission immediately upon instituting the
civil action.
(B) Intervention by the commission.--The Commission
may--
(i) intervene in any civil action brought
by the attorney general of a State under
paragraph

(1) ; and
(ii) upon intervening--
(I) be heard on all matters arising
in the civil action; and
(II) file petitions for appeal of a
decision in the civil action.

(3) Investigatory powers.--Nothing in this subsection may
be construed to prevent the attorney general of a State from
exercising the powers conferred on the attorney general by the
laws of the State to conduct investigations, to administer
oaths or affirmations, or to compel the attendance of witnesses
or the production of documentary or other evidence.

(4) Action by the commission.--If the Commission institutes
a civil action or an administrative action with respect to a
violation of this Act, the attorney general of a State may not,
during the pendency of such action, bring a civil action under
paragraph

(1) against any defendant named in the complaint of
the Commission for the violation with respect to which the
Commission instituted such action.

(5) Venue; service or process.--
(A) Venue.--Any action brought under paragraph

(1) may be brought in--
(i) the district court of the United States
that meets applicable requirements relating to
venue under
section 1391 of title 28, United States Code; or (ii) another court of competent jurisdiction.
States Code; or
(ii) another court of competent
jurisdiction.
(B) Service of process.--In an action brought under
paragraph

(1) , process may be served in any district in
which the defendant--
(i) is an inhabitant; or
(ii) may be found.

(6) Actions by other state officials.--
(A) In general.--In addition to civil actions
brought by attorneys general under paragraph

(1) , any
other officer of a State who is authorized by the State
to do so may bring a civil action under paragraph

(1) ,
subject to the same requirements and limitations that
apply under this subsection to civil actions brought by
attorneys general.
(B) Savings provision.--Nothing in this subsection
may be construed to prohibit an authorized official of
a State from initiating or continuing any proceeding in
a court of the State for a violation of any civil or
criminal law of the State.

(7) Damages.--If a person brings a civil action for a
violation of this Act pursuant to subsection
(c) and receives
any monetary damages, the court shall reduce the amount of any
damages awarded under this subsection by the amount of monetary
damages awarded to such person.
(c) Enforcement by Private Parties and Government Entities.--

(1) In general.--Any person who owns covered content that
has content provenance information that is attached to or
associated with such covered content may bring a civil action
in a court of competent jurisdiction against--
(A) any person or covered platform for removing,
altering, tampering with, or disabling such content
provenance information in violation of subsection

(b)

(1) or

(b)

(2) of
section 6; and (B) any person for using such covered content in violation of
(B) any person for using such covered content in
violation of
section 6 (c) .
(c) .

(2) Relief.--In a civil action brought under paragraph

(1) in which the plaintiff prevails, the court may award the
plaintiff declaratory or injunctive relief, compensatory
damages, and reasonable litigation expenses, including a
reasonable attorney's fee.

(3) Statute of limitations.--An action for a violation of
this Act brought under this subsection may be commenced not
later than 4 years after the date upon which the plaintiff
discovers or should have discovered the facts giving rise to
such violation.
SEC. 8.

This Act does not impair or in any way alter the rights of
copyright owners under any other applicable law.
SEC. 9.

If any provision of this Act, or an amendment made by this Act, is
determined to be unenforceable or invalid, the remaining provisions of
this Act and the amendments made by this Act shall not be affected.
<all>