119-s3062

S
✓ Complete Data

GUARD Act

Login to track bills
Introduced:
Oct 28, 2025

Bill Statistics

2
Actions
6
Cosponsors
0
Summaries
0
Subjects
1
Text Versions
Yes
Full Text

AI Summary

No AI Summary Available

Click the button above to generate an AI-powered summary of this bill using Claude.

The summary will analyze the bill's key provisions, impact, and implementation details.

Latest Action

Oct 28, 2025
Read twice and referred to the Committee on the Judiciary.

Actions (2)

Read twice and referred to the Committee on the Judiciary.
Type: IntroReferral | Source: Senate
Oct 28, 2025
Introduced in Senate
Type: IntroReferral | Source: Library of Congress | Code: 10000
Oct 28, 2025

Cosponsors (6)

Text Versions (1)

Introduced in Senate

Oct 28, 2025

Full Bill Text

Length: 14,434 characters Version: Introduced in Senate Version Date: Oct 28, 2025 Last Updated: Nov 15, 2025 6:03 AM
[Congressional Bills 119th Congress]
[From the U.S. Government Publishing Office]
[S. 3062 Introduced in Senate

(IS) ]

<DOC>

119th CONGRESS
1st Session
S. 3062

To require artificial intelligence chatbots to implement age
verification measures and make certain disclosures, and for other
purposes.

_______________________________________________________________________

IN THE SENATE OF THE UNITED STATES

October 28, 2025

Mr. Hawley (for himself, Mr. Blumenthal, Mrs. Britt, Mr. Warner, Mr.
Murphy, and Mr. Kelly) introduced the following bill; which was read
twice and referred to the Committee on the Judiciary

_______________________________________________________________________

A BILL

To require artificial intelligence chatbots to implement age
verification measures and make certain disclosures, and for other
purposes.

Be it enacted by the Senate and House of Representatives of the
United States of America in Congress assembled,
SECTION 1.

This Act may be cited as the ``Guidelines for User Age-verification
and Responsible Dialogue Act of 2025'' or the ``GUARD Act''.
SEC. 2.

Congress finds the following:

(1) Artificial intelligence chatbots are increasingly being
deployed on social media platforms and in consumer applications
used by minors.

(2) These chatbots can generate and disseminate harmful or
sexually explicit content to children.

(3) These chatbots can manipulate emotions and influence
behavior in ways that exploit the developmental vulnerabilities
of minors.

(4) The widespread availability of such chatbots exposes
children to physical and psychological safety risks, including
grooming, addiction, self-harm, and harm to others.

(5) Protecting children from artificial intelligence
chatbots that simulate human interaction without accountability
is a compelling governmental interest.
SEC. 3.

In this Act:

(1) AI companion.--The term ``AI companion'' means an
artificial intelligence chatbot that--
(A) provides adaptive, human-like responses to user
inputs; and
(B) is designed to encourage or facilitate the
simulation of interpersonal or emotional interaction,
friendship, companionship, or therapeutic
communication.

(2) Artificial intelligence chatbot.--The term ``artificial
intelligence chatbot''--
(A) means any interactive computer service or
software application that--
(i) produces new expressive content or
responses not fully predetermined by the
developer or operator of the service or
application; and
(ii) accepts open-ended natural-language or
multimodal user input and produces adaptive or
context-responsive output; and
(B) does not include an interactive computer
service or software application--
(i) the responses of which are limited to
contextualized replies; and
(ii) that is unable to respond on a range
of topics outside of a narrow specified
purpose.

(3) Covered entity.--The term ``covered entity'' means any
person who owns, operates, or otherwise makes available an
artificial intelligence chatbot to individuals in the United
States.

(4) Minor.--The term ``minor'' means any individual who has
not attained 18 years of age.

(5) Reasonable age verification measure.--The term
``reasonable age verification measure'' means a method that is
authenticated to relate to a user of an artificial intelligence
chatbot, such as--
(A) a government-issued identification; or
(B) any other commercially reasonable method that
can reliably and accurately--
(i) determine whether a user is an adult;
and
(ii) prevent access by minors to AI
companions, as required by
section 6.

(6) Reasonable age verification process.--The term
``reasonable age verification process'' means an age
verification process employed by a covered entity that--
(A) uses one or more reasonable age verification
measures in order to verify the age of a user of an
artificial intelligence chatbot owned, operated, or
otherwise made available by the covered entity;
(B) provides that requiring a user to confirm that
the user is not a minor, or to insert the user's birth
date, is not sufficient to constitute a reasonable age
verification measure;
(C) ensures that each user is subjected to each
reasonable age verification measure used by the covered
entity as part of the age verification process; and
(D) does not base verification of a user's age on
factors such as whether the user shares an Internet
Protocol address, hardware identifier, or other
technical indicator with another user determined to not
be a minor.
SEC. 4.

(a) In General.--Part I of title 18, United States Code, is amended
by inserting after chapter 5 the following:

``CHAPTER 6--ARTIFICIAL INTELLIGENCE

``Sec.
``91. Artificial intelligence chatbots.
``
Sec. 91.
``

(a)
=== Definitions. === -In this section: `` (1) Artificial intelligence chatbot.--The term `artificial intelligence chatbot'-- `` (A) means any interactive computer service or software application that-- `` (i) produces new expressive content or responses not fully predetermined by the developer or operator of the service or application; and `` (ii) accepts open-ended natural-language or multimodal user input and produces adaptive or context-responsive output; and `` (B) does not include an interactive computer service or software application-- `` (i) the responses of which are limited to contextualized replies; and `` (ii) that is unable to respond on a range of topics outside of a narrow specified purpose. `` (2) Minor.--The term `minor' means any individual who has not attained 18 years of age. `` (3) Sexually explicit conduct.--The term `sexually explicit conduct' has the meaning given the term in
section 2256.
``

(b) Solicitation of Minors.--
``

(1) Offense.--It shall be unlawful to design, develop, or
make available an artificial intelligence chatbot, knowing or
with reckless disregard for the fact that the artificial
intelligence chatbot poses a risk of soliciting, encouraging,
or inducing minors to--
``
(A) engage in, describe, or simulate sexually
explicit conduct; or
``
(B) create or transmit any visual depiction of
sexually explicit conduct, including any visual
depiction described in
section 1466A (a) .

(a) .
``

(2) Penalty.--Any person who violates paragraph

(1) shall
be fined not more than $100,000 per offense.
``
(c) Promotion of Physical Violence.--
``

(1) Offense.--It shall be unlawful to design, develop, or
make available an artificial intelligence chatbot, knowing or
with reckless disregard for the fact that the artificial
intelligence chatbot encourages, promotes, or coerces suicide,
non-suicidal self-injury, or imminent physical or sexual
violence.
``

(2) Penalty.--Any person who violates paragraph

(1) shall
be fined not more than $100,000 per offense.''.

(b) Technical and Conforming Amendment.--The table of chapters for
part I of title 18, United States Code, is amended by inserting after
the item relating to chapter 5 the following:

``6. Artificial intelligence................................ 91''.
SEC. 5.

(a) Creation of User Accounts.--A covered entity shall require each
individual accessing an artificial intelligence chatbot to make a user
account in order to use or otherwise interact with such chatbot.

(b) Age Verification.--

(1) Age verification of existing accounts.--With respect to
each user account of an artificial intelligence chatbot that
exists as of the effective date of this Act, a covered entity
shall--
(A) on such date, freeze any such account;
(B) in order to restore the functionality of such
account, require that the user provide age data that is
verifiable using a reasonable age verification process,
subject to paragraph

(4) ; and
(C) using such age data, classify each user as a
minor or an adult.

(2) Age verification of new accounts.--At the time an
individual creates a new user account to use or interact with
an artificial intelligence chatbot, a covered entity shall--
(A) request age data from the individual;
(B) verify the individual's age using a reasonable
age verification process, subject to paragraph

(4) ; and
(C) using such age data, classify each user as a
minor or an adult.

(3) Periodic age verification.--A covered entity shall
periodically review previously verified user accounts using a
reasonable age verification process, subject to paragraph

(4) ,
to ensure compliance with this Act.

(4) Use of third parties.--For purposes of paragraphs

(1)
(B) ,

(2)
(B) , and

(3) , a covered entity may contract with a
third party to employ reasonable age verification measures as
part of the covered entity's reasonable age verification
process, but the use of such a third party shall not relieve
the covered entity of its obligations under this Act or from
liability under this Act.

(5) Age verification measure data security.--A covered
entity--
(A) shall establish, implement, and maintain
reasonable data security to--
(i) limit collection of personal data to
that which is minimally necessary to verify a
user's age or maintain compliance with this
Act; and
(ii) protect such age verification data
against unauthorized access;
(B) shall protect such age verification data
against unauthorized access;
(C) shall protect the integrity and confidentiality
of such data by only transmitting such data using
industry-standard encryption protocols;
(D) shall retain such data for no longer than is
reasonably necessary to verify a user's age or maintain
compliance with this Act; and
(E) may not share with, transfer to, or sell to,
any other entity such data.
(c) Required Disclosures for Artificial Intelligence Chatbots.--

(1) Disclosure of non-human status.--Each artificial
intelligence chatbot made available to users shall--
(A) at the initiation of each conversation with a
user and at 30-minute intervals, clearly and
conspicuously disclose to the user that the chatbot is
an artificial intelligence system and not a human
being; and
(B) be programmed to ensure that the chatbot does
not claim to be a human being or otherwise respond
deceptively when asked by a user if the chatbot is a
human being.

(2) Disclosure regarding non-professional status.--
(A) In general.--An artificial intelligence chatbot
may not represent, directly or indirectly, that the
chatbot is a licensed professional, including a
therapist, physician, lawyer, financial advisor, or
other professional.
(B) Other limitations.--Each artificial
intelligence chatbot made available to users shall, at
the initiation of each conversation with a user and at
reasonably regular intervals, clearly and conspicuously
disclose to the user that--
(i) the chatbot does not provide medical,
legal, financial, or psychological services;
and
(ii) users of the chatbot should consult a
licensed professional for such advice.
SEC. 6.

If the age verification process described in
section 5 (b) determines that an individual is a minor, a covered entity shall prohibit the minor from accessing or using any AI companion owned, operated, or otherwise made available by the covered entity.

(b) determines that an individual is a minor, a covered entity shall
prohibit the minor from accessing or using any AI companion owned,
operated, or otherwise made available by the covered entity.
SEC. 7.

(a) In General.--In the case of a violation of
section 5 or 6, or a regulation promulgated thereunder, the Attorney General may bring a civil action in an appropriate district court of the United States to-- (1) enjoin the violation; (2) enforce compliance with
regulation promulgated thereunder, the Attorney General may bring a
civil action in an appropriate district court of the United States to--

(1) enjoin the violation;

(2) enforce compliance with
section 5 or 6, or the regulation promulgated thereunder; or (3) obtain civil penalties under subsection (c) of this section, restitution, and other appropriate relief.
regulation promulgated thereunder; or

(3) obtain civil penalties under subsection
(c) of this
section, restitution, and other appropriate relief.

(b) Attorney General Powers.--

(1) Investigatory powers.--For the purpose of conducting
investigations or bringing enforcement actions under this
section, the Attorney General may issue subpoenas, administer
oaths, and compel the production of documents or testimony.

(2) Rulemaking.--The Attorney General may promulgate any
regulations necessary to carry out this Act.
(c) Civil Penalties.--

(1) In general.--Any person who violates
section 5 or 6, or a regulation promulgated thereunder, shall be subject to a civil penalty not to exceed $100,000 for each violation.
a regulation promulgated thereunder, shall be subject to a
civil penalty not to exceed $100,000 for each violation.

(2) Separate violations.--Each violation described in
paragraph

(1) shall be considered a separate violation.
(d) State Enforcement.--In any case in which the attorney general
of a State has reason to believe that an interest of the residents of
that State has been or is threatened or adversely affected by the
engagement of any covered entity in a violation of this Act or a
regulation promulgated thereunder, the State, as parens patriae, may
bring a civil action on behalf of the residents of the State in a
district court of the United States or a State court of appropriate
jurisdiction to obtain injunctive relief.

(e) Relationship to State Laws.--Nothing in this Act or an
amendment made by this Act, or any regulation promulgated thereunder,
shall be construed to prohibit or otherwise affect the enforcement of
any State law or regulation that is at least as protective of users of
artificial intelligence chatbots as this Act and the amendments made by
this Act, and the regulations promulgated thereunder.
SEC. 8.

This Act and the amendments made by this Act shall take effect on
the date that is 180 days after the date of enactment of this Act.
<all>