Introduced:
Feb 26, 2025
Policy Area:
Commerce
Congress.gov:
Bill Statistics
3
Actions
25
Cosponsors
1
Summaries
1
Subjects
1
Text Versions
Yes
Full Text
AI Summary
AI Summary
No AI Summary Available
Click the button above to generate an AI-powered summary of this bill using Claude.
The summary will analyze the bill's key provisions, impact, and implementation details.
Error generating summary
Latest Action
Feb 26, 2025
Referred to the House Committee on Energy and Commerce.
Summaries (1)
Introduced in House
- Feb 26, 2025
00
<p><strong>Shielding Children's Retinas from Egregious Exposure on the Net Act or the SCREEN Act</strong><br/><br/>This bill establishes age-verification requirements for commercial interactive computer services (e.g., websites) that make available content that is harmful to minors (e.g., content that appeals to the prurient interest in nudity or sex, is obscene, or is child pornography).<br/><br/>Specifically, the bill requires such services to adopt and utilize technology verification measures to ensure that (1) users of the service are not minors, and (2) minors are prevented from accessing any content on the service that is harmful to minors.<br/><br/>Additionally, such services must (1) use the technology to verify a user's age; (2) publish the verification process that the service uses; and (3) subject users' Internet Protocol (IP) addresses, including known virtual proxy network (VPN) IP addresses, to the technology verification measures, unless the service determines a user is not located within the United States.<br/><br/>Covered services also must implement data security measures to protect information about individuals collected through the verification process.<br/><br/>The Federal Trade Commission must conduct regular audits of such services, issue guidance, and otherwise enforce the requirements of this bill.</p>
Actions (3)
Referred to the House Committee on Energy and Commerce.
Type: IntroReferral
| Source: House floor actions
| Code: H11100
Feb 26, 2025
Introduced in House
Type: IntroReferral
| Source: Library of Congress
| Code: Intro-H
Feb 26, 2025
Introduced in House
Type: IntroReferral
| Source: Library of Congress
| Code: 1000
Feb 26, 2025
Subjects (1)
Commerce
(Policy Area)
Cosponsors (20 of 25)
(R-FL)
Sep 3, 2025
Sep 3, 2025
(R-SC)
Sep 2, 2025
Sep 2, 2025
(R-AL)
Sep 2, 2025
Sep 2, 2025
(R-SC)
Sep 2, 2025
Sep 2, 2025
(R-IN)
Aug 26, 2025
Aug 26, 2025
(R-TX)
Jul 16, 2025
Jul 16, 2025
(R-NC)
Jun 9, 2025
Jun 9, 2025
(R-TN)
Apr 3, 2025
Apr 3, 2025
(R-TX)
Feb 27, 2025
Feb 27, 2025
(R-MD)
Feb 27, 2025
Feb 27, 2025
(R-TN)
Feb 27, 2025
Feb 27, 2025
(R-CA)
Feb 26, 2025
Feb 26, 2025
(R-NJ)
Feb 26, 2025
Feb 26, 2025
(R-GA)
Feb 26, 2025
Feb 26, 2025
(R-TN)
Feb 26, 2025
Feb 26, 2025
(R-AL)
Feb 26, 2025
Feb 26, 2025
(R-UT)
Feb 26, 2025
Feb 26, 2025
(R-AZ)
Feb 26, 2025
Feb 26, 2025
(R-OK)
Feb 26, 2025
Feb 26, 2025
(R-TX)
Feb 26, 2025
Feb 26, 2025
Showing latest 20 cosponsors
Full Bill Text
Length: 17,574 characters
Version: Introduced in House
Version Date: Feb 26, 2025
Last Updated: Nov 15, 2025 2:05 AM
[Congressional Bills 119th Congress]
[From the U.S. Government Publishing Office]
[H.R. 1623 Introduced in House
(IH) ]
<DOC>
119th CONGRESS
1st Session
H. R. 1623
To require certain interactive computer services to adopt and operate
technology verification measures to ensure that users of the platform
are not minors, and for other purposes.
_______________________________________________________________________
IN THE HOUSE OF REPRESENTATIVES
February 26, 2025
Mrs. Miller of Illinois (for herself, Mr. Van Drew, Mr. Brecheen, Mr.
LaMalfa, Mr. Austin Scott of Georgia, Mr. Kennedy of Utah, Mr. Crane,
Mr. Aderholt, Mr. Babin, and Mr. Rose) introduced the following bill;
which was referred to the Committee on Energy and Commerce
_______________________________________________________________________
A BILL
To require certain interactive computer services to adopt and operate
technology verification measures to ensure that users of the platform
are not minors, and for other purposes.
Be it enacted by the Senate and House of Representatives of the
United States of America in Congress assembled,
[From the U.S. Government Publishing Office]
[H.R. 1623 Introduced in House
(IH) ]
<DOC>
119th CONGRESS
1st Session
H. R. 1623
To require certain interactive computer services to adopt and operate
technology verification measures to ensure that users of the platform
are not minors, and for other purposes.
_______________________________________________________________________
IN THE HOUSE OF REPRESENTATIVES
February 26, 2025
Mrs. Miller of Illinois (for herself, Mr. Van Drew, Mr. Brecheen, Mr.
LaMalfa, Mr. Austin Scott of Georgia, Mr. Kennedy of Utah, Mr. Crane,
Mr. Aderholt, Mr. Babin, and Mr. Rose) introduced the following bill;
which was referred to the Committee on Energy and Commerce
_______________________________________________________________________
A BILL
To require certain interactive computer services to adopt and operate
technology verification measures to ensure that users of the platform
are not minors, and for other purposes.
Be it enacted by the Senate and House of Representatives of the
United States of America in Congress assembled,
SECTION 1.
This Act may be cited as the ``Shielding Children's Retinas from
Egregious Exposure on the Net Act'' or the ``SCREEN Act''.
SEC. 2.
(a)
=== Findings ===
-Congress finds the following:
(1) Over the 3 decades preceding the date of enactment of
this Act, Congress has passed several bills to protect minors
from access to online pornographic content, including title V
of the Telecommunications Act of 1996 (Public Law 104-104)
(commonly known as the ``Communications Decency Act''),
section 231 of the Communications Act of 1934 (47 U.
known as the ``Child Online Protection Act''), and the
Children's Internet Protection Act (title XVII of division B of
Public Law 106-554).
(2) With the exception of the Children's Internet
Protection Act (title XVII of division B of Public Law 106-
554), the Supreme Court of the United States has struck down
the previous efforts of Congress to shield children from
pornographic content, finding that such legislation constituted
a ``compelling government interest'' but that it was not the
least restrictive means to achieve such interest. In Ashcroft
v. ACLU, 542 U.S. 656
(2004) , the Court even suggested at the
time that ``blocking and filtering software'' could conceivably
be a ``primary alternative'' to the requirements passed by
Congress.
(3) In the nearly 2 decades since the Supreme Court of the
United States suggested the use of ``blocking and filtering
software'', such technology has proven to be ineffective in
protecting minors from accessing online pornographic content.
The Kaiser Family Foundation has found that filters do not work
on 1 in 10 pornography sites accessed intentionally and 1 in 3
pornography sites that are accessed unintentionally. Further,
it has been proven that children are able to bypass ``blocking
and filtering'' software by employing strategic searches or
measures to bypass the software completely.
(4) Additionally, Pew Research has revealed studies showing
that only 39 percent of parents use blocking or filtering
software for their minor's online activities, meaning that 61
percent of children only have restrictions on their internet
access when they are at school or at a library.
(5) 17 States have now recognized pornography as a public
health hazard that leads to a broad range of individual harms,
societal harms, and public health impacts.
(6) It is estimated that 80 percent of minors between the
ages of 12 to 17 have been exposed to pornography, with 54
percent of teenagers seeking it out. The internet is the most
common source for minors to access pornography with
pornographic websites receiving more web traffic in the United
States than Twitter, Netflix, Pinterest, and LinkedIn combined.
(7) Exposure to online pornography has created unique
psychological effects for minors, including anxiety, addiction,
low self-esteem, body image disorders, an increase in
problematic sexual activity at younger ages, and an increased
desire among minors to engage in risky sexual behavior.
(8) The Supreme Court of the United States has recognized
on multiple occasions that Congress has a ``compelling
government interest'' to protect the physical and psychological
well-being of minors, which includes shielding them from
``indecent'' content that may not necessarily be considered
``obscene'' by adult standards.
(9) Because ``blocking and filtering software'' has not
produced the results envisioned nearly 2 decades ago, it is
necessary for Congress to pursue alternative policies to enable
the protection of the physical and psychological well-being of
minors.
(10) The evolution of our technology has now enabled the
use of age verification technology that is cost efficient, not
unduly burdensome, and can be operated narrowly in a manner
that ensures only adults have access to a website's online
pornographic content.
(b) Sense of Congress.--It is the sense of Congress that--
(1) shielding minors from access to online pornographic
content is a compelling government interest that protects the
physical and psychological well-being of minors; and
(2) requiring interactive computer services that are in the
business of creating, hosting, or making available pornographic
content to enact technological measures that shield minors from
accessing pornographic content on their platforms is the least
restrictive means for Congress to achieve its compelling
government interest.
Children's Internet Protection Act (title XVII of division B of
Public Law 106-554).
(2) With the exception of the Children's Internet
Protection Act (title XVII of division B of Public Law 106-
554), the Supreme Court of the United States has struck down
the previous efforts of Congress to shield children from
pornographic content, finding that such legislation constituted
a ``compelling government interest'' but that it was not the
least restrictive means to achieve such interest. In Ashcroft
v. ACLU, 542 U.S. 656
(2004) , the Court even suggested at the
time that ``blocking and filtering software'' could conceivably
be a ``primary alternative'' to the requirements passed by
Congress.
(3) In the nearly 2 decades since the Supreme Court of the
United States suggested the use of ``blocking and filtering
software'', such technology has proven to be ineffective in
protecting minors from accessing online pornographic content.
The Kaiser Family Foundation has found that filters do not work
on 1 in 10 pornography sites accessed intentionally and 1 in 3
pornography sites that are accessed unintentionally. Further,
it has been proven that children are able to bypass ``blocking
and filtering'' software by employing strategic searches or
measures to bypass the software completely.
(4) Additionally, Pew Research has revealed studies showing
that only 39 percent of parents use blocking or filtering
software for their minor's online activities, meaning that 61
percent of children only have restrictions on their internet
access when they are at school or at a library.
(5) 17 States have now recognized pornography as a public
health hazard that leads to a broad range of individual harms,
societal harms, and public health impacts.
(6) It is estimated that 80 percent of minors between the
ages of 12 to 17 have been exposed to pornography, with 54
percent of teenagers seeking it out. The internet is the most
common source for minors to access pornography with
pornographic websites receiving more web traffic in the United
States than Twitter, Netflix, Pinterest, and LinkedIn combined.
(7) Exposure to online pornography has created unique
psychological effects for minors, including anxiety, addiction,
low self-esteem, body image disorders, an increase in
problematic sexual activity at younger ages, and an increased
desire among minors to engage in risky sexual behavior.
(8) The Supreme Court of the United States has recognized
on multiple occasions that Congress has a ``compelling
government interest'' to protect the physical and psychological
well-being of minors, which includes shielding them from
``indecent'' content that may not necessarily be considered
``obscene'' by adult standards.
(9) Because ``blocking and filtering software'' has not
produced the results envisioned nearly 2 decades ago, it is
necessary for Congress to pursue alternative policies to enable
the protection of the physical and psychological well-being of
minors.
(10) The evolution of our technology has now enabled the
use of age verification technology that is cost efficient, not
unduly burdensome, and can be operated narrowly in a manner
that ensures only adults have access to a website's online
pornographic content.
(b) Sense of Congress.--It is the sense of Congress that--
(1) shielding minors from access to online pornographic
content is a compelling government interest that protects the
physical and psychological well-being of minors; and
(2) requiring interactive computer services that are in the
business of creating, hosting, or making available pornographic
content to enact technological measures that shield minors from
accessing pornographic content on their platforms is the least
restrictive means for Congress to achieve its compelling
government interest.
SEC. 3.
In this Act:
(1) Child pornography; minor.--The terms ``child
pornography'' and ``minor'' have the meanings given those terms
in
section 2256 of title 18, United States Code.
(2) Commission.--The term ``Commission'' means the Federal
Trade Commission.
(3) Covered platform.--The term ``covered platform''--
(A) means an entity--
(i) that is an interactive computer
service;
(ii) that--
(I) is engaged in interstate or
foreign commerce; or
(II) purposefully avails itself of
the United States market or a portion
thereof; and
(iii) for which it is in the regular course
of the trade or business of the entity to
create, host, or make available content that
meets the definition of harmful to minors under
paragraph
(4) and that is provided by the
entity, a user, or other information content
provider, with the objective of earning a
profit; and
(B) includes an entity described in subparagraph
(A) regardless of whether--
(i) the entity earns a profit on the
activities described in subparagraph
(A)
(iii) ;
or
(ii) creating, hosting, or making available
content that meets the definition of harmful to
minors under paragraph
(4) is the sole source
of income or principal business of the entity.
(4) Harmful to minors.--The term ``harmful to minors'',
with respect to a picture, image, graphic image file, film,
videotape, or other visual depiction, means that the picture,
image, graphic image file, film, videotape, or other
depiction--
(A)
(i) taken as a whole and with respect to minors,
appeals to the prurient interest in nudity, sex, or
excretion;
(ii) depicts, describes, or represents, in a
patently offensive way with respect to what is suitable
for minors, an actual or simulated sexual act or sexual
contact, actual or simulated normal or perverted sexual
acts, or lewd exhibition of the genitals; and
(iii) taken as a whole, lacks serious, literary,
artistic, political, or scientific value as to minors;
(B) is obscene; or
(C) is child pornography.
(5) Information content provider; interactive computer
service.--The terms ``information content provider'' and
``interactive computer service'' have the meanings given those
terms in
section 230
(f) of the Communications Act of 1934 (47
U.
(f) of the Communications Act of 1934 (47
U.S.C. 230
(f) ).
(6) Sexual act; sexual contact.--The terms ``sexual act''
and ``sexual contact'' have the meanings given those terms in
section 2246 of title 18, United States Code.
(7) Technology verification measure.--The term ``technology
verification measure'' means technology that--
(A) employs a system or process to determine
whether it is more likely than not that a user of a
covered platform is a minor; and
(B) prevents access by minors to any content on a
covered platform.
(8) Technology verification measure data.--The term
``technology verification measure data'' means information
that--
(A) identifies, is linked to, or is reasonably
linkable to an individual or a device that identifies,
is linked to, or is reasonably linkable to an
individual;
(B) is collected or processed for the purpose of
fulfilling a request by an individual to access any
content on a covered platform; and
(C) is collected and processed solely for the
purpose of utilizing a technology verification measure
and meeting the obligations imposed under this Act.
SEC. 4.
(a) Covered Platform Requirements.--Beginning on the date that is 1
year after the date of enactment of this Act, a covered platform shall
adopt and utilize technology verification measures on the platform to
ensure that--
(1) users of the covered platform are not minors; and
(2) minors are prevented from accessing any content on the
covered platform that is harmful to minors.
(b) Requirements for Age Verification Measures.--In order to comply
with the requirement of subsection
(a) , the technology verification
measures adopted and utilized by a covered platform shall do the
following:
(1) Use a technology verification measure in order to
verify a user's age.
(2) Provide that requiring a user to confirm that the user
is not a minor shall not be sufficient to satisfy the
requirement of subsection
(a) .
(3) Make publicly available the verification process that
the covered platform is employing to comply with the
requirements under this Act.
(4) Subject the Internet Protocol
(IP) addresses, including
known virtual proxy network IP addresses, of all users of a
covered platform to the technology verification measure
described in paragraph
(1) unless the covered platform
determines based on available technology that a user is not
located within the United States.
(c) Choice of Verification Measures.--A covered platform may choose
the specific technology verification measures to employ for purposes of
complying with subsection
(a) , provided that the technology
verification measure employed by the covered platform meets the
requirements of subsection
(b) and prohibits a minor from accessing the
platform or any information on the platform that is obscene, child
pornography, or harmful to minors.
(d) Use of Third Parties.--A covered platform may contract with a
third party to employ technology verification measures for purposes of
complying with subsection
(a) but the use of such a third party shall
not relieve the covered platform of its obligations under this Act or
from liability under this Act.
(e) Rule of Construction.--Nothing in this section shall be
construed to require a covered platform to submit to the Commission any
information that identifies, is linked to, or is reasonably linkable to
a user of the covered platform or a device that identifies, is linked
to, or is reasonably linkable to a user of the covered platform.
(f) Technology Verification Measure Data Security.--A covered
platform shall--
(1) establish, implement, and maintain reasonable data
security to--
(A) protect the confidentiality, integrity, and
accessibility of technology verification measure data
collected by the covered platform or a third party
employed by the covered platform; and
(B) protect such technology verification measure
data against unauthorized access; and
(2) retain the technology verification measure data for no
longer than is reasonably necessary to utilize a technology
verification measure or what is minimally necessary to
demonstrate compliance with the obligations under this Act.
SEC. 5.
In enforcing the requirements under
section 4, the Commission shall
consult with the following individuals, including with respect to the
applicable standards and metrics for making a determination on whether
a user of a covered platform is not a minor:
(1) Individuals with experience in computer science and
software engineering.
consult with the following individuals, including with respect to the
applicable standards and metrics for making a determination on whether
a user of a covered platform is not a minor:
(1) Individuals with experience in computer science and
software engineering.
(2) Individuals with experience in--
(A) advocating for online child safety; or
(B) providing services to minors who have been
victimized by online child exploitation.
(3) Individuals with experience in consumer protection and
online privacy.
(4) Individuals who supply technology verification measure
products or have expertise in technology verification measure
solutions.
(5) Individuals with experience in data security and
cryptography.
applicable standards and metrics for making a determination on whether
a user of a covered platform is not a minor:
(1) Individuals with experience in computer science and
software engineering.
(2) Individuals with experience in--
(A) advocating for online child safety; or
(B) providing services to minors who have been
victimized by online child exploitation.
(3) Individuals with experience in consumer protection and
online privacy.
(4) Individuals who supply technology verification measure
products or have expertise in technology verification measure
solutions.
(5) Individuals with experience in data security and
cryptography.
SEC. 6.
(a) In General.--The Commission shall--
(1) conduct regular audits of covered platforms to ensure
compliance with the requirements of
section 4;
(2) make public the terms and processes for the audits
conducted under paragraph
(1) , including the processes for any
third party conducting an audit on behalf of the Commission;
(3) establish a process for each covered platform to submit
only such documents or other materials as are necessary for the
Commission to ensure full compliance with the requirements of
(2) make public the terms and processes for the audits
conducted under paragraph
(1) , including the processes for any
third party conducting an audit on behalf of the Commission;
(3) establish a process for each covered platform to submit
only such documents or other materials as are necessary for the
Commission to ensure full compliance with the requirements of
section 4 when conducting audits under this section; and
(4) prescribe the appropriate documents, materials, or
other measures required to demonstrate full compliance with the
requirements of
(4) prescribe the appropriate documents, materials, or
other measures required to demonstrate full compliance with the
requirements of
section 4.
(b) Guidance.--
(1) In general.--Not later than 180 days after the date of
enactment of this Act, the Commission shall issue guidance to
assist covered platforms in complying with the requirements of
section 4.
(2) Limitations on guidance.--No guidance issued by the
Commission with respect to this Act shall confer any rights on
any person, State, or locality, nor shall operate to bind the
Commission or any person to the approach recommended in such
guidance. In any enforcement action brought pursuant to this
Act, the Commission shall allege a specific violation of a
provision of this Act. The Commission may not base an
enforcement action on, or execute a consent order based on,
practices that are alleged to be inconsistent with any such
guidelines, unless the practices allegedly violate a provision
of this Act.
SEC. 7.
(a) Unfair or Deceptive Act or Practice.--A violation of
section 4
shall be treated as a violation of a rule defining an unfair or
deceptive act or practice under
shall be treated as a violation of a rule defining an unfair or
deceptive act or practice under
deceptive act or practice under
section 18
(a)
(1)
(B) of the Federal
Trade Commission Act (15 U.
(a)
(1)
(B) of the Federal
Trade Commission Act (15 U.S.C. 57a
(a)
(1)
(B) ).
(b) Powers of the Commission.--
(1) In general.--The Commission shall enforce
section 4 in
the same manner, by the same means, and with the same
jurisdiction, powers, and duties as though all applicable terms
and provisions of the Federal Trade Commission Act (15 U.
the same manner, by the same means, and with the same
jurisdiction, powers, and duties as though all applicable terms
and provisions of the Federal Trade Commission Act (15 U.S.C.
41 et seq.) were incorporated into and made a part of this
title.
(2) Privileges and immunities.--Any person who violates
jurisdiction, powers, and duties as though all applicable terms
and provisions of the Federal Trade Commission Act (15 U.S.C.
41 et seq.) were incorporated into and made a part of this
title.
(2) Privileges and immunities.--Any person who violates
section 4 shall be subject to the penalties and entitled to the
privileges and immunities provided in the Federal Trade
Commission Act (15 U.
privileges and immunities provided in the Federal Trade
Commission Act (15 U.S.C. 41 et seq.).
(3) Authority preserved.--Nothing in this Act shall be
construed to limit the authority of the Commission under any
other provision of law.
Commission Act (15 U.S.C. 41 et seq.).
(3) Authority preserved.--Nothing in this Act shall be
construed to limit the authority of the Commission under any
other provision of law.
SEC. 8.
Not later than 2 years after the date on which covered platforms
are required to comply with the requirement of
section 4
(a) , the
Comptroller General of the United States shall submit to Congress a
report that includes--
(1) an analysis of the effectiveness of the technology
verification measures required under such section;
(2) an analysis of rates of compliance with such section
among covered platforms;
(3) an analysis of the data security measures used by
covered platforms in the age verification process;
(4) an analysis of the behavioral, economic, psychological,
and societal effects of implementing technology verification
measures;
(5) recommendations to the Commission on improving
enforcement of
(a) , the
Comptroller General of the United States shall submit to Congress a
report that includes--
(1) an analysis of the effectiveness of the technology
verification measures required under such section;
(2) an analysis of rates of compliance with such section
among covered platforms;
(3) an analysis of the data security measures used by
covered platforms in the age verification process;
(4) an analysis of the behavioral, economic, psychological,
and societal effects of implementing technology verification
measures;
(5) recommendations to the Commission on improving
enforcement of
section 4
(a) , if any; and
(6) recommendations to Congress on potential legislative
improvements to this Act, if any.
(a) , if any; and
(6) recommendations to Congress on potential legislative
improvements to this Act, if any.
SEC. 9.
If any provision of this Act, or the application of such a
provision to any person or circumstance, is held to be
unconstitutional, the remaining provisions of this Act, and the
application of such provisions to any other person or circumstance,
shall not be affected thereby.
<all>