WASHINGTON – Today, U.S. Senators Jim Risch (R-Idaho), Richard Blumenthal (D-Conn.), and Marsha Blackburn (R-Tenn.) introduced the Kids Online Safety Act—comprehensive bipartisan legislation to protect children online and hold Big Tech accountable.
“America’s youth are our most vulnerable population, yet big tech and social media’s current policies are jeopardizing their mental and physical well being,” said Risch. “The Kids Online Safety Act will empower parents to have a greater role in protecting their kids online while holding tech accountable.”
“Our bill provides specific tools to stop Big Tech companies from driving toxic content at kids and to hold them accountable for putting profits over safety,” said Blumenthal. “Record levels of hopelessness and despair—a national teen mental health crisis—have been fueled by black box algorithms featuring eating disorders, bullying, suicidal thoughts, and more. Kids and parents want to take back control over their online lives. They are demanding safeguards, means to disconnect, and a duty of care for social media. Our bill has strong bipartisan momentum. And it has growing support from young people who’ve seen Big Tech’s destruction, parents who’ve lost children, mental health experts, and public interest advocates. It’s an idea whose time has come.”
“Over the last two years, Senator Blumenthal and I have met with countless parents, psychologists, and pediatricians who are all in agreement that children are suffering at the hands of online platforms,” said Blackburn. “Big Tech has proven to be incapable of appropriately protecting our children, and it’s time for Congress to step in. The bipartisan Kids Online Safety Act not only requires social media companies to make their platforms safer by default, but it provides parents with the tools they need protect their children online. I thank Senator Blumenthal for his continued partnership on this critical issue and urge my colleagues to join us in the fight to protect our children online.”
The Kids Online Safety Act provides young people and parents with the tools, safeguards, and transparency they need to protect against online harms. The bill requires social media platforms to put the well-being of children first, ensuring an environment that is safe by default. The legislation requires independent audits by experts and academic researchers to ensure that social media platforms are taking meaningful steps to address risks to kids.
The Kids Online Safety Act has been cosponsored by U.S. Senators Shelley Moore Capito (R-W. Va.), Ben Ray Luján (D-N.M.), Bill Cassidy (R-La.), Tammy Baldwin (D-Wis.), Joni Ernst (R-Iowa), Amy Klobuchar (D-Minn.), Steve Daines (R-Mont.), Gary Peters (D-Mich.), Marco Rubio (R-Fla.), John Hickenlooper (D-Colo.), Dan Sullivan (R-Ark.), Chris Murphy (D-Conn.), Todd Young (R-Ind.), Chris Coons (D-Del.), Chuck Grassley (R-Iowa), Brian Schatz (D-Hawaii), Lindsey Graham (R-S.C.), Mark Warner (D-Va.), Roger Marshall (R-Kan.), Peter Welch (D-Vt.), Cindy Hyde-Smith (R-Miss.), Maggie Hassan (D-N.H.), Markwayne Mullin (R-Okla.), Dick Durbin (D-Ill.), Sheldon Whitehouse (D-R.I.), and Katie Britt (R-Ala.). More cosponsors may be added during today’s session.
The Kids Online Safety Act is supported by hundreds of advocacy and technology groups, including Common Sense Media, American Psychological Association, American Academy of Pediatrics, American Compass, Eating Disorders Coalition, Fairplay, Mental Health America, and Digital Progress Institute.
The Kids Online Safety Act:
- Requires that social media platforms provide minors with options to protect their information, disable addictive product features, and opt out of algorithmic recommendations. Platforms would be required to enable the strongest settings by default;
- Gives parents new controls to help support their children and identify harmful behaviors, and provides parents and children with a dedicated channel to report harms to kids to the platform;
- Creates a responsibility for social media platforms to prevent and mitigate harms to minors, such as promotion of suicide, eating disorders, substance abuse, sexual exploitation, and unlawful products for minors (e.g. gambling and alcohol);
- Requires social media platforms to perform an annual independent audit that assesses the risks to minors, their compliance with this legislation, and whether the platform is taking meaningful steps to prevent those harms; and
- Provides academic and public interest organizations with access to critical datasets from social media platforms to foster research regarding harms to the safety and well-being of minors.
# # #