The post Senators Introduce Bill to Ban AI Companions for Minors Over Mental Health Fears appeared on BitcoinEthereumNews.com. In brief The bill targets AI chatbots and companions marketed to minors. Data has shown widespread teen use of AI for emotional support and relationships. Critics say companies have failed to protect young users from manipulation and harm. A bipartisan group of U.S. senators on Tuesday introduced a bill to restrict how artificial intelligence models can interact with children, warning that AI companions pose serious risks to minors’ mental health and emotional well-being. The legislation, called the GUARD Act, would ban AI companions for minors, require chatbots to clearly identify themselves as non-human, and create new criminal penalties for companies whose products aimed at minors solicit or generate sexual content. “In their race to the bottom, AI companies are pushing treacherous chatbots at kids and looking away when their products cause sexual abuse, or coerce them into self-harm or suicide,” said Sen. Richard Blumenthal (D-Conn.), one of the bill’s co-sponsors, in a statement. “Our legislation imposes strict safeguards against exploitative or manipulative AI, backed by tough enforcement with criminal and civil penalties,” he added. “Big Tech has betrayed any claim that we should trust companies to do the right thing on their own when they consistently put profit first ahead of child safety.”  The scale of the issue is sobering. A July survey by Common Sense Media found that 72% of teens have used AI companions, and more than half use them at least a few times a month. About one in three said they use AI for social or romantic interaction, emotional support, or conversation practice—and many reported that chats with AI felt as meaningful as those with real friends. An equal amount also said they turned to AI companions instead of humans to discuss serious or personal issues. Concerns have deepened as lawsuits mount against major AI… The post Senators Introduce Bill to Ban AI Companions for Minors Over Mental Health Fears appeared on BitcoinEthereumNews.com. In brief The bill targets AI chatbots and companions marketed to minors. Data has shown widespread teen use of AI for emotional support and relationships. Critics say companies have failed to protect young users from manipulation and harm. A bipartisan group of U.S. senators on Tuesday introduced a bill to restrict how artificial intelligence models can interact with children, warning that AI companions pose serious risks to minors’ mental health and emotional well-being. The legislation, called the GUARD Act, would ban AI companions for minors, require chatbots to clearly identify themselves as non-human, and create new criminal penalties for companies whose products aimed at minors solicit or generate sexual content. “In their race to the bottom, AI companies are pushing treacherous chatbots at kids and looking away when their products cause sexual abuse, or coerce them into self-harm or suicide,” said Sen. Richard Blumenthal (D-Conn.), one of the bill’s co-sponsors, in a statement. “Our legislation imposes strict safeguards against exploitative or manipulative AI, backed by tough enforcement with criminal and civil penalties,” he added. “Big Tech has betrayed any claim that we should trust companies to do the right thing on their own when they consistently put profit first ahead of child safety.”  The scale of the issue is sobering. A July survey by Common Sense Media found that 72% of teens have used AI companions, and more than half use them at least a few times a month. About one in three said they use AI for social or romantic interaction, emotional support, or conversation practice—and many reported that chats with AI felt as meaningful as those with real friends. An equal amount also said they turned to AI companions instead of humans to discuss serious or personal issues. Concerns have deepened as lawsuits mount against major AI…

Senators Introduce Bill to Ban AI Companions for Minors Over Mental Health Fears

In brief

  • The bill targets AI chatbots and companions marketed to minors.
  • Data has shown widespread teen use of AI for emotional support and relationships.
  • Critics say companies have failed to protect young users from manipulation and harm.

A bipartisan group of U.S. senators on Tuesday introduced a bill to restrict how artificial intelligence models can interact with children, warning that AI companions pose serious risks to minors’ mental health and emotional well-being.

The legislation, called the GUARD Act, would ban AI companions for minors, require chatbots to clearly identify themselves as non-human, and create new criminal penalties for companies whose products aimed at minors solicit or generate sexual content.

“In their race to the bottom, AI companies are pushing treacherous chatbots at kids and looking away when their products cause sexual abuse, or coerce them into self-harm or suicide,” said Sen. Richard Blumenthal (D-Conn.), one of the bill’s co-sponsors, in a statement.

“Our legislation imposes strict safeguards against exploitative or manipulative AI, backed by tough enforcement with criminal and civil penalties,” he added. “Big Tech has betrayed any claim that we should trust companies to do the right thing on their own when they consistently put profit first ahead of child safety.”

The scale of the issue is sobering. A July survey by Common Sense Media found that 72% of teens have used AI companions, and more than half use them at least a few times a month. About one in three said they use AI for social or romantic interaction, emotional support, or conversation practice—and many reported that chats with AI felt as meaningful as those with real friends. An equal amount also said they turned to AI companions instead of humans to discuss serious or personal issues.

Concerns have deepened as lawsuits mount against major AI companies over their products’ alleged roles in teen self-harm and suicide. Among them, the parents of 16-year-old Adam Raine—who discussed suicide with ChatGPT before taking his life—have filed a wrongful death lawsuit against OpenAI.

The company drew criticism for its legal response, which included requests for the attendee list and eulogies from the teen’s memorial. Lawyers for the family called their actions “intentional harassment.”

“AI is moving faster than any technology we’ve dealt with, and we’re already seeing its impact on behavior, belief, and emotional health,” said Shady El Damaty, co-founder of Holonym and a digital rights advocate, told Decrypt.

“This is starting to look more like the nuclear arms race than the iPhone era. We’re talking about tech that can shift how people think, that needs to be treated with serious, global accountability.”

El Damaty added that rights for users are essential to ensure users’ safety. “If you build tools that affect how people live and think, you’re responsible for how those tools are used,” he said.

The issue extends beyond minors. This week OpenAI disclosed that 1.2 million users discuss suicide with ChatGPT every week, representing 0.15% of all users. Nearly half a million display explicit or implicit suicidal intent, another 560,000 show signs of psychosis or mania weekly, and over a million users exhibit heightened emotional attachment to the chatbot, according to company data.

Forums on Reddit and other platforms have also sprung up for AI users who say they are in romantic relationships with AI bots. In these groups, users describe their relationships with AI “boyfriends” and “girlfriends,” as well as share AI generated images of themselves and their “partners.”

In response to growing scrutiny, OpenAI this month formed an Expert Council on Well-Being and AI, made up of academics and nonprofit leaders to help guide how its products handle mental health interactions. The move came alongside word from CEO Sam Altman that the company will begin relaxing restrictions on adult content in December.

Generally Intelligent Newsletter

A weekly AI journey narrated by Gen, a generative AI model.

Source: https://decrypt.co/346624/senators-introduce-bill-ban-ai-companions-minors-mental-health-fears

Market Opportunity
Comedian Logo
Comedian Price(BAN)
$0.08011
$0.08011$0.08011
-1.53%
USD
Comedian (BAN) Live Price Chart
Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact service@support.mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.

You May Also Like

Pendle price eyes breakout above $2.35 resistance as new staking model goes live

Pendle price eyes breakout above $2.35 resistance as new staking model goes live

Pendle price is showing signs of recovery above a key resistance level as the protocol rolls out a new staking model. Pendle was trading at $2.07 at press time,
Share
Crypto.news2026/01/20 13:25
SEC clears framework for fast-tracked crypto ETF listings

SEC clears framework for fast-tracked crypto ETF listings

The post SEC clears framework for fast-tracked crypto ETF listings appeared on BitcoinEthereumNews.com. The Securities and Exchange Commission has approved new generic listing standards for spot crypto exchange-traded funds, clearing the way for faster approvals. Summary SEC has greenlighted new generic listing standards for spot crypto ETFs. Rule change eliminates lengthy case-by-case approvals, aligning crypto ETFs with commodity funds. Grayscale’s Digital Large Cap Fund and Bitcoin ETF options also gain approval. The U.S. SEC has approved new generic listing standards that will allow exchanges to fast-track spot crypto ETFs, marking a pivotal shift in U.S. digital asset regulation. According to a Sept. 17 press release, the SEC voted to approve rule changes from Nasdaq, NYSE Arca, and Cboe BZX, enabling them to list and trade commodity-based trust shares, including those holding spot digital assets, without submitting individual proposals for each product. A streamlined path for crypto ETFs Under the new rules, an ETF can be listed without SEC sign-off if its underlying asset trades on a market with surveillance-sharing agreements, has active CFTC-regulated futures contracts for at least six months, or already represents at least 40% of an existing listed ETF. This brings crypto ETFs in line with traditional commodity-based funds under Rule 6c-11, eliminating a process that could take up to 240 days. SEC chair Paul Atkins said the move was designed to “maximize investor choice and foster innovation” while ensuring the U.S. remains the leading market for digital assets. Jamie Selway, director of the division of trading and markets, called the framework “a rational, rules-based approach” that balances access with investor protection. First products already approved Alongside the new standards, the SEC cleared the listing of the Grayscale Digital Large Cap Fund, which tracks spot assets based on the CoinDesk 5 Index. It also approved trading of options tied to the Cboe Bitcoin U.S. ETF Index and its mini version, with…
Share
BitcoinEthereumNews2025/09/18 14:04
Masterpieces at Your Fingertips: Why Artplace is the Ultimate Revolution in Digital Art Galleries

Masterpieces at Your Fingertips: Why Artplace is the Ultimate Revolution in Digital Art Galleries

Art has long been perceived as an exclusive world—a realm reserved for the elite, tucked away in silent galleries and prestigious auction houses. However, the emergence
Share
Techbullion2026/01/20 13:33