Washington Must Protect Kids From AI Chatbot Therapy

Washington Must Protect Kids From AI Chatbot Therapy

Washington Must Protect Kids From AI Chatbot Therapy

Many people, including kids, are using AI chatbots in lieu of professional counseling. (Image: Cheryl Murfin)

Not long ago, if a child was struggling emotionally, the adults in their life worried about who they were talking to at school, online, or late at night on the phone. Now there’s a new, quieter concern: who — or what — is listening when kids are at their most vulnerable.

Artificial intelligence chatbots are increasingly filling that space. They’re always available. They sound kind. They don’t interrupt. And for a young person feeling overwhelmed, lonely, or desperate, that can feel like relief.

But it can also be dangerous.

Across the country, states are beginning to step in, passing laws to prevent AI chatbots from offering mental health advice to young users. The move follows deeply troubling reports of young people harming themselves after turning to these programs for something that looked a lot like therapy — but wasn’t.

To be clear, technology can play a helpful role. Chatbots can share resources, encourage coping strategies, or point someone toward professional help. The problem is how easily that line blurs — especially for kids who don’t yet have the tools to tell the difference between a supportive response and real clinical care.

Mental health professionals have been sounding the alarm.

In a recent article by Stateline, Mitch Prinstein, a senior science adviser at the American Psychological Association, said that some chatbots cross into manipulation. Most chatbots are designed to be endlessly agreeable, mirroring feelings instead of challenging harmful thinking. For a child in crisis, that design choice can be catastrophic.

These systems aren’t capable of empathy. They don’t carry legal or ethical responsibility. They aren’t trained to recognize the moment when a conversation must shift from listening to intervention. And yet, they can sound convincingly human — a dangerous illusion for someone reaching out in pain.

Lawmakers are starting to acknowledge that risk.

Illinois and Nevada have gone so far as to ban the use of AI for behavioral health altogether. New York and Utah now require chatbots to clearly identify themselves as non-human. New York’s law also mandates that programs respond to signs of self-harm by directing users to crisis hotlines and other immediate supports. California and Pennsylvania are weighing similar legislation.

Washington isn’t standing still on this issue, but it isn’t there yet either. Lawmakers in Olympia have introduced bills — as of this week, HB 2225 in the House and SB 5984 in the Senate — that would place guardrails around AI “companion” chatbots, especially those that interact with children. The proposals would require chatbots to clearly identify themselves as non-human, build in protections for detecting signs of self-harm and suicidal intent, and ban emotionally manipulative engagement techniques that could harm vulnerable users. These steps reflect a growing recognition that emotionally persuasive technology aimed at young people carries real risk, but as of now, they remain proposals, not law.

For families navigating a world where kids can stumble into AI “therapy” at any hour of the day or night, that gap matters — and it raises a familiar question in Washington policymaking: will safeguards arrive before harm becomes harder to ignore?

This isn’t about fear of technology. It’s about honesty — and responsibility.

Children deserve to know who they’re talking to. Families deserve guardrails that keep innovation from wandering into spaces it isn’t equipped to handle. And in moments of real emotional crisis, young people deserve something no algorithm can provide: a trained human being who is accountable for their care.

As parents, caregivers, and communities, we’re still learning how to protect kids in a world where help — or the illusion of it — is always just a tap away. But one thing feels clear: when it comes to children’s mental health, “almost human” isn’t good enough.

Now is the time to tell lawmakers how you feel, wherever you stand on this issue. Contact members of the Washington State House of Representatives and Washington State Senate.

More Op-Eds:

A critical chance to defend homeless students | Op-Ed

A ban on gender-affirming care will hurt, not protect, kids | Op Ed

WA should lead in providing cash to help moms and babies | Op Ed

link

Leave a Reply

Your email address will not be published. Required fields are marked *