BILL ALERT: HB 2225 Builds the On-Ramps to Digital ID Through AI Regulation

BILL ALERT: HB 2225 Builds the On-Ramps to Digital ID Through AI Regulation

Washington’s HB 2225 and its companion bill, SB 5984, targets “AI companion chatbots”, AI systems designed to simulate ongoing, emotionally responsive relationships with users. Supporters of the bill frame it as a child-safety and suicide-prevention bill. However, the structure of the bill raises serious concerns about privacy, parental authority, and unchecked government power. This is a pattern we are seeing across the country as lawmakers attempt to regulate technology that has already surpassed them.

Register your CON (opposed) position for the legislative record here: CSI

HB 2225, legislation requested by Governor Bob Ferguson, imposes legal obligations based on a user’s age and mental state, while offering no realistic way for platforms to determine either with certainty. To avoid liability, companies are pushed toward collecting more personal data, monitoring conversations more aggressively, and retaining records to prove compliance. That surveillance isn’t explicitly mandated, but it is strongly incentivized.

HB 2225 also requires AI chatbot operators to develop and publicly disclose protocols for responding to self-harm and suicidal ideation. In practice, that means AI systems must continuously analyze private conversations for mental-health signals and intervene when certain language appears. This effectively normalizes psychological monitoring inside tools marketed as private, supportive, and non-judgmental.

The bill’s language is notably vague. Terms like “emotional reliance,” “reasonable safeguards,” and “prolonging engagement” are undefined and subjective. That vagueness doesn’t limit state power; it actually expands it. Enforcement is routed through Washington’s Consumer Protection Act, giving the Attorney General broad discretion to investigate, penalize, and shut down companies after the fact, often in response to complaints or political pressure.

image 1

While HB 2225 does not explicitly require digital ID or age verification, it quietly creates demand for both. When the state holds companies liable for failing to protect minors but provides no reliable way to identify minors, the inevitable “solution” is stronger identity verification. That is how digital ID systems are typically introduced: not by mandate, but as a compliance fix to an impossible standard.

HB 2225 isn’t just about AI chatbots. It’s about whether lawmakers can regulate subjective human states—age, vulnerability, emotional distress—and then delegate enforcement to platforms while expanding state power through vague rules and selective enforcement. That tradeoff deserves far more scrutiny than this bill has received.

Comparing HB 2225 to Congress’ GUARD Act: Different Roads, Same Destination

The GUARD Act, sponsored by Senator Josh Hawley (R-MO) takes a blunt federal approach: banning minors from AI outright and requiring government ID or biometric verification for everyone. HB 2225 takes a softer, state-level approach: imposing liability tied to age and mental state, enforced through the Attorney General, and letting platforms figure out how to comply.

But the destination is strikingly similar.

Both approaches undermine parental authority, treat access to modern tools as inherently suspect, and push companies toward more data collection and identity verification. The difference is that the GUARD Act announces the surveillance upfront, while HB 2225 builds the on-ramps quietly, through vague standards, liability pressure, and enforcement discretion.

In practice, HB 2225 may prove more durable. By avoiding explicit ID mandates, it sidesteps immediate constitutional challenges while still nudging platforms toward the same infrastructure: monitoring, profiling, and eventually identity verification in the name of “safety.”

HB 2225 should not be viewed in isolation. It is part of a growing wave of state and federal legislation attempting to regulate emerging technologies by shifting responsibility from parents and individuals to platforms and the government. Whether it’s Washington’s approach through liability and Attorney General enforcement, or Congress’s more explicit push toward identity-based access controls, the pattern is the same: vague safety standards, expanding regulatory authority, and increasing pressure for surveillance and verification as the default solution. These proposals raise serious questions about privacy, parental rights, and the long-term consequences of treating access to modern tools as something to be permissioned by the state. As similar bills continue to move across the country and at the federal level, we will be tracking them closely and providing updates as they advance through the legislative process, alerting you as to how you can speak up and make your voices heard. Because once these systems are built, they are rarely rolled back.


Support Our Work

If this bill concerns you, you’re not alone. Our team works every day to track legislation, break down complex policy, and alert citizens before it’s too late to act. This kind of work takes time, research, and resources—and we rely on supporters like you to keep it going.

If you value having a watchdog in Olympia who will actually read the bills, expose the risks, and equip citizens with the truth, please consider making a donation today. Every contribution, large or small, helps us continue bringing truth and transparency to Washington citizens.

Stand with us.

Conservative Ladies of Washington

Subscribe to our emails

You have Successfully Subscribed!