Weaponizing Kindness: When “Protect the Kids” Laws Expand Arizona's Government Control
The question facing lawmakers is not whether children deserve protection, but how to provide it without expanding surveillance, data collection, or government authority.
“We have spent, you have spent, so much time telling people how horrible government K-12 is. Now you’re asking that same government to help you verify your children’s age before they can get on. That mentality of going to government to pass bills to do what you should be doing as parents is a lot of how we got here.”
-Alvin Lui | Courage is a Habit
It’s a beautiful Saturday in Arizona, the kind of day that reminds you why people love living here. But as the sunshine fills the desert, the growing hustle and bustle inside the Capitol signals the legislative session is racing toward its final stretch.
In just a few short weeks, a handful of privacy bills could land on the desk of Governor Katie Hobbs for a signature. Many of them sound reassuring on paper; proposals framed around protecting children and keeping families safe online. Those are goals most people would instinctively support. But sometimes the policies wrapped in the language of safety come with trade-offs deserving a closer look.
I agree with the principle behind many of these bills… but the devil is in the details of data retention, data sharing, and Fourth Amendment protections.
As I was thinking about how to approach this topic, I came across a post this morning from Alvin Lui of Courage Is a Habit. In a video he admits he’d rather not make…
Lui explains that speaking against popular ideas, especially those framed around protecting kids, can be uncomfortable.
I understand that feeling. On the surface, proposals like age verification for social media seem reasonable. As Lui put it, “of course, that sounds like a pretty good idea.” But he argues that the way these policies are often presented can amount to a kind of bait-and-switch. One that shifts responsibility away from families and toward government mandates.
“We already have age verification. And it goes something like this.
Son goes up to father. ‘Dad, I would like a cell phone. I would like to get on TikTok.’
Dad goes, “No son. You’re 11 years old.’
There you go.”
Lui argues that parenting, not government mandates, should be the first line of defense in the digital age.
One of the first proposals worth examining is a bill aimed at addressing harmful online interactions involving minors.
Arizona House Bill 2665 — “Cade’s Law: If You See Something, Say Something”
HB 2665 would expand Arizona’s manslaughter statute to include certain online communications directed at minors. Under the bill, anyone of at least 18 years of age could face manslaughter charges if they intentionally send a direct communication — verbally, in writing, or electronically — encouraging or advising a minor to die by suicide, and the minor later uses that communication to take their own life.
The bill defines “direct communication” as messages specifically addressed to the minor through platforms such as social media posts, text messages, or other online communications. General discussions about suicide or mental health that are not specifically directed at the minor would not be included under the law.
Supporters say the measure is intended to ensure that adults of at least 18 years of age and up who intentionally exploit vulnerable minors online are held accountable for the consequences of those actions… a goal few would disagree with.
HB 2665 was heard on the House floor for Third Reading on February 25th. The family of 16-year-old Cade Keller, who died by suicide in March 2022, was present in the gallery during the debate.
Rep. Nancy Gutierrez (D-LD18) voted no, saying she agrees there should be consequences for adults who encourage suicide but expressed concern that the law could result in a 15-year-old being prosecuted as an adult for manslaughter.
Rep. Alex Kolodin (R-LD3) voted yes, explaining that the bill does not create a new crime but clarifies how existing law applies. “It is the underlying law that criminalizes the urging of a young person to take their own life. Not this bill,” Kolodin said. “What this bill does is clarify that that underlying law applies even if the communication was electronic in nature.”
Rep. Jeff Weninger (R-LD13) also spoke in favor of the bill, noting that he sponsored the original underlying law. Weninger referenced a 16-year-old Chandler High School valedictorian who died by suicide in 2019. “It was a lot of the same thing,” he said, describing how the student had reportedly been encouraged online by an adult to take his life.
Rep. Anna Abeytia (D-LD24) voted no, sharing her own experience with mental health struggles. She said she was concerned about criminalizing conversations surrounding suicide and mental health, particularly when young adults may still be navigating those issues themselves. "Kids could be going to their peers in high school. Yes they are legally adults, but do they understand the capacity of that?"
Rep. Kevin Volk (D-LD17) supported the bill but acknowledged concerns raised during the debate, noting that lawmakers want to avoid situations where peer-to-peer conversations among young people could be unintentionally swept into criminal liability.
The bill ultimately passed the House Third Reading on a vote of 45–9–6.
For clarity, I asked child privacy pro Julie Barrett, founder of Conservative Ladies of America, to review the bill. Barrett says HB 2665 raises concerns about how broadly the law could be interpreted.
Barrett is actively involved in the national debate over age-verification legislation, traveling to state capitols to testify before lawmakers and speak at committee hearings about the potential privacy, parental-rights, and data-collection concerns raised by these proposals.
In an exclusive blog post, Barrett explains the proposal creates a new manslaughter offense if an adult intentionally sends a “directed communication” encouraging a minor to die by suicide, knowing the minor intends to do so.
But, Barrett argues the definition of “directed communication” could potentially pull in posts, tags, or messages that are later interpreted as targeting a minor, raising questions about how the law might be applied in practice.
“Digital communication is messy. Tone is misread. Sarcasm is taken literally. Posts are screenshotted, shared, and stripped of context. A comment meant for a general audience can be reframed as targeted. And when the stakes involve a child in crisis, the pressure to assign blame is immense.
The bill requires intent and knowledge, which is a meaningful guardrail. But even with those standards, the ambiguity around what counts as “encouragement” or “advice” opens the door to subjective interpretation.”
Barrett explains the risk is selective enforcement, where the law could become a tool in personal, political, or ideological conflicts — including custody disputes, online conflicts between adults and teens, ideological targeting, or pressure on platforms to over-moderate content.
HB 2665 is scheduled for a hearing in the Senate Committee on Judiciary and Elections on Wednesday, March 18, 2026.
Arizona House Bill 2311 — Children and Conversational AI
HB 2311 proposes new requirements for online platforms that host certain types of artificial-intelligence-generated content. The bill would require platforms to implement age-verification systems designed to prevent minors from accessing specified AI services. In practice, this could require users to continuously verify their age before accessing certain platforms or tools.
The bill is scheduled to receive a hearing on Tuesday, March 17th before the Arizona Senate Committee on Appropriations, Transportation and Technology.
Supporters argue the measure is intended to protect minors from harmful or inappropriate AI-generated content. Technology giant Google testified in support of the proposal during an Arizona legislative hearing.
Critics, however, argue the bill raises significant privacy concerns. Julie Barrett, founder of Conservative Ladies of America, described HB 2311 as an “AI Child Safety Bill” that could effectively require digital identification for all users in order to verify age.
During House debate, Rep. Alex Kolodin urged the Senate to reevaluate the legislation, arguing the proposal contains no explicit provisions addressing data privacy, limits on data collection, retention standards, biometric identification safeguards, or restrictions on third-party data sharing.
In reviewing the bill, Barrett points to what she sees as practical and structural concerns with how age-verification systems would operate.
In my Jen & Friends podcast, she argues proposals requiring digital age verification often depend on linking a parent’s account or identity to a child’s device or account. According to Barrett, the legislation does not clearly specify how those parent-child relationships would be verified, raising questions about how platforms would confirm that the adult providing authorization is actually the child’s parent or legal guardian.
She also notes potential complications for families where parents and children do not share the same last name, arguing that the mechanics of account linking and verification remain largely undefined in the legislation.
Arizona House Bill 2920 — App Store Accountability
HB 2920 would require online service providers to request and verify the age category of anyone creating an account. If the user is determined to be a minor, the bill instructs providers to require the minor’s account to be linked to a parent account. Similar proposals in other states are often referred to as the “App Store Accountability Act.” The bill has not seen movement since January and currently has no upcoming hearings scheduled. But we’re keeping a close eye on it.
Arizona House Bill 2133 — “Protect Act”
While some states, including Arizona, require age verification for viewers accessing adult websites, federal law focuses only on verifying the age of performers appearing in sexually explicit content.
HB 2133 would require commercial websites that publish or distribute sexual material to verify the age and consent of individuals depicted in that material. The state bill requires anyone who uploads such material to confirm that the person depicted was at least 18 years old at the time the material was created and that consent verification methods were used. Platforms would be required to maintain verification records for at least seven years. Entities that fail to obtain verification could face civil liability for damages.
Supporters say the bill is intended to prevent the distribution of non-consensual or underage sexual content. During committee discussion, however, lawmakers and industry observers raised questions about how the required records would be stored, shared, and accessed.
Senator Analise Ortiz (D-LD24) asked bill sponsor Representative Nick Kupper (R-LD25) who would ultimately hold the verification database.
Rep. Kupper responded that the system would follow existing models used in the adult content industry under federal law, where companies maintain their own records.
Journalist and documentary filmmaker Mike Stabile of the Free Speech Coalition also raised concerns about the record keeping provisions.
“Our concern here is the record keeping... we need to share those records with state and local governments. If there is an investigation, as to is this person under age or was this done without consent, we have to share that information. We cannot delete it. We also have to share it with banks and credit card processors. Wanna make sure that illegal content isn’t being sold on our sites.”
Mike Stabile | Director of Public Policy | Free Speech Coalition
While supporting age and consent verification requirements, he warned that the bill’s mandate to retain records and share them during investigations could create legal complications. HB 2133 states that, upon request, the Attorney General may inspect the records, and the law would allow the Attorney General or individuals depicted in material without consent to bring civil actions for enforcement. Stabile noted that accessing records without a warrant could raise potential Fourth Amendment questions in criminal investigations.
Federal law already requires adult content producers to verify performers’ ages under 18 U.S.C. §2257, but HB 2133 would create a separate state-level system requiring verification records to be kept for seven years and made available for inspection by the Arizona Attorney General.
The bill passed committee on a 4–3 party-line vote. We’ll watch for future hearings.
Speaking of the Fourth Amendment…
As debates around surveillance and digital privacy continue, the discussion is not limited to social media or online platforms. Questions about constitutional protections are also surfacing in legislation involving automated license plate readers.
This week, the Arizona Police Association issued a public statement regarding Arizona Senate Bill 1111, which addresses the use of automatic license plate readers and traffic camera data. In the statement, the association described the bill as an important step toward addressing both privacy concerns raised by citizens and operational concerns raised by law enforcement agencies.
The group acknowledged license plate reader technology can be a valuable investigative tool but stated that lawmakers have not yet found the proper balance between protecting the public’s privacy and establishing clear guardrails for law enforcement’s use of traffic camera systems.
SB 1111 is not currently scheduled for its next hearing.
Jen’s Two Cents.
Many of the age-verification bills moving through the Arizona Legislature are rooted in real tragedies and legitimate concerns.
Protecting children from harm, exploitation, and manipulation online is a goal most people share.
But laws written in response to emotional and urgent problems can sometimes carry consequences that extend far beyond their original intent.
Earlier today, Alvin Lui urged parents to reconsider the growing push for government-mandated solutions to online safety.
“Stop asking government and big tech to do what you’re supposed to do,” he said, arguing that many parents have spent years criticizing government-run K-12 systems while now asking that same government to verify their children’s ages online.
Lui says the same reasoning is often used by schools to justify social and emotional learning programs.
Is it even possible to protect children without creating systems that expand surveillance, data collection, or government authority?
The real challenge isn’t whether children deserve protection. It’s ensuring the solutions we create today don’t reshape privacy and personal freedom for everyone tomorrow.






