Senate’s ‘Conditional’ AI Moratorium Is a Trojan Horse

Tucked inside the Senate’s 940-page reconciliation package is a controversial provision that would discourage states from enforcing almost any law related to artificial intelligence or algorithms for 10 years.
The moratorium on state AI laws has undergone multiple revisions to survive the Senate’s arcane rules governing reconciliation legislation. Under the final language, states would be subject to the moratorium if they take even one dollar from a newly created $500 million state AI infrastructure fund as part of the Broadband Equity, Access, and Deployment (BEAD) Program.
Rebranded as a “temporary pause” on state AI laws, the moratorium is a trojan horse that could tie states’ hands for years and jeopardize their rural broadband funding. Even worse, the new funding pool linked to the moratorium could be leveraged to induce small rural states to rollback restrictions on AI-enabled child sexual exploitation and loosen guardrails on Big Tech.
Proponents stress that states are subject to the moratorium “if and only if” they apply for new AI infrastructure dollars appropriated under (b)(5)(A). But there’s a major catch.
Once a state accepts even one dollar of those funds, the U.S. Department of Commerce would then be authorized to claw back any BEAD awards already obligated to the state if it doesn’t comply with the moratorium. That could potentially impact much of the $42.5 billion in BEAD grants already pledged to state broadband projects.
As Charlie Bullock with the Institute for AI and Law noted, the new language would “clearly allow for deobligation of all of a state’s BEAD funding rather than just the new $500 million tranch.”
Further, it’s not clear that a state could opt back out if it later decided that the moratorium’s sweeping scope proved too onerous or dangerous to its residents.
Like Hotel California, under the terms of the moratorium-linked BEAD funds, “you can check out but you can’t leave.”
If states were coaxed into taking even one dollar of the newly appropriated pool, they would be bound by the sweeping moratorium for 10 years, likely without the possibility of exit.
That’s because BEAD awards impose contractual requirements on states—limiting their sovereignty. Once funds are obligated, states are bound by whatever strings come with them. What’s more, those terms are not necessarily reversible, even upon repayment of fund amounts.
While federal grants almost always come with strings attached, requirements should directly relate to the specific projects or entities involved in carrying out the programs.
For example, BEAD was notoriously bogged down in unnecessary DEI mandates. But those mandates—as horrendous as they were—did not attempt to impose field pre-emption on states.
Barring states from discharging their essential duty to protect the rights and interests of their citizens—as this measure seeks to do—far exceeds the permissible scope of a federal grant program.
The BEAD-linked moratorium would effectively serve as a one-way ratchet to lock states out of enforcing any laws related to “automated decision systems” or AI for the next decade.
Imposing a sweeping preemption of this kind is a bad idea in general. But it’s especially dangerous for the thousands of women and children increasingly victimized by AI-enabled sexual exploitation and abuse.
The National Center on Sexual Exploitation (NCOSE) noted just days ago that AI “nudifying apps” are being used to virtually undress girls and women, predators are using AI to generate child sexual abuse material (CSAM), and traffickers are leveraging AI tools to “groom and sextort minors at scale.”
States have a critical role in ensuring that AI diffusion is accompanied by appropriate guardrails to prevent these and other abuses. And they are rising to the challenge. According to anti-trafficking organization, Enough Abuse, 37 states have criminalized AI generated or “modified” CSAM.
But if states become subject to the moratorium, these and other essential safeguards for kids would be wiped away. Millions of children will be placed at risk of being sexually exploited by groomers and pedophiles using AI with bad intent.
At bottom, the BEAD-linked moratorium risks becoming a tool for inducing states to remove guardrails on Big Tech and strip protections against AI-enabled sexual exploitation of women and children.
Congress shouldn’t be in the business of asking states to betray our children for thirty-pieces of silver. Or in this case, a few million dollars.
If Congress wants to get AI policy right and protect the most vulnerable, it needs to fundamentally rethink the moratorium.
The post Senate’s ‘Conditional’ AI Moratorium Is a Trojan Horse appeared first on The Daily Signal.
Originally Published at Daily Wire, Daily Signal, or The Blaze
What's Your Reaction?






