Trump's new AI Action Plan reveals our digital manifest destiny


It arrived in July under the kind of blandly aspirational title that Washington favors for its grand designs: America’s AI Action Plan. The document, running to over 90 federal actions, spoke of securing leadership, winning a global “AI race,” and ushering in a “new golden age.” One imagines the interagency meetings, the careful calibration of phrases meant to signal both urgency and control. It uses the peculiar dialect of American power, a blend of boosterism and threat assessment, and tells a story not just about a technology, but about the country that produced it.
At its heart, the plan is a declaration of faith, a very American conviction that the future, however unnerving, can be engineered. The document is laced with a sort of technological patriotism, the belief that American ingenuity, if properly unleashed and funded, is the presumptive solution to any problem, including the ones it itself creates. The rhetoric is that of a race, a competition we are destined to win. One is meant to be reminded of other races, other moments when the national project was fused with a technological imperative. The Apollo program, with its clean narrative arc of presidential challenge and triumphant splashdown, is the obvious touchstone.
The plan is a testament to the enduring belief that American leadership is allied with American technology, that to export one is to secure the other.
The plan’s talk of a “roadmap to victory” is Kennedy’s moonshot rhetoric retooled for the age of algorithms. But the echoes are older, deeper. They resonate with the hum of the first power lines stretching across the Tennessee Valley, with the clatter of the transcontinental railroad, with the foundational belief in a frontier to be conquered. The AI frontier, the plan suggests, is simply the latest iteration of manifest destiny, a digital territory to be settled and civilized according to American norms.
The plan refracts the national character through policy. There is the profound distrust of centralized control, a legacy of the country’s founding arguments. The strategy frames the government’s role as that of an “enabler,” not a commander. The private sector will “drive AI innovation.” The government will clear the way, removing “red tape and onerous regulation,” while also suggesting that federal funds might flow more freely to states with a more permissive regulatory climate. It is a philosophy of governance as groundskeeping: tend the soil, remove the weeds, and let a thousand private-sector flowers bloom.
This is the American way, a stark contrast to the European impulse to regulate first and ask questions later, or the Chinese model of state-directed, top-down command.
RELATED: Europe pushes for digital ID to help 'crack down'
Photo by Nurphoto / Contributor via Getty Images
This impulse extends even to the vexing question of truth, a concept that has become distressingly fluid. The plan insists that AI models must be “free from ideological bias.” It directs federal agencies to shun AI systems that engage in social engineering or censorship. One could see this as a noble commitment to objectivity. One could also see it as a maneuver in the country’s raging culture war, embedding a particular vision of neutrality into the machines themselves. The plan calls for scrubbing terms like “misinformation” and “diversity, equity, and inclusion” from official AI risk frameworks, quietly acknowledging that the machines are not just calculating, but inheriting, our arguments.
The concern is palpable: that AI, in its immense power to sort and present information, might become an Orwellian tool. The plan’s promise to avoid that future attempts to reassure a public deeply suspicious of the selective amplification or suppression of particular voices.
Beneath the policy directives lies a familiar foundation of steel and concrete, or rather, silicon and fiber optics. The second pillar of the plan, “Build American AI Infrastructure,” is a 21st-century update to the great nation-building projects of the past. Its ambition is breathtaking. To power the immense computational thirst of AI, the plan calls for a wholesale modernization of the energy grid, even urging the revival of nuclear power. It seeks to accelerate the construction of semiconductor fabs and data centers, those anonymous, humming cathedrals of the digital age, by streamlining environmental reviews. The message is clear: The AI revolution will not be stalled by paperwork.
Just as the Industrial Revolution demanded coal and the automotive age demanded highways, the AI age demands an enormous supply of electricity and processing power. And it needs people. The plan recognizes a coming shortage of electricians and HVAC technicians, the blue-collar workforce required to build and maintain the physical shell of this new intelligence. This is a telling detail, a reminder that even the most ethereal technology rests on a bedrock of manual labor.
The final pillar extends this project globally, recasting diplomacy as a form of technological export. The plan advocates for a “full AI technology stack” to be pushed to allies, a Marshall Plan for the digital age. By exporting American hardware, software, and standards, the U.S. aims to create an ecosystem, a sphere of influence. The logic is one of interdependence: Nations running on American AI will be more amenable to American norms. This is techno-diplomacy, a great-power competition played out in server farms and source code. It is a strategy of pre-emption, an attempt to ensure the world’s operating system is written in a familiar language, before a rival power can install its own. The plan is a testament to the enduring belief that American leadership is allied with American technology, that to export one is to secure the other.
It is a vision of a world made predictable through the careful management of a powerful new tool. And it is a wager, a very American wager, that we can shape our tools before they shape us.
Originally Published at Daily Wire, Daily Signal, or The Blaze
What's Your Reaction?






