This obscure Civil War-era figure gave us a paradoxical warning. Do we have time to heed it today?

In 1865, the economist William Stanley Jevons looked at the industrializing world and noted a distinct, counterintuitive rhythm to the smoke rising over England. The assumption of the time, a naive view that persists with a certain obstinacy, was that improving the efficiency of coal use would lead to a conservation of coal. Jevons observed precisely the opposite. As the steam engines became more efficient, coal became cheaper to use, and the demand for coal did not decline; it skyrocketed.
Live Your Best Retirement
Fun • Funds • Fitness • Freedom
This phenomenon, later called the Jevons Effect, suggests a fundamental truth of economics that we seem determined to forget: When a resource becomes easier and cheaper to consume, and demand for it is elastic, we do not consume less of it. We often consume a great deal more. We find new ways to burn it. We expand the definition of what is possible, not to rest, but to fill the newly available capacity with ever more work.
The result is not a workforce at rest.
We are standing at the precipice of another such moment, perhaps the most significant since the steam engine. The age of AI is upon us, bringing with it efficiencies that promise to do for knowledge work what mechanization did for physical labor. The rhetoric surrounding this shift is familiar. We are told that AI will free us from drudgery, that it will automate the contract reviews, the basic coding, the marketing copy, and leave us with a surplus of time.
Lesson learned?
We have heard this song before. In 1930, John Maynard Keynes predicted that by 2030 technological progress would reduce the workweek to 15 hours. He imagined a world in which productivity was so high that we would opt for leisure. As we survey the frenetic landscape of the American workplace in 2026, we can see that he was largely wrong. We did not take our gains in time; we took them in goods and services.
The history of computing serves as a prologue to the current AI moment. When mainframes were scarce and costly, they were tools for the Fortune 500. As costs fell and efficiency rose, through the minicomputer era to the ubiquitous personal computer, we did not declare the problem of computing “solved.” We adopted roughly 100 times more computers with each step change in affordability. The cloud era erased barriers to entry, and suddenly a local shop could access software capabilities that, in the 1970s, were the exclusive province of massive conglomerates.
When high-level programming languages replaced the tedium of low-level coding, programmers did not write less code. They wrote much more, tackling problems that would have been previously deemed infeasible. Today, despite the existence of open-source libraries and cloud platforms that automate vast swaths of development, there are more software engineers than ever before. Efficiency simply allowed software to infiltrate every domain of life.
RELATED: How Americans can prepare for the worst — before it's too late
Photo by PATRICK T. FALLON/AFP via Getty Images
Now we have LLMs and coding agents. These tools lower the “cost of trying” still further. A task that once required a team — analyzing customer data with advanced models or building a prototype application — can now be attempted by a lone entrepreneur.
From production to orchestration
Consider Boris Cherny, the engineer who created Claude Code and used it to submit 259 pull requests in a single month, altering 78,000 lines of code. Every single line was written by Claude Code. This is not a story of labor reduction; it is a story of a single human scaling his output to match that of a large team. The barrier to initiating a software project or a marketing campaign is falling, and in response, companies are green-lighting projects they would have previously shelved.
The result is not a workforce at rest. Instead we see a shift in which the human role evolves from producer to orchestrator. We are becoming “gardeners,” cultivating and pruning fleets of AI agents. The span of control for a single worker increases, one person supervising what five or 10 might have done previously, but those displaced workers do not vanish into leisure. They move on to supervise their own agents, in different projects, expanding the frontier of what is built. This is the Jevons Effect in a strong form. The “latent demand” for knowledge work is proving to be great.
In the United States, where the cultural ethos tilts toward growth and innovation, this tendency to convert efficiency into more work is acute. Marketing employment, for example, grew fivefold over the last 50 years, precisely during the era when tools like Photoshop and Google Ads made the job in some ways easier. Efficiency turned marketing from a niche activity into a requirement for every business, spawning sub-disciplines that didn’t exist a generation ago.
Why bother?
The danger, of course, lies in the lack of distinction between “can” and “should.” An enduring lesson of the Jevons Effect is that efficiency does not confer wisdom. Technology can tell us how to execute a task faster but cannot say whether the task is worth doing. As roles transition into oversight, reviewing, and coordinating the outputs of AI, we must still ask if those outputs are solving meaningful problems. The crucial factor is human judgment. When more things are possible, the burden falls on us to decide what goals are actually worth the time.
We are not heading toward a 15-hour workweek. We are heading toward a world of expanding projects, in which efficiency lowers the cost of work and raises the amount we choose to do. The coal is cheaper, the fire is hotter, and we are shoveling as fast as we can.
Originally Published at Daily Wire, Daily Signal, or The Blaze
What's Your Reaction?
Like
0
Dislike
0
Love
0
Funny
0
Angry
0
Sad
0
Wow
0