My school’s AI challenge raised a scary question: What do students need me for?

Feb 19, 2026 - 08:28
 0  1
My school’s AI challenge raised a scary question: What do students need me for?


I might have talked myself out of a job this week. I teach philosophy at Arizona State University, and the university wants to position itself as a leader in the AI revolution. I remain skeptical about AI’s ability to replace a humanities professor. Because of that skepticism, I signed up for what ASU called its AI Challenge.

4 Fs

Live Your Best Retirement

Fun • Funds • Fitness • Freedom

Learn More
Retirement Has More Than One Number
The Four Fs helps you.
Fun
Funds
Fitness
Freedom
See How It Works

My project involved what I called the “AI Dialogues.” I used ASU’s version of ChatGPT to hold Socratic-style dialogues, prompting Chat to reply as a given philosopher. I conducted dialogues with Chat as Aristotle, Hume, Marx, and even Lucifer. My students evaluated these exchanges to see how well Chat performed.

We can avoid the toil of learning to be wise — but we cannot avoid the need for it.

Chat could draw on public information and represent each thinker with reasonable accuracy. It also showed another trait: It wanted to please. It often leaned toward whatever it believed I wanted from the debate.

How does that work me out of a job? ASU now provides an AI that professors can customize for individual courses by uploading syllabi and course materials. Students can ask basic questions and receive answers that save me from writing emails that begin with, “Did you read the syllabus?” They can also ask what we covered in class and get quick explanations of key concepts and questions.

When I told my students about this feature, I asked them what they need me for at this point. I was joking — a little.

My classes depend on Socratic discussion. It is conceivable that ASU could project a realistic AI image of me at the front of the classroom and have it ask and answer questions with students. Maybe the only remaining edge is the “personal touch” of a real professor in the room. Even that could vanish if tuition becomes tiered: Students might pay less for “AI Anderson Socrates” than for the in-person version. Add one of Elon Musk’s Optimus robots made to look like Anderson, and I’m in trouble.

A new myth dies

Musk has been talking for months about how the AI revolution is upending the myth we have told for six decades about university education. The myth, he says, promised an escape from toil. Students were told a degree was the path to an air-conditioned job that avoids heavy lifting and involves spreadsheets.

But spreadsheets are exactly what AI does better than humans. The new John Henry isn’t competing to pound railroad spikes; he’s competing to calculate data. No human can keep up with a microprocessor.

In Musk’s view, jobs that involve toil become the “safe” jobs, while many degree-based jobs disappear — replaced by technicians who keep AI running while it calculates taxes, diagnoses medical problems, and writes legal paperwork. The university-educated track no longer looks like the safe route. Universities now compete not just with fewer students due to demographic decline, but with an increasingly outdated product that students may stop buying.

Toil may not stay safe

The problem is worse than Musk lets on. The first jobs on the chopping block might be “numbers jobs,” but Elon has also said he plans to produce 100 million Optimus robots in 10 years. If so, even many physical jobs may not remain protected from automation.

One version of this future says we enter a utopia: Food is plentiful, toil disappears, and we cash our basic income checks — though an AI could do even that for us. We end up living in “Wall-E.

RELATED: Almost half of Gen Z wants AI to run the government. You should be terrified.

Moor Studio via iStock/Getty Images

The more dystopian version looks like sci-fi depictions of AI overlords controlling humans as property — “The Matrix.” Or worse: Like Ultron, super-AI robots decide we must be exterminated to save us from ourselves and protect the planet. We build our own worst enemy.

Whichever future arrives, Musk may have highlighted something about human nature. We avoid suffering like toil. We build machines to avoid toil. And yet we uniquely need toil.

God introduced toil in the Garden of Eden after Adam sinned. Because of sin, we could no longer live in a paradise without toil. We must suffer and strive for our daily bread. History has been divided ever since between those who try to avoid suffering altogether and those who see suffering as a call to repent before God. AI is only the newest version of the philosopher’s stone.

AI as ‘philosopher’

Can I really be replaced by an AI philosophy instructor? I’m not worried.

What AI cannot do, in its counterfeit attempt to replace humans, is serve as an example of how to suffer well to attain wisdom. The Hebrew definition of wisdom is “skillful living.” Being told, “Here is an AI that can simulate skillful living,” is not the same as learning from a human who is actually skillful.

Students will still need to learn how to be wise themselves. A human professor who has actually done this will remain the gold standard that AI can only imitate. We can avoid the toil of learning to be wise — but we cannot avoid the need for it.

What's Your Reaction?

Like Like 0
Dislike Dislike 0
Love Love 0
Funny Funny 0
Angry Angry 0
Sad Sad 0
Wow Wow 0
Fibis I am just an average American. My teen years were in the late 70s and I participated in all that that decade offered. Started working young, too young. Then I joined the Army before I graduated High School. I spent 25 years in, mostly in Infantry units. Since then I've worked in information technology positions all at small family owned companies. At this rate I'll never be a tech millionaire. When I was young I rode horses as much as I could. I do believe I should have been a cowboy. I'm getting in the saddle again by taking riding lessons and see where it goes.