What "Battlestar Galactica" Teaches Us About AI in Education
We built an AI backdoor into classrooms with 30 years of ed-tech hype and handed every student the keys.
A German instructor at Cornell University recently demanded students use manual typewriters in class. The students — with smartphones in their pockets and ChatGPT at home — struggled with pinky strength and the absence of a delete key. But nevertheless, the students completed their exam completely without assistance.
The solution to AI cheating, it turned out, was 1950s office equipment.
And here’s the thing: it’s working. Students reported being “forced to actually think about the problem on my own” — as if this were a revelation. They collaborated more. They slowed down. Without the dopamine drip of notifications, they noticed each other.
This is where we are now: professors buying typewriters from thrift stores while administrators insist AI will “personalize learning.” We spent two decades shoving technology into classrooms — and the solution to the problems that technology created is to remove it entirely.
The analog comeback isn’t limited to higher education either. In December 2025, McPherson Middle School in Kansas collected back all 480 student Chromebooks. Principal Inge Esping had already banned cellphones, but students still found ways to be distracted — YouTube, video games, bullying through school Gmail. “This technology can be a tool,” she told the New York Times. “It is not the answer to education.”
The pattern is national. Cornell’s biomedical engineering program requires “oral defenses” — students must explain their work face-to-face. UPenn pairs oral exams with written papers. NYU uses AI voice agents to conduct remote oral exams, “fighting fire with fire.” Faculty no longer trust written assignments to demonstrate actual thinking. We built educational systems that made thinking optional — and now we’re surprised when students opt out.
Twenty-five years ago, a teacher applying for jobs met interviewers disappointed she didn’t tout her PowerPoint skills. The business world was “light years ahead.” Schools needed to “catch up.”
Teachers understandably pushed back against this obvious false equivalence. Classrooms aren’t workplaces, at least, not exclusively. They asked whether every classroom needed tablets, whether every assignment needed to be digital, whether “blended learning” was anything more than a buzzword. They were called “afraid of change,” “out of touch,” “resisting the future.”
A generation of students later, we have our answer: the skeptics were right. The technology that was supposed to enhance education instead gave students permission to skip the hard work of thinking. The “personalized learning” pitch became a two-tier system where “haves” get human teachers and “have-nots” get AI proxies.
In Battlestar Galactica, humanity created the Cylons — intelligent machines that rebelled. When they finally attacked, they exploited a backdoor in the Command Navigation Program, a software update installed across the entire Colonial Fleet. Because every ship was on the network, every modern fighter and battlestar went dark instantly. Only ships with primitive avionics survived. Humanity decided to prevent this from ever happening again—networked computers were banned entirely.
We spent 20 years “networking” education. Every student got a Chromebook. Every assignment went through Canvas or Google Classroom. Every lesson plan was supposed to be “enhanced” by technology. When AI arrived, it didn’t need a backdoor. We built the backdoor ourselves — and handed every student the keys.
Faculty no longer trust written assignments because they can’t tell whether students did the thinking. But more fundamentally: students are losing the experience of thinking itself — the struggle, the false starts, the revision process that builds actual understanding. A student at Cornell noted being “forced to actually think about the problem on my own” — a revelation that should worry anyone who teaches.
But there’s another loss: physical stamina. Professors at Northwestern University have reverted to handwritten blue book exams, but students who never write by hand don’t have the endurance to do it well. We outsourced thinking and atrophied the muscles — literal and cognitive — required to perform it.
The irony is thick: we built an educational system that made thinking optional, then expressed shock when students struggled to do it on command. The typewriter exercise didn’t teach German; it taught students what their own minds felt like without a machine doing the work.
AI was supposed to “personalize” learning. Every student would get individualized instruction, adaptive curricula, infinite patience. The sales pitch wrote itself: technology would democratize the one-on-one tutor that only wealthy families could afford.
But Allison Pugh’s research shows that “connective labor” — the human element of teaching — degrades when intervened by technology. The result isn’t personalized learning; it’s a two-tier system. Students with resources get human teachers who know them. Students without get AI proxies that process them.
And the “personalization” pitch ignores the environmental costs: water consumption, energy use, emissions, e-waste. Teaching students to use AI “ethically” asks them to ignore that the technology itself has ethical costs built into its infrastructure.
ChatGPT will write your lesson plans, grade your papers, give feedback to students. The promise was that AI would free teachers from drudgery and let them focus on “what matters.”
A METR study found developers using AI tools took 19% MORE time to complete their work, not less. Learning Management Systems — sold as time-savers — added layers to tasks instead of reducing them. The technology that was supposed to streamline education instead created more administrative overhead.
But there’s a deeper cost. Teachers who skip the mental struggle of lesson planning don’t develop the skills they need to adapt, improvise, respond to students in real time. The time saved isn’t saved — it’s borrowed from the development of actual expertise.
The argument against analog measures is predictable: students are already using AI. It’s here to stay. We have to teach them to use it ethically. The only way forward is adaptation.
We heard the same argument about cellphones. Told it was our job to teach “responsible phone use.” A decade later, cellphone bans are sweeping the nation — because “teaching responsible use” didn’t work. Sometimes the solution is not to adapt to the problem. You collect the Chromebooks and bring out the typewriters.
Sometimes you look students in the eye and ask them to explain what they wrote. As one Cornell student put it: “It’s a lot harder to look people in the eyes and say out loud, ‘I don’t know this.’”
Catherine Mong, the freshman who struggled with the typewriter because of her broken wrist, didn’t complain. She told the AP she’d probably hang one on her wall. She told all her friends about the experience. “I did a German test on a typewriter!” — as if this were a discovery worth sharing.
When students say they were “forced to actually think,” they’re revealing what the technology had cost them — not just knowledge, but the experience of thinking itself. The slow, difficult, unglamorous work of building an idea from scratch. The thing we used to call education.
We spent 20 years insisting technology was the answer. Now professors are buying typewriters from thrift shops, middle schools are collecting Chromebooks, and students are discovering what “return” means. Sometimes the machines that can’t think are the ones that let humans remember how.



