Future Fiction: When AI tutors adapt like teammates, not textbooks

Published: September 2, 2025
Category: Essays | News
FutureFictionPAL3

This is part of a new ICT series projecting into the Future via Fictional narratives: This story is rooted in the very real PAL3 initiative—USC ICT’s Personal Assistant for Life-Long Learning. Led by Dr. Benjamin Nye, Director of the Learning Sciences Lab, and Dr. William Swartout, Chief Science Officer, PAL3 transforms mobile learning into something adaptive, personalized, and resilient. This tale draws on that research to imagine what tomorrow’s training could look like. Learn more about the project here: PAL3: Personal Assistant for Life-Long Learning.

Jared hadn’t enlisted for the PowerPoint slides.

He’d seen them already—during basic, at orientation, in briefings so soporific they might’ve been designed to test endurance. So when he was told to report to Army University for “AI Upskilling,” his expectations were grim: dim lights, passive instruction, and instructors who spoke in acronyms and directives.

Instead, the sergeant handed him a QR code and said, “Download PAL3. You’ll need it.”

Jared complied, out of habit more than curiosity. But as the app loaded on his phone, he noticed something odd: it didn’t open with a list of static lessons or videos. It asked him questions—about his past assignments, technical strengths, areas of hesitation, career goals. It responded in real time. The tone was clipped, professional, but unmistakably adaptive.

The next morning, the classroom surprised him again. No front-facing desks. No central projector. Groups of soldiers worked in clusters, heads bent over devices, debating scenarios, troubleshooting simulations. The room felt alive—cooperative rather than hierarchical.

He took a seat. PAL3 booted up automatically, welcoming him back and recommending a set of modules tailored to his background: signal operations, procedural AI, and human-machine teaming.

He was skeptical. Then he opened the first scenario.


It wasn’t a test. It was a dilemma.

A convoy in unfamiliar terrain had lost contact with its forward element. An unmanned aerial system had identified anomalies—patterns in heat signatures that might indicate concealed IEDs. The question: could the AI’s risk assessment be trusted enough to halt the mission?

PAL3 walked him through it—not by explaining doctrine, but by forcing engagement. He had to interpret data outputs, question AI model assumptions, and reconcile recommendations with real-world uncertainty. The system didn’t let him advance with shallow answers. It prompted him to examine his own reasoning, offering counterfactuals and historical parallels.

By the end of the session, he wasn’t just reviewing tactical responses—he was analyzing the architecture behind the AI’s decisions. He saw how sensor noise affected output reliability, how model retraining was necessary for novel environments, and how failure to understand system limitations could end lives.

Jared closed the session with a note of surprise: he wasn’t bored. He was alert. Not because he feared failure, but because for the first time, training felt consequential.


PAL3 tracked his progress—not just in terms of content mastered, but in rate of learning, attention decay, and forgotten concepts. When he revisited a module days later, it adjusted to account for skill fade, calibrating questions to reinforce fragile knowledge.

There were no lectures. Instead, the app integrated short guides, targeted quizzes, dialogue-based tutors, and problem simulations. PAL3 adapted these in real time based on Jared’s behavior, identifying which learning strategies worked best for him. He could review procedures offline, during downtime, in the barracks, or in transit. The entire platform was built for mobility and resilience.

Outside the classroom, he started hearing others talk about it—not as a requirement, but as a resource. Soldiers on different tracks compared modules, shared progress on leaderboards, and even challenged each other in scenario walkthroughs. The tone was competitive, but collegial. The training had achieved something rare: voluntary engagement.


Deployment came quickly.

His unit was sent to support humanitarian operations in Japan following a major typhoon. Flooded infrastructure, overwhelmed local services, and unstable conditions made for a complex operating environment. Communications were intermittent. Jared’s team was responsible for re-establishing signal lines and coordinating with allied personnel across scattered relief sites.

In the third week, a critical relay failed in a low-access zone. A signal technician was injured during recovery, and Jared found himself as the most technically proficient person on site. The system was unfamiliar, improvised—a legacy configuration he hadn’t trained on. He had minutes to restore comms, or risk a gap in coordination that could jeopardize evacuation routes.

He opened PAL3.

It launched instantly, even without a data connection. The app assessed the situation based on his last updates, reviewed his known strengths, and retrieved a streamlined diagnostic procedure for the exact system. It didn’t overwhelm him with every manual. It curated. It highlighted failure modes he was least familiar with. It reminded him, precisely, of what he had started to forget.

He worked steadily. Each step confirmed, each anomaly flagged. When he hit a configuration he hadn’t seen before, PAL3 simulated possible interpretations based on partial schematics and prompted verification. Within 12 minutes, the line was restored.

The evac proceeded.

Later, when he sat in a transport case with his boots off and adrenaline fading, Jared realized something: it hadn’t just been a technical save. PAL3 had made the moment survivable by making it navigable. And it had done so not with a monologue or a manual, but with a memory—his own, extended and restored.


Months later, back on base, Jared reviewed his PAL3 dashboard. The app had recorded not just completions, but interactions—decisions made under pressure, modules consulted voluntarily, reflections noted. It offered him new learning pathways based on that experience, including cross-training in resilience leadership and AI-aided logistics.

What surprised him most was the quiet accuracy of its trajectory mapping. PAL3 understood not just where he was, but where he could go. It wasn’t replacing a commanding officer or a career counselor. But it was becoming something else: a guide that remembered more than he could, that never got tired, that adapted even when he wasn’t asking it to.

He submitted a request for additional AI systems coursework, this time through a PAL3 module slated for public release under AI-UP. He also signed up to mentor incoming trainees—offering to annotate scenarios with real-world insights, to build the next layer of adaptive learning for others.


When a new cohort arrived—fresh uniforms, fresh skepticism—he watched them hesitate at the classroom entrance, expecting screens and speeches. Instead, they found clusters of activity, devices already booted up, and instructors moving not to command, but to listen.

Jared didn’t make a speech. He simply said: “Bring your questions to PAL3. Then bring your questions back to us.”

And with that, he handed them the room, and let the next cycle begin.

//