Over the last 18 months, I’ve been spending multiple hours a day immersed in generative AI tools, exploring their capabilities, limitations and potential impact. In my role at the University of Michigan, I’m continually thinking about how these and other tools can and should shape our colleagues’ work and the communities we serve. But perhaps more profoundly, this sustained exploration has led me to reflect on what it truly means to be human—at work, in relationships and throughout life.
As we embrace these technologies, we must also consider the experiences we need to discover and maintain our connections—and our humanity. In a world increasingly shaped by AI, I find myself asking: What are the experiences that define us, and how do they influence the relationships we build, both professionally and personally?
This concept of “off-loading” has become central to my thinking. In simple terms, off-loading is the act of delegating tasks to AI that we would otherwise do ourselves. As AI systems advance, we’re increasingly confronted with a question: Which tasks should we off-load to AI? And as we delegate, we also face the possibility of what some call delegation remorse—the regret that comes from realizing we’ve let go of something essential. In a world that feels as though it’s moving at an increasingly unsteady pace, the allure of a quick fix to capture our most precious resource—time—is undeniably intoxicating. But in seeking that quick fix, are we trading away something far more valuable?
I love movies. Whether Oscarworthy or barely tolerable, there’s nothing quite like getting lost in a story and finding ways to layer it back onto your own life.
Consider the rarely bundled quartet of It’s a Wonderful Life, The Family Man, Groundhog Day and Click. Each film presents a protagonist with a magical shortcut—a chance to bypass life’s challenges and fast-track their way to a better future, whether in Bedford Falls or Punxsutawney. From George Bailey’s glimpse into a world without him to Phil Connors’s seemingly endless loop, these characters are confronted with paths that let them skip the painful, mundane parts of life—only to find that in bypassing struggle, they miss out on something profoundly important.
In real life, we’re rarely given such blatant choices, but with AI, we might find ourselves unwittingly fast-tracking through experiences that, while uncomfortable, are essential to our growth. These stories remind us that skipping life’s struggles often comes at the expense of what makes us human. As we face the prospect of generative AI in our own lives, we must ask: Does extreme off-loading elevate us by granting freedom, or does it risk eroding the very experiences that shape our humanity? In this era of unprecedented possibility, where will we draw the line between convenience and connection?
Lights, camera, activate off-loading!
The Promise of Off-Loading
There is no doubt that off-loading to AI has significant positive implications. Imagine members of a university community—faculty, staff and administrators—leveraging AI to automate administrative tasks, freeing them to focus on student engagement, research, strategic planning or creative endeavors. We’re already seeing parallels to the so-called 10X engineer, but in many roles: Educators, researchers and support staff who harness AI can amplify their work, going both broad and deep.
In higher education, off-loading can also facilitate an interdisciplinary approach. With AI handling complex data analysis or repetitive administrative responsibilities, a researcher might expand into unfamiliar fields, a staff member might optimize support services or an administrator might explore new strategic initiatives. This capacity to transcend traditional role boundaries suggests that we can move beyond the false dichotomy of generalist versus specialist. Those who master AI tools can be both, deepening their expertise while broadening their reach across disciplines and roles.
Beyond this, we’re already seeing how AI-driven off-loading can streamline curriculum development, enhance personalized learning experiences, provide real-time insights into student progress and even open up opportunities for global collaboration by breaking down language and logistical barriers. And, as has already become cliché, this is the worst AI will ever be. So, what’s next? As we continue integrating AI, will our evolving roles bring us closer to our colleagues and communities, or will they create an unfamiliar distance?
The Perils of Off-Loading
But I’m worried about a darker side, too. In our relentless quest to off-load, often motivated by the pressures inherent to work and life, we might instinctively pass off the tasks we find tedious or uncomfortable—tasks that are often crucial to our growth and our connections. For example, if university staff members begin to delegate key aspects of student support or advising to AI, they may miss the subtle cues that reveal deeper needs, the kind of insights that build genuine understanding. Similarly, if faculty off-load all grading and feedback, they could overlook nuances in student responses that lead to more personalized teaching. These moments, which might seem routine, are opportunities to foster empathy, insight and a richer connection to our work and the people we interact with.
Moreover, off-loading could reshape our roles—and our relationships—in ways we don’t yet fully understand. In a university setting, all members of the community learn through experience, grappling with challenges that build the skills we pass on to others and forming bonds along the way. Off-loading might disrupt this cycle: If we haven’t engaged in these fundamental experiences ourselves, can we truly understand the journeys of those who follow or offer them the support they need? If we off-load these seemingly minor tasks, are we also off-loading the empathy that comes from shared challenges and the insights we gain from direct engagement?
In the long term, this shift could weaken essential mentorship and support structures across academia, ultimately impacting the depth and quality of our connections, both in the professional sphere and beyond. Off-loading can free us, allowing us to pursue new opportunities, but it also risks isolating us from the experiences that build resilience and perspective. As professionals and as individuals, where will we draw the line?
Reflecting on the Trade-Offs
The implications of off-loading will resonate throughout our learning journey, no matter where we stand on that path. Higher education professionals, learners and leaders alike will need to weigh the benefits of productivity against the risk of becoming disconnected from the meaningful, experiential aspects of work. The rarely bundled quartet of Stewart, Cage, Murray and Sandler reminds us that while skipping the hard parts can bring temporary relief, it often leads to a deeper sense of loss.
I know I made decisions differently at age 15, 25, 35 and 45—choices shaped by the sum of my experiences, both rewarding and challenging, energizing and mundane. These experiences informed my understanding of the world, and without them, I wouldn’t be who I am today. If we lean into extreme off-loading, will we really be equipped to make better choices? At what point in a partnership with AI, whether subtly enhanced or unrecognizably transformed, are we even making choices at all? As AI becomes more capable, will we retain the agency to shape our journeys, or will our choices become reflections of algorithms rather than authentic expressions of ourselves?
As we adopt these tools, let’s pause to ask: What are we gaining, and what might we be missing? We should approach this new era thoughtfully, weighing the time we save against the experiences we surrender. After all, the immortal Ferris Bueller, who seemed to have at least a few things figured out early, reminded us, “Life moves pretty fast. If you don’t stop and look around once in a while, you could miss it.” In the end, perhaps the most important question we face isn’t just about what we off-load, but about whom we become as a result. Will we emerge more connected to our purpose, or will we become strangers to our own experiences?
In a world where we can hand over more and more to AI, we must choose wisely. It’s up to us to ensure that, even as we move faster, we don’t lose touch with the experiences that make our work—and our lives—meaningful.
James DeVaney is the associate vice provost for academic innovation and the founding executive director of the Center for Academic Innovation at the University of Michigan.