Agency in the Age of AI
In a TED Talk from several years ago, Ethan Hawke reflects on creativity and the courage of self-expression. His point was simple but enduring: to express yourself honestly, you first must know yourself. Creativity, he suggests, isn’t about polish or performance. It’s about allowing yourself to be fully present…even foolish at times, as Alan Ginsberg embraced in unguarded expression, in service of something true.
What stayed with me wasn’t just the idea of creativity but the urgency beneath it. Our time is short. The pull of habit is powerful, so powerful that many of us forget to ask whether we’re doing what matters to us, meaning what we love rather than what’s merely familiar, efficient, or rewarded. Somehow, we struggle to give ourselves permission to be imaginative; maybe we’re suspicious of our own talents and of how vital creativity is.
Creative expression looks and feels different for each person.
I see this clearly in my own marriage. My husband leads with numbers, making sense of the world through spreadsheets, systems, and structure. I lead with words, seeking to make sense of our shared humanity and how we connect to move forward. We each express our individuality in different ways; neither is more valid, and both are expressions of agency.
Which I believe is why the current anxiety about AI feels so charged and contemptuous. When tools become powerful enough to accelerate expression, they don’t erase agency; they reveal whether it was ever practiced at all. They expose where authorship is strong and where it’s fragile. That’s what makes this moment uncomfortable…not the technology itself, but the mirror it holds up.
Hawke’s point, distilled, is this, below, which I think gets to the heart of that discomfort:
Perfection belittles the experience of life. Creativity comes from nature and from being who you are, imperfectly. It’s what makes you and the energy you bring to it. You lose that connection if AI perfects and consumes our creativity and output.
I agree. In fact, the observation is essential. But the real tension isn’t about perfection; it’s about agency, our capacity to choose, direct our attention, and take responsibility for what we create. It’s the difference between knowing who you are and outsourcing that knowledge.
Creativity has never been endangered by tools. What’s endangered is our relationship to authorship, to the quiet, practiced act of showing up with intention, judgment, and responsibility. Agency isn’t something we either have or don’t have. To me, it’s more like a muscle that strengthens with use and weakens with neglect.
When agency atrophies, reliance creeps in, not just on AI but also on systems, templates, validation, and consensus. We begin outsourcing judgment instead of exercising it. In that state, any powerful tool feels threatening, not because it replaces creativity but because it exposes how rarely we’ve claimed it as our own.
AI doesn’t generate energy; it reflects it.
I see this play out in my own creative process. I often use AI as a kind of think tank—a place to test language, organize ideas, and challenge my assumptions. But it only works because I bring the raw material first: the questions, the experiences, the unease. The thinking still has to be mine. The agency has to be exercised before anything useful can happen. What emerges is collaboration, not replacement—and certainly not authorship by proxy.
This is why framing the future as humans versus AI misses the point. The anxiety around AI isn’t really about the technology. It’s about agency, about whether we will continue to choose and author in the presence of tools that can simulate both. Every generation has felt a version of this unease about radio, television, airplanes, and the internet, each innovation mistaken at first for a threat to something essential.
The future isn’t humans versus AI. It’s humans who can think versus humans who outsource thinking. AI can polish language, organize thought, and even amplify ideas, but it cannot supply presence. It cannot originate hunger, doubt, longing, or lived friction. Those things still belong to humans. When they’re absent, the work feels sterile, not from refinement but from disengagement. Not because a tool touched it, but because the human didn’t fully inhabit the creative process.
In my (perhaps optimistic) view, the loudest resistance to AI often comes not from those deeply rooted in their creative agency but from fear of displacement. Tools feel dangerous when authorship feels fragile. For those who have exercised their voice and know how they think, choose, and create, tools will remain what they have always been: instruments, not replacements.
Human agency is practiced, not possessed.
When creativity feels flattened or sterile, it isn’t because AI touched it. It’s because the human didn’t bring themselves fully into the room.
Recent Comments