The Messy Middle: Thinking With AI and What's at Stake
Staying in relationship with our own intelligence as everything accelerates
I often notice that the ideas I value most don’t arrive when I sit down and tell myself it’s time to think. They tend to arrive sideways when I’m walking, washing dishes, or taking a shower. In moments when I relax and let my attention wander, and my mind isn’t trying to be productive.
It’s that feeling that something is happening behind the scenes where threads are finding each other, gathering into form. Connections are still formulating, not quite ready yet. And then, suddenly, it clicks.
Over time, I’ve learned to trust this type of intelligence that doesn’t respond well to pressure. It needs space and time. Room to move. And when it finally surfaces, it often feels less like something I made happen and more like a discovery that found its way through me.
That phase where thoughts are forming but nothing is clear yet is what I call the messy middle. Many of us recognize it from having the experience, but it isn’t how most of us have been trained to think.
There’s a moment in one of the Star Trek films where Spock, having traveled back in time, hands Scotty an equation Scotty himself will develop in the distant future. Years of research collapse all at once. Scotty is amazed. Grateful. And aware that time has just been rearranged.
He now has knowledge far ahead of its time.
It’s easy to focus on what was lost there. Scotty didn’t live through the journey of discovery. He didn’t spend years grappling with the problem, making mistakes, or learning through trial and error. The struggle was skipped.
But there’s another side to it.
With that equation in hand, Scotty can now build things from a new baseline that would never have been possible otherwise. Acceleration erases process, yes, but it also opens new territory. Humans have always done this: passing knowledge forward so the next person doesn’t have to start from scratch. Language, writing, and printing did this. The internet did this.
AI is doing it again, but much faster.
So the question isn’t whether acceleration is good or bad. The question is what happens within us when things start moving faster than we can integrate them?
In living systems, intelligence isn’t about optimization. It comes from responding to what’s happening. Nothing is rushing to get it right. Life changes as it needs to.
Creative work follows a similar rhythm. Ideas don’t usually arrive on demand. They arrive when there’s more space. This is why you can have some of your best ideas in the shower. The combination of warm water and no one expecting anything from you allows the nervous system to settle. Your mind lets go a bit. Something somewhere gets the breathing room to connect.
There really isn’t anything particularly special about the shower. It’s more about the conditions it creates.
The messy middle lives in that space, between not knowing and knowing. When things feel vague, unresolved, or half-formed. It can be uncomfortable for some. It can feel unproductive or frustrating. I know someone who absolutely hates it. There can be a strong desire to collapse this space into certainty. And, it’s where a lot of our creative intelligence seems to take shape.
This is how my own creative process often works. Things gather while I’m living my life. Riding the train or vacuuming the living room. Ideas picked up along the way. A phrase here. A feeling there. Fragments gather long before anything feels whole.
That middle matters because it builds something internal.
When we stay with uncertainty instead of rushing to resolve it, we develop discernment. A capacity to feel our way through complexity, with patience. When we struggle with an idea, rewrite it, sit with it longer than we’d like, the understanding becomes ours in a deeper way. It settles into the body.
Without that process, knowledge can lose meaning. We can say things without really inhabiting them. We can rush ahead without feeling grounded in what we’re doing. Plenty of output can be generated without much felt continuity underneath.
This doesn’t happen in isolation. How we think is influenced by the environments we’re in.
Somewhere along the way, not knowing started to look like a problem.
In school, we’re rewarded more for having the right answers than for being curious. At work, for deliverables more than reflection. Speed, clarity, and certainty are treated as signs of competence. Hesitation gets labeled as inefficiency. Ambiguity becomes something to eliminate.
The messy middle, by contrast, looks sloppy. Risky. Hard to justify.
So we design systems to avoid it.
We compress timelines. Streamline the work. We look for tools that promise resolution on demand. And now, with AI, we can often skip the messy middle altogether. We get the answer, the summary, and the articulation within seconds, without having to live inside the question.
This is where a subtle erosion begins.
I’ve been hearing from friends who teach young people that a lot of their students are not excited about AI. They’re more uneasy about it than anything else. And the fear doesn’t seem to be focused on whether jobs will disappear. It’s more about the concern that they could lose the ability to think for themselves. They’d rather feel their way toward understanding instead of having it handed to them fully formed.
Emerging research reflects something similar. In a study conducted by researchers at MIT’s Media Lab, participants writing essays with generative AI showed lower neural engagement, especially in areas tied to memory, planning, and creativity, than those who wrote without these tools. Those in the AI-assisted group were also less able to remember or recall what they had written, and independent reviewers described the AI-supported essays as lacking individuality and originality.
These findings don’t necessarily prove that AI diminishes thinking (the study itself is small and not yet peer-reviewed) but they do suggest that relying on AI can change how our minds engage with complex tasks. If that’s true, we may be undermining the very conditions that make original thinking possible. That possibility echoes what many educators are witnessing: speed and efficiency don’t guarantee depth or meaning. When systems move faster than our biological and emotional pace, they can bypass the conditions that allow understanding to take root.
When technology starts moving faster than we do, we don’t magically catch up. Our attention scatters, and our nervous systems stay on high alert.
Which raises an interesting question.
Does AI have a messy middle?
In a technical sense, it does. Models are trained through enormous cycles of iteration and feedback, where patterns emerge and errors are corrected, and performance gradually improves. But none of this is felt in the way it is for humans. It doesn’t involve confusion or doubt, or waiting and wondering when understanding will arrive.
Human emergence is different. It’s lived. It moves through sensation, emotion, memory, and meaning. It’s slow because it unfolds through experience.
And yet, an interesting shift happens when the two meet.
When AI is used as a shortcut, it can collapse the process. But when it’s used more as a thinking partner rather than an answer machine, the messy middle can shift back to where it belongs.
Here’s what I mean. Imagine a student working on a paper about a topic they don’t fully understand yet. They could ask AI to write the essay and get something that sounds decent enough in seconds. Or they could do something else entirely.
They might start by writing out what confuses them about the subject. What doesn’t make sense, where their thinking gets tangled. Then ask the AI: What questions am I actually trying to answer here? This helps to clarify the question rather than give an answer. Or they could write a messy first draft themselves and ask the AI to show them where their argument is strongest, or where they’re circling around an idea they haven’t articulated yet.
In this version, the AI doesn’t replace the struggle. It makes the struggle visible. It externalizes the process so the student can see their own thinking more clearly, sit with the confusion longer, follow it further. The integration still happens in them. The discernment is still theirs. But the AI helps hold the complexity open instead of collapsing it too soon.
This is often how I work with these tools. I use AI less for answers and more as a way to stay in conversation with an idea and see my own thinking from new angles. I ask it to challenge me.
The difference is subtle but crucial. In one version, you’re outsourcing the messy middle. In the other, you’re using AI to create more room for it, instead of rushing to resolve the discomfort of not knowing.
The human still has to live inside the question. The AI can help hold the door open.
The risk is that we stop noticing when we’ve handed over the part of the process where we actually become the human who understands.
We’ve already seen what happens when self-organizing systems run without care. Market capitalism is a kind of emergent intelligence. It responds to signals and feedback loops. But when growth becomes the only measure, balance gets lost.
AI amplifies the logic we feed into it.
If we build systems that value speed above all else, they’ll keep compressing the spaces where integration happens. If we build systems that value the process and not just the result, new possibilities emerge.
Acceleration doesn’t have to mean disconnection. But it does require awareness.
Nothing essential about being human is vanishing overnight.
But something is being tested.
Can we stay in relationship with our own unfolding as everything speeds up? Can we protect spaces where not knowing is allowed? Can we let ideas arrive on walks, in the shower, in the pauses between tasks, rather than trying to force them?
For me, the question isn’t whether to use AI.
It’s whether we can use it without losing ourselves in the process. To remember that the messy middle isn’t a flaw to engineer away.
It’s where understanding forms.
And if we lose the ability to stay with it, we don’t just lose creativity.
We lose our way of becoming.
This essay is a continuation of an earlier reflection on AI, time, and acceleration, which you can read here: The Future Reaching Back: What Star Trek Can Teach Us About AI


Thanks for this. I can no longer imagine living without the messy middle, but it took me a long time to find it. It's so essential for creativity. It may perhaps be the very human longing for creativity that keeps us from getting lost within the lines and mazes of systems like AI. It may also be the very breakdown of systems like capitalism that helps us find the messy place where we are home.
April, this piece resonates deeply. I do not know how often I've solved a writing conundrum in my sleep and have woken up with the line of poetry or the direction the essay will take. Allowing the time and space necessary for the work to marinate has produced the work I'm most proud of. "If we build systems that value the process and not just the result, new possibilities emerge." Also agree on your take on how best to integrate AI as a partner rather than a replacement. I've heard and read something similar about Generation Z wanting nothing to do with the direction technology is taking us. They seem to be craving the analog experience, and that makes so much sense to me somehow. If screens have always been your reality, yet there is still all the richness of books, art, and music from the before times, lol, built with that middle process intact, I believe it connects.