Trying to remember something that doesn’t exist: How AI excavates images from static
- Kathryn Courtney
- Feb 18
- 3 min read
Updated: Mar 24
It was that one time, remember? It was…where was it again? We were in Monaco. It was spring…no July. Or was that a movie? No, it was definitely July, and we had these sandwiches. I swear I’ve never had a better sandwich. And that wine…what was it again? I can almost see it…I can taste it. Wait, no, it was April…
It feels like trying to assemble a reel of our own experiences from the static of an old TV set. You see little hints of color or shape, and you grab onto them like a bobbing boat promising to take you ashore, but then the memory retreats back into the static. You’ve lost it. Sometimes, the picture resolves, and you find yourself transported to the past, living that April in Monaco with the best sandwich ever. And sometimes it remains a blur of maybe spring, maybe summer, maybe not your life at all but just a scene from To Catch a Thief.
We aren’t sure how human recall works exactly, but we instinctively feel ourselves piecing together incomplete cues or fragmented bits, progressively reconstructing memories from noisy static.
There’s no absolute answer—our recall is variable. We need prompts, cues, and context. And what we build as a memory may resolve into glassy clarity or stay fuzzy, and it may be different every time.
This is—not coincidentally—how Midjourney and its robot pals create AI-generated images. It’s called a cascaded diffusion model. And it’s a little bit bonkers.
It starts with thermodynamics.
Weirdly, the ideal starting canvas for generating images is static. Not a blank page but maximal randomness. Full noise. Infinite possibilities.
To get a big, yummy bite of static, we need look no further than the principles of thermodynamics.
The Second Law of Thermodynamics tells us that in an isolated system, entropy, or the measure of randomness and disorder in a system, increases over time. And if our goal is to extend randomness to the extreme of absolute noise (and it is!), the probability math of scattered stuff will get us there.
Hold my beer and watch this.
If you toss pennies and measure the probability of 50% of the pennies landing heads up vs 100% of the pennies landing heads up, it looks like this:
With 4 pennies: an outcome of 50% heads up is 6x more likely than an outcome of 100% (because there are six ways for 2 pennies to land heads up and only one way for 4 pennies to land heads up)
With 10 pennies: 50% is 252 times more likely than 100%
With 50 pennies: 50% is 126,000,000,000,000 times more likely
With 1 billion pennies: Ain’t nobody got time for that.
Now imagine if we’re tossing out pixels instead of coins, and they can potentially land anywhere on the canvas (not just in the two positions of heads or tails). That’s a mess of pixels and an unfathomable number of potential positions—the odds of any predictable scattering become effectively nothing. We’re nearing maximal randomness, creating a starting canvas that is the opposite of a blank page. We’re in a chaotic state of full potential, different every time.
Here’s where it gets weird: AI goes hunting for the image you’ve requested from this canvas of pixel chaos. It’s essentially tasked with finding the hedgehog in a bunny costume you need for your mom’s birthday card floating in those pixels, waiting to be found. Then, through a series of refining processes, it strips away everything that doesn’t belong, leaving the requested image remaining.

Ever wonder why you can’t generate the same hedgehog in a bunny costume image twice, no matter how desperately you labor over it? Well, there you go. Because it isn’t actually creating the image at all. It’s excavating the image from a pile of infinitely possible options. No matter how clear your instructions are, the chances of finding the same image twice are impossibly small.
I feel you, robot
Just like when you or I try to remember a fading memory, AI is reaching into its noisy static to piece together a memory of something from cues and context—it just happens to be something that doesn’t exist.

“I saw the angel in the marble and carved until I set him free” – Michelangelo
This is an endlessly rich subject, persistently incomprehensible, and deliciously delightful in its parallels to the mysterious experience of living in a human brain. I could tell you more weird stuff or…
Read more—it’s wildly fun:
Comments