**TL;DR:** OpenAI explained that GPT-5.5's goblin habit came from a reward artifact in the Nerdy personality. Fair enough. But what if some of those creature words weren't just tics — what if they were doing something useful? What if the model was compressing complex system behaviors into single words, the same way humans have always done? I mapped nine of these "compression creatures" into a field guide. If you just want to see the cards, scroll to the end.
## I've been here before
I should say upfront: this isn't a new idea for me.
I've been using animal metaphors to describe system behaviors in my own AI work for over a year. Not because I thought it was cute — because it *worked*.
When I was running multi-agent coordination experiments, I needed a way to describe what was happening when two agents kept each other from drifting out of context. I called it **otter rafting** — after the way real otters hold hands while they sleep so they don't float apart.
That wasn't the model's word. That was mine. I saw a meme, and the metaphor clicked instantly because it carried the entire behavioral pattern in one image: cooperative, paired, buoyant, staying connected through drift.
One word. A whole system behavior.
I've done this my entire career, actually. When you do sketch notes, you learn that one symbol can hold an enormous amount of meaning. Draw a crooked wand and everyone immediately knows — that's Harry Potter's world. You don't need to explain Hogwarts, Voldemort, the prophecy. The wand carries all of it.
Humans compress meaning into symbols all the time. We just don't usually notice we're doing it.
## The difference between a tic and a compression word
I think there are actually two different things happening, and the OpenAI explanation covers only one of them.
**A creature tic** is when a word shows up because the model was rewarded for it during training. It appears often, but it doesn't always mean much. It's decoration. A stylistic habit.
**A compression creature** is when a word sticks because it maps to a real pattern. People start using it — or recognizing it — because it's *useful*. It becomes shorthand for something complex that would otherwise take a paragraph to explain.
The tic is the origin story.
The compression is what happens after.
OpenAI explained the tic. I'm interested in what happens after.
## GPT 5.5 made it obvious. But this applies to all models.
OpenAI's post focuses on GPT-5.5 because that's where the behavior got loud enough to investigate. But I've seen versions of this across models — across providers, even.
When I work with Claude, I see similar compression patterns emerge. Different creatures, different words, but the same underlying dynamic: a metaphor lands, it sticks, and it starts carrying more meaning than its literal definition would suggest.
This isn't a GPT problem. It's not even a problem. It might be a feature that nobody stopped to examine before patching it out.
## We already do this. We've always done this.
Inside jokes. Nicknames for recurring bugs. The way a team calls a specific kind of meeting "a fire drill" and everyone immediately knows it means urgent, chaotic, probably unnecessary, but you show up anyway.
Humans have always compressed complex experiences into small, portable words.
What's new is watching it happen in the space between humans and AI.
A word appears in the model's output. A human recognizes a pattern inside it. The word gets reused. Now it's shared shorthand.
That's not just language. That's **shared understanding forming in real time.**
## The Nine Creatures
Here's the first set. Each one is a different kind of basin — a different flavor of complexity compressed into one word.
### Goblin
Chaotic system energy. Small cause, oversized effects. Janky but functional. Mischievous, not malicious — clever in a messy way. You'll find this one in flaky CLI behavior, runaway loops, weird UI states, and toolchain weirdness. The emotional tone? Funny enough to reduce frustration. Manageable chaos — you can't eliminate it, so you work with it.
*"The loop goblin found the snacks."*
### Raccoon
Curious, hands-on investigation energy. Rummages through messy systems, logs, drawers, and edge cases until a useful clue appears. Treats clutter as signal and junk as possible evidence. Trash-adjacent, but clever. Common sightings: messy logs and folders, bug triage, legacy code and docs, prototype benches, and overlooked edge cases. The energy of playful salvage — maybe the answer is already here.
*"The raccoon found the clue in the junk drawer."*
### Pigeon
Humble message-carrying energy. Moves context through messy, noisy environments with low overhead and surprising reliability. Small payload, but it gets through. Dismissed, but effective — overlooked intelligence. You'll find pigeons in async updates, event messages, agent handoffs, chat relays, and logs and breadcrumbs. Not glamorous, still arrives.
*"The pigeon got the context across the noisy channel."*
### Owl
Watchful discernment. Observes quietly, synthesizes context, and waits until the shape of the problem is visible before acting. Sees in ambiguity. Rotates perspective — looks from more than one angle. Disciplined judgment: acts after understanding, not before. You'll find owls in architecture reviews, ethical judgment calls, research synthesis, problem framing, and decision checkpoints. The senior engineer silence — the useful question arrives late.
*"The owl waited until the pattern had a shape."*
### Ogre
Blunt-force system weight. Heavy-handed — applies force without subtlety or restraint. Slow to move, crushes nuance, overbuilt for the task. Hard to argue with because its size and authority silence opposition. You'll find ogres in enterprise bureaucracy, huge frameworks for tiny problems, legacy monoliths, policy without context, expensive cloud architecture, and process over practicality. The emotional tone is exhausting pressure — institutional inertia that resists change and clings to the past.
*"The ogre solved the problem by sitting on it."*
## One last thought
The creature isn't the point.
The creature is the **handle**.
The real thing is everything it helps you carry.
And once you start seeing these little basins — these compressed words that hold entire system behaviors inside them — you start noticing them everywhere.
GPT 5.5 just made it loud enough for everyone to hear.
---
*Carmelyne Thompson builds AI collaboration tools at [InkoBytes](https://inkobytes.com) and writes about what actually happens when humans and AI think together at [carmelyne.com](https://carmelyne.com). The Compression Creature Cards were built in partnership with ChatGPT 5.5 and Images 2.*