All Systems, All the Time
May 4, 2026
Designers have been building systems for years. Component libraries, color palettes, rules for how the product should look. Those systems got handed to engineers, who filled in the gaps when the spec came up short.
The pattern is old. What’s new is how far it’s spreading.
Today systems are everything. Your brand voice, your sales outreach, your support responses, your hiring, your marketing, your product behavior. Every function that used to live in someone’s head is being extracted, codified, and handed — not to other humans, but to machines.
This is new. And it changes the job.
For twenty years, I’ve been handing off designs to engineers. Here’s the screen. Here’s how it behaves. Here are the edge cases — well, most of them. The rest we figured out together.
That handoff was a conversation. Imperfect, iterative, human. The engineer would ping me: “What happens when there’s no data?” I’d sketch something on the fly. We’d go back and forth.
I spent four years at Uber watching the app change out from under users in real time. Same screen, different experience — depending on supply, surge, traffic, time of day. At Lime, the same story amplified across cities and vehicles and regulations. I thought I understood what it meant to design for a system rather than a screen.
Then I spent two years building an AI-native product from scratch, and realized I had no idea.
Here’s what changed.
In the old world, the system was under the hood. Users saw the app. Engineers interpreted intent and filled in the gaps when the design spec came up short. My “system” was really a set of artifacts — Figma libraries, spec docs, brand PDFs — good enough to get the idea across, refined through conversation.
In an AI-first product, the system is the product. The model takes what you gave it and applies it at scale, instantly, without judgment. No clarifying questions. No taste. No calling you to ask what you meant.
The model doesn’t know what you meant. It only knows what you said.
Leave a gap, the AI fills it. Leave ambiguity, the AI resolves it — sometimes badly. The seams that used to be invisible are now exposed, generation after generation, to every user.
What surprised me isn’t that design is becoming more systematic. It’s that every function is.
Marketing is a system the AI applies. Support is a system the AI executes. Sales is a system the AI runs. Product is a system the AI generates within.
The loop is the same everywhere:
- Define the system — make what’s in your head legible to the model.
- Assess the output.
- Adjust the system.
- Repeat.
That’s the work now.
It sounds mechanical. It isn’t.
Making your thinking legible to a machine forces clarity. It exposes gaps. You can’t hand-wave. You can’t say “you know what I mean.” You can’t hide behind intuition or experience or taste — unless you can articulate what those actually are in terms a model can apply.
The AI is a mirror. It reflects exactly what you gave it.
If your brand guidelines are vague, the AI produces incoherent outputs. If your design system has gaps, the AI invents something to fill them. If your strategy is confused, the AI exposes the confusion at scale.
The companies that do this well become more coherent, not less. More rigorous. More honest.
The companies that don’t produce slop at scale and wonder why.
Which brings me to the part that doesn’t get talked about enough: someone still has to own it.
A system legible to a machine is not a system on autopilot. The AI will execute whatever you gave it, at speed, across millions of interactions. If the system is wrong, the failure is everywhere at once. There’s no individual engineer in the loop to catch it.
So every system now needs a named human behind it — someone accountable for whether it’s actually producing what it should. Not a committee. Not a policy. A person who reviews the outputs, notices when they drift, and adjusts the rules.
Legible to AI. Monitored by humans. Owned by a person.
That’s the shape of the job.
For designers specifically, it means this:
Build the system that sets the product up to be consistently great. Recognize when it isn’t. Adjust the system until it is.
That’s always been true, in some sense. But the leverage is different now. One system decision influences millions of outputs. And if the system is thin, the cracks show up everywhere, instantly.
The designers who thrive won’t be the best pixel-pushers. They’ll be the ones who think in systems, define taste explicitly, and build rules that make AI output feel crafted rather than generated. Editors, not authors. Architects, not builders.
But the same shift is happening to everyone else. PMs, marketers, operators, executives. Every function is being asked the same question:
Can you make what’s in your head legible enough that a machine can execute it — and stay on the hook when it does?
If yes, you gain leverage. The kind where a small team does what used to take an army.
If no, you become a bottleneck.
For most of history, expertise lived in people’s heads. You learned by apprenticeship, by osmosis, by years of reps. The knowledge was tacit. Hard to transfer.
AI is forcing that knowledge to become explicit. Codified. Transferable — not to other humans, but to machines.
It turns out a lot of what we thought was intuition was really just pattern recognition we never bothered to write down. When you’re forced to write it down, you sometimes discover you understand it better than you thought.
Sometimes you discover you didn’t understand it at all.
Both are useful.
I wrote last year that systems beat goals. I still believe that. But I’d update it for 2026: the people who build real leverage in the next decade will be the ones who can define a system clearly enough that a machine can run it — and stay honest enough to own what it produces.
Everything is a system now.
The question is whose head it came from, and who’s on the hook for it.