Home / Articles / Implicit State Is the Most Expensive Dependency
Essay January 23, 2026

Implicit State Is the Most Expensive Dependency

Implicit state feels like flexibility at small scale, but compounds into fragility over time. This piece explores how systems quietly accumulate hidden dependencies on memory, and what happens when those dependencies break

Most systems don’t fail where we expect them to.


They don’t usually collapse because of a bad algorithm, an insufficient data structure, or a missing feature. They fail quietly, over time, in places no one thought to model—until the system no longer knows how to operate without the people who built it.


At that point, the dependency graph becomes visible. And the most expensive dependency turns out not to be a service, a library, or a vendor.

It’s memory.


In small systems, implicit state feels like flexibility. Decisions are made in conversation. Context lives in inboxes, chats, and heads. Everyone “just knows” how things work. When something goes wrong, someone remembers what was agreed to and fixes it. The system appears resilient because humans are compensating for what the system doesn’t hold.


This works—right up until it doesn’t.


The failure mode is subtle. Nothing crashes. Nothing alerts. Instead, every question starts to require reconstruction. Why was this value chosen? Which version was approved? Did this actually ship, or was it superseded? The answers exist, but only as residue: half-remembered conversations, email threads with branching replies, comments without timestamps.


What follows looks less like debugging and more like forensics.


Engineers are familiar with this pattern, even if they don’t always name it. A service that depends on undocumented behavior. A pipeline that works only if steps are performed in the right order by the right person. A “temporary” workaround that quietly becomes permanent. Over time, the system accumulates hidden assumptions—state that exists, but nowhere the system can see.


Implicit state is attractive because it lowers upfront cost. You don’t have to formalize decisions. You don’t have to encode transitions. You can move quickly and rely on shared understanding. But the cost isn’t eliminated. It’s deferred—and it compounds.


Eventually, the system reaches a point where continuity depends on specific people being present. When they leave, the system doesn’t just lose capacity; it loses coherence. The work hasn’t disappeared, but the logic that held it together has.


At that moment, flexibility reveals its true price.


The problem isn’t that humans are unreliable. It’s that human memory is being used as infrastructure. Memory drifts. It compresses. It substitutes narrative for exactness. When systems rely on it, they inherit all of those properties.


A healthier posture doesn’t try to eliminate judgment or improvisation. It simply refuses to let them remain invisible. When a decision matters, it becomes explicit. When an action has consequences, those consequences are triggered automatically. State is allowed to land somewhere durable, rather than hovering indefinitely in conversation.


This is the difference between a system that reacts to failure and one that prevents it quietly.


In systems with explicit state, order doesn’t have to be reconstructed after something breaks. The past is already present in the structure. Transitions are legible. Disagreements can reference artifacts instead of memories. People can rotate out without taking the system’s logic with them.


None of this is exciting. It doesn’t look like innovation. It looks like boring reliability—the kind that goes unnoticed precisely because it works.

But boring systems age well.


As systems grow, the cost of implicit state grows faster than anything else. Faster than compute. Faster than storage. Faster than bandwidth. Because it taxes the most limited resource in any organization: attention. Every time someone has to remember what the system should already know, you’re paying interest on deferred structure.


Most engineers eventually feel this, even if they can’t always point to a single cause. The system becomes harder to reason about. Changes feel riskier than they should. Progress slows not because the work is hard, but because certainty is scarce.


At that point, the question isn’t how to optimize further.


It’s whether the system was ever allowed to know what it was doing.


Once you see implicit state for what it is—not flexibility, but unpriced dependency—it becomes hard to justify letting it accumulate. Not because engineers need more discipline, but because systems deserve to be allowed to carry their own memory.


When they don’t, someone always ends up doing it for them.