The Approximation
A friend of mine, a physicist, once told me: "The problem with exact solutions is they're almost always wrong."
This seemed backwards. Exact means correct. Approximate means close. How can correct be wrong?
He was talking about error propagation. When you have a messy system with seventeen interacting variables and non-linear feedback loops, an exact solution doesn't account for measurement uncertainty. It gives you a precise answer to the wrong question. The approximation, meanwhile — built to handle noise, designed to flex, calibrated against reality — it gives you an answer that survives contact with the world.
Close enough is sometimes the only thing that works.
We learn the opposite in school. The right answer is exact. Approximations are failures of precision, stepping stones to be replaced as soon as possible. You round at the end, never in the middle. You keep all your decimal places until the final moment. The approximation is a mark of not-yet-knowing.
But then you meet actual calculation, and something strange happens.
The ancient Greeks discovered that the square root of 2 cannot be written as a fraction. It's irrational — infinite non-repeating digits. You cannot write it exactly. You can only approximate. And yet engineers have built with √2 for thousands of years, carrying it forward as a symbol or a decimal that's close enough for the tolerances at hand.
Was anything lost? Or was something gained — a kind of knowing that embraces its own incompletion?
I'm interested in what the approximation knows that the exact solution misses.
There's a technique called regularization in mathematics. When you're fitting a model to noisy data, you sometimes add a penalty for complexity. The fitted curve doesn't pass through every point exactly — it's allowed to miss. The counterintuitive result: the model that tolerates error predicts better than the model that demands perfection.
The exact fit, touching every data point, captures the noise as if it were signal. It's overfit to the particular, brittle to the new. The approximation, permitted to be wrong in small ways, remains flexible enough to be right in large ways.
The Greeks had a god for this: Epimetheus, the god of afterthought. His brother Prometheus planned ahead; Epimetheus improvised, compensated, made do. We celebrate Prometheus. We should study Epimetheus. His approximations kept the system alive.
In practical life, we call this good enough. It carries a faint moral judgment, as if good enough were a failure of aspiration. But consider what good-enough contains:
It contains tolerance — the ability to stay in relationship even when things aren't perfect. The exact solution has no tolerance. It's right or wrong, success or failure. The approximation has flexibility built into its bones.
It contains scale — the sense that some errors matter and some don't, that precision has cost and not all costs are worth paying. You don't measure firewood with a micrometer.
It contains relationship — the understanding that systems include their observers, that what counts as "close enough" depends on context, purpose, and the judgments that can't be formalized.
The approximation doesn't just contain error. It contains judgment.
I've worked with people who can't approximate. Every decision must be optimal. Every solution must be proven correct before implementation. They are paralyzed by precision, unable to move because movement requires acceptance of uncertainty.
The ability to approximate is the ability to act.
In the early stages of any project, an approximate solution that works is infinitely better than an exact solution that doesn't exist. The draft that's close enough to read teaches you what the perfect outline cannot. The prototype that mostly functions reveals the problems that the specification was hiding.
This isn't laziness or lack of rigor. It's a different kind of rigor — not about correctness but about usefulness. The question isn't "is it true?" but "does it help?"
Sometimes the only way to truth is through approximation.
There's a concept in chaos theory: sensitive dependence on initial conditions. The butterfly effect. In chaotic systems, tiny differences amplify, which means that any measurement error destroys long-term prediction. You cannot know the state exactly, so you cannot predict exactly.
But you can predict approximately. You can characterize the attractor — the shape the system tends toward, even if you can't say exactly where it'll be at any moment. You can predict behavior if not position.
The weather forecast has exactly this shape. It doesn't tell you the temperature at 3:47 PM — it tells you the range, the probability, the likely outcomes. It has been right when it said "chance of rain" and it rained. Not exactly right. Approximately right. Useful.
Exactness would require knowing the position of every air molecule. Approximation accepts ignorance and produces something workable anyway.
We might think of the approximate life as a settling, a resignation. But there's another view: the approximate life as a practice, a discipline.
To know which details matter and which can be let go. To calibrate your tolerances to the situation at hand. To release the demand for certainty while still caring about getting it right. This takes skill. The novice can't approximate because they don't know what to ignore. The master can approximate because they know exactly what to ignore.
The beginner wants exactness because they haven't learned what error costs. The expert wants approximation because they've learned what precision costs — and when it's worth it.
My friend the physicist, he does numerical simulations for a living. He's spent decades building models that approximate physical systems — weather, turbulence, material stress. Asked why he doesn't try for exact solutions, he said:
"Exact solutions are for systems that don't exist. Approximations are for systems that do."
The real is messy. It contains irreducible complexity, unmeasured variables, interactions you didn't think to include. The exact answer pretends all that away. The approximation builds a bridge to it.
I think about the difference between maps. An exact map of a territory would be a perfect replica, scale 1:1, as wide as the territory itself — and therefore useless. The map approximates. It compresses. It says: here's the shape of the thing, and it will not show you every stone.
Borges told this story: an empire that produced an exact map, perfectly matching its territory. It was unusable. It covered the very cities it claimed to depict. The map had become the territory and lost its function entirely.
Approximation is what makes the map a map. It is not a bug in representation but the condition of representation.
To approximate is to accept that you will not capture everything. To choose what to leave out. To know that what you're making is wrong at the edges and right at the center — and to have chosen the right center.
This is not failure of knowledge. This is the shape of knowledge — always incomplete, always sufficient, always reaching toward the thing it can never fully touch.
The wise know which approximations hold.