Naive Interventionism: Why Doing Nothing Is Often Right

Quintus Fabius Maximus was insulted by the Roman Senate.

It was 216 BC. Rome was at war with Hannibal, one of history's greatest military commanders. Fabius was given command of the Roman forces. His strategy: avoid pitched battle. Harass Hannibal's supply lines. Delay. Wait. Don't engage.

The Senate hated this. They wanted action, victory, engagement. Fabius's strategy of non-engagement looked like cowardice. They called him "Cunctator" — the Procrastinator — as an insult.

So Rome replaced him with generals who wanted to fight.

These generals promptly engaged Hannibal directly at the Battle of Cannae. It was the worst Roman military defeat in history. Roughly 70,000 Roman soldiers were killed in a single afternoon — the largest single-day casualty count Rome would experience for centuries.

Fabius was reinstated. His "procrastination" turned out to be the correct strategy — an asymmetric approach to an opponent you couldn't defeat directly. Eventually, Hannibal's overextension, his supply lines harassed, his forces tired, he retreated.

Doing nothing was the winning move.


The Compulsion to Act

Naive interventionism is the compulsion to "do something" in response to any problem, without accounting for the costs of action.

It's driven by a deep psychological need to feel in control. When something goes wrong, the natural response is to take action. The alternative — waiting, observing, allowing systems to self-correct — feels passive, irresponsible, cowardly.

But waiting often works better.

Systems are frequently self-correcting. Small problems resolve on their own. Interventions interrupt this correction and often make things worse. But the intervention is visible, while the cost of the intervention is invisible.


The Noise Problem

Here's where naive interventionism becomes measurable: in information consumption.

At daily observation frequency, financial and economic data is approximately 95% noise. At hourly frequency, it's roughly 99.5% noise. Most of what you're observing is random fluctuation, not signal.

Yet financial professionals check prices throughout the day. They react to every movement. Each price movement generates an emotional reaction. Each emotional reaction increases the probability of a decision. Each decision incurs transaction costs, tax consequences, and often moves the portfolio further from the rational position.

The investor who checks their portfolio quarterly is not less informed than the one who checks it hourly. They're more protected from acting on noise.

The heuristic is simple: reduce the frequency of observation, reduce the probability of naive intervention.

This applies everywhere. The person who checks their email once at the end of the day gets more important work done than the person who monitors email continuously. The person who reads the news once a week understands current events better than the person who consumes news hourly. More frequent observation increases exposure to noise that mimics signal well enough to trigger reaction.


The Back Pain Example

Here's a concrete medical example: back pain.

The majority of acute back pain resolves on its own within 6-12 weeks with conservative management — rest, mild movement, time. The body's self-correction mechanism is robust.

Yet back surgery is one of the most commonly performed elective procedures in the United States. The surgery has meaningful failure rates, significant recovery time, and for many conditions produces outcomes no better than watchful waiting.

Why? Because waiting is invisible. Doing nothing feels irresponsible to both the doctor and the patient. The surgeon is intervening. The patient is getting treatment. Progress is being made.

But the correct response to most back pain is non-intervention. Let the self-correction mechanism work. This is naive interventionism avoided — difficult because it requires restraint and confidence in systems that are already in motion.


Wu-Wei: The Ancient Wisdom

The concept isn't new. The Chinese philosophers understood it millennia ago: wu-wei, often translated as "non-action" or "action through non-action."

It doesn't mean literal inaction. It means action that is calibrated to what the system needs, not to the intervention we want to execute. It's the idea that sometimes the correct response to a problem is to understand the system well enough to see that intervention will backfire.

Roman generals understood it too. The word "festina lente" — "make haste slowly" — captured the idea that movement without direction wastes energy.

Modern culture has lost this wisdom. We're drowning in data, feedback, metrics. Every signal triggers a reaction. We intervene constantly, assuming more control is always better.

It's not.


The Stoic Version

Seneca, the Stoic philosopher, practiced a form of this: he would periodically prepare for loss before loss occurred. Not morbidly, but as a mental exercise. This reduced his anxiety about potential losses and freed him from the compulsion to constantly intervene to prevent them.

The practice: acknowledge that the worst case might happen, mentally accept it, and then make deliberate decisions about what to do — rather than decisions made from panic or the compulsion to avoid discomfort.

This is naive interventionism avoided through philosophical preparation rather than data analysis.


What To Intervene In

The argument isn't against all intervention. Some interventions are justified:

Specific reduction of genuine fragility: Banning smoking, traffic speed limits, building codes — these interventions reduce fragility in measurable ways because the harm is concrete and the intervention is specific.

Early detection systems: Some forms of monitoring are justified when they catch problems early in low-cost ways. Smoke detectors, blood pressure checks, financial position monitoring. The threshold is: does this catch problems before they become catastrophic?

Allowing small losses to accumulate and resolve: Sometimes the correct intervention is the decision to allow the system to lose small, distributed amounts rather than trying to prevent them and accumulating large, concentrated risk.

The key question: Does this intervention reduce fragility, or does it store fragility?

If the intervention allows a system to become more volatile, more imbalanced, more susceptible to larger shocks, then it's naive interventionism even if it works in the short term.


The Heuristic

Here's a practical rule: Before you act, ask what will happen if you don't.

If the answer is "the system will self-correct," then don't act. If the answer is "catastrophic failure," then act decisively.

Everything else is in between. And the in-between is where naive interventionism lives — small problems that would resolve on their own, made worse by someone unable to tolerate the discomfort of waiting.

Fabius understood this. He waited while others wanted action. History validated his patience.