Trusting yourself is dangerous, but trusting experts can be worse.

Sometimes your instincts are wrong.

A car mirror is a tool designed to give you the ability to perceive what’s behind you while driving. It saves you from changing lanes into a collision with a semi. A mirror is an invaluable tool keeping you aware and safe on the road, and its use seems instinctively simple. Unfortunately, your instinct can be deadly wrong.


The warning engraved in passenger-side car mirrors in the USA, India, Canada, Saudi Arabia, and Nepal reads “Objects in mirror are close than they appear”. This optical imperfection is by design, as a concave mirror gives you a wider field of view crucial for getting a sense of what the lane to your right holds. When used properly, the mirror gives you enough information to move over safely and avoid an accident.

But, what if you aren’t familiar with the concave nature of the mirror? If you are an inexperienced driver who is unfamiliar with the shrinking effect of the mirror, you run the risk of cutting someone off and causing an accident. What’s worse, you get no warning, except for an easy to miss engraving, and that, only in a handful of countries. If you grew up in Europe, Latin America, or most of Asia, the first clue that you are making a driving mistake may be the barrage of honking and expletives coming from behind after you unwittingly cut someone off. Maybe you find out only after someone’s engine makes itself comfortable in what used to be your car’s trunk.

The danger here is that at first glance, there’s nothing to pay attention to. It’s a mirror, it reflects light, you can see what’s behind you. The slight curvature of the mirror is imperceptible without a close examination. The apparent simplicity of the object and its use lulls you into a false sense of security. This kind of intuitiveness makes you feel safe as you merge into the hood of a speeding truck hauling highly explosive fuel.

300,000 merging accidents occur every year … 50,000 of these crashes are fatal

Dulaney, Lauer, Thomas Law Firm

The Dunning-Krueger effect is usually somewhere close-by when one observes a professional doing their job. From ditch-digger to the landscape painter, some jobs just seem a lot simpler until you do them yourself. That little voice that whispers to clients from hell “hey, your 15-year old kid could probably make a logo in MS Paint or whatever” is exactly what DK sounds like. Even when considering what is very obviously a skilled discipline or profession, people tend to assume they can do it with no practice, at least until they get their hands dirty and fail a few times.

In cases where it’s not obvious that there is something to learn at all, potential disaster lurks. Just behind that intuitive perception of safety in familiarity be dragons. It’s just a mirror, after all, what could go wrong?

Revenge of Homo Habilis

“intuition is nothing more and nothing less than recognition”

– Daniel Kahneman

The trouble with intuition is that its a result of pattern-based learning. We get a sense of a person from having met hundreds of people throughout our life and forming some sort of a heuristic map. When trying to actually evaluate something without a heuristic map (either for sake of objective impartiality or because we aren’t familiar with the subject and lack such a map) we need tools.

Artisanal, hand-crafted tools.

Pilots follow checklists because if they don’t, the experience of having done the same thing hundreds of times can create a false sense of security. That’s how you find yourself trying to land using your passenger’s baggage as the landing gear. A checklist is a cognitive tool. It helps you from being inept.

The many physical tools that we, as a species, have produced to help us in our work and play are immediately obvious as such. A rake is obviously a tool, and so is a ball or an airplane. They’re tangible objects.

Intangible mental tools are harder to recognize- they are “insights” or “beliefs” or “education”. These can be as simple as “smile back” to as complex as differential equations, but they are necessary. And here is where the Dunning Krueger kicks in. When seeing a master working with a cognitive tool, it’s difficult to tell that there are tools in use at all.


Part of the recognition that complex cognitive tools are at work are degrees and distinctions, the trappings of a profession. The doctor, with the recognizable and a framed diploma on the wall, exudes confidence, privy to the complex tricks and tools that enable diagnosis and healing. A doctor in a t-shirt and khaki shorts may be no less effective but is less likely to be listened to. Except, the doctor wearing a white coat might actually be more effective, and even better at their job, so long as they wear the coat.

Trust me, I got the coat AND the Crocks.

With sets of cognitive tools arranged into disciplines, the deference to certifications and recognitions makes sense. The honors that are bestowed after a significant period of education, certified by bodies whose interest is directly related to ensuring trust from the public are markers that the individual accredited to practice is, in fact, in possession of the tools necessary to practice.

The respect that comes with the office extends even as far as making peace with the disfavorable outcomes of treatment. If a patient dies under that doctor’s care, the family of the deceased tends to accept that there was nothing more to be done. The incidence of malpractice suit has been steadily declining for the past 10 years, and blame for medical errors tends to fall on the nurses, rather than doctors. The more education and the loftier the title, the more trust is given to the practitioner.

This is not universally true, of course, and in places like China, the threat of an angry family attacking doctors when the treatment goes awry is real and frequent. Of course, if you have ever been to a Chinese hospital, you may be surprised at the casualness of the experience, as thousands of people are pumped through daily. The barrier that pomp and circumstance provide to doctors in other places is almost entirely lacking. There is a distinct absence of faith-like reverence towards doctors, who themselves often act more like mechanics than exalted healers.

A traditional medicine practitioner, however, does seem to have an aura of magic around them. They touch and prod, listening in silence, an air of learned concentration surrounds them as they locate the various pulses of Qi and thoughtfully. When a master solemnly declares you to be too “heaty” (or, rather, “hyperactivity of fire due to Yin deficiency“) before prescribing an herbal tea or a Qigong exercise, there seems to be little room for doubt or negotiation. The TCM master gets the recognition and respect for their suite of cognitive tools and their use.

The Paul Krugman Effect

Medicine is arguably a technical subject with a well-studied set of rules, refined over centuries, and with a fast enough feedback cycle where one can tell whether approach A or B works. Physical sciences, at their heart, are based on falsifiability and verification. But, the same structure of deference and trappings of status have been imparted on decidedly non-scientific, near-scientific, and even pseudo-scientific disciplines.

Enshrining economics (a favorite whipping boy for hard-science snobs) into the same trappings of status as theoretical physics is silly. So is calling it a “science“. Not that the study of economics is not useful or worthwhile, but conflating the value of an opinion by a Ph.D. in economics with an opinion on relevant matters of a physicist or a physician is … lazy.

Paul Krugman is a great example of the sheer overconfidence that the modern world imparts on someone with a degree. The Nobel-winning scholar and opinionated columnist has plenty of detractors, including himself. His undergrad textbooks, as far as I remember, were hard-to-follow tripe couched in a cloak of self-derived authority. Then again, maybe I’m just not intellectually sophisticated enough to follow the brilliant mind of Paul, a flaw I seem to share with the actual economy.


But, any discussion of Paul Krugman, or the economy (may it grow and bless us with Margaritas), becomes political. It crosses from concerns of correctness, the inquiry of “what is the causal relationship between A and B”, and into a discussion of “who deserves share A and B”.

The practitioners of the discipline, of this set of cognitive tools, are inevitably corrupted by self-interest. Even the ones that are not, are tainted by the possibility of corruption. Without “marking to market”, or getting feedback from some objective and impartial source, systems of cognitive tools cannot be dialed-in to be precise enough to be actually useful in the field. Superstitions and irrational, even potentially dangerous beliefs can propagate for centuries if unchecked by the harsh reality of experience. Even medicine, until relatively recently, was a competition to see what new places we could find to apply leeches. Economics works on time scales that do not yield themselves to easy causality chains, and the subject matter of the field is self-aware, and therefore self-adjusting without ever really touching that harsh limit of reality. Some practitioners get lucky, as do some presidents, enjoying the fruits of the natural business cycle boom while lauding themselves the entire time for being an economic genius.

In other words, without feedback, your tools might very well be broken. And the worst kind of broken, too. If a tool gives you nothing but failures when applied, it’s easy to see that it is broken and change your approach. If your tools give you a 50% chance of success, they are unlikely to be efficiently discarded in favor of a new approach. Some measure of success can get you invested in a set of fundamentally flawed or irrelevant tools, leading to overconfidence, and ruin.

This is the flaw with a lot of “soft” science, as well as every inevitably broke Vegas gambler with a “system”. Not knowing how the side view mirror works, using it poorly, or having a warped mirror can actually get you killed.

So What?

  • Make your thinking explicit. There are a lot of decisions that get made in business (and life) on a gut feeling, a hunch. Going with a hunch is, in fact, using a heuristic, a toolset. Without making these toolsets explicit, and acknowledging them at the very least to yourself, you are bound to introduce un-vetted assumptions and thinking driven by herd mentality or momentum. Make your assumptions explicit. Having sound reasoning and explicit assumptions behind the decision will make it easier to find flaws, and if the decision was the wrong one, actually refine the cognitive toolset for better efficacy in the future. Thinking through a problem can’t prevent you from making mistakes, or making the wrong choice even for the right reasons. Making your assumptions explicit will help, however, find if your process works over time, which is a lot better than guessing blindly through implicit heuristics or just trusting someone else to make the decision for you.
  • Seek feedback. The faster your cycle between making a decision and getting hard data for the results, the faster you can calibrate your decision-making toolset to actually work.
  • Take responsibility for your decisions. Expert opinion is valuable, but so is your own personal experience, insight, assessment. Relying entirely on what the experts say, especially if the expert has no skin in the game, is a recipe for disaster. If you are doing business, becoming an expert, even though painful trial and error, is more useful in creating a long-term sustainable operational model than trying to “buy” expertise. Eventually, Paul Krugman will be wrong, and so will the herds of sycophants that parrot him. An executive’s job is to make decisions, so if you’re in control, the buck has to stop with you.