Iâm going to use the sunk cost fallacy as an example.
Youâre working in your garden, and youâve dug a hole in the wrong place. Itâs straightforward to recognize that you shouldnât let the existence of the hole impact what you do next â the value of the hole is 0. Yes, you donât want to dig another hole; yes, you feel some loss aversion. Itâs the sunk cost fallacy, of course. Fortunately, I think most of us are inoculated against this particular flavor of irrational thinking: you know what the correct decision is â fill up the hole and and dig a new one, in the right place.
That said, I donât think most of us really understand how far-reaching the second-order effects of this sort of thing can be. For the sunk cost fallacy, in particular, that might go something like this:
You like the hole, so you unthinkingly come up with reasons to give the hole some value â maybe the hole is okay here. Maybe you can just plant different plants.
You like the hole, so you start to quietly contort your original objective â maybe this is a better place for the hole after all, given some modified constraints. Your mind is quietly weaving an imagined world where the hole is legitimately placed.
The value calculation thus becomes ambiguous. The value of the current hole is now nonzero, and through some impressive mental gymnastics, youâve convinced yourself that the incremental value of digging a new hole is negligible.
Iâm using the sunk cost fallacy in particular because I tend to come across it quite often in startup land, but this sort of second-order metastasis is something that extends to any sort of logical fallacy. We are at once highly rational and highly irrational beings, and as such, we can be quite exceptional at convincing ourselves that our lies are the truth.
How bad thinking turns into other things
Itâs interesting to scrutinize whatâs happening here in our brains: sunk cost imbues bias within us, and bias has this curious tendency to ramify. It starts as a simple overvaluation of a particular position (in the case of sunk cost, this is the position that weâve invested in thus far), but itâll seep into the way we construct the arguments for and against this or other paths, in our attitude when making decisions in that regard, in the color of our words as we discuss this with others. Specifically, I think these effects can be broadly grouped into two categories:
Bad arguments.
Bias tends to masquerade as bad arguments. You come up with reasons why the wrong decision makes sense, and it just makes the web of decision-making bigger and bigger â and worse, itâs around a decision that doesnât even warrant so much consideration.
Destabilizing arguments.
And then, sometimes bias just manifests as feelings. I believe in the value of subjective arguments, of course, but when used as vehicles for bias, they can be highly destabilizing to a discussion. If something âfeels badâ, it can swing the balance of an argument by any arbitrary magnitude. (Side note: this is why I typically try to convert qualitative measures to quantitative ones: if the bad feeling can be translated to âincreased cognitive load and 1 lost day of focusâ, this is measurable, comparable.)
The crux of the problem is that there are always plausible reasons to reject rational arguments. You can always perform mental gymnastics to convince yourself that the hole is in the right place. Hell, that the hole is in a better place than it was before.
Why this is so bad: you waste time making bad decisions
This hornetâs nest of arguments leads to two terrible things: wasted mindshare and bad decisions. Youâre disputing decisions that shouldnât be reconsidered, and youâre increasing the likelihood of making the wrong choice â youâve entered into some sort of inverse Monty Hall reality.1 In my clearly fictitious hole example, the correct choice is not so difficult to discern. But this can be one hell of a web to disentangle, particularly when, in real-world situations, multiple people are involved, and decisions were never clear-cut.
One of the most dangerous places in which this can transpire in is startup land. When youâre building a company, your entire life is navigating a complex, multi-dimensional idea maze â you have an infinity of possible choices every day, and the objective is to not only make good decisions, but to even choose which decisions to make that will lead you to some modicum of success. Itâs hard enough to navigate that decision space, but itâs even harder if you let bias quietly creep into the conversation.
In essence, irrational thinking tends build up traps in rational processes, and as bad thinking metastasizes, youâll find youâre dealing with a lowered probability of finding the right path through. Iâve always recognized intellectual honesty and mindful self-awareness as critical in improving oneâs ability to make good decisions, but all this is what can happen if you let its counterpart â unscrutinized bias â run entirely unchecked. You end up with a decision-making process thatâs filibustered with bad arguments until you end up with bad decisions.
E.g. you want to pick a goat instead of the car.