Causal Complexity and the Fallacy of the Single Cause

Causal complexity means that in almost any situation there are many causes behind a single event, including a good deal of random factors and unknown causes. Getting to grips with causal complexity is indispensible for effective strategic and analytic thinking, especially for our ability to diagnose the problem, to understand what factors are responsible for the problem and what prevents possible solutions. Yet, our minds are inclined to look for and settle on a single cause, neglecting any causal complexity. This cognitive bias is known as the fallacy of the single cause,  sometimes also called causal reductionism or causal oversimplification.

Causal Complexity

Almost any situation is driven by multiple causes, and such causal complexity underlies even situations or events that appear to be fundamentally clear-cut. There is always a mixture of factors bring about an event. As the great Russian writer Leo Tolstoy beautifully illustrated this point in his classic novel War and Peace (1869), there is a complex cause even behind an apple falling from a tree or for Napoleon’s invasion of Russia in 1812 and his subsequent retreat:

When an apple has ripened and falls, why does it fall? Because of its attraction to the earth, because its stalk withers, because it is dried by the sun, because it grows heavier, because the wind shakes it, or because the boy standing below wants to eat it?
Nothing is the cause. All this is only the coincidence of conditions in which all vital organic and elemental events occur. And the botanist who finds that the apple falls because the cellular tissue decays and so forth is equally right with the child who stands under the tree and says the apple fell because he wanted to eat it and prayed for it. Equally right or wrong is he who says that Napoleon went to Moscow because he wanted to, and perished because [Czar] Alexander desired his destruction, and he who says that an undermined hill weighing a million tons fell because the last navvy struck it for the last time with his mattock. In historic events the so-called great men are labels giving names to events, and like labels they have but the smallest connection with the event itself.[1]

The Fallacy of the Single Cause

The fallacy of the single cause assumes that an outcome is brough about by a single cause, ignoring any causal complexity. This fallacy is particularly easy to notice in the media. Political pundits will ask what is the cause of the most recent crisis in the Middle East. Economists will argue about the cause of increasing poverty in the Third World. Business analysts will ask what has been the cause of Apple’s success. Such discussions imply that there is a single cause responsible for the outcome. Many of the so-called strategy experts may be equally prone to this oversimplification bias, often suggesting that the sole cause of a successful political or marketing campaign is the leader’s strategic intent or a similar cause.

Wired for Causal Oversimplification

It is not a new realization that people fall for causal oversimplification. Thomas Hobbes in Leviathan (1651) pointed out that people usually perceive only immediate causes:

Ignorance of remote causes disposeth men to attribute all events to the causes immediate and instrumental; for these are all they perceive.[2]

Evolutionary psychology sheds some light on this bias, suggesting that our minds are wired to represent factors which we could manipulate and discount those factors which we couldn’t manipulate, even if they are more important. Using Tolstoy’s example of the falling apple, evolutionary psychologist John Tooby explains that:

. . . our minds evolved to represent situations in a way that highlighted the element in the nexus [i.e. causal connection] that we could manipulate to bring about a favored outcome. Elements in the situation that remained stable and that we could not change (like gravity or human nature) were left out of our representation of causes. Similarly, variable factors in the nexus (like the wind blowing) that we could not control, but that predicted an outcome (the apple falling), were also useful to represent as causes, in order to prepare ourselves to exploit opportunities or avoid dangers. So the reality of the causal nexus is cognitively ignored in favor of the cartoon of single causes. While useful for a forager, this machinery impoverishes our scientific understanding . . . [3]

Of course, such evolutionary wiring not only impoverishes our scientific understanding but also decapatites our strategic thinking and problem solving.

So the basic takeaway here is to resist causal oversimplifications and try to see the full causal complexity. In practice, therefore, it is even more useful to rely on mental models like tension systems, and analytic techniques like force field analysis which are based on tension systems.

Notes

[1] Leo Tolstoy, War and Peace (1869) (Book IX, Chapter 1)

[2] Thomas Hobbes, Leviathan (1651) (Chapter XI)

[3] John Tooby, Nexus Causality, Moral Warfare and Misattribution Arbitrage