“If anyone can refute me — show me I’m making a mistake or looking at things from the wrong perspective — I’ll gladly change. It’s the truth I’m after, and the truth never harmed anyone. What harms us is to persist in self-deceit and ignorance.” — Marcus Aurelius in Meditations, Book 6:21 (translated by Gregory Hays, The Modern Library, New York)
On foresight and the role of doubt
During a recent exploration into the possible futures of a consulting firm, I was, again, struck by the lack of doubt people display, including the firm’s leadership team. Now knowking or being wrong seems to be regarded as a vulnarability, or even as weak leadership. To me, however, as a ‘perpertual doubter’ in constant search for alternative points of view and opinions, this isn’t mere counterintuitive, I find it also counterproductive; especially when dealing with foresight, as it prevents inquisitiveness and exploration.
Doubt is inherently human, and a precondition for inquiry and learning. It allows us to question ourselves and the world around us, and promotes a healthy debate about controversial issues. What people deny today, based on belief, may be undeniable tomorrow. So, it’s best to keep an open mind.
“It belongs to every large nature, when it is not under the immediate power of some strong unquestioning emotion, to suspect itself, and doubt the truth of its own impressions, conscious of possibilities beyond its own horizon.” — George Eliot
In How foresight creates unforeseen futures: the role of doubting, published in 2004 in Futures (Volume 36, Issue 2, page 253–266) Deborah Blackman and Steven Henderson define foresight as a mental model about the future. They argue that genuine attempts to refute, rather than confirm, these mental models will lead to lower expectations and certainty, thereby ‘opening’ the organisation, and enabling our mental models about the future to be more accurate. Or, to put it simply: doubt increases our chances of getting it right.
The succes of processes of organisational foresight depends on the collective interpretation of information and choosing appropriate actions. When faced with a lack of clarity, the overall idea is that gathering more information will directly result in better learning, and thus more accurate foresight. This would imply that the future is sufficiently determined by what can be seen in the present to be accurately known and understood. Although this may be true in special cases, most of the times the relationship between the present and the future is subject to the mental models used to guide our actions.
Furthermore, Blackman and Henderson are concerned about the freedom with which ‘new’ information is allowed to flow into the system. Even ‘open’ organisations — those that permit information to enter, and can be changed and affected by such information — are, in reality, a lot less open because all information is subject to filtering. What gets through is often determined by the knowledge already availbale within the system. Often, only information that supports the prevailing worldview will be encouraged.
Blackman and Henderson don’t claim that the mental models, and hence the foresight, will be incorrect or inaccurate. What they suggest, however, is that the process of creating and sustaining these foresights is a direct contributor to the kind of future that the firm faces.
Often, when organisations are confronted with information that differs from their existing mental models, they question its source, accuracy or relevance. A second line of defense is to see the deviating information as an excepetion. This is what the philosopher of science, Karl Popper, called a ‘conventionalist stratagem’ — a twist to rescue one’s theory.
“To doubt is sometimes seen as a problem as, especially in the case of global scepticism, the actors can be left unsure of any certainties and, therefore, of how to act. However, without doubt of some nature, knowledge will not be tested and inaccuracy will stay within the system (unless the mental model was actually true). A problem when developing new mental models is that what is currently knowledge is sometimes treated as infallible truth.”
The problems Blackman and Henderson have identified with developing organisational foresight are caused because learning processes begin from existing mental models and objectives in the present, and check for deviations in the future that can only be detected by information from the present. Instead, we should apply Popper’s logic of scientific discovery and start with doubting our existing mental models; not because of events creating a potential difference, but as a matter of discipline. They call this ‘double loop doubting.’
“The process begins by doubting the existing mental model, and devising checks whereby it could be falsified. Should the test fail to falsify existing mental models, it may be taken as supporting evidence and used to think about the future; that is developing foresight. Should the test suggest the model is wrong, existing ideas should be rejected, and be, at best, one source for new conjectures about what is the true state of affairs.
A mental model that cannot be falsified by a test should be rejected. It may be that the mental model is insufficiently developed, tacit, or rendered too plastic by conventionalist stratagems, so that it is impossible to think of circumstances where it can be shown to be false. Further, in other situations, the mental model may have significant political loading that makes doubting a form of subversion or dissent. No doubt these problems could be eased by falsification processes and a reframing of the mental models, providing that goals and visions are not strongly shared.
Mental models relating to the future and foresight will need testing until they pass pertinent tests although, naturally, the process is never complete. Thus organisational foresight will be based upon a pragmatic acceptance of working hypotheses and conjectures that can be falsified, rather than a stock of beliefs, in the form of shared mental models, that are accepted as true.
The practical implications of this for effective foresight are that organisations need to become less self-referential, which will undermine the chances of experiencing a chaotic or tragic future. This will be possible by having less focus upon sharing and dialogue and more upon challenging and doubting. Further, doubting, testing and removing rules of thumb can create uncertainty and differences, which increase the opportunity for heroic futures [see note below]. These processes need to be iterative and ongoing in order to ensure that new explanations do not develop erroneously — heroic futures require a constant move into the unknown. Managers should be wary of consensus: instead of indicating certainty, it may only herald closure.”
This process of ‘double loop doubting’ will ‘no doubt’ lead to less complete and more contingent mental models about the future — rather working guesses than clearly articulated visions. These limited versions of foresight may appear to offer fewer ‘facts,’ but they also create less false certainties about the future, offering a far greater possibility for what Blackman and Hendersen call, ‘heroic futures.’
“We can’t live in a state of perpetual doubt, so we make up the best story possible and we live as if the story were true.” — Daniel Kahneman
[note] In their paper, Blackman and Hendersen define four types of futures: the imperious, tragic, heroic, and chaotic (page 256 and 257). Heroic futures are created by actors who take actions in advance of others based on belief in their foresight. They don’t depend upon more or better information. It’s, rather the interpretation and actions taken that create novelty. Henry Ford’s vision of an affordable motor car would be one such example.