Adapted with permission from Agility: How to Navigate the Unknown and Seize Opportunity in a World of Disruption by Leo M. Tilman and Charles Jacoby, 2019, Missionday, Arlington, VA. Copyright © 2019 by Leo M. Tilman and Charles Jacoby.
Some years ago, a leading financial firm suffered significant losses in its value investing division. Value investing requires a great deal of conviction – because it focuses on discovering assets that are, in the eyes of the investor, undervalued. So, naturally, if a portfolio manager already likes an asset and then the price of this asset declines, it becomes so much more attractive. In the case of our client, the internal analyses looked comprehensive and persuasive in arguing this exact point on multiple occasions – all while showing a complete disregard for disconfirming evidence. The data supportive of prior views was highlighted, while the importance of new troubling developments was deemphasized. As asset prices continued to decline, portfolio managers ended up doubling down a number of times, until the losses were so significant that the company had to close out the positions.
The Odey Special Situations Fund was down 0.27% for April, compared to its benchmark, the MSCI World USD Index, which was up 4.65%. For the first four months of the year, the fund is up 8.4%, while its benchmark returned 9.8%. Q1 2021 hedge fund letters, conferences and more The Odey Special Situations Fund is Read More
During the post-mortem, one senior executive told us that this experience would have been inconceivable when the company was smaller: a significant price drop would trigger the fight for risk intelligence. A formal and vigorous discussion involving senior executives and external resources would take place, with opposing views presented and carefully evaluated. This process disintegrated during the company’s explosive growth. Executives became distracted and boundaries of initiative became blurred: key decisions could be made without a rigorous debate – as long as their supporting arguments were coherent and well-documented.
Fostering The Pursuit of Truth: We are deeply convinced that in addition to specific capabilities and processes, agility requires a special organizational and cultural environment—a key aspect of which is what we can “The Forum of Truth.”
For an organization to operate as a Forum of Truth, it must legitimize doubt and dissent and prioritize reason over formal authority. The expectation of a free exchange of ideas; patient and respectful deliberation; and mutual learning and discovery must be fully and forcefully articulated, from the top all the way down. This must include the mandate to treat honest mistakes as opportunities for learning and improvement, not vehicles for humiliation or admonishment. The culture has to promote the understanding that “universal truths” and “objective reality” are extremely tricky, idealized concepts. It must also emphasize that because of differing goals, experiences, and risk equations, the same information may mean very different things to any of us vis-à-vis our colleagues, adversaries, and other players in our competitive ecosystem.[i]
The only way these norms will actually prevail throughout an organization is if they are exemplified by the senior leaders and institutionalized through processes, performance metrics, and incentives. The organization’s situational awareness and decision-making – and ultimately, agility – will be decimated if senior leaders suppress dissent or treat it as disloyalty; if they punish the messengers of bad news; if they exhibit willful ignorance; or if they use diffusion of responsibility or plausible deniability as excuses for inaction.
In many of the cases we’ve written about earlier, from the Korean War and Hurricane Katrina to the Fukushima Nuclear Disaster and MF Global, warning signals were ignored and risk equations were misjudged precisely because the disconfirming evidence was either intentionally suppressed or subconsciously dismissed. Similarly, the risk management failures that doomed Challenger and Columbia shuttles did not stem solely from a lack of competence. The NASA culture – that discouraged dissent by ostracizing those who voiced concerns and placing the burden of proof squarely on them – played an equally significant role.[ii] In the wake of the disasters, NASA changed the approach: whenever concerns and warnings emerged, they were taken seriously, and teams were assigned to analyze them and offer recommendations.[iii] This change improved risk intelligence, decision making, and organizational cohesion.
Even when organizations declare the primacy of truth and create processes for vetting decisions, expressly for the purpose of giving them tough hearings, people often don’t voice opinions and counterarguments, or ask questions that could lead to important additional investigation. This is only partly due to everyone’s familiarity with hierarchical management that punishes dissent, in ways both explicit and implicit. Another reason why people often keep their potential input to themselves is their deep-seated concern with what Darwin called “the praise and blame” of their colleagues. [iv] This fear of reputational damage is at the core of groupthink, which, again, we have all witnessed – with one colleague after another agreeing with a plan or analysis that we could clearly see was invalid, or at least incomplete, and which we expected most of them could see as such too. That is why it’s critical that senior leaders position the pursuit of truth – and everyone’s active engagement in it – as both a job requirement and a social norm.
The challenges presented by this complex mix of behavioral biases, leadership practices, and organizational cultures are amplified by an additional powerful force that must be preemptively dealt with. Due to what is referred to in psychological research as human context dependency, specific circumstances under which we are exposed to information, the way in which it is presented, and so many other factors of the context in which we are making decisions have a huge effect on both the process of deliberation and the outcome.[v] As the following examples illustrate, the choice architecture – ways in which information and possible solutions are presented for discussion – must be carefully evaluated, especially when pivotal strategic, investment, and organizational decision are made. Consider the following two examples:
- When Verizon and Sprint set out to reduce costs a number of years ago, the companies utilized different choice architectures. At Verizon, lines of business were instructed to propose cuts relative to prior years’ budgets. In contrast, at Sprint the unit heads were directed to start with a clean sheet and create a budget based on the expected benefits of expenditures and their role in advancing the firm’s strategy.[vi] In our experience, these alternative approaches usually lead to very different outcomes.
- As discussed earlier in the book, complex organizations are commonly exposed to large numbers of individual risks. Unless these risks are properly aggregated, boards and leadership teams end up judging the firm’s risk appetite and its alignment with goals and resources based on dozens of disparate line items. As we have witnessed on multiple occasions, very different conclusions may be reached based on the order in which risks are listed and the language in which vulnerabilities, consequences, and rare events are described. A previously “unacceptably high” risk profile may be deemed totally acceptable at the next board meeting if risks happen to be rearranged or framed differently – even with the best of intentions.
The organization’s ability to function as a Forum of Truth becomes especially important when it operates in environments dominated by powerful (and offer erroneous) popular narratives. As economist Robert Shiller observed, many economic and financial phenomena – such as recessions, asset bubbles, or financial crises – can be linked to viral “epidemics of ideas” deliberately used by some people and unwittingly adopted by others.[vii] Such narratives, that are “mixtures of fact and emotion” based on “varying degrees of truth,” can become juggernauts and must be detected and proactively debunked. In the buildup to the Great Recession, for example, the dominant narrative that “housing prices across the United States never decline simultaneously” created an over-reliance on geographical diversification as a risk mitigator. This narrative – exploited by some and not questioned by others – permeated a wide range of investment products, corporate practices, and credit ratings, setting in motion the lethal vicious cycle.
When a concerted fight for risk intelligence takes place in an environment of trust that prioritizes evidence, deliberation, and honesty, organizations become positioned for agility. They examine the evidence from multiple perspectives; detect invalid assumptions and potential landmines; and challenge themselves about what they know and don’t know. This enables them to gain situational awareness, effectively assess environmental changes and risk equations, and shape good responses. From this perspective, agility applies to organizations and investment processes in very similar fashion—all of which is especially relevant to value investors.
[i] Of course, as opposed to honest mistakes, violations of the norms must be consistently and swiftly punished. This includes attempts to humiliate or demean colleagues under the disguise of honesty and transparency—a sure way to decimate the Forum of Truth, engagement and trust.
[ii] We owe this observation to Maj. General John Barry who headed the Columbia accident investigation.
[iii] A number of multidisciplinary approaches have proven useful in this regard, like the “red team/blue team” approach to war games originally conceived by the national security community (http://bit.ly/2HJwMm4), as well as the so-called premortems— formal analyses that imagine that an initiative has failed and then perform a typical postmortem (http://bit.ly/2KdBdHx)
[iv] Charles Darwin, The descent of man, and selection related to sex (London: John Murray). As a noteworthy case in point, the executive team of a leading US company that we worked with was shocked to discover that the firm’s lack of innovation stemmed not from employees’ anxiety about undermining their bonuses or promotions if ideas didn’t work, but rather a fear of “losing face” in front of their superiors and colleagues, http://bit.ly/2I5nAra.
[v] Kahneman, Thinking, Fast and Slow, pp. 413 and 225.
[vi] Verizon presentation at the Goldman Sachs investor conference (2017).
[vii] http://bit.ly/2Wefarg. Interestingly, Shiller has also argued that the new “post-truth” culture dominated by modern information technology and social media may be even more susceptible to non-factual narratives. Narrative psychology is also related to the psychologists’ concept of framing, as exemplified by the work of Thaler, Kahneman and Tversky.