Preachers, prosecutors, politicians and scientists
, Jonathan Greenwold
Preachers, prosecutors, politicians and scientists
When debating business ideas we often fall into one of three categories. Jonathan Greenwold looks at how to identify these and pivot towards a more evidence-based, analytical approach.
Imagine this scenario - you’ve just presented your new revenue raising idea to your management team. It’s been a long, slightly fractious discussion. But it hasn’t gone your way - the team isn’t supporting your idea. So frustrating – you did your best to sell the idea but the CFO shot it down and the others fell into line with her.
Afterwards, you reflect on the meeting – where did it go wrong; and how can you do things better next time?
This is where some concepts developed by Wharton organisational psychologist, Adam Grant, can help.1
When promoting your idea, you were being a Preacher - arguing your point of view based on a set of prior beliefs.
It looks like the CFO was in Prosecutor mode - calling out the flaws in your reasoning, marshalling arguments to prove you wrong and win her case.
The others were Politicians - currying favour to try and win approval from colleagues.
The good news is that there’s a better way – where the group act more like Scientists. This is where you discard your prior assumptions, beliefs and self-interest to take a more data-driven and open-minded approach. Scientists treat work as a series of experiments based on hypotheses which are tested through a disciplined process of building, measuring and learning.
Making decisions - the scientific way
The benefits of this more humble, scientific approach were demonstrated in a 2017 Italian research study. In the study 116 start ups were trained to seek market feedback on their ideas. They were then divided into two groups. One group was instructed to apply a scientific approach to test the mechanisms affecting their products’ performance and predict future net revenues. The second, control group used the more usual informal rules of thumb, impressions and intuitions.
The group using the more scientific method performed significantly better:
- They pivoted more often, updating their business approach in light of new information
- They felt more confident in making big decisions
- They earned more revenue
Superforecasting the future
We can also see the success of this approach in the superforecasting method pioneered by University of Pennsylvania professor, Philip Tetlock2. Superforecasters treat their ideas as “perpetual beta – a computer program that is not intended to be released in a final version, but will instead be used, analysed, and improved without end.” You are willing to continually revise your understanding, assumptions and expectations on the basis of new information in an open-minded way.
Superforecasters treat their ideas as “perpetual beta – a computer program that is not intended to be released in a final version, but will instead be used, analysed, and improved without end.”
Superforecasters have made accurate forecasts about the coronavirus pandemic, the Brexit vote and Donald Trump’s unexpected victory in the 2016 Republican primary. Contrast this with the approach of a typical pundit - usually a Preacher with an overarching worldview which they apply more or less consistently to each scenario they encounter. The Preacher’s approach gives you a ready response to each new situation but is less effective than challenging yourself to revise your opinions in light of new information.
Harnessing the crowd within
If you embrace the scientific method and move away from, say, being a Prosecutor or Preacher, this allows you to acknowledge the fundamental variability or “noisiness” of human judgement and potentially even benefit from it.
An increasing body of psychology research has found that you can use noise to good effect - harnessing the wisdom of the “crowd within”. In a seminal 2008 study, participants were asked difficult general knowledge questions, all of which required an estimate of a percentage e.g. what percentage of the world's airports are in the USA? 3 Participants were then unexpectedly asked the questions again, either immediately or three weeks later.
Those who answered the question in quick succession produced very similar answers. Those who answered with a three-week gap produced very different answers. When you average the answers out not only did this cancel out the variability, it also dramatically increased the quality of the answers compared to the group that answered in quick successsion. The message is that if we take a more scientific approach and acknowledge our own fallibility and noisiness, we’ll improve the accuracy and consistency of our judgments.
There is a parallel here with artificial intelligence – arguably there is nothing intrinsically “intelligent” about AI. The secret of AI’s success is its ability to learn from past mistakes and to do so at an incredible scale.
On the other hand, if we play the part of the Preacher, Prosecutor or Politician then we are less likely to succeed.
The message is that if we take a more scientific approach and acknowledge our own fallibility and noisiness, we'll improve the accuracy and consistency of our judgments.
The AMX laboratory
When we first built the AMXConnect Store, AMX’s online marketplace for financial services, we did so on a minimal viable product basis – treating the project as an experiment. For example, we hypothesised that sellers would be the first adopters rather than buyers, so we focused on that side of the market. We limited the initial version of the Store to a basic directory listing rather than building out complex functionality at the outset. This has proved fruitful so far, with over 50 service providers offering 145+ services. Our next hypothesis was that asset managers would likely be our first group of buyers. This represented a departure for AMX, as in the past most of the buyers of our services have been fund investors.
We are also running an experiment in hybrid working – ‘AMXNext’. We have given up a permanent office and instead rent office space for in-person meetings for the management team once a week and the wider team once a fortnight, without making a long-term commitment. Prior to this, we were working completely remotely during the pandemic but received feedback that the team wanted more in-person interaction. Our experiment is still ongoing. We are surveying colleagues to get a data-driven view of their experiences to help us improve how we collaborate.
A toolkit for better decision making (Guns in Philadelphia)
The next time you find yourself debating with colleagues, perhaps ask yourself whether you’re doing so as a Preacher, Prosecutor, Politician or Scientist. Are the discussions framed as one person’s view being “right” and others’ being “wrong”, or as an open-minded exploration of the possible that is almost certainly going to have to be revised in the future?
This isn’t easy – current neuroscience tells us that our brains are wired to make predictions about the world based on our past experiences rather than objectively evaluating the information we receive. As the neuroscientist Lisa Feldman Barrett has written – “what you see, hear, small and taste … are completely constructed in your head”. Your brain issues predictions and checks them against the sense data coming from the world and your body. Unfortunately, because these predictions are based on your past experiences, when you encounter something new these predictions are often wrong. Even worse, because it takes energy to rigorously check our predictions, the brain often runs on autopilot and doesn’t do a great job of correcting prediction errors. 4
An example – when in 2015 the US Justice Department analysed eight years of shootings by Philadelphia police officers, its report contained two sobering statistics: 15% percent of those shot were unarmed; and in half of these cases, an officer misidentified a “nonthreatening object (e.g. a cellphone) or movement (e.g. tugging at the waistband)” as a weapon.
…because it takes energy to rigorously check our predictions, the brain often runs on autopilot and doesn't do a great job of correcting prediction errors.
What the science is saying here is that in the heat of the moment, given police officers’ past experience they actually see a weapon when none is present. The brain is wired for prediction rather than reacting to and processing reality as it really happens.5
Hopefully the stakes in your boardroom won’t be as high as they are for the Philadelphia police but the point stands – it’s likely that a lot of us get a lot of things wrong a lot the time, especially in stressful, time-pressured situations. So if we can pull together our various independent views and embrace a willingness to update those views as we gain more information, then we are more likely to filter out the signal from the noise.
4 Seven and a Half Lessons About the Brain by Lisa Feldman Barrett
5 Applying the Theory of Constructed Emotion to Police Decision Making, Frontiers in Psychology, 11 September 2019 by Joseph Fridman, Lisa Feldman Barrett and others