Why your decisions may not be as rational as you believe

Datablog_decision

If you believe your decisions are rational, you’re probably not listening to reason. Hidden biases and faulty thinking unbalance the decision-making process without us being aware of their influence.

Who shot JFK? Was it Lee Harvey Oswald, alone in the Texas School Book Depository? Was it the Cubans? The Soviets? The CIA? Or could it have been Lyndon Johnson, who succeeded JFK and therefore had the most to gain? Since Kennedy’s assassination in 1963 a small industry has grown up dedicated to unravelling the circumstances of his death. According to ABC, in the last 50 years more than 2,000 books have been written on the subject. Each amasses an argument and presents the evidence to support its case. Or, more accurately, each begins with a conclusion and then seeks the evidence to support it.

We are Pleased to Confirm

This is an example of confirmation bias and it illustrates a startling flaw in our reasoning. A large proportion of our decisions are based not on a clear reading of the facts, but in hidden biases and dodgy reasoning. And if you think that only applies to our private decisions, think again. Decision-making in business can be just as irrational, and is subject to a whole range of factors that distort our better judgement.

In his book Thinking Fast And Slow, the Nobel Prize-winning behavioural psychologist Professor Daniel Kahneman notes that we have two modes of thought. One is logical, analytical and unhurried, and we’re aware of its gears grinding away whenever we approach a complex problem. The other is fast, intuitive and hidden from us. Bad decisions and the biases that drive them are a result of us thinking fast when we need to think slow.

Here’s a quick example. Participants in a study were asked to choose between two treatments for 600 people suffering from a deadly disease. Treatment A, which was positively framed as ‘saving 200 lives’, was dramatically favoured over treatment B, which was negatively framed as ‘leaving 400 to die’. Read that sentence again and it’s obvious that both treatments generated exactly the same result. It’s simply that A puts the good news first.

It’s not just external factors – how information is presented and by whom – that skew the decision-making process. ‘Heuristics’ is the name for the mental short-cuts we take when we need to make a decision quickly: rules of thumb, general principles, the sense that this kind of thing usually works like that.

Misapplied heuristics, however, cause us to give undue weight to the wrong information. Ever seen Crimewatch on TV and then double-locked the front door, convinced there’s a vicious gang roaming your neighbourhood? That’s the ‘availability heuristic’ granting special prominence to a thought on the basis of something you’ve recently encountered, regardless of its actual relevance.

In an example from Kahneman’s own work, ‘Steve’ is described as a shy, quiet, withdrawn kind of guy, very concerned with detail. Is he likely to be a farmer, a salesman, a pilot, a librarian or a physician? According to the ‘representative heuristic’, we might well decide Steve is a librarian because he conforms to a stereotyped notion of what librarians are like. It’s a fast but shallow form of decision-making with serious implications, particularly when applied to race or gender.

When Time Isn’t Money

Underpinning many of our everyday decisions is a subconscious calculation of current expediency versus future gain. Or, to put it another way: immediate reward beats delayed gratification (almost) every time. It’s one of the reasons smokers carry on smoking and dieters crash the cake.

This is hyperbolic discounting, and the discount refers not to money but to time, which becomes harder to imagine the further away it is. Even when we understand the long-term implications, we’re still more than capable of writing off the future; procrastination -understanding the options and then deciding not to do what’s important right now – explicitly shifts the burden of responsibility to tomorrow. By which point, the chances are, whatever unenticing task we’re avoiding will have got even bigger and less appealing.

One of the most famous studies of short versus long-term decision-making is the Marshmallow Test, a study conducted at Stanford in the early 1960s and followed up over subsequent decades. In the initial test a group of four-year-olds were offered one marshmallow now, or two in 15 minutes’ time. The original researchers described the kids’ agony as they grappled with gratification and deferral. A third took the marshmallow right away. But what transpired as the study’s participants were tracked over the following years was that the kids who’d waited for their second marshmallow generally fared better in life.They had greater self-control and stronger motivation. By valuing the future, their decision- making in the present improved.

It sounds obvious, yet for many of us a bird (or a marshmallow) in the hand will always be worth two in the bush. Hyperbolic discounting means we favour a smaller, guaranteed reward now rather than a bigger (and, we may feel, notional) one in the future. Research indicates that most adults, if offered £50 today or £100 in a year, will opt for the lower sum now. Apply that principle to pension savings, health insurance or borrowing, and we’re making decisions we could live to regret.

How Confidence Tricks

In 2012 Kodak, a brand so synonymous with photography that for 50 years we’d measured our lives in ‘Kodak moments’, went into liquidation. It wasn’t because the company hadn’t kept up with technology. On the contrary. In the 1970s Kodak developed the first digital camera and pioneered mobile communications. Had history taken a different course we might now all be carrying k-Phones. So what irrational decision-making process led to the downfall of the world’s biggest photographic brand?

Having invented a prototype digital camera in 1975, executives became worried that it posed a threat to their own core business: traditional cameras and film, of which Kodak then had a huge market share. “That’s cute,” is how engineer Steven Sasson described the company’s reaction to filmless photography. “But don’t tell anyone about it.” Despite extensive market research, Kodak failed – or refused – to grasp the significance of their own invention.

Kodak’s disastrous decision is partly explained by loss aversion – decision-making on the grounds of what stands to be lost, rather than looking at potential gain.

Hindsight, of course, is a wonderful thing. (It’s also another bias: now that’s something’s actually happened, I realise I always knew it would.) But at the time of Kodak’s liquidation, commentators also pointed to prolonged overconfidence in its leaders, grounded in the company’s earlier success.

That’s nothing new. In 1876, Western Union had a monopoly on the recently developed telegraph communication system. Alexander Graham Bell, inventor of the telephone, offered Western Union the patent for his invention for $100,000. Supremely confident that the telegraph wouldn’t be toppled, Western Union’s President wrote to Bell. “After careful consideration of your invention, while it is a very interesting novelty, we have come to the conclusion that it has no commercial possibilities. What use could this company make of an electrical toy?”

Is this Decision Taken?

Overconfidence of this sort is, itself, a bias. Western Union couldn’t envisage a time when they weren’t going to be right. But, as any gambler probably won’t be able to tell you, there’s no such thing as being on a roll. At each throw of the dice the statistics are reset. The conviction that past frequency affects future outcome is called the gambler’s fallacy and it’s evident in the behavior of investors who liquidate their stock precisely because its value keeps on rising and they fear an imminent crash. (If enough of a company’s investors do the same, eventually they’ll be proved right.)

So who can we trust to make smart decisions? Before we stop and ask an expert, let’s consider illusory superiority, a bias which prompts people to overestimate their competence – and perhaps to take personal credit for teamwork. That doesn’t just make them annoying to be around. From a managerial perspective it raises the likelihood of inappropriate risk-taking and lowers the chances of investment into proper research that might end up proving them wrong. The Dunning–Kruger Effect ramps this up a notch and describes the tendency of people with no skill at all to overrate their ability. “Ignorance,” as Charles Darwin noted, “more frequently begets confidence than does knowledge.”

In his 2005 book Blink, Malcolm Gladwell introduces the notion of ‘thin-slicing': making fast decisions based on limited information or experience. Thin-slicing, for Gladwell, is the power of the glance. Great basketball players, he says, have ‘court sense’ – an intuitive awareness of what’s happening around them. So do successful military leaders, reading the combat zone. But unless you really are a practiced expert for whom certain forms of decision-making have become second nature, there’s no guarantee those instincts are going to be right, and the door remains open to other, less helpful, biases.

If, after all that, you think none of this applies to your own decision-making process, it may be because our subconscious delights in creating cover stories which enable us to justify bad choices to ourselves. One more slice of cake. It’s fine. Not only do I deserve it after a day like that, I’ll be setting the alarm for the gym tomorrow. Actually, I think I’ll have two. This is Kahneman’s fast-track mode of thinking telling the slow mode to get off its back. When we want to defend a biased decision to ourselves we can be such extraordinarily creative thinkers that we may wonder whether we really ‘make’ certain decisions at all.

In her 2010 book The Art of Choosing, psychologist Sheena Iyengar acknowledges that while we can’t be free of biases, we can at least train ourselves to recognise them. She suggests reflecting on heuristics and looking for evidence to disconfirm assumptions. “Though you won’t always be able to engage in extensive reflection before you make a choice, it’s worth your while to reconsider the choice later on. You may not be able to change it, but if you can discover that you made an error, you can avoid repeating the mistake in the future.” Still believe you’re a rational decision-maker?

It could be because you’re biased.

Speak Your Mind

*