CPD: The airline food enigma – The way clients think, and how 9 out of 10 advisers respond

From

‘Sunk cost thinking’: the idea that we have already spent or invested money to that point, and that switching means we somehow lose the value, including our tendency to eat airline food even if we aren’t hungry.

The world depicted in many economics textbooks is populated by completely informed consumers who value choice and make perfectly rational decisions. Such a world is, of course, mythical. In the real world, decisions are an outcome of a complex interplay between our thinking styles and the way information is ‘framed’ or presented to us.

The exhausting nature of decision making sees us rely on short cuts and other techniques which are often undermined by emotions, unconscious biases and poorly chosen reference points. As a result, decisions are often irrational and frequently sub optimal. This is true of both simple, everyday choices (what to eat for lunch, what to watch on tv) and complex, life changing decisions about career, relationships, and our finances.

As a financial adviser, your relationship with your clients is predicated on trust, and on their understanding and appreciating the value in your advice. From the very beginning of this relationship, your client is required to make many important, complex decisions. Understanding the science behind the way clients think, how they are likely to process information, and how they make choices, will allow you to tailor your approach so as to help them make decisions which are more informed. More informed decisions can in turn lead to more satisfied clients and longer, more fruitful, adviser/client relationships.

“Nothing in life is as important as you think it is, while you are thinking about it”
Daniel Kahneman, “Thinking Fast and Slow”[1].

One simple puzzle – two ways of thinking

Consider for one moment, a simple puzzle:

A bat and a ball together cost $1.10. The bat costs $1 more than the ball. How much does the ball cost? 

Faced with this puzzle, the majority of people leap to the fast, intuitive answer – 10 cents. As a number, 10c ‘feels right’, doesn’t it? Unfortunately, it’s wrong.

(The correct answer, which most people would be able to work out if they thought about it a bit more, can be found at the end of this article).

This simple example eloquently demonstrates how our brain thinks in two different ways – automatic and deliberate, or ‘fast’ and ‘slow’.

It was Nobel Prize winner Daniel Kahneman’s 2011 bestselling book[1] “Thinking Fast and Slow” which popularised the idea that the mind’s processes are divided into two distinct systems:

System 1 is the brain’s fast, automatic, intuitive approach. System 1 activity includes the innate mental activities that some would call ‘instinctive’, including those we are born with and those that become instinctive through prolonged exposure.

System 2 is the mind’s slower, analytical mode, where reason dominates.  Usually, System 2 activity is activated when we do something that does not come naturally and requires some sort of conscious mental exertion.

Table 1 summarises the main characteristics of the two styles.

 

 

Decisions can be exhausting, so we look for shortcuts

Scientists estimate the human body sends 11 million bits per second to the brain for processing, yet the conscious mind seems to be able to process only 50 bits per second[3].

That’s a big differential, and it means that if we tried to consciously process all that information, our minds would crash, like a website with too much traffic.

The only way we can cope is to use mental short cuts – heuristics – to make decisions amongst this deluge of data. These shortcuts look for patterns, make associations and quick inferences based on the limited information it can digest. In effect our brains “connect the dots.”  But, sometimes, in connecting the dots, we get it wrong. (Remember the bat and ball example?).

Common short cuts and their hidden traps

Availability

The availability heuristic is a mental shortcut that relies on immediate examples that come to mind when evaluating a specific topic or decision.

Under this short cut people tend to place more importance on the most recent – available – information they have received. It is easy to see how this may impact decisions we make around financial matters, such as insurance. Rather than your client rationally assessing risks and probabilities, they are more likely to be influenced by an event that they have become aware of, through the news, or because a friend or family member has been affected. This is why we see sales of home insurance increase after a flood, and people seek life insurance after the death or illness of someone they know. The actual risk of them actually suffering that event hasn’t changed, it’s just that they are more aware of it[4].

The downside of this effect is that, as the memory of the event fades, the value placed on that insurance is diminished[5] in their minds. After a few claim-free years, people may start to question the value of that cover. In such a situation, advisers should seek to increase the ‘availability’ of the risk, by drawing on news and examples (e.g. claims stories) that remind people of the risks and their consequences.

We overvalue the present and excessively discount the future

Our ability to project forward and imagine ourselves in the future is very limited and explains our tendency to excessively[6] (hyperbolically) discount the value of things in the future.

Related to the idea of instant gratification, in a financial context this could look like customers focussing more on first year insurance premium, rather than considering a full comparison into the future.

Another shortcut at play in this example is our overconfidence about what the future will look like, including our future income and the state of our health. Overconfidence about health has long been a barrier to consumers recognising the need for, and value of, life insurance, and data and tools which make it easier for people to understand the real risks can be useful in such circumstances.

We do judge a book by its cover

The halo effect is a cognitive bias that claims that positive impressions of people, brands, and products in one area positively influence our feelings in another area. Research shows that physical attractiveness plays a significant role in how people perceive others, even when it comes to judging traits that have nothing to do with looks[7]. This means, for instance, that people rate attractive people as having a better personality and as being more knowledgeable than unattractive people.

Whilst our ability to change our physical appearance is obviously limited, this bias does reinforce the importance of the way you present yourself and your office location. A clean, airy and well-lit office, staffed by well groomed, positive people creates a certain impression about the quality of your advice, even though from a rational perspective there is no direct link.

We prefer avoiding losses to making gains

Loss aversion[8] is a cognitive bias that describes why, for individuals, the pain of losing is psychologically twice as powerful as the pleasure of gaining. Simply put, it’s better not to lose $20, than to find $20.

In our everyday lives, loss aversion is especially common when individuals deal with financial decisions and marketing. An individual is less likely to buy a stock if it’s seen as risky with the potential for a loss of money, even though the reward potential is high.

Additionally, marketing campaigns such as trial periods and rebates take advantage of an individual’s tendency to opt into a presumed free service. As the buyer incorporates that service or product into their lives, they are more likely to purchase it, as they want to avoid loss they will feel once they give up the product. This tends to happen because scaling back, whether on software trials, expensive cars, or bigger houses, is an emotionally challenging decision.

We like status quo (not the band)

Related to our fear of making losses is our preference for the status quo. Status quo bias[9] is evident when people prefer things to stay the same by doing nothing or by sticking with a decision made previously.

One psychological explanation for this bias is ‘sunk cost thinking’, the idea that we have already spent or invested money to that point, and that switching means we somehow lose the value of that investment[10]. This explains everything from a reluctance to switch insurers, sticking with a stock which is tanking, and even our tendency to eat airline food even if we aren’t hungry.

Don’t fence me in

Reactance bias[11] is the tendency to do something different from what someone wants you to do, because you perceive they are trying to stifle your freedom of choice. So strong is this bias it can lead us to adopting extreme contrary positions, no matter how irrational, and encourages the use of ‘reverse psychology’ in those trying to influence certain types of behaviour (for example, parents, or unscrupulous salespeople). From an adviser’s perspective, it is vital that clients feel their freedom of choice about any decision isn’t being restricted, and that they remain ‘in control’. This bias explains why the most effective salespeople are often the ones who don’t apply any pressure to buy.

Social proof

We look to others for behavioral guidance when we are unsure, in an unclear, unfamiliar,

or ambiguous situation. The greater the number of people who find any idea correct – the more a given individual will perceive the idea to be correct. This is the Social Proof heuristic, and in a way, it is outsourcing some of the decision-making process to others, relying on their work and research. Social proof explains everything from the value of trusted referrals, to our preference for established, familiar brands (if everyone else is buying that brand then it must be good). Our tendency to follow the crowd – not wanting to stand out – is also explained by social proof and our need to conform. Experiments[12] have shown this can influence everything from our uptake of recycling to our paying of bills on time.

For a financial adviser, a clear example of how to leverage the power of social proof is through the use of client testimonials, especially those from clients who are seen as ‘typical’ or ‘representative’, rather than those with unique or unattainable circumstances.

In a life insurance context, the social proof heuristic also explains how the way we ask health related questions can impact disclosure rates.

People generally don’t like to admit how much they drink or smoke, and tests have shown that framing these questions in a way that normalises or reduces the stigma around a particular habit encourage them to be more accurate in their disclosures. For example, asking the question ‘when was the last time you smoked?’  was found[13] to result in a much higher rate of honest disclosure than a binary ‘do you smoke, yes/no?’.

Choice is a difficult balance

One of the most difficult balances to get right is knowing how many choices are optimal in any given circumstances. There are two competing psychological phenomena at play here; on the one hand, our desire to avoid mentally taxing comparisons between different options suggests less is better. On the other hand, as we have already discussed above (see ‘Don’t fence me in’, above) we don’t like to feel that our freedom of choice is being taken away from us.

There is a significant body of research showing the many circumstances where reducing choices actually increases decision making outcomes. One famous study<sup[>14] found that supermarket customers offered 6 choices of jam were 10 times more likely to make a purchase than those offered 24 different flavours!

Particularly relevant for financial advisers is the study[15]published in the Journal of Consumer Psychology in 2015, which analysed a total of 99 ‘choice studies’ and specifically looked at those cases in which reducing choices helped to boost sales.

They found four scenarios where less choice actually increased outcomes:

  1. When people want to make a quick and easy choice;
  2. When the product is complex (so fewer choices help the consumer make a decision);
  3. When it’s difficult to compare alternatives; and
  4. When consumers don’t have clear preferences

Points 2, 3 and 4, seem particularly pertinent to financial products, where product complexity is often high and consumer knowledge of providers is generally low.

Anchors weigh us down

When making a decision, people look for reference points or ‘anchors’ that they are familiar with – and then make adjustments around that point. But often we rely too heavily on one piece of information to make a decision. Anchors[16] affect our decision making in a variety of different contexts, and the anchors themselves can vary (e.g., we can anchor to prices, similar products, extremes, rules of thumb, and even to completely random numbers, albeit subconsciously).

Thinking in a stock market context, if the share market return has been at around 8% per annum for a number of years, for example, investors may be heavily anchored to that level of return as their reference point.  This may set up unrealistic expectations for future returns. Similarly, in an insurance context a customer may have a particular reference point in mind for premium cost and/or sum insured. Exploring what number your clients might already be anchored to as their ‘reference number’ can greatly assist in managing expectations and in finding out if this is genuinely their ‘budget’, or more an arbitrary reference point they have decided on.

A picture paints a thousand words, but the way it’s framed also matters

Financial products and information can often be complex and confusing and understandably, people struggle to make sense of them, losing concentration and switching off. Sometimes, it’s the sheer amount of information that can be the barrier to action, leading people to become overwhelmed, making it important to present information to clients in ways that require minimal effort to understand – things like tables, charts, and explainer videos in plain language. These promote ‘cognitive ease’ and let people absorb information using System 1 thinking.

The way that information is ‘framed’ is also important.

The same information, framed differently, can lead to very different decision outcomes. Sausages advertised as 95% fat free sound much more appealing those which are 5% fat, Similarly, communication about life insurance which is framed around the sum insured – rather than the annual premium – is more likely to elicit the perception of cover being an investment rather than a cost.

People can make very different decisions depending on how numerical information is presented to them. For example, there’s a surprisingly big difference between how we react to probabilities expressed as percentages (say, 10%) and how we respond to odds expressed as frequencies[17] (one person out of every 10). Percentages are abstract and hard to imagine, meaning people often make perceptual mistakes in interpreting percentages. Natural frequencies, on the other hand, are much easier to imagine; particularly for less numerate people.

As psychologist Paul Slovic[18]explains: “If you tell people there’s a 1 in 10 chance of winning or losing, they think ‘Well, who’s the one?!’ They’ll actually visualise a person.” This suggests that framing a piece of information as a natural frequency could mean people overweight it and see the benefits (or losses) as larger than they actually are.

Conclusion

Cognitive biases are inherent in all of us (not just your clients) and have a huge impact on the way we understand, interact with, and receive information and advice.

Being aware that your clients’ decisions and choices can often be based on emotions and gut feel (‘System 1’ thinking), rather than logic, helps you to deliver more effective advice and communications. Similarly, understanding that certain biases can interfere with your clients achieving their goals in the long term will equip you to develop strategies and plans that they will be able to stick to.

P.S. The ball cost $0.05 ($1.05 + $0.05 = $1.10).

 

Take the quiz to earn 0.5 CPD hour:

 

———-

Notes:
1.
‘Thinking Fast and Slow’, Daniel Kahneman, Penguin, 2011.
2. ‘Introduction to Applied Behavioral Science’, Cetinok E & Sagara N, Ipsos Loyalty Presentation, New York, April 2017.
3. https://www.britannica.com/science/information-theory/Physiology accessed August 2020.
4. ‘Leveraging Behavioural Science in Insurance: A Systematic Review’, Raghuram A., University of Pennsylvania, Scholarly Commons, August 2019.
5. ‘Behavioral Science and Insurance: Part One, how can behavioral science help solve our industry’s greatest challenges?’ Matt Battersby, RGA Knowledge Centre, rga.com, December 2018.
6. ‘How Customers Really Think About Insurance’, Laughlin P., Insurance Thoughtleadership.com, December 2014.
7. ‘Halo effects and the attractiveness premium in perceptions of political expertise’, Palmer C. L., & Peterson R.D., American Politics Research, 44(2), August 2015.
8. ‘Prospect Theory. An Analysis of Decision Making Under Risk’, Kahneman, D., & Tversky, A., published in Econometrica, Volume 47, March 1979.
9. ‘Status Quo Bias in Decision Making’, Samuelson W., & Zeckhauser R.J., Journal of Risk and Uncertainty, February 1988.
10. ‘The Hidden Traps in Decision Making,’ Hammond J.S., Keeney R.L., & Raiffa H., Harvard Business Review, September-October 1998.
11. ‘Reactance or Rationalization? Predicting Public Responses to Government Policy’, Proudfoot D., Kay A.C., Policy insights from the behavioural and brain sciences, Sage Journals, October 2014.
12. ‘East: Four simple ways to apply behavioural insights’, Service O., Hallsworth M., et al, The Behavioural Insights Team, https://www.bi.team/publications/east-four-simple-ways-to-apply-behavioural-insights/
13. ‘Behavioral Economics, Disclosure Gaps, and Customer Journeys in Life and Health Insurance’, Matt Battersby, RGA Knowledge Centre, rga.com, May 2019.
14. ‘When Choice is Demotivating: Can One Desire Too Much of a Good Thing?’, Iyengar S.S., Lepper M.R., Journal of personality and social psychology, 79(6), 2000.
15. ‘Choice overload: a conceptual review and meta-analysis’, Chernev A., Bockenholt U., & Goodman J, Journal of Consumer Psychology, 2015.
16. ‘Harnessing behavioural finance for better client outcomes’, Daniels M., Zurich Risk Pulse newsletter, December 2019.
17.‘Violence Risk Assessment and Risk Communication: The Effects of Using Actual Cases, Providing Instruction, and Employing Probability Versus Frequency Formats’, Slovic P. et al, Law and Human Behaviour, July 2000.
18. Ibid

You must be logged in to post or view comments.