Of all the myths about the brain, the 10% myth is the most common. This is the idea that we only use 10% of the human brain’s capacity. The remainder, potent and humming in the background remains untapped – and all we have to do is access the untapped part and the floodgates will open, hello new mental powers, hello top performance. It’s alluring because it excuses us of our frailties or lack of achievement – and offers an instant remedy. No wonder it has been co-opted by many companies over the years to help sell their products.
It’s a myth that misleads us. The truth is our brains are defined by their efficiency. When you hear a psychologist explain the complex overlapping processes required for simple acts like catching a tennis ball thrown from the other side of the room, our day-to-day existence navigating the world seems little short of a miracle.
Why does understanding the brain matter for marketers?
Firstly, the more we know about how our brains process the world around us, the biases we are subject to and our decision-making – the better we can explain how people behave – and the more we can influence them.
Secondly the stuff we create – brands, campaigns, packaging – has to live in the real world. The world is a hostile, unforgiving ecosystem for marketing ideas. If our assumptions about how people notice, make sense of then act on the world around them are wrong then we’re wasting time, money and effort.
What follows is a list of first principles about human behaviour taken from my recent reading. It’s a marketer’s summary so if in doubt – I’d advise going to the original sources.
- Why people always have an answer to your questions
- Why few readily admit to being wrong
- Why believing is seeing
- People are cognitive misers
- Under and over accounting for emotions in decision-making
- Acting, then working out why afterwards
1. Why people always have an answer to your questions
The theorising instinct (Schultz)
People like to explain things, even when they don’t know the answer. Schultz explains this as a quirk of human cognition that has evolved over millennia. Evolutionary theory tells us our sex drive is adaptive: without with the biological imperative to reproduce we’d go extinct pretty quickly. As we evolved however we developed a drive to explain the world that was also as crucial to our survival. Prehistoric man had to work out that shaking a tree led to fruit falling, that certain shaped fruits were nourishing and others poisonous and that a rustle in the bush could well be a hungry lion. In order to stay alive we had to create explanations. The ‘evolutionary urgency of theorising’ has persisted, and now means humans are incapable of not making explanations. We do so consciously and unconsciously – like when you meet someone for the first time and suddenly think “you don’t look like I expected” despite having not thought consciously thought about it before.
This is best illustrated by an experiment conducted in a Department Store by Nisbett and Wilson (‘77). They offered 4 varieties of tights and asked customers which they preferred and why. What customers didn’t know was that all 4 were identical. Despite this everyone they asked stated a preference and a reason (colour, texture etc). When the experimenters revealed that the tights were identical many customers refused to believe them: indeed, they argued they could detect a difference – and stuck to their original preference.
Remember the theorising instinct when:
– Asking consumers questions about why they buy
– Speaking to experts – have the confidence to question their arguments
– You find yourself pontificating without foundation (maybe one’s just for me)
2. Why few readily admit to being wrong (Schultz, Tarvis & Aaronson)
Market research is involves trying to explain other peoples’ attitudes and behaviours, and try to predict what they’ll do next. Observe, ruminate, hypothesise, discuss.
As people every single day we explain ourselves to ourselves, consciously and unconsciously. We think about what we’ve done, how we feel and how we might change.
Cognitive dissonance is the uncomfortable feeling you get that results from holding two contradictory beliefs. An example would be “smoking kills you” and “I smoke 20 a day”. This tension or ‘dissonance’ produces mental discomfort which we try to reduce. To do so we’ll either change our mind/behaviour (e.g. give up smoking) or convince ourselves and others that the belief isn’t so important (e.g. smoking helps me lose weight and de-stress). As an example, keep an ear out the next time someone you know makes a big ticket purchase they can’t really afford (e.g. fast car). Cognitive dissonance predicts that to assuage the incompatible beliefs of “I’m on a middling income” and “it’s costing me £500 a month” the individual will try to convince himself and others not only about the benefits of said vehicle but its damned necessity at every opportunity.
First described in 1956 by Leon Festinger the theory challenged how we think about why people do things beyond the rational decision maker of economic theory (“homo economicus” ) or the view that people are motivated primarily by reward.
Both Schultz and Tarvis & Aaronson argue persuasively that we are so emotionally invested in our beliefs that we are unwilling or unable to recognise them as anything other than the truth.
For example there’s an entire raft of cognitive biases hardwired into how we perceive the world, which slant the evidence we pick up. Bandura describes the “positive illusions” which guide our thinking which include:
– Overly positive self evaluation (e.g. I’m a better than average driver)
– Exaggerated perceptions of mastery/control (e.g. I’m sure I can convince her to my point of view)
– Unrealistic optimism for the future (e.g. They’d definitely make others in my Department redundant before me)
These biases help us maintain a positive view of ourselves, and maintain good mental health. Our identity relies on this self affirmation.
Taking this further, Tarvis & Aaronson describe a “totalitarian ego” which protects us from the pain and embarrassment of actions we have taken which are inconsistent with our core self image. Most people share an impulse to justify ourselves and avoid taking responsibility for actions that turn out to be harmful, stupid or immoral. When confronted with proof we are wrong, we commonly don’t change our view or course of action but stick to it even more tenaciously. Self justification is often unconscious. People lie to themselves, persuading themselves that their course of action was justified.
Schultz and Tarvis & Aaronson give the examples of cult members following a leader who wrongly predicted the end of the world. When the allotted date came, went, the world didn’t end and their entire worldview disproved, they didn’t accept they had got it wrong. To do so would make them feel ridiculous; they’d sold their world goods, and said goodbye to all non-cult members. Instead they went on to create justifications in order to protect their self esteem – for example “it was meant to be this way God rewarded us by saving humanity at the last minute”.
Another learning from cognitive dissonance theory is that if people voluntarily go through pain, discomfort or embarrassment to get something, they will be happier with that something than if it came to them easily (Tarvis & Aaronson). Elliot’s experiments with Initiation ceremonies showed that severe initiations increase the member’s liking for the group they gain access to. The mental arithmetic is thus: I tried so hard to get in – it must have been worth it. We explain ourselves to ourselves.
Recent evidence suggests it also works the other way around, a kind of virtuous circle of benevolence. When someone does a good deed on a whim / by chance, they will come to see the beneficiary of their generosity in a warmer light (Tarvis & Aaronson). The thought that they did a favour contrasts with the feelings that they might have had about him e.g. “Why would I have done that for an idiot? He can’t be as much of an idiot I thought he was. He’s probably a good person.”
SO! Compare this complex world of reality – individuals with “totalitarian egos” rose-tinting reality – the typical market research interview. Me asking the middle income fast car buyer simply to state their attitudes about said fast car will only get me so far (“It’s so practical, I can do 37mpg, I commute 2 hours each day so it’s great being in a car I love so much etc”). I’ve got to get to the truth by other means like picking up on non-verbal cues, looking at actual behaviour and by using projective techniques that take HIM and HIS SELF ESTEEM out of the situation, so he has a chance of being objective.
Asking any person to criticise something they are psychologically invested in is fairly pointless – though we do it anyway. The trick is picking up the things people are psychologically invested in. Knowing about cognitive dissonance can embolden our conclusions.
Remember cognitive dissonance when:
– Asking someone to justify anything, especially a purchase decision
– Observing someone justify spontaneously – the vigour their argument may be founded by a need to convince themselves
3. Believing is seeing – your beliefs blinker you to reality (Tarvis & Aaronson)
We all have an unconscious tendency to notice evidence that supports our beliefs and ignore that which contradicts it: confirmation bias.
Taking this further, our attitudes are part of our identities. Certainty feels good. Evidence shows that challenging others’ beliefs makes them more likely to dig their heels in – not change their minds.
Recent experiments within the world of Neuroscience have shown that biases in thinking are built into the very way we process information (Tarvis & Aaronson). fMRI scans undertaken whilst discussing information that either supported or contradicted their favoured political candidate. When contradictory information was discussed the areas of the brain associated with reasoning virtually shut down; when supporting information was discussed the areas associated with emotion lit up. Made up minds are hard to change. As discussed above, unconscious self-justification is how we protect self esteem.
The implications of this? Attitude change is difficult, and that people assess the evidence in front of them based on earlier theories. People can be blind to what’s in front of their very eyes.
Schultz gives a good example if this in her book. In the Western world until 15th century the stars in the heavens were thought to be set, unchanging. After the Coepernican revolution (where he worked out that the planets orbit around the sun, and therefore we could predict solar cycles) astronomers started to observe changes they had missed for centuries. New stars that had been there all along ‘appeared’. Contrast this to China which resided under identical skies, but a different ideology: they began recording many astrological phenomena 500 years earlier. Instead of failing to believe the counter evidence astronomers in the West literally failed to see it. A case of selective perception informed by belief.
Remember confirmation bias and believing is seeing when:
– Making grand claims about how communications will change attitudes
– Deciding when it’s a good use of your time to engage another in debate
Being wrong, Kathryn Schultz
Mistakes were made – but not by me, Tarvis & Aaronson
The brain: A secret history, BBC Four