How You Really Make Decisions

Download Subtitles

Transcript

0:00:04 > 0:00:08We like to think that as a species, we are pretty smart.

0:00:11 > 0:00:15We like to think we are wise, rational creatures.

0:00:15 > 0:00:18I think we all like to think of ourselves as Mr Spock to some degree.

0:00:18 > 0:00:20You know, we make rational, conscious decisions.

0:00:21 > 0:00:22But we may have to think again.

0:00:24 > 0:00:28It's mostly delusion, and we should just wake up to that fact.

0:00:31 > 0:00:32In every decision you make,

0:00:32 > 0:00:36there's a battle in your mind between intuition and logic.

0:00:38 > 0:00:41It's a conflict that plays out in every aspect of your life.

0:00:43 > 0:00:44What you eat.

0:00:45 > 0:00:46What you believe.

0:00:47 > 0:00:49Who you fall in love with.

0:00:50 > 0:00:54And most powerfully, in decisions you make about money.

0:00:54 > 0:00:58The moment money enters the picture, the rules change.

0:00:58 > 0:01:00Scientists now have a new way

0:01:00 > 0:01:03to understand this battle in your mind,

0:01:03 > 0:01:07how it shapes the decisions you take, what you believe...

0:01:09 > 0:01:11And how it has transformed our understanding

0:01:11 > 0:01:13of human nature itself.

0:01:26 > 0:01:27TAXI HORN BLARES

0:01:34 > 0:01:38Sitting in the back of this New York cab is Professor Danny Kahneman.

0:01:42 > 0:01:46He's regarded as one of the most influential psychologists alive today.

0:01:48 > 0:01:50Over the last 40 years,

0:01:50 > 0:01:52he's developed some extraordinary insights

0:01:52 > 0:01:54into the way we make decisions.

0:01:58 > 0:02:02I think it can't hurt to have a realistic view of human nature

0:02:02 > 0:02:03and of how the mind works.

0:02:07 > 0:02:09His insights come largely from puzzles.

0:02:13 > 0:02:17Take, for instance, the curious puzzle of New York cab drivers

0:02:17 > 0:02:20and their highly illogical working habits.

0:02:23 > 0:02:26Business varies according to the weather.

0:02:27 > 0:02:29On rainy days, everyone wants a cab.

0:02:30 > 0:02:34But on sunny days, like today, fares are hard to find.

0:02:34 > 0:02:39Logically, they should spend a lot of time driving on rainy days,

0:02:39 > 0:02:43because it's very easy to find passengers on rainy days,

0:02:43 > 0:02:46and if they are going to take leisure, it should be on sunny days

0:02:46 > 0:02:50but it turns out this is not what many of them do.

0:02:52 > 0:02:57Many do the opposite, working long hours on slow, sunny days

0:02:57 > 0:02:59and knocking off early when it's rainy and busy.

0:03:01 > 0:03:05Instead of thinking logically, the cabbies are driven by an urge

0:03:05 > 0:03:10to earn a set amount of cash each day, come rain or shine.

0:03:11 > 0:03:13Once they hit that target, they go home.

0:03:17 > 0:03:19They view being below the target as a loss

0:03:19 > 0:03:22and being above the target as a gain, and they care

0:03:22 > 0:03:26more about preventing the loss than about achieving the gain.

0:03:28 > 0:03:31So when they reach their goal on a rainy day, they stop...

0:03:35 > 0:03:37..which really doesn't make sense.

0:03:37 > 0:03:39If they were trying to maximise their income,

0:03:39 > 0:03:42they would take their leisure on sunny days

0:03:42 > 0:03:44and they would drive all day on rainy days.

0:03:49 > 0:03:51It was this kind of glitch in thinking

0:03:51 > 0:03:54that Kahneman realised could reveal something profound

0:03:54 > 0:03:56about the inner workings of the mind.

0:03:58 > 0:04:00Anyone want to take part in an experiment?

0:04:00 > 0:04:03And he began to devise a series of puzzles and questions

0:04:03 > 0:04:06which have become classic psychological tests.

0:04:10 > 0:04:13- It's a simple experiment. - It's a very attractive game...

0:04:13 > 0:04:14Don't worry, sir. Nothing strenuous.

0:04:14 > 0:04:17..Posing problems where you can recognise in yourself

0:04:17 > 0:04:21that your intuition is going the wrong way.

0:04:21 > 0:04:25The type of puzzle where the answer that intuitively springs to mind,

0:04:25 > 0:04:27and that seems obvious, is, in fact, wrong.

0:04:30 > 0:04:34Here is one that I think works on just about everybody.

0:04:34 > 0:04:36I want you to imagine a guy called Steve.

0:04:36 > 0:04:42You tell people that Steve, you know, is a meek and tidy soul

0:04:42 > 0:04:48with a passion for detail and very little interest in people.

0:04:48 > 0:04:49He's got a good eye for detail.

0:04:49 > 0:04:52And then you tell people he was drawn at random.

0:04:52 > 0:04:54From a census of the American population.

0:04:54 > 0:04:59What's the probability that he is a farmer or a librarian?

0:04:59 > 0:05:01So do you think it's more likely

0:05:01 > 0:05:03that Steve's going to end up working as a librarian or a farmer?

0:05:03 > 0:05:06What's he more likely to be?

0:05:06 > 0:05:07Maybe a librarian.

0:05:07 > 0:05:09Librarian.

0:05:09 > 0:05:10Probably a librarian.

0:05:10 > 0:05:11A librarian.

0:05:11 > 0:05:15Immediately, you know, the thought pops to mind

0:05:15 > 0:05:20that it's a librarian, because he resembled the prototype of librarian.

0:05:20 > 0:05:22Probably a librarian.

0:05:22 > 0:05:25In fact, that's probably the wrong answer,

0:05:25 > 0:05:28because, at least in the United States,

0:05:28 > 0:05:32there are 20 times as many male farmers as male librarians.

0:05:32 > 0:05:33- Librarian.- Librarian.

0:05:33 > 0:05:37So there are probably more meek and tidy souls you know

0:05:37 > 0:05:40who are farmers than meek and tidy souls who are librarians.

0:05:45 > 0:05:49This type of puzzle seemed to reveal a discrepancy

0:05:49 > 0:05:51between intuition and logic.

0:05:51 > 0:05:52Another example is...

0:05:52 > 0:05:55Imagine a dictionary. I'm going to pull a word out of it at random.

0:05:55 > 0:05:56Which is more likely,

0:05:56 > 0:06:02that a word that you pick out at random has the letter

0:06:02 > 0:06:07R in the first position or has the letter R in the third position?

0:06:07 > 0:06:09- Erm, start with the letter R.- OK.

0:06:10 > 0:06:12People think the first position,

0:06:12 > 0:06:15because it's easy to think of examples.

0:06:15 > 0:06:16- Start with it.- First.

0:06:17 > 0:06:20In fact, there are nearly three times as many words with

0:06:20 > 0:06:24R as the third letter, than words that begin with R,

0:06:24 > 0:06:27but that's not what our intuition tells us.

0:06:27 > 0:06:31So we have examples like that. Like, many of them.

0:06:34 > 0:06:38Kahneman's interest in human error was first sparked in the 1970s

0:06:38 > 0:06:41when he and his colleague, Amos Taverski,

0:06:41 > 0:06:43began looking at their own mistakes.

0:06:45 > 0:06:47It was all in ourselves.

0:06:47 > 0:06:50That is, all the mistakes that we studied were mistakes

0:06:50 > 0:06:52that we were prone to make.

0:06:52 > 0:06:55In my hand here, I've got £100.

0:06:55 > 0:06:59Kahneman and Taverski found a treasure trove of these puzzles.

0:06:59 > 0:07:00Which would you prefer?

0:07:00 > 0:07:03They unveiled a catalogue of human error.

0:07:04 > 0:07:07Would you rather go to Rome with a free breakfast?

0:07:07 > 0:07:10And opened a Pandora's Box of mistakes.

0:07:10 > 0:07:12A year and a day.

0:07:12 > 0:07:1425?

0:07:14 > 0:07:17But the really interesting thing about these mistakes is,

0:07:17 > 0:07:18they're not accidents.

0:07:18 > 0:07:2075?

0:07:20 > 0:07:22They have a shape, a structure.

0:07:22 > 0:07:23I think Rome.

0:07:23 > 0:07:25Skewing our judgment.

0:07:25 > 0:07:2620.

0:07:26 > 0:07:29What makes them interesting is that they are not random errors.

0:07:29 > 0:07:32They are biases, so the difference between a bias

0:07:32 > 0:07:36and a random error is that a bias is predictable.

0:07:36 > 0:07:38It's a systematic error that is predictable.

0:07:40 > 0:07:44Kahneman's puzzles prompt the wrong reply again.

0:07:44 > 0:07:45More likely.

0:07:45 > 0:07:46And again.

0:07:46 > 0:07:48- More likely.- More likely?

0:07:48 > 0:07:49And again.

0:07:49 > 0:07:50Probably more likely?

0:07:51 > 0:07:55It's a pattern of human error that affects every single one of us.

0:07:57 > 0:07:59On their own, they may seem small.

0:08:01 > 0:08:04Ah, that seems to be the right drawer.

0:08:04 > 0:08:07But by rummaging around in our everyday mistakes...

0:08:07 > 0:08:09That's very odd.

0:08:09 > 0:08:13..Kahneman started a revolution in our understanding of human thinking.

0:08:15 > 0:08:17A revolution so profound

0:08:17 > 0:08:20and far-reaching that he was awarded a Nobel prize.

0:08:22 > 0:08:27So if you want to see the medal, that's what it looks like.

0:08:30 > 0:08:31That's it.

0:08:38 > 0:08:42Psychologists have long strived to pick apart the moments

0:08:42 > 0:08:43when people make decisions.

0:08:45 > 0:08:48Much of the focus has been on our rational mind,

0:08:48 > 0:08:51our capacity for logic.

0:08:51 > 0:08:55But Kahneman saw the mind differently.

0:08:55 > 0:08:58He saw a much more powerful role for the other side of our minds,

0:08:58 > 0:09:00intuition.

0:09:02 > 0:09:04And at the heart of human thinking,

0:09:04 > 0:09:08there's a conflict between logic and intuition that leads to mistakes.

0:09:09 > 0:09:11Kahneman and Taverski started this trend

0:09:11 > 0:09:13of seeing the mind differently.

0:09:15 > 0:09:17They found these decision-making illusions,

0:09:17 > 0:09:20these spots where our intuitions just make us decide

0:09:20 > 0:09:23these things that just don't make any sense.

0:09:23 > 0:09:26The work of Kahneman and Taverski has really been revolutionary.

0:09:29 > 0:09:31It kicked off a flurry of experimentation

0:09:31 > 0:09:35and observation to understand the meaning of these mistakes.

0:09:37 > 0:09:40People didn't really appreciate, as recently as 40 years ago,

0:09:40 > 0:09:45that the mind didn't really work like a computer.

0:09:45 > 0:09:48We thought that we were very deliberative, conscious creatures

0:09:48 > 0:09:51who weighed up the costs and benefits of action,

0:09:51 > 0:09:52just like Mr Spock would do.

0:09:56 > 0:10:00By now, it's a fairly coherent body of work

0:10:00 > 0:10:01about ways in which

0:10:01 > 0:10:07intuition departs from the rules, if you will.

0:10:11 > 0:10:13And the body of evidence is growing.

0:10:16 > 0:10:19Some of the best clues to the working of our minds come not

0:10:19 > 0:10:22when we get things right, but when we get things wrong.

0:10:35 > 0:10:38In a corner of this otherwise peaceful campus,

0:10:38 > 0:10:42Professor Chris Chabris is about to start a fight.

0:10:42 > 0:10:49All right, so what I want you guys to do is stay in this area over here.

0:10:49 > 0:10:51The two big guys grab you

0:10:51 > 0:10:54and sort of like start pretending to punch you, make some sound effects.

0:10:54 > 0:10:56All right, this looks good.

0:10:58 > 0:10:59All right, that seemed pretty good to me.

0:11:01 > 0:11:03It's part of an experiment

0:11:03 > 0:11:06that shows a pretty shocking mistake that any one of us could make.

0:11:08 > 0:11:10A mistake where you don't notice

0:11:10 > 0:11:12what's happening right in front of your eyes.

0:11:19 > 0:11:22As well as a fight, the experiment also involves a chase.

0:11:26 > 0:11:29It was inspired by an incident in Boston in 1995,

0:11:29 > 0:11:32when a young police officer, Kenny Conley,

0:11:32 > 0:11:34was in hot pursuit of a murder suspect.

0:11:37 > 0:11:39It turned out that this police officer,

0:11:39 > 0:11:41while he was chasing the suspect,

0:11:41 > 0:11:43had run right past some other police officers

0:11:43 > 0:11:46who were beating up another suspect, which, of course,

0:11:46 > 0:11:49police officers are not supposed to do under any circumstances.

0:11:50 > 0:11:54When the police tried to investigate this case of police brutality,

0:11:54 > 0:11:57he said, "I didn't see anything going on there,

0:11:57 > 0:11:59"all I saw was the suspect I was chasing."

0:11:59 > 0:12:00And nobody could believe this

0:12:00 > 0:12:04and he was prosecuted for perjury and obstruction of justice.

0:12:05 > 0:12:08Everyone was convinced that Conley was lying.

0:12:08 > 0:12:11We don't want you to be, like, closer than about...

0:12:11 > 0:12:13Everyone, that is, apart from Chris Chabris.

0:12:17 > 0:12:19He wondered if our ability to pay attention

0:12:19 > 0:12:23is so limited that any one of us could run past a vicious fight

0:12:23 > 0:12:24without even noticing.

0:12:27 > 0:12:29And it's something he's putting to the test.

0:12:29 > 0:12:32Now, when you see someone jogging across the footbridge,

0:12:32 > 0:12:34then you should get started.

0:12:34 > 0:12:35Jackie, you can go.

0:12:40 > 0:12:41In the experiment,

0:12:41 > 0:12:44the subjects are asked to focus carefully on a cognitive task.

0:12:45 > 0:12:47They must count the number of times

0:12:47 > 0:12:49the runner taps her head with each hand.

0:12:56 > 0:12:58Would they, like the Boston police officer,

0:12:58 > 0:13:00be so blinded by their limited attention

0:13:00 > 0:13:03that they would completely fail to notice the fight?

0:13:07 > 0:13:10About 45 seconds or a minute into the run, there was the fight.

0:13:10 > 0:13:14And they could actually see the fight from a ways away,

0:13:14 > 0:13:17and it was about 20 feet away from them when they got closest to them.

0:13:20 > 0:13:22The fight is right in their field of view,

0:13:22 > 0:13:26and at least partially visible from as far back as the footbridge.

0:13:34 > 0:13:37It seems incredible that anyone would fail to notice something

0:13:37 > 0:13:38so apparently obvious.

0:13:42 > 0:13:44They completed the three-minute course

0:13:44 > 0:13:47and then we said, "Did you notice anything unusual?"

0:13:47 > 0:13:48Yes.

0:13:48 > 0:13:50- What was it? - It was a fight.

0:13:50 > 0:13:53Sometimes they would have noticed the fight and they would say,

0:13:53 > 0:13:57"Yeah, I saw some guys fighting", but a large percentage of people said

0:13:57 > 0:13:59"We didn't see anything unusual at all."

0:13:59 > 0:14:00And when we asked them specifically

0:14:00 > 0:14:03about whether they saw anybody fighting, they still said no.

0:14:10 > 0:14:12In fact, nearly 50% of people in the experiment

0:14:12 > 0:14:15completely failed to notice the fight.

0:14:17 > 0:14:20Did you see anything unusual during the run?

0:14:20 > 0:14:21No.

0:14:21 > 0:14:23OK. Did you see some people fighting?

0:14:23 > 0:14:25No.

0:14:28 > 0:14:31We did at night time and we did it in the daylight.

0:14:32 > 0:14:34Even when we did it in daylight,

0:14:34 > 0:14:38many people ran right past the fight and didn't notice it at all.

0:14:41 > 0:14:43Did you see anything unusual during the run?

0:14:43 > 0:14:45No, not really.

0:14:45 > 0:14:47OK, did you see some people fighting?

0:14:47 > 0:14:48No.

0:14:48 > 0:14:50- You really didn't see anyone fighting?- No.

0:14:50 > 0:14:52Does it surprise you that you would have missed that?

0:14:52 > 0:14:55They were about 20 feet off the path.

0:14:55 > 0:14:56- Oh! - You ran right past them.

0:14:56 > 0:14:58- Completely missed that, then.- OK.

0:15:00 > 0:15:02Maybe what happened to Conley was,

0:15:02 > 0:15:04when you're really paying attention to one thing

0:15:04 > 0:15:07and focusing a lot of mental energy on it, you can miss things

0:15:07 > 0:15:10that other people are going to think are completely obvious, and in fact,

0:15:10 > 0:15:12that's what the jurors said after Conley's trial.

0:15:12 > 0:15:15They said "We couldn't believe that he could miss something like that".

0:15:15 > 0:15:17It didn't make any sense. He had to have been lying.

0:15:20 > 0:15:24It's an unsettling phenomenon called inattentional blindness

0:15:24 > 0:15:26that can affect us all.

0:15:27 > 0:15:29Some people have said things like

0:15:29 > 0:15:30"This shatters my faith in my own mind",

0:15:30 > 0:15:32or, "Now I don't know what to believe",

0:15:32 > 0:15:35or, "I'm going to be confused from now on."

0:15:35 > 0:15:38But I'm not sure that that feeling really stays with them very long.

0:15:38 > 0:15:40They are going to go out from the experiment, you know,

0:15:40 > 0:15:43walk to the next place they're going or something like that

0:15:43 > 0:15:45and they're going to have just as much inattentional blindness

0:15:45 > 0:15:48when they're walking down the street that afternoon as they did before.

0:15:53 > 0:15:56This experiment reveals a powerful quandary about our minds.

0:15:58 > 0:16:00We glide through the world blissfully unaware

0:16:00 > 0:16:04of most of what we do and how little we really know our minds.

0:16:07 > 0:16:08For all its brilliance,

0:16:08 > 0:16:12the part of our mind we call ourselves is extremely limited.

0:16:15 > 0:16:17So how do we manage to navigate our way through

0:16:17 > 0:16:19the complexity of daily life?

0:16:31 > 0:16:32Every day, each one of us

0:16:32 > 0:16:35makes somewhere between two and 10,000 decisions.

0:16:40 > 0:16:42When you think about our daily lives,

0:16:42 > 0:16:45it's really a long, long sequence of decisions.

0:16:47 > 0:16:50We make decisions probably at a frequency

0:16:50 > 0:16:52that is close to the frequency we breathe.

0:16:52 > 0:16:56Every minute, every second, you're deciding where to move your legs,

0:16:56 > 0:16:58and where to move your eyes, and where to move your limbs,

0:16:58 > 0:17:00and when you're eating a meal,

0:17:00 > 0:17:01you're making all kinds of decisions.

0:17:03 > 0:17:05And yet the vast majority of these decisions,

0:17:05 > 0:17:07we make without even realising.

0:17:13 > 0:17:17It was Danny Kahneman's insight that we have two systems

0:17:17 > 0:17:19in the mind for making decisions.

0:17:21 > 0:17:23Two ways of thinking:

0:17:23 > 0:17:25fast and slow.

0:17:30 > 0:17:33You know, our mind has really two ways of operating,

0:17:33 > 0:17:39and one is sort of fast-thinking, an automatic effortless mode,

0:17:39 > 0:17:41and that's the one we're in most of the time.

0:17:44 > 0:17:48This fast, automatic mode of thinking, he called System 1.

0:17:51 > 0:17:54It's powerful, effortless and responsible for most of what we do.

0:17:56 > 0:18:00And System 1 is, you know, that's what happens most of the time.

0:18:00 > 0:18:05You're there, the world around you provides all kinds of stimuli

0:18:05 > 0:18:07and you respond to them.

0:18:07 > 0:18:11Everything that you see and that you understand, you know,

0:18:11 > 0:18:14this is a tree, that's a helicopter back there,

0:18:14 > 0:18:16that's the Statue of Liberty.

0:18:16 > 0:18:20All of this visual perception, all of this comes through System 1.

0:18:21 > 0:18:26The other mode is slow, deliberate, logical and rational.

0:18:27 > 0:18:32This is System 2 and it's the bit you think of as you,

0:18:32 > 0:18:34the voice in your head.

0:18:34 > 0:18:36The simplest example of the two systems is

0:18:36 > 0:18:42really two plus two is on one side, and 17 times 24 is on the other.

0:18:42 > 0:18:43What is two plus two?

0:18:43 > 0:18:45- Four.- Four.- Four.

0:18:45 > 0:18:49Fast System 1 is always in gear, producing instant answers.

0:18:49 > 0:18:51- And what's two plus two?- Four.

0:18:51 > 0:18:53A number comes to your mind.

0:18:53 > 0:18:55- Four.- Four.- Four.

0:18:55 > 0:18:59It is automatic. You do not intend for it to happen.

0:18:59 > 0:19:01It just happens to you. It's almost like a reflex.

0:19:01 > 0:19:04And what's 22 times 17?

0:19:04 > 0:19:05That's a good one.

0:19:10 > 0:19:13But when we have to pay attention to a tricky problem,

0:19:13 > 0:19:16we engage slow-but-logical System 2.

0:19:17 > 0:19:19If you can do that in your head,

0:19:19 > 0:19:22you'll have to follow some rules and to do it sequentially.

0:19:24 > 0:19:26And that is not automatic at all.

0:19:26 > 0:19:30That involves work, it involves effort, it involves concentration.

0:19:30 > 0:19:3222 times 17?

0:19:36 > 0:19:38There will be physiological symptoms.

0:19:38 > 0:19:41Your heart rate will accelerate, your pupils will dilate,

0:19:41 > 0:19:45so many changes will occur while you're performing this computation.

0:19:46 > 0:19:49Three's...oh, God!

0:19:52 > 0:19:53That's, so...

0:19:53 > 0:19:56220 and seven times 22 is...

0:20:00 > 0:20:02..54. 374?

0:20:02 > 0:20:06- OK. And can I get you to just walk with me for a second?- OK.

0:20:06 > 0:20:07Who's the current...?

0:20:07 > 0:20:13System 2 may be clever, but it's also slow, limited and lazy.

0:20:13 > 0:20:16I live in Berkeley during summers and I walk a lot.

0:20:16 > 0:20:19And when I walk very fast, I cannot think.

0:20:19 > 0:20:24- Can I get you to count backwards from 100 by 7?- Sure.

0:20:24 > 0:20:28193, 80... It's hard when you're walking.

0:20:28 > 0:20:31It takes up, interestingly enough,

0:20:31 > 0:20:36the same kind of executive function as...as thinking.

0:20:36 > 0:20:39Fourty...four?

0:20:41 > 0:20:46If you are expected to do something that demands a lot of effort,

0:20:46 > 0:20:48you will stop even walking.

0:20:48 > 0:20:52Eighty...um...six?

0:20:52 > 0:20:5351.

0:20:55 > 0:20:56Uh....

0:20:57 > 0:20:5916,

0:20:59 > 0:21:009, 2?

0:21:03 > 0:21:06Everything that you're aware of in your own mind

0:21:06 > 0:21:09is part of this slow, deliberative System 2.

0:21:09 > 0:21:13As far as you're concerned, it is the star of the show.

0:21:14 > 0:21:19Actually, I describe System 2 as not the star.

0:21:19 > 0:21:23I describe it as, as a minor character who thinks he is the star,

0:21:23 > 0:21:25because, in fact,

0:21:25 > 0:21:28most of what goes on in our mind is automatic.

0:21:28 > 0:21:31You know, it's in the domain that I call System 1.

0:21:31 > 0:21:34System 1 is an old, evolved bit of our brain, and it's remarkable.

0:21:34 > 0:21:38We couldn't survive without it because System 2 would explode.

0:21:38 > 0:21:41If Mr Spock had to make every decision for us,

0:21:41 > 0:21:45it would be very slow and effortful and our heads would explode.

0:21:45 > 0:21:48And this vast, hidden domain is responsible

0:21:48 > 0:21:51for far more than you would possibly believe.

0:21:51 > 0:21:55Having an opinion, you have an opinion immediately,

0:21:55 > 0:21:58whether you like it or not, whether you like something or not,

0:21:58 > 0:22:02whether you're for something or not, liking someone or not liking them.

0:22:02 > 0:22:04That, quite often, is something you have no control over.

0:22:04 > 0:22:08Later, when you're asked for reasons, you will invent reasons.

0:22:08 > 0:22:11And a lot of what System 2 does is, it provides reason.

0:22:11 > 0:22:17It provides rationalisations which are not necessarily the true reasons

0:22:17 > 0:22:22for our beliefs and our emotions and our intentions and what we do.

0:22:28 > 0:22:31You have two systems of thinking that steer you through life...

0:22:34 > 0:22:38Fast, intuitive System 1 that is incredibly powerful

0:22:38 > 0:22:40and does most of the driving.

0:22:42 > 0:22:45And slow, logical System 2

0:22:45 > 0:22:48that is clever, but a little lazy.

0:22:48 > 0:22:52Trouble is, there's a bit of a battle between them

0:22:52 > 0:22:54as to which one is driving your decisions.

0:23:02 > 0:23:06And this is where the mistakes creep in,

0:23:06 > 0:23:09when we use the wrong system to make a decision.

0:23:09 > 0:23:10Just going to ask you a few questions.

0:23:10 > 0:23:12We're interested in what you think.

0:23:12 > 0:23:15This question concerns this nice bottle of champagne I have here.

0:23:15 > 0:23:19Millesime 2005, it's a good year, genuinely nice, vintage bottle.

0:23:20 > 0:23:25These people think they're about to use slow, sensible System 2

0:23:25 > 0:23:28to make a rational decision about how much they would pay

0:23:28 > 0:23:30for a bottle of champagne.

0:23:30 > 0:23:33But what they don't know is that their decision

0:23:33 > 0:23:35will actually be taken totally

0:23:35 > 0:23:40without their knowledge by their hidden, fast auto-pilot, System 1.

0:23:42 > 0:23:45And with the help of a bag of ping-pong balls,

0:23:45 > 0:23:46we can influence that decision.

0:23:48 > 0:23:52I've got a set of numbered balls here from 1 to 100 in this bag.

0:23:52 > 0:23:55I'd like you to reach in and draw one out at random for me,

0:23:55 > 0:23:56if you would.

0:23:56 > 0:23:58First, they've got to choose a ball.

0:23:58 > 0:24:00- The number says ten.- Ten.- Ten.- Ten.

0:24:00 > 0:24:04They think it's a random number, but in fact, it's rigged.

0:24:04 > 0:24:07All the balls are marked with the low number ten.

0:24:07 > 0:24:13This experiment is all about the thoughtless creation of habits.

0:24:13 > 0:24:17It's about how we make one decision, and then other decisions follow it

0:24:17 > 0:24:20as if the first decision was actually meaningful.

0:24:22 > 0:24:24What we do is purposefully,

0:24:24 > 0:24:28we give people a first decision that is clearly meaningless.

0:24:28 > 0:24:30- Ten.- Ten, OK. Would you be willing to pay ten pounds

0:24:30 > 0:24:33for this nice bottle of vintage champagne?

0:24:33 > 0:24:34I would, yes.

0:24:34 > 0:24:37- No.- Yeah, I guess.- OK.

0:24:37 > 0:24:40This first decision is meaningless,

0:24:40 > 0:24:43based as it is on a seemingly random number.

0:24:44 > 0:24:49But what it does do is lodge the low number ten in their heads.

0:24:49 > 0:24:53- Would you buy it for ten pounds? - Yes, I would. Yes.- You would? OK.

0:24:54 > 0:24:57Now for the real question where we ask them

0:24:57 > 0:25:00how much they'd actually pay for the champagne.

0:25:00 > 0:25:03What's the maximum amount you think you'd be willing to pay?

0:25:03 > 0:25:05- 20?- OK.

0:25:05 > 0:25:07- Seven pounds.- Seven pounds, OK.

0:25:07 > 0:25:08Probably ten pound.

0:25:08 > 0:25:10A range of fairly low offers.

0:25:12 > 0:25:15But what happens if we prime people with a much higher number,

0:25:15 > 0:25:1765 instead of 10?

0:25:20 > 0:25:22- What does that one say?- 65.- 65, OK.

0:25:22 > 0:25:23- 65.- OK.

0:25:23 > 0:25:25It says 65.

0:25:27 > 0:25:31How will this affect the price people are prepared to pay?

0:25:31 > 0:25:33What's the maximum you would be willing to pay for this

0:25:33 > 0:25:35bottle of champagne?

0:25:35 > 0:25:37- 40?- £45.- 45, OK.

0:25:37 > 0:25:3950.

0:25:39 > 0:25:41- 40 quid?- OK.

0:25:41 > 0:25:45- £50?- £50? - Yeah, I'd pay between 50 and £80.

0:25:45 > 0:25:48- Between 50 and 80?- Yeah.

0:25:48 > 0:25:50Logic has gone out of the window.

0:25:52 > 0:25:55The price people are prepared to pay is influenced by nothing more

0:25:55 > 0:25:58than a number written on a ping-pong ball.

0:26:01 > 0:26:04It suggests that when we can't make decisions,

0:26:04 > 0:26:06we don't evaluate the decision in itself.

0:26:06 > 0:26:08Instead what we do is,

0:26:08 > 0:26:12we try to look at other similar decisions we've made in the past

0:26:12 > 0:26:16and we take those decisions as if they were good decisions

0:26:16 > 0:26:17and we say to ourselves,

0:26:17 > 0:26:19"Oh, I've made this decision before.

0:26:19 > 0:26:23"Clearly, I don't need to go ahead and solve this decision.

0:26:23 > 0:26:25"Let me just use what I did before and repeat it,

0:26:25 > 0:26:27"maybe with some modifications."

0:26:29 > 0:26:32This anchoring effect comes from the conflict

0:26:32 > 0:26:35between our two systems of thinking.

0:26:37 > 0:26:41Fast System 1 is a master of taking short cuts

0:26:41 > 0:26:45to bring about the quickest possible decision.

0:26:45 > 0:26:48What happens is, they ask you a question

0:26:48 > 0:26:50and if the question is difficult

0:26:50 > 0:26:54but there is a related question that is a lot...that is somewhat simpler,

0:26:54 > 0:26:58you're just going to answer the other question and...

0:26:58 > 0:27:00and not even notice.

0:27:00 > 0:27:02So the system does all kinds of short cuts to feed us

0:27:02 > 0:27:06the information in a faster way and we can make actions,

0:27:06 > 0:27:08and the system is accepting some mistakes.

0:27:11 > 0:27:13We make decisions using fast System 1

0:27:13 > 0:27:17when we really should be using slow System 2.

0:27:18 > 0:27:22And this is why we make the mistakes we do,

0:27:22 > 0:27:26systematic mistakes known as cognitive biases.

0:27:28 > 0:27:30Nice day.

0:27:36 > 0:27:41Since Kahneman first began investigating the glitches in our thinking,

0:27:41 > 0:27:45more than 150 cognitive biases have been identified.

0:27:46 > 0:27:49We are riddled with these systematic mistakes

0:27:49 > 0:27:52and they affect every aspect of our daily lives.

0:27:54 > 0:27:57Wikipedia has a very big list of biases

0:27:57 > 0:28:00and we are finding new ones all the time.

0:28:00 > 0:28:03One of the biases that I think is the most important

0:28:03 > 0:28:05is what's called the present bias focus.

0:28:05 > 0:28:07It's the fact that we focus on now

0:28:07 > 0:28:09and don't think very much about the future.

0:28:09 > 0:28:13That's the bias that causes things like overeating and smoking,

0:28:13 > 0:28:16and texting and driving, and having unprotected sex.

0:28:16 > 0:28:19Another one is called the halo effect,

0:28:19 > 0:28:21and this is the idea

0:28:21 > 0:28:25that if you like somebody or an organisation,

0:28:25 > 0:28:29you're biased to think that all of its aspects are good,

0:28:29 > 0:28:30that everything is good about it.

0:28:30 > 0:28:33If you dislike it, everything is bad.

0:28:33 > 0:28:35People really are quite uncomfortable, you know,

0:28:35 > 0:28:39by the idea that Hitler loved children, you know.

0:28:39 > 0:28:40He did.

0:28:40 > 0:28:43Now, that doesn't make him a good person,

0:28:43 > 0:28:47but we feel uncomfortable to see an attractive trait

0:28:47 > 0:28:51in a person that we consider, you know, the epitome of evil.

0:28:51 > 0:28:54We are prone to think that what we like is all good

0:28:54 > 0:28:56and what we dislike is all bad.

0:28:56 > 0:28:57That's a bias.

0:28:57 > 0:29:01Another particular favourite of mine is the bias to get attached

0:29:01 > 0:29:04to things that we ourselves have created.

0:29:04 > 0:29:06We call it the IKEA effect.

0:29:06 > 0:29:10Well, you've got loss aversion, risk aversion, present bias.

0:29:10 > 0:29:13Spotlight effect, and the spotlight effect is the idea

0:29:13 > 0:29:16that we think that other people pay a lot of attention to us

0:29:16 > 0:29:17when in fact, they don't.

0:29:17 > 0:29:20Confirmation bias. Overconfidence is a big one.

0:29:20 > 0:29:23But what's clear is that there's lots of them.

0:29:23 > 0:29:25There's lots of ways for us to get things wrong.

0:29:25 > 0:29:27You know, there's one way to do things right

0:29:27 > 0:29:31and many ways to do things wrong, and we're capable of many of them.

0:29:37 > 0:29:41These biases explain so many things that we get wrong.

0:29:42 > 0:29:45Our impulsive spending.

0:29:45 > 0:29:48Trusting the wrong people.

0:29:48 > 0:29:51Not seeing the other person's point of view.

0:29:51 > 0:29:53Succumbing to temptation.

0:29:56 > 0:29:59We are so riddled with these biases,

0:29:59 > 0:30:02it's hard to believe we ever make a rational decision.

0:30:10 > 0:30:13But it's not just our everyday decisions that are affected.

0:30:28 > 0:30:30What happens if you're an expert,

0:30:30 > 0:30:34trained in making decisions that are a matter of life and death?

0:30:38 > 0:30:41Are you still destined to make these systematic mistakes?

0:30:44 > 0:30:47On the outskirts of Washington DC,

0:30:47 > 0:30:52Horizon has been granted access to spy on the spooks.

0:30:56 > 0:31:00Welcome to Analytical Exercise Number Four.

0:31:01 > 0:31:04Former intelligence analyst Donald Kretz

0:31:04 > 0:31:07is running an ultra-realistic spy game.

0:31:08 > 0:31:13This exercise will take place in the fictitious city of Vastopolis.

0:31:15 > 0:31:19Taking part are a mixture of trained intelligence analysts

0:31:19 > 0:31:20and some novices.

0:31:23 > 0:31:25Due to an emerging threat...

0:31:26 > 0:31:30..a terrorism taskforce has been stood up.

0:31:30 > 0:31:31I will be the terrorism taskforce lead

0:31:31 > 0:31:36and I have recruited all of you to be our terrorism analysts.

0:31:39 > 0:31:41The challenge facing the analysts

0:31:41 > 0:31:45is to thwart a terrorist threat against a US city.

0:31:47 > 0:31:50The threat at this point has not been determined.

0:31:50 > 0:31:53It's up to you to figure out the type of terrorism...

0:31:54 > 0:31:56..and who's responsible for planning it.

0:31:59 > 0:32:02The analysts face a number of tasks.

0:32:02 > 0:32:06They must first investigate any groups who may pose a threat.

0:32:06 > 0:32:08Your task is to write a report.

0:32:08 > 0:32:11The subject in this case is the Network of Dread.

0:32:11 > 0:32:14The mayor has asked for this 15 minutes from now.

0:32:17 > 0:32:21Just like in the real world, the analysts have access

0:32:21 > 0:32:25to a huge amount of data streaming in, from government agencies,

0:32:25 > 0:32:29social media, mobile phones and emergency services.

0:32:33 > 0:32:35The Network of Dread turns out to be

0:32:35 > 0:32:38a well-known international terror group.

0:32:38 > 0:32:41They have the track record, the capability

0:32:41 > 0:32:45and the personnel to carry out an attack.

0:32:45 > 0:32:46The scenario that's emerging

0:32:46 > 0:32:48is a bio-terror event,

0:32:48 > 0:32:51meaning it's a biological terrorism attack

0:32:51 > 0:32:53that's going to take place against the city.

0:32:56 > 0:33:00If there is an emerging threat, they are the likely candidate.

0:33:00 > 0:33:02We need to move onto the next task.

0:33:04 > 0:33:06It's now 9th April.

0:33:06 > 0:33:09This is another request for information,

0:33:09 > 0:33:13this time on something or someone called the Masters of Chaos.

0:33:16 > 0:33:21The Masters of Chaos are a group of cyber-hackers,

0:33:21 > 0:33:24a local bunch of misfits with no history of violence.

0:33:27 > 0:33:31And while the analysts continue to sift through the incoming data,

0:33:31 > 0:33:36behind the scenes, Kretz is watching their every move.

0:33:36 > 0:33:39In this room, we're able to monitor what the analysts are doing

0:33:39 > 0:33:41throughout the entire exercise.

0:33:42 > 0:33:44We have set up a knowledge base

0:33:44 > 0:33:46into which we have been inserting data

0:33:46 > 0:33:48throughout the course of the day.

0:33:48 > 0:33:50Some of them are related to our terrorist threat,

0:33:50 > 0:33:52many of them are not.

0:33:53 > 0:33:56Amidst the wealth of data on the known terror group,

0:33:56 > 0:33:58there's also evidence coming in

0:33:58 > 0:34:00of a theft at a university biology lab

0:34:00 > 0:34:05and someone has hacked into the computers of a local freight firm.

0:34:06 > 0:34:10Each of these messages represents, essentially, a piece of the puzzle,

0:34:10 > 0:34:14but it's a puzzle that you don't have the box top to,

0:34:14 > 0:34:17so you don't have the picture in advance,

0:34:17 > 0:34:20so you don't know what pieces go where.

0:34:20 > 0:34:23Furthermore, what we have is a bunch of puzzle pieces that don't

0:34:23 > 0:34:25even go with this puzzle.

0:34:27 > 0:34:31The exercise is part of a series of experiments to investigate

0:34:31 > 0:34:33whether expert intelligence agents

0:34:33 > 0:34:37are just as prone to mistakes from cognitive bias as the rest of us,

0:34:37 > 0:34:43or whether their training and expertise makes them immune.

0:34:44 > 0:34:47I have a sort of insider's point of view of this problem.

0:34:47 > 0:34:51I worked a number of years as an intelligence analyst.

0:34:51 > 0:34:53The stakes are incredibly high.

0:34:53 > 0:34:57Mistakes can often be life and death.

0:35:00 > 0:35:02We roll ahead now. The date is 21st May.

0:35:04 > 0:35:07If the analysts are able to think rationally,

0:35:07 > 0:35:09they should be able to solve the puzzle.

0:35:10 > 0:35:15But the danger is, they will fall into the trap set by Kretz

0:35:15 > 0:35:18and only pay attention to the established terror group,

0:35:18 > 0:35:20the Network of Dread.

0:35:21 > 0:35:27Their judgment may be clouded by a bias called confirmation bias.

0:35:27 > 0:35:30Confirmation bias is the most prevalent bias of all,

0:35:30 > 0:35:33and it's where we tend to search for information

0:35:33 > 0:35:35that supports what we already believe.

0:35:37 > 0:35:39Confirmation bias can easily lead people

0:35:39 > 0:35:42to ignore the evidence in front of their eyes.

0:35:44 > 0:35:47And Kretz is able to monitor if the bias kicks in.

0:35:48 > 0:35:51We still see that they're searching for Network of Dread.

0:35:51 > 0:35:55That's an indication that we may have a confirmation bias operating.

0:35:57 > 0:35:59The Network of Dread are the big guys -

0:35:59 > 0:36:04they've done it before, so you would expect they'd do it again.

0:36:04 > 0:36:07And I think we're starting to see some biases here.

0:36:08 > 0:36:11Analysts desperately want to get to the correct answer,

0:36:11 > 0:36:15but they're affected by the same biases as the rest of us.

0:36:15 > 0:36:19So far, most of our analysts seem to believe

0:36:19 > 0:36:24that the Network of Dread is responsible for planning this attack,

0:36:24 > 0:36:26and that is completely wrong.

0:36:30 > 0:36:31How are we doing?

0:36:33 > 0:36:36It's time for the analysts to put themselves on the line

0:36:36 > 0:36:40and decide who the terrorists are and what they're planning.

0:36:42 > 0:36:43So what do you think?

0:36:43 > 0:36:47It was a bio-terrorist attack.

0:36:47 > 0:36:49- I had a different theory. - What's your theory?

0:36:49 > 0:36:51Cos I may be missing something here, too.

0:36:51 > 0:36:54They know that the Network of Dread is a terrorist group.

0:36:54 > 0:36:58They know that the Masters of Chaos is a cyber-hacking group.

0:36:58 > 0:37:01Either to the new factory or in the water supply.

0:37:01 > 0:37:05Lots of dead fish floating up in the river.

0:37:06 > 0:37:09The question is, did any of the analysts manage to dig out

0:37:09 > 0:37:12the relevant clues and find the true threat?

0:37:14 > 0:37:18In this case, the actual threat is due to the cyber-group,

0:37:18 > 0:37:20the Masters of Chaos,

0:37:20 > 0:37:23who become increasingly radicalised throughout the scenario

0:37:23 > 0:37:27and decide to take out their anger on society, essentially.

0:37:27 > 0:37:31Who convinced them to switch from cyber-crime to bio-terrorism?

0:37:31 > 0:37:34Or did they succumb to confirmation bias

0:37:34 > 0:37:38and simply pin the blame on the usual suspects?

0:37:38 > 0:37:40Will they make that connection?

0:37:40 > 0:37:43Will they process that evidence and assess it accordingly,

0:37:43 > 0:37:46or will their confirmation bias drive them

0:37:46 > 0:37:50to believe that it's a more traditional type of terrorist group?

0:37:50 > 0:37:53I believe that the Masters of Chaos are actually the ones behind it.

0:37:53 > 0:37:56It's either a threat or not a threat, but the Network of Dread?

0:37:56 > 0:37:59And time's up. Please go ahead and save those reports.

0:38:02 > 0:38:03At the end of the exercise,

0:38:03 > 0:38:08Kretz reveals the true identity of the terrorists.

0:38:08 > 0:38:10We have a priority message from City Hall.

0:38:12 > 0:38:15The terrorist attack was thwarted,

0:38:15 > 0:38:20the planned bio-terrorist attack by the Masters of Chaos

0:38:20 > 0:38:22against Vastopolis was thwarted.

0:38:22 > 0:38:25The mayor expresses his thanks for a job well done.

0:38:25 > 0:38:29Show of hands, who...who got it?

0:38:30 > 0:38:32Yeah.

0:38:32 > 0:38:37Out of 12 subjects, 11 of them got the wrong answer.

0:38:38 > 0:38:42The only person to spot the true threat was in fact a novice.

0:38:44 > 0:38:48All the trained experts fell prey to confirmation bias.

0:38:53 > 0:38:55It is not typically the case

0:38:55 > 0:38:57that simply being trained as an analyst

0:38:57 > 0:39:00gives you the tools you need to overcome cognitive bias.

0:39:02 > 0:39:05You can learn techniques for memory improvement.

0:39:05 > 0:39:08You can learn techniques for better focus,

0:39:08 > 0:39:11but techniques to eliminate cognitive bias

0:39:11 > 0:39:13just simply don't work.

0:39:19 > 0:39:21And for intelligence analysts in the real world,

0:39:21 > 0:39:26the implications of making mistakes from these biases are drastic.

0:39:28 > 0:39:32Government reports and studies over the past decade or so

0:39:32 > 0:39:35have cited experts as believing that cognitive bias

0:39:35 > 0:39:37may have played a role

0:39:37 > 0:39:41in a number of very significant intelligence failures,

0:39:41 > 0:39:44and yet it remains an understudied problem.

0:39:51 > 0:39:53Heads.

0:39:54 > 0:39:55Heads.

0:39:58 > 0:39:59But the area of our lives

0:39:59 > 0:40:04in which these systematic mistakes have the most explosive impact

0:40:04 > 0:40:06is in the world of money.

0:40:07 > 0:40:11The moment money enters the picture, the rules change.

0:40:13 > 0:40:16Many of us think that we're at our most rational

0:40:16 > 0:40:18when it comes to decisions about money.

0:40:20 > 0:40:24We like to think we know how to spot a bargain, to strike a good deal,

0:40:24 > 0:40:28sell our house at the right time, invest wisely.

0:40:29 > 0:40:32Thinking about money the right way

0:40:32 > 0:40:36is one of the most challenging things for human nature.

0:40:38 > 0:40:41But if we're not as rational as we like to think,

0:40:41 > 0:40:45and there is a hidden force at work shaping our decisions,

0:40:45 > 0:40:47are we deluding ourselves?

0:40:47 > 0:40:51Money brings with it a mode of thinking.

0:40:51 > 0:40:54It changes the way we react to the world.

0:40:55 > 0:40:57When it comes to money,

0:40:57 > 0:41:00cognitive biases play havoc with our best intentions.

0:41:02 > 0:41:05There are many mistakes that people make when it comes to money.

0:41:18 > 0:41:21Kahneman's insight into our mistakes with money

0:41:21 > 0:41:25were to revolutionise our understanding of economics.

0:41:27 > 0:41:29It's all about a crucial difference in how we feel

0:41:29 > 0:41:32when we win or lose

0:41:32 > 0:41:35and our readiness to take a risk.

0:41:35 > 0:41:37I would like to take a risk.

0:41:37 > 0:41:39Take a risk, OK?

0:41:39 > 0:41:40Let's take a risk.

0:41:40 > 0:41:44Our willingness to take a gamble is very different depending on

0:41:44 > 0:41:47whether we are faced with a loss or a gain.

0:41:47 > 0:41:49Excuse me, guys, can you spare two minutes to help us

0:41:49 > 0:41:52with a little experiment? Try and win as much money as you can, OK?

0:41:52 > 0:41:54- OK.- OK?

0:41:54 > 0:41:56In my hands here I have £20, OK?

0:41:56 > 0:41:58Here are two scenarios.

0:41:58 > 0:41:59And I'm going to give you ten.

0:41:59 > 0:42:03In the first case, you are given ten pounds.

0:42:03 > 0:42:06That's now yours. Put it in your pocket, take it away,

0:42:06 > 0:42:08spend it on a drink on the South Bank later.

0:42:08 > 0:42:09- OK.- OK?

0:42:09 > 0:42:10OK.

0:42:10 > 0:42:15Then you have to make a choice about how much more you could gain.

0:42:15 > 0:42:17You can either take the safe option, in which case,

0:42:17 > 0:42:19I give you an additional five,

0:42:19 > 0:42:21or you can take a risk.

0:42:21 > 0:42:24If you take a risk, I'm going to flip this coin.

0:42:24 > 0:42:28If it comes up heads, you win ten,

0:42:28 > 0:42:30but if it comes up tails,

0:42:30 > 0:42:32you're not going to win any more.

0:42:32 > 0:42:36Would you choose the safe option and get an extra five pounds

0:42:36 > 0:42:40or take a risk and maybe win an extra ten or nothing?

0:42:40 > 0:42:41Which is it going to be?

0:42:45 > 0:42:47I'd go safe.

0:42:47 > 0:42:49- Safe, five?- Yeah.

0:42:49 > 0:42:52- Take five.- You'd take five? - Yeah, man.- Sure? There we go.

0:42:52 > 0:42:54Most people presented with this choice

0:42:54 > 0:42:56go for the certainty of the extra fiver.

0:42:56 > 0:42:59- Thank you very much. - Told you it was easy.

0:42:59 > 0:43:03In a winning frame of mind, people are naturally rather cautious.

0:43:03 > 0:43:04That's yours, too.

0:43:04 > 0:43:06That was it?

0:43:06 > 0:43:09- That was it.- Really?- Yes.- Eh?

0:43:11 > 0:43:13But what about losing?

0:43:13 > 0:43:17Are we similarly cautious when faced with a potential loss?

0:43:17 > 0:43:20In my hands, I've got £20 and I'm going to give that to you.

0:43:20 > 0:43:21That's now yours.

0:43:21 > 0:43:23- OK.- You can put it in your handbag.

0:43:23 > 0:43:26This time, you're given £20.

0:43:28 > 0:43:30And again, you must make a choice.

0:43:32 > 0:43:37Would you choose to accept a safe loss of £5 or would you take a risk?

0:43:39 > 0:43:42If you take a risk, I'm going to flip this coin.

0:43:42 > 0:43:45If it comes up heads, you don't lose anything,

0:43:45 > 0:43:48but if it comes up tails, then you lose ten pounds.

0:43:49 > 0:43:52In fact, it's exactly the same outcome.

0:43:52 > 0:43:55In both cases, you face a choice between ending up with

0:43:55 > 0:44:00a certain £15 or tossing a coin to get either ten or twenty.

0:44:01 > 0:44:06- I will risk losing ten or nothing. - OK.

0:44:06 > 0:44:09But the crucial surprise here is that when the choice is framed

0:44:09 > 0:44:12in terms of a loss, most people take a risk.

0:44:13 > 0:44:16- Take a risk.- Take a risk, OK.

0:44:16 > 0:44:18- I'll risk it.- You'll risk it? OK.

0:44:18 > 0:44:20Our slow System 2 could probably work out

0:44:20 > 0:44:23that the outcome is the same in both cases.

0:44:23 > 0:44:24And that's heads, you win.

0:44:24 > 0:44:27But it's too limited and too lazy.

0:44:27 > 0:44:29That's the easiest £20 you'll ever make.

0:44:29 > 0:44:34Instead, fast System 1 makes a rough guess based on change.

0:44:34 > 0:44:38- And that's all there is to it, thank you very much.- Oh, no! Look.

0:44:38 > 0:44:41And System 1 doesn't like losing.

0:44:47 > 0:44:49If you were to lose £10 in the street today

0:44:49 > 0:44:53and then find £10 tomorrow, you would be financially unchanged

0:44:53 > 0:44:55but actually we respond to changes,

0:44:55 > 0:45:00so the pain of the loss of £10 looms much larger, it feels more painful.

0:45:00 > 0:45:02In fact, you'd probably have to find £20

0:45:02 > 0:45:05to offset the pain that you feel by losing ten.

0:45:05 > 0:45:06Heads.

0:45:06 > 0:45:10At the heart of this, is a bias called loss aversion,

0:45:10 > 0:45:13which affects many of our financial decisions.

0:45:13 > 0:45:17People think in terms of gains and losses.

0:45:17 > 0:45:20- Heads.- It's tails.- Oh!

0:45:20 > 0:45:25And in their thinking, typically, losses loom larger than gains.

0:45:26 > 0:45:29We even have an idea by...by how much,

0:45:29 > 0:45:33by roughly a factor of two or a little more than two.

0:45:34 > 0:45:36That is loss aversion,

0:45:36 > 0:45:38and it certainly was the most important thing

0:45:38 > 0:45:40that emerged from our work.

0:45:43 > 0:45:46It's a vital insight into human behaviour,

0:45:46 > 0:45:49so important that it led to a Nobel prize

0:45:49 > 0:45:53and the founding of an entirely new branch of economics.

0:45:55 > 0:45:58When we think we're winning, we don't take risks.

0:45:59 > 0:46:04But when we're faced with a loss, frankly, we're a bit reckless.

0:46:14 > 0:46:16But loss aversion doesn't just affect people

0:46:16 > 0:46:19making casual five pound bets.

0:46:19 > 0:46:23It can affect anyone at any time,

0:46:23 > 0:46:27including those who work in the complex system of high finance,

0:46:27 > 0:46:30in which trillions of dollars are traded.

0:46:32 > 0:46:35In our current complex environments,

0:46:35 > 0:46:39we now have the means as well as the motive

0:46:39 > 0:46:42to make very serious mistakes.

0:46:44 > 0:46:48The bedrock of economics is that people think rationally.

0:46:48 > 0:46:52They calculate risks, rewards and decide accordingly.

0:46:55 > 0:46:57But we're not always rational.

0:46:57 > 0:47:00We rarely behave like Mr Spock.

0:47:01 > 0:47:05For most of our decisions, we use fast, intuitive,

0:47:05 > 0:47:08but occasionally unreliable System 1.

0:47:11 > 0:47:13And in a global financial market,

0:47:13 > 0:47:16that can lead to very serious problems.

0:47:18 > 0:47:23I think what the financial crisis did was, it simply said,

0:47:23 > 0:47:25"You know what?

0:47:25 > 0:47:30"People are a lot more vulnerable to psychological pitfalls

0:47:30 > 0:47:33"than we really understood before."

0:47:33 > 0:47:38Basically, human psychology is just too flawed

0:47:38 > 0:47:41to expect that we could avert a crisis.

0:47:48 > 0:47:53Understanding these pitfalls has led to a new branch of economics.

0:47:54 > 0:47:56Behavioural economics.

0:47:58 > 0:48:01Thanks to psychologists like Hersh Shefrin,

0:48:01 > 0:48:04it's beginning to establish a toehold in Wall Street.

0:48:05 > 0:48:09It takes account of the way we actually make decisions

0:48:09 > 0:48:11rather than how we say we do.

0:48:17 > 0:48:21Financial crisis, I think, was as large a problem as it was

0:48:21 > 0:48:24because certain psychological traits like optimism,

0:48:24 > 0:48:27over-confidence and confirmation bias

0:48:27 > 0:48:31played a very large role among a part of the economy

0:48:31 > 0:48:36where serious mistakes could be made, and were.

0:48:44 > 0:48:48But for as long as our financial system assumes we are rational,

0:48:48 > 0:48:50our economy will remain vulnerable.

0:48:52 > 0:48:53I'm quite certain

0:48:53 > 0:48:58that if the regulators listened to behavioural economists early on,

0:48:58 > 0:49:01we would have designed a very different financial system

0:49:01 > 0:49:05and we wouldn't have had the incredible increase

0:49:05 > 0:49:10in housing market and we wouldn't have this financial catastrophe.

0:49:11 > 0:49:15And so when Kahneman collected his Nobel prize,

0:49:15 > 0:49:17it wasn't for psychology,

0:49:17 > 0:49:19it was for economics.

0:49:25 > 0:49:29The big question is, what can we do about these systematic mistakes?

0:49:30 > 0:49:34Can we hope to find a way round our fast-thinking biases

0:49:34 > 0:49:36and make better decisions?

0:49:39 > 0:49:40To answer this,

0:49:40 > 0:49:43we need to know the evolutionary origins of our mistakes.

0:49:46 > 0:49:48Just off the coast of Puerto Rico

0:49:48 > 0:49:52is probably the best place in the world to find out.

0:49:59 > 0:50:03The tiny island of Cayo Santiago.

0:50:05 > 0:50:08So we're now in the boat, heading over to Cayo Santiago.

0:50:08 > 0:50:12This is an island filled with a thousand rhesus monkeys.

0:50:12 > 0:50:13Once you pull in, it looks

0:50:13 > 0:50:16a little bit like you're going to Jurassic Park. You're not sure

0:50:16 > 0:50:19what you're going to see. Then you'll see your first monkey,

0:50:19 > 0:50:21and it'll be comfortable, like,

0:50:21 > 0:50:24"Ah, the monkeys are here, everything's great."

0:50:30 > 0:50:33It's an island devoted to monkey research.

0:50:34 > 0:50:37You see the guys hanging out on the cliff up there?

0:50:37 > 0:50:38Pretty cool.

0:50:41 > 0:50:43The really special thing about Cayo Santiago

0:50:43 > 0:50:44is that the animals here,

0:50:44 > 0:50:47because they've grown up over the last seven years around humans,

0:50:47 > 0:50:49they're completely habituated,

0:50:49 > 0:50:52and that means we can get up close to them, show them stuff,

0:50:52 > 0:50:53look at how they make decisions.

0:50:53 > 0:50:54We're able to do this here

0:50:54 > 0:50:57in a way that we'd never be able to do it anywhere else, really.

0:50:57 > 0:50:58It's really unique.

0:51:02 > 0:51:04Laurie Santos is here to find out

0:51:04 > 0:51:08if monkeys make the same mistakes in their decisions that we do.

0:51:11 > 0:51:14Most of the work we do is comparing humans and other primates,

0:51:14 > 0:51:16trying to ask what's special about humans.

0:51:16 > 0:51:18But really, what we want to understand is,

0:51:18 > 0:51:21what's the evolutionary origin of some of our dumber strategies,

0:51:21 > 0:51:23some of those spots where we get things wrong?

0:51:23 > 0:51:26If we could understand where those came from,

0:51:26 > 0:51:28that's where we'll get some insight.

0:51:31 > 0:51:32If Santos can show us

0:51:32 > 0:51:35that monkeys have the same cognitive biases as us,

0:51:35 > 0:51:38it would suggest that they evolved a long time ago.

0:51:43 > 0:51:49And a mental strategy that old would be almost impossible to change.

0:51:50 > 0:51:53We started this work around the time of the financial collapse.

0:51:56 > 0:51:57So, when we were thinking about

0:51:57 > 0:52:01what dumb strategies could we look at in monkeys, it was pretty obvious

0:52:01 > 0:52:03that some of the human economic strategies

0:52:03 > 0:52:06which were in the news might be the first thing to look at.

0:52:06 > 0:52:09And one of the particular things we wanted to look at was

0:52:09 > 0:52:12whether or not the monkeys are loss averse.

0:52:12 > 0:52:16But monkeys, smart as they are, have yet to start using money.

0:52:16 > 0:52:18And so that was kind of where we started.

0:52:18 > 0:52:21We said, "Well, how can we even ask this question

0:52:21 > 0:52:24"of if monkeys make financial mistakes?"

0:52:24 > 0:52:27And so we decided to do it by introducing the monkeys

0:52:27 > 0:52:31to their own new currency and just let them buy their food.

0:52:32 > 0:52:36So I'll show you some of this stuff we've been up to with the monkeys.

0:52:37 > 0:52:39Back in her lab at Yale,

0:52:39 > 0:52:42she introduced a troop of monkeys to their own market,

0:52:42 > 0:52:46giving them round shiny tokens they could exchange for food.

0:52:48 > 0:52:50So here's Holly.

0:52:50 > 0:52:52She comes in, hands over a token

0:52:52 > 0:52:55and you can see, she just gets to grab the grape there.

0:52:55 > 0:52:57One of the first things we wondered was just,

0:52:57 > 0:52:59can they in some sense learn

0:52:59 > 0:53:02that a different store sells different food at different prices?

0:53:02 > 0:53:06So what we did was, we presented the monkeys with situations

0:53:06 > 0:53:10where they met traders who sold different goods at different rates.

0:53:10 > 0:53:13So what you'll see in this clip is the monkeys meeting a new trader.

0:53:13 > 0:53:17She's actually selling grapes for three grapes per one token.

0:53:21 > 0:53:23And what we found is that in this case,

0:53:23 > 0:53:24the monkeys are pretty rational.

0:53:24 > 0:53:26so when they get a choice of a guy who sells, you know,

0:53:26 > 0:53:30three goods for one token, they actually shop more at that guy.

0:53:36 > 0:53:38Having taught the monkeys the value of money,

0:53:38 > 0:53:42the next step was to see if monkeys, like humans,

0:53:42 > 0:53:46suffer from that most crucial bias, loss aversion.

0:53:48 > 0:53:52And so what we did was, we introduced the monkeys to traders

0:53:52 > 0:53:56who either gave out losses or gains relative to what they should.

0:53:57 > 0:54:00So I could make the monkey think he's getting a bonus

0:54:00 > 0:54:01simply by having him trade

0:54:01 > 0:54:04with a trader who's starting with a single grape

0:54:04 > 0:54:06but then when the monkey pays this trader,

0:54:06 > 0:54:08she actually gives him an extra, so she gives him a bonus.

0:54:08 > 0:54:10At the end, the monkey gets two,

0:54:10 > 0:54:13but he thinks he got that second one as a bonus.

0:54:13 > 0:54:16We can then compare what the monkeys do with that guy

0:54:16 > 0:54:18versus a guy who gives the monkey losses.

0:54:18 > 0:54:19This is a guy who shows up,

0:54:19 > 0:54:22who pretends he's going to sell three grapes,

0:54:22 > 0:54:25but then when the monkey actually pays this trader,

0:54:25 > 0:54:28he'll take one of the grapes away and give the monkeys only two.

0:54:30 > 0:54:33The big question then is how the monkeys react

0:54:33 > 0:54:36when faced with a choice between a loss and a gain.

0:54:37 > 0:54:40So she'll come in, she's met these two guys before.

0:54:40 > 0:54:42You can see she goes with the bonus option,

0:54:42 > 0:54:46even waits patiently for her additional piece to be added here,

0:54:47 > 0:54:51and then takes the bonus, avoiding the person who gives her losses.

0:54:57 > 0:55:00So monkeys hate losing just as much as people.

0:55:08 > 0:55:11And crucially, Santos found that monkeys, as well,

0:55:11 > 0:55:15are more likely to take risks when faced with a loss.

0:55:19 > 0:55:20This suggests to us

0:55:20 > 0:55:23that the monkeys seem to frame their decisions

0:55:23 > 0:55:24in exactly the same way we do.

0:55:24 > 0:55:27They're not thinking just about the absolute,

0:55:27 > 0:55:29they're thinking relative to what they expect.

0:55:29 > 0:55:32And when they're getting less than they expect,

0:55:32 > 0:55:35when they're getting losses, they too become more risk-seeking.

0:55:38 > 0:55:42The fact that we share this bias with these monkeys suggests

0:55:42 > 0:55:45that it's an ancient strategy etched into our DNA

0:55:45 > 0:55:47more than 35 million years ago.

0:55:50 > 0:55:53And what we learn from the monkeys is that

0:55:53 > 0:55:55if this bias is really that old,

0:55:55 > 0:55:59if we really have had this strategy for the last 35 million years,

0:55:59 > 0:56:02simply deciding to overcome it is just not going to work.

0:56:02 > 0:56:06We need better ways to make ourselves avoid some of these pitfalls.

0:56:08 > 0:56:13Making mistakes, it seems, is just part of what it is to be human.

0:56:19 > 0:56:23We are stuck with our intuitive inner stranger.

0:56:26 > 0:56:29The challenge this poses is profound.

0:56:31 > 0:56:34If it's human nature to make these predictable mistakes

0:56:34 > 0:56:38and we can't change that, what, then, can we do?

0:56:40 > 0:56:43We need to accept ourselves as we are.

0:56:43 > 0:56:46The cool thing about being a human versus a monkey

0:56:46 > 0:56:50is that we have a deliberative self that can reflect on our biases.

0:56:50 > 0:56:53System 2 in us has for the first time realised

0:56:53 > 0:56:56that there's a System 1, and with that realisation,

0:56:56 > 0:56:59we can shape the way we set up policies.

0:56:59 > 0:57:03We can shape the way we set up situations to allow ourselves

0:57:03 > 0:57:04to make better decisions.

0:57:04 > 0:57:07This is the first time in evolution that this has happened.

0:57:12 > 0:57:16If we want to avoid mistakes, we have to reshape the environment

0:57:16 > 0:57:20we've built around us rather than hope to change ourselves.

0:57:27 > 0:57:31We've achieved a lot despite all of these biases.

0:57:31 > 0:57:34If we are aware of them, we can probably do things

0:57:34 > 0:57:38like design our institutions and our regulations

0:57:38 > 0:57:43and our own personal environments and working lives to minimise

0:57:43 > 0:57:45the effect of those biases

0:57:45 > 0:57:49and help us think about how to overcome them.

0:57:54 > 0:57:56We are limited, we're not perfect.

0:57:56 > 0:57:58We're irrational in all kinds of ways,

0:57:58 > 0:58:01but we can build a world that is compatible with this

0:58:01 > 0:58:05and get us to make better decisions rather than worse decisions.

0:58:05 > 0:58:06That's my hope.

0:58:10 > 0:58:13And by accepting our inner stranger,

0:58:13 > 0:58:18we may come to a better understanding of our own minds.

0:58:18 > 0:58:21I think it is important, in general,

0:58:21 > 0:58:24to be aware of where beliefs come from.

0:58:27 > 0:58:30And if we think that we have reasons for what

0:58:30 > 0:58:34we believe, that is often a mistake,

0:58:34 > 0:58:38that our beliefs and our wishes and our hopes

0:58:38 > 0:58:41are not always anchored in reasons.

0:58:41 > 0:58:43They're anchored in something else

0:58:43 > 0:58:46that comes from within and is different.