Browse content similar to How You Really Make Decisions. Check below for episodes and series from the same categories and more!
Line | From | To | |
---|---|---|---|
We like to think that as a species, we are pretty smart. | 0:00:04 | 0:00:08 | |
We like to think we are wise, rational creatures. | 0:00:11 | 0:00:15 | |
I think we all like to think of ourselves as Mr Spock to some degree. | 0:00:15 | 0:00:18 | |
You know, we make rational, conscious decisions. | 0:00:18 | 0:00:20 | |
But we may have to think again. | 0:00:21 | 0:00:22 | |
It's mostly delusion, and we should just wake up to that fact. | 0:00:24 | 0:00:28 | |
In every decision you make, | 0:00:31 | 0:00:32 | |
there's a battle in your mind between intuition and logic. | 0:00:32 | 0:00:36 | |
It's a conflict that plays out in every aspect of your life. | 0:00:38 | 0:00:41 | |
What you eat. | 0:00:43 | 0:00:44 | |
What you believe. | 0:00:45 | 0:00:46 | |
Who you fall in love with. | 0:00:47 | 0:00:49 | |
And most powerfully, in decisions you make about money. | 0:00:50 | 0:00:54 | |
The moment money enters the picture, the rules change. | 0:00:54 | 0:00:58 | |
Scientists now have a new way | 0:00:58 | 0:01:00 | |
to understand this battle in your mind, | 0:01:00 | 0:01:03 | |
how it shapes the decisions you take, what you believe... | 0:01:03 | 0:01:07 | |
And how it has transformed our understanding | 0:01:09 | 0:01:11 | |
of human nature itself. | 0:01:11 | 0:01:13 | |
TAXI HORN BLARES | 0:01:26 | 0:01:27 | |
Sitting in the back of this New York cab is Professor Danny Kahneman. | 0:01:34 | 0:01:38 | |
He's regarded as one of the most influential psychologists alive today. | 0:01:42 | 0:01:46 | |
Over the last 40 years, | 0:01:48 | 0:01:50 | |
he's developed some extraordinary insights | 0:01:50 | 0:01:52 | |
into the way we make decisions. | 0:01:52 | 0:01:54 | |
I think it can't hurt to have a realistic view of human nature | 0:01:58 | 0:02:02 | |
and of how the mind works. | 0:02:02 | 0:02:03 | |
His insights come largely from puzzles. | 0:02:07 | 0:02:09 | |
Take, for instance, the curious puzzle of New York cab drivers | 0:02:13 | 0:02:17 | |
and their highly illogical working habits. | 0:02:17 | 0:02:20 | |
Business varies according to the weather. | 0:02:23 | 0:02:26 | |
On rainy days, everyone wants a cab. | 0:02:27 | 0:02:29 | |
But on sunny days, like today, fares are hard to find. | 0:02:30 | 0:02:34 | |
Logically, they should spend a lot of time driving on rainy days, | 0:02:34 | 0:02:39 | |
because it's very easy to find passengers on rainy days, | 0:02:39 | 0:02:43 | |
and if they are going to take leisure, it should be on sunny days | 0:02:43 | 0:02:46 | |
but it turns out this is not what many of them do. | 0:02:46 | 0:02:50 | |
Many do the opposite, working long hours on slow, sunny days | 0:02:52 | 0:02:57 | |
and knocking off early when it's rainy and busy. | 0:02:57 | 0:02:59 | |
Instead of thinking logically, the cabbies are driven by an urge | 0:03:01 | 0:03:05 | |
to earn a set amount of cash each day, come rain or shine. | 0:03:05 | 0:03:10 | |
Once they hit that target, they go home. | 0:03:11 | 0:03:13 | |
They view being below the target as a loss | 0:03:17 | 0:03:19 | |
and being above the target as a gain, and they care | 0:03:19 | 0:03:22 | |
more about preventing the loss than about achieving the gain. | 0:03:22 | 0:03:26 | |
So when they reach their goal on a rainy day, they stop... | 0:03:28 | 0:03:31 | |
..which really doesn't make sense. | 0:03:35 | 0:03:37 | |
If they were trying to maximise their income, | 0:03:37 | 0:03:39 | |
they would take their leisure on sunny days | 0:03:39 | 0:03:42 | |
and they would drive all day on rainy days. | 0:03:42 | 0:03:44 | |
It was this kind of glitch in thinking | 0:03:49 | 0:03:51 | |
that Kahneman realised could reveal something profound | 0:03:51 | 0:03:54 | |
about the inner workings of the mind. | 0:03:54 | 0:03:56 | |
Anyone want to take part in an experiment? | 0:03:58 | 0:04:00 | |
And he began to devise a series of puzzles and questions | 0:04:00 | 0:04:03 | |
which have become classic psychological tests. | 0:04:03 | 0:04:06 | |
-It's a simple experiment. -It's a very attractive game... | 0:04:10 | 0:04:13 | |
Don't worry, sir. Nothing strenuous. | 0:04:13 | 0:04:14 | |
..Posing problems where you can recognise in yourself | 0:04:14 | 0:04:17 | |
that your intuition is going the wrong way. | 0:04:17 | 0:04:21 | |
The type of puzzle where the answer that intuitively springs to mind, | 0:04:21 | 0:04:25 | |
and that seems obvious, is, in fact, wrong. | 0:04:25 | 0:04:27 | |
Here is one that I think works on just about everybody. | 0:04:30 | 0:04:34 | |
I want you to imagine a guy called Steve. | 0:04:34 | 0:04:36 | |
You tell people that Steve, you know, is a meek and tidy soul | 0:04:36 | 0:04:42 | |
with a passion for detail and very little interest in people. | 0:04:42 | 0:04:48 | |
He's got a good eye for detail. | 0:04:48 | 0:04:49 | |
And then you tell people he was drawn at random. | 0:04:49 | 0:04:52 | |
From a census of the American population. | 0:04:52 | 0:04:54 | |
What's the probability that he is a farmer or a librarian? | 0:04:54 | 0:04:59 | |
So do you think it's more likely | 0:04:59 | 0:05:01 | |
that Steve's going to end up working as a librarian or a farmer? | 0:05:01 | 0:05:03 | |
What's he more likely to be? | 0:05:03 | 0:05:06 | |
Maybe a librarian. | 0:05:06 | 0:05:07 | |
Librarian. | 0:05:07 | 0:05:09 | |
Probably a librarian. | 0:05:09 | 0:05:10 | |
A librarian. | 0:05:10 | 0:05:11 | |
Immediately, you know, the thought pops to mind | 0:05:11 | 0:05:15 | |
that it's a librarian, because he resembled the prototype of librarian. | 0:05:15 | 0:05:20 | |
Probably a librarian. | 0:05:20 | 0:05:22 | |
In fact, that's probably the wrong answer, | 0:05:22 | 0:05:25 | |
because, at least in the United States, | 0:05:25 | 0:05:28 | |
there are 20 times as many male farmers as male librarians. | 0:05:28 | 0:05:32 | |
-Librarian. -Librarian. | 0:05:32 | 0:05:33 | |
So there are probably more meek and tidy souls you know | 0:05:33 | 0:05:37 | |
who are farmers than meek and tidy souls who are librarians. | 0:05:37 | 0:05:40 | |
This type of puzzle seemed to reveal a discrepancy | 0:05:45 | 0:05:49 | |
between intuition and logic. | 0:05:49 | 0:05:51 | |
Another example is... | 0:05:51 | 0:05:52 | |
Imagine a dictionary. I'm going to pull a word out of it at random. | 0:05:52 | 0:05:55 | |
Which is more likely, | 0:05:55 | 0:05:56 | |
that a word that you pick out at random has the letter | 0:05:56 | 0:06:02 | |
R in the first position or has the letter R in the third position? | 0:06:02 | 0:06:07 | |
-Erm, start with the letter R. -OK. | 0:06:07 | 0:06:09 | |
People think the first position, | 0:06:10 | 0:06:12 | |
because it's easy to think of examples. | 0:06:12 | 0:06:15 | |
-Start with it. -First. | 0:06:15 | 0:06:16 | |
In fact, there are nearly three times as many words with | 0:06:17 | 0:06:20 | |
R as the third letter, than words that begin with R, | 0:06:20 | 0:06:24 | |
but that's not what our intuition tells us. | 0:06:24 | 0:06:27 | |
So we have examples like that. Like, many of them. | 0:06:27 | 0:06:31 | |
Kahneman's interest in human error was first sparked in the 1970s | 0:06:34 | 0:06:38 | |
when he and his colleague, Amos Taverski, | 0:06:38 | 0:06:41 | |
began looking at their own mistakes. | 0:06:41 | 0:06:43 | |
It was all in ourselves. | 0:06:45 | 0:06:47 | |
That is, all the mistakes that we studied were mistakes | 0:06:47 | 0:06:50 | |
that we were prone to make. | 0:06:50 | 0:06:52 | |
In my hand here, I've got £100. | 0:06:52 | 0:06:55 | |
Kahneman and Taverski found a treasure trove of these puzzles. | 0:06:55 | 0:06:59 | |
Which would you prefer? | 0:06:59 | 0:07:00 | |
They unveiled a catalogue of human error. | 0:07:00 | 0:07:03 | |
Would you rather go to Rome with a free breakfast? | 0:07:04 | 0:07:07 | |
And opened a Pandora's Box of mistakes. | 0:07:07 | 0:07:10 | |
A year and a day. | 0:07:10 | 0:07:12 | |
25? | 0:07:12 | 0:07:14 | |
But the really interesting thing about these mistakes is, | 0:07:14 | 0:07:17 | |
they're not accidents. | 0:07:17 | 0:07:18 | |
75? | 0:07:18 | 0:07:20 | |
They have a shape, a structure. | 0:07:20 | 0:07:22 | |
I think Rome. | 0:07:22 | 0:07:23 | |
Skewing our judgment. | 0:07:23 | 0:07:25 | |
20. | 0:07:25 | 0:07:26 | |
What makes them interesting is that they are not random errors. | 0:07:26 | 0:07:29 | |
They are biases, so the difference between a bias | 0:07:29 | 0:07:32 | |
and a random error is that a bias is predictable. | 0:07:32 | 0:07:36 | |
It's a systematic error that is predictable. | 0:07:36 | 0:07:38 | |
Kahneman's puzzles prompt the wrong reply again. | 0:07:40 | 0:07:44 | |
More likely. | 0:07:44 | 0:07:45 | |
And again. | 0:07:45 | 0:07:46 | |
-More likely. -More likely? | 0:07:46 | 0:07:48 | |
And again. | 0:07:48 | 0:07:49 | |
Probably more likely? | 0:07:49 | 0:07:50 | |
It's a pattern of human error that affects every single one of us. | 0:07:51 | 0:07:55 | |
On their own, they may seem small. | 0:07:57 | 0:07:59 | |
Ah, that seems to be the right drawer. | 0:08:01 | 0:08:04 | |
But by rummaging around in our everyday mistakes... | 0:08:04 | 0:08:07 | |
That's very odd. | 0:08:07 | 0:08:09 | |
..Kahneman started a revolution in our understanding of human thinking. | 0:08:09 | 0:08:13 | |
A revolution so profound | 0:08:15 | 0:08:17 | |
and far-reaching that he was awarded a Nobel prize. | 0:08:17 | 0:08:20 | |
So if you want to see the medal, that's what it looks like. | 0:08:22 | 0:08:27 | |
That's it. | 0:08:30 | 0:08:31 | |
Psychologists have long strived to pick apart the moments | 0:08:38 | 0:08:42 | |
when people make decisions. | 0:08:42 | 0:08:43 | |
Much of the focus has been on our rational mind, | 0:08:45 | 0:08:48 | |
our capacity for logic. | 0:08:48 | 0:08:51 | |
But Kahneman saw the mind differently. | 0:08:51 | 0:08:55 | |
He saw a much more powerful role for the other side of our minds, | 0:08:55 | 0:08:58 | |
intuition. | 0:08:58 | 0:09:00 | |
And at the heart of human thinking, | 0:09:02 | 0:09:04 | |
there's a conflict between logic and intuition that leads to mistakes. | 0:09:04 | 0:09:08 | |
Kahneman and Taverski started this trend | 0:09:09 | 0:09:11 | |
of seeing the mind differently. | 0:09:11 | 0:09:13 | |
They found these decision-making illusions, | 0:09:15 | 0:09:17 | |
these spots where our intuitions just make us decide | 0:09:17 | 0:09:20 | |
these things that just don't make any sense. | 0:09:20 | 0:09:23 | |
The work of Kahneman and Taverski has really been revolutionary. | 0:09:23 | 0:09:26 | |
It kicked off a flurry of experimentation | 0:09:29 | 0:09:31 | |
and observation to understand the meaning of these mistakes. | 0:09:31 | 0:09:35 | |
People didn't really appreciate, as recently as 40 years ago, | 0:09:37 | 0:09:40 | |
that the mind didn't really work like a computer. | 0:09:40 | 0:09:45 | |
We thought that we were very deliberative, conscious creatures | 0:09:45 | 0:09:48 | |
who weighed up the costs and benefits of action, | 0:09:48 | 0:09:51 | |
just like Mr Spock would do. | 0:09:51 | 0:09:52 | |
By now, it's a fairly coherent body of work | 0:09:56 | 0:10:00 | |
about ways in which | 0:10:00 | 0:10:01 | |
intuition departs from the rules, if you will. | 0:10:01 | 0:10:07 | |
And the body of evidence is growing. | 0:10:11 | 0:10:13 | |
Some of the best clues to the working of our minds come not | 0:10:16 | 0:10:19 | |
when we get things right, but when we get things wrong. | 0:10:19 | 0:10:22 | |
In a corner of this otherwise peaceful campus, | 0:10:35 | 0:10:38 | |
Professor Chris Chabris is about to start a fight. | 0:10:38 | 0:10:42 | |
All right, so what I want you guys to do is stay in this area over here. | 0:10:42 | 0:10:49 | |
The two big guys grab you | 0:10:49 | 0:10:51 | |
and sort of like start pretending to punch you, make some sound effects. | 0:10:51 | 0:10:54 | |
All right, this looks good. | 0:10:54 | 0:10:56 | |
All right, that seemed pretty good to me. | 0:10:58 | 0:10:59 | |
It's part of an experiment | 0:11:01 | 0:11:03 | |
that shows a pretty shocking mistake that any one of us could make. | 0:11:03 | 0:11:06 | |
A mistake where you don't notice | 0:11:08 | 0:11:10 | |
what's happening right in front of your eyes. | 0:11:10 | 0:11:12 | |
As well as a fight, the experiment also involves a chase. | 0:11:19 | 0:11:22 | |
It was inspired by an incident in Boston in 1995, | 0:11:26 | 0:11:29 | |
when a young police officer, Kenny Conley, | 0:11:29 | 0:11:32 | |
was in hot pursuit of a murder suspect. | 0:11:32 | 0:11:34 | |
It turned out that this police officer, | 0:11:37 | 0:11:39 | |
while he was chasing the suspect, | 0:11:39 | 0:11:41 | |
had run right past some other police officers | 0:11:41 | 0:11:43 | |
who were beating up another suspect, which, of course, | 0:11:43 | 0:11:46 | |
police officers are not supposed to do under any circumstances. | 0:11:46 | 0:11:49 | |
When the police tried to investigate this case of police brutality, | 0:11:50 | 0:11:54 | |
he said, "I didn't see anything going on there, | 0:11:54 | 0:11:57 | |
"all I saw was the suspect I was chasing." | 0:11:57 | 0:11:59 | |
And nobody could believe this | 0:11:59 | 0:12:00 | |
and he was prosecuted for perjury and obstruction of justice. | 0:12:00 | 0:12:04 | |
Everyone was convinced that Conley was lying. | 0:12:05 | 0:12:08 | |
We don't want you to be, like, closer than about... | 0:12:08 | 0:12:11 | |
Everyone, that is, apart from Chris Chabris. | 0:12:11 | 0:12:13 | |
He wondered if our ability to pay attention | 0:12:17 | 0:12:19 | |
is so limited that any one of us could run past a vicious fight | 0:12:19 | 0:12:23 | |
without even noticing. | 0:12:23 | 0:12:24 | |
And it's something he's putting to the test. | 0:12:27 | 0:12:29 | |
Now, when you see someone jogging across the footbridge, | 0:12:29 | 0:12:32 | |
then you should get started. | 0:12:32 | 0:12:34 | |
Jackie, you can go. | 0:12:34 | 0:12:35 | |
In the experiment, | 0:12:40 | 0:12:41 | |
the subjects are asked to focus carefully on a cognitive task. | 0:12:41 | 0:12:44 | |
They must count the number of times | 0:12:45 | 0:12:47 | |
the runner taps her head with each hand. | 0:12:47 | 0:12:49 | |
Would they, like the Boston police officer, | 0:12:56 | 0:12:58 | |
be so blinded by their limited attention | 0:12:58 | 0:13:00 | |
that they would completely fail to notice the fight? | 0:13:00 | 0:13:03 | |
About 45 seconds or a minute into the run, there was the fight. | 0:13:07 | 0:13:10 | |
And they could actually see the fight from a ways away, | 0:13:10 | 0:13:14 | |
and it was about 20 feet away from them when they got closest to them. | 0:13:14 | 0:13:17 | |
The fight is right in their field of view, | 0:13:20 | 0:13:22 | |
and at least partially visible from as far back as the footbridge. | 0:13:22 | 0:13:26 | |
It seems incredible that anyone would fail to notice something | 0:13:34 | 0:13:37 | |
so apparently obvious. | 0:13:37 | 0:13:38 | |
They completed the three-minute course | 0:13:42 | 0:13:44 | |
and then we said, "Did you notice anything unusual?" | 0:13:44 | 0:13:47 | |
Yes. | 0:13:47 | 0:13:48 | |
-What was it? -It was a fight. | 0:13:48 | 0:13:50 | |
Sometimes they would have noticed the fight and they would say, | 0:13:50 | 0:13:53 | |
"Yeah, I saw some guys fighting", but a large percentage of people said | 0:13:53 | 0:13:57 | |
"We didn't see anything unusual at all." | 0:13:57 | 0:13:59 | |
And when we asked them specifically | 0:13:59 | 0:14:00 | |
about whether they saw anybody fighting, they still said no. | 0:14:00 | 0:14:03 | |
In fact, nearly 50% of people in the experiment | 0:14:10 | 0:14:12 | |
completely failed to notice the fight. | 0:14:12 | 0:14:15 | |
Did you see anything unusual during the run? | 0:14:17 | 0:14:20 | |
No. | 0:14:20 | 0:14:21 | |
OK. Did you see some people fighting? | 0:14:21 | 0:14:23 | |
No. | 0:14:23 | 0:14:25 | |
We did at night time and we did it in the daylight. | 0:14:28 | 0:14:31 | |
Even when we did it in daylight, | 0:14:32 | 0:14:34 | |
many people ran right past the fight and didn't notice it at all. | 0:14:34 | 0:14:38 | |
Did you see anything unusual during the run? | 0:14:41 | 0:14:43 | |
No, not really. | 0:14:43 | 0:14:45 | |
OK, did you see some people fighting? | 0:14:45 | 0:14:47 | |
No. | 0:14:47 | 0:14:48 | |
-You really didn't see anyone fighting? -No. | 0:14:48 | 0:14:50 | |
Does it surprise you that you would have missed that? | 0:14:50 | 0:14:52 | |
They were about 20 feet off the path. | 0:14:52 | 0:14:55 | |
-Oh! -You ran right past them. | 0:14:55 | 0:14:56 | |
-Completely missed that, then. -OK. | 0:14:56 | 0:14:58 | |
Maybe what happened to Conley was, | 0:15:00 | 0:15:02 | |
when you're really paying attention to one thing | 0:15:02 | 0:15:04 | |
and focusing a lot of mental energy on it, you can miss things | 0:15:04 | 0:15:07 | |
that other people are going to think are completely obvious, and in fact, | 0:15:07 | 0:15:10 | |
that's what the jurors said after Conley's trial. | 0:15:10 | 0:15:12 | |
They said "We couldn't believe that he could miss something like that". | 0:15:12 | 0:15:15 | |
It didn't make any sense. He had to have been lying. | 0:15:15 | 0:15:17 | |
It's an unsettling phenomenon called inattentional blindness | 0:15:20 | 0:15:24 | |
that can affect us all. | 0:15:24 | 0:15:26 | |
Some people have said things like | 0:15:27 | 0:15:29 | |
"This shatters my faith in my own mind", | 0:15:29 | 0:15:30 | |
or, "Now I don't know what to believe", | 0:15:30 | 0:15:32 | |
or, "I'm going to be confused from now on." | 0:15:32 | 0:15:35 | |
But I'm not sure that that feeling really stays with them very long. | 0:15:35 | 0:15:38 | |
They are going to go out from the experiment, you know, | 0:15:38 | 0:15:40 | |
walk to the next place they're going or something like that | 0:15:40 | 0:15:43 | |
and they're going to have just as much inattentional blindness | 0:15:43 | 0:15:45 | |
when they're walking down the street that afternoon as they did before. | 0:15:45 | 0:15:48 | |
This experiment reveals a powerful quandary about our minds. | 0:15:53 | 0:15:56 | |
We glide through the world blissfully unaware | 0:15:58 | 0:16:00 | |
of most of what we do and how little we really know our minds. | 0:16:00 | 0:16:04 | |
For all its brilliance, | 0:16:07 | 0:16:08 | |
the part of our mind we call ourselves is extremely limited. | 0:16:08 | 0:16:12 | |
So how do we manage to navigate our way through | 0:16:15 | 0:16:17 | |
the complexity of daily life? | 0:16:17 | 0:16:19 | |
Every day, each one of us | 0:16:31 | 0:16:32 | |
makes somewhere between two and 10,000 decisions. | 0:16:32 | 0:16:35 | |
When you think about our daily lives, | 0:16:40 | 0:16:42 | |
it's really a long, long sequence of decisions. | 0:16:42 | 0:16:45 | |
We make decisions probably at a frequency | 0:16:47 | 0:16:50 | |
that is close to the frequency we breathe. | 0:16:50 | 0:16:52 | |
Every minute, every second, you're deciding where to move your legs, | 0:16:52 | 0:16:56 | |
and where to move your eyes, and where to move your limbs, | 0:16:56 | 0:16:58 | |
and when you're eating a meal, | 0:16:58 | 0:17:00 | |
you're making all kinds of decisions. | 0:17:00 | 0:17:01 | |
And yet the vast majority of these decisions, | 0:17:03 | 0:17:05 | |
we make without even realising. | 0:17:05 | 0:17:07 | |
It was Danny Kahneman's insight that we have two systems | 0:17:13 | 0:17:17 | |
in the mind for making decisions. | 0:17:17 | 0:17:19 | |
Two ways of thinking: | 0:17:21 | 0:17:23 | |
fast and slow. | 0:17:23 | 0:17:25 | |
You know, our mind has really two ways of operating, | 0:17:30 | 0:17:33 | |
and one is sort of fast-thinking, an automatic effortless mode, | 0:17:33 | 0:17:39 | |
and that's the one we're in most of the time. | 0:17:39 | 0:17:41 | |
This fast, automatic mode of thinking, he called System 1. | 0:17:44 | 0:17:48 | |
It's powerful, effortless and responsible for most of what we do. | 0:17:51 | 0:17:54 | |
And System 1 is, you know, that's what happens most of the time. | 0:17:56 | 0:18:00 | |
You're there, the world around you provides all kinds of stimuli | 0:18:00 | 0:18:05 | |
and you respond to them. | 0:18:05 | 0:18:07 | |
Everything that you see and that you understand, you know, | 0:18:07 | 0:18:11 | |
this is a tree, that's a helicopter back there, | 0:18:11 | 0:18:14 | |
that's the Statue of Liberty. | 0:18:14 | 0:18:16 | |
All of this visual perception, all of this comes through System 1. | 0:18:16 | 0:18:20 | |
The other mode is slow, deliberate, logical and rational. | 0:18:21 | 0:18:26 | |
This is System 2 and it's the bit you think of as you, | 0:18:27 | 0:18:32 | |
the voice in your head. | 0:18:32 | 0:18:34 | |
The simplest example of the two systems is | 0:18:34 | 0:18:36 | |
really two plus two is on one side, and 17 times 24 is on the other. | 0:18:36 | 0:18:42 | |
What is two plus two? | 0:18:42 | 0:18:43 | |
-Four. -Four. -Four. | 0:18:43 | 0:18:45 | |
Fast System 1 is always in gear, producing instant answers. | 0:18:45 | 0:18:49 | |
-And what's two plus two? -Four. | 0:18:49 | 0:18:51 | |
A number comes to your mind. | 0:18:51 | 0:18:53 | |
-Four. -Four. -Four. | 0:18:53 | 0:18:55 | |
It is automatic. You do not intend for it to happen. | 0:18:55 | 0:18:59 | |
It just happens to you. It's almost like a reflex. | 0:18:59 | 0:19:01 | |
And what's 22 times 17? | 0:19:01 | 0:19:04 | |
That's a good one. | 0:19:04 | 0:19:05 | |
But when we have to pay attention to a tricky problem, | 0:19:10 | 0:19:13 | |
we engage slow-but-logical System 2. | 0:19:13 | 0:19:16 | |
If you can do that in your head, | 0:19:17 | 0:19:19 | |
you'll have to follow some rules and to do it sequentially. | 0:19:19 | 0:19:22 | |
And that is not automatic at all. | 0:19:24 | 0:19:26 | |
That involves work, it involves effort, it involves concentration. | 0:19:26 | 0:19:30 | |
22 times 17? | 0:19:30 | 0:19:32 | |
There will be physiological symptoms. | 0:19:36 | 0:19:38 | |
Your heart rate will accelerate, your pupils will dilate, | 0:19:38 | 0:19:41 | |
so many changes will occur while you're performing this computation. | 0:19:41 | 0:19:45 | |
Three's...oh, God! | 0:19:46 | 0:19:49 | |
That's, so... | 0:19:52 | 0:19:53 | |
220 and seven times 22 is... | 0:19:53 | 0:19:56 | |
..54. 374? | 0:20:00 | 0:20:02 | |
-OK. And can I get you to just walk with me for a second? -OK. | 0:20:02 | 0:20:06 | |
Who's the current...? | 0:20:06 | 0:20:07 | |
System 2 may be clever, but it's also slow, limited and lazy. | 0:20:07 | 0:20:13 | |
I live in Berkeley during summers and I walk a lot. | 0:20:13 | 0:20:16 | |
And when I walk very fast, I cannot think. | 0:20:16 | 0:20:19 | |
-Can I get you to count backwards from 100 by 7? -Sure. | 0:20:19 | 0:20:24 | |
193, 80... It's hard when you're walking. | 0:20:24 | 0:20:28 | |
It takes up, interestingly enough, | 0:20:28 | 0:20:31 | |
the same kind of executive function as...as thinking. | 0:20:31 | 0:20:36 | |
Fourty...four? | 0:20:36 | 0:20:39 | |
If you are expected to do something that demands a lot of effort, | 0:20:41 | 0:20:46 | |
you will stop even walking. | 0:20:46 | 0:20:48 | |
Eighty...um...six? | 0:20:48 | 0:20:52 | |
51. | 0:20:52 | 0:20:53 | |
Uh.... | 0:20:55 | 0:20:56 | |
16, | 0:20:57 | 0:20:59 | |
9, 2? | 0:20:59 | 0:21:00 | |
Everything that you're aware of in your own mind | 0:21:03 | 0:21:06 | |
is part of this slow, deliberative System 2. | 0:21:06 | 0:21:09 | |
As far as you're concerned, it is the star of the show. | 0:21:09 | 0:21:13 | |
Actually, I describe System 2 as not the star. | 0:21:14 | 0:21:19 | |
I describe it as, as a minor character who thinks he is the star, | 0:21:19 | 0:21:23 | |
because, in fact, | 0:21:23 | 0:21:25 | |
most of what goes on in our mind is automatic. | 0:21:25 | 0:21:28 | |
You know, it's in the domain that I call System 1. | 0:21:28 | 0:21:31 | |
System 1 is an old, evolved bit of our brain, and it's remarkable. | 0:21:31 | 0:21:34 | |
We couldn't survive without it because System 2 would explode. | 0:21:34 | 0:21:38 | |
If Mr Spock had to make every decision for us, | 0:21:38 | 0:21:41 | |
it would be very slow and effortful and our heads would explode. | 0:21:41 | 0:21:45 | |
And this vast, hidden domain is responsible | 0:21:45 | 0:21:48 | |
for far more than you would possibly believe. | 0:21:48 | 0:21:51 | |
Having an opinion, you have an opinion immediately, | 0:21:51 | 0:21:55 | |
whether you like it or not, whether you like something or not, | 0:21:55 | 0:21:58 | |
whether you're for something or not, liking someone or not liking them. | 0:21:58 | 0:22:02 | |
That, quite often, is something you have no control over. | 0:22:02 | 0:22:04 | |
Later, when you're asked for reasons, you will invent reasons. | 0:22:04 | 0:22:08 | |
And a lot of what System 2 does is, it provides reason. | 0:22:08 | 0:22:11 | |
It provides rationalisations which are not necessarily the true reasons | 0:22:11 | 0:22:17 | |
for our beliefs and our emotions and our intentions and what we do. | 0:22:17 | 0:22:22 | |
You have two systems of thinking that steer you through life... | 0:22:28 | 0:22:31 | |
Fast, intuitive System 1 that is incredibly powerful | 0:22:34 | 0:22:38 | |
and does most of the driving. | 0:22:38 | 0:22:40 | |
And slow, logical System 2 | 0:22:42 | 0:22:45 | |
that is clever, but a little lazy. | 0:22:45 | 0:22:48 | |
Trouble is, there's a bit of a battle between them | 0:22:48 | 0:22:52 | |
as to which one is driving your decisions. | 0:22:52 | 0:22:54 | |
And this is where the mistakes creep in, | 0:23:02 | 0:23:06 | |
when we use the wrong system to make a decision. | 0:23:06 | 0:23:09 | |
Just going to ask you a few questions. | 0:23:09 | 0:23:10 | |
We're interested in what you think. | 0:23:10 | 0:23:12 | |
This question concerns this nice bottle of champagne I have here. | 0:23:12 | 0:23:15 | |
Millesime 2005, it's a good year, genuinely nice, vintage bottle. | 0:23:15 | 0:23:19 | |
These people think they're about to use slow, sensible System 2 | 0:23:20 | 0:23:25 | |
to make a rational decision about how much they would pay | 0:23:25 | 0:23:28 | |
for a bottle of champagne. | 0:23:28 | 0:23:30 | |
But what they don't know is that their decision | 0:23:30 | 0:23:33 | |
will actually be taken totally | 0:23:33 | 0:23:35 | |
without their knowledge by their hidden, fast auto-pilot, System 1. | 0:23:35 | 0:23:40 | |
And with the help of a bag of ping-pong balls, | 0:23:42 | 0:23:45 | |
we can influence that decision. | 0:23:45 | 0:23:46 | |
I've got a set of numbered balls here from 1 to 100 in this bag. | 0:23:48 | 0:23:52 | |
I'd like you to reach in and draw one out at random for me, | 0:23:52 | 0:23:55 | |
if you would. | 0:23:55 | 0:23:56 | |
First, they've got to choose a ball. | 0:23:56 | 0:23:58 | |
-The number says ten. -Ten. -Ten. -Ten. | 0:23:58 | 0:24:00 | |
They think it's a random number, but in fact, it's rigged. | 0:24:00 | 0:24:04 | |
All the balls are marked with the low number ten. | 0:24:04 | 0:24:07 | |
This experiment is all about the thoughtless creation of habits. | 0:24:07 | 0:24:13 | |
It's about how we make one decision, and then other decisions follow it | 0:24:13 | 0:24:17 | |
as if the first decision was actually meaningful. | 0:24:17 | 0:24:20 | |
What we do is purposefully, | 0:24:22 | 0:24:24 | |
we give people a first decision that is clearly meaningless. | 0:24:24 | 0:24:28 | |
-Ten. -Ten, OK. Would you be willing to pay ten pounds | 0:24:28 | 0:24:30 | |
for this nice bottle of vintage champagne? | 0:24:30 | 0:24:33 | |
I would, yes. | 0:24:33 | 0:24:34 | |
-No. -Yeah, I guess. -OK. | 0:24:34 | 0:24:37 | |
This first decision is meaningless, | 0:24:37 | 0:24:40 | |
based as it is on a seemingly random number. | 0:24:40 | 0:24:43 | |
But what it does do is lodge the low number ten in their heads. | 0:24:44 | 0:24:49 | |
-Would you buy it for ten pounds? -Yes, I would. Yes. -You would? OK. | 0:24:49 | 0:24:53 | |
Now for the real question where we ask them | 0:24:54 | 0:24:57 | |
how much they'd actually pay for the champagne. | 0:24:57 | 0:25:00 | |
What's the maximum amount you think you'd be willing to pay? | 0:25:00 | 0:25:03 | |
-20? -OK. | 0:25:03 | 0:25:05 | |
-Seven pounds. -Seven pounds, OK. | 0:25:05 | 0:25:07 | |
Probably ten pound. | 0:25:07 | 0:25:08 | |
A range of fairly low offers. | 0:25:08 | 0:25:10 | |
But what happens if we prime people with a much higher number, | 0:25:12 | 0:25:15 | |
65 instead of 10? | 0:25:15 | 0:25:17 | |
-What does that one say? -65. -65, OK. | 0:25:20 | 0:25:22 | |
-65. -OK. | 0:25:22 | 0:25:23 | |
It says 65. | 0:25:23 | 0:25:25 | |
How will this affect the price people are prepared to pay? | 0:25:27 | 0:25:31 | |
What's the maximum you would be willing to pay for this | 0:25:31 | 0:25:33 | |
bottle of champagne? | 0:25:33 | 0:25:35 | |
-40? -£45. -45, OK. | 0:25:35 | 0:25:37 | |
50. | 0:25:37 | 0:25:39 | |
-40 quid? -OK. | 0:25:39 | 0:25:41 | |
-£50? -£50? -Yeah, I'd pay between 50 and £80. | 0:25:41 | 0:25:45 | |
-Between 50 and 80? -Yeah. | 0:25:45 | 0:25:48 | |
Logic has gone out of the window. | 0:25:48 | 0:25:50 | |
The price people are prepared to pay is influenced by nothing more | 0:25:52 | 0:25:55 | |
than a number written on a ping-pong ball. | 0:25:55 | 0:25:58 | |
It suggests that when we can't make decisions, | 0:26:01 | 0:26:04 | |
we don't evaluate the decision in itself. | 0:26:04 | 0:26:06 | |
Instead what we do is, | 0:26:06 | 0:26:08 | |
we try to look at other similar decisions we've made in the past | 0:26:08 | 0:26:12 | |
and we take those decisions as if they were good decisions | 0:26:12 | 0:26:16 | |
and we say to ourselves, | 0:26:16 | 0:26:17 | |
"Oh, I've made this decision before. | 0:26:17 | 0:26:19 | |
"Clearly, I don't need to go ahead and solve this decision. | 0:26:19 | 0:26:23 | |
"Let me just use what I did before and repeat it, | 0:26:23 | 0:26:25 | |
"maybe with some modifications." | 0:26:25 | 0:26:27 | |
This anchoring effect comes from the conflict | 0:26:29 | 0:26:32 | |
between our two systems of thinking. | 0:26:32 | 0:26:35 | |
Fast System 1 is a master of taking short cuts | 0:26:37 | 0:26:41 | |
to bring about the quickest possible decision. | 0:26:41 | 0:26:45 | |
What happens is, they ask you a question | 0:26:45 | 0:26:48 | |
and if the question is difficult | 0:26:48 | 0:26:50 | |
but there is a related question that is a lot...that is somewhat simpler, | 0:26:50 | 0:26:54 | |
you're just going to answer the other question and... | 0:26:54 | 0:26:58 | |
and not even notice. | 0:26:58 | 0:27:00 | |
So the system does all kinds of short cuts to feed us | 0:27:00 | 0:27:02 | |
the information in a faster way and we can make actions, | 0:27:02 | 0:27:06 | |
and the system is accepting some mistakes. | 0:27:06 | 0:27:08 | |
We make decisions using fast System 1 | 0:27:11 | 0:27:13 | |
when we really should be using slow System 2. | 0:27:13 | 0:27:17 | |
And this is why we make the mistakes we do, | 0:27:18 | 0:27:22 | |
systematic mistakes known as cognitive biases. | 0:27:22 | 0:27:26 | |
Nice day. | 0:27:28 | 0:27:30 | |
Since Kahneman first began investigating the glitches in our thinking, | 0:27:36 | 0:27:41 | |
more than 150 cognitive biases have been identified. | 0:27:41 | 0:27:45 | |
We are riddled with these systematic mistakes | 0:27:46 | 0:27:49 | |
and they affect every aspect of our daily lives. | 0:27:49 | 0:27:52 | |
Wikipedia has a very big list of biases | 0:27:54 | 0:27:57 | |
and we are finding new ones all the time. | 0:27:57 | 0:28:00 | |
One of the biases that I think is the most important | 0:28:00 | 0:28:03 | |
is what's called the present bias focus. | 0:28:03 | 0:28:05 | |
It's the fact that we focus on now | 0:28:05 | 0:28:07 | |
and don't think very much about the future. | 0:28:07 | 0:28:09 | |
That's the bias that causes things like overeating and smoking, | 0:28:09 | 0:28:13 | |
and texting and driving, and having unprotected sex. | 0:28:13 | 0:28:16 | |
Another one is called the halo effect, | 0:28:16 | 0:28:19 | |
and this is the idea | 0:28:19 | 0:28:21 | |
that if you like somebody or an organisation, | 0:28:21 | 0:28:25 | |
you're biased to think that all of its aspects are good, | 0:28:25 | 0:28:29 | |
that everything is good about it. | 0:28:29 | 0:28:30 | |
If you dislike it, everything is bad. | 0:28:30 | 0:28:33 | |
People really are quite uncomfortable, you know, | 0:28:33 | 0:28:35 | |
by the idea that Hitler loved children, you know. | 0:28:35 | 0:28:39 | |
He did. | 0:28:39 | 0:28:40 | |
Now, that doesn't make him a good person, | 0:28:40 | 0:28:43 | |
but we feel uncomfortable to see an attractive trait | 0:28:43 | 0:28:47 | |
in a person that we consider, you know, the epitome of evil. | 0:28:47 | 0:28:51 | |
We are prone to think that what we like is all good | 0:28:51 | 0:28:54 | |
and what we dislike is all bad. | 0:28:54 | 0:28:56 | |
That's a bias. | 0:28:56 | 0:28:57 | |
Another particular favourite of mine is the bias to get attached | 0:28:57 | 0:29:01 | |
to things that we ourselves have created. | 0:29:01 | 0:29:04 | |
We call it the IKEA effect. | 0:29:04 | 0:29:06 | |
Well, you've got loss aversion, risk aversion, present bias. | 0:29:06 | 0:29:10 | |
Spotlight effect, and the spotlight effect is the idea | 0:29:10 | 0:29:13 | |
that we think that other people pay a lot of attention to us | 0:29:13 | 0:29:16 | |
when in fact, they don't. | 0:29:16 | 0:29:17 | |
Confirmation bias. Overconfidence is a big one. | 0:29:17 | 0:29:20 | |
But what's clear is that there's lots of them. | 0:29:20 | 0:29:23 | |
There's lots of ways for us to get things wrong. | 0:29:23 | 0:29:25 | |
You know, there's one way to do things right | 0:29:25 | 0:29:27 | |
and many ways to do things wrong, and we're capable of many of them. | 0:29:27 | 0:29:31 | |
These biases explain so many things that we get wrong. | 0:29:37 | 0:29:41 | |
Our impulsive spending. | 0:29:42 | 0:29:45 | |
Trusting the wrong people. | 0:29:45 | 0:29:48 | |
Not seeing the other person's point of view. | 0:29:48 | 0:29:51 | |
Succumbing to temptation. | 0:29:51 | 0:29:53 | |
We are so riddled with these biases, | 0:29:56 | 0:29:59 | |
it's hard to believe we ever make a rational decision. | 0:29:59 | 0:30:02 | |
But it's not just our everyday decisions that are affected. | 0:30:10 | 0:30:13 | |
What happens if you're an expert, | 0:30:28 | 0:30:30 | |
trained in making decisions that are a matter of life and death? | 0:30:30 | 0:30:34 | |
Are you still destined to make these systematic mistakes? | 0:30:38 | 0:30:41 | |
On the outskirts of Washington DC, | 0:30:44 | 0:30:47 | |
Horizon has been granted access to spy on the spooks. | 0:30:47 | 0:30:52 | |
Welcome to Analytical Exercise Number Four. | 0:30:56 | 0:31:00 | |
Former intelligence analyst Donald Kretz | 0:31:01 | 0:31:04 | |
is running an ultra-realistic spy game. | 0:31:04 | 0:31:07 | |
This exercise will take place in the fictitious city of Vastopolis. | 0:31:08 | 0:31:13 | |
Taking part are a mixture of trained intelligence analysts | 0:31:15 | 0:31:19 | |
and some novices. | 0:31:19 | 0:31:20 | |
Due to an emerging threat... | 0:31:23 | 0:31:25 | |
..a terrorism taskforce has been stood up. | 0:31:26 | 0:31:30 | |
I will be the terrorism taskforce lead | 0:31:30 | 0:31:31 | |
and I have recruited all of you to be our terrorism analysts. | 0:31:31 | 0:31:36 | |
The challenge facing the analysts | 0:31:39 | 0:31:41 | |
is to thwart a terrorist threat against a US city. | 0:31:41 | 0:31:45 | |
The threat at this point has not been determined. | 0:31:47 | 0:31:50 | |
It's up to you to figure out the type of terrorism... | 0:31:50 | 0:31:53 | |
..and who's responsible for planning it. | 0:31:54 | 0:31:56 | |
The analysts face a number of tasks. | 0:31:59 | 0:32:02 | |
They must first investigate any groups who may pose a threat. | 0:32:02 | 0:32:06 | |
Your task is to write a report. | 0:32:06 | 0:32:08 | |
The subject in this case is the Network of Dread. | 0:32:08 | 0:32:11 | |
The mayor has asked for this 15 minutes from now. | 0:32:11 | 0:32:14 | |
Just like in the real world, the analysts have access | 0:32:17 | 0:32:21 | |
to a huge amount of data streaming in, from government agencies, | 0:32:21 | 0:32:25 | |
social media, mobile phones and emergency services. | 0:32:25 | 0:32:29 | |
The Network of Dread turns out to be | 0:32:33 | 0:32:35 | |
a well-known international terror group. | 0:32:35 | 0:32:38 | |
They have the track record, the capability | 0:32:38 | 0:32:41 | |
and the personnel to carry out an attack. | 0:32:41 | 0:32:45 | |
The scenario that's emerging | 0:32:45 | 0:32:46 | |
is a bio-terror event, | 0:32:46 | 0:32:48 | |
meaning it's a biological terrorism attack | 0:32:48 | 0:32:51 | |
that's going to take place against the city. | 0:32:51 | 0:32:53 | |
If there is an emerging threat, they are the likely candidate. | 0:32:56 | 0:33:00 | |
We need to move onto the next task. | 0:33:00 | 0:33:02 | |
It's now 9th April. | 0:33:04 | 0:33:06 | |
This is another request for information, | 0:33:06 | 0:33:09 | |
this time on something or someone called the Masters of Chaos. | 0:33:09 | 0:33:13 | |
The Masters of Chaos are a group of cyber-hackers, | 0:33:16 | 0:33:21 | |
a local bunch of misfits with no history of violence. | 0:33:21 | 0:33:24 | |
And while the analysts continue to sift through the incoming data, | 0:33:27 | 0:33:31 | |
behind the scenes, Kretz is watching their every move. | 0:33:31 | 0:33:36 | |
In this room, we're able to monitor what the analysts are doing | 0:33:36 | 0:33:39 | |
throughout the entire exercise. | 0:33:39 | 0:33:41 | |
We have set up a knowledge base | 0:33:42 | 0:33:44 | |
into which we have been inserting data | 0:33:44 | 0:33:46 | |
throughout the course of the day. | 0:33:46 | 0:33:48 | |
Some of them are related to our terrorist threat, | 0:33:48 | 0:33:50 | |
many of them are not. | 0:33:50 | 0:33:52 | |
Amidst the wealth of data on the known terror group, | 0:33:53 | 0:33:56 | |
there's also evidence coming in | 0:33:56 | 0:33:58 | |
of a theft at a university biology lab | 0:33:58 | 0:34:00 | |
and someone has hacked into the computers of a local freight firm. | 0:34:00 | 0:34:05 | |
Each of these messages represents, essentially, a piece of the puzzle, | 0:34:06 | 0:34:10 | |
but it's a puzzle that you don't have the box top to, | 0:34:10 | 0:34:14 | |
so you don't have the picture in advance, | 0:34:14 | 0:34:17 | |
so you don't know what pieces go where. | 0:34:17 | 0:34:20 | |
Furthermore, what we have is a bunch of puzzle pieces that don't | 0:34:20 | 0:34:23 | |
even go with this puzzle. | 0:34:23 | 0:34:25 | |
The exercise is part of a series of experiments to investigate | 0:34:27 | 0:34:31 | |
whether expert intelligence agents | 0:34:31 | 0:34:33 | |
are just as prone to mistakes from cognitive bias as the rest of us, | 0:34:33 | 0:34:37 | |
or whether their training and expertise makes them immune. | 0:34:37 | 0:34:43 | |
I have a sort of insider's point of view of this problem. | 0:34:44 | 0:34:47 | |
I worked a number of years as an intelligence analyst. | 0:34:47 | 0:34:51 | |
The stakes are incredibly high. | 0:34:51 | 0:34:53 | |
Mistakes can often be life and death. | 0:34:53 | 0:34:57 | |
We roll ahead now. The date is 21st May. | 0:35:00 | 0:35:02 | |
If the analysts are able to think rationally, | 0:35:04 | 0:35:07 | |
they should be able to solve the puzzle. | 0:35:07 | 0:35:09 | |
But the danger is, they will fall into the trap set by Kretz | 0:35:10 | 0:35:15 | |
and only pay attention to the established terror group, | 0:35:15 | 0:35:18 | |
the Network of Dread. | 0:35:18 | 0:35:20 | |
Their judgment may be clouded by a bias called confirmation bias. | 0:35:21 | 0:35:27 | |
Confirmation bias is the most prevalent bias of all, | 0:35:27 | 0:35:30 | |
and it's where we tend to search for information | 0:35:30 | 0:35:33 | |
that supports what we already believe. | 0:35:33 | 0:35:35 | |
Confirmation bias can easily lead people | 0:35:37 | 0:35:39 | |
to ignore the evidence in front of their eyes. | 0:35:39 | 0:35:42 | |
And Kretz is able to monitor if the bias kicks in. | 0:35:44 | 0:35:47 | |
We still see that they're searching for Network of Dread. | 0:35:48 | 0:35:51 | |
That's an indication that we may have a confirmation bias operating. | 0:35:51 | 0:35:55 | |
The Network of Dread are the big guys - | 0:35:57 | 0:35:59 | |
they've done it before, so you would expect they'd do it again. | 0:35:59 | 0:36:04 | |
And I think we're starting to see some biases here. | 0:36:04 | 0:36:07 | |
Analysts desperately want to get to the correct answer, | 0:36:08 | 0:36:11 | |
but they're affected by the same biases as the rest of us. | 0:36:11 | 0:36:15 | |
So far, most of our analysts seem to believe | 0:36:15 | 0:36:19 | |
that the Network of Dread is responsible for planning this attack, | 0:36:19 | 0:36:24 | |
and that is completely wrong. | 0:36:24 | 0:36:26 | |
How are we doing? | 0:36:30 | 0:36:31 | |
It's time for the analysts to put themselves on the line | 0:36:33 | 0:36:36 | |
and decide who the terrorists are and what they're planning. | 0:36:36 | 0:36:40 | |
So what do you think? | 0:36:42 | 0:36:43 | |
It was a bio-terrorist attack. | 0:36:43 | 0:36:47 | |
-I had a different theory. -What's your theory? | 0:36:47 | 0:36:49 | |
Cos I may be missing something here, too. | 0:36:49 | 0:36:51 | |
They know that the Network of Dread is a terrorist group. | 0:36:51 | 0:36:54 | |
They know that the Masters of Chaos is a cyber-hacking group. | 0:36:54 | 0:36:58 | |
Either to the new factory or in the water supply. | 0:36:58 | 0:37:01 | |
Lots of dead fish floating up in the river. | 0:37:01 | 0:37:05 | |
The question is, did any of the analysts manage to dig out | 0:37:06 | 0:37:09 | |
the relevant clues and find the true threat? | 0:37:09 | 0:37:12 | |
In this case, the actual threat is due to the cyber-group, | 0:37:14 | 0:37:18 | |
the Masters of Chaos, | 0:37:18 | 0:37:20 | |
who become increasingly radicalised throughout the scenario | 0:37:20 | 0:37:23 | |
and decide to take out their anger on society, essentially. | 0:37:23 | 0:37:27 | |
Who convinced them to switch from cyber-crime to bio-terrorism? | 0:37:27 | 0:37:31 | |
Or did they succumb to confirmation bias | 0:37:31 | 0:37:34 | |
and simply pin the blame on the usual suspects? | 0:37:34 | 0:37:38 | |
Will they make that connection? | 0:37:38 | 0:37:40 | |
Will they process that evidence and assess it accordingly, | 0:37:40 | 0:37:43 | |
or will their confirmation bias drive them | 0:37:43 | 0:37:46 | |
to believe that it's a more traditional type of terrorist group? | 0:37:46 | 0:37:50 | |
I believe that the Masters of Chaos are actually the ones behind it. | 0:37:50 | 0:37:53 | |
It's either a threat or not a threat, but the Network of Dread? | 0:37:53 | 0:37:56 | |
And time's up. Please go ahead and save those reports. | 0:37:56 | 0:37:59 | |
At the end of the exercise, | 0:38:02 | 0:38:03 | |
Kretz reveals the true identity of the terrorists. | 0:38:03 | 0:38:08 | |
We have a priority message from City Hall. | 0:38:08 | 0:38:10 | |
The terrorist attack was thwarted, | 0:38:12 | 0:38:15 | |
the planned bio-terrorist attack by the Masters of Chaos | 0:38:15 | 0:38:20 | |
against Vastopolis was thwarted. | 0:38:20 | 0:38:22 | |
The mayor expresses his thanks for a job well done. | 0:38:22 | 0:38:25 | |
Show of hands, who...who got it? | 0:38:25 | 0:38:29 | |
Yeah. | 0:38:30 | 0:38:32 | |
Out of 12 subjects, 11 of them got the wrong answer. | 0:38:32 | 0:38:37 | |
The only person to spot the true threat was in fact a novice. | 0:38:38 | 0:38:42 | |
All the trained experts fell prey to confirmation bias. | 0:38:44 | 0:38:48 | |
It is not typically the case | 0:38:53 | 0:38:55 | |
that simply being trained as an analyst | 0:38:55 | 0:38:57 | |
gives you the tools you need to overcome cognitive bias. | 0:38:57 | 0:39:00 | |
You can learn techniques for memory improvement. | 0:39:02 | 0:39:05 | |
You can learn techniques for better focus, | 0:39:05 | 0:39:08 | |
but techniques to eliminate cognitive bias | 0:39:08 | 0:39:11 | |
just simply don't work. | 0:39:11 | 0:39:13 | |
And for intelligence analysts in the real world, | 0:39:19 | 0:39:21 | |
the implications of making mistakes from these biases are drastic. | 0:39:21 | 0:39:26 | |
Government reports and studies over the past decade or so | 0:39:28 | 0:39:32 | |
have cited experts as believing that cognitive bias | 0:39:32 | 0:39:35 | |
may have played a role | 0:39:35 | 0:39:37 | |
in a number of very significant intelligence failures, | 0:39:37 | 0:39:41 | |
and yet it remains an understudied problem. | 0:39:41 | 0:39:44 | |
Heads. | 0:39:51 | 0:39:53 | |
Heads. | 0:39:54 | 0:39:55 | |
But the area of our lives | 0:39:58 | 0:39:59 | |
in which these systematic mistakes have the most explosive impact | 0:39:59 | 0:40:04 | |
is in the world of money. | 0:40:04 | 0:40:06 | |
The moment money enters the picture, the rules change. | 0:40:07 | 0:40:11 | |
Many of us think that we're at our most rational | 0:40:13 | 0:40:16 | |
when it comes to decisions about money. | 0:40:16 | 0:40:18 | |
We like to think we know how to spot a bargain, to strike a good deal, | 0:40:20 | 0:40:24 | |
sell our house at the right time, invest wisely. | 0:40:24 | 0:40:28 | |
Thinking about money the right way | 0:40:29 | 0:40:32 | |
is one of the most challenging things for human nature. | 0:40:32 | 0:40:36 | |
But if we're not as rational as we like to think, | 0:40:38 | 0:40:41 | |
and there is a hidden force at work shaping our decisions, | 0:40:41 | 0:40:45 | |
are we deluding ourselves? | 0:40:45 | 0:40:47 | |
Money brings with it a mode of thinking. | 0:40:47 | 0:40:51 | |
It changes the way we react to the world. | 0:40:51 | 0:40:54 | |
When it comes to money, | 0:40:55 | 0:40:57 | |
cognitive biases play havoc with our best intentions. | 0:40:57 | 0:41:00 | |
There are many mistakes that people make when it comes to money. | 0:41:02 | 0:41:05 | |
Kahneman's insight into our mistakes with money | 0:41:18 | 0:41:21 | |
were to revolutionise our understanding of economics. | 0:41:21 | 0:41:25 | |
It's all about a crucial difference in how we feel | 0:41:27 | 0:41:29 | |
when we win or lose | 0:41:29 | 0:41:32 | |
and our readiness to take a risk. | 0:41:32 | 0:41:35 | |
I would like to take a risk. | 0:41:35 | 0:41:37 | |
Take a risk, OK? | 0:41:37 | 0:41:39 | |
Let's take a risk. | 0:41:39 | 0:41:40 | |
Our willingness to take a gamble is very different depending on | 0:41:40 | 0:41:44 | |
whether we are faced with a loss or a gain. | 0:41:44 | 0:41:47 | |
Excuse me, guys, can you spare two minutes to help us | 0:41:47 | 0:41:49 | |
with a little experiment? Try and win as much money as you can, OK? | 0:41:49 | 0:41:52 | |
-OK. -OK? | 0:41:52 | 0:41:54 | |
In my hands here I have £20, OK? | 0:41:54 | 0:41:56 | |
Here are two scenarios. | 0:41:56 | 0:41:58 | |
And I'm going to give you ten. | 0:41:58 | 0:41:59 | |
In the first case, you are given ten pounds. | 0:41:59 | 0:42:03 | |
That's now yours. Put it in your pocket, take it away, | 0:42:03 | 0:42:06 | |
spend it on a drink on the South Bank later. | 0:42:06 | 0:42:08 | |
-OK. -OK? | 0:42:08 | 0:42:09 | |
OK. | 0:42:09 | 0:42:10 | |
Then you have to make a choice about how much more you could gain. | 0:42:10 | 0:42:15 | |
You can either take the safe option, in which case, | 0:42:15 | 0:42:17 | |
I give you an additional five, | 0:42:17 | 0:42:19 | |
or you can take a risk. | 0:42:19 | 0:42:21 | |
If you take a risk, I'm going to flip this coin. | 0:42:21 | 0:42:24 | |
If it comes up heads, you win ten, | 0:42:24 | 0:42:28 | |
but if it comes up tails, | 0:42:28 | 0:42:30 | |
you're not going to win any more. | 0:42:30 | 0:42:32 | |
Would you choose the safe option and get an extra five pounds | 0:42:32 | 0:42:36 | |
or take a risk and maybe win an extra ten or nothing? | 0:42:36 | 0:42:40 | |
Which is it going to be? | 0:42:40 | 0:42:41 | |
I'd go safe. | 0:42:45 | 0:42:47 | |
-Safe, five? -Yeah. | 0:42:47 | 0:42:49 | |
-Take five. -You'd take five? -Yeah, man. -Sure? There we go. | 0:42:49 | 0:42:52 | |
Most people presented with this choice | 0:42:52 | 0:42:54 | |
go for the certainty of the extra fiver. | 0:42:54 | 0:42:56 | |
-Thank you very much. -Told you it was easy. | 0:42:56 | 0:42:59 | |
In a winning frame of mind, people are naturally rather cautious. | 0:42:59 | 0:43:03 | |
That's yours, too. | 0:43:03 | 0:43:04 | |
That was it? | 0:43:04 | 0:43:06 | |
-That was it. -Really? -Yes. -Eh? | 0:43:06 | 0:43:09 | |
But what about losing? | 0:43:11 | 0:43:13 | |
Are we similarly cautious when faced with a potential loss? | 0:43:13 | 0:43:17 | |
In my hands, I've got £20 and I'm going to give that to you. | 0:43:17 | 0:43:20 | |
That's now yours. | 0:43:20 | 0:43:21 | |
-OK. -You can put it in your handbag. | 0:43:21 | 0:43:23 | |
This time, you're given £20. | 0:43:23 | 0:43:26 | |
And again, you must make a choice. | 0:43:28 | 0:43:30 | |
Would you choose to accept a safe loss of £5 or would you take a risk? | 0:43:32 | 0:43:37 | |
If you take a risk, I'm going to flip this coin. | 0:43:39 | 0:43:42 | |
If it comes up heads, you don't lose anything, | 0:43:42 | 0:43:45 | |
but if it comes up tails, then you lose ten pounds. | 0:43:45 | 0:43:48 | |
In fact, it's exactly the same outcome. | 0:43:49 | 0:43:52 | |
In both cases, you face a choice between ending up with | 0:43:52 | 0:43:55 | |
a certain £15 or tossing a coin to get either ten or twenty. | 0:43:55 | 0:44:00 | |
-I will risk losing ten or nothing. -OK. | 0:44:01 | 0:44:06 | |
But the crucial surprise here is that when the choice is framed | 0:44:06 | 0:44:09 | |
in terms of a loss, most people take a risk. | 0:44:09 | 0:44:12 | |
-Take a risk. -Take a risk, OK. | 0:44:13 | 0:44:16 | |
-I'll risk it. -You'll risk it? OK. | 0:44:16 | 0:44:18 | |
Our slow System 2 could probably work out | 0:44:18 | 0:44:20 | |
that the outcome is the same in both cases. | 0:44:20 | 0:44:23 | |
And that's heads, you win. | 0:44:23 | 0:44:24 | |
But it's too limited and too lazy. | 0:44:24 | 0:44:27 | |
That's the easiest £20 you'll ever make. | 0:44:27 | 0:44:29 | |
Instead, fast System 1 makes a rough guess based on change. | 0:44:29 | 0:44:34 | |
-And that's all there is to it, thank you very much. -Oh, no! Look. | 0:44:34 | 0:44:38 | |
And System 1 doesn't like losing. | 0:44:38 | 0:44:41 | |
If you were to lose £10 in the street today | 0:44:47 | 0:44:49 | |
and then find £10 tomorrow, you would be financially unchanged | 0:44:49 | 0:44:53 | |
but actually we respond to changes, | 0:44:53 | 0:44:55 | |
so the pain of the loss of £10 looms much larger, it feels more painful. | 0:44:55 | 0:45:00 | |
In fact, you'd probably have to find £20 | 0:45:00 | 0:45:02 | |
to offset the pain that you feel by losing ten. | 0:45:02 | 0:45:05 | |
Heads. | 0:45:05 | 0:45:06 | |
At the heart of this, is a bias called loss aversion, | 0:45:06 | 0:45:10 | |
which affects many of our financial decisions. | 0:45:10 | 0:45:13 | |
People think in terms of gains and losses. | 0:45:13 | 0:45:17 | |
-Heads. -It's tails. -Oh! | 0:45:17 | 0:45:20 | |
And in their thinking, typically, losses loom larger than gains. | 0:45:20 | 0:45:25 | |
We even have an idea by...by how much, | 0:45:26 | 0:45:29 | |
by roughly a factor of two or a little more than two. | 0:45:29 | 0:45:33 | |
That is loss aversion, | 0:45:34 | 0:45:36 | |
and it certainly was the most important thing | 0:45:36 | 0:45:38 | |
that emerged from our work. | 0:45:38 | 0:45:40 | |
It's a vital insight into human behaviour, | 0:45:43 | 0:45:46 | |
so important that it led to a Nobel prize | 0:45:46 | 0:45:49 | |
and the founding of an entirely new branch of economics. | 0:45:49 | 0:45:53 | |
When we think we're winning, we don't take risks. | 0:45:55 | 0:45:58 | |
But when we're faced with a loss, frankly, we're a bit reckless. | 0:45:59 | 0:46:04 | |
But loss aversion doesn't just affect people | 0:46:14 | 0:46:16 | |
making casual five pound bets. | 0:46:16 | 0:46:19 | |
It can affect anyone at any time, | 0:46:19 | 0:46:23 | |
including those who work in the complex system of high finance, | 0:46:23 | 0:46:27 | |
in which trillions of dollars are traded. | 0:46:27 | 0:46:30 | |
In our current complex environments, | 0:46:32 | 0:46:35 | |
we now have the means as well as the motive | 0:46:35 | 0:46:39 | |
to make very serious mistakes. | 0:46:39 | 0:46:42 | |
The bedrock of economics is that people think rationally. | 0:46:44 | 0:46:48 | |
They calculate risks, rewards and decide accordingly. | 0:46:48 | 0:46:52 | |
But we're not always rational. | 0:46:55 | 0:46:57 | |
We rarely behave like Mr Spock. | 0:46:57 | 0:47:00 | |
For most of our decisions, we use fast, intuitive, | 0:47:01 | 0:47:05 | |
but occasionally unreliable System 1. | 0:47:05 | 0:47:08 | |
And in a global financial market, | 0:47:11 | 0:47:13 | |
that can lead to very serious problems. | 0:47:13 | 0:47:16 | |
I think what the financial crisis did was, it simply said, | 0:47:18 | 0:47:23 | |
"You know what? | 0:47:23 | 0:47:25 | |
"People are a lot more vulnerable to psychological pitfalls | 0:47:25 | 0:47:30 | |
"than we really understood before." | 0:47:30 | 0:47:33 | |
Basically, human psychology is just too flawed | 0:47:33 | 0:47:38 | |
to expect that we could avert a crisis. | 0:47:38 | 0:47:41 | |
Understanding these pitfalls has led to a new branch of economics. | 0:47:48 | 0:47:53 | |
Behavioural economics. | 0:47:54 | 0:47:56 | |
Thanks to psychologists like Hersh Shefrin, | 0:47:58 | 0:48:01 | |
it's beginning to establish a toehold in Wall Street. | 0:48:01 | 0:48:04 | |
It takes account of the way we actually make decisions | 0:48:05 | 0:48:09 | |
rather than how we say we do. | 0:48:09 | 0:48:11 | |
Financial crisis, I think, was as large a problem as it was | 0:48:17 | 0:48:21 | |
because certain psychological traits like optimism, | 0:48:21 | 0:48:24 | |
over-confidence and confirmation bias | 0:48:24 | 0:48:27 | |
played a very large role among a part of the economy | 0:48:27 | 0:48:31 | |
where serious mistakes could be made, and were. | 0:48:31 | 0:48:36 | |
But for as long as our financial system assumes we are rational, | 0:48:44 | 0:48:48 | |
our economy will remain vulnerable. | 0:48:48 | 0:48:50 | |
I'm quite certain | 0:48:52 | 0:48:53 | |
that if the regulators listened to behavioural economists early on, | 0:48:53 | 0:48:58 | |
we would have designed a very different financial system | 0:48:58 | 0:49:01 | |
and we wouldn't have had the incredible increase | 0:49:01 | 0:49:05 | |
in housing market and we wouldn't have this financial catastrophe. | 0:49:05 | 0:49:10 | |
And so when Kahneman collected his Nobel prize, | 0:49:11 | 0:49:15 | |
it wasn't for psychology, | 0:49:15 | 0:49:17 | |
it was for economics. | 0:49:17 | 0:49:19 | |
The big question is, what can we do about these systematic mistakes? | 0:49:25 | 0:49:29 | |
Can we hope to find a way round our fast-thinking biases | 0:49:30 | 0:49:34 | |
and make better decisions? | 0:49:34 | 0:49:36 | |
To answer this, | 0:49:39 | 0:49:40 | |
we need to know the evolutionary origins of our mistakes. | 0:49:40 | 0:49:43 | |
Just off the coast of Puerto Rico | 0:49:46 | 0:49:48 | |
is probably the best place in the world to find out. | 0:49:48 | 0:49:52 | |
The tiny island of Cayo Santiago. | 0:49:59 | 0:50:03 | |
So we're now in the boat, heading over to Cayo Santiago. | 0:50:05 | 0:50:08 | |
This is an island filled with a thousand rhesus monkeys. | 0:50:08 | 0:50:12 | |
Once you pull in, it looks | 0:50:12 | 0:50:13 | |
a little bit like you're going to Jurassic Park. You're not sure | 0:50:13 | 0:50:16 | |
what you're going to see. Then you'll see your first monkey, | 0:50:16 | 0:50:19 | |
and it'll be comfortable, like, | 0:50:19 | 0:50:21 | |
"Ah, the monkeys are here, everything's great." | 0:50:21 | 0:50:24 | |
It's an island devoted to monkey research. | 0:50:30 | 0:50:33 | |
You see the guys hanging out on the cliff up there? | 0:50:34 | 0:50:37 | |
Pretty cool. | 0:50:37 | 0:50:38 | |
The really special thing about Cayo Santiago | 0:50:41 | 0:50:43 | |
is that the animals here, | 0:50:43 | 0:50:44 | |
because they've grown up over the last seven years around humans, | 0:50:44 | 0:50:47 | |
they're completely habituated, | 0:50:47 | 0:50:49 | |
and that means we can get up close to them, show them stuff, | 0:50:49 | 0:50:52 | |
look at how they make decisions. | 0:50:52 | 0:50:53 | |
We're able to do this here | 0:50:53 | 0:50:54 | |
in a way that we'd never be able to do it anywhere else, really. | 0:50:54 | 0:50:57 | |
It's really unique. | 0:50:57 | 0:50:58 | |
Laurie Santos is here to find out | 0:51:02 | 0:51:04 | |
if monkeys make the same mistakes in their decisions that we do. | 0:51:04 | 0:51:08 | |
Most of the work we do is comparing humans and other primates, | 0:51:11 | 0:51:14 | |
trying to ask what's special about humans. | 0:51:14 | 0:51:16 | |
But really, what we want to understand is, | 0:51:16 | 0:51:18 | |
what's the evolutionary origin of some of our dumber strategies, | 0:51:18 | 0:51:21 | |
some of those spots where we get things wrong? | 0:51:21 | 0:51:23 | |
If we could understand where those came from, | 0:51:23 | 0:51:26 | |
that's where we'll get some insight. | 0:51:26 | 0:51:28 | |
If Santos can show us | 0:51:31 | 0:51:32 | |
that monkeys have the same cognitive biases as us, | 0:51:32 | 0:51:35 | |
it would suggest that they evolved a long time ago. | 0:51:35 | 0:51:38 | |
And a mental strategy that old would be almost impossible to change. | 0:51:43 | 0:51:49 | |
We started this work around the time of the financial collapse. | 0:51:50 | 0:51:53 | |
So, when we were thinking about | 0:51:56 | 0:51:57 | |
what dumb strategies could we look at in monkeys, it was pretty obvious | 0:51:57 | 0:52:01 | |
that some of the human economic strategies | 0:52:01 | 0:52:03 | |
which were in the news might be the first thing to look at. | 0:52:03 | 0:52:06 | |
And one of the particular things we wanted to look at was | 0:52:06 | 0:52:09 | |
whether or not the monkeys are loss averse. | 0:52:09 | 0:52:12 | |
But monkeys, smart as they are, have yet to start using money. | 0:52:12 | 0:52:16 | |
And so that was kind of where we started. | 0:52:16 | 0:52:18 | |
We said, "Well, how can we even ask this question | 0:52:18 | 0:52:21 | |
"of if monkeys make financial mistakes?" | 0:52:21 | 0:52:24 | |
And so we decided to do it by introducing the monkeys | 0:52:24 | 0:52:27 | |
to their own new currency and just let them buy their food. | 0:52:27 | 0:52:31 | |
So I'll show you some of this stuff we've been up to with the monkeys. | 0:52:32 | 0:52:36 | |
Back in her lab at Yale, | 0:52:37 | 0:52:39 | |
she introduced a troop of monkeys to their own market, | 0:52:39 | 0:52:42 | |
giving them round shiny tokens they could exchange for food. | 0:52:42 | 0:52:46 | |
So here's Holly. | 0:52:48 | 0:52:50 | |
She comes in, hands over a token | 0:52:50 | 0:52:52 | |
and you can see, she just gets to grab the grape there. | 0:52:52 | 0:52:55 | |
One of the first things we wondered was just, | 0:52:55 | 0:52:57 | |
can they in some sense learn | 0:52:57 | 0:52:59 | |
that a different store sells different food at different prices? | 0:52:59 | 0:53:02 | |
So what we did was, we presented the monkeys with situations | 0:53:02 | 0:53:06 | |
where they met traders who sold different goods at different rates. | 0:53:06 | 0:53:10 | |
So what you'll see in this clip is the monkeys meeting a new trader. | 0:53:10 | 0:53:13 | |
She's actually selling grapes for three grapes per one token. | 0:53:13 | 0:53:17 | |
And what we found is that in this case, | 0:53:21 | 0:53:23 | |
the monkeys are pretty rational. | 0:53:23 | 0:53:24 | |
so when they get a choice of a guy who sells, you know, | 0:53:24 | 0:53:26 | |
three goods for one token, they actually shop more at that guy. | 0:53:26 | 0:53:30 | |
Having taught the monkeys the value of money, | 0:53:36 | 0:53:38 | |
the next step was to see if monkeys, like humans, | 0:53:38 | 0:53:42 | |
suffer from that most crucial bias, loss aversion. | 0:53:42 | 0:53:46 | |
And so what we did was, we introduced the monkeys to traders | 0:53:48 | 0:53:52 | |
who either gave out losses or gains relative to what they should. | 0:53:52 | 0:53:56 | |
So I could make the monkey think he's getting a bonus | 0:53:57 | 0:54:00 | |
simply by having him trade | 0:54:00 | 0:54:01 | |
with a trader who's starting with a single grape | 0:54:01 | 0:54:04 | |
but then when the monkey pays this trader, | 0:54:04 | 0:54:06 | |
she actually gives him an extra, so she gives him a bonus. | 0:54:06 | 0:54:08 | |
At the end, the monkey gets two, | 0:54:08 | 0:54:10 | |
but he thinks he got that second one as a bonus. | 0:54:10 | 0:54:13 | |
We can then compare what the monkeys do with that guy | 0:54:13 | 0:54:16 | |
versus a guy who gives the monkey losses. | 0:54:16 | 0:54:18 | |
This is a guy who shows up, | 0:54:18 | 0:54:19 | |
who pretends he's going to sell three grapes, | 0:54:19 | 0:54:22 | |
but then when the monkey actually pays this trader, | 0:54:22 | 0:54:25 | |
he'll take one of the grapes away and give the monkeys only two. | 0:54:25 | 0:54:28 | |
The big question then is how the monkeys react | 0:54:30 | 0:54:33 | |
when faced with a choice between a loss and a gain. | 0:54:33 | 0:54:36 | |
So she'll come in, she's met these two guys before. | 0:54:37 | 0:54:40 | |
You can see she goes with the bonus option, | 0:54:40 | 0:54:42 | |
even waits patiently for her additional piece to be added here, | 0:54:42 | 0:54:46 | |
and then takes the bonus, avoiding the person who gives her losses. | 0:54:47 | 0:54:51 | |
So monkeys hate losing just as much as people. | 0:54:57 | 0:55:00 | |
And crucially, Santos found that monkeys, as well, | 0:55:08 | 0:55:11 | |
are more likely to take risks when faced with a loss. | 0:55:11 | 0:55:15 | |
This suggests to us | 0:55:19 | 0:55:20 | |
that the monkeys seem to frame their decisions | 0:55:20 | 0:55:23 | |
in exactly the same way we do. | 0:55:23 | 0:55:24 | |
They're not thinking just about the absolute, | 0:55:24 | 0:55:27 | |
they're thinking relative to what they expect. | 0:55:27 | 0:55:29 | |
And when they're getting less than they expect, | 0:55:29 | 0:55:32 | |
when they're getting losses, they too become more risk-seeking. | 0:55:32 | 0:55:35 | |
The fact that we share this bias with these monkeys suggests | 0:55:38 | 0:55:42 | |
that it's an ancient strategy etched into our DNA | 0:55:42 | 0:55:45 | |
more than 35 million years ago. | 0:55:45 | 0:55:47 | |
And what we learn from the monkeys is that | 0:55:50 | 0:55:53 | |
if this bias is really that old, | 0:55:53 | 0:55:55 | |
if we really have had this strategy for the last 35 million years, | 0:55:55 | 0:55:59 | |
simply deciding to overcome it is just not going to work. | 0:55:59 | 0:56:02 | |
We need better ways to make ourselves avoid some of these pitfalls. | 0:56:02 | 0:56:06 | |
Making mistakes, it seems, is just part of what it is to be human. | 0:56:08 | 0:56:13 | |
We are stuck with our intuitive inner stranger. | 0:56:19 | 0:56:23 | |
The challenge this poses is profound. | 0:56:26 | 0:56:29 | |
If it's human nature to make these predictable mistakes | 0:56:31 | 0:56:34 | |
and we can't change that, what, then, can we do? | 0:56:34 | 0:56:38 | |
We need to accept ourselves as we are. | 0:56:40 | 0:56:43 | |
The cool thing about being a human versus a monkey | 0:56:43 | 0:56:46 | |
is that we have a deliberative self that can reflect on our biases. | 0:56:46 | 0:56:50 | |
System 2 in us has for the first time realised | 0:56:50 | 0:56:53 | |
that there's a System 1, and with that realisation, | 0:56:53 | 0:56:56 | |
we can shape the way we set up policies. | 0:56:56 | 0:56:59 | |
We can shape the way we set up situations to allow ourselves | 0:56:59 | 0:57:03 | |
to make better decisions. | 0:57:03 | 0:57:04 | |
This is the first time in evolution that this has happened. | 0:57:04 | 0:57:07 | |
If we want to avoid mistakes, we have to reshape the environment | 0:57:12 | 0:57:16 | |
we've built around us rather than hope to change ourselves. | 0:57:16 | 0:57:20 | |
We've achieved a lot despite all of these biases. | 0:57:27 | 0:57:31 | |
If we are aware of them, we can probably do things | 0:57:31 | 0:57:34 | |
like design our institutions and our regulations | 0:57:34 | 0:57:38 | |
and our own personal environments and working lives to minimise | 0:57:38 | 0:57:43 | |
the effect of those biases | 0:57:43 | 0:57:45 | |
and help us think about how to overcome them. | 0:57:45 | 0:57:49 | |
We are limited, we're not perfect. | 0:57:54 | 0:57:56 | |
We're irrational in all kinds of ways, | 0:57:56 | 0:57:58 | |
but we can build a world that is compatible with this | 0:57:58 | 0:58:01 | |
and get us to make better decisions rather than worse decisions. | 0:58:01 | 0:58:05 | |
That's my hope. | 0:58:05 | 0:58:06 | |
And by accepting our inner stranger, | 0:58:10 | 0:58:13 | |
we may come to a better understanding of our own minds. | 0:58:13 | 0:58:18 | |
I think it is important, in general, | 0:58:18 | 0:58:21 | |
to be aware of where beliefs come from. | 0:58:21 | 0:58:24 | |
And if we think that we have reasons for what | 0:58:27 | 0:58:30 | |
we believe, that is often a mistake, | 0:58:30 | 0:58:34 | |
that our beliefs and our wishes and our hopes | 0:58:34 | 0:58:38 | |
are not always anchored in reasons. | 0:58:38 | 0:58:41 | |
They're anchored in something else | 0:58:41 | 0:58:43 | |
that comes from within and is different. | 0:58:43 | 0:58:46 |