Why Do I Need You? The Brain with David Eagleman


Why Do I Need You?

Similar Content

Browse content similar to Why Do I Need You?. Check below for episodes and series from the same categories and more!

Transcript


LineFromTo

What does a brain need to be healthy?

0:00:040:00:06

Well, it needs nutrients from the food you eat,

0:00:080:00:11

it needs oxygen from your blood, plenty of water.

0:00:110:00:14

But there's something else, something equally as important.

0:00:140:00:18

It needs other people.

0:00:180:00:20

CHEERING

0:00:200:00:25

Human beings are extremely social creatures.

0:00:310:00:34

We come together, we team up,

0:00:360:00:39

we share moments of intense joy and disappointment.

0:00:390:00:43

We don't just seek out other people to have a good time.

0:00:460:00:49

Your brain function depends on the social web that you're in.

0:00:490:00:53

Your neurons require other people's neurons to thrive and survive.

0:00:530:00:58

I want to show you how our brains are fundamentally wired

0:00:590:01:03

to work together...

0:01:030:01:04

..how this social network that envelops us

0:01:070:01:10

from birth is vital for our survival.

0:01:100:01:13

Yeah? OK!

0:01:170:01:19

Understanding how brains deal with each other allows us

0:01:190:01:22

to understand what bonds our species, driving us to help

0:01:220:01:28

one another and what makes us hate...

0:01:280:01:32

..what allows acts of human violence.

0:01:330:01:36

It helps us to make sense of our past

0:01:370:01:41

and holds the key to our future.

0:01:410:01:43

There are seven billion people living today - seven billion brains

0:01:560:02:02

moving, choosing, acting,

0:02:020:02:06

believing and connecting with other brains.

0:02:060:02:10

Brains are traditionally studied in isolation but, in fact,

0:02:120:02:16

much of the circuitry of the brain has to do with other brains.

0:02:160:02:20

We're fundamentally social creatures

0:02:200:02:22

and our society is a complex web of interaction.

0:02:220:02:26

On any normal day, we intersect with an enormous number of people.

0:02:260:02:31

'Our lives are built on these intersections,

0:02:330:02:36

'not just between us and our family and friends and work colleagues,

0:02:360:02:42

'but also between them and the people they meet.'

0:02:420:02:45

'Even the most basic encounter...'

0:02:510:02:54

like getting a cup of coffee...

0:02:540:02:56

'..relies on trust with a stranger.'

0:02:560:02:59

-Could I get a latte, please?

-Definitely.

-Thanks.

0:02:590:03:01

Everywhere we look, we see complex social interactions, relationships

0:03:040:03:10

forming and breaking, bonds of love and support, social networking.

0:03:100:03:16

'We clump into large groups to share our knowledge.'

0:03:180:03:22

You've got all these random spots in your brain that get

0:03:220:03:25

wired up into an associative neural network.

0:03:250:03:27

'We work to impress each other and we swap ideas.'

0:03:270:03:32

I'll stick around for any questions that anyone has. Thank you.

0:03:320:03:34

CLAPPING

0:03:340:03:37

'Most research looks at one brain at a time,

0:03:380:03:41

'but that misses the fact that a great deal of our brain activity

0:03:410:03:45

'is dedicated to communicating with each other,

0:03:450:03:49

'interpreting each other.'

0:03:490:03:50

'Our social drive is deeply rooted in our neural circuitry.'

0:03:510:03:55

Take a look at this film from the 1940s.

0:04:000:04:02

What do you see happening here?

0:04:020:04:04

Is this just a simple animation of some shapes or something more?

0:04:050:04:10

Do you see a chase, a fight, a love story?

0:04:100:04:14

The big one seems to be pushing the little one around.

0:04:160:04:20

It seems like the two triangles are in a little bit of a squabble.

0:04:200:04:25

There are relationships here

0:04:250:04:26

in terms of one is more dominant than the other.

0:04:260:04:29

Back in the 1940s, psychologists Fritz Heider

0:04:310:04:34

and Marianne Simmel created this film as part of an experiment.

0:04:340:04:39

-The ball doesn't seem to want to be in there.

-It's freaking out.

0:04:390:04:42

It's scared. It looks like a trap to me.

0:04:420:04:45

It looks like the smaller triangle is being shut out and, like,

0:04:460:04:49

trying to peer in.

0:04:490:04:52

-They are paired in a way that seems friendly.

-Yeah.

0:04:520:04:56

-This is really fun.

-It's fun, right?

0:04:560:04:58

'What Heider and Simmel found, as I did,

0:04:590:05:02

'is how easy it is to look at moving shapes and to see meaning

0:05:020:05:07

'and motives and emotion, all in the form of a social narrative.'

0:05:070:05:12

I kind of get the sense that they're cats and dogs.

0:05:140:05:16

It seemed like the big one might have been,

0:05:160:05:18

like, his dad or something.

0:05:180:05:20

Just call it more of, like, a mating ritual -

0:05:200:05:22

two competitors going for one possible mate.

0:05:220:05:25

These are just shapes on a screen but we can't help

0:05:280:05:31

but tell stories about them. Why?

0:05:310:05:34

It's because our brains are so primed for social interaction

0:05:340:05:37

that we look for intention in relationships all around us.

0:05:370:05:41

One way we navigate the social world

0:05:430:05:45

is by judging other people's intentions.

0:05:450:05:49

Is she is trying to be helpful?

0:05:490:05:52

Are we a trustworthy team?

0:05:540:05:56

Our brains are good at making these sorts of judgements

0:05:570:06:00

and we do it constantly

0:06:000:06:03

but do we learn this skill from life experience or are we born with it?

0:06:030:06:07

To figure out which one it is, I've invited over some people who

0:06:100:06:13

don't have much experience with the world.

0:06:130:06:16

SHE CRIES

0:06:160:06:18

I've invited them to a puppet show.

0:06:180:06:20

These babies are all under 12 months old.

0:06:220:06:26

They're just beginning to explore the world around them.

0:06:260:06:30

You could say they're all a little short on life experience.

0:06:300:06:33

We decided to run a simple experiment

0:06:350:06:37

developed at Yale University.

0:06:370:06:39

Here's a duck struggling to open a box.

0:06:440:06:46

One bear helps the duck.

0:06:510:06:53

The other is mean to the duck.

0:06:590:07:01

OK, Booey. Here you go. There are two puppets.

0:07:050:07:08

'When the show's over, I let the babies choose a bear to play with.'

0:07:080:07:12

Yeah, OK. Is that the one you like? All right.

0:07:150:07:17

'Almost every one of them chooses the bear that's been kind.'

0:07:170:07:21

'These babies can't walk or talk

0:07:230:07:25

'and yet they already have the tools to make judgements about others.'

0:07:250:07:30

Yeah? OK!

0:07:320:07:33

It's often assumed that trust is something that we

0:07:350:07:37

learn from our experience in the world, but these experiments

0:07:370:07:40

demonstrate that, even as babies, we come equipped with

0:07:400:07:44

social antennae for feeling our way through the world.

0:07:440:07:47

The brain comes with inborn instincts for figuring out

0:07:470:07:51

who's trustworthy and who's not.

0:07:510:07:53

BABY GURGLES

0:07:530:07:54

As we grow, our social challenges become even more subtle

0:07:550:08:00

and complex.

0:08:000:08:01

Understanding others is one of the most demanding operations

0:08:030:08:07

that our brains perform.

0:08:070:08:08

They have to interpret words and, more than that, inflection,

0:08:110:08:16

facial expressions, body language.

0:08:160:08:18

Does she like me?

0:08:210:08:22

Is he interested in what I'm saying?

0:08:240:08:26

Do they want my help?

0:08:280:08:29

Society runs on our ability to read each other's social signals.

0:08:310:08:37

Take that ability away and the world becomes a very strange place.

0:08:390:08:44

Car enthusiast John Robeson has always struggled to read

0:08:520:08:56

other people.

0:08:560:08:57

When I was a little boy, I was bullied and rejected by

0:08:590:09:03

other kids, and that didn't happen with machines, you know.

0:09:030:09:07

I could stand by a tractor in my grandparents' farm

0:09:070:09:12

and I could learn how to adjust it and it wouldn't tease me

0:09:120:09:16

or do anything bad, it wouldn't run away, it would

0:09:160:09:18

always be there and I could count on it.

0:09:180:09:20

And I guess I learned to make friends with

0:09:200:09:24

the machines before I learned how to make friends with other people.

0:09:240:09:29

In time, John's affinity for technology took him

0:09:290:09:32

to places his bullies could only dream of.

0:09:320:09:35

By 21, he was a roadie for the band Kiss.

0:09:360:09:40

This was me, back with Kiss in the '70s.

0:09:400:09:44

I'm older and fatter and stuff. I don't look the same any more.

0:09:440:09:47

Surrounded by legendary rock and roll excess,

0:09:500:09:53

his outlook remained different from other people's.

0:09:530:09:57

People would come up to me all the time and they would say,

0:09:570:10:00

"What's this guy like?" or, "What's that guy like?"

0:10:000:10:03

I would say, "Yeah, their stage set-up,

0:10:040:10:06

"they had Sunn 2000S bass amps," or, "Gene played Sunn Coliseums,

0:10:060:10:10

"and we had seven bass amps chained together,

0:10:100:10:13

"we had 2,200 watts in the bass system for that."

0:10:130:10:16

But I maybe couldn't tell you

0:10:170:10:18

the first thing about the musicians who sang through them.

0:10:180:10:21

Now I realise that shows that I did kind of live in

0:10:240:10:28

a different world all those years - a world of machines and equipment.

0:10:280:10:32

When he was 40, John was diagnosed with Asperger's - a form of autism.

0:10:350:10:41

Many regions of the brain are engaged during

0:10:480:10:51

social interaction, but in autism, that brain activity isn't seen

0:10:510:10:56

as strongly, and that's paralleled by diminished social skills.

0:10:560:11:01

I didn't really understand that there were complex messages

0:11:020:11:08

in faces until I was well into adulthood and learned about autism.

0:11:080:11:13

I knew that people could display signs of crazed anger..

0:11:150:11:20

..but if you asked about more subtle expressions, you know,

0:11:220:11:28

I think you're sweet and I wonder what you're hiding,

0:11:280:11:31

or, I'd really like to do that, or, I wish you'd do this or that, I...

0:11:310:11:35

I had no idea about things like that.

0:11:360:11:39

But then came a transforming moment in John's life.

0:11:450:11:48

In 2008, he was invited to Harvard Medical School to take part

0:11:490:11:54

in an experiment on his brain - overseen by Dr Alvaro Pascual-Leone.

0:11:540:12:00

It was an attempt to try to understand

0:12:020:12:05

how activity in one area

0:12:050:12:07

affects activity in another area and how that affects behaviour.

0:12:070:12:12

The experiment was only meant to help the scientists gain

0:12:140:12:17

greater knowledge about the autistic brain.

0:12:170:12:19

But then, something unexpected happened.

0:12:210:12:24

John was given transcranial magnetic stimulation, or TMS.

0:12:270:12:32

Magnetic coils were placed next to his head to generate

0:12:320:12:36

minute electrical currents in the brain and alter its activity.

0:12:360:12:40

The researchers targeted different regions of John's brain to see

0:12:420:12:46

whether interfering with his brain activity had

0:12:460:12:50

-any effect on his behaviour.

-They would test me after the session.

0:12:500:12:56

I would go home, kind of not knowing what to expect.

0:12:560:12:59

At first, there was no result,

0:13:000:13:02

but then they targeted the dorsolateral prefrontal cortex,

0:13:020:13:06

a region involved in flexible thinking and abstraction,

0:13:060:13:11

and something dramatic happened.

0:13:110:13:14

Somehow, I became different.

0:13:140:13:15

He contacted us, very excited, to say, you know,

0:13:190:13:24

"The effects of the stimulation seemed to have unlocked

0:13:240:13:27

"something and the effects are still lasting

0:13:270:13:30

"and I now can do things that I could never do!"

0:13:300:13:33

After TMS, I was able to sort of read signals from other people

0:13:380:13:44

and understand what was going on.

0:13:440:13:46

So I listened to that, fascinated by it, and thought, OK, well, whatever.

0:13:480:13:53

It'll go away. But it didn't.

0:13:530:13:55

It actually remained something that had really

0:13:550:13:58

fundamentally changed in him.

0:13:580:14:00

Somehow, and entirely accidentally,

0:14:030:14:06

the TMS had unlocked a whole new world for John.

0:14:060:14:10

A vegetable sandwich to bring home...

0:14:100:14:11

'I'd be tempted to say I couldn't read people and now I can,

0:14:110:14:16

'but that's not really true.'

0:14:160:14:18

-OK, how about a full-sized... One of them?

-Sure.

0:14:180:14:21

It's more accurate to say I had no idea there were these

0:14:210:14:25

messages emanating from other people.

0:14:250:14:28

'TMS showed me those messages, and now that

0:14:300:14:34

'I'm aware that they're out there, everything I do is different.'

0:14:340:14:38

All of a sudden, you can walk around and engage the world...

0:14:390:14:42

'..and it's a big, big thing.'

0:14:430:14:45

OK, thanks.

0:14:460:14:47

We don't know exactly what happened neurobiologically

0:14:500:14:53

but I think it now offers the opportunity for us

0:14:530:14:55

to understand what behavioural modifications, what interventions

0:14:550:15:00

might be possible to learn from him that we can then teach others.

0:15:000:15:04

John's transformation is a reminder that all

0:15:070:15:11

the activities of the human brain,

0:15:110:15:13

including the subtle interplay of emotions and relationships,

0:15:130:15:18

are rooted in the detailed patterns

0:15:180:15:21

of trillions of electrochemical signals.

0:15:210:15:24

Somehow, humans can look at each other

0:15:320:15:34

and study the arrangement of facial muscles

0:15:340:15:38

and then process that information into an understanding of

0:15:380:15:42

other people's thoughts and emotions.

0:15:420:15:45

It's an astonishing skill because the cues are so subtle

0:15:470:15:52

and the processing is so rapid

0:15:520:15:55

that the whole operation runs under your radar.

0:15:550:15:59

It only takes 33 milliseconds for your brain to process basic

0:16:010:16:06

information about someone's facial expression and start reacting to it.

0:16:060:16:12

So we're going to put one electrode right above your eyebrow...

0:16:120:16:15

'So how does it do that?'

0:16:150:16:16

..and the other right on your cheek. There we go. Great.

0:16:160:16:19

'I've invited a group of people to run an experiment.

0:16:220:16:27

'I've wired up to a machine that measures movements in their

0:16:270:16:30

'facial muscles and I've asked them to look at photographs of faces.'

0:16:300:16:35

When participants are looking at a photograph with a smile

0:16:440:16:47

or a frown, we see this activity on the graph,

0:16:470:16:51

which indicates that their own facial muscles are moving.

0:16:510:16:54

Why?

0:16:550:16:56

Well, it turns out that they are automatically mirroring,

0:16:590:17:02

with their own faces, the expressions that they're seeing.

0:17:020:17:06

-That was fun, right, the last one?

-Yeah.

-Yeah, that was a fun test!

0:17:090:17:14

But what purpose does this mirroring serve?

0:17:140:17:16

I've invited a second group of people.

0:17:180:17:21

They're similar to the first group except for one thing.

0:17:210:17:25

This is the most lethal neurotoxin on the planet.

0:17:320:17:36

If you were to ingest even a fraction of this,

0:17:360:17:39

your brain could no longer tell your muscles how to contract,

0:17:390:17:42

and you would die of total paralysis.

0:17:420:17:44

So it seems unlikely that anyone would pay to have this

0:17:440:17:48

injected into themselves, but they do.

0:17:480:17:50

This is known as botulinum toxin or Botox.

0:17:520:17:55

If you put in your forehead muscles, it paralyses them

0:17:570:18:00

to reduce wrinkling.

0:18:000:18:02

But there's a less well-known side-effect.

0:18:040:18:07

When our participants with Botox went through the same tests,

0:18:110:18:16

their facial muscles responded less. No surprise there.

0:18:160:18:20

But replicating an experiment out of Duke University,

0:18:200:18:24

we had both groups look at facial expressions

0:18:240:18:27

and now they were asked to choose the word that best described

0:18:270:18:32

the emotion they were seeing.

0:18:320:18:34

Panic.

0:18:390:18:41

Panicked.

0:18:410:18:43

Upset.

0:18:450:18:46

On average, the Botox group was worse at identifying

0:18:480:18:53

the emotions correctly.

0:18:530:18:54

Sceptical?

0:18:560:18:57

It seems that the lack of feedback from their facial muscles

0:18:570:19:01

impairs their ability to read other people.

0:19:010:19:05

The paralysed faces of Botox users not only makes it hard for us

0:19:050:19:10

to tell what THEY'RE feeling,

0:19:100:19:12

those same frozen muscles make it hard for THEM to read US.

0:19:120:19:17

And that tells us something.

0:19:180:19:21

When I'm happy or sad, part of that feeling

0:19:210:19:24

relies on the unconscious feedback from muscles in my face.

0:19:240:19:29

And our social brains take advantage of that,

0:19:290:19:32

so when we're trying to understand what someone else is feeling,

0:19:320:19:36

we try on their facial expression.

0:19:360:19:38

This automatic mirroring of expressions is just one way

0:19:410:19:46

in which we understand others.

0:19:460:19:49

The brain also has a deeper way,

0:19:490:19:52

one that's best explained at the movies.

0:19:520:19:55

One ticket, please.

0:20:030:20:05

Thank you.

0:20:060:20:08

When we go to the movie theatre,

0:20:100:20:12

we know full well that it's make-believe.

0:20:120:20:15

The people on the screen are just acting.

0:20:150:20:19

And yet, we still react. We gasp and flinch and cry.

0:20:190:20:25

Why do we fall for it?

0:20:250:20:26

To understand why we care about other people getting hurt,

0:20:330:20:36

we need to understand what happens in your brain when you get hurt.

0:20:360:20:40

So imagine that somebody

0:20:400:20:41

were to stab your hand with a syringe needle.

0:20:410:20:44

That activates a network of areas in your brain that we call

0:20:440:20:47

the pain matrix.

0:20:470:20:50

There's no single spot in the brain where pain is processed.

0:20:520:20:56

Instead, the perception of pain arises from several different

0:20:570:21:01

areas networking together.

0:21:010:21:03

Strangely enough,

0:21:030:21:06

this pain matrix is at the heart of how we connect with others.

0:21:060:21:10

Now, when you watch someone else get stabbed,

0:21:130:21:16

your pain matrix becomes activated.

0:21:160:21:19

Not the parts that tell you you've actually been touched,

0:21:200:21:24

but the parts involved in the emotional experience of pain.

0:21:240:21:28

In other words, watching someone else in pain and being in pain

0:21:280:21:33

use the same neural machinery and that's the basis of empathy.

0:21:330:21:38

To empathise with another person is to literally feel their pain.

0:21:470:21:52

You run a compelling simulation of what it would be like

0:21:550:21:59

if you were in that situation.

0:21:590:22:02

And our capacity to do this is why stories and movies and novels

0:22:020:22:07

are so absorbing and why they're so pervasive across human culture

0:22:070:22:12

because whether it's about total strangers or made-up characters,

0:22:120:22:17

you experience their agony and their ecstasy, you fluidly become them

0:22:170:22:23

and live their lives and stand in their vantage points.

0:22:230:22:27

You can tell yourself that the stories aren't real,

0:22:300:22:33

but some neurons deep in your brain can't tell the difference.

0:22:330:22:37

Our capacity to feel another person's pain is part of what

0:22:480:22:52

makes us so good at taking other people's perspective,

0:22:520:22:55

to step out of our shoes and into their shoes, neurally speaking.

0:22:550:22:59

We can't help but connect with others.

0:23:020:23:05

We're hotwired to be extremely social creatures.

0:23:050:23:09

And that raises a question - what would happen

0:23:090:23:14

if the brain were starved of human contact?

0:23:140:23:17

In 2009, peace activist Sarah Shourd

0:23:240:23:28

and her two companions were hiking in the mountains of northern Iraq,

0:23:280:23:32

an area that was at the time peaceful,

0:23:320:23:35

but they accidentally strayed into Iran and they were arrested.

0:23:350:23:39

They pulled us apart and threw us in separate cells and slammed the door.

0:23:410:23:45

And um...

0:23:450:23:47

That was the beginning of the next 410 days of my life in that cell.

0:23:470:23:53

In the early weeks and really months of solitary confinement,

0:23:590:24:03

you're reduced to an animal-like state.

0:24:030:24:06

I mean, you are an animal, in a cage.

0:24:060:24:08

And the majority of your hours are pacing,

0:24:140:24:18

and the animal-like state sort of eventually

0:24:230:24:27

transformed into a more plant-like state.

0:24:270:24:30

When your mind starts to slow down and your thoughts become repetitive.

0:24:320:24:37

Your brain turns on itself

0:24:440:24:46

and it becomes the source of your worst pain and your worst torture.

0:24:460:24:53

I would relive every detail of my life.

0:25:000:25:02

And eventually, you run out of memories and you've told them

0:25:040:25:08

all to yourself so many times and it doesn't take that long.

0:25:080:25:12

Extreme social deprivation causes deep psychological pain.

0:25:120:25:17

Without interaction, the brain suffers.

0:25:170:25:21

Solitary confinement is designed to eat away at

0:25:220:25:26

and really attack what essentially makes us human.

0:25:260:25:29

Sarah's brain used the scant sensory information it had

0:25:330:25:37

to construct a reality.

0:25:370:25:40

The sun would come in at a certain time of day at an angle

0:25:410:25:44

through my window and all of the little dust particles in my cell

0:25:440:25:49

were illuminated by the sun.

0:25:490:25:51

I saw all of those particles of dust as being

0:25:530:25:56

other human beings occupying the planet.

0:25:560:25:59

And they were in the stream of life, they were interacting, they

0:25:590:26:03

were bouncing off one another, they were doing something collective.

0:26:030:26:07

And I saw myself as off in a corner, you know,

0:26:110:26:15

walled off by myself, out of the stream of life.

0:26:150:26:18

In September 2010, after 410 days in solitary confinement,

0:26:240:26:30

Sarah was finally released and allowed to rejoin the world.

0:26:300:26:33

But for a long time, she suffered from extreme post-traumatic stress.

0:26:360:26:41

The philosopher Martin Heidegger said we can't talk about being,

0:26:440:26:48

we can only talk about being in the world.

0:26:480:26:51

In other words, the world around you is a part of who you are.

0:26:510:26:56

In a vacuum, you lose your sense of self.

0:26:580:27:01

It's not easy for science to study people

0:27:050:27:08

while they're experiencing solitary confinement,

0:27:080:27:11

but a simple experiment designed by neuroscientist

0:27:110:27:14

Naomi Eisenberger can give us

0:27:140:27:16

an insight into what's happening in the brain when we feel excluded.

0:27:160:27:21

It's based on a game of catch.

0:27:240:27:27

While volunteers played a computer game of catch, Eisenberger

0:27:290:27:33

and her team scanned their brains.

0:27:330:27:35

The volunteers thought the other characters were controlled by other

0:27:370:27:41

participants, but in fact, they were just part of a computer program.

0:27:410:27:45

At first, the other characters played nicely,

0:27:470:27:50

but after a while, they'd cut the volunteer out of the game.

0:27:500:27:54

And simply play between themselves.

0:27:540:27:57

She found that being left out of the game activated the pain matrix.

0:27:590:28:05

Not getting the ball might seem insignificant,

0:28:090:28:12

but to the brain, social rejection is so meaningful that it hurts.

0:28:120:28:17

But that pain, in turn, is useful.

0:28:220:28:25

It pushes us in the direction of bonding with others.

0:28:280:28:31

We all seek out alliances.

0:28:370:28:40

We join with friends, with family,

0:28:400:28:44

with colleagues.

0:28:440:28:47

It could be which team we support.

0:28:470:28:49

What style we go for.

0:28:510:28:53

What our hobbies are.

0:28:540:28:57

It gives comfort to belong to a group.

0:28:570:29:00

And that gives us a critical clue into our success as a species.

0:29:000:29:05

Survival of the fittest isn't just about individuals.

0:29:070:29:10

It's also about groups.

0:29:100:29:13

We're safer, we're more productive, we overcome challenges.

0:29:130:29:17

The drive to work in groups has helped human populations

0:29:190:29:23

thrive across the planet and build entire civilisations.

0:29:230:29:27

And yet, there's a flipside to this drive to come together.

0:29:330:29:37

Because for every ingroup, there are outsiders.

0:29:370:29:40

And the consequences of that can be very dark.

0:29:440:29:47

History is plagued with examples of one group turning on another

0:29:520:29:56

that was defenceless and posed no threat.

0:29:560:30:00

If you were to look at my family tree,

0:30:030:30:06

you would see that most of the branches end in the early 1940s.

0:30:060:30:10

This is because my family is ethnically Jewish.

0:30:100:30:15

That small social marker was enough to prompt Nazi genocide.

0:30:150:30:19

Under normal circumstances,

0:30:230:30:25

you wouldn't find it conscionable to go and murder your neighbour.

0:30:250:30:29

So what is it that allows hundreds or thousands of people to

0:30:290:30:33

suddenly do exactly that?

0:30:330:30:36

What is it about certain situations that short-circuits the normal

0:30:360:30:41

social functioning of the brain?

0:30:410:30:43

While the Nazi holocaust was on an unprecedented scale,

0:30:470:30:52

it wasn't unique.

0:30:520:30:54

Genocide continued to occur all over the world

0:30:540:30:58

and within a generation, it returned to Eastern Europe.

0:30:580:31:02

This time, it was in Yugoslavia.

0:31:020:31:05

The Bosnian war from 1992 to '95 saw atrocities on both sides.

0:31:150:31:22

In one of the worst, more than 100,000 Bosnians Muslims,

0:31:220:31:26

known as Bosniaks,

0:31:260:31:28

were slaughtered by Serbians in actions known as ethnic cleansing.

0:31:280:31:32

One of the most horrible incidents happened here at Srebrenica.

0:31:350:31:39

Over the course of just ten days,

0:31:430:31:46

8,000 people were systematically killed.

0:31:460:31:49

How does something like this happen?

0:31:510:31:55

Here in 1995, thousands of Bosniaks took refuge inside this

0:31:550:31:59

United Nations compound

0:31:590:32:01

because this village was surrounded by siege forces.

0:32:010:32:04

But then, on July 11th,

0:32:060:32:08

the UN commanders made the decision to expel all the refugees

0:32:080:32:12

and they delivered them right into the hands of their enemies,

0:32:120:32:16

who were waiting just outside this gate.

0:32:160:32:18

Women were raped and men were executed

0:32:190:32:23

and even children were killed and this was just the beginning of what

0:32:230:32:28

would be the largest genocide on European soil since the Holocaust.

0:32:280:32:32

The Dutch were there.

0:32:420:32:44

I mean, the world was there, the UN,

0:32:440:32:47

the Serbs were there as perpetrators.

0:32:470:32:50

Everything was mixed.

0:32:500:32:52

The refugees were there, the babies were crying.

0:32:520:32:55

I was there, being protected with that UN ID card that said

0:32:560:32:59

UN Language Assistant, whatever...

0:32:590:33:02

Hasan Nuhanovic's status as a UN translator made him

0:33:040:33:08

part of a protected group.

0:33:080:33:10

But his family members were marked out by their identity as Muslims.

0:33:120:33:17

At that very moment when my family was being sent out of the compound

0:33:170:33:22

to actually die, I lost my mother, my brother and my father.

0:33:220:33:26

You know, like, you are in a situation where your family

0:33:260:33:30

is being killed.

0:33:300:33:32

And I was thinking, "My God...

0:33:320:33:35

"Why?"

0:33:350:33:38

One of the most striking things

0:33:400:33:42

is that the perpetrators weren't strangers.

0:33:420:33:45

They were people with whom his family had previously shared

0:33:450:33:48

a great deal.

0:33:480:33:50

The continuation, you know, of the killings, of torture,

0:33:510:33:56

was perpetrated by our neighbours.

0:33:560:34:01

You know, the very people we had been living with for decades.

0:34:010:34:05

They were capable of killing their own school friends.

0:34:080:34:12

I remember they said they arrested a dentist who was a Bosniak,

0:34:140:34:18

the best dentist in the town.

0:34:180:34:20

They tied him up for a light pole, like this.

0:34:220:34:25

In front of the post office. He was hanging there like this.

0:34:250:34:28

And they beat him with a metal bar, they broke his spine.

0:34:280:34:34

And he was there, dying for days, while Serb children went to school.

0:34:340:34:39

Walking by his body, you know.

0:34:390:34:43

I mean, there are universal values

0:34:450:34:48

and these universal values are kind of very basic.

0:34:480:34:53

Don't kill.

0:34:530:34:54

April '92, this... Don't kill...

0:34:560:35:01

suddenly disappeared.

0:35:010:35:03

It was like - go and kill.

0:35:030:35:05

It was allowed to kill.

0:35:050:35:07

This is where Hasan's family is buried and each year,

0:35:180:35:22

there are new bodies that are found and identified

0:35:220:35:25

and they're brought here.

0:35:250:35:26

Many of these graves are fresh.

0:35:260:35:29

And across the human species, this is just one genocide of many.

0:35:320:35:35

Genocides keep happening. Rwanda, Darfur, Nanking,

0:35:370:35:42

Armenia... And my interest is in understanding why.

0:35:420:35:46

Traditionally, we ask this question through the lens of history

0:35:490:35:53

or economics or politics, and those are all important vantage points,

0:35:530:35:57

but I think for a complete picture, one more lens is needed.

0:35:570:36:01

We need to understand genocide as a neural phenomenon.

0:36:010:36:05

I've been researching this back in my laboratory

0:36:120:36:15

and here is my main question - when we interact with someone,

0:36:150:36:20

does our brain function differ according to which group they're in?

0:36:200:36:24

For every ingroup we belong to,

0:36:270:36:28

there's at least one group that we don't.

0:36:280:36:32

And that division can be based on anything.

0:36:320:36:35

Race, or gender, or wealth, or religion.

0:36:370:36:41

We put 130 participants in the scanner.

0:36:470:36:50

And here's what they saw - six hands on the screen

0:36:530:36:56

and the computer randomly picks one of these

0:36:560:36:58

and then that hand gets stabbed by a syringe needle.

0:36:580:37:01

Now, that activates the pain matrix, which is what comes online

0:37:060:37:09

when you're in pain or you seem someone else in pain.

0:37:090:37:12

Now, here's the trick.

0:37:160:37:17

We now added a label to each hand -

0:37:170:37:20

Jewish, Christian, Muslim, Hindu, Atheist, Scientologist.

0:37:200:37:25

And the question is - would they care as much

0:37:250:37:28

when they see a member of their outgroup getting stabbed?

0:37:280:37:31

So, here's what we found.

0:37:400:37:42

Here's a subject and when he watched a member of his ingroup getting

0:37:420:37:47

stabbed, there was a large neural response in this area of his brain.

0:37:470:37:51

But when he watched a member of one of his outgroups get stabbed,

0:37:510:37:55

there was essentially a flat line.

0:37:550:38:00

We scanned a range of volunteers

0:38:000:38:02

and there are individual differences, but the trend is clear.

0:38:020:38:06

A single-word label is enough

0:38:060:38:09

to change your brain's basic preconscious response

0:38:090:38:12

to another person in pain.

0:38:120:38:14

In other words, how much you care about them.

0:38:140:38:17

Now, you might have opinions about religion

0:38:170:38:20

and its historical divisiveness,

0:38:200:38:22

but even atheists here care more about other atheists'

0:38:220:38:27

hands getting stabbed than they do about other people.

0:38:270:38:30

So it's not really about religion. It's about which team you're on.

0:38:300:38:34

This is just the first step in understanding how we get to this.

0:38:400:38:44

To understand how groups of people can commit atrocities,

0:38:510:38:56

it can help to look at the behaviour of individuals, like psychopaths.

0:38:560:39:02

Some of the most callous,

0:39:020:39:03

inhumane crimes ever recorded have been committed by psychopaths.

0:39:030:39:09

But what's different about their brains that allows them

0:39:100:39:15

to act that way?

0:39:150:39:17

There are networks in the medial prefrontal cortex that

0:39:190:39:23

underlie social interaction.

0:39:230:39:26

When we interact with other people, this area becomes active.

0:39:260:39:30

But in the brain of someone with extreme psychopathy, this area

0:39:320:39:36

has a lot less activity.

0:39:360:39:39

A psychopath doesn't care about you.

0:39:400:39:43

He might be able to run a simulation

0:39:430:39:45

of what you're going to do or how you might react,

0:39:450:39:48

but when it comes to an emotional understanding

0:39:480:39:52

of what it's like to be you, he doesn't get that.

0:39:520:39:56

To him, you're just an obstacle to be worked around or manipulated,

0:39:560:40:00

rather than a fellow human being.

0:40:000:40:03

So what accounts for genocide? Is it driven by armies of psychopaths?

0:40:030:40:09

Well, that can't be it

0:40:090:40:10

because psychopaths only make up a small fraction of the population,

0:40:100:40:14

but genocide typically engages a wider community.

0:40:140:40:18

So here's the question - how do you get ordinary citizens on board?

0:40:180:40:23

At the University of Leiden in Holland, Dr Lasana Harris

0:40:290:40:33

has been conducting an experiment to understand a piece of this puzzle.

0:40:330:40:38

So now we're going to start the experiment.

0:40:490:40:52

What you're going to see is a bunch of pictures of different people.

0:40:520:40:56

Your job is just to react naturally to those pictures.

0:40:560:40:59

Lasana is looking at activity

0:41:010:41:03

in the brain areas involved in human social interaction,

0:41:030:41:07

in particular the medial prefrontal cortex.

0:41:070:41:11

This comes online when we think about other people.

0:41:110:41:14

It's less active when dealing with something inanimate, like a cup.

0:41:140:41:18

What Lasana found is that this region has a similarly low

0:41:200:41:24

response when we deal with certain types of other people.

0:41:240:41:28

What he sees now are stereotypical

0:41:320:41:34

images of people from different social groups.

0:41:340:41:38

What we see here is that this network of brain regions,

0:41:410:41:44

including medial prefrontal cortex, is less active

0:41:440:41:48

when our participant looks at the homeless people.

0:41:480:41:52

So what this pattern of activity

0:41:520:41:54

suggests is a type of mental avoidance.

0:41:540:41:56

They are not thinking about the mind of the homeless person

0:41:560:42:00

in the same way they thought about the mind

0:42:000:42:02

of the college student that they saw or the businesspeople.

0:42:020:42:05

So if, for instance,

0:42:080:42:09

you imagine that interacting with a homeless person will be unpleasant,

0:42:090:42:13

it will make you feel bad.

0:42:130:42:15

You may feel some demand to donate some of your money

0:42:150:42:18

and all of these unpleasant pressures that come along with it.

0:42:180:42:22

By shutting off the system, you never experience those feelings.

0:42:220:42:26

To a brain that responds this way, homeless people are dehumanised.

0:42:290:42:34

They are viewed more like objects and that can enable us to not care.

0:42:340:42:39

Of course, if you don't properly diagnose this person

0:42:440:42:47

as a human being - which is happening here -

0:42:470:42:49

then the different moral rules we have

0:42:490:42:52

that are reserved for human people may not apply.

0:42:520:42:55

So, under the right circumstances,

0:42:590:43:01

our brain activity can look more like a psychopath's.

0:43:010:43:05

But to understand how we can get to genocide,

0:43:060:43:10

we need to understand one more thing about group behaviour.

0:43:100:43:14

Genocide is only possible

0:43:140:43:17

when dehumanisation happens on a massive scale -

0:43:170:43:20

not just a few individuals,

0:43:200:43:22

but whole sections of the population.

0:43:220:43:24

We are talking about a group of people committing atrocities.

0:43:260:43:30

And if all the members of that perpetrating group are complicit,

0:43:330:43:37

it's as if they have all somehow experienced

0:43:370:43:40

the same reduction in brain activity

0:43:400:43:42

when they think about their outgroup.

0:43:420:43:44

This can be understood and studied like a disease outbreak -

0:43:470:43:51

a kind of group contagion, one that is most often spread deliberately.

0:43:510:43:56

The perfect tool for this job is propaganda.

0:44:010:44:04

It plugs right into neural networks

0:44:040:44:06

and it dials down the degree to which we care about other people.

0:44:060:44:11

Just like all sites of genocide,

0:44:110:44:14

that's what happened in the former Yugoslavia.

0:44:140:44:16

The people who went on to torture and kill their neighbours

0:44:180:44:21

were bombarded with propaganda.

0:44:210:44:24

State-controlled broadcasters demonised the Bosnian Muslims

0:44:240:44:29

with distorted news stories.

0:44:290:44:31

We knew from the beginning

0:44:310:44:33

that somebody is helping the Muslims and arming them.

0:44:330:44:36

They went so far as to claim that the Muslims

0:44:360:44:39

were feeding Serbian children to the lions at the zoo.

0:44:390:44:42

Across place and time, the language of propaganda changes very little.

0:44:470:44:51

It always plays the familiar tune of dehumanisation.

0:44:510:44:55

"Make your enemy less than human.

0:44:550:44:58

"Make him like an animal."

0:44:580:45:00

Propaganda is a weapon.

0:45:030:45:04

And, over the course of human history,

0:45:080:45:10

it's become an art and a science

0:45:100:45:14

and it's become ever more dangerous.

0:45:140:45:17

In our connected age,

0:45:240:45:26

any extremist group can reach millions of people with a keystroke.

0:45:260:45:31

The internet is the perfect carrier for propaganda messages

0:45:310:45:35

to reach the people most likely to act upon them -

0:45:350:45:38

young men.

0:45:380:45:40

The political agendas around us

0:45:440:45:47

actually manipulate the brain activity inside of us.

0:45:470:45:51

So is there any way to stop what has happened in the past

0:45:520:45:55

from continuing into the future?

0:45:550:45:58

One possible solution lies in an 1960s experiment

0:46:010:46:05

that was conducted not in a science lab,

0:46:050:46:08

but a school.

0:46:080:46:10

It was 1968,

0:46:130:46:15

the day after the assassination of Martin Luther King.

0:46:150:46:19

Is there anyone in the United States

0:46:240:46:26

that we do not treat as our brothers?

0:46:260:46:28

-Yeah.

-Who?

-Black people.

0:46:280:46:30

The black people. Who else?

0:46:300:46:32

Jane Elliott was a teacher in a small town in Iowa

0:46:320:46:35

and she wanted to show her class what prejudice really felt like.

0:46:350:46:40

How are black people treated?

0:46:400:46:41

They don't get anything in this world.

0:46:410:46:43

-Why is that?

-Because they are a different colour.

0:46:430:46:47

These two men were in that class.

0:46:500:46:53

This was Rex, back then...

0:46:530:46:55

..and this was Ray.

0:46:580:46:59

How many of you in here have blue eyes?

0:47:010:47:04

OK. How many in here have brown eyes?

0:47:050:47:08

Jane says, "We are going to have this exercise,"

0:47:080:47:11

and she right away launches into the propaganda

0:47:110:47:14

of "blue eyes are better than brown eyes".

0:47:140:47:16

Blue-eyed people are better than brown-eyed people.

0:47:160:47:20

Ray and Rex both had blue eyes.

0:47:220:47:26

You brown-eyed people are not to play

0:47:270:47:30

with the blue-eyed people on the playground

0:47:300:47:32

because you are not as good as blue-eyed people.

0:47:320:47:35

The brown-eyeds were denied privileges given to the blue eyes,

0:47:350:47:39

and they had to wear special collars.

0:47:390:47:41

You begin to notice today

0:47:410:47:43

that we spend a great deal of time waiting for brown-eyed people.

0:47:430:47:46

Do you remember what your own behaviour was like

0:47:540:47:58

when you were on top?

0:47:580:48:00

-I was...tremendously evil to my friends.

-How so?

0:48:000:48:05

I was going out of my way to pick on my brown-eyed friends

0:48:050:48:11

for the sake of my own promotion.

0:48:110:48:14

What did you do?

0:48:140:48:16

I recall...telling Mrs Elliott - Jane - that she should keep

0:48:180:48:24

the yardstick at hand in case those brown-eyeds got out of control.

0:48:240:48:27

I don't see the yardstick, do you?

0:48:270:48:29

It's over there.

0:48:290:48:32

Mrs Elliott, you'd better keep that on your desk,

0:48:320:48:34

so as the brown-eyed people don't get out of hand.

0:48:340:48:37

At that time, my hair was quite blonde and my eyes were quite blue,

0:48:380:48:42

and I was the perfect little Nazi.

0:48:420:48:44

I looked for ways to be mean to my friends

0:48:450:48:50

who, minutes or hours earlier, had been very close to me.

0:48:500:48:56

But next day, there was a reversal of fortune.

0:48:580:49:02

Yesterday, I told you

0:49:040:49:05

that brown-eyed people aren't as good as blue-eyed people.

0:49:050:49:08

That wasn't true. I lied to you, yesterday.

0:49:080:49:11

Oh, boy, here we go again.

0:49:130:49:15

The truth is...

0:49:150:49:16

..that brown-eyed people are better than blue-eyed people.

0:49:170:49:20

The person you trust stands before you

0:49:200:49:22

and says, "I was wrong. Now, here's the truth."

0:49:220:49:27

Takes your world and shatters it,

0:49:280:49:32

like you have never had your world shattered before.

0:49:320:49:35

You blue-eyed people are not to play with the brown-eyed people.

0:49:350:49:39

Blue-eyed people, go to the back,

0:49:390:49:41

the brown-eyed people, come to the front.

0:49:410:49:43

It's not fair!

0:49:460:49:47

Tell me a little more about what it was like

0:49:500:49:52

when you were in the down group.

0:49:520:49:54

You have such a sense of loss of personality and self,

0:49:540:49:59

that makes it almost impossible to function

0:49:590:50:01

with what is going on in the room.

0:50:010:50:03

Should the colour of some other person's eyes

0:50:030:50:06

have anything to do you with how you treat them?

0:50:060:50:08

-CLASS: No.

-All right, then -

0:50:080:50:10

should the colour of their skin?

0:50:100:50:11

CLASS: No.

0:50:110:50:13

Should you judge people by the colour of their skin?

0:50:130:50:17

CLASS: No.

0:50:170:50:18

If I were just going to riff, guess at it,

0:50:190:50:21

it's that one of the most important things we learn as humans

0:50:210:50:24

is perspective-taking

0:50:240:50:25

and kids don't often get really meaningful exercise in that,

0:50:250:50:30

and when you're forced into understanding

0:50:300:50:32

what it is like to stand in someone else's shoes,

0:50:320:50:34

that opens up a lot of cognitive pathways for you.

0:50:340:50:37

I remember saying something to my dad about a comment he made,

0:50:370:50:41

saying "No, that's not appropriate."

0:50:410:50:45

And it did change within the family.

0:50:450:50:48

But you talk about a little kid making that statement, it's huge.

0:50:480:50:53

But it reaffirmed that you can do that -

0:50:530:50:56

you could begin to change.

0:50:560:50:58

The brilliance of the blue eyes/brown eyes experiment

0:51:060:51:09

is that the teacher, Jane Elliott, switched which group was on top

0:51:090:51:13

and that allowed the students to extract the larger lesson,

0:51:130:51:17

which is that systems of rules can be arbitrary.

0:51:170:51:20

They learned that the truths of the world are not fixed

0:51:200:51:25

and they are not even necessarily truths.

0:51:250:51:27

And this is what empowered the children as they grew

0:51:270:51:30

to see through the smoke and mirrors of other people's political agendas

0:51:300:51:34

and to form their own opinions -

0:51:340:51:36

surely a skill we should be teaching to all of our children.

0:51:360:51:40

Should the colour of some other person's eyes

0:51:410:51:43

have anything to do with how you treat them?

0:51:430:51:45

CLASS: No.

0:51:450:51:46

GUNFIRE

0:51:470:51:50

When people are armed with an understanding

0:51:500:51:52

of how propaganda works,

0:51:520:51:54

the power of propaganda is reduced.

0:51:540:51:58

As we come to understand the deep importance of cooperation,

0:51:590:52:03

we stand a chance of not only reducing dehumanisation,

0:52:030:52:08

but achieving our potential as a species.

0:52:080:52:11

Genocide doesn't have to be the norm.

0:52:140:52:17

Instead, our fundamentally social nature

0:52:250:52:28

can hold the key to our success as a species.

0:52:280:52:32

Our future, our survival,

0:52:340:52:37

is intimately, permanently bound up with that of the people around us.

0:52:370:52:42

Our social drive is at the root

0:52:450:52:47

of extraordinary acts of bravery and generosity.

0:52:470:52:53

Who you are has everything to do with who we are.

0:52:550:52:59

Our brains are so fundamentally wired to interact

0:52:590:53:02

that it is not always clear where each of us begins and ends.

0:53:020:53:06

Our species is more than just seven billion individuals,

0:53:110:53:16

spread out across the planet.

0:53:160:53:17

We are something more like a single, vast super-organism.

0:53:190:53:23

Because what your friends know and love as you

0:53:250:53:29

is really a neural network,

0:53:290:53:31

embedded in a far larger web of other neural networks.

0:53:310:53:36

In this age of digital connection,

0:53:400:53:42

we desperately need to understand the links between humans.

0:53:420:53:46

If we want our civilisations to have a bright future,

0:53:460:53:50

we'll need to understand how human brains interact,

0:53:500:53:53

the dangers and the opportunities,

0:53:530:53:55

because there is no avoiding the truth

0:53:550:53:57

that is etched into our neural circuitry.

0:53:570:54:00

We need each other.

0:54:000:54:02

Next time on The Brain, I am going to look into the future.

0:54:080:54:11

What if the brain could do more? Handle more?

0:54:130:54:18

What if there were other ways for it to operate?

0:54:180:54:21

We will look at how we can use our brains

0:54:230:54:26

to control new kinds of bodies.

0:54:260:54:28

How our sensory experience can be expanded to new horizons.

0:54:330:54:38

We will look at how we might one day separate our minds

0:54:400:54:43

from our physical selves.

0:54:430:54:44

What if the study of the brain could address our mortality?

0:54:460:54:49

What if, in the future, we didn't have to die?

0:54:490:54:53

So, what is next for our brains?

0:54:550:54:57

What do the next thousand years have in store for us?

0:54:570:55:01

And, in the far future, what is the human race going to look like?

0:55:010:55:05

What will we be capable of?

0:55:050:55:07

We are heading for a fundamental change in the relationship

0:55:080:55:12

between the body, the brain and the outside world.

0:55:120:55:16

We are marrying our biology with our technology

0:55:170:55:20

and that is poised to transform who we will be.

0:55:200:55:25

Download Subtitles

SRT

ASS