Mind Control The Brain: A Secret History


Mind Control

Similar Content

Browse content similar to Mind Control. Check below for episodes and series from the same categories and more!

Transcript


LineFromTo

Why do we do the things we do?

0:00:020:00:04

What really makes us tick?

0:00:040:00:08

How do our minds work?

0:00:080:00:11

For centuries,

0:00:110:00:12

these questions were largely left to philosophers and theologians.

0:00:120:00:17

Then, about 100 years ago, a new science began to shine

0:00:200:00:23

a bright light on the inner workings of the mind.

0:00:230:00:28

It was called "experimental psychology".

0:00:280:00:31

In this series, I'm tracing the history of how this "new science"

0:00:310:00:36

revealed things about ourselves

0:00:360:00:38

that were surprising and often profoundly shocking.

0:00:380:00:41

The experiment requires that we continue.

0:00:410:00:43

But he might be dead in there!

0:00:430:00:45

In this film, I'm investigating the inventive, sometimes barbaric ways

0:00:450:00:50

psychology has been used to control and manipulate people.

0:00:500:00:55

How does it feel, good or bad?

0:00:550:00:57

-Good.

-Good?

0:00:570:00:58

It's a science which strikes

0:00:580:01:00

at the heart of what we hold most dear - our free will.

0:01:000:01:03

I don't think people like having their behaviour tinkered with,

0:01:030:01:06

I don't think they like to be told what to do.

0:01:060:01:10

The pursuit of mind control led to some truly horrific experiments,

0:01:100:01:15

and left many casualties in its wake.

0:01:150:01:18

I only remember that I forget.

0:01:180:01:21

Everything.

0:01:210:01:23

But investigating the forces that shape our minds and our behaviour

0:01:240:01:28

has also led to profound insights into how the brain works.

0:01:280:01:32

Ever since I was a medical student,

0:01:350:01:38

I have been fascinated by psychology,

0:01:380:01:40

by what it has revealed about ourselves

0:01:400:01:43

and by how far some researchers have been prepared to go.

0:01:430:01:47

I'm on my way to Cardiff and I'm feeling apprehensive.

0:02:020:02:06

I'm about to take a class A hallucinogenic drug.

0:02:060:02:10

In the past, this drug and others like it were used in experiments

0:02:120:02:16

which were designed with a very sinister purpose -

0:02:160:02:18

to achieve total mind control.

0:02:180:02:22

The drug I'm going to be given is psilocybin,

0:02:240:02:28

the active ingredient of magic mushrooms.

0:02:280:02:31

Normally, taking psilocybin would be illegal.

0:02:330:02:36

-Michael.

-Hello, nice to see you.

0:02:370:02:40

But in this case, Professor David Nutt

0:02:400:02:42

and his team from Imperial College are doing a scientific study,

0:02:420:02:45

the first of its kind in the UK.

0:02:450:02:48

-So, am I going to enjoy this?

-I think so.

0:02:480:02:51

Most people do, most people get something positive from it.

0:02:510:02:54

We've had no bad reactions,

0:02:540:02:56

so don't worry about what you're going to feel.

0:02:560:02:58

I think you may enjoy it.

0:02:580:03:00

It's a bit odd, the idea of having your consciousness played with.

0:03:000:03:03

You do it all the time, don't you?

0:03:030:03:06

-Every time you go out and get drunk.

-OK.

0:03:060:03:09

The team are expecting psilocybin to have a significant impact on me.

0:03:100:03:15

So I'm put through physical and psychological tests.

0:03:150:03:20

-Would you describe yourself as a spiritual person?

-A bit.

0:03:200:03:24

-And would you describe yourself as a mystical person?

-No.

0:03:240:03:28

So I'm going to prepare the drug now.

0:03:280:03:30

-See that tiny amount in there? Can you see it?

-Just.

0:03:300:03:34

That's 3.5 milligrams, so you get two milligrams.

0:03:340:03:38

-So that's going to be it?

-That's it, but it does quite a lot.

0:03:380:03:42

Surprisingly enough,

0:03:420:03:44

David Nutt believes that by changing the brain,

0:03:440:03:47

psilocybin's mind-bending properties

0:03:470:03:49

may help psychiatric disorders like depression.

0:03:490:03:53

Before they test it on patients, the team want to find out

0:03:530:03:57

exactly how it acts on normal brains like mine.

0:03:570:04:01

It can be quite traumatic.

0:04:010:04:02

The last volunteer we had had quite a dramatic response.

0:04:020:04:05

He said it was probably the strangest experience of his life.

0:04:050:04:09

-OK.

-He said that, um, his whole sense of self dissolved,

0:04:090:04:13

and he only existed as a concept.

0:04:130:04:16

MICHAEL LAUGHS

0:04:160:04:17

OK, I'm not sure I wish to only exist as a concept. We shall see.

0:04:170:04:22

I am starting to wonder what sort of trip I'm going to have.

0:04:220:04:27

I'm placed in a brain scanner

0:04:270:04:30

so the team can map in detail how my brain reacts to psilocybin.

0:04:300:04:35

Then the drug is injected.

0:04:350:04:38

The result? Well, it would turn out

0:04:390:04:42

to be one of the strangest and most unusual experiences of my life.

0:04:420:04:46

Today, drugs are at the cutting edge of mind control.

0:04:470:04:51

But down the years, scientists have experimented with more unusual ways

0:04:510:04:56

to manipulate our brains.

0:04:560:04:58

There is a long and murky history behind scientific attempts

0:05:010:05:06

to control emotions, behaviour, and decision-making.

0:05:060:05:11

I think some of the most interesting stuff can be traced

0:05:160:05:19

to an accidental discovery by a Russian more than 100 years ago.

0:05:190:05:23

In the 1890s, biologist Ivan Pavlov was studying dogs,

0:05:250:05:30

many of which had been snatched off the streets of St Petersburg.

0:05:300:05:34

His early experiments seemed to have nothing to do with mind control.

0:05:340:05:38

He was studying reflexes in animals.

0:05:380:05:42

Pavlov was interested in saliva, and in particular,

0:05:420:05:45

how much saliva is produced when we eat different sorts of foods.

0:05:450:05:48

Now, he studied it in dogs.

0:05:480:05:51

He operated on his dogs so he could accurately measure

0:05:520:05:56

the saliva they produced when they saw food.

0:05:560:05:59

To Pavlov's immense surprise,

0:05:590:06:02

he discovered that the dogs would salivate

0:06:020:06:04

not just when they got the food,

0:06:040:06:06

but when they simply saw the men in white coats

0:06:060:06:10

who brought them the food.

0:06:100:06:12

This was utterly unexpected.

0:06:120:06:14

Pavlov realised that his dogs had made an association

0:06:150:06:18

between the lab coat and the arrival of food.

0:06:180:06:21

It was this association, rather than the food itself,

0:06:210:06:26

that produced the saliva reflex.

0:06:260:06:28

Pavlov wanted to train other responses.

0:06:290:06:32

As he did so, his experiments became increasingly brutal.

0:06:320:06:36

In this experiment, the beat of the metronome

0:06:360:06:39

is combined with an electric shock.

0:06:390:06:42

BUZZING

0:06:430:06:44

The electric shock is given a number of times,

0:06:440:06:48

until the beat of the metronome alone

0:06:480:06:50

is enough to cause the dog to jump.

0:06:500:06:53

The Pavlovian response is one of the most famous discoveries

0:06:560:06:59

in the history of psychology.

0:06:590:07:01

What is less well known

0:07:010:07:03

is that Pavlov didn't confine himself to dogs.

0:07:030:07:06

Disturbingly, he was also involved in experiments on children.

0:07:060:07:11

According to Pavlov,

0:07:110:07:12

our reflexes start being conditioned from the moment we're born.

0:07:120:07:16

As we progress through life,

0:07:160:07:18

more and more of these conditioned reflexes

0:07:180:07:20

are implanted in our brains, building up complex behaviours.

0:07:200:07:25

And once you understand the basics

0:07:250:07:27

of how those behaviours are laid down,

0:07:270:07:30

well, perhaps you can begin to also control them.

0:07:300:07:33

BABY CRIES

0:07:330:07:35

Pavlov's discoveries are fundamental to mind control,

0:07:390:07:43

but they are limited.

0:07:430:07:45

They only condition behaviours that already exist.

0:07:450:07:48

By the time Pavlov died in the mid-1930s,

0:07:510:07:54

new techniques had been developed in America

0:07:540:07:57

that promised far more sophisticated mind manipulation.

0:07:570:08:01

These methods also held the enticing possibility

0:08:040:08:08

of being used on a grand scale.

0:08:080:08:12

The man behind these developments was so controversial

0:08:120:08:15

that he was hung in effigy.

0:08:150:08:17

The vice president described him as "most dangerous".

0:08:170:08:20

And the FBI kept a detailed file on him.

0:08:200:08:23

I'm here to meet Deborah Buzan, daughter of BF Skinner,

0:08:240:08:27

one of the most famous and infamous scientists

0:08:270:08:30

in the history of psychology.

0:08:300:08:32

-Hello.

-Hello.

-Michael Mosley.

0:08:330:08:35

Nice to meet you. Come in.

0:08:350:08:37

He's receiving an honorary degree at Harvard here.

0:08:380:08:41

-He was terribly proud of that.

-Uh-huh.

0:08:410:08:43

And here he is with a rat. SHE LAUGHS

0:08:430:08:47

This, again, is the machine behind with the gazillion wires and things.

0:08:470:08:50

It would all be done very differently today.

0:08:500:08:53

That's my father putting the pigeon

0:08:530:08:56

into what became known as the Skinner Box.

0:08:560:08:59

The key to Skinner's research was the so-called Skinner Box.

0:08:590:09:03

He used it to create new behaviour.

0:09:030:09:06

In this Skinner Box, a pigeon is given a reward

0:09:090:09:12

every time it pecks the word "peck",

0:09:120:09:14

or turns full circle when it sees the word "turn" appear.

0:09:140:09:18

The pigeon has learnt

0:09:220:09:23

that when it performs these actions, it gets food.

0:09:230:09:26

So it continues to do them.

0:09:260:09:28

As it does so, this new behaviour is reinforced.

0:09:300:09:34

This pigeon experiment, and many like it, demonstrated graphically

0:09:360:09:40

that carefully-timed rewards

0:09:400:09:42

can reinforce, change or even create new behaviours.

0:09:420:09:48

Deborah is using her father's technique

0:09:500:09:52

to train her cats to play the piano.

0:09:520:09:55

Lola's not exactly a virtuoso,

0:09:570:09:59

but she has learnt that a little tinkle earns a piece of chicken.

0:09:590:10:03

I like the fact that she changes notes.

0:10:040:10:07

Oh, that was good.

0:10:130:10:15

Teaching animals cute tricks is hardly controversial,

0:10:150:10:19

but Skinner had defined a precise and powerful way

0:10:190:10:23

of modifying behaviour and said it should be used on humans.

0:10:230:10:27

He wanted to apply

0:10:270:10:28

his scientific methods of psychological manipulation

0:10:280:10:32

to whole populations.

0:10:320:10:33

He was really, essentially, telling people how to behave,

0:10:330:10:37

how to act, maybe how to create a world

0:10:370:10:41

in which people would act differently

0:10:410:10:43

from the way they already acted.

0:10:430:10:45

And I don't think people like to have their behaviour tinkered with.

0:10:450:10:49

I don't think they like to be told what to do.

0:10:490:10:52

Skinner believed that many of the problems of modern society

0:10:530:10:56

could be improved if a ruling elite

0:10:560:11:00

were to use his methods to control behaviour.

0:11:000:11:04

Using them would reduce violence and improve productivity.

0:11:040:11:08

Skinner's psychologically-controlled society brushed aside

0:11:110:11:15

an individual's role in determining and shaping their own destiny.

0:11:150:11:19

He saw belief in free will as unscientific nonsense.

0:11:190:11:23

These views polarised opinion.

0:11:230:11:26

On 3rd May, 1972, the same day that 6000 students cheered him on

0:11:280:11:34

in Michigan, he was hung in effigy at Indiana University.

0:11:340:11:38

He very much wanted to get across his message. I don't know

0:11:380:11:42

if he expected such a reception, which was not all good, of course.

0:11:420:11:47

But, certainly, it made people sit up and think.

0:11:470:11:50

Skinner's utopia never materialised.

0:11:520:11:55

But his methods of behaviour control are now widely used

0:11:550:11:59

in everyday life.

0:11:590:12:02

They have been particularly influential

0:12:020:12:05

in the world of child development,

0:12:050:12:07

where the use of well-targeted rewards to encourage good behaviour

0:12:070:12:10

has become an established parenting technique.

0:12:100:12:13

And for some children, Skinner's methods have proved life-changing.

0:12:130:12:17

I've tried applying some of Skinner's techniques

0:12:190:12:22

to my own kids in a slightly sort of haphazard fashion.

0:12:220:12:25

What I've never done is,

0:12:250:12:27

I've never seen it applied systematically

0:12:270:12:30

and in an organised way.

0:12:300:12:32

And I'm hoping to see that in here.

0:12:320:12:34

In the Jigsaw School for Autistic Children in Surrey,

0:12:380:12:41

the teachers use a system based on Skinner's principles

0:12:410:12:45

called Applied Behaviour Analysis.

0:12:450:12:48

We're going to go and watch Rohit working.

0:12:480:12:52

-Take a seat.

-Thank you.

0:12:520:12:54

Rohit started with us last September,

0:12:540:12:56

so he's almost done a full academic year with us.

0:12:560:13:00

He's actually behaving beautifully this morning,

0:13:000:13:02

but within his repertoire,

0:13:020:13:04

he can assault staff, he likes to pull hair.

0:13:040:13:08

Eight-year-old Rohit is working through a series of learning tasks.

0:13:090:13:13

Each task is broken down into small parts.

0:13:130:13:16

He gets a token for completing each part of the task.

0:13:160:13:20

Point to a cube. That's a cube. Good boy, give me high-five.

0:13:200:13:23

Well done. You can take off a token, that was lovely work.

0:13:230:13:28

Every single correct response here is being reinforced by the teacher.

0:13:280:13:31

Our main focus is,

0:13:310:13:32

if we're constantly praising and reinforcing appropriate behaviour,

0:13:320:13:36

we will see an increase in that appropriate behaviour.

0:13:360:13:39

-It is a mushroom.

-It is a mushroom.

0:13:390:13:42

Good boy, you're trying really hard. Good job, Mister.

0:13:420:13:45

The key is to consistently reward the behaviours and responses

0:13:450:13:49

they want to encourage.

0:13:490:13:51

Rohit is working towards the bigger prize of time in the soft play area.

0:13:510:13:56

So they get a combination of reinforcers, if you like,

0:13:560:14:00

so it's a mixture of praise, pat on the back, and a point

0:14:000:14:04

-which they know leads to a prize?

-Yes.

0:14:040:14:07

Rohit eventually gets his prize - playtime in the soft room.

0:14:090:14:14

But more importantly, he'd sat for over an hour

0:14:140:14:17

learning picture-word associations,

0:14:170:14:20

something that would have been impossible just a year ago.

0:14:200:14:24

Are you a fan of Skinner, then?

0:14:240:14:26

I am a fan of Skinner, yes.

0:14:260:14:29

It is really heart-warming watching the way that Skinner's ideas

0:14:290:14:33

of positive reinforcement are being put into practice

0:14:330:14:37

in such a sort of practical way.

0:14:370:14:39

And I also think that without Skinner,

0:14:390:14:42

these kids' lives would be an awful lot bleaker.

0:14:420:14:46

Skinner believed passionately that behaviour modification

0:14:500:14:53

could and should be used to make the world a better place.

0:14:530:14:57

'A false testimony, really,

0:15:000:15:02

'which I had fabricated to keep them happy.'

0:15:020:15:05

But, in the 1950s, when he was developing his ideas,

0:15:050:15:09

research into mind control had already taken a more sinister turn,

0:15:090:15:14

fuelled by the ruthless needs of war.

0:15:140:15:17

By then, America was fighting the communists in Korea.

0:15:170:15:23

As the conflict raged, a sinister new weapon emerged.

0:15:230:15:27

'I did sign a confession, relating to germ warfare.

0:15:270:15:31

'How can I tell them these things?'

0:15:310:15:34

It first came to light

0:15:340:15:35

when the North Koreans paraded their American prisoners of war.

0:15:350:15:39

'How can I go back and face my family?

0:15:420:15:45

'In a civilised world, how can I tell them these things?

0:15:450:15:50

'That I am a criminal in the eyes of humanity?

0:15:500:15:53

'They are my flesh and blood.'

0:15:530:15:56

These images really disturbed people,

0:15:560:15:58

American soldiers denouncing American aggression.

0:15:580:16:02

How had the North Koreans done it?

0:16:020:16:04

How had they, in such a short period of time,

0:16:040:16:07

managed to convert them to the communist cause?

0:16:070:16:10

'How can I go back and face my family?

0:16:100:16:14

'They are my flesh and blood.

0:16:140:16:16

'..and absurd. I did sign a confession...'

0:16:160:16:20

This was a terrifying new weapon

0:16:200:16:22

that threatened the freedom of the West.

0:16:220:16:26

And they gave this weapon a name.

0:16:260:16:28

Brainwashing.

0:16:280:16:30

'People who used to be in the asylum...'

0:16:300:16:34

A huge effort was made

0:16:340:16:35

to try and uncover the secrets of the communist brainwashers.

0:16:350:16:39

In Britain, an ambitious psychiatrist soon became convinced

0:16:410:16:45

he knew how it was done.

0:16:450:16:47

His name was William Sargant, and his radical approach to psychiatry

0:16:480:16:53

inspired both loyalty and loathing.

0:16:530:16:56

He once said of himself, "Some people say I'm a wonderful doctor,

0:16:560:17:02

"others that I am the work of the devil."

0:17:020:17:06

Sargant had a theory on brainwashing,

0:17:060:17:09

and it was based on the work of none other than Ivan Pavlov.

0:17:090:17:13

Sargant had read about one of Pavlov's lesser-known experiments,

0:17:130:17:17

in which he'd tried to remove his dogs' conditioning.

0:17:170:17:20

Typically, Pavlov had not gone for half measures.

0:17:200:17:23

He had induced nervous breakdowns in his dogs,

0:17:230:17:26

and found that this rapidly and dramatically removed

0:17:260:17:29

the conditioned behaviour he had so painstakingly created.

0:17:290:17:33

Pavlov concluded that, during the breakdown,

0:17:350:17:38

established connections in the dogs' brains were lost.

0:17:380:17:42

When the brain rewired itself,

0:17:420:17:44

it made new connections, leading to dramatic changes in behaviour.

0:17:440:17:48

Sargant thought that Pavlov's work

0:17:500:17:52

was central to understanding brainwashing,

0:17:520:17:55

how somebody could change their core beliefs

0:17:550:17:57

in a very short period of time.

0:17:570:17:59

Just as with the dogs, Sargant believed that when a human brain

0:18:000:18:04

is broken down, connections are lost,

0:18:040:18:07

which then need to be rebuilt.

0:18:070:18:09

He decided to use this insight as the basis of a radical treatment

0:18:120:18:15

for mentally ill patients.

0:18:150:18:17

His aim was simple -

0:18:200:18:21

to break down and then reprogramme their troubled minds.

0:18:210:18:26

Mary Thornton was 21 when she was admitted under Sargant's care.

0:18:270:18:32

My memories are like snapshots.

0:18:340:18:38

One is of the electrodes being attached to the side of my head,

0:18:380:18:44

being given a general anaesthetic,

0:18:440:18:47

seeing an image of myself in the mirror one day,

0:18:470:18:51

seeing a strange face looking back at myself.

0:18:510:18:55

And being really, really frightened that I would never get out.

0:18:560:19:02

40 years on, Mary has returned

0:19:100:19:12

to the Royal Waterloo Hospital in London, once part of St Thomas'.

0:19:120:19:16

This... It... One of these rooms is the room I recovered in.

0:19:170:19:23

And it was on the left-hand side.

0:19:230:19:25

And it could have been that room.

0:19:250:19:28

It could have been a room exactly like that.

0:19:280:19:32

Mary's parents had admitted her to the hospital

0:19:320:19:35

following a series of rows over a new boyfriend.

0:19:350:19:38

I understood that I was there

0:19:380:19:41

to be treated for an acute anxiety state.

0:19:410:19:46

And I was told that I would be given sleep treatment to help my, erm...

0:19:460:19:52

..head to have a nice, long rest.

0:19:540:19:57

Mary, and other patients, became unwitting guinea pigs,

0:19:570:20:01

testing Sargant's more radical ideas.

0:20:010:20:04

Every available method is used here, all at the same time.

0:20:040:20:09

Sargant believed it was essential to use a wide range of approaches

0:20:090:20:13

if he was to break down established patterns of thinking.

0:20:130:20:18

He began with a process of "modified narcosis".

0:20:180:20:22

Patients were put into a medicated sleep for up to three months.

0:20:220:20:27

The patients are woken three times a day for feeding,

0:20:270:20:31

and to check on whether they're getting better.

0:20:310:20:34

They were then given copious amounts of drugs,

0:20:340:20:37

combined with bouts of electric shock therapy, or ECT.

0:20:370:20:41

-OK? Go.

-110 volts for three seconds

0:20:410:20:45

through the frontal lobes of the brain.

0:20:450:20:49

Sargant was convinced he had to use all the tools available to him

0:20:490:20:54

if he was to wipe the slate clean.

0:20:540:20:56

He had also suffered from depression,

0:20:560:20:58

and knew it could be life-threatening,

0:20:580:21:00

so the risks were worth it.

0:21:000:21:02

In all my 30 years, I've tried never to use any method

0:21:020:21:06

that I wouldn't use on myself.

0:21:060:21:08

All that matters is,

0:21:080:21:09

don't do unto others what you wouldn't have done to yourself.

0:21:090:21:13

For many patients like Mary,

0:21:170:21:19

this barrage of treatments had a dramatic effect on their minds.

0:21:190:21:23

I only remember that I forget.

0:21:230:21:25

Everything.

0:21:280:21:29

That's what ECT made you do.

0:21:290:21:33

In this building, I saw her maybe four times,

0:21:330:21:37

I visited maybe four times.

0:21:370:21:40

And she progressively got less aware of who I was.

0:21:420:21:47

On one occasion, she didn't know me from Adam.

0:21:470:21:52

After three months at the Royal Waterloo, Mary was discharged.

0:21:520:21:58

Sargant's belief that her brain, shaken and shocked,

0:21:580:22:00

would reassemble itself, free from the shackles of her past, was wrong.

0:22:000:22:05

When I left, I had great big black holes in my memory.

0:22:050:22:12

I was sent to convalesce to my brother in Yorkshire.

0:22:120:22:16

A week or so into my stay,

0:22:160:22:18

I suddenly remembered that I had a boyfriend,

0:22:180:22:21

and I found his telephone number. And I phoned him up.

0:22:210:22:26

And he came to meet me in London.

0:22:260:22:30

Lucky for me!

0:22:300:22:32

And we've been together ever since.

0:22:320:22:34

We had her on two grams.

0:22:370:22:40

-That's 2,000 milligrams.

-Yes.

0:22:400:22:42

Sargant had shown that with drugs and ECT,

0:22:420:22:46

he could disrupt and destroy memories.

0:22:460:22:49

Yet with patients like Mary,

0:22:490:22:51

he failed to break patterns of behaviour

0:22:510:22:54

and alter personal beliefs.

0:22:540:22:56

The human mind had proved remarkably resilient to change.

0:22:560:23:00

And without these new treatments, we did have psychotherapy

0:23:000:23:03

and psychoanalysis...

0:23:030:23:04

Sargant's theories on brainwashing were flawed,

0:23:040:23:07

and rejected by many British psychiatrists.

0:23:070:23:10

'How can I go back and face my family?'

0:23:100:23:14

But across the Atlantic, in the United States, a frantic search

0:23:140:23:18

to learn the secrets of communist mind control continued unabated.

0:23:180:23:22

'They're my flesh and blood.'

0:23:220:23:25

In response to the threat of brainwashing, the CIA had set up

0:23:250:23:29

their own secret operation, code name MK-ULTRA.

0:23:290:23:34

MK-ULTRA employed many of the leading scientists of the day.

0:23:350:23:39

From 1953, they embarked on a series of increasingly bizarre experiments

0:23:390:23:44

designed to replicate the communist brainwashing techniques.

0:23:440:23:48

They subjected people to sensory deprivation,

0:23:480:23:52

total isolation, electric shocks.

0:23:520:23:55

One research area that seemed promising was drugs,

0:23:550:23:59

and one class of drug stood out from all the rest.

0:23:590:24:03

They seemed to change the world.

0:24:050:24:07

They distorted vision, and they warped sound.

0:24:100:24:14

They created vivid hallucinations.

0:24:160:24:18

Hallucinogenic drugs like LSD and psilocybin had only recently

0:24:220:24:26

been discovered, and were the subject of intense research.

0:24:260:24:30

They held out the enticing possibility

0:24:340:24:36

of controlling another person's mind.

0:24:360:24:38

Perhaps they could also change your core beliefs.

0:24:380:24:43

The top-secret operatives from MK-ULTRA experimented on themselves,

0:24:430:24:49

their colleagues and on unsuspecting members of the public.

0:24:490:24:52

And the American military

0:24:520:24:54

also attempted to harness their mind-bending powers.

0:24:540:24:57

-How do you feel?

-I could run 100 miles right now.

0:24:590:25:03

-Is that right?

-That's right.

-Pretty pepped up, huh?

0:25:030:25:05

But, while the drugs were certainly mind-altering,

0:25:050:25:10

they were useless as a form of brainwashing.

0:25:100:25:13

The effects were too temporary, too unpredictable

0:25:130:25:16

to reliably alter an individual.

0:25:160:25:19

The huge effort to produce

0:25:190:25:21

a foolproof method of brainwashing had failed.

0:25:210:25:25

And following the end of the Korean War,

0:25:250:25:28

it emerged that communist techniques were also flawed.

0:25:280:25:31

Many of the soldiers who had denounced America while in captivity

0:25:310:25:34

rediscovered their patriotism when they got home.

0:25:340:25:39

Brainwashing turned out to be an illusory weapon of war.

0:25:390:25:44

For many years, there was diminished scientific interest

0:25:440:25:47

in the mind-altering properties of hallucinogenic drugs.

0:25:470:25:50

Now, half a century on, things are changing.

0:25:530:25:57

As part of an experiment, I have been injected with psilocybin,

0:25:570:26:01

the active component of magic mushrooms.

0:26:010:26:03

Professor Nutt and his team look on as the drug races through my system.

0:26:030:26:09

They want to understand exactly how the drug works on my brain.

0:26:090:26:14

Ten on that scale is extremely intense.

0:26:140:26:18

How anxious he feels now...

0:26:180:26:20

He'll be hallucinating, certainly.

0:26:210:26:24

He might be seeing colours and swirling patterns.

0:26:240:26:27

Probably some bodily feelings as well.

0:26:270:26:29

His body might feel slightly different.

0:26:290:26:31

Maybe a kind of floating sensation.

0:26:310:26:34

Inside the machine, I was convinced I could levitate,

0:26:340:26:39

make the walls of the scanner dissolve

0:26:390:26:42

and fly up to the stars at supersonic speed.

0:26:420:26:46

It was all rather beautiful.

0:26:460:26:50

His sense of time has slowed down.

0:26:500:26:52

OK, Michael, we're all finished, I'll come and get you out.

0:26:520:26:57

Thank you.

0:26:570:26:59

'Although the effects of psilocybin had largely worn off,

0:26:590:27:02

'I had an uncontrollable urge to talk. And talk.'

0:27:020:27:06

That was very strange, I have to say, because I thought,

0:27:060:27:10

"Whoa!" and then, "Whoo!"

0:27:100:27:11

It was really sort of, whoosh, take-off.

0:27:110:27:13

'..And talk. And talk.'

0:27:130:27:18

It took me back to when I was eight years old

0:27:180:27:21

and I used to rub my eyeballs

0:27:210:27:22

in an attempt to have religious experiences.

0:27:220:27:25

You felt it, when it came,

0:27:250:27:27

it was just like being in a sort of Star Trek thing,

0:27:270:27:30

going into hyperspace, and "whoo".

0:27:300:27:32

As soon as you start thinking even the slightest bad thoughts,

0:27:320:27:35

you think, "This is really, really...

0:27:350:27:37

"I don't want to go there."

0:27:370:27:38

And actually, dragging yourself back from it is very difficult.

0:27:380:27:42

I'm still feeling very strange.

0:27:420:27:44

But you want to go out and share it with everyone.

0:27:440:27:47

You want to go and say, "Hello!" and talk about it.

0:27:470:27:50

My results will be merged with scans from other volunteers

0:27:500:27:54

to identify the precise areas inside the brain where the drug is active.

0:27:540:27:59

And you see there are essentially three big, blue blobs.

0:27:590:28:04

What blue means is that

0:28:040:28:06

-the activity of the brain has shut down.

-Switched off!

0:28:060:28:10

When we went into this study, we thought that this drug

0:28:100:28:13

would activate, certainly different parts of the brain to those,

0:28:130:28:17

but we don't find any activation anywhere.

0:28:170:28:20

All we find are reductions in blood flow.

0:28:200:28:22

David Nutt believes that the areas of the brain

0:28:220:28:26

that are being switched off

0:28:260:28:27

are critical to the experience I've just had.

0:28:270:28:30

The blobs in the middle

0:28:300:28:32

are the part of the brain that tells us who and what we are.

0:28:320:28:36

When these were dampened down, I was released from everyday constraints.

0:28:360:28:39

You were able to break free

0:28:390:28:41

from the normal constraints of what you are -

0:28:410:28:44

"I'm a father, I've got to go home in an hour and a half, I've got to make this programme."

0:28:440:28:49

The fact that it dampens activity in these areas

0:28:490:28:51

may provide clues

0:28:510:28:52

as to how psilocybin could be used to treat depression.

0:28:520:28:56

One of the things we think is that,

0:28:560:28:58

if in conditions like depression or obsessive compulsive disorder,

0:28:580:29:03

where people get locked into a mindset which is maladaptive,

0:29:030:29:09

-these regions may be overactive.

-Ah.

0:29:090:29:12

Maybe dampening those down

0:29:120:29:15

will help people move into another mindset

0:29:150:29:18

which might be better and healthier.

0:29:180:29:20

If you wanted to shift somebody who's profoundly depressed

0:29:200:29:23

out of that mindset, something like this might conceivably...

0:29:230:29:26

It's conceivable. We need to do the experiments.

0:29:260:29:30

That's part of the rationale.

0:29:300:29:32

It helps take you from this rigid, motoric process

0:29:320:29:36

of thinking, into something that may be more positive.

0:29:360:29:40

'The idea that you could use a psychedelic drug like psilocybin

0:29:410:29:46

'to treat depression is certainly surprising.

0:29:460:29:48

'If it works, one advantage is, it works fast.'

0:29:480:29:52

You changed immediately. Whether it would produce immediate changes

0:29:520:29:55

into a depressed person, we don't know.

0:29:550:29:57

If it did, it would be wonderful, wouldn't it?

0:29:570:30:00

If particularly, you could hold on to those and maintain them.

0:30:000:30:04

It may turn out that hallucinogenic drugs

0:30:060:30:08

do have a part to play in mind control.

0:30:080:30:11

As a therapeutic tool, to help patients think more positively,

0:30:110:30:16

it's a far cry from the ambitions of would-be brainwashers.

0:30:160:30:19

'"How is it possible?" I ask myself.'

0:30:220:30:25

In the 1960s, a new discovery would uncover a far more pervasive

0:30:250:30:30

and effective way of controlling people.

0:30:300:30:32

It was far more subtle and almost impossible to escape.

0:30:330:30:38

But he might be dead in there!

0:30:380:30:41

The new approach emphasised the importance of social pressure,

0:30:420:30:45

of the social context you find yourself in.

0:30:450:30:48

To find out more,

0:30:480:30:50

psychologists devised elaborate human experiments,

0:30:500:30:54

sometimes with shocking results.

0:30:540:30:56

The son of Jewish immigrants from Eastern Europe,

0:31:020:31:05

Stanley Milgram wanted

0:31:050:31:06

to understand why so many people took part in the perpetration

0:31:060:31:10

of terrible war crimes.

0:31:100:31:12

Milgram was curious to find out

0:31:170:31:19

what drove German soldiers to participate in the Holocaust.

0:31:190:31:22

How is it possible, I ask myself,

0:31:250:31:27

that ordinary people, courteous and decent in everyday life,

0:31:270:31:30

can act callously, inhumanely,

0:31:300:31:33

without any limitations of conscience?

0:31:330:31:36

Milgram described the moment he conceived his experience

0:31:380:31:41

as "incandescent".

0:31:410:31:42

It was beautifully, almost fiendishly designed

0:31:420:31:46

to reveal uncomfortable flaws about human nature.

0:31:460:31:49

It made a huge impact worldwide.

0:31:490:31:51

I remember, as a teenager, reading about it and later watching footage.

0:31:510:31:56

It had an enormous impact on me.

0:31:560:31:58

It influenced my later decision to train as a doctor

0:31:580:32:01

with the intention of becoming a psychiatrist.

0:32:010:32:03

It's nearly 50 years since the original experiment and today,

0:32:030:32:08

I'm fortunate enough to be meeting one of the few remaining survivors.

0:32:080:32:12

-Good morning.

-Morning.

0:32:140:32:16

-Hello, Michael Mosley.

-Hi, Michael.

0:32:160:32:18

-Bill Menold. Nice to meet you.

-Thank you very much.

0:32:180:32:20

'In 1962, Bill Menold had recently left the army and was working in New Haven.'

0:32:200:32:26

I happened to see an ad in the New Haven Register

0:32:260:32:29

and it said "memory and learning experiment".

0:32:290:32:33

They were going to pay you 4. I thought,

0:32:330:32:37

"Why wouldn't I do this?"

0:32:370:32:39

I've got some footage here from Yale University.

0:32:390:32:42

'Bill's participation in the experiment wasn't filmed, but other volunteers were.'

0:32:420:32:47

-This guy here...

-He was the pupil.

0:32:470:32:51

This is basically the... experimenter.

0:32:510:32:54

-The experimenter.

-The doctor.

0:32:540:32:57

The experiment involved a teacher and a learner,

0:32:570:33:01

who were given a simple set of memory tasks.

0:33:010:33:04

If the learner got the answers wrong,

0:33:060:33:09

the teacher had to give him electric shocks, which steadily increased.

0:33:090:33:14

Were you surprised when they talked about electric shocks?

0:33:140:33:17

I wasn't aghast or anything,

0:33:170:33:19

but I was, "Oh, electric shock, what a novel way to do it.

0:33:190:33:24

"Who knows what they'll think of next?"

0:33:240:33:26

What Bill and the other volunteers weren't told

0:33:270:33:30

was that the electric shocks

0:33:300:33:31

were fake and that both the experimenter and the learner

0:33:310:33:35

were, in fact, actors.

0:33:350:33:36

The real purpose of the experiment was to see how many electric shocks

0:33:380:33:42

the volunteers like Bill would administer.

0:33:420:33:45

It's very convincing, isn't it?

0:33:470:33:49

Oh, it is Academy Award nomination.

0:33:490:33:52

'Bill and the other volunteers were asked to increase the voltage

0:33:520:33:56

'every time the learner got an answer wrong.'

0:33:560:33:59

You're now getting a shock of 75 volts.

0:33:590:34:03

Ow!

0:34:030:34:04

180 volts.

0:34:040:34:07

Ow!

0:34:070:34:08

Just how far can you go on this thing?

0:34:080:34:11

Despite the apparent pain they were inflicting,

0:34:110:34:15

most continued to increase the shocks.

0:34:150:34:18

Incorrect.

0:34:200:34:21

150 volts.

0:34:230:34:25

Ah!

0:34:250:34:26

If you're sitting in that chair

0:34:260:34:29

with this stuff going on,

0:34:290:34:30

and the pressure that you're under, it's very hard to think clearly.

0:34:300:34:37

I'm up to 180 volts!

0:34:370:34:39

Please continue, teacher.

0:34:390:34:42

Neil, you're going to get a shock, 180 volts.

0:34:420:34:46

-SWITCH CLICKS

-Ow!

0:34:460:34:48

Who's going to take the responsibility

0:34:480:34:50

if anything happens to that gentleman?

0:34:500:34:52

I'm responsible for anything that happens here.

0:34:520:34:55

Continue, please.

0:34:550:34:57

Milgram asked colleagues beforehand how many people they thought

0:34:570:35:00

would administer the lethal 450-volt shock.

0:35:000:35:03

Most said about 1%, and those would probably be psychopaths.

0:35:030:35:08

195 volts. Dance!

0:35:080:35:12

Ow! Let me out of here!

0:35:120:35:15

Then they really cranked it up.

0:35:150:35:19

You were talking about traumatic experiences.

0:35:190:35:22

I've never had anything before or since that was like that.

0:35:220:35:27

Start with blue, please, at the top of the page.

0:35:270:35:30

Continue, please, teacher.

0:35:300:35:31

Did you at any point think that perhaps you'd killed him?

0:35:310:35:35

Yes.

0:35:350:35:37

When he stopped responding.

0:35:370:35:39

The experiment requires that we continue. Go on, please.

0:35:390:35:42

Don't the man's health mean anything?

0:35:420:35:45

-Whether the learner likes it or not...

-He might be dead in there!

0:35:450:35:49

I shut off my moral...

0:35:490:35:51

compass.

0:35:530:35:55

Continue, please.

0:35:550:35:56

435 volts.

0:35:570:35:59

Once you make the decision...

0:36:010:36:04

you've made your decision.

0:36:050:36:08

Let's get this show on the road and get it over with.

0:36:080:36:11

The answer is "horse".

0:36:110:36:13

450 volts.

0:36:130:36:15

Next word.

0:36:170:36:18

Bill, like two thirds of the volunteers,

0:36:190:36:22

gave the lethal electric shock.

0:36:220:36:25

I thought Bill was very honest,

0:36:280:36:30

particularly when he started to talk about that moment

0:36:300:36:33

when he abandoned his moral compass

0:36:330:36:35

and he handed over responsibility for his actions to the experimenter.

0:36:350:36:39

I guess that's what Milgram demonstrated.

0:36:390:36:42

We'd like to believe we're autonomous creatures,

0:36:420:36:45

that we follow some sort of moral principles.

0:36:450:36:48

But what Milgram showed is that, in the right circumstances,

0:36:480:36:51

you can persuade someone to do almost anything.

0:36:510:36:56

According to Milgram, Bill Menold and the other participants

0:36:560:37:00

who administered lethal shocks were not psychopaths.

0:37:000:37:03

They had simply demonstrated

0:37:030:37:05

a striking willingness to obey authority.

0:37:050:37:09

-Wrong.

-It was this willingness to obey authority

0:37:090:37:12

that shaped and controlled their behaviour.

0:37:120:37:15

-SWITCH CLICKS

-Ah!

0:37:150:37:17

Milgram said it didn't matter if you were German or American,

0:37:180:37:22

the power of authority remained the same.

0:37:220:37:25

Milgram's experiments catapulted him to worldwide fame,

0:37:310:37:34

and he used his numerous television appearances to spread his ideas

0:37:340:37:39

to a wider audience.

0:37:390:37:40

Now, one of the illusions about human behaviour

0:37:400:37:44

is that it stems entirely from personality or character.

0:37:440:37:47

But social psychology shows us that often,

0:37:470:37:50

behaviour is dominated

0:37:500:37:51

by the social roles we're asked to play.

0:37:510:37:53

Revealing the pervasive

0:37:550:37:56

and sometimes malign influence of authority

0:37:560:37:59

chimed well with the anti-authority sentiments of the late 1960s.

0:37:590:38:03

But Milgram's own motivation was not mistrust of authority.

0:38:060:38:10

He wanted to understand WHY authority has such a hold over us.

0:38:100:38:15

To do this, he took to the streets.

0:38:160:38:19

-Yes, sir, where to?

-33 West 42nd.

0:38:200:38:23

Milgram started by pointing out how, unquestioningly,

0:38:230:38:26

we obey authority figures in a whole range of different situations.

0:38:260:38:31

In this setting, I allow things to be done to me

0:38:310:38:35

that I wouldn't allow in any other context.

0:38:350:38:37

The dentist is about to put his fist and electric drill into my mouth.

0:38:370:38:42

Open, please.

0:38:430:38:45

In this setting,

0:38:460:38:48

I willingly expose my throat to a man with a razor blade.

0:38:480:38:53

What he did next would throw new light on why we're so obedient.

0:38:530:38:59

Milgram wanted to see how people would behave

0:38:590:39:01

in a situation where there was no obvious authority.

0:39:010:39:04

How would you react if a perfect stranger came up to you

0:39:040:39:08

on the train and said, "I'd like your seat, please?"

0:39:080:39:12

Well, this is exactly what Milgram and his students did

0:39:120:39:15

on the New York subway.

0:39:150:39:17

Now, if you ask a New Yorker,

0:39:170:39:19

would he give up his seat to a man who gives no reason for asking,

0:39:190:39:23

he'd say, "Never." But what would he do? We photographed what happened.

0:39:230:39:27

Excuse me, ma'am, may I have your seat, please?

0:39:270:39:30

-All right.

-Thank you.

0:39:300:39:32

Excuse me, sir, may I have your seat, please?

0:39:320:39:36

-I guess so.

-Thank you very much.

0:39:360:39:38

But would it work today?

0:39:410:39:43

I decided to repeat his experiment in a London shopping centre.

0:39:430:39:47

Excuse me, could I have that seat, please?

0:39:470:39:50

-Huh?

-Could I have that seat, please?

0:39:500:39:53

-Yeah?

-Could I sit there?

-Why?

0:39:530:39:55

I'd like to sit down.

0:39:550:39:57

If you gave me a reason...

0:39:570:39:59

-Excuse me, could I have that seat, please?

-Yeah, sure.

0:39:590:40:04

Thank you very much.

0:40:040:40:05

-Excuse me, could I have your seat, please?

-Yeah, sure.

0:40:080:40:13

Thank you very much.

0:40:130:40:15

I was surprised by how many people complied

0:40:190:40:22

with my totally unreasonable request - about half.

0:40:220:40:25

Excuse me, could I have that seat please?

0:40:260:40:29

-Yeah.

-OK, thank you.

0:40:290:40:30

Excuse me, could I sit there, please?

0:40:300:40:35

-Why do you want to sit here?

-I want to sit down.

-Pardon?

0:40:350:40:39

-I want to sit down.

-Why can't you sit here?

-Can I sit there?

0:40:390:40:44

-Is that a yes or a no?

-No.

0:40:440:40:47

But what really surprised me

0:40:500:40:52

was just how difficult I personally found the whole experience.

0:40:520:40:56

My real feeling at this moment is one of just profound relief

0:40:560:41:02

that it's done.

0:41:020:41:03

I felt really uncomfortable throughout the whole thing.

0:41:030:41:07

This is also what Milgram felt.

0:41:070:41:09

I stood in front of a passenger and I was about to say,

0:41:090:41:13

"Excuse me, sir, may I have your seat?"

0:41:130:41:15

I found something very interesting. There was an enormous inhibition.

0:41:150:41:19

The words wouldn't come out. I simply couldn't utter them.

0:41:190:41:22

There was a terrible constraint against saying this simple phrase.

0:41:220:41:25

Milgram concluded that feeling socially awkward

0:41:250:41:29

and embarrassed plays an important role in governing our behaviour.

0:41:290:41:33

We don't like breaking the social rules,

0:41:330:41:35

whether it's asking for someone's seat for no good reason,

0:41:350:41:39

or disobeying the instruction of somebody

0:41:390:41:41

whose authority we have accepted.

0:41:410:41:42

In ordinary everyday situations,

0:41:420:41:45

there's an implicit set of rules as to who's in charge.

0:41:450:41:48

If we violate these rules, it leads to the disruption of behaviour,

0:41:480:41:51

to embarrassment, awkwardness, to a kind of social chaos.

0:41:510:41:54

Ordinarily, we accept the submissive role that the occasion requires.

0:41:540:42:00

According to Milgram,

0:42:020:42:04

compliance in society and obedience to authority is necessary.

0:42:040:42:09

Our social organisation depends on it.

0:42:090:42:11

If we're to have a society,

0:42:110:42:14

we must have members of society willing to obey its laws.

0:42:140:42:19

It's not simply a question

0:42:190:42:21

of whether or not we want to be controlled.

0:42:210:42:24

According to Milgram, our social organisation demands it.

0:42:240:42:27

As a means of control, there's no doubt that social pressure

0:42:360:42:39

is ever-present and highly effective.

0:42:390:42:43

But social pressure, like drugs or behaviour modification,

0:42:440:42:48

or even Pavlovian training,

0:42:480:42:50

wasn't able to deliver total mind control.

0:42:500:42:52

There was, however, still one group of scientists who believed

0:42:520:42:57

they could control behaviour and emotions with pinpoint accuracy.

0:42:570:43:01

To do so, they would have to develop techniques

0:43:040:43:08

to go directly into the brain.

0:43:080:43:11

CRACKLING

0:43:110:43:13

Are you sleepy or awake or what?

0:43:140:43:17

I'm awake now.

0:43:170:43:18

The origins of this approach lie in a series of bizarre experiments

0:43:180:43:23

conducted by an ambitious doctor in the American Deep South.

0:43:230:43:27

Psychiatrist Robert Heath was a maverick and a pioneer.

0:43:280:43:32

From the early 1950s onwards, in New Orleans, he used electrodes

0:43:320:43:37

to stimulate and also to map out his patients' brains.

0:43:370:43:42

DOORBELL RINGS

0:43:420:43:44

-John! Hello, buddy.

-Welcome, good to see you.

0:43:440:43:47

John Goethe and James Eaton

0:43:470:43:49

worked with Heath on many of his more ambitious experiments.

0:43:490:43:53

He wanted to be a discoverer.

0:43:530:43:55

-He wanted to find a new path.

-But there was no question

0:43:550:44:01

that there was, um...

0:44:010:44:03

a light side and a dark side.

0:44:030:44:06

I really loved the guy,

0:44:060:44:07

but if you love someone,

0:44:070:44:09

and you're very close to them,

0:44:090:44:10

you see all sorts of warts.

0:44:100:44:12

He had plenty of warts.

0:44:120:44:14

Heath was not afraid of controversy.

0:44:160:44:18

He passionately believed you had to experiment on humans

0:44:180:44:22

to make truly significant medical breakthroughs.

0:44:220:44:25

He conducted his most controversial research in the state of Louisiana.

0:44:270:44:33

At the time, the consent process was not as rigorous as it is now.

0:44:330:44:38

His goal was ambitious - to see if he could alter behaviour

0:44:390:44:43

by sticking electrodes into people's brains.

0:44:430:44:46

Heath was inspired by experiments that had been done on rats.

0:44:490:44:53

Electrodes were implanted directly into areas of the rat's brain

0:44:540:44:58

believed to be responsible for pleasure.

0:44:580:45:02

When the rat pressed the lever, it stimulated these areas.

0:45:020:45:07

The scientists found that this process

0:45:070:45:09

could dramatically change the rat's normal behaviour.

0:45:090:45:11

BUZZING

0:45:140:45:15

This rat is prepared to run across an electrified floor

0:45:150:45:18

to reach the lever, press it, and get a pleasure kick.

0:45:180:45:22

Heath decided to take the extreme step of trying this on humans.

0:45:270:45:32

How do you feel?

0:45:340:45:37

It's...like coming on.

0:45:370:45:41

Hmm?

0:45:410:45:42

-Like coming on.

-Like something coming on?

0:45:420:45:45

What do you mean?

0:45:450:45:47

'The depth electrode procedure'

0:45:490:45:52

involved placing these very thin wires through the skull

0:45:520:45:57

and into precisely located areas

0:45:570:46:00

thought to be important in the regulation of emotion.

0:46:000:46:04

This idea that the emotions are deep in the temporal lobes,

0:46:040:46:08

rather than being on the surface,

0:46:080:46:10

was an idea shared by many,

0:46:100:46:12

but he's the one who actually put a needle, put an electrode in,

0:46:120:46:17

in there and turned on the juice.

0:46:170:46:19

Can you keep pushing this yourself?

0:46:190:46:22

-Yes, sir.

-All right, push this button.

0:46:220:46:25

'They had these little boxes'

0:46:250:46:27

and they had the electrodes in their head.

0:46:270:46:31

Whenever they felt they needed a little goose or something,

0:46:310:46:35

again, I'm simplifying it, but they could press it.

0:46:350:46:38

All right, push this button.

0:46:380:46:40

You know how to work this.

0:46:400:46:42

Heath's early patients were schizophrenics,

0:46:420:46:45

severe epileptics and depressives.

0:46:450:46:47

I feel like something is coming on.

0:46:470:46:50

-All right. Does it feel? How does it feel? Good or bad?

-Good.

0:46:500:46:54

-Good?

-Yes, sir.

0:46:540:46:56

Like the rats,

0:46:560:46:57

these human patients were able to stimulate their own brains

0:46:570:47:01

and use this pleasurable sensation to alleviate their symptoms.

0:47:010:47:05

-Well, I think it's... I think it's somewhat of a sexy button.

-Yeah?

0:47:080:47:14

Encouraged by his results, Heath began to explore ways of using

0:47:140:47:17

deep brain stimulation to create new behaviours.

0:47:170:47:21

His most infamous experiment was on a 24-year-old man

0:47:230:47:27

with a long history of depression who had never had sex with a woman.

0:47:270:47:31

He approached Heath and asked him to change him,

0:47:330:47:36

to change his behaviour,

0:47:360:47:38

because he was gay and he wanted to become straight.

0:47:380:47:42

At the time, being gay was regarded as a psychiatric illness,

0:47:420:47:45

so Heath agreed.

0:47:450:47:46

He wasn't the first to do this. Other people had tried,

0:47:460:47:51

but nobody had done so using electrical stimulation.

0:47:510:47:54

Heath implanted a series of electrodes

0:47:550:47:57

into the pleasure regions of the man's brain.

0:47:570:48:00

Then he was shown some soft porn.

0:48:000:48:03

The idea was to link, in the man's mind, the pleasure he was getting

0:48:050:48:09

from the electrical stimulation

0:48:090:48:11

with the heterosexual behaviour he was watching on the screen.

0:48:110:48:16

At the end of the three-week period,

0:48:180:48:21

the man reported that he had growing sexual interest in women.

0:48:210:48:25

So Heath decided to go to the next stage. Now, this really is bizarre.

0:48:250:48:31

What he did is, he hired a prostitute for 50,

0:48:310:48:35

a woman he called a "lady of the night".

0:48:350:48:38

Her job was to have sex with a gay man

0:48:380:48:41

with electrodes sticking out of his head.

0:48:410:48:44

The wires led to recording devices in the next room,

0:48:440:48:47

where the psychiatrists were keeping a record

0:48:470:48:49

of what was going on inside his skull.

0:48:490:48:52

Well, astonishingly enough, I mean, I'm absolutely gobsmacked,

0:48:540:48:58

but astonishingly enough, according to Heath,

0:48:580:49:01

intercourse was successful.

0:49:010:49:04

In fact, the man said he'd had so much fun,

0:49:040:49:07

got so much pleasure out of it,

0:49:070:49:10

he would like to go and do it again and again.

0:49:100:49:14

Unlikely as it sounds,

0:49:140:49:16

Heath's use of electrode stimulation seemed to have worked.

0:49:160:49:20

But his patient's desire to become heterosexual

0:49:200:49:22

wasn't entirely fulfilled.

0:49:220:49:24

The young man went off and he did actually begin

0:49:260:49:29

an affair with a married woman that lasted for ten months.

0:49:290:49:32

But he also continued to have sex with men.

0:49:320:49:35

So a rather mixed outcome,

0:49:350:49:37

and Heath himself never tried to repeat the experiment.

0:49:370:49:41

-Huh?

-Ooh...

-You all right?

-I don't like that.

0:49:440:49:46

You don't like that?

0:49:460:49:48

Heath's methods were primitive,

0:49:480:49:50

but they did pave the way for some important medical developments.

0:49:500:49:54

Today, the technique of deep brain stimulation has been refined

0:49:570:50:02

and improved, and is now used to treat a range of conditions.

0:50:020:50:05

It has been particularly successful for sufferers of Parkinson's.

0:50:080:50:13

Electrodes are placed in the areas of the brain

0:50:140:50:17

that control movement and help to reduce symptoms.

0:50:170:50:20

However, implanting electrodes involves major surgery

0:50:220:50:26

and the risks are considerable.

0:50:260:50:29

Although it is a huge improvement

0:50:300:50:32

on what went before, electrical stimulation is still fairly crude.

0:50:320:50:36

Even the most carefully positioned of electrodes

0:50:360:50:39

stimulates a large area of brain matter,

0:50:390:50:42

consisting of thousands of cells.

0:50:420:50:44

But just recently, the technology has moved on.

0:50:440:50:47

Here at Oxford University,

0:50:490:50:51

they have found new ways to manipulate the brain

0:50:510:50:55

and control behaviour using a technique called optogenetics.

0:50:550:50:59

With optogenetics, scientists can control the brain

0:51:010:51:04

simply using light.

0:51:040:51:05

I'm here to meet neuroscientist Gero Miesenbock,

0:51:050:51:09

who's been refining his technique on flies.

0:51:090:51:13

-Hello, there.

-Hello.

0:51:130:51:15

-Michael Mosley.

-Gero Miesenbock.

0:51:150:51:17

I gather you are lord of the flies?

0:51:170:51:19

In some circles, I am, yes.

0:51:190:51:21

And how many flies do you have here?

0:51:210:51:24

We haven't really counted them, but we have thousands of, erm,

0:51:240:51:28

vials like these.

0:51:280:51:29

Each vial has maybe 50-100 flies in them, so you do the math.

0:51:290:51:32

It's probably millions buzzing around somewhere.

0:51:320:51:35

-Millions. You are lord of the flies!

-Yeah.

0:51:350:51:38

Miesenbock earned his nickname

0:51:400:51:42

through his ability to control and manipulate the choices flies make.

0:51:420:51:47

We've come quite far in controlling not only simple motor acts

0:51:470:51:52

of the animals, but actually their psychology in rather profound ways.

0:51:520:51:57

I don't think of flies as having a psychology!

0:51:570:52:00

You just haven't worked with them for long enough.

0:52:000:52:03

Can we see it in action?

0:52:030:52:04

-Yeah, let's take a look.

-OK.

0:52:040:52:07

Controlling a fly's psychology is possible.

0:52:070:52:10

Because of the 200,000 or so cells that make up their brains,

0:52:100:52:14

he has managed to isolate a few that are responsible for learning.

0:52:140:52:19

We did find a particular small group of just 12 cells

0:52:190:52:22

lying in the central brain.

0:52:220:52:25

These 12 cells don't dictate behaviour per se.

0:52:250:52:28

They dictate whether behaviour should be changed.

0:52:280:52:31

And dictating whether behaviour should be changed

0:52:310:52:34

is a crucial part of learning.

0:52:340:52:35

Miesenbock likens the role of this group of cells

0:52:350:52:39

to that of a nagging critic.

0:52:390:52:41

Critic is, if you wish,

0:52:410:52:43

the structure that holds the key to intelligence.

0:52:430:52:47

It's the structure that decides whether something is good or bad,

0:52:470:52:50

whether it's right or wrong.

0:52:500:52:52

Sort of the brain's version of the Catholic Church,

0:52:520:52:55

if you're an Austrian like me,

0:52:550:52:57

or the superego if you're Freudian or your mother if you're Jewish.

0:52:570:53:02

Miesenbock has genetically engineered his flies

0:53:020:53:06

so that he can switch on this negative nagging critical voice

0:53:060:53:10

simply by exposing them to blue light.

0:53:100:53:13

When the blue light comes on, the cells are activated

0:53:130:53:16

and the flies start to learn.

0:53:160:53:18

The flies are taught to dislike whatever is in their surroundings

0:53:180:53:22

when the blue light is on. This is then stored as a bad memory.

0:53:220:53:27

In this experiment,

0:53:270:53:28

the flies are taught to dislike the smell of liquorice.

0:53:280:53:32

Each of these chambers contains just one fly.

0:53:320:53:35

-Right.

-And the fly walks up and down the length of the chamber.

0:53:350:53:39

And, um, the left and the right half of the chamber

0:53:390:53:43

are each filled with a different odour.

0:53:430:53:45

One of these smells a bit like, as they say, a tennis shoe in July.

0:53:450:53:50

-Right.

-And the other one a little bit like liquorice.

-Liquorice, OK.

0:53:500:53:53

You can see that, before our optogenetic intervention,

0:53:530:53:59

the fly just explores the chamber in its entirety.

0:53:590:54:02

They start off not caring one way or the other

0:54:020:54:05

-and you're going to manipulate their choice?

-Exactly.

0:54:050:54:08

When the blue light is switched on,

0:54:100:54:12

the chamber is flooded with the smell of liquorice.

0:54:120:54:15

OK, now, the effect of this light is to turn on the 12 cells

0:54:150:54:19

of the critic in the fly's brain.

0:54:190:54:21

-And that then leads to the implantation of memory.

-Right.

0:54:210:54:25

When the blue light switches off,

0:54:260:54:28

the chamber returns to its original smells.

0:54:280:54:31

Tennis shoes on the left, liquorice on the right.

0:54:310:54:34

So now, because of that blue light,

0:54:340:54:38

the fly has come to associate the smell of liquorice

0:54:380:54:41

with something it doesn't like,

0:54:410:54:42

something it needs to avoid, something it should avoid. Right.

0:54:420:54:45

What you see is, it's backtracking already. It made a mistake.

0:54:450:54:49

It stepped into the odour of liquorice,

0:54:490:54:51

which is on the right here.

0:54:510:54:52

Now it's back in tennis shoe, which is on the left.

0:54:520:54:56

And you see, whenever it reaches

0:54:560:54:58

the interface between the two odours, it recoils.

0:54:580:55:01

You can almost see how it goes, "Eugh, this is awful!"

0:55:010:55:06

-And turns around.

-It's very impressive.

0:55:060:55:08

The ability of optogenetics

0:55:080:55:10

to pinpoint and act on very specific cells

0:55:100:55:13

in the brain suggests it could be used in the future

0:55:130:55:17

for modifying human behaviour.

0:55:170:55:19

I think there's medical benefits in many possible directions,

0:55:190:55:23

the neurons that matter for a particular brain function,

0:55:230:55:26

be it appetite control,

0:55:260:55:28

mood, anxiety, sleep and wakefulness, attention, you name it.

0:55:280:55:32

But there are some major obstacles

0:55:320:55:34

to optogenetically controlling humans.

0:55:340:55:37

I think it's not something that's around the corner

0:55:370:55:41

in the next few years, because, remember,

0:55:410:55:43

for optogenetics to work,

0:55:430:55:44

you would have to genetically modify the brain...

0:55:440:55:47

Indeed, which we are some way off doing at the moment.

0:55:470:55:50

Exactly. A fly normally doesn't mind, but I certainly would

0:55:500:55:55

if you wanted to implant light receptors into my brain.

0:55:550:55:58

Now, that really was quite astonishing,

0:56:020:56:05

the idea that you can control

0:56:050:56:07

really quite sophisticated behaviour in a fly

0:56:070:56:11

just by manipulating 12 neurons.

0:56:110:56:12

It's astonishing and also slightly worrying,

0:56:120:56:15

because it makes you wonder whether the same is true of humans.

0:56:150:56:18

But I think that because human brains are so much bigger

0:56:180:56:22

and more complex than insect brains,

0:56:220:56:25

it is going to be a long time,

0:56:250:56:27

if ever, before technology like that is applied to us.

0:56:270:56:31

I've no doubt the technology used to study the brain

0:56:370:56:40

will continue to advance...

0:56:400:56:42

And that in doing so, we will make giant strides in understanding

0:56:440:56:48

the workings of our own minds.

0:56:480:56:50

But what I find reassuring is that,

0:56:520:56:54

although many people have tried to manipulate our minds,

0:56:540:56:57

the human brain has so far proved to be too complex to finely control,

0:56:570:57:03

our personalities too resilient.

0:57:030:57:06

Down the years, the brain has been pushed around,

0:57:070:57:11

probed, drugged and electrified.

0:57:110:57:14

But it remains impressively hard to control,

0:57:160:57:20

and I find that very comforting.

0:57:200:57:22

Download Subtitles

SRT

ASS