Browse content similar to Mind Control. Check below for episodes and series from the same categories and more!
Line | From | To | |
---|---|---|---|
Why do we do the things we do? | 0:00:02 | 0:00:04 | |
What really makes us tick? | 0:00:04 | 0:00:08 | |
How do our minds work? | 0:00:08 | 0:00:11 | |
For centuries, | 0:00:11 | 0:00:12 | |
these questions were largely left to philosophers and theologians. | 0:00:12 | 0:00:17 | |
Then, about 100 years ago, a new science began to shine | 0:00:20 | 0:00:23 | |
a bright light on the inner workings of the mind. | 0:00:23 | 0:00:28 | |
It was called "experimental psychology". | 0:00:28 | 0:00:31 | |
In this series, I'm tracing the history of how this "new science" | 0:00:31 | 0:00:36 | |
revealed things about ourselves | 0:00:36 | 0:00:38 | |
that were surprising and often profoundly shocking. | 0:00:38 | 0:00:41 | |
The experiment requires that we continue. | 0:00:41 | 0:00:43 | |
But he might be dead in there! | 0:00:43 | 0:00:45 | |
In this film, I'm investigating the inventive, sometimes barbaric ways | 0:00:45 | 0:00:50 | |
psychology has been used to control and manipulate people. | 0:00:50 | 0:00:55 | |
How does it feel, good or bad? | 0:00:55 | 0:00:57 | |
-Good. -Good? | 0:00:57 | 0:00:58 | |
It's a science which strikes | 0:00:58 | 0:01:00 | |
at the heart of what we hold most dear - our free will. | 0:01:00 | 0:01:03 | |
I don't think people like having their behaviour tinkered with, | 0:01:03 | 0:01:06 | |
I don't think they like to be told what to do. | 0:01:06 | 0:01:10 | |
The pursuit of mind control led to some truly horrific experiments, | 0:01:10 | 0:01:15 | |
and left many casualties in its wake. | 0:01:15 | 0:01:18 | |
I only remember that I forget. | 0:01:18 | 0:01:21 | |
Everything. | 0:01:21 | 0:01:23 | |
But investigating the forces that shape our minds and our behaviour | 0:01:24 | 0:01:28 | |
has also led to profound insights into how the brain works. | 0:01:28 | 0:01:32 | |
Ever since I was a medical student, | 0:01:35 | 0:01:38 | |
I have been fascinated by psychology, | 0:01:38 | 0:01:40 | |
by what it has revealed about ourselves | 0:01:40 | 0:01:43 | |
and by how far some researchers have been prepared to go. | 0:01:43 | 0:01:47 | |
I'm on my way to Cardiff and I'm feeling apprehensive. | 0:02:02 | 0:02:06 | |
I'm about to take a class A hallucinogenic drug. | 0:02:06 | 0:02:10 | |
In the past, this drug and others like it were used in experiments | 0:02:12 | 0:02:16 | |
which were designed with a very sinister purpose - | 0:02:16 | 0:02:18 | |
to achieve total mind control. | 0:02:18 | 0:02:22 | |
The drug I'm going to be given is psilocybin, | 0:02:24 | 0:02:28 | |
the active ingredient of magic mushrooms. | 0:02:28 | 0:02:31 | |
Normally, taking psilocybin would be illegal. | 0:02:33 | 0:02:36 | |
-Michael. -Hello, nice to see you. | 0:02:37 | 0:02:40 | |
But in this case, Professor David Nutt | 0:02:40 | 0:02:42 | |
and his team from Imperial College are doing a scientific study, | 0:02:42 | 0:02:45 | |
the first of its kind in the UK. | 0:02:45 | 0:02:48 | |
-So, am I going to enjoy this? -I think so. | 0:02:48 | 0:02:51 | |
Most people do, most people get something positive from it. | 0:02:51 | 0:02:54 | |
We've had no bad reactions, | 0:02:54 | 0:02:56 | |
so don't worry about what you're going to feel. | 0:02:56 | 0:02:58 | |
I think you may enjoy it. | 0:02:58 | 0:03:00 | |
It's a bit odd, the idea of having your consciousness played with. | 0:03:00 | 0:03:03 | |
You do it all the time, don't you? | 0:03:03 | 0:03:06 | |
-Every time you go out and get drunk. -OK. | 0:03:06 | 0:03:09 | |
The team are expecting psilocybin to have a significant impact on me. | 0:03:10 | 0:03:15 | |
So I'm put through physical and psychological tests. | 0:03:15 | 0:03:20 | |
-Would you describe yourself as a spiritual person? -A bit. | 0:03:20 | 0:03:24 | |
-And would you describe yourself as a mystical person? -No. | 0:03:24 | 0:03:28 | |
So I'm going to prepare the drug now. | 0:03:28 | 0:03:30 | |
-See that tiny amount in there? Can you see it? -Just. | 0:03:30 | 0:03:34 | |
That's 3.5 milligrams, so you get two milligrams. | 0:03:34 | 0:03:38 | |
-So that's going to be it? -That's it, but it does quite a lot. | 0:03:38 | 0:03:42 | |
Surprisingly enough, | 0:03:42 | 0:03:44 | |
David Nutt believes that by changing the brain, | 0:03:44 | 0:03:47 | |
psilocybin's mind-bending properties | 0:03:47 | 0:03:49 | |
may help psychiatric disorders like depression. | 0:03:49 | 0:03:53 | |
Before they test it on patients, the team want to find out | 0:03:53 | 0:03:57 | |
exactly how it acts on normal brains like mine. | 0:03:57 | 0:04:01 | |
It can be quite traumatic. | 0:04:01 | 0:04:02 | |
The last volunteer we had had quite a dramatic response. | 0:04:02 | 0:04:05 | |
He said it was probably the strangest experience of his life. | 0:04:05 | 0:04:09 | |
-OK. -He said that, um, his whole sense of self dissolved, | 0:04:09 | 0:04:13 | |
and he only existed as a concept. | 0:04:13 | 0:04:16 | |
MICHAEL LAUGHS | 0:04:16 | 0:04:17 | |
OK, I'm not sure I wish to only exist as a concept. We shall see. | 0:04:17 | 0:04:22 | |
I am starting to wonder what sort of trip I'm going to have. | 0:04:22 | 0:04:27 | |
I'm placed in a brain scanner | 0:04:27 | 0:04:30 | |
so the team can map in detail how my brain reacts to psilocybin. | 0:04:30 | 0:04:35 | |
Then the drug is injected. | 0:04:35 | 0:04:38 | |
The result? Well, it would turn out | 0:04:39 | 0:04:42 | |
to be one of the strangest and most unusual experiences of my life. | 0:04:42 | 0:04:46 | |
Today, drugs are at the cutting edge of mind control. | 0:04:47 | 0:04:51 | |
But down the years, scientists have experimented with more unusual ways | 0:04:51 | 0:04:56 | |
to manipulate our brains. | 0:04:56 | 0:04:58 | |
There is a long and murky history behind scientific attempts | 0:05:01 | 0:05:06 | |
to control emotions, behaviour, and decision-making. | 0:05:06 | 0:05:11 | |
I think some of the most interesting stuff can be traced | 0:05:16 | 0:05:19 | |
to an accidental discovery by a Russian more than 100 years ago. | 0:05:19 | 0:05:23 | |
In the 1890s, biologist Ivan Pavlov was studying dogs, | 0:05:25 | 0:05:30 | |
many of which had been snatched off the streets of St Petersburg. | 0:05:30 | 0:05:34 | |
His early experiments seemed to have nothing to do with mind control. | 0:05:34 | 0:05:38 | |
He was studying reflexes in animals. | 0:05:38 | 0:05:42 | |
Pavlov was interested in saliva, and in particular, | 0:05:42 | 0:05:45 | |
how much saliva is produced when we eat different sorts of foods. | 0:05:45 | 0:05:48 | |
Now, he studied it in dogs. | 0:05:48 | 0:05:51 | |
He operated on his dogs so he could accurately measure | 0:05:52 | 0:05:56 | |
the saliva they produced when they saw food. | 0:05:56 | 0:05:59 | |
To Pavlov's immense surprise, | 0:05:59 | 0:06:02 | |
he discovered that the dogs would salivate | 0:06:02 | 0:06:04 | |
not just when they got the food, | 0:06:04 | 0:06:06 | |
but when they simply saw the men in white coats | 0:06:06 | 0:06:10 | |
who brought them the food. | 0:06:10 | 0:06:12 | |
This was utterly unexpected. | 0:06:12 | 0:06:14 | |
Pavlov realised that his dogs had made an association | 0:06:15 | 0:06:18 | |
between the lab coat and the arrival of food. | 0:06:18 | 0:06:21 | |
It was this association, rather than the food itself, | 0:06:21 | 0:06:26 | |
that produced the saliva reflex. | 0:06:26 | 0:06:28 | |
Pavlov wanted to train other responses. | 0:06:29 | 0:06:32 | |
As he did so, his experiments became increasingly brutal. | 0:06:32 | 0:06:36 | |
In this experiment, the beat of the metronome | 0:06:36 | 0:06:39 | |
is combined with an electric shock. | 0:06:39 | 0:06:42 | |
BUZZING | 0:06:43 | 0:06:44 | |
The electric shock is given a number of times, | 0:06:44 | 0:06:48 | |
until the beat of the metronome alone | 0:06:48 | 0:06:50 | |
is enough to cause the dog to jump. | 0:06:50 | 0:06:53 | |
The Pavlovian response is one of the most famous discoveries | 0:06:56 | 0:06:59 | |
in the history of psychology. | 0:06:59 | 0:07:01 | |
What is less well known | 0:07:01 | 0:07:03 | |
is that Pavlov didn't confine himself to dogs. | 0:07:03 | 0:07:06 | |
Disturbingly, he was also involved in experiments on children. | 0:07:06 | 0:07:11 | |
According to Pavlov, | 0:07:11 | 0:07:12 | |
our reflexes start being conditioned from the moment we're born. | 0:07:12 | 0:07:16 | |
As we progress through life, | 0:07:16 | 0:07:18 | |
more and more of these conditioned reflexes | 0:07:18 | 0:07:20 | |
are implanted in our brains, building up complex behaviours. | 0:07:20 | 0:07:25 | |
And once you understand the basics | 0:07:25 | 0:07:27 | |
of how those behaviours are laid down, | 0:07:27 | 0:07:30 | |
well, perhaps you can begin to also control them. | 0:07:30 | 0:07:33 | |
BABY CRIES | 0:07:33 | 0:07:35 | |
Pavlov's discoveries are fundamental to mind control, | 0:07:39 | 0:07:43 | |
but they are limited. | 0:07:43 | 0:07:45 | |
They only condition behaviours that already exist. | 0:07:45 | 0:07:48 | |
By the time Pavlov died in the mid-1930s, | 0:07:51 | 0:07:54 | |
new techniques had been developed in America | 0:07:54 | 0:07:57 | |
that promised far more sophisticated mind manipulation. | 0:07:57 | 0:08:01 | |
These methods also held the enticing possibility | 0:08:04 | 0:08:08 | |
of being used on a grand scale. | 0:08:08 | 0:08:12 | |
The man behind these developments was so controversial | 0:08:12 | 0:08:15 | |
that he was hung in effigy. | 0:08:15 | 0:08:17 | |
The vice president described him as "most dangerous". | 0:08:17 | 0:08:20 | |
And the FBI kept a detailed file on him. | 0:08:20 | 0:08:23 | |
I'm here to meet Deborah Buzan, daughter of BF Skinner, | 0:08:24 | 0:08:27 | |
one of the most famous and infamous scientists | 0:08:27 | 0:08:30 | |
in the history of psychology. | 0:08:30 | 0:08:32 | |
-Hello. -Hello. -Michael Mosley. | 0:08:33 | 0:08:35 | |
Nice to meet you. Come in. | 0:08:35 | 0:08:37 | |
He's receiving an honorary degree at Harvard here. | 0:08:38 | 0:08:41 | |
-He was terribly proud of that. -Uh-huh. | 0:08:41 | 0:08:43 | |
And here he is with a rat. SHE LAUGHS | 0:08:43 | 0:08:47 | |
This, again, is the machine behind with the gazillion wires and things. | 0:08:47 | 0:08:50 | |
It would all be done very differently today. | 0:08:50 | 0:08:53 | |
That's my father putting the pigeon | 0:08:53 | 0:08:56 | |
into what became known as the Skinner Box. | 0:08:56 | 0:08:59 | |
The key to Skinner's research was the so-called Skinner Box. | 0:08:59 | 0:09:03 | |
He used it to create new behaviour. | 0:09:03 | 0:09:06 | |
In this Skinner Box, a pigeon is given a reward | 0:09:09 | 0:09:12 | |
every time it pecks the word "peck", | 0:09:12 | 0:09:14 | |
or turns full circle when it sees the word "turn" appear. | 0:09:14 | 0:09:18 | |
The pigeon has learnt | 0:09:22 | 0:09:23 | |
that when it performs these actions, it gets food. | 0:09:23 | 0:09:26 | |
So it continues to do them. | 0:09:26 | 0:09:28 | |
As it does so, this new behaviour is reinforced. | 0:09:30 | 0:09:34 | |
This pigeon experiment, and many like it, demonstrated graphically | 0:09:36 | 0:09:40 | |
that carefully-timed rewards | 0:09:40 | 0:09:42 | |
can reinforce, change or even create new behaviours. | 0:09:42 | 0:09:48 | |
Deborah is using her father's technique | 0:09:50 | 0:09:52 | |
to train her cats to play the piano. | 0:09:52 | 0:09:55 | |
Lola's not exactly a virtuoso, | 0:09:57 | 0:09:59 | |
but she has learnt that a little tinkle earns a piece of chicken. | 0:09:59 | 0:10:03 | |
I like the fact that she changes notes. | 0:10:04 | 0:10:07 | |
Oh, that was good. | 0:10:13 | 0:10:15 | |
Teaching animals cute tricks is hardly controversial, | 0:10:15 | 0:10:19 | |
but Skinner had defined a precise and powerful way | 0:10:19 | 0:10:23 | |
of modifying behaviour and said it should be used on humans. | 0:10:23 | 0:10:27 | |
He wanted to apply | 0:10:27 | 0:10:28 | |
his scientific methods of psychological manipulation | 0:10:28 | 0:10:32 | |
to whole populations. | 0:10:32 | 0:10:33 | |
He was really, essentially, telling people how to behave, | 0:10:33 | 0:10:37 | |
how to act, maybe how to create a world | 0:10:37 | 0:10:41 | |
in which people would act differently | 0:10:41 | 0:10:43 | |
from the way they already acted. | 0:10:43 | 0:10:45 | |
And I don't think people like to have their behaviour tinkered with. | 0:10:45 | 0:10:49 | |
I don't think they like to be told what to do. | 0:10:49 | 0:10:52 | |
Skinner believed that many of the problems of modern society | 0:10:53 | 0:10:56 | |
could be improved if a ruling elite | 0:10:56 | 0:11:00 | |
were to use his methods to control behaviour. | 0:11:00 | 0:11:04 | |
Using them would reduce violence and improve productivity. | 0:11:04 | 0:11:08 | |
Skinner's psychologically-controlled society brushed aside | 0:11:11 | 0:11:15 | |
an individual's role in determining and shaping their own destiny. | 0:11:15 | 0:11:19 | |
He saw belief in free will as unscientific nonsense. | 0:11:19 | 0:11:23 | |
These views polarised opinion. | 0:11:23 | 0:11:26 | |
On 3rd May, 1972, the same day that 6000 students cheered him on | 0:11:28 | 0:11:34 | |
in Michigan, he was hung in effigy at Indiana University. | 0:11:34 | 0:11:38 | |
He very much wanted to get across his message. I don't know | 0:11:38 | 0:11:42 | |
if he expected such a reception, which was not all good, of course. | 0:11:42 | 0:11:47 | |
But, certainly, it made people sit up and think. | 0:11:47 | 0:11:50 | |
Skinner's utopia never materialised. | 0:11:52 | 0:11:55 | |
But his methods of behaviour control are now widely used | 0:11:55 | 0:11:59 | |
in everyday life. | 0:11:59 | 0:12:02 | |
They have been particularly influential | 0:12:02 | 0:12:05 | |
in the world of child development, | 0:12:05 | 0:12:07 | |
where the use of well-targeted rewards to encourage good behaviour | 0:12:07 | 0:12:10 | |
has become an established parenting technique. | 0:12:10 | 0:12:13 | |
And for some children, Skinner's methods have proved life-changing. | 0:12:13 | 0:12:17 | |
I've tried applying some of Skinner's techniques | 0:12:19 | 0:12:22 | |
to my own kids in a slightly sort of haphazard fashion. | 0:12:22 | 0:12:25 | |
What I've never done is, | 0:12:25 | 0:12:27 | |
I've never seen it applied systematically | 0:12:27 | 0:12:30 | |
and in an organised way. | 0:12:30 | 0:12:32 | |
And I'm hoping to see that in here. | 0:12:32 | 0:12:34 | |
In the Jigsaw School for Autistic Children in Surrey, | 0:12:38 | 0:12:41 | |
the teachers use a system based on Skinner's principles | 0:12:41 | 0:12:45 | |
called Applied Behaviour Analysis. | 0:12:45 | 0:12:48 | |
We're going to go and watch Rohit working. | 0:12:48 | 0:12:52 | |
-Take a seat. -Thank you. | 0:12:52 | 0:12:54 | |
Rohit started with us last September, | 0:12:54 | 0:12:56 | |
so he's almost done a full academic year with us. | 0:12:56 | 0:13:00 | |
He's actually behaving beautifully this morning, | 0:13:00 | 0:13:02 | |
but within his repertoire, | 0:13:02 | 0:13:04 | |
he can assault staff, he likes to pull hair. | 0:13:04 | 0:13:08 | |
Eight-year-old Rohit is working through a series of learning tasks. | 0:13:09 | 0:13:13 | |
Each task is broken down into small parts. | 0:13:13 | 0:13:16 | |
He gets a token for completing each part of the task. | 0:13:16 | 0:13:20 | |
Point to a cube. That's a cube. Good boy, give me high-five. | 0:13:20 | 0:13:23 | |
Well done. You can take off a token, that was lovely work. | 0:13:23 | 0:13:28 | |
Every single correct response here is being reinforced by the teacher. | 0:13:28 | 0:13:31 | |
Our main focus is, | 0:13:31 | 0:13:32 | |
if we're constantly praising and reinforcing appropriate behaviour, | 0:13:32 | 0:13:36 | |
we will see an increase in that appropriate behaviour. | 0:13:36 | 0:13:39 | |
-It is a mushroom. -It is a mushroom. | 0:13:39 | 0:13:42 | |
Good boy, you're trying really hard. Good job, Mister. | 0:13:42 | 0:13:45 | |
The key is to consistently reward the behaviours and responses | 0:13:45 | 0:13:49 | |
they want to encourage. | 0:13:49 | 0:13:51 | |
Rohit is working towards the bigger prize of time in the soft play area. | 0:13:51 | 0:13:56 | |
So they get a combination of reinforcers, if you like, | 0:13:56 | 0:14:00 | |
so it's a mixture of praise, pat on the back, and a point | 0:14:00 | 0:14:04 | |
-which they know leads to a prize? -Yes. | 0:14:04 | 0:14:07 | |
Rohit eventually gets his prize - playtime in the soft room. | 0:14:09 | 0:14:14 | |
But more importantly, he'd sat for over an hour | 0:14:14 | 0:14:17 | |
learning picture-word associations, | 0:14:17 | 0:14:20 | |
something that would have been impossible just a year ago. | 0:14:20 | 0:14:24 | |
Are you a fan of Skinner, then? | 0:14:24 | 0:14:26 | |
I am a fan of Skinner, yes. | 0:14:26 | 0:14:29 | |
It is really heart-warming watching the way that Skinner's ideas | 0:14:29 | 0:14:33 | |
of positive reinforcement are being put into practice | 0:14:33 | 0:14:37 | |
in such a sort of practical way. | 0:14:37 | 0:14:39 | |
And I also think that without Skinner, | 0:14:39 | 0:14:42 | |
these kids' lives would be an awful lot bleaker. | 0:14:42 | 0:14:46 | |
Skinner believed passionately that behaviour modification | 0:14:50 | 0:14:53 | |
could and should be used to make the world a better place. | 0:14:53 | 0:14:57 | |
'A false testimony, really, | 0:15:00 | 0:15:02 | |
'which I had fabricated to keep them happy.' | 0:15:02 | 0:15:05 | |
But, in the 1950s, when he was developing his ideas, | 0:15:05 | 0:15:09 | |
research into mind control had already taken a more sinister turn, | 0:15:09 | 0:15:14 | |
fuelled by the ruthless needs of war. | 0:15:14 | 0:15:17 | |
By then, America was fighting the communists in Korea. | 0:15:17 | 0:15:23 | |
As the conflict raged, a sinister new weapon emerged. | 0:15:23 | 0:15:27 | |
'I did sign a confession, relating to germ warfare. | 0:15:27 | 0:15:31 | |
'How can I tell them these things?' | 0:15:31 | 0:15:34 | |
It first came to light | 0:15:34 | 0:15:35 | |
when the North Koreans paraded their American prisoners of war. | 0:15:35 | 0:15:39 | |
'How can I go back and face my family? | 0:15:42 | 0:15:45 | |
'In a civilised world, how can I tell them these things? | 0:15:45 | 0:15:50 | |
'That I am a criminal in the eyes of humanity? | 0:15:50 | 0:15:53 | |
'They are my flesh and blood.' | 0:15:53 | 0:15:56 | |
These images really disturbed people, | 0:15:56 | 0:15:58 | |
American soldiers denouncing American aggression. | 0:15:58 | 0:16:02 | |
How had the North Koreans done it? | 0:16:02 | 0:16:04 | |
How had they, in such a short period of time, | 0:16:04 | 0:16:07 | |
managed to convert them to the communist cause? | 0:16:07 | 0:16:10 | |
'How can I go back and face my family? | 0:16:10 | 0:16:14 | |
'They are my flesh and blood. | 0:16:14 | 0:16:16 | |
'..and absurd. I did sign a confession...' | 0:16:16 | 0:16:20 | |
This was a terrifying new weapon | 0:16:20 | 0:16:22 | |
that threatened the freedom of the West. | 0:16:22 | 0:16:26 | |
And they gave this weapon a name. | 0:16:26 | 0:16:28 | |
Brainwashing. | 0:16:28 | 0:16:30 | |
'People who used to be in the asylum...' | 0:16:30 | 0:16:34 | |
A huge effort was made | 0:16:34 | 0:16:35 | |
to try and uncover the secrets of the communist brainwashers. | 0:16:35 | 0:16:39 | |
In Britain, an ambitious psychiatrist soon became convinced | 0:16:41 | 0:16:45 | |
he knew how it was done. | 0:16:45 | 0:16:47 | |
His name was William Sargant, and his radical approach to psychiatry | 0:16:48 | 0:16:53 | |
inspired both loyalty and loathing. | 0:16:53 | 0:16:56 | |
He once said of himself, "Some people say I'm a wonderful doctor, | 0:16:56 | 0:17:02 | |
"others that I am the work of the devil." | 0:17:02 | 0:17:06 | |
Sargant had a theory on brainwashing, | 0:17:06 | 0:17:09 | |
and it was based on the work of none other than Ivan Pavlov. | 0:17:09 | 0:17:13 | |
Sargant had read about one of Pavlov's lesser-known experiments, | 0:17:13 | 0:17:17 | |
in which he'd tried to remove his dogs' conditioning. | 0:17:17 | 0:17:20 | |
Typically, Pavlov had not gone for half measures. | 0:17:20 | 0:17:23 | |
He had induced nervous breakdowns in his dogs, | 0:17:23 | 0:17:26 | |
and found that this rapidly and dramatically removed | 0:17:26 | 0:17:29 | |
the conditioned behaviour he had so painstakingly created. | 0:17:29 | 0:17:33 | |
Pavlov concluded that, during the breakdown, | 0:17:35 | 0:17:38 | |
established connections in the dogs' brains were lost. | 0:17:38 | 0:17:42 | |
When the brain rewired itself, | 0:17:42 | 0:17:44 | |
it made new connections, leading to dramatic changes in behaviour. | 0:17:44 | 0:17:48 | |
Sargant thought that Pavlov's work | 0:17:50 | 0:17:52 | |
was central to understanding brainwashing, | 0:17:52 | 0:17:55 | |
how somebody could change their core beliefs | 0:17:55 | 0:17:57 | |
in a very short period of time. | 0:17:57 | 0:17:59 | |
Just as with the dogs, Sargant believed that when a human brain | 0:18:00 | 0:18:04 | |
is broken down, connections are lost, | 0:18:04 | 0:18:07 | |
which then need to be rebuilt. | 0:18:07 | 0:18:09 | |
He decided to use this insight as the basis of a radical treatment | 0:18:12 | 0:18:15 | |
for mentally ill patients. | 0:18:15 | 0:18:17 | |
His aim was simple - | 0:18:20 | 0:18:21 | |
to break down and then reprogramme their troubled minds. | 0:18:21 | 0:18:26 | |
Mary Thornton was 21 when she was admitted under Sargant's care. | 0:18:27 | 0:18:32 | |
My memories are like snapshots. | 0:18:34 | 0:18:38 | |
One is of the electrodes being attached to the side of my head, | 0:18:38 | 0:18:44 | |
being given a general anaesthetic, | 0:18:44 | 0:18:47 | |
seeing an image of myself in the mirror one day, | 0:18:47 | 0:18:51 | |
seeing a strange face looking back at myself. | 0:18:51 | 0:18:55 | |
And being really, really frightened that I would never get out. | 0:18:56 | 0:19:02 | |
40 years on, Mary has returned | 0:19:10 | 0:19:12 | |
to the Royal Waterloo Hospital in London, once part of St Thomas'. | 0:19:12 | 0:19:16 | |
This... It... One of these rooms is the room I recovered in. | 0:19:17 | 0:19:23 | |
And it was on the left-hand side. | 0:19:23 | 0:19:25 | |
And it could have been that room. | 0:19:25 | 0:19:28 | |
It could have been a room exactly like that. | 0:19:28 | 0:19:32 | |
Mary's parents had admitted her to the hospital | 0:19:32 | 0:19:35 | |
following a series of rows over a new boyfriend. | 0:19:35 | 0:19:38 | |
I understood that I was there | 0:19:38 | 0:19:41 | |
to be treated for an acute anxiety state. | 0:19:41 | 0:19:46 | |
And I was told that I would be given sleep treatment to help my, erm... | 0:19:46 | 0:19:52 | |
..head to have a nice, long rest. | 0:19:54 | 0:19:57 | |
Mary, and other patients, became unwitting guinea pigs, | 0:19:57 | 0:20:01 | |
testing Sargant's more radical ideas. | 0:20:01 | 0:20:04 | |
Every available method is used here, all at the same time. | 0:20:04 | 0:20:09 | |
Sargant believed it was essential to use a wide range of approaches | 0:20:09 | 0:20:13 | |
if he was to break down established patterns of thinking. | 0:20:13 | 0:20:18 | |
He began with a process of "modified narcosis". | 0:20:18 | 0:20:22 | |
Patients were put into a medicated sleep for up to three months. | 0:20:22 | 0:20:27 | |
The patients are woken three times a day for feeding, | 0:20:27 | 0:20:31 | |
and to check on whether they're getting better. | 0:20:31 | 0:20:34 | |
They were then given copious amounts of drugs, | 0:20:34 | 0:20:37 | |
combined with bouts of electric shock therapy, or ECT. | 0:20:37 | 0:20:41 | |
-OK? Go. -110 volts for three seconds | 0:20:41 | 0:20:45 | |
through the frontal lobes of the brain. | 0:20:45 | 0:20:49 | |
Sargant was convinced he had to use all the tools available to him | 0:20:49 | 0:20:54 | |
if he was to wipe the slate clean. | 0:20:54 | 0:20:56 | |
He had also suffered from depression, | 0:20:56 | 0:20:58 | |
and knew it could be life-threatening, | 0:20:58 | 0:21:00 | |
so the risks were worth it. | 0:21:00 | 0:21:02 | |
In all my 30 years, I've tried never to use any method | 0:21:02 | 0:21:06 | |
that I wouldn't use on myself. | 0:21:06 | 0:21:08 | |
All that matters is, | 0:21:08 | 0:21:09 | |
don't do unto others what you wouldn't have done to yourself. | 0:21:09 | 0:21:13 | |
For many patients like Mary, | 0:21:17 | 0:21:19 | |
this barrage of treatments had a dramatic effect on their minds. | 0:21:19 | 0:21:23 | |
I only remember that I forget. | 0:21:23 | 0:21:25 | |
Everything. | 0:21:28 | 0:21:29 | |
That's what ECT made you do. | 0:21:29 | 0:21:33 | |
In this building, I saw her maybe four times, | 0:21:33 | 0:21:37 | |
I visited maybe four times. | 0:21:37 | 0:21:40 | |
And she progressively got less aware of who I was. | 0:21:42 | 0:21:47 | |
On one occasion, she didn't know me from Adam. | 0:21:47 | 0:21:52 | |
After three months at the Royal Waterloo, Mary was discharged. | 0:21:52 | 0:21:58 | |
Sargant's belief that her brain, shaken and shocked, | 0:21:58 | 0:22:00 | |
would reassemble itself, free from the shackles of her past, was wrong. | 0:22:00 | 0:22:05 | |
When I left, I had great big black holes in my memory. | 0:22:05 | 0:22:12 | |
I was sent to convalesce to my brother in Yorkshire. | 0:22:12 | 0:22:16 | |
A week or so into my stay, | 0:22:16 | 0:22:18 | |
I suddenly remembered that I had a boyfriend, | 0:22:18 | 0:22:21 | |
and I found his telephone number. And I phoned him up. | 0:22:21 | 0:22:26 | |
And he came to meet me in London. | 0:22:26 | 0:22:30 | |
Lucky for me! | 0:22:30 | 0:22:32 | |
And we've been together ever since. | 0:22:32 | 0:22:34 | |
We had her on two grams. | 0:22:37 | 0:22:40 | |
-That's 2,000 milligrams. -Yes. | 0:22:40 | 0:22:42 | |
Sargant had shown that with drugs and ECT, | 0:22:42 | 0:22:46 | |
he could disrupt and destroy memories. | 0:22:46 | 0:22:49 | |
Yet with patients like Mary, | 0:22:49 | 0:22:51 | |
he failed to break patterns of behaviour | 0:22:51 | 0:22:54 | |
and alter personal beliefs. | 0:22:54 | 0:22:56 | |
The human mind had proved remarkably resilient to change. | 0:22:56 | 0:23:00 | |
And without these new treatments, we did have psychotherapy | 0:23:00 | 0:23:03 | |
and psychoanalysis... | 0:23:03 | 0:23:04 | |
Sargant's theories on brainwashing were flawed, | 0:23:04 | 0:23:07 | |
and rejected by many British psychiatrists. | 0:23:07 | 0:23:10 | |
'How can I go back and face my family?' | 0:23:10 | 0:23:14 | |
But across the Atlantic, in the United States, a frantic search | 0:23:14 | 0:23:18 | |
to learn the secrets of communist mind control continued unabated. | 0:23:18 | 0:23:22 | |
'They're my flesh and blood.' | 0:23:22 | 0:23:25 | |
In response to the threat of brainwashing, the CIA had set up | 0:23:25 | 0:23:29 | |
their own secret operation, code name MK-ULTRA. | 0:23:29 | 0:23:34 | |
MK-ULTRA employed many of the leading scientists of the day. | 0:23:35 | 0:23:39 | |
From 1953, they embarked on a series of increasingly bizarre experiments | 0:23:39 | 0:23:44 | |
designed to replicate the communist brainwashing techniques. | 0:23:44 | 0:23:48 | |
They subjected people to sensory deprivation, | 0:23:48 | 0:23:52 | |
total isolation, electric shocks. | 0:23:52 | 0:23:55 | |
One research area that seemed promising was drugs, | 0:23:55 | 0:23:59 | |
and one class of drug stood out from all the rest. | 0:23:59 | 0:24:03 | |
They seemed to change the world. | 0:24:05 | 0:24:07 | |
They distorted vision, and they warped sound. | 0:24:10 | 0:24:14 | |
They created vivid hallucinations. | 0:24:16 | 0:24:18 | |
Hallucinogenic drugs like LSD and psilocybin had only recently | 0:24:22 | 0:24:26 | |
been discovered, and were the subject of intense research. | 0:24:26 | 0:24:30 | |
They held out the enticing possibility | 0:24:34 | 0:24:36 | |
of controlling another person's mind. | 0:24:36 | 0:24:38 | |
Perhaps they could also change your core beliefs. | 0:24:38 | 0:24:43 | |
The top-secret operatives from MK-ULTRA experimented on themselves, | 0:24:43 | 0:24:49 | |
their colleagues and on unsuspecting members of the public. | 0:24:49 | 0:24:52 | |
And the American military | 0:24:52 | 0:24:54 | |
also attempted to harness their mind-bending powers. | 0:24:54 | 0:24:57 | |
-How do you feel? -I could run 100 miles right now. | 0:24:59 | 0:25:03 | |
-Is that right? -That's right. -Pretty pepped up, huh? | 0:25:03 | 0:25:05 | |
But, while the drugs were certainly mind-altering, | 0:25:05 | 0:25:10 | |
they were useless as a form of brainwashing. | 0:25:10 | 0:25:13 | |
The effects were too temporary, too unpredictable | 0:25:13 | 0:25:16 | |
to reliably alter an individual. | 0:25:16 | 0:25:19 | |
The huge effort to produce | 0:25:19 | 0:25:21 | |
a foolproof method of brainwashing had failed. | 0:25:21 | 0:25:25 | |
And following the end of the Korean War, | 0:25:25 | 0:25:28 | |
it emerged that communist techniques were also flawed. | 0:25:28 | 0:25:31 | |
Many of the soldiers who had denounced America while in captivity | 0:25:31 | 0:25:34 | |
rediscovered their patriotism when they got home. | 0:25:34 | 0:25:39 | |
Brainwashing turned out to be an illusory weapon of war. | 0:25:39 | 0:25:44 | |
For many years, there was diminished scientific interest | 0:25:44 | 0:25:47 | |
in the mind-altering properties of hallucinogenic drugs. | 0:25:47 | 0:25:50 | |
Now, half a century on, things are changing. | 0:25:53 | 0:25:57 | |
As part of an experiment, I have been injected with psilocybin, | 0:25:57 | 0:26:01 | |
the active component of magic mushrooms. | 0:26:01 | 0:26:03 | |
Professor Nutt and his team look on as the drug races through my system. | 0:26:03 | 0:26:09 | |
They want to understand exactly how the drug works on my brain. | 0:26:09 | 0:26:14 | |
Ten on that scale is extremely intense. | 0:26:14 | 0:26:18 | |
How anxious he feels now... | 0:26:18 | 0:26:20 | |
He'll be hallucinating, certainly. | 0:26:21 | 0:26:24 | |
He might be seeing colours and swirling patterns. | 0:26:24 | 0:26:27 | |
Probably some bodily feelings as well. | 0:26:27 | 0:26:29 | |
His body might feel slightly different. | 0:26:29 | 0:26:31 | |
Maybe a kind of floating sensation. | 0:26:31 | 0:26:34 | |
Inside the machine, I was convinced I could levitate, | 0:26:34 | 0:26:39 | |
make the walls of the scanner dissolve | 0:26:39 | 0:26:42 | |
and fly up to the stars at supersonic speed. | 0:26:42 | 0:26:46 | |
It was all rather beautiful. | 0:26:46 | 0:26:50 | |
His sense of time has slowed down. | 0:26:50 | 0:26:52 | |
OK, Michael, we're all finished, I'll come and get you out. | 0:26:52 | 0:26:57 | |
Thank you. | 0:26:57 | 0:26:59 | |
'Although the effects of psilocybin had largely worn off, | 0:26:59 | 0:27:02 | |
'I had an uncontrollable urge to talk. And talk.' | 0:27:02 | 0:27:06 | |
That was very strange, I have to say, because I thought, | 0:27:06 | 0:27:10 | |
"Whoa!" and then, "Whoo!" | 0:27:10 | 0:27:11 | |
It was really sort of, whoosh, take-off. | 0:27:11 | 0:27:13 | |
'..And talk. And talk.' | 0:27:13 | 0:27:18 | |
It took me back to when I was eight years old | 0:27:18 | 0:27:21 | |
and I used to rub my eyeballs | 0:27:21 | 0:27:22 | |
in an attempt to have religious experiences. | 0:27:22 | 0:27:25 | |
You felt it, when it came, | 0:27:25 | 0:27:27 | |
it was just like being in a sort of Star Trek thing, | 0:27:27 | 0:27:30 | |
going into hyperspace, and "whoo". | 0:27:30 | 0:27:32 | |
As soon as you start thinking even the slightest bad thoughts, | 0:27:32 | 0:27:35 | |
you think, "This is really, really... | 0:27:35 | 0:27:37 | |
"I don't want to go there." | 0:27:37 | 0:27:38 | |
And actually, dragging yourself back from it is very difficult. | 0:27:38 | 0:27:42 | |
I'm still feeling very strange. | 0:27:42 | 0:27:44 | |
But you want to go out and share it with everyone. | 0:27:44 | 0:27:47 | |
You want to go and say, "Hello!" and talk about it. | 0:27:47 | 0:27:50 | |
My results will be merged with scans from other volunteers | 0:27:50 | 0:27:54 | |
to identify the precise areas inside the brain where the drug is active. | 0:27:54 | 0:27:59 | |
And you see there are essentially three big, blue blobs. | 0:27:59 | 0:28:04 | |
What blue means is that | 0:28:04 | 0:28:06 | |
-the activity of the brain has shut down. -Switched off! | 0:28:06 | 0:28:10 | |
When we went into this study, we thought that this drug | 0:28:10 | 0:28:13 | |
would activate, certainly different parts of the brain to those, | 0:28:13 | 0:28:17 | |
but we don't find any activation anywhere. | 0:28:17 | 0:28:20 | |
All we find are reductions in blood flow. | 0:28:20 | 0:28:22 | |
David Nutt believes that the areas of the brain | 0:28:22 | 0:28:26 | |
that are being switched off | 0:28:26 | 0:28:27 | |
are critical to the experience I've just had. | 0:28:27 | 0:28:30 | |
The blobs in the middle | 0:28:30 | 0:28:32 | |
are the part of the brain that tells us who and what we are. | 0:28:32 | 0:28:36 | |
When these were dampened down, I was released from everyday constraints. | 0:28:36 | 0:28:39 | |
You were able to break free | 0:28:39 | 0:28:41 | |
from the normal constraints of what you are - | 0:28:41 | 0:28:44 | |
"I'm a father, I've got to go home in an hour and a half, I've got to make this programme." | 0:28:44 | 0:28:49 | |
The fact that it dampens activity in these areas | 0:28:49 | 0:28:51 | |
may provide clues | 0:28:51 | 0:28:52 | |
as to how psilocybin could be used to treat depression. | 0:28:52 | 0:28:56 | |
One of the things we think is that, | 0:28:56 | 0:28:58 | |
if in conditions like depression or obsessive compulsive disorder, | 0:28:58 | 0:29:03 | |
where people get locked into a mindset which is maladaptive, | 0:29:03 | 0:29:09 | |
-these regions may be overactive. -Ah. | 0:29:09 | 0:29:12 | |
Maybe dampening those down | 0:29:12 | 0:29:15 | |
will help people move into another mindset | 0:29:15 | 0:29:18 | |
which might be better and healthier. | 0:29:18 | 0:29:20 | |
If you wanted to shift somebody who's profoundly depressed | 0:29:20 | 0:29:23 | |
out of that mindset, something like this might conceivably... | 0:29:23 | 0:29:26 | |
It's conceivable. We need to do the experiments. | 0:29:26 | 0:29:30 | |
That's part of the rationale. | 0:29:30 | 0:29:32 | |
It helps take you from this rigid, motoric process | 0:29:32 | 0:29:36 | |
of thinking, into something that may be more positive. | 0:29:36 | 0:29:40 | |
'The idea that you could use a psychedelic drug like psilocybin | 0:29:41 | 0:29:46 | |
'to treat depression is certainly surprising. | 0:29:46 | 0:29:48 | |
'If it works, one advantage is, it works fast.' | 0:29:48 | 0:29:52 | |
You changed immediately. Whether it would produce immediate changes | 0:29:52 | 0:29:55 | |
into a depressed person, we don't know. | 0:29:55 | 0:29:57 | |
If it did, it would be wonderful, wouldn't it? | 0:29:57 | 0:30:00 | |
If particularly, you could hold on to those and maintain them. | 0:30:00 | 0:30:04 | |
It may turn out that hallucinogenic drugs | 0:30:06 | 0:30:08 | |
do have a part to play in mind control. | 0:30:08 | 0:30:11 | |
As a therapeutic tool, to help patients think more positively, | 0:30:11 | 0:30:16 | |
it's a far cry from the ambitions of would-be brainwashers. | 0:30:16 | 0:30:19 | |
'"How is it possible?" I ask myself.' | 0:30:22 | 0:30:25 | |
In the 1960s, a new discovery would uncover a far more pervasive | 0:30:25 | 0:30:30 | |
and effective way of controlling people. | 0:30:30 | 0:30:32 | |
It was far more subtle and almost impossible to escape. | 0:30:33 | 0:30:38 | |
But he might be dead in there! | 0:30:38 | 0:30:41 | |
The new approach emphasised the importance of social pressure, | 0:30:42 | 0:30:45 | |
of the social context you find yourself in. | 0:30:45 | 0:30:48 | |
To find out more, | 0:30:48 | 0:30:50 | |
psychologists devised elaborate human experiments, | 0:30:50 | 0:30:54 | |
sometimes with shocking results. | 0:30:54 | 0:30:56 | |
The son of Jewish immigrants from Eastern Europe, | 0:31:02 | 0:31:05 | |
Stanley Milgram wanted | 0:31:05 | 0:31:06 | |
to understand why so many people took part in the perpetration | 0:31:06 | 0:31:10 | |
of terrible war crimes. | 0:31:10 | 0:31:12 | |
Milgram was curious to find out | 0:31:17 | 0:31:19 | |
what drove German soldiers to participate in the Holocaust. | 0:31:19 | 0:31:22 | |
How is it possible, I ask myself, | 0:31:25 | 0:31:27 | |
that ordinary people, courteous and decent in everyday life, | 0:31:27 | 0:31:30 | |
can act callously, inhumanely, | 0:31:30 | 0:31:33 | |
without any limitations of conscience? | 0:31:33 | 0:31:36 | |
Milgram described the moment he conceived his experience | 0:31:38 | 0:31:41 | |
as "incandescent". | 0:31:41 | 0:31:42 | |
It was beautifully, almost fiendishly designed | 0:31:42 | 0:31:46 | |
to reveal uncomfortable flaws about human nature. | 0:31:46 | 0:31:49 | |
It made a huge impact worldwide. | 0:31:49 | 0:31:51 | |
I remember, as a teenager, reading about it and later watching footage. | 0:31:51 | 0:31:56 | |
It had an enormous impact on me. | 0:31:56 | 0:31:58 | |
It influenced my later decision to train as a doctor | 0:31:58 | 0:32:01 | |
with the intention of becoming a psychiatrist. | 0:32:01 | 0:32:03 | |
It's nearly 50 years since the original experiment and today, | 0:32:03 | 0:32:08 | |
I'm fortunate enough to be meeting one of the few remaining survivors. | 0:32:08 | 0:32:12 | |
-Good morning. -Morning. | 0:32:14 | 0:32:16 | |
-Hello, Michael Mosley. -Hi, Michael. | 0:32:16 | 0:32:18 | |
-Bill Menold. Nice to meet you. -Thank you very much. | 0:32:18 | 0:32:20 | |
'In 1962, Bill Menold had recently left the army and was working in New Haven.' | 0:32:20 | 0:32:26 | |
I happened to see an ad in the New Haven Register | 0:32:26 | 0:32:29 | |
and it said "memory and learning experiment". | 0:32:29 | 0:32:33 | |
They were going to pay you 4. I thought, | 0:32:33 | 0:32:37 | |
"Why wouldn't I do this?" | 0:32:37 | 0:32:39 | |
I've got some footage here from Yale University. | 0:32:39 | 0:32:42 | |
'Bill's participation in the experiment wasn't filmed, but other volunteers were.' | 0:32:42 | 0:32:47 | |
-This guy here... -He was the pupil. | 0:32:47 | 0:32:51 | |
This is basically the... experimenter. | 0:32:51 | 0:32:54 | |
-The experimenter. -The doctor. | 0:32:54 | 0:32:57 | |
The experiment involved a teacher and a learner, | 0:32:57 | 0:33:01 | |
who were given a simple set of memory tasks. | 0:33:01 | 0:33:04 | |
If the learner got the answers wrong, | 0:33:06 | 0:33:09 | |
the teacher had to give him electric shocks, which steadily increased. | 0:33:09 | 0:33:14 | |
Were you surprised when they talked about electric shocks? | 0:33:14 | 0:33:17 | |
I wasn't aghast or anything, | 0:33:17 | 0:33:19 | |
but I was, "Oh, electric shock, what a novel way to do it. | 0:33:19 | 0:33:24 | |
"Who knows what they'll think of next?" | 0:33:24 | 0:33:26 | |
What Bill and the other volunteers weren't told | 0:33:27 | 0:33:30 | |
was that the electric shocks | 0:33:30 | 0:33:31 | |
were fake and that both the experimenter and the learner | 0:33:31 | 0:33:35 | |
were, in fact, actors. | 0:33:35 | 0:33:36 | |
The real purpose of the experiment was to see how many electric shocks | 0:33:38 | 0:33:42 | |
the volunteers like Bill would administer. | 0:33:42 | 0:33:45 | |
It's very convincing, isn't it? | 0:33:47 | 0:33:49 | |
Oh, it is Academy Award nomination. | 0:33:49 | 0:33:52 | |
'Bill and the other volunteers were asked to increase the voltage | 0:33:52 | 0:33:56 | |
'every time the learner got an answer wrong.' | 0:33:56 | 0:33:59 | |
You're now getting a shock of 75 volts. | 0:33:59 | 0:34:03 | |
Ow! | 0:34:03 | 0:34:04 | |
180 volts. | 0:34:04 | 0:34:07 | |
Ow! | 0:34:07 | 0:34:08 | |
Just how far can you go on this thing? | 0:34:08 | 0:34:11 | |
Despite the apparent pain they were inflicting, | 0:34:11 | 0:34:15 | |
most continued to increase the shocks. | 0:34:15 | 0:34:18 | |
Incorrect. | 0:34:20 | 0:34:21 | |
150 volts. | 0:34:23 | 0:34:25 | |
Ah! | 0:34:25 | 0:34:26 | |
If you're sitting in that chair | 0:34:26 | 0:34:29 | |
with this stuff going on, | 0:34:29 | 0:34:30 | |
and the pressure that you're under, it's very hard to think clearly. | 0:34:30 | 0:34:37 | |
I'm up to 180 volts! | 0:34:37 | 0:34:39 | |
Please continue, teacher. | 0:34:39 | 0:34:42 | |
Neil, you're going to get a shock, 180 volts. | 0:34:42 | 0:34:46 | |
-SWITCH CLICKS -Ow! | 0:34:46 | 0:34:48 | |
Who's going to take the responsibility | 0:34:48 | 0:34:50 | |
if anything happens to that gentleman? | 0:34:50 | 0:34:52 | |
I'm responsible for anything that happens here. | 0:34:52 | 0:34:55 | |
Continue, please. | 0:34:55 | 0:34:57 | |
Milgram asked colleagues beforehand how many people they thought | 0:34:57 | 0:35:00 | |
would administer the lethal 450-volt shock. | 0:35:00 | 0:35:03 | |
Most said about 1%, and those would probably be psychopaths. | 0:35:03 | 0:35:08 | |
195 volts. Dance! | 0:35:08 | 0:35:12 | |
Ow! Let me out of here! | 0:35:12 | 0:35:15 | |
Then they really cranked it up. | 0:35:15 | 0:35:19 | |
You were talking about traumatic experiences. | 0:35:19 | 0:35:22 | |
I've never had anything before or since that was like that. | 0:35:22 | 0:35:27 | |
Start with blue, please, at the top of the page. | 0:35:27 | 0:35:30 | |
Continue, please, teacher. | 0:35:30 | 0:35:31 | |
Did you at any point think that perhaps you'd killed him? | 0:35:31 | 0:35:35 | |
Yes. | 0:35:35 | 0:35:37 | |
When he stopped responding. | 0:35:37 | 0:35:39 | |
The experiment requires that we continue. Go on, please. | 0:35:39 | 0:35:42 | |
Don't the man's health mean anything? | 0:35:42 | 0:35:45 | |
-Whether the learner likes it or not... -He might be dead in there! | 0:35:45 | 0:35:49 | |
I shut off my moral... | 0:35:49 | 0:35:51 | |
compass. | 0:35:53 | 0:35:55 | |
Continue, please. | 0:35:55 | 0:35:56 | |
435 volts. | 0:35:57 | 0:35:59 | |
Once you make the decision... | 0:36:01 | 0:36:04 | |
you've made your decision. | 0:36:05 | 0:36:08 | |
Let's get this show on the road and get it over with. | 0:36:08 | 0:36:11 | |
The answer is "horse". | 0:36:11 | 0:36:13 | |
450 volts. | 0:36:13 | 0:36:15 | |
Next word. | 0:36:17 | 0:36:18 | |
Bill, like two thirds of the volunteers, | 0:36:19 | 0:36:22 | |
gave the lethal electric shock. | 0:36:22 | 0:36:25 | |
I thought Bill was very honest, | 0:36:28 | 0:36:30 | |
particularly when he started to talk about that moment | 0:36:30 | 0:36:33 | |
when he abandoned his moral compass | 0:36:33 | 0:36:35 | |
and he handed over responsibility for his actions to the experimenter. | 0:36:35 | 0:36:39 | |
I guess that's what Milgram demonstrated. | 0:36:39 | 0:36:42 | |
We'd like to believe we're autonomous creatures, | 0:36:42 | 0:36:45 | |
that we follow some sort of moral principles. | 0:36:45 | 0:36:48 | |
But what Milgram showed is that, in the right circumstances, | 0:36:48 | 0:36:51 | |
you can persuade someone to do almost anything. | 0:36:51 | 0:36:56 | |
According to Milgram, Bill Menold and the other participants | 0:36:56 | 0:37:00 | |
who administered lethal shocks were not psychopaths. | 0:37:00 | 0:37:03 | |
They had simply demonstrated | 0:37:03 | 0:37:05 | |
a striking willingness to obey authority. | 0:37:05 | 0:37:09 | |
-Wrong. -It was this willingness to obey authority | 0:37:09 | 0:37:12 | |
that shaped and controlled their behaviour. | 0:37:12 | 0:37:15 | |
-SWITCH CLICKS -Ah! | 0:37:15 | 0:37:17 | |
Milgram said it didn't matter if you were German or American, | 0:37:18 | 0:37:22 | |
the power of authority remained the same. | 0:37:22 | 0:37:25 | |
Milgram's experiments catapulted him to worldwide fame, | 0:37:31 | 0:37:34 | |
and he used his numerous television appearances to spread his ideas | 0:37:34 | 0:37:39 | |
to a wider audience. | 0:37:39 | 0:37:40 | |
Now, one of the illusions about human behaviour | 0:37:40 | 0:37:44 | |
is that it stems entirely from personality or character. | 0:37:44 | 0:37:47 | |
But social psychology shows us that often, | 0:37:47 | 0:37:50 | |
behaviour is dominated | 0:37:50 | 0:37:51 | |
by the social roles we're asked to play. | 0:37:51 | 0:37:53 | |
Revealing the pervasive | 0:37:55 | 0:37:56 | |
and sometimes malign influence of authority | 0:37:56 | 0:37:59 | |
chimed well with the anti-authority sentiments of the late 1960s. | 0:37:59 | 0:38:03 | |
But Milgram's own motivation was not mistrust of authority. | 0:38:06 | 0:38:10 | |
He wanted to understand WHY authority has such a hold over us. | 0:38:10 | 0:38:15 | |
To do this, he took to the streets. | 0:38:16 | 0:38:19 | |
-Yes, sir, where to? -33 West 42nd. | 0:38:20 | 0:38:23 | |
Milgram started by pointing out how, unquestioningly, | 0:38:23 | 0:38:26 | |
we obey authority figures in a whole range of different situations. | 0:38:26 | 0:38:31 | |
In this setting, I allow things to be done to me | 0:38:31 | 0:38:35 | |
that I wouldn't allow in any other context. | 0:38:35 | 0:38:37 | |
The dentist is about to put his fist and electric drill into my mouth. | 0:38:37 | 0:38:42 | |
Open, please. | 0:38:43 | 0:38:45 | |
In this setting, | 0:38:46 | 0:38:48 | |
I willingly expose my throat to a man with a razor blade. | 0:38:48 | 0:38:53 | |
What he did next would throw new light on why we're so obedient. | 0:38:53 | 0:38:59 | |
Milgram wanted to see how people would behave | 0:38:59 | 0:39:01 | |
in a situation where there was no obvious authority. | 0:39:01 | 0:39:04 | |
How would you react if a perfect stranger came up to you | 0:39:04 | 0:39:08 | |
on the train and said, "I'd like your seat, please?" | 0:39:08 | 0:39:12 | |
Well, this is exactly what Milgram and his students did | 0:39:12 | 0:39:15 | |
on the New York subway. | 0:39:15 | 0:39:17 | |
Now, if you ask a New Yorker, | 0:39:17 | 0:39:19 | |
would he give up his seat to a man who gives no reason for asking, | 0:39:19 | 0:39:23 | |
he'd say, "Never." But what would he do? We photographed what happened. | 0:39:23 | 0:39:27 | |
Excuse me, ma'am, may I have your seat, please? | 0:39:27 | 0:39:30 | |
-All right. -Thank you. | 0:39:30 | 0:39:32 | |
Excuse me, sir, may I have your seat, please? | 0:39:32 | 0:39:36 | |
-I guess so. -Thank you very much. | 0:39:36 | 0:39:38 | |
But would it work today? | 0:39:41 | 0:39:43 | |
I decided to repeat his experiment in a London shopping centre. | 0:39:43 | 0:39:47 | |
Excuse me, could I have that seat, please? | 0:39:47 | 0:39:50 | |
-Huh? -Could I have that seat, please? | 0:39:50 | 0:39:53 | |
-Yeah? -Could I sit there? -Why? | 0:39:53 | 0:39:55 | |
I'd like to sit down. | 0:39:55 | 0:39:57 | |
If you gave me a reason... | 0:39:57 | 0:39:59 | |
-Excuse me, could I have that seat, please? -Yeah, sure. | 0:39:59 | 0:40:04 | |
Thank you very much. | 0:40:04 | 0:40:05 | |
-Excuse me, could I have your seat, please? -Yeah, sure. | 0:40:08 | 0:40:13 | |
Thank you very much. | 0:40:13 | 0:40:15 | |
I was surprised by how many people complied | 0:40:19 | 0:40:22 | |
with my totally unreasonable request - about half. | 0:40:22 | 0:40:25 | |
Excuse me, could I have that seat please? | 0:40:26 | 0:40:29 | |
-Yeah. -OK, thank you. | 0:40:29 | 0:40:30 | |
Excuse me, could I sit there, please? | 0:40:30 | 0:40:35 | |
-Why do you want to sit here? -I want to sit down. -Pardon? | 0:40:35 | 0:40:39 | |
-I want to sit down. -Why can't you sit here? -Can I sit there? | 0:40:39 | 0:40:44 | |
-Is that a yes or a no? -No. | 0:40:44 | 0:40:47 | |
But what really surprised me | 0:40:50 | 0:40:52 | |
was just how difficult I personally found the whole experience. | 0:40:52 | 0:40:56 | |
My real feeling at this moment is one of just profound relief | 0:40:56 | 0:41:02 | |
that it's done. | 0:41:02 | 0:41:03 | |
I felt really uncomfortable throughout the whole thing. | 0:41:03 | 0:41:07 | |
This is also what Milgram felt. | 0:41:07 | 0:41:09 | |
I stood in front of a passenger and I was about to say, | 0:41:09 | 0:41:13 | |
"Excuse me, sir, may I have your seat?" | 0:41:13 | 0:41:15 | |
I found something very interesting. There was an enormous inhibition. | 0:41:15 | 0:41:19 | |
The words wouldn't come out. I simply couldn't utter them. | 0:41:19 | 0:41:22 | |
There was a terrible constraint against saying this simple phrase. | 0:41:22 | 0:41:25 | |
Milgram concluded that feeling socially awkward | 0:41:25 | 0:41:29 | |
and embarrassed plays an important role in governing our behaviour. | 0:41:29 | 0:41:33 | |
We don't like breaking the social rules, | 0:41:33 | 0:41:35 | |
whether it's asking for someone's seat for no good reason, | 0:41:35 | 0:41:39 | |
or disobeying the instruction of somebody | 0:41:39 | 0:41:41 | |
whose authority we have accepted. | 0:41:41 | 0:41:42 | |
In ordinary everyday situations, | 0:41:42 | 0:41:45 | |
there's an implicit set of rules as to who's in charge. | 0:41:45 | 0:41:48 | |
If we violate these rules, it leads to the disruption of behaviour, | 0:41:48 | 0:41:51 | |
to embarrassment, awkwardness, to a kind of social chaos. | 0:41:51 | 0:41:54 | |
Ordinarily, we accept the submissive role that the occasion requires. | 0:41:54 | 0:42:00 | |
According to Milgram, | 0:42:02 | 0:42:04 | |
compliance in society and obedience to authority is necessary. | 0:42:04 | 0:42:09 | |
Our social organisation depends on it. | 0:42:09 | 0:42:11 | |
If we're to have a society, | 0:42:11 | 0:42:14 | |
we must have members of society willing to obey its laws. | 0:42:14 | 0:42:19 | |
It's not simply a question | 0:42:19 | 0:42:21 | |
of whether or not we want to be controlled. | 0:42:21 | 0:42:24 | |
According to Milgram, our social organisation demands it. | 0:42:24 | 0:42:27 | |
As a means of control, there's no doubt that social pressure | 0:42:36 | 0:42:39 | |
is ever-present and highly effective. | 0:42:39 | 0:42:43 | |
But social pressure, like drugs or behaviour modification, | 0:42:44 | 0:42:48 | |
or even Pavlovian training, | 0:42:48 | 0:42:50 | |
wasn't able to deliver total mind control. | 0:42:50 | 0:42:52 | |
There was, however, still one group of scientists who believed | 0:42:52 | 0:42:57 | |
they could control behaviour and emotions with pinpoint accuracy. | 0:42:57 | 0:43:01 | |
To do so, they would have to develop techniques | 0:43:04 | 0:43:08 | |
to go directly into the brain. | 0:43:08 | 0:43:11 | |
CRACKLING | 0:43:11 | 0:43:13 | |
Are you sleepy or awake or what? | 0:43:14 | 0:43:17 | |
I'm awake now. | 0:43:17 | 0:43:18 | |
The origins of this approach lie in a series of bizarre experiments | 0:43:18 | 0:43:23 | |
conducted by an ambitious doctor in the American Deep South. | 0:43:23 | 0:43:27 | |
Psychiatrist Robert Heath was a maverick and a pioneer. | 0:43:28 | 0:43:32 | |
From the early 1950s onwards, in New Orleans, he used electrodes | 0:43:32 | 0:43:37 | |
to stimulate and also to map out his patients' brains. | 0:43:37 | 0:43:42 | |
DOORBELL RINGS | 0:43:42 | 0:43:44 | |
-John! Hello, buddy. -Welcome, good to see you. | 0:43:44 | 0:43:47 | |
John Goethe and James Eaton | 0:43:47 | 0:43:49 | |
worked with Heath on many of his more ambitious experiments. | 0:43:49 | 0:43:53 | |
He wanted to be a discoverer. | 0:43:53 | 0:43:55 | |
-He wanted to find a new path. -But there was no question | 0:43:55 | 0:44:01 | |
that there was, um... | 0:44:01 | 0:44:03 | |
a light side and a dark side. | 0:44:03 | 0:44:06 | |
I really loved the guy, | 0:44:06 | 0:44:07 | |
but if you love someone, | 0:44:07 | 0:44:09 | |
and you're very close to them, | 0:44:09 | 0:44:10 | |
you see all sorts of warts. | 0:44:10 | 0:44:12 | |
He had plenty of warts. | 0:44:12 | 0:44:14 | |
Heath was not afraid of controversy. | 0:44:16 | 0:44:18 | |
He passionately believed you had to experiment on humans | 0:44:18 | 0:44:22 | |
to make truly significant medical breakthroughs. | 0:44:22 | 0:44:25 | |
He conducted his most controversial research in the state of Louisiana. | 0:44:27 | 0:44:33 | |
At the time, the consent process was not as rigorous as it is now. | 0:44:33 | 0:44:38 | |
His goal was ambitious - to see if he could alter behaviour | 0:44:39 | 0:44:43 | |
by sticking electrodes into people's brains. | 0:44:43 | 0:44:46 | |
Heath was inspired by experiments that had been done on rats. | 0:44:49 | 0:44:53 | |
Electrodes were implanted directly into areas of the rat's brain | 0:44:54 | 0:44:58 | |
believed to be responsible for pleasure. | 0:44:58 | 0:45:02 | |
When the rat pressed the lever, it stimulated these areas. | 0:45:02 | 0:45:07 | |
The scientists found that this process | 0:45:07 | 0:45:09 | |
could dramatically change the rat's normal behaviour. | 0:45:09 | 0:45:11 | |
BUZZING | 0:45:14 | 0:45:15 | |
This rat is prepared to run across an electrified floor | 0:45:15 | 0:45:18 | |
to reach the lever, press it, and get a pleasure kick. | 0:45:18 | 0:45:22 | |
Heath decided to take the extreme step of trying this on humans. | 0:45:27 | 0:45:32 | |
How do you feel? | 0:45:34 | 0:45:37 | |
It's...like coming on. | 0:45:37 | 0:45:41 | |
Hmm? | 0:45:41 | 0:45:42 | |
-Like coming on. -Like something coming on? | 0:45:42 | 0:45:45 | |
What do you mean? | 0:45:45 | 0:45:47 | |
'The depth electrode procedure' | 0:45:49 | 0:45:52 | |
involved placing these very thin wires through the skull | 0:45:52 | 0:45:57 | |
and into precisely located areas | 0:45:57 | 0:46:00 | |
thought to be important in the regulation of emotion. | 0:46:00 | 0:46:04 | |
This idea that the emotions are deep in the temporal lobes, | 0:46:04 | 0:46:08 | |
rather than being on the surface, | 0:46:08 | 0:46:10 | |
was an idea shared by many, | 0:46:10 | 0:46:12 | |
but he's the one who actually put a needle, put an electrode in, | 0:46:12 | 0:46:17 | |
in there and turned on the juice. | 0:46:17 | 0:46:19 | |
Can you keep pushing this yourself? | 0:46:19 | 0:46:22 | |
-Yes, sir. -All right, push this button. | 0:46:22 | 0:46:25 | |
'They had these little boxes' | 0:46:25 | 0:46:27 | |
and they had the electrodes in their head. | 0:46:27 | 0:46:31 | |
Whenever they felt they needed a little goose or something, | 0:46:31 | 0:46:35 | |
again, I'm simplifying it, but they could press it. | 0:46:35 | 0:46:38 | |
All right, push this button. | 0:46:38 | 0:46:40 | |
You know how to work this. | 0:46:40 | 0:46:42 | |
Heath's early patients were schizophrenics, | 0:46:42 | 0:46:45 | |
severe epileptics and depressives. | 0:46:45 | 0:46:47 | |
I feel like something is coming on. | 0:46:47 | 0:46:50 | |
-All right. Does it feel? How does it feel? Good or bad? -Good. | 0:46:50 | 0:46:54 | |
-Good? -Yes, sir. | 0:46:54 | 0:46:56 | |
Like the rats, | 0:46:56 | 0:46:57 | |
these human patients were able to stimulate their own brains | 0:46:57 | 0:47:01 | |
and use this pleasurable sensation to alleviate their symptoms. | 0:47:01 | 0:47:05 | |
-Well, I think it's... I think it's somewhat of a sexy button. -Yeah? | 0:47:08 | 0:47:14 | |
Encouraged by his results, Heath began to explore ways of using | 0:47:14 | 0:47:17 | |
deep brain stimulation to create new behaviours. | 0:47:17 | 0:47:21 | |
His most infamous experiment was on a 24-year-old man | 0:47:23 | 0:47:27 | |
with a long history of depression who had never had sex with a woman. | 0:47:27 | 0:47:31 | |
He approached Heath and asked him to change him, | 0:47:33 | 0:47:36 | |
to change his behaviour, | 0:47:36 | 0:47:38 | |
because he was gay and he wanted to become straight. | 0:47:38 | 0:47:42 | |
At the time, being gay was regarded as a psychiatric illness, | 0:47:42 | 0:47:45 | |
so Heath agreed. | 0:47:45 | 0:47:46 | |
He wasn't the first to do this. Other people had tried, | 0:47:46 | 0:47:51 | |
but nobody had done so using electrical stimulation. | 0:47:51 | 0:47:54 | |
Heath implanted a series of electrodes | 0:47:55 | 0:47:57 | |
into the pleasure regions of the man's brain. | 0:47:57 | 0:48:00 | |
Then he was shown some soft porn. | 0:48:00 | 0:48:03 | |
The idea was to link, in the man's mind, the pleasure he was getting | 0:48:05 | 0:48:09 | |
from the electrical stimulation | 0:48:09 | 0:48:11 | |
with the heterosexual behaviour he was watching on the screen. | 0:48:11 | 0:48:16 | |
At the end of the three-week period, | 0:48:18 | 0:48:21 | |
the man reported that he had growing sexual interest in women. | 0:48:21 | 0:48:25 | |
So Heath decided to go to the next stage. Now, this really is bizarre. | 0:48:25 | 0:48:31 | |
What he did is, he hired a prostitute for 50, | 0:48:31 | 0:48:35 | |
a woman he called a "lady of the night". | 0:48:35 | 0:48:38 | |
Her job was to have sex with a gay man | 0:48:38 | 0:48:41 | |
with electrodes sticking out of his head. | 0:48:41 | 0:48:44 | |
The wires led to recording devices in the next room, | 0:48:44 | 0:48:47 | |
where the psychiatrists were keeping a record | 0:48:47 | 0:48:49 | |
of what was going on inside his skull. | 0:48:49 | 0:48:52 | |
Well, astonishingly enough, I mean, I'm absolutely gobsmacked, | 0:48:54 | 0:48:58 | |
but astonishingly enough, according to Heath, | 0:48:58 | 0:49:01 | |
intercourse was successful. | 0:49:01 | 0:49:04 | |
In fact, the man said he'd had so much fun, | 0:49:04 | 0:49:07 | |
got so much pleasure out of it, | 0:49:07 | 0:49:10 | |
he would like to go and do it again and again. | 0:49:10 | 0:49:14 | |
Unlikely as it sounds, | 0:49:14 | 0:49:16 | |
Heath's use of electrode stimulation seemed to have worked. | 0:49:16 | 0:49:20 | |
But his patient's desire to become heterosexual | 0:49:20 | 0:49:22 | |
wasn't entirely fulfilled. | 0:49:22 | 0:49:24 | |
The young man went off and he did actually begin | 0:49:26 | 0:49:29 | |
an affair with a married woman that lasted for ten months. | 0:49:29 | 0:49:32 | |
But he also continued to have sex with men. | 0:49:32 | 0:49:35 | |
So a rather mixed outcome, | 0:49:35 | 0:49:37 | |
and Heath himself never tried to repeat the experiment. | 0:49:37 | 0:49:41 | |
-Huh? -Ooh... -You all right? -I don't like that. | 0:49:44 | 0:49:46 | |
You don't like that? | 0:49:46 | 0:49:48 | |
Heath's methods were primitive, | 0:49:48 | 0:49:50 | |
but they did pave the way for some important medical developments. | 0:49:50 | 0:49:54 | |
Today, the technique of deep brain stimulation has been refined | 0:49:57 | 0:50:02 | |
and improved, and is now used to treat a range of conditions. | 0:50:02 | 0:50:05 | |
It has been particularly successful for sufferers of Parkinson's. | 0:50:08 | 0:50:13 | |
Electrodes are placed in the areas of the brain | 0:50:14 | 0:50:17 | |
that control movement and help to reduce symptoms. | 0:50:17 | 0:50:20 | |
However, implanting electrodes involves major surgery | 0:50:22 | 0:50:26 | |
and the risks are considerable. | 0:50:26 | 0:50:29 | |
Although it is a huge improvement | 0:50:30 | 0:50:32 | |
on what went before, electrical stimulation is still fairly crude. | 0:50:32 | 0:50:36 | |
Even the most carefully positioned of electrodes | 0:50:36 | 0:50:39 | |
stimulates a large area of brain matter, | 0:50:39 | 0:50:42 | |
consisting of thousands of cells. | 0:50:42 | 0:50:44 | |
But just recently, the technology has moved on. | 0:50:44 | 0:50:47 | |
Here at Oxford University, | 0:50:49 | 0:50:51 | |
they have found new ways to manipulate the brain | 0:50:51 | 0:50:55 | |
and control behaviour using a technique called optogenetics. | 0:50:55 | 0:50:59 | |
With optogenetics, scientists can control the brain | 0:51:01 | 0:51:04 | |
simply using light. | 0:51:04 | 0:51:05 | |
I'm here to meet neuroscientist Gero Miesenbock, | 0:51:05 | 0:51:09 | |
who's been refining his technique on flies. | 0:51:09 | 0:51:13 | |
-Hello, there. -Hello. | 0:51:13 | 0:51:15 | |
-Michael Mosley. -Gero Miesenbock. | 0:51:15 | 0:51:17 | |
I gather you are lord of the flies? | 0:51:17 | 0:51:19 | |
In some circles, I am, yes. | 0:51:19 | 0:51:21 | |
And how many flies do you have here? | 0:51:21 | 0:51:24 | |
We haven't really counted them, but we have thousands of, erm, | 0:51:24 | 0:51:28 | |
vials like these. | 0:51:28 | 0:51:29 | |
Each vial has maybe 50-100 flies in them, so you do the math. | 0:51:29 | 0:51:32 | |
It's probably millions buzzing around somewhere. | 0:51:32 | 0:51:35 | |
-Millions. You are lord of the flies! -Yeah. | 0:51:35 | 0:51:38 | |
Miesenbock earned his nickname | 0:51:40 | 0:51:42 | |
through his ability to control and manipulate the choices flies make. | 0:51:42 | 0:51:47 | |
We've come quite far in controlling not only simple motor acts | 0:51:47 | 0:51:52 | |
of the animals, but actually their psychology in rather profound ways. | 0:51:52 | 0:51:57 | |
I don't think of flies as having a psychology! | 0:51:57 | 0:52:00 | |
You just haven't worked with them for long enough. | 0:52:00 | 0:52:03 | |
Can we see it in action? | 0:52:03 | 0:52:04 | |
-Yeah, let's take a look. -OK. | 0:52:04 | 0:52:07 | |
Controlling a fly's psychology is possible. | 0:52:07 | 0:52:10 | |
Because of the 200,000 or so cells that make up their brains, | 0:52:10 | 0:52:14 | |
he has managed to isolate a few that are responsible for learning. | 0:52:14 | 0:52:19 | |
We did find a particular small group of just 12 cells | 0:52:19 | 0:52:22 | |
lying in the central brain. | 0:52:22 | 0:52:25 | |
These 12 cells don't dictate behaviour per se. | 0:52:25 | 0:52:28 | |
They dictate whether behaviour should be changed. | 0:52:28 | 0:52:31 | |
And dictating whether behaviour should be changed | 0:52:31 | 0:52:34 | |
is a crucial part of learning. | 0:52:34 | 0:52:35 | |
Miesenbock likens the role of this group of cells | 0:52:35 | 0:52:39 | |
to that of a nagging critic. | 0:52:39 | 0:52:41 | |
Critic is, if you wish, | 0:52:41 | 0:52:43 | |
the structure that holds the key to intelligence. | 0:52:43 | 0:52:47 | |
It's the structure that decides whether something is good or bad, | 0:52:47 | 0:52:50 | |
whether it's right or wrong. | 0:52:50 | 0:52:52 | |
Sort of the brain's version of the Catholic Church, | 0:52:52 | 0:52:55 | |
if you're an Austrian like me, | 0:52:55 | 0:52:57 | |
or the superego if you're Freudian or your mother if you're Jewish. | 0:52:57 | 0:53:02 | |
Miesenbock has genetically engineered his flies | 0:53:02 | 0:53:06 | |
so that he can switch on this negative nagging critical voice | 0:53:06 | 0:53:10 | |
simply by exposing them to blue light. | 0:53:10 | 0:53:13 | |
When the blue light comes on, the cells are activated | 0:53:13 | 0:53:16 | |
and the flies start to learn. | 0:53:16 | 0:53:18 | |
The flies are taught to dislike whatever is in their surroundings | 0:53:18 | 0:53:22 | |
when the blue light is on. This is then stored as a bad memory. | 0:53:22 | 0:53:27 | |
In this experiment, | 0:53:27 | 0:53:28 | |
the flies are taught to dislike the smell of liquorice. | 0:53:28 | 0:53:32 | |
Each of these chambers contains just one fly. | 0:53:32 | 0:53:35 | |
-Right. -And the fly walks up and down the length of the chamber. | 0:53:35 | 0:53:39 | |
And, um, the left and the right half of the chamber | 0:53:39 | 0:53:43 | |
are each filled with a different odour. | 0:53:43 | 0:53:45 | |
One of these smells a bit like, as they say, a tennis shoe in July. | 0:53:45 | 0:53:50 | |
-Right. -And the other one a little bit like liquorice. -Liquorice, OK. | 0:53:50 | 0:53:53 | |
You can see that, before our optogenetic intervention, | 0:53:53 | 0:53:59 | |
the fly just explores the chamber in its entirety. | 0:53:59 | 0:54:02 | |
They start off not caring one way or the other | 0:54:02 | 0:54:05 | |
-and you're going to manipulate their choice? -Exactly. | 0:54:05 | 0:54:08 | |
When the blue light is switched on, | 0:54:10 | 0:54:12 | |
the chamber is flooded with the smell of liquorice. | 0:54:12 | 0:54:15 | |
OK, now, the effect of this light is to turn on the 12 cells | 0:54:15 | 0:54:19 | |
of the critic in the fly's brain. | 0:54:19 | 0:54:21 | |
-And that then leads to the implantation of memory. -Right. | 0:54:21 | 0:54:25 | |
When the blue light switches off, | 0:54:26 | 0:54:28 | |
the chamber returns to its original smells. | 0:54:28 | 0:54:31 | |
Tennis shoes on the left, liquorice on the right. | 0:54:31 | 0:54:34 | |
So now, because of that blue light, | 0:54:34 | 0:54:38 | |
the fly has come to associate the smell of liquorice | 0:54:38 | 0:54:41 | |
with something it doesn't like, | 0:54:41 | 0:54:42 | |
something it needs to avoid, something it should avoid. Right. | 0:54:42 | 0:54:45 | |
What you see is, it's backtracking already. It made a mistake. | 0:54:45 | 0:54:49 | |
It stepped into the odour of liquorice, | 0:54:49 | 0:54:51 | |
which is on the right here. | 0:54:51 | 0:54:52 | |
Now it's back in tennis shoe, which is on the left. | 0:54:52 | 0:54:56 | |
And you see, whenever it reaches | 0:54:56 | 0:54:58 | |
the interface between the two odours, it recoils. | 0:54:58 | 0:55:01 | |
You can almost see how it goes, "Eugh, this is awful!" | 0:55:01 | 0:55:06 | |
-And turns around. -It's very impressive. | 0:55:06 | 0:55:08 | |
The ability of optogenetics | 0:55:08 | 0:55:10 | |
to pinpoint and act on very specific cells | 0:55:10 | 0:55:13 | |
in the brain suggests it could be used in the future | 0:55:13 | 0:55:17 | |
for modifying human behaviour. | 0:55:17 | 0:55:19 | |
I think there's medical benefits in many possible directions, | 0:55:19 | 0:55:23 | |
the neurons that matter for a particular brain function, | 0:55:23 | 0:55:26 | |
be it appetite control, | 0:55:26 | 0:55:28 | |
mood, anxiety, sleep and wakefulness, attention, you name it. | 0:55:28 | 0:55:32 | |
But there are some major obstacles | 0:55:32 | 0:55:34 | |
to optogenetically controlling humans. | 0:55:34 | 0:55:37 | |
I think it's not something that's around the corner | 0:55:37 | 0:55:41 | |
in the next few years, because, remember, | 0:55:41 | 0:55:43 | |
for optogenetics to work, | 0:55:43 | 0:55:44 | |
you would have to genetically modify the brain... | 0:55:44 | 0:55:47 | |
Indeed, which we are some way off doing at the moment. | 0:55:47 | 0:55:50 | |
Exactly. A fly normally doesn't mind, but I certainly would | 0:55:50 | 0:55:55 | |
if you wanted to implant light receptors into my brain. | 0:55:55 | 0:55:58 | |
Now, that really was quite astonishing, | 0:56:02 | 0:56:05 | |
the idea that you can control | 0:56:05 | 0:56:07 | |
really quite sophisticated behaviour in a fly | 0:56:07 | 0:56:11 | |
just by manipulating 12 neurons. | 0:56:11 | 0:56:12 | |
It's astonishing and also slightly worrying, | 0:56:12 | 0:56:15 | |
because it makes you wonder whether the same is true of humans. | 0:56:15 | 0:56:18 | |
But I think that because human brains are so much bigger | 0:56:18 | 0:56:22 | |
and more complex than insect brains, | 0:56:22 | 0:56:25 | |
it is going to be a long time, | 0:56:25 | 0:56:27 | |
if ever, before technology like that is applied to us. | 0:56:27 | 0:56:31 | |
I've no doubt the technology used to study the brain | 0:56:37 | 0:56:40 | |
will continue to advance... | 0:56:40 | 0:56:42 | |
And that in doing so, we will make giant strides in understanding | 0:56:44 | 0:56:48 | |
the workings of our own minds. | 0:56:48 | 0:56:50 | |
But what I find reassuring is that, | 0:56:52 | 0:56:54 | |
although many people have tried to manipulate our minds, | 0:56:54 | 0:56:57 | |
the human brain has so far proved to be too complex to finely control, | 0:56:57 | 0:57:03 | |
our personalities too resilient. | 0:57:03 | 0:57:06 | |
Down the years, the brain has been pushed around, | 0:57:07 | 0:57:11 | |
probed, drugged and electrified. | 0:57:11 | 0:57:14 | |
But it remains impressively hard to control, | 0:57:16 | 0:57:20 | |
and I find that very comforting. | 0:57:20 | 0:57:22 |