Browse content similar to Are You Good or Evil?. Check below for episodes and series from the same categories and more!
Line | From | To | |
---|---|---|---|
What makes us good... | 0:00:05 | 0:00:07 | |
or evil? | 0:00:07 | 0:00:08 | |
Scientists are daring to investigate this unsettling question. | 0:00:10 | 0:00:16 | |
They're trying to peel back the mask of the psychopathic killer. | 0:00:21 | 0:00:25 | |
I hate to use the term evil, but there is something pretty scary about them. | 0:00:29 | 0:00:33 | |
These are people without a conscience. | 0:00:33 | 0:00:35 | |
What separates us from these terrifying people? | 0:00:36 | 0:00:40 | |
Psychopaths really aren't the kind of person you think they are. | 0:00:41 | 0:00:45 | |
They're exposing the biology that divides vice from virtue. | 0:00:45 | 0:00:49 | |
If there's a chemical involved, we can not only measure it but we can manipulate it. | 0:00:50 | 0:00:56 | |
What they're finding reveals something about the good and evil in us all. | 0:00:56 | 0:01:01 | |
Bingo. When we broke the code, there it was. | 0:01:01 | 0:01:04 | |
That group were the killers. | 0:01:04 | 0:01:06 | |
Who, or what, is evil? | 0:01:06 | 0:01:10 | |
I killed Leslie Bradshaw. | 0:01:11 | 0:01:14 | |
Now, scientists are rewriting our ideas of right and wrong, | 0:01:14 | 0:01:17 | |
even of crime and punishment. | 0:01:17 | 0:01:19 | |
This is what one might call novel science, in that it is a new kind of science. | 0:01:19 | 0:01:24 | |
I think this did affect whether he would live or die. | 0:01:24 | 0:01:28 | |
What they're finding could turn your world upside down. | 0:01:28 | 0:01:33 | |
In London, a group of researchers have devised a rather unusual experiment. | 0:01:42 | 0:01:47 | |
They wanted to see, if we have a moral instinct, | 0:01:50 | 0:01:53 | |
what might it look like in action? | 0:01:53 | 0:01:56 | |
They've invited volunteers to face a stark moral choice. | 0:01:58 | 0:02:02 | |
But they've added a twist. | 0:02:02 | 0:02:05 | |
They're not going to rely on what their volunteers say they would do, | 0:02:05 | 0:02:09 | |
but what they actually do. | 0:02:09 | 0:02:11 | |
They've created an alternative world. | 0:02:13 | 0:02:16 | |
No-one really knows themselves that well to know how they would respond in an extreme situation. | 0:02:19 | 0:02:24 | |
Now, we wouldn't want to manufacture extreme situations in physical reality. | 0:02:24 | 0:02:29 | |
But in virtual reality, you can. | 0:02:29 | 0:02:31 | |
Everybody knows what they do has no real consequences. | 0:02:34 | 0:02:38 | |
But nevertheless, there's a basic part of the brain that doesn't know virtual reality. | 0:02:38 | 0:02:43 | |
It just makes people respond like they would in reality, at maybe a lesser level of intensity. | 0:02:43 | 0:02:49 | |
The volunteers find themselves in an art gallery. | 0:02:50 | 0:02:54 | |
Their role is to operate the lift and take visitors to the first floor. | 0:02:54 | 0:02:59 | |
Five people are on the first floor and one on the ground floor. | 0:03:02 | 0:03:06 | |
A man comes in and asks to be taken to the first floor. | 0:03:08 | 0:03:12 | |
GUNSHOTS AND SCREAMING | 0:03:15 | 0:03:18 | |
The man has started shooting the five people upstairs. | 0:03:22 | 0:03:26 | |
Do they move him down, risking the life of one person on the ground floor in the hope of saving the five? | 0:03:26 | 0:03:32 | |
Or do they do nothing? | 0:03:32 | 0:03:36 | |
If they move the gunman down, they will be responsible for the death of that one person. | 0:03:36 | 0:03:40 | |
If they leave him there, more people will die. | 0:03:40 | 0:03:44 | |
But it won't be their fault. | 0:03:44 | 0:03:46 | |
They actually have to do the action. | 0:03:47 | 0:03:50 | |
They actually have to make the lift come down. | 0:03:50 | 0:03:52 | |
It's like they're pressing a button to potentially kill one. | 0:03:52 | 0:03:56 | |
One of the questions Mel is asking is how volunteers make this tough decision. | 0:03:56 | 0:04:03 | |
GUNSHOTS AND SCREAMING | 0:04:08 | 0:04:10 | |
I wasn't really thinking too much. | 0:04:12 | 0:04:16 | |
I definitely acted with my emotions in there. | 0:04:16 | 0:04:19 | |
Once he started shooting it was very much instinctive. | 0:04:19 | 0:04:23 | |
I should get him out of the way of these people. | 0:04:23 | 0:04:25 | |
GUNSHOTS AND SCREAMING | 0:04:25 | 0:04:27 | |
I was stressed. I panicked. | 0:04:28 | 0:04:32 | |
I was surprised and I couldn't manage to operate the buttons properly. | 0:04:32 | 0:04:38 | |
Because I lost the plot! Basically. | 0:04:38 | 0:04:42 | |
I was thinking, it's going to be a case of lots of people turn up at once, who do I put across? | 0:04:46 | 0:04:50 | |
And then a gunshot happens. | 0:04:55 | 0:04:58 | |
GUNSHOTS | 0:04:58 | 0:04:59 | |
And all that logical thinking goes out of the window and you have to revert back to your instincts. | 0:05:05 | 0:05:11 | |
I tried to move him down as quickly as possible, pressed the wrong button! | 0:05:13 | 0:05:16 | |
Mel studied hundreds of people and has found a consistent pattern. | 0:05:18 | 0:05:23 | |
I think it's a conflict between reason and emotion. | 0:05:23 | 0:05:27 | |
There's an immediate reaction, an immediate need to do something. | 0:05:27 | 0:05:31 | |
Then layered on top of this, a bit slower, is the cognitive response, | 0:05:31 | 0:05:36 | |
the kind of rational, analytic response. | 0:05:36 | 0:05:38 | |
But by that time it's already too late. | 0:05:38 | 0:05:41 | |
Something had to change, something had to move. | 0:05:44 | 0:05:47 | |
It just seemed better to get him downstairs, back where he came from. | 0:05:47 | 0:05:51 | |
They may, post-hoc, be making a rationalisation. | 0:05:51 | 0:05:55 | |
Yes, I wanted to save the majority. | 0:05:55 | 0:05:57 | |
But my guess is that they just react out of instinct. | 0:05:57 | 0:06:00 | |
I have to do something and do it fast. | 0:06:00 | 0:06:03 | |
I didn't really think about the guy on the ground floor. | 0:06:03 | 0:06:05 | |
I just thought, get him away from where there's lots of people. | 0:06:05 | 0:06:07 | |
I don't think there was time to calculate or come up with anything clever. | 0:06:07 | 0:06:10 | |
You just did what seemed the right thing to do. | 0:06:10 | 0:06:13 | |
I couldn't control the lift as well as I would have hoped to have done. | 0:06:13 | 0:06:16 | |
I forgot which was left and right. | 0:06:16 | 0:06:18 | |
GUNSHOTS | 0:06:18 | 0:06:19 | |
In the heat of the moment, they'd all instinctively tried to save the five. | 0:06:19 | 0:06:24 | |
But once Emma had time to apply reason, she started to have doubts. | 0:06:24 | 0:06:29 | |
I don't know... I'd like to tell myself I did that because in killing that one... | 0:06:29 | 0:06:35 | |
If you yourself aided that, that's a bad thing and that would be on your conscience. | 0:06:35 | 0:06:39 | |
It was confusing! | 0:06:39 | 0:06:41 | |
No-one is right in these circumstances. | 0:06:42 | 0:06:45 | |
If they choose to save the one, they're doing a moral act. | 0:06:45 | 0:06:47 | |
If they choose to save the five, they're doing a moral act. | 0:06:47 | 0:06:54 | |
Empirically, the majority do decide to try to save the five and sacrifice the one. | 0:06:54 | 0:06:58 | |
I get the impression that people are moral beings and people really care about other people. | 0:06:58 | 0:07:03 | |
And they try to do the best in the situation. | 0:07:03 | 0:07:06 | |
And the remarkable thing is that they are driven, even though these are not actual human beings. | 0:07:06 | 0:07:13 | |
It seems that when we are confronted with a difficult moral choice, we're confused, distressed. | 0:07:15 | 0:07:22 | |
We may not know the right thing to do | 0:07:22 | 0:07:25 | |
but we seem to have a moral impulse to try and do good. | 0:07:25 | 0:07:29 | |
But just how embedded is this feeling? | 0:07:31 | 0:07:34 | |
And where does it come from? | 0:07:34 | 0:07:36 | |
What instincts, if any, are we born with? | 0:07:38 | 0:07:41 | |
At Yale University, scientists have designed an ingenious experiment. | 0:07:49 | 0:07:54 | |
They wanted to see if babies are born good or bad. | 0:07:56 | 0:07:59 | |
Hundreds of parents have volunteered their children. | 0:08:01 | 0:08:04 | |
The two scientists behind the project are Karen Wynne and Paul Bloom. | 0:08:06 | 0:08:10 | |
I would give a year of my life to spend five minutes as a baby. | 0:08:18 | 0:08:22 | |
To recapture what if feels like to be that sort of creature. | 0:08:22 | 0:08:27 | |
I'm interested in the origin of morality, the origin of good and evil. | 0:08:31 | 0:08:35 | |
We want to see what people start off with. | 0:08:36 | 0:08:38 | |
Do they start of with a moral sense? | 0:08:38 | 0:08:41 | |
With good impulses or evil impulses? | 0:08:41 | 0:08:43 | |
And when you have a sense of that, you can ask, | 0:08:44 | 0:08:46 | |
how does this develop into the adult sense of right and wrong, adult moral behaviour? | 0:08:46 | 0:08:53 | |
They wanted to find out what is in a baby's brain. | 0:08:54 | 0:08:59 | |
To try and unlock this secret, they've devised a kind of morality play that each baby will watch. | 0:08:59 | 0:09:05 | |
So, this character has a ball that he is playing with. | 0:09:08 | 0:09:11 | |
And he passes it to this other fellow, | 0:09:11 | 0:09:15 | |
who returns it in a nice, reciprocal manner. | 0:09:15 | 0:09:19 | |
But now, he's playing with his ball again. | 0:09:25 | 0:09:29 | |
And he is now going to pass it to this other fellow, | 0:09:31 | 0:09:34 | |
who takes it and runs away with it. | 0:09:34 | 0:09:36 | |
What they're waiting to see is which character the baby will prefer. | 0:09:39 | 0:09:43 | |
But how will they know? | 0:09:45 | 0:09:47 | |
As an adult seeing this, the person who gave back the ball is good. | 0:09:49 | 0:09:55 | |
The person who ran away with the ball is kind of a jerk. | 0:09:55 | 0:09:57 | |
And for an adult, you just say, "Who's the good guy and who's the bad guy?" | 0:09:59 | 0:10:02 | |
You can't do that with a young baby. | 0:10:02 | 0:10:03 | |
So what you do is you hold them out and you get the baby to choose. | 0:10:04 | 0:10:08 | |
The experimenter who hands the two puppets to the baby, | 0:10:08 | 0:10:12 | |
she doesn't know which puppet was the good one and which puppet was the bad one. | 0:10:12 | 0:10:17 | |
So she can't unconciously influence the baby's preference. | 0:10:17 | 0:10:20 | |
Happy? Hi! | 0:10:20 | 0:10:24 | |
Do you remember these guys from the show? | 0:10:24 | 0:10:26 | |
Look at me. Which one do you like? | 0:10:26 | 0:10:28 | |
CHILD GURGLES | 0:10:28 | 0:10:29 | |
Which one do you like? | 0:10:32 | 0:10:34 | |
CHILD CHATTERS | 0:10:34 | 0:10:35 | |
-That one? Good job! -OK, that was the nice one. | 0:10:37 | 0:10:41 | |
Aasrith had chosen the good puppet. | 0:10:42 | 0:10:44 | |
The fact is, about 70% of babies do. | 0:10:44 | 0:10:47 | |
Paul and Karen believe this is a sign that these babies are drawn towards kindness. | 0:10:50 | 0:10:56 | |
And that this is a glimmer of a moral feeling. | 0:10:56 | 0:11:00 | |
I was surprised that these experiments worked out as well as they did. | 0:11:04 | 0:11:07 | |
These are very strong findings and we get them over and over again. | 0:11:09 | 0:11:12 | |
That one? All right! Good job! | 0:11:14 | 0:11:16 | |
I think we are tapping something that babies feel strongly about. | 0:11:16 | 0:11:20 | |
These are not subtle effects. | 0:11:20 | 0:11:23 | |
Rather, babies analyse these scenes in a rich and powerful way and respond accordingly. | 0:11:23 | 0:11:29 | |
But if 70% choose the good guy, that leaves 30% who don't. | 0:11:31 | 0:11:36 | |
So what does that say about those babies? | 0:11:38 | 0:11:41 | |
We have always wondered about that. | 0:11:41 | 0:11:43 | |
What do you do about the babies who reach for the bad guy? | 0:11:43 | 0:11:46 | |
The sort of sexy explanation is these are psychopath babies, | 0:11:46 | 0:11:49 | |
babies who see the world differently. | 0:11:49 | 0:11:51 | |
Who actually prefer the bad guy. | 0:11:51 | 0:11:54 | |
And I think that's a logical possibility. | 0:11:55 | 0:11:58 | |
I think it's more likely that in any experiment you run, | 0:11:58 | 0:12:01 | |
if you test 100 babies, 20 of them are going to act funny, no matter what. | 0:12:01 | 0:12:06 | |
Because they just fall asleep or they get distracted. | 0:12:06 | 0:12:09 | |
It's an open question whether babies who reach for the bad guy | 0:12:10 | 0:12:14 | |
are different kinds of babies with a different moral code. | 0:12:14 | 0:12:19 | |
As opposed to, it's just noise, it's the sort of noise you get in every experiment. | 0:12:19 | 0:12:23 | |
The high percentage of babies who do pick the good puppet is striking. | 0:12:25 | 0:12:29 | |
These are the first experiments to show that a moral instinct really seems present in babies. | 0:12:31 | 0:12:37 | |
I'd suggest the moral sense we have as adults is already present by the time we reach our first birthday. | 0:12:38 | 0:12:45 | |
Most of us seem to start life with good impulses, not bad. | 0:12:49 | 0:12:53 | |
The inclination to help each other, to empathise, seems to be built into our brains. | 0:12:54 | 0:13:00 | |
We feel distress when we see someone in pain. | 0:13:01 | 0:13:05 | |
But why? | 0:13:05 | 0:13:06 | |
The fact that this is such a strong feeling has inspired a new and bold scientific quest. | 0:13:08 | 0:13:14 | |
Human beings are obsessed with morality. | 0:13:26 | 0:13:29 | |
We need to know why people are doing what they're doing. | 0:13:29 | 0:13:33 | |
And I, indeed, am as obsessed with morality. | 0:13:33 | 0:13:36 | |
I really want to know when people are good and evil, and why that occurs. | 0:13:36 | 0:13:40 | |
Paul Zak is a neuroscientist. | 0:13:40 | 0:13:43 | |
His mission is to try and trace the basis of our morality. | 0:13:45 | 0:13:49 | |
So I was really looking for a chemical basis for these behaviours. | 0:13:49 | 0:13:54 | |
If there is a chemical involved, that means we can not only measure it but we can manipulate it. | 0:13:54 | 0:14:00 | |
Paul wanted to find the actual chemicals that drive our behaviour. | 0:14:01 | 0:14:06 | |
And to do that, he's doing an experiment he's never done before. | 0:14:06 | 0:14:10 | |
He's bringing his lab outside to see if he can catch good, co-operative behaviour. | 0:14:13 | 0:14:19 | |
He's using a group of people who don't know each other well | 0:14:25 | 0:14:29 | |
but are going to have to work together if they want to succeed. | 0:14:29 | 0:14:33 | |
Hi, guys, thanks for coming out and burning part of your Saturday to be with us. | 0:14:39 | 0:14:44 | |
So, we want to do this experiment where we want to find out how do you bond as a group? | 0:14:44 | 0:14:49 | |
And that's intuitive. | 0:14:49 | 0:14:50 | |
But we want to find some neuroscience behind this. | 0:14:50 | 0:14:52 | |
Paul thinks their brain chemistry may undergo a transformation. | 0:14:55 | 0:14:59 | |
They may release a chemical that will make them feel empathy. | 0:15:00 | 0:15:04 | |
If this is true, this chemical could be driving our morality. | 0:15:05 | 0:15:10 | |
It could be the moral molecule. | 0:15:11 | 0:15:13 | |
We see lots of cooperation in the world but we don't know why. | 0:15:17 | 0:15:21 | |
So I began wondering if there was an underlying biological basis for co-operation. | 0:15:21 | 0:15:26 | |
If there was, | 0:15:26 | 0:15:28 | |
could there be an underlying chemical foundation for this? | 0:15:28 | 0:15:32 | |
One of the chemicals he's interested in he knows is active within families. | 0:15:32 | 0:15:37 | |
But he has never looked for it in a team of relative strangers. | 0:15:37 | 0:15:41 | |
This chemical, oxytocin, that motivates co-operation, is triggered in a variety of ways. | 0:15:42 | 0:15:48 | |
It's released in little children when they are nurtured by their mothers when they are breast-fed. | 0:15:48 | 0:15:52 | |
It's released during touch. | 0:15:52 | 0:15:54 | |
We found it's even released when complete strangers trust us. | 0:15:54 | 0:15:58 | |
So the next question we want to ask is, | 0:15:58 | 0:16:00 | |
are there a variety of rituals that may induce oxytocin release | 0:16:00 | 0:16:04 | |
and lead to bonding among groups, even among groups of strangers? | 0:16:04 | 0:16:08 | |
One of these rituals could indeed be the pre-match warm-up. | 0:16:10 | 0:16:13 | |
So we are going to take a baseline blood draw | 0:16:15 | 0:16:17 | |
and find out what his baseline physiologic state is | 0:16:17 | 0:16:20 | |
and then we'll measure after the warm-up how it changes. | 0:16:20 | 0:16:23 | |
So each individual is different so it's important to get a baseline for each individual. | 0:16:23 | 0:16:27 | |
OK. | 0:16:27 | 0:16:29 | |
What they are really doing is training their movements together, | 0:16:50 | 0:16:53 | |
getting in sync, so they are actually forming themselves as almost a super organism. | 0:16:53 | 0:16:58 | |
Coming together, they're warming up their muscles. | 0:17:00 | 0:17:04 | |
At the same time, they are warming up their brains. | 0:17:04 | 0:17:07 | |
Their brains are starting to bond together. | 0:17:07 | 0:17:10 | |
So as a group they can be aggressive against the other team. | 0:17:10 | 0:17:13 | |
OK, we're starting the second blood draw in just a minute. | 0:17:16 | 0:17:19 | |
At the end of the warm-up, Paul prepares for the second blood draw. | 0:17:20 | 0:17:24 | |
As the blood is sent back to the lab, the match got underway. | 0:17:27 | 0:17:32 | |
This is where we are seeing the real payoff from that warm-up. | 0:17:34 | 0:17:38 | |
They're working as a group. | 0:17:38 | 0:17:40 | |
Oh, look at him go! He almost made it. | 0:17:40 | 0:17:42 | |
Two weeks later and Paul had the results. | 0:17:45 | 0:17:48 | |
This is a brand-new experiment. | 0:17:48 | 0:17:49 | |
We don't really know what we're going to find. | 0:17:49 | 0:17:52 | |
The oxytocin levels of the players had, in fact, converged, | 0:17:53 | 0:17:56 | |
getting them in sync with each other. | 0:17:56 | 0:17:59 | |
This would have helped them feel bonded | 0:17:59 | 0:18:01 | |
and confirms what Paul has found in his many laboratory experiments. | 0:18:01 | 0:18:06 | |
Oxytocin seems to be the key to empathy. | 0:18:06 | 0:18:11 | |
I call oxytocin the moral molecule. | 0:18:11 | 0:18:13 | |
When oxytocin is released we feel empathy, we feel attachment, we connect to people. | 0:18:15 | 0:18:20 | |
But the results showed something else. | 0:18:21 | 0:18:24 | |
Another hormone, testosterone, had increased. | 0:18:24 | 0:18:28 | |
And this drives aggressive behaviour. | 0:18:28 | 0:18:31 | |
So is testosterone the opposite of the moral molecule? | 0:18:31 | 0:18:36 | |
Oxytocin makes us more selfless and testosterone makes us more selfish. | 0:18:38 | 0:18:43 | |
What's interesting in the ritual setting like with the rugby team | 0:18:43 | 0:18:47 | |
is that sometimes these run together. | 0:18:47 | 0:18:49 | |
So if the rugby players want to be both selfless, they want to support their team | 0:18:49 | 0:18:54 | |
but also selfish, they want to grab goals from the other team. | 0:18:54 | 0:18:58 | |
And that's pretty interesting, so they're not always in conflict. | 0:18:58 | 0:19:01 | |
Paul believes that what happens on the sports field reflects our moral battlefield in life. | 0:19:01 | 0:19:09 | |
The way to think about this is that rugby is like society in miniature. | 0:19:09 | 0:19:12 | |
We have to co-operate as a group to achieve a goal. | 0:19:12 | 0:19:15 | |
But yet we have another group that's trying to stop us from doing that. | 0:19:15 | 0:19:18 | |
So there's a balance between testosterone and oxytocin. | 0:19:18 | 0:19:21 | |
There's a way to understand how societies work. | 0:19:21 | 0:19:23 | |
What we experience as a battle between good and evil may be a chemical battle waging inside us. | 0:19:28 | 0:19:36 | |
Perhaps being moral means achieving a balance. | 0:19:38 | 0:19:41 | |
For each one of us, that process will be different. | 0:19:41 | 0:19:45 | |
But what happens if you try and disturb that balance? | 0:19:50 | 0:19:53 | |
If you make someone more aggressive than they naturally are? | 0:19:53 | 0:19:58 | |
What does it do to a human if you suppress their own moral instinct? | 0:19:59 | 0:20:04 | |
MILITARY DRUMROLL | 0:20:06 | 0:20:09 | |
Not all experiments are planned. | 0:20:14 | 0:20:17 | |
Here in Quantico, Virginia, | 0:20:17 | 0:20:20 | |
Marines are part of a radical training programme | 0:20:20 | 0:20:23 | |
that has implications far beyond this camp. | 0:20:23 | 0:20:27 | |
The two warriors at the centre of it are Captain Hoban | 0:20:31 | 0:20:35 | |
and Lieutenant Colonel Shushko. | 0:20:35 | 0:20:40 | |
HE YELLS | 0:20:40 | 0:20:41 | |
'Marines know from day one | 0:20:41 | 0:20:43 | |
'if they are given a mission. they have got to accomplish it.' | 0:20:43 | 0:20:47 | |
That does mean they will have to take a life, | 0:20:47 | 0:20:50 | |
so how do you train to take a life? | 0:20:50 | 0:20:53 | |
You're going to grab your training knives, batons, | 0:20:53 | 0:20:57 | |
and then set up on LZ6... | 0:20:57 | 0:20:59 | |
'What the marines must do goes against | 0:20:59 | 0:21:01 | |
'their natural moral instinct.' | 0:21:01 | 0:21:04 | |
It's not in human nature to take somebody's life. | 0:21:05 | 0:21:08 | |
I don't think it's easy to kill somebody. | 0:21:08 | 0:21:10 | |
It's not easy to even think about killing somebody. | 0:21:10 | 0:21:13 | |
Because it's so unnatural, the marines learn step by step. | 0:21:15 | 0:21:19 | |
'When we start a marine out, we do have him crawl, before he walks. | 0:21:19 | 0:21:23 | |
'before he runs.' | 0:21:23 | 0:21:24 | |
To the head, to the head! | 0:21:24 | 0:21:26 | |
'Learning the basics of standing, falling | 0:21:26 | 0:21:29 | |
'throwing punches, throwing kicks, being thrown.' | 0:21:29 | 0:21:32 | |
Then we add a simple thing like a knife. | 0:21:32 | 0:21:35 | |
'How to hold a knife, how to use a knife with your good hand. | 0:21:35 | 0:21:38 | |
'with your bad hand.' | 0:21:38 | 0:21:40 | |
Switch hands with the knife, kill him! | 0:21:40 | 0:21:42 | |
'How to use a baton. Same thing, how to use a pistol.' | 0:21:42 | 0:21:46 | |
Then you practise it over and over again | 0:21:46 | 0:21:48 | |
They start getting more confident, so it becomes second nature. | 0:21:48 | 0:21:52 | |
From constantly training, it becomes muscle memory | 0:21:54 | 0:21:58 | |
and your body naturally reacts. | 0:21:58 | 0:21:59 | |
I'm a firm believer that if you practise something, | 0:22:03 | 0:22:05 | |
after 21 days it becomes a habit. | 0:22:05 | 0:22:07 | |
The act of repetition is aimed to push men over the natural barrier | 0:22:07 | 0:22:12 | |
that holds them back from harming. | 0:22:12 | 0:22:17 | |
A combat mindset is being able to turn the switch on when you have to. | 0:22:17 | 0:22:22 | |
So that when the time comes | 0:22:22 | 0:22:25 | |
and it's my life or yours | 0:22:25 | 0:22:27 | |
or an innocent bystander's life, we know how to react. | 0:22:27 | 0:22:30 | |
And we don't think twice about it. | 0:22:30 | 0:22:33 | |
-So if that means killing someone, you're able to do it? -Yes. | 0:22:33 | 0:22:37 | |
Equipping them with the ability to kill must be combined | 0:22:41 | 0:22:46 | |
with the motivation to do so. | 0:22:46 | 0:22:48 | |
In the past, that motivation was often hate. | 0:22:48 | 0:22:53 | |
When I was trained, we were trained as killers. | 0:22:56 | 0:22:59 | |
The easy way out. | 0:22:59 | 0:23:00 | |
Fill in the blank with some pejorative of a subhuman, | 0:23:00 | 0:23:04 | |
whether it's a gook in Vietnam | 0:23:04 | 0:23:07 | |
or a hajj in the conflicts we are in now, | 0:23:07 | 0:23:10 | |
"He did this, he's subhuman. Kill him like an animal." | 0:23:10 | 0:23:15 | |
Well, it turns out that you can try to take that approach, | 0:23:15 | 0:23:19 | |
but it comes back to haunt you. | 0:23:19 | 0:23:22 | |
What they found was that ignoring the Marines' | 0:23:26 | 0:23:30 | |
natural sense of morality was starting to destroy them. | 0:23:30 | 0:23:34 | |
We had people abusing their families and their wives. | 0:23:37 | 0:23:40 | |
they're knocked so far off that way, that now everybody is the enemy. | 0:23:40 | 0:23:44 | |
they start to lose respect for all life, | 0:23:44 | 0:23:47 | |
including former friends and family. | 0:23:47 | 0:23:50 | |
Taking away all their ethical parameters | 0:23:50 | 0:23:53 | |
was removing something fundamental to their brains. | 0:23:53 | 0:23:57 | |
Human beings are not natural killers. | 0:23:57 | 0:24:00 | |
When we have tried, over the centuries, | 0:24:00 | 0:24:03 | |
to make soldiers more effective killers, | 0:24:03 | 0:24:06 | |
we may have been effective in the short run. | 0:24:06 | 0:24:09 | |
But when they get back afterwards | 0:24:09 | 0:24:12 | |
and they think about what they've done, | 0:24:12 | 0:24:14 | |
it's not psychologically healthy for them. | 0:24:14 | 0:24:18 | |
They had to come up with a new plan that somehow worked | 0:24:18 | 0:24:21 | |
with their moral instinct, not against it. | 0:24:21 | 0:24:24 | |
What we did is we backed up and thought "What are people?" | 0:24:24 | 0:24:27 | |
If they're not killers, what are they? | 0:24:28 | 0:24:31 | |
If you think about it, people are naturally protectors. | 0:24:31 | 0:24:36 | |
So if you work off that, | 0:24:36 | 0:24:39 | |
would people protect and defend | 0:24:39 | 0:24:42 | |
to the point of killing? | 0:24:42 | 0:24:45 | |
Yes, they would. But only when necessary | 0:24:45 | 0:24:47 | |
to protect and defend life. | 0:24:47 | 0:24:49 | |
What we think is if we calibrate their moral compass, | 0:24:51 | 0:24:55 | |
make them ethical warriors, | 0:24:55 | 0:24:58 | |
whose mission is to protect and defend life, | 0:24:58 | 0:25:02 | |
killing only when necessary to protect life. | 0:25:02 | 0:25:05 | |
'So for the Marines, morality has had to become part | 0:25:08 | 0:25:12 | |
'of their new narrative. | 0:25:12 | 0:25:15 | |
'It seems our moral instinct cannot be suppressed | 0:25:15 | 0:25:19 | |
'without paying a heavy price.' | 0:25:19 | 0:25:22 | |
Don't fly too early. You don't want to get tracked. | 0:25:22 | 0:25:25 | |
But if our natural instinct is to do no harm, | 0:25:28 | 0:25:32 | |
how can we explain those who seem totally devoid of this feeling? | 0:25:32 | 0:25:36 | |
Who have no revulsion at taking a life? | 0:25:36 | 0:25:40 | |
Scientists have embarked on a new, dark voyage to understand evil. | 0:25:44 | 0:25:49 | |
They've turned to the serial killer psychopath. | 0:25:49 | 0:25:54 | |
One man has done more than anyone to understand the mind of a psychopath. | 0:26:04 | 0:26:09 | |
Psychologist Professor Bob Hare | 0:26:11 | 0:26:14 | |
set out on this trail 30 years ago. | 0:26:14 | 0:26:17 | |
He was determined to penetrate what lay beneath the mask. | 0:26:17 | 0:26:22 | |
I'm looking at two pictures of very well-known, | 0:26:27 | 0:26:30 | |
infamous serial killers, Jeffrey Dahmer and Ted Bundy. | 0:26:30 | 0:26:34 | |
When you look at the pictures, you see ordinary people. | 0:26:34 | 0:26:37 | |
That was their strong point. They looked perfectly ordinary | 0:26:37 | 0:26:40 | |
when they were out in society. | 0:26:40 | 0:26:43 | |
After the fact, of course, we realise they were far from normal. | 0:26:43 | 0:26:47 | |
They're very deviant, cold-blooded killers. | 0:26:47 | 0:26:50 | |
This was a world he came into by accident. | 0:26:52 | 0:26:55 | |
I needed a job, | 0:26:56 | 0:26:57 | |
the only job I could find at the time was as the sole psychologist | 0:26:57 | 0:27:01 | |
at the OBC Penitentiary, | 0:27:01 | 0:27:03 | |
a maximum security institute near Vancouver. | 0:27:03 | 0:27:06 | |
Bob found himself face to face with psychopaths. | 0:27:06 | 0:27:10 | |
When I was first starting out, I had no idea at all | 0:27:10 | 0:27:13 | |
about the sorts of people with whom I was dealing. | 0:27:13 | 0:27:16 | |
They were people. Some would be very difficult to deal with. | 0:27:16 | 0:27:21 | |
You could see there was something strange about them, even predatory. | 0:27:21 | 0:27:25 | |
I hate to use the term evil, but something pretty scary about them. | 0:27:25 | 0:27:29 | |
But many of them were open, warm-appearing people | 0:27:29 | 0:27:33 | |
until you find out what they've done. | 0:27:33 | 0:27:36 | |
'NEWS REPORT: Parts of 11 different bodies were found in Dahmer's flat | 0:27:43 | 0:27:46 | |
'when he was arrested on Tuesday.' | 0:27:46 | 0:27:48 | |
He wanted to find the rules of a psychopath. | 0:27:48 | 0:27:52 | |
'I'm the same person I was on the street, except for the killing. | 0:27:52 | 0:27:56 | |
'I'm still someone's brother, someone's son.' | 0:27:56 | 0:28:01 | |
I tried to find out what makes them tick. What they have in common | 0:28:01 | 0:28:04 | |
and why do they have these things in common. | 0:28:04 | 0:28:07 | |
'I never once had any guilt.' | 0:28:07 | 0:28:10 | |
Who are they? | 0:28:10 | 0:28:12 | |
How do we go about assessing this guy, saying he's psychopathic? | 0:28:12 | 0:28:16 | |
'I never once shed any tears.' | 0:28:16 | 0:28:18 | |
We had to have some sort of diagnostic criteria. | 0:28:18 | 0:28:21 | |
Bob drew up a checklist | 0:28:21 | 0:28:23 | |
defining their core personality traits. | 0:28:23 | 0:28:27 | |
The essential features of psychopathy | 0:28:29 | 0:28:32 | |
would include a profound lack of empathy. | 0:28:32 | 0:28:35 | |
I don't mean a general, I mean a profound lack of empathy. | 0:28:35 | 0:28:39 | |
A general callousness towards other people. | 0:28:39 | 0:28:42 | |
these are people without a conscience, shallow emotions. | 0:28:42 | 0:28:45 | |
"I'm number one in the universe, there's nobody else." | 0:28:45 | 0:28:49 | |
He then devised an experiment looking into their brains. | 0:28:54 | 0:28:58 | |
A psychopathic killer who volunteered was Anthony Frazzel. | 0:29:04 | 0:29:08 | |
If we assume, and the evidence supports this, that psychopaths | 0:29:08 | 0:29:13 | |
have a severe blunted emotional life, | 0:29:13 | 0:29:16 | |
the emotions aren't as powerful, | 0:29:16 | 0:29:18 | |
they don't have the same range they have for most people, | 0:29:18 | 0:29:21 | |
it should be reflected in things like their language. | 0:29:21 | 0:29:24 | |
Bob showed Frazzel both real and made-up words | 0:29:27 | 0:29:30 | |
and asked him to spot the difference. | 0:29:30 | 0:29:33 | |
Some of those words would have an emotional charge. | 0:29:33 | 0:29:37 | |
Most people can decide very quickly when it's emotional. | 0:29:37 | 0:29:41 | |
More quickly than a neutral word. There's a difference. | 0:29:41 | 0:29:43 | |
The brain responses to the emotional words are quite different | 0:29:43 | 0:29:47 | |
than they are to neutral words. | 0:29:47 | 0:29:50 | |
For psychopaths, there was absolutely no difference. | 0:29:50 | 0:29:53 | |
A word was a word was a word. | 0:29:53 | 0:29:55 | |
The word rape had the same emotional impact as the word table or tree. | 0:29:55 | 0:30:02 | |
He ran the experiment with dozens of psychopaths | 0:30:02 | 0:30:06 | |
and got the same result. | 0:30:06 | 0:30:08 | |
They were so dramatic that reviewers simply didn't believe the findings came from real people. | 0:30:08 | 0:30:14 | |
So we rewrote it, explained it in more detail | 0:30:14 | 0:30:18 | |
and sent it into another top journal, Psychophysiology. | 0:30:18 | 0:30:21 | |
It was published and it's actually a seminal study. | 0:30:21 | 0:30:24 | |
Bob Hare identified one of the lines that might separate good from evil. | 0:30:30 | 0:30:36 | |
It was our emotions. | 0:30:36 | 0:30:39 | |
Psychopaths simply did not seem to have the feelings of empathy | 0:30:39 | 0:30:43 | |
that stop the rest of us from harming. | 0:30:43 | 0:30:45 | |
The search had begun. | 0:30:46 | 0:30:49 | |
What else could we see if we peered into the brain of the psychopath? | 0:30:49 | 0:30:55 | |
In California, neuroscientist Jim Fallon found himself almost by accident picking up the quest. | 0:31:08 | 0:31:16 | |
He had specialised in standard clinical disorders. | 0:31:18 | 0:31:22 | |
Now he was about to become an expert in the brains of psychopaths. | 0:31:23 | 0:31:28 | |
I spent a lot of my research career | 0:31:32 | 0:31:34 | |
looking at different brain abnormalities. | 0:31:34 | 0:31:37 | |
Mostly schizophrenia but also depression and addictions of different sorts. | 0:31:37 | 0:31:41 | |
And then my colleagues started to do something different. | 0:31:41 | 0:31:45 | |
They asked him to analyse a variety of brain scans. | 0:31:46 | 0:31:51 | |
What he didn't know was some of them were the brain scans of murderers. | 0:31:51 | 0:31:57 | |
They brought me these scans and said, | 0:31:57 | 0:31:58 | |
"What do you think of these? What do you see?" | 0:31:58 | 0:32:01 | |
There were normals mixed in. | 0:32:06 | 0:32:08 | |
People with schizophrenia, depression and there were killers. | 0:32:08 | 0:32:11 | |
But I didn't know the mix. It was just like, "Here it is." | 0:32:11 | 0:32:14 | |
About halfway through I noticed a pattern. It was fascinating. | 0:32:17 | 0:32:21 | |
This one group, no matter what other damage they had or didn't have, | 0:32:21 | 0:32:24 | |
they always had damage of the orbital cortex above the eyes. | 0:32:24 | 0:32:27 | |
The other part of the brain that looked like wasn't working right | 0:32:31 | 0:32:34 | |
was the front part of the temporal lobe which houses the amygdala. | 0:32:34 | 0:32:38 | |
That is where your different animal drives are. | 0:32:38 | 0:32:41 | |
I said, "This is extraordinary." | 0:32:41 | 0:32:43 | |
So I separate out the piles and I said, "This is a different group." | 0:32:43 | 0:32:47 | |
And bingo, when we broke the code, there it was. | 0:32:48 | 0:32:52 | |
That group were the killers. | 0:32:52 | 0:32:53 | |
It was really one of those "ah-hah" moments. | 0:32:53 | 0:32:57 | |
The areas that looked abnormal | 0:32:58 | 0:33:00 | |
were crucial for controlling impulsivity and emotions. | 0:33:00 | 0:33:05 | |
Fallon's images seem to confirm what Hare's work had suggested. | 0:33:05 | 0:33:11 | |
It looked like we were getting closer to the signature brain profile of the serial killer. | 0:33:11 | 0:33:17 | |
This is about as dramatic as a difference can be in a PET scan. | 0:33:17 | 0:33:22 | |
It's just a mind-blower, really. | 0:33:22 | 0:33:25 | |
The location of these abnormalities | 0:33:26 | 0:33:29 | |
indicated to Jim | 0:33:29 | 0:33:30 | |
why psychopaths could be driven towards extreme behaviour. | 0:33:30 | 0:33:34 | |
Just to get up to the point of being satisfied of feeding the amygdale, | 0:33:37 | 0:33:42 | |
that whole system, some of these psychopaths do extraordinary things. | 0:33:42 | 0:33:46 | |
Somebody like that may have to fly to Vegas and get drunk, | 0:33:46 | 0:33:50 | |
and be with a bunch of prostitutes or snort cocaine, | 0:33:50 | 0:33:53 | |
or kill somebody over and over again. | 0:33:53 | 0:33:56 | |
It really indicated that there was a biological basis, | 0:34:02 | 0:34:05 | |
a really hardcore brain basis, for this urge to kill. | 0:34:05 | 0:34:09 | |
That brings the other question, | 0:34:11 | 0:34:13 | |
is that enough to cause someone to be a psychopath or a killer? | 0:34:13 | 0:34:18 | |
Or are there other factors? | 0:34:18 | 0:34:21 | |
That moment was immediately followed by a bunch of question marks. | 0:34:21 | 0:34:26 | |
Once it seemed that the brains of psychopaths were different, | 0:34:30 | 0:34:35 | |
the next urgent question was why? | 0:34:35 | 0:34:38 | |
Back in Vancouver, the direction seemed clear. | 0:34:39 | 0:34:43 | |
The path to pursue was genes. | 0:34:43 | 0:34:45 | |
Once we had determined there were certain differences in brain function and structure, | 0:34:46 | 0:34:51 | |
the next question is, where do they originate from? | 0:34:51 | 0:34:56 | |
That brings up questions of genetic factors. | 0:34:56 | 0:34:58 | |
All behaviour, all physical features have strong genetic contributions. | 0:35:00 | 0:35:05 | |
The search was on. | 0:35:07 | 0:35:10 | |
Where there genes that linked to violence? | 0:35:10 | 0:35:13 | |
In 1993, the breakthrough came with one family's history. | 0:35:14 | 0:35:19 | |
Here all the men had a background of violence | 0:35:19 | 0:35:23 | |
and all lacked the same gene. | 0:35:23 | 0:35:25 | |
There was one gene that was missing | 0:35:28 | 0:35:30 | |
and it was in the men and all these men were violent. | 0:35:30 | 0:35:33 | |
What was important was that the loss of one gene profoundly affected behaviour. | 0:35:36 | 0:35:41 | |
That kind of supported idea that one gene really controlled behaviour. | 0:35:41 | 0:35:46 | |
It then emerged that just being born with one variant of this gene | 0:35:46 | 0:35:51 | |
could also predispose you to violent behaviour. | 0:35:51 | 0:35:54 | |
The MAO-A gene became known as the warrior gene. | 0:35:54 | 0:36:00 | |
That was pretty exciting because it implied first off | 0:36:00 | 0:36:03 | |
that we could identify specific contributing factors to psychopathy | 0:36:03 | 0:36:09 | |
but also because it suggests that | 0:36:09 | 0:36:13 | |
this particular area of research is bound to be fruitful. | 0:36:13 | 0:36:17 | |
It seemed that it could be possible to trace the hallmark of evil | 0:36:21 | 0:36:26 | |
in people's brains and genes. | 0:36:26 | 0:36:29 | |
So did this mean that if you had both elements | 0:36:29 | 0:36:33 | |
you were destined to become a killer? | 0:36:33 | 0:36:36 | |
For Jim Fallon, this question was about to become deeply personal. | 0:36:36 | 0:36:40 | |
At a regular family party, | 0:36:56 | 0:36:58 | |
a casual remark by his mother took him by surprise. | 0:36:58 | 0:37:02 | |
As we were discussing this, and different brains, I said to him, | 0:37:04 | 0:37:09 | |
"You should look into your own history." | 0:37:09 | 0:37:14 | |
I said, "Did you ever hear of Lizzie Borden?" | 0:37:14 | 0:37:16 | |
and I started telling the story about Lizzie Borden | 0:37:16 | 0:37:20 | |
and how she had murdered her father and mother. | 0:37:20 | 0:37:23 | |
I said, "There's a cousin of yours." | 0:37:23 | 0:37:26 | |
Well, he was shocked | 0:37:28 | 0:37:30 | |
and, of course, started to delve a little further into this. | 0:37:30 | 0:37:33 | |
It was pretty startling. | 0:37:35 | 0:37:37 | |
I knew it was true. She doesn't make things up. | 0:37:37 | 0:37:39 | |
There were quite a few murderers in that family. | 0:37:39 | 0:37:42 | |
At least 16 murderers in the one line. | 0:37:45 | 0:37:50 | |
Hearing this, Jim took the bold decision to run a check | 0:37:52 | 0:37:56 | |
on the entire family for the genes and brain structure | 0:37:56 | 0:37:59 | |
linked to violent psychopathic behaviour. | 0:37:59 | 0:38:03 | |
The results of the brain scans came back first. | 0:38:04 | 0:38:07 | |
There were many, many sheets and they all looked normal. Fantastic. | 0:38:10 | 0:38:14 | |
And then I came to one and it was the last one, as it turns out, | 0:38:14 | 0:38:18 | |
and it looked very abnormal. | 0:38:18 | 0:38:21 | |
This particular PET scan had no orbital cortex activity. | 0:38:21 | 0:38:28 | |
It had no temporal lobe activity. | 0:38:28 | 0:38:30 | |
The whole limbic system was not functioning. | 0:38:30 | 0:38:33 | |
I said, "Oh my god, it's one of these killers. | 0:38:33 | 0:38:36 | |
"It's the exact same pattern as a killer." | 0:38:36 | 0:38:39 | |
When I looked down at the code, it wasn't one of the killers. It was me. | 0:38:39 | 0:38:43 | |
It was really a shock but I tried to think, "That's really interesting." | 0:38:45 | 0:38:50 | |
"I'm not in jail, haven't killed anybody or done that stuff. | 0:38:50 | 0:38:53 | |
"At least I don't have the genes. I just have the brain pattern." | 0:38:53 | 0:38:57 | |
I said, "OK". I felt better. | 0:38:57 | 0:39:00 | |
He then did the gene tests, looking not only for the warrior gene | 0:39:00 | 0:39:04 | |
but for other traits, like impulsivity, | 0:39:04 | 0:39:07 | |
that make up the profile of a psychopath. | 0:39:07 | 0:39:11 | |
Back came the results. | 0:39:11 | 0:39:13 | |
Again, everybody had a mix of things in our family. | 0:39:13 | 0:39:16 | |
It looked like an average mix of these different genes | 0:39:16 | 0:39:21 | |
that have to do with aggression and all sorts of behaviours, | 0:39:21 | 0:39:25 | |
except now again there was one that showed all of these high risk genes. | 0:39:25 | 0:39:29 | |
And it was mine. | 0:39:29 | 0:39:30 | |
What are the odds of getting these? | 0:39:32 | 0:39:34 | |
To throw the dice 20 times and it comes up six-six, six-six, six-six? | 0:39:34 | 0:39:38 | |
It's millions to one. | 0:39:38 | 0:39:40 | |
Now Jim started asking himself some unsettling questions. | 0:39:41 | 0:39:46 | |
This really became probably more serious in my mind | 0:39:47 | 0:39:53 | |
because it's like, who am I really? | 0:39:53 | 0:39:56 | |
People with far less dangerous genetics | 0:39:56 | 0:39:59 | |
become killers and are psychopaths than what I had. | 0:39:59 | 0:40:02 | |
I had almost all of them. | 0:40:02 | 0:40:04 | |
But the reaction from his family was to unsettle him even further. | 0:40:06 | 0:40:10 | |
I knew there was always something off. | 0:40:12 | 0:40:14 | |
It makes more sense now that, it's clear he does have the brain | 0:40:14 | 0:40:20 | |
and genetics of a psychopath. | 0:40:20 | 0:40:23 | |
It all falls into place, as it were. | 0:40:23 | 0:40:25 | |
He's got a hot head. | 0:40:27 | 0:40:29 | |
Everything you'd want in a serial killer, he has in a fundamental way. | 0:40:29 | 0:40:36 | |
Because I've been scared of him a few times. | 0:40:36 | 0:40:39 | |
Thanks. | 0:40:41 | 0:40:43 | |
It was surprising but not surprising. | 0:40:43 | 0:40:45 | |
Because he really is in a way, two different people. | 0:40:45 | 0:40:48 | |
Even though he's always been very funny and gregarious, | 0:40:48 | 0:40:52 | |
he's always had a stand-offish part to him. | 0:40:52 | 0:40:55 | |
And that's always been there. That's always been there. | 0:40:55 | 0:40:58 | |
We'll drink to Shannon who's not here. | 0:41:00 | 0:41:02 | |
Having heard what his family thought, | 0:41:02 | 0:41:04 | |
Jim felt forced to be honest with himself. | 0:41:04 | 0:41:08 | |
I've characteristics or traits, | 0:41:08 | 0:41:10 | |
some of which have that a psychopathic, yes. | 0:41:10 | 0:41:14 | |
I could blow off an aunt's funeral if I thought there was a party that day. | 0:41:15 | 0:41:19 | |
I would just take off. | 0:41:19 | 0:41:21 | |
And that's not right. | 0:41:21 | 0:41:22 | |
The thing is I know that now but I still don't care. | 0:41:22 | 0:41:26 | |
And so I know something's wrong, but I still don't care. | 0:41:26 | 0:41:31 | |
I don't know how else to put that, you're in a position where, | 0:41:34 | 0:41:39 | |
that's not right, I don't give a shit. | 0:41:39 | 0:41:41 | |
And that's the truth. | 0:41:41 | 0:41:42 | |
But Jim still had a puzzle to solve. | 0:41:42 | 0:41:45 | |
If he had the brain and the genes of the killer, why wasn't he one? | 0:41:45 | 0:41:50 | |
The answer is that whether genes are triggered on not will depend | 0:41:50 | 0:41:55 | |
on what happens in your childhood. | 0:41:55 | 0:41:57 | |
Simply having the warrior gene doesn't necessarily mean you'll be violent. | 0:41:57 | 0:42:03 | |
If you've the high-risk form of the gene and you were abused early on in life, | 0:42:03 | 0:42:08 | |
your chances of a life of crime are much higher. | 0:42:08 | 0:42:11 | |
If you have the high-risk gene but you weren't abused, | 0:42:11 | 0:42:15 | |
then there really wasn't much risk. | 0:42:15 | 0:42:17 | |
So just a gene by itself, the variant doesn't really dramatically affect behaviour, | 0:42:17 | 0:42:22 | |
but under certain environmental conditions, a big difference. | 0:42:22 | 0:42:25 | |
And that was a very profound finding. | 0:42:25 | 0:42:28 | |
So what was it about Jim's environment that cancelled out his unlucky genes? | 0:42:30 | 0:42:35 | |
It turns out I had an unbelievably wonderful childhood. | 0:42:39 | 0:42:42 | |
When I went back to look at old movies and pictures, | 0:42:45 | 0:42:48 | |
and smiling and as happy as a lark. | 0:42:48 | 0:42:51 | |
You can see it all the way through my life. | 0:42:51 | 0:42:54 | |
There's a good chance that offset all these genetic factors, | 0:42:54 | 0:42:59 | |
the brain development and everything. | 0:42:59 | 0:43:01 | |
And it washed that away. | 0:43:01 | 0:43:02 | |
It seems your genes can increase your chances of being a violent psychopath. | 0:43:13 | 0:43:19 | |
Though it's your environment that shapes whether you'll ever be one. | 0:43:19 | 0:43:24 | |
But understanding the world of the psychopath is now leading | 0:43:24 | 0:43:28 | |
scientists beyond the world of prison walls. | 0:43:28 | 0:43:31 | |
Scientists could be looking for psychopaths in a place near you. | 0:43:36 | 0:43:39 | |
When you walk in the city, you're not thinking of psychopaths. | 0:43:53 | 0:43:57 | |
And yet the chances of passing one are higher than you think. | 0:43:57 | 0:44:02 | |
Psychopaths have been adopting a camouflage, | 0:44:02 | 0:44:05 | |
taking even the experts by surprise. | 0:44:05 | 0:44:09 | |
I met my first psychopath a little over 25 years ago. | 0:44:12 | 0:44:15 | |
And it wasn't someone in a prison, | 0:44:17 | 0:44:20 | |
it was someone who was working for a company where I was a consultant. | 0:44:20 | 0:44:25 | |
When I talked to people about it, half thought he was a wonderful leader. | 0:44:29 | 0:44:33 | |
The other half of the team members felt quite the opposite. | 0:44:33 | 0:44:37 | |
They thought he was the devil incarnate. | 0:44:37 | 0:44:38 | |
So I was somewhat puzzled by this and I called Bob Hare, | 0:44:38 | 0:44:45 | |
and at the end of the conversation he said, yep, you got one. | 0:44:45 | 0:44:49 | |
When Paul called me and described the characteristics in the people he was dealing with, | 0:44:50 | 0:44:55 | |
the concept hit me right between the eyes, of course. | 0:44:55 | 0:44:58 | |
Paul applied Bob's psychopathy checklist, | 0:44:58 | 0:45:01 | |
and found this leader fitted the profile. | 0:45:01 | 0:45:04 | |
His high status had hidden the truth. | 0:45:04 | 0:45:07 | |
Psychopaths really aren't the kind of person you think they are. | 0:45:07 | 0:45:11 | |
In fact, you could be living with one, married to one | 0:45:11 | 0:45:14 | |
for 20 years or more and not know that that person is a psychopath. | 0:45:14 | 0:45:19 | |
In more modern times, we've identified individuals | 0:45:22 | 0:45:25 | |
who we might label the successful psychopath. | 0:45:25 | 0:45:28 | |
Whom do you think of when you hear the term psychopath? | 0:45:30 | 0:45:33 | |
Most likely it's Hannibal Lecter, or some other serial killer. | 0:45:33 | 0:45:37 | |
But the actual behaviours they engage in will depend upon | 0:45:37 | 0:45:42 | |
the context, on how bright you are. | 0:45:42 | 0:45:44 | |
What do you look like? What kind of upbringing have you had? | 0:45:44 | 0:45:47 | |
Being a psychopath doesn't mean you can't get a job. | 0:45:47 | 0:45:51 | |
Part of the problem is that the very things we're | 0:45:51 | 0:45:54 | |
looking for in our leaders, the psychopath can easily mimic. | 0:45:54 | 0:45:59 | |
Their natural tendency is to be charming. | 0:46:02 | 0:46:07 | |
Take that charm and couch it in the right business language, | 0:46:07 | 0:46:10 | |
it sounds like charismatic leadership. | 0:46:10 | 0:46:14 | |
You think of psychopaths as having at their disposal, | 0:46:14 | 0:46:17 | |
a very, very large repertoire of behaviours. | 0:46:17 | 0:46:20 | |
So they can use charm, manipulation, intimidation, whatever is required. | 0:46:20 | 0:46:28 | |
Psychopaths can also turn their lack of emotion to their advantage. | 0:46:28 | 0:46:32 | |
The psychopath can actually put themselves inside your skin | 0:46:32 | 0:46:36 | |
intellectually, not emotionally. | 0:46:36 | 0:46:38 | |
They can tell what you're thinking in a sense, | 0:46:38 | 0:46:41 | |
they look at your body language, listen to what you're saying. | 0:46:41 | 0:46:44 | |
But what they don't really do is feel what you feel. | 0:46:44 | 0:46:47 | |
What this allows them to do is to use the words to manipulate | 0:46:47 | 0:46:51 | |
and con and interact with you without the baggage of having | 0:46:51 | 0:46:54 | |
this, I really feel your pain. | 0:46:54 | 0:46:57 | |
Paul then constructed his own survey to see how many psychopaths | 0:47:00 | 0:47:04 | |
had infiltrated big business. | 0:47:04 | 0:47:07 | |
The answer? Almost four times as many as in the general population. | 0:47:07 | 0:47:13 | |
These were all individuals who were at the top of an organisation. | 0:47:13 | 0:47:18 | |
Vice-presidents, directors, CEOs, so it was actually quite a shock. | 0:47:18 | 0:47:24 | |
But the biggest surprise was | 0:47:24 | 0:47:26 | |
when they looked at their actual performance. | 0:47:26 | 0:47:28 | |
The higher the psychopathy, the better they looked. | 0:47:29 | 0:47:32 | |
These people walked into the room and everybody got excited, watching them, the room lit up. | 0:47:32 | 0:47:37 | |
Charisma, lots of charisma, and they talked a good line. | 0:47:37 | 0:47:42 | |
But if you look at their actual performance | 0:47:42 | 0:47:44 | |
and ratings as a team player and productivity and so forth, dismal. | 0:47:44 | 0:47:49 | |
Looked good, performed badly. | 0:47:49 | 0:47:52 | |
And that was really quite a dramatic finding. | 0:47:52 | 0:47:54 | |
Their ability to communicate, to charm, | 0:47:54 | 0:47:58 | |
to manipulate those around them overshadowed the hard data. | 0:47:58 | 0:48:02 | |
Paul thinks this is just the beginning. | 0:48:05 | 0:48:07 | |
Corporate culture today seems ideal for the psychopath. | 0:48:07 | 0:48:11 | |
They're thrill-seekers, they're easily bored. | 0:48:12 | 0:48:16 | |
What better place to work than a place that's constantly changing? | 0:48:16 | 0:48:21 | |
That's the perfect environment for a psychopath. | 0:48:22 | 0:48:25 | |
So how do you tell the high-power, high talent MBA | 0:48:28 | 0:48:32 | |
student from the lying, cheating, deceitful, manipulative psychopath? | 0:48:32 | 0:48:39 | |
Very, very hard to do. | 0:48:39 | 0:48:40 | |
So the blend of genes and environment determine not only who will be a psychopath, | 0:48:47 | 0:48:53 | |
but whether they end up in the boardroom or behind bars. | 0:48:53 | 0:48:58 | |
Now this new science is about to challenge us all. | 0:49:00 | 0:49:04 | |
It's about to make us question not only our ideas of good and evil, | 0:49:04 | 0:49:09 | |
but even of crime and punishment itself. | 0:49:09 | 0:49:13 | |
I was asking him, "Please." I was begging him to stop. | 0:49:17 | 0:49:22 | |
SIREN | 0:49:22 | 0:49:23 | |
And he wouldn't stop. | 0:49:25 | 0:49:27 | |
In 2006, a brutal murder took place that rocked the state of Tennessee. | 0:49:28 | 0:49:34 | |
It was a horrible crime. | 0:49:36 | 0:49:38 | |
Everybody knows that Mr Waldroup had committed a murder. | 0:49:39 | 0:49:44 | |
He attempted to murder his wife | 0:49:46 | 0:49:48 | |
and he did murder the friend of his wife. | 0:49:48 | 0:49:51 | |
And it was done in a very violent way. | 0:49:51 | 0:49:53 | |
At one point, he had a machete, he ran after her, he cut her. | 0:49:55 | 0:50:01 | |
So it was a pretty grisly scenario. | 0:50:01 | 0:50:05 | |
This was the kind of crime where there is no question who did it. | 0:50:09 | 0:50:13 | |
I killed Leslie Bradshaw. | 0:50:13 | 0:50:16 | |
I cut my wife. | 0:50:18 | 0:50:20 | |
In the state of Tennessee, | 0:50:21 | 0:50:23 | |
that meant Bradley Waldroup was facing the death penalty. | 0:50:23 | 0:50:27 | |
But in this case, the man who could save Waldroup was not a lawyer, | 0:50:29 | 0:50:34 | |
but a forensic psychiatrist. | 0:50:34 | 0:50:37 | |
A fundamental question would lie at the heart of Waldroup's defence. | 0:50:37 | 0:50:42 | |
He may have done it, but was he to blame? | 0:50:42 | 0:50:46 | |
I think in the opening statements, | 0:50:46 | 0:50:49 | |
the defence attorney said that, we are not disputing who did this crime, | 0:50:49 | 0:50:53 | |
but what we would like to talk about is why it happened. | 0:50:53 | 0:50:56 | |
Dr Burnet agreed to gather the evidence. | 0:50:58 | 0:51:02 | |
I first saw Mr Waldroup. We arranged to do an evaluation at Vanderbilt | 0:51:02 | 0:51:06 | |
and I met him in this room. | 0:51:06 | 0:51:09 | |
The Sheriff sits on the other side of the window, | 0:51:09 | 0:51:12 | |
but Mr Waldroup himself was... He's a middle-aged man. | 0:51:12 | 0:51:16 | |
He's pleasant. He's talkative. He's co-operative. | 0:51:16 | 0:51:21 | |
In other words, this is a normal-looking man, who is conversant | 0:51:21 | 0:51:24 | |
and fairly articulate, but who has a story. | 0:51:24 | 0:51:28 | |
And, in his case, I think the story was very important. | 0:51:29 | 0:51:33 | |
Waldroup's actions suggested a brutal, destructive personality. | 0:51:33 | 0:51:38 | |
My initial impression was that this was an unusually violent act. | 0:51:38 | 0:51:43 | |
This was a murder case and it was also a capital case, | 0:51:43 | 0:51:47 | |
meaning that the state of Tennessee were seeking the death penalty. | 0:51:47 | 0:51:51 | |
The evaluation was going to include a new and controversial element. | 0:51:56 | 0:52:01 | |
Did Bradley Waldroup have the warrior gene? | 0:52:02 | 0:52:06 | |
After a week or so, we got the result back | 0:52:12 | 0:52:14 | |
and it was, I guess what one would call a positive result, | 0:52:14 | 0:52:18 | |
in the sense that he had the low-activity version of the MAOA gene. | 0:52:18 | 0:52:24 | |
But this gene would only be relevant to his defence | 0:52:26 | 0:52:30 | |
if he'd also had an extremely difficult early environment. | 0:52:30 | 0:52:35 | |
The question is, does he have a history of child abuse? And he did. | 0:52:36 | 0:52:40 | |
Mr Waldroup described times of getting physically disciplined | 0:52:42 | 0:52:46 | |
where he had welts, he had bruising, and that this was a fairly regular experience for him. | 0:52:46 | 0:52:54 | |
So we thought that that might be important in court. | 0:52:54 | 0:52:58 | |
This genetic evidence was so new, | 0:53:02 | 0:53:04 | |
that Burnet had to work out how he was going to explain it to the jury. | 0:53:04 | 0:53:09 | |
We basically thought back! | 0:53:10 | 0:53:13 | |
We didn't get out old textbooks. We did collect pictures. | 0:53:15 | 0:53:19 | |
We thought that images would be very important. | 0:53:19 | 0:53:22 | |
We obviously want to keep their attention. | 0:53:22 | 0:53:24 | |
It can be very boring to be on a jury for several days. | 0:53:24 | 0:53:28 | |
The DNA evidence revolved around one idea. | 0:53:30 | 0:53:33 | |
A gene composed of four segments is safe. Three, and you're at risk. | 0:53:34 | 0:53:41 | |
But would a judge accept this evidence? | 0:53:43 | 0:53:46 | |
It had never been used in court before. | 0:53:46 | 0:53:48 | |
The day of the trial arrived. | 0:53:54 | 0:53:57 | |
I drove down to this small town in Tennessee. | 0:54:02 | 0:54:05 | |
We were mainly thinking would the judge let us testify about this topic? | 0:54:05 | 0:54:10 | |
This is what one might call novel science, in that it's a new kind of science. | 0:54:13 | 0:54:17 | |
We were the first people, as far as I know, | 0:54:19 | 0:54:22 | |
to introduce this gene environment interaction in a trial. | 0:54:22 | 0:54:28 | |
Despite strong objections from the prosecution, | 0:54:29 | 0:54:32 | |
the judge allowed Burnet to take the stand. | 0:54:32 | 0:54:36 | |
Burnet knew he could be making history. | 0:54:36 | 0:54:40 | |
The jury is sitting in the court and we're asked to go to and testify one at a time. | 0:54:41 | 0:54:45 | |
When Penny tried to run, he intentionally drew that weapon up | 0:54:45 | 0:54:51 | |
and she was running behind that trailer, and fired twice. | 0:54:51 | 0:54:56 | |
Remember Penny's testimony. She says that is why she got hit in the back. | 0:54:56 | 0:55:00 | |
That's the best description... | 0:55:00 | 0:55:02 | |
We were assuming that he would be found guilty of first-degree murder, | 0:55:02 | 0:55:07 | |
and then the jury would have to decide | 0:55:07 | 0:55:09 | |
whether he would get the death penalty or something else. | 0:55:09 | 0:55:13 | |
So we thought the fair outcome would be for them to take it into consideration at that time | 0:55:13 | 0:55:19 | |
and to not give him the death penalty. | 0:55:19 | 0:55:21 | |
Burnet had described Waldroup up as a highly troubled man | 0:55:25 | 0:55:29 | |
with a gene that that made him vulnerable to rage. | 0:55:29 | 0:55:32 | |
Then, it was over to the jury. | 0:55:33 | 0:55:35 | |
What actually happened really surprised us. | 0:55:40 | 0:55:43 | |
This jury, after hearing the testimony, | 0:55:43 | 0:55:47 | |
did not find him guilty of first-degree murder, | 0:55:47 | 0:55:49 | |
but found him guilty of voluntary manslaughter. | 0:55:49 | 0:55:53 | |
So I think the jury was really influenced by the testimony | 0:55:53 | 0:55:58 | |
regarding behavioural genomics. | 0:55:58 | 0:56:01 | |
One juror's comments show just how important it was to them. | 0:56:02 | 0:56:06 | |
-WOMAN: -A diagnosis is a diagnosis. You know. It's there. | 0:56:07 | 0:56:12 | |
A bad gene is a bad gene. | 0:56:12 | 0:56:14 | |
I think this testimony did affect whether he would live or die. | 0:56:14 | 0:56:18 | |
The implications of the verdict are enormous. | 0:56:19 | 0:56:23 | |
They could rewrite the fundamental rules of crime and punishment. | 0:56:23 | 0:56:27 | |
So was it right? | 0:56:30 | 0:56:32 | |
I think we have to be really careful how we state this. | 0:56:33 | 0:56:36 | |
This increases a person's vulnerability, but it doesn't make the person commit a crime. | 0:56:36 | 0:56:42 | |
In his particular case, | 0:56:42 | 0:56:43 | |
I would say that his free will had been diminished. | 0:56:43 | 0:56:49 | |
I don't think I would ever say it vanished, | 0:56:49 | 0:56:52 | |
but I think it had been diminished. | 0:56:52 | 0:56:55 | |
The verdict has set a powerful precedent. | 0:56:58 | 0:57:02 | |
It's ushering in a brand new era of neuro-law. | 0:57:02 | 0:57:05 | |
I think there's an avalanche coming. | 0:57:09 | 0:57:12 | |
There are hundreds or thousands of research projects on behavioural genomics. | 0:57:12 | 0:57:19 | |
And, I think in 10, 15 years, there will be more information than we have now. | 0:57:19 | 0:57:24 | |
The new science is starting to explain the basis of good and evil | 0:57:27 | 0:57:31 | |
and why we are different from each other. | 0:57:31 | 0:57:33 | |
I am pretty sure if I had not had this very positive environment, | 0:57:35 | 0:57:38 | |
I would have turned out poorly. I would have been a real behaviour problem | 0:57:38 | 0:57:44 | |
and I'm pretty sure of that. | 0:57:44 | 0:57:46 | |
But while this science is giving us more information, | 0:57:49 | 0:57:53 | |
it's also undermining our certainties. | 0:57:53 | 0:57:57 | |
Whether we're good or whether we're evil lies partly in our genes | 0:57:57 | 0:58:01 | |
and partly in our environment. | 0:58:01 | 0:58:05 | |
But as we don't choose either, | 0:58:05 | 0:58:09 | |
are we really free to choose at all? | 0:58:09 | 0:58:12 | |
Subtitling by Red Bee Media Ltd | 0:58:19 | 0:58:21 | |
E-mail [email protected] | 0:58:21 | 0:58:23 |