The Moral Side of Murder Justice


The Moral Side of Murder

Similar Content

Browse content similar to The Moral Side of Murder. Check below for episodes and series from the same categories and more!

Transcript


LineFromTo

'My name is Michael Sandel and I teach Political Philosophy at Harvard University.'

0:00:130:00:19

The lecture you're about to see is the first one I give in a course called Justice.

0:00:260:00:32

I begin by posing some moral dilemmas about murder

0:00:320:00:35

and I ask the students a question - is killing sometimes the right thing to do?

0:00:350:00:42

APPLAUSE

0:00:510:00:53

This is a course about justice and we begin with a story.

0:00:550:00:59

Suppose you're the driver of a trolley car and it's hurtling down the track at 60 miles an hour.

0:00:590:01:07

At the end of the track, you notice five workers working on the track.

0:01:070:01:12

You try to stop, but you can't. Your brakes don't work.

0:01:120:01:16

You feel desperate because you know that if you crash into these workers

0:01:160:01:21

they will all die. Let's assume you know that for sure.

0:01:210:01:26

And so you feel helpless

0:01:260:01:28

until you notice that there is, off to the right,

0:01:280:01:32

a side track. And at the end of that track

0:01:320:01:37

there is one worker working on the track.

0:01:370:01:41

Your steering wheel works.

0:01:410:01:43

So you can turn the trolley car, if you want to, onto the side track.

0:01:430:01:50

killing the one, but sparing the five.

0:01:500:01:54

Here's our first question.

0:01:540:01:57

What's the right thing to do?

0:01:570:02:00

What would you do?

0:02:000:02:02

Let's take a poll.

0:02:020:02:04

How many would turn the trolley car onto the side track?

0:02:050:02:10

Raise your hands.

0:02:100:02:12

How many wouldn't? How many would go straight ahead?

0:02:140:02:18

Keep your hands up, those of you who would go straight ahead.

0:02:200:02:24

A handful of people would. The vast majority would turn.

0:02:260:02:30

Now we need to begin to investigate the reasons why you think it's the right thing to do.

0:02:300:02:37

Let's begin with those in the majority. Who would turn to go onto the side track?

0:02:370:02:44

Why would you do it? What would be your reason? Who is willing to volunteer a reason?

0:02:440:02:51

Go ahead. Stand up.

0:02:510:02:54

Because it can't be right to kill five people when you can only kill one person instead.

0:02:540:03:00

It wouldn't be right to kill five if you could kill one person instead.

0:03:020:03:07

That's a good reason. That's a good reason.

0:03:090:03:13

Who else? Does everybody agree with that reason?

0:03:150:03:19

Go ahead.

0:03:210:03:23

It was the same reason on 9/11.

0:03:230:03:26

We regard the people who flew the plane into the Pennsylvania field as heroes

0:03:260:03:32

because they chose to kill the people on the plane and not kill more people in big buildings.

0:03:320:03:38

So the principle there was the same on 9/11. A tragic circumstance,

0:03:380:03:44

but better to kill one so that five can live? Is that the reason most of you had, those who would turn?

0:03:440:03:50

Yes?

0:03:500:03:52

Let's hear now

0:03:520:03:54

from those in the minority, those who wouldn't turn.

0:03:540:04:00

Yes.

0:04:010:04:02

Well, I think that's the same mentality that justifies genocide and totalitarianism.

0:04:020:04:08

In order to save one type of race, you wipe out the other.

0:04:080:04:12

So what would you do in this case? You would, to avoid the horrors of genocide,

0:04:120:04:19

you would crash into the five and kill them?

0:04:190:04:23

-Presumably, yes.

-You would?

-Yeah.

0:04:260:04:30

OK. Who else? That's a brave answer. Thank you.

0:04:300:04:34

Let's consider another trolley car case

0:04:350:04:41

and see whether

0:04:420:04:45

those of you in the majority

0:04:460:04:49

want to adhere to the principle "better that one should die so that five should live".

0:04:490:04:56

This time you're not the driver of the trolley car. You're an onlooker.

0:04:560:05:00

You're standing on a bridge overlooking a trolley car track.

0:05:000:05:04

And down the track comes a trolley car.

0:05:040:05:08

At the end of the track are five workers.

0:05:080:05:11

The brakes don't work, the trolley car is about to careen into the five and kill them,

0:05:110:05:18

and now you're not the driver, you really feel helpless

0:05:180:05:22

until you notice, standing next to you,

0:05:220:05:26

leaning over the bridge is a very fat man.

0:05:280:05:33

LAUGHTER

0:05:340:05:37

And...

0:05:370:05:38

you could give him a shove,

0:05:390:05:43

he would fall over the bridge onto the track,

0:05:430:05:48

right in the way of the trolley car.

0:05:500:05:54

He would die, but he would spare the five.

0:05:540:05:58

Now...

0:05:580:06:00

How many would push the fat man over the bridge? Raise your hand.

0:06:010:06:06

LAUGHTER

0:06:070:06:09

How many wouldn't?

0:06:100:06:12

Most people wouldn't.

0:06:130:06:16

Here's the obvious question - what became of the principle...

0:06:160:06:20

better to save five lives even if it means sacrificing one?

0:06:220:06:26

What became of the principle that almost everyone endorsed in the first case?

0:06:260:06:32

I need to hear from someone who was in the majority in both cases.

0:06:320:06:36

How do you explain the difference between the two? Yes.

0:06:360:06:40

The second one, I guess, involves an act of choice of pushing the person down.

0:06:400:06:46

I guess that person himself would otherwise not have been involved in the situation at all

0:06:460:06:52

and so to choose on his behalf, I guess, to...

0:06:520:06:56

involve him in something he'd otherwise have escaped is, I guess,

0:06:560:07:04

more than what you have in the first place where the three parties - the driver and two sets of workers -

0:07:040:07:10

-are already, I guess, in the situation.

-But the guy working on the track off to the side,

0:07:100:07:16

he didn't choose to sacrifice his life any more than the fat man did. Did he?

0:07:160:07:22

-That's true, but he was on the tracks...

-This guy was on the bridge!

0:07:240:07:29

LAUGHTER

0:07:290:07:31

Go ahead. You can come back if you want.

0:07:320:07:35

All right. It's a hard question. You did well, you did very well. It's a hard question.

0:07:350:07:41

Who else can...find a way

0:07:410:07:45

of reconciling...the reaction of the majority in these two cases?

0:07:450:07:51

Yes?

0:07:510:07:52

Well, I guess in the first case where you have the one worker and the five,

0:07:520:07:57

it's a choice between those two and you have to make a choice. People will die because of the trolley car,

0:07:570:08:03

not necessarily because of your direct actions. The trolley car is a runaway, it's a split-second choice,

0:08:030:08:10

whereas pushing the fat man over is an act of murder on your part.

0:08:100:08:14

You have control over that, whereas you may not have control over the trolley car.

0:08:140:08:20

-I think it's a slightly different situation.

-All right. Who has a reply? That's good.

0:08:200:08:26

Who wants to reply? Is that a way out of this?

0:08:260:08:32

I don't think that's a very good reason because you choose... Either way, you choose who dies -

0:08:320:08:38

choose to turn and kill him, which is an act of conscious thought,

0:08:380:08:42

or choose to push the fat man over, which is also an active, conscious action.

0:08:420:08:47

-Either way, you're making a choice.

-Do you want to reply?

0:08:470:08:52

I'm not really sure that's the case. It still seems different - pushing someone over onto the tracks

0:08:520:08:59

-and killing him. You are actually killing him yourself.

-Pushing him with your own hands.

-Pushing him.

0:08:590:09:05

And that's different than steering something that is going to cause death into another...

0:09:050:09:11

-It doesn't really sound right saying it now!

-No, it's good.

0:09:110:09:16

-What's your name?

-Andrew.

-Let me ask you this question.

0:09:160:09:20

Suppose...

0:09:210:09:23

standing on the bridge, next to the fat man, I didn't have to push him.

0:09:230:09:29

Suppose he was standing over a trap door that I could open by turning a steering wheel like that?

0:09:290:09:36

LAUGHTER

0:09:360:09:38

Would you turn?

0:09:380:09:40

-For some reason, that still just seems more wrong.

-Right.

-I mean...

0:09:400:09:46

-Maybe if you accidentally leaned into the steering wheel...

-LAUGHTER

0:09:460:09:51

..but... Or say that the car is hurtling towards a switch that will drop the trap,

0:09:510:09:58

-then...I could agree with that.

-Fair enough.

0:09:580:10:02

It still seems wrong in a way that it doesn't seem wrong in the first place to turn?

0:10:020:10:08

And in the first situation, you're involved directly with it? In the second, you're an onlooker.

0:10:080:10:14

-All right.

-So you have the choice of becoming involved or not.

-Let's forget for the moment this case.

0:10:140:10:21

That's good.

0:10:210:10:23

Let's imagine a different case. This time you're a doctor in an emergency room.

0:10:230:10:29

And six patients come to you

0:10:290:10:31

They've been in a terrible trolley car wreck.

0:10:330:10:37

LAUGHTER

0:10:370:10:39

Five of them sustained moderate injuries, one is severely injured.

0:10:400:10:45

You could spend all day caring for the one severely injured victim, but in that time the five would die.

0:10:450:10:51

Or you could look after the five, restore them to health, but the one severely injured person

0:10:510:10:58

would die. How many would save the five now as the doctor?

0:10:580:11:03

How many would save the one?

0:11:030:11:05

Very few people. Just a handful of people.

0:11:060:11:10

Same reason, I assume - one life versus five?

0:11:110:11:15

Now consider another doctor case. This time you're a transplant surgeon.

0:11:180:11:23

And you have five patients each in desperate need of an organ transplant in order to survive.

0:11:240:11:31

One needs a heart, one a lung, one a kidney, one a liver

0:11:310:11:36

and the fifth...a pancreas.

0:11:360:11:40

And you have no organ donors.

0:11:400:11:44

You are about to see them die.

0:11:440:11:48

And then it occurs to you that in the next room there's a healthy guy who came in for a check-up.

0:11:490:11:57

LAUGHTER

0:11:570:11:59

And he's...

0:12:020:12:04

You like that?

0:12:040:12:07

And he's...he's taking a nap. LAUGHTER

0:12:080:12:12

You could go in, very quietly,

0:12:150:12:17

yank out the five organs,

0:12:170:12:20

that person would die.

0:12:200:12:23

But you could save the five. How many would do it?

0:12:230:12:27

Anyone?

0:12:290:12:31

How many? Put your hands up if you would do it.

0:12:320:12:36

Anyone in the balcony?

0:12:400:12:42

You would?

0:12:420:12:44

Be careful - don't lean over too...!

0:12:440:12:47

How many wouldn't?

0:12:480:12:50

All right. What do you say? Speak up in the balcony, you who would yank out the organs.

0:12:510:12:57

-Why?

-I'd like to explore a slightly alternate possibility

0:12:570:13:01

of just taking the one of the five who needs an organ who dies first

0:13:010:13:05

and using their four healthy organs to save the other four.

0:13:050:13:10

MILD APPLAUSE That's a pretty good idea.

0:13:110:13:14

That's a great idea.

0:13:160:13:18

LAUGHTER

0:13:180:13:20

Except for the fact

0:13:200:13:22

that you just wrecked the philosophical point. LAUGHTER

0:13:220:13:27

Let's...let's step back from these stories and these arguments

0:13:270:13:33

to notice a couple of things about the way the arguments have begun to unfold.

0:13:330:13:39

Certain moral principles have already begun to emerge

0:13:390:13:45

from the discussions we've had.

0:13:450:13:48

And let's consider what those moral principles look like.

0:13:480:13:53

The first moral principle that emerged in the discussion

0:13:530:13:57

said the right thing to do, the moral thing to do,

0:13:570:14:01

depends on the consequences that will result from your action.

0:14:010:14:06

At the end of the day, better that five should live even if one must die.

0:14:080:14:13

That's an example of consequentialist moral reasoning.

0:14:140:14:19

The state of the world that will result from the thing you do.

0:14:250:14:30

But then we went a little further when we considered those other cases

0:14:300:14:35

and people weren't so sure about consequentialist moral reasoning

0:14:350:14:41

when people hesitated to push the fat man over the bridge

0:14:420:14:47

or to yank out the organs of the innocent patient.

0:14:480:14:52

People gestured toward reasons

0:14:520:14:55

having to do with the intrinsic quality of the act itself,

0:14:560:15:02

consequences be what they may.

0:15:020:15:04

People were reluctant, people thought it was just wrong,

0:15:050:15:09

categorically wrong,

0:15:090:15:11

to kill a person, an innocent person, even for the sake of saving five lives.

0:15:110:15:19

At least people thought that in the second version of each story we considered.

0:15:190:15:25

So this points to...

0:15:260:15:29

That's regardless of the consequences.

0:15:460:15:49

We're going to explore, in the days and weeks to come,

0:15:490:15:53

the contrast between consequentialist and categorical moral principles.

0:15:530:15:58

The most influential example of consequential moral reasoning is utilitarianism,

0:15:590:16:05

a doctrine invented by Jeremy Bentham, the 18th-century English political philosopher.

0:16:050:16:11

The most important philosopher of categorical moral reasoning

0:16:130:16:19

is the 19th-century German philosopher, Immanuel Kant.

0:16:190:16:24

So we will look a those two different modes of moral reasoning,

0:16:240:16:29

assess them and also consider others.

0:16:290:16:32

If you look at the syllabus,

0:16:320:16:34

you'll notice that we read a number of great and famous books.

0:16:340:16:38

Books by Aristotle, John Locke,

0:16:380:16:41

Immanuel Kant, John Stuart Mill and others.

0:16:410:16:45

You'll notice, too, from the syllabus that we don't only read these books.

0:16:460:16:50

We also take up contemporary political and legal controversies

0:16:500:16:56

that raise philosophical questions.

0:16:560:16:59

We will debate equality and inequality, affirmative action,

0:16:590:17:04

free speech versus hate speech, same-sex marriage, military conscription.

0:17:040:17:09

A range of practical questions. Why? Not just to enliven these abstract and distant books,

0:17:090:17:17

but to make clear, to bring out what's at stake in our everyday lives, including our political lives,

0:17:170:17:23

for philosophy.

0:17:240:17:27

And so we will read these books and we will debate these issues

0:17:270:17:31

and we'll see how each informs and illuminates the other.

0:17:310:17:36

This may sound appealing enough,

0:17:370:17:40

but here I have to issue a warning.

0:17:400:17:43

And the warning is this -

0:17:450:17:47

to read these books

0:17:470:17:49

in this way,

0:17:510:17:53

as an exercise in self-knowledge,

0:17:530:17:56

to read them in this way carries certain risks.

0:17:560:18:00

Risks that are both personal and political,

0:18:000:18:04

risks that every student of political philosophy has known.

0:18:040:18:09

These risks spring from the fact

0:18:100:18:12

that philosophy teaches us and unsettles us

0:18:120:18:17

by confronting us with what we already know.

0:18:170:18:22

There's an irony.

0:18:230:18:25

The difficulty of this course consists in the fact that it teaches what you already know.

0:18:250:18:32

It works by taking what we know from familiar, unquestioned settings

0:18:320:18:37

and making it strange.

0:18:370:18:40

That's how those examples worked,

0:18:420:18:45

the hypotheticals with which we began, with their mix of playfulness and sobriety.

0:18:450:18:51

It's also how these philosophical books work.

0:18:510:18:54

Philosophy estranges us from the familiar

0:18:540:18:59

not by supplying new information,

0:18:590:19:01

but by inviting and provoking a new way of seeing.

0:19:020:19:07

But, and here's the risk,

0:19:090:19:12

once the familiar turns strange, it's never quite the same again.

0:19:120:19:18

Self-knowledge is like lost innocence.

0:19:200:19:24

However unsettling you find it,

0:19:240:19:28

it can never be unthought or unknown.

0:19:280:19:33

What makes this enterprise difficult

0:19:350:19:38

but also riveting

0:19:400:19:42

is that moral and political philosophy is a story

0:19:420:19:46

and you don't know where the story will lead, but what you do know is that the story is about you.

0:19:480:19:55

Those are the personal risks. Now what of the political risks?

0:19:560:20:01

One way of introducing a course like this

0:20:010:20:05

would be to promise you that by reading these books and debating these issues

0:20:050:20:10

you will become a better, more responsible citizen. You will examine presuppositions of public policy,

0:20:100:20:17

hone your political judgment, you will become a more effective participant in public affairs.

0:20:170:20:23

But this would be a partial and misleading promise.

0:20:240:20:28

Political philosophy, for the most part, hasn't worked that way.

0:20:280:20:34

You have to allow for the possibility that political philosophy may make you a worse citizen,

0:20:340:20:40

rather than a better one.

0:20:400:20:43

Or at least a worse citizen before it makes you a better one.

0:20:440:20:49

And that's because philosophy is a distancing and even debilitating activity.

0:20:500:20:58

And you see this going back to Socrates. There's a dialogue, the Gorgias,

0:20:580:21:04

in which one of Socrates' friends, Callicles, tries to talk him out of philosophising.

0:21:040:21:11

Callicles tells Socrates, "Philosophy is a pretty toy,

0:21:120:21:16

"if one indulges in it with moderation at the right time of life,

0:21:160:21:20

"but if one pursues it further than one should, it is absolute ruin.

0:21:200:21:25

"Take my advice," Callicles says. "Abandon argument.

0:21:250:21:30

"Learn the accomplishments of active life. Take for your models

0:21:300:21:35

"not those people who spend their time on these petty quibbles

0:21:350:21:39

"but those who have a good livelihood and reputation and many other blessings."

0:21:390:21:45

So Callicles is really saying to Socrates,

0:21:450:21:49

"Quit philosophising. Get real. Go to business school."

0:21:490:21:54

LAUGHTER

0:21:540:21:57

And Callicles did have a point.

0:21:570:22:00

He had a point because philosophy distances us from conventions, from established assumptions

0:22:010:22:07

and from settled beliefs. Those are the risks, personal and political,

0:22:070:22:12

and in the face of these risks there is a characteristic evasion.

0:22:120:22:16

The name of the evasion is scepticism. It's the idea,

0:22:160:22:21

"We didn't resolve, once and for all,

0:22:210:22:24

"either the cases or the principles we were arguing when we began."

0:22:250:22:30

And if Aristotle and Locke and Kant and Mill haven't solved these questions after all of these years,

0:22:310:22:39

who are we to think that we here in Sanders Theatre over the course of a semester can resolve them?

0:22:390:22:47

And so maybe it's just a matter of each person having his or her own principles

0:22:490:22:54

and there's nothing more to be said about it, no way of reasoning.

0:22:540:22:58

That's the evasion, the evasion of scepticism, to which I would offer the following reply.

0:22:580:23:04

It's true, these questions have been debated for a very long time,

0:23:040:23:09

but the very fact that they have recurred and persisted

0:23:090:23:13

may suggest that though they are impossible in one sense,

0:23:140:23:19

they are unavoidable in another.

0:23:190:23:21

And the reason they are unavoidable, the reason they are inescapable is

0:23:210:23:27

that we live some answer to these questions every day.

0:23:270:23:31

So scepticism, just throwing up your hands and giving up on moral reflection

0:23:310:23:36

is no solution.

0:23:370:23:39

Immanuel Kant described very well the problem with scepticism when he wrote,

0:23:390:23:45

"Scepticism is a resting place for human reason, where it can reflect upon its dogmatic wanderings,

0:23:450:23:51

"but it is no dwelling place for permanent settlement.

0:23:510:23:55

"Simply to acquiesce in scepticism can never suffice to overcome the restlessness of reason."

0:23:550:24:02

I've tried to suggest through these stories and arguments

0:24:050:24:09

some sense of the risks and temptations, of the perils and possibilities.

0:24:090:24:14

I would simply conclude by saying that the aim of this course

0:24:140:24:20

is to awaken the restlessness of reason

0:24:200:24:24

and to see where it might lead. Thank you very much.

0:24:240:24:28

APPLAUSE

0:24:280:24:30

To challenge your views and learn more about justice,

0:24:360:24:40

go to:

0:24:400:24:42

And follow the links to the Open University.

0:24:450:24:49

Subtitles by Subtext for Red Bee Media Ltd - 2011

0:25:060:25:10

Email [email protected]

0:25:110:25:13

Download Subtitles

SRT

ASS