Can Scientists Be Morally Neutral? Dinner with Portillo


Can Scientists Be Morally Neutral?

Over dinner, Michael Portillo and seven guests discuss whether scientists can be morally neutral in their pursuit of scientific knowledge.


Similar Content

Browse content similar to Can Scientists Be Morally Neutral?. Check below for episodes and series from the same categories and more!

Transcript


LineFromTo

A scientist, Charles Darwin,

0:00:030:00:06

made us see ourselves differently

0:00:060:00:09

with an explanation of how humans came to exist.

0:00:090:00:12

Even now, his theory of evolution makes many uncomfortable

0:00:120:00:16

and challenges their beliefs.

0:00:160:00:19

Later advances in science have produced many moral dilemmas.

0:00:190:00:23

And in the 20th century, the Nazis used quasi-science

0:00:230:00:27

to justify mass murder.

0:00:270:00:30

In this century, many fear what science could unleash.

0:00:300:00:34

Europe resisted genetically modified crops

0:00:340:00:37

and America blocked stem cell research.

0:00:370:00:41

And we shrink from the prospect of designing our babies.

0:00:410:00:44

What responsibility does the scientist have,

0:00:440:00:47

setting out on the path of research?

0:00:470:00:49

Is it merely to encourage knowledge?

0:00:490:00:53

To grow and to flourish.

0:00:530:00:55

Or should they reflect on the impact that their discoveries,

0:00:550:00:59

such as splitting the atom, could have on humanity?

0:00:590:01:03

Can they safely leave such worries to those who enjoy political power?

0:01:030:01:08

At dinner tonight, helping me to test the role

0:01:110:01:15

of scientists to destruction,

0:01:150:01:18

are a medical scientist,

0:01:180:01:19

a Catholic theologian,

0:01:190:01:21

an Oxford neuroscientist

0:01:210:01:22

and a science journalist.

0:01:220:01:25

Today, we trade in our Petri dishes and our test tubes

0:01:290:01:33

for platters of food and goblets of wine, as we ask...

0:01:330:01:39

My guests are Sir Mark Walport -

0:01:440:01:47

doctor, medical scientist and director of the Wellcome Trust,

0:01:470:01:51

the UK's biggest medical research charity.

0:01:510:01:54

Petra Boynton who's an agony aunt, sex researcher

0:01:540:01:59

and lecturer at University College London.

0:01:590:02:04

Lord Dick Taverne - one time treasury minister

0:02:040:02:07

and chair of Sense About Science,

0:02:070:02:09

a charity promoting public understanding of science.

0:02:090:02:13

Baroness Susan Greenfield - neuroscientist, cross-bench peer

0:02:130:02:18

and professor of pharmacology at the University Of Oxford.

0:02:180:02:23

Mark Henderson, who is the award winning science editor of The Times,

0:02:230:02:28

and also, an author and columnist.

0:02:280:02:32

Tina Beattie - Roman Catholic theologian

0:02:320:02:35

and professor of Catholic Studies at Roehampton University.

0:02:350:02:40

And Bryan Appleyard - author, journalist

0:02:400:02:43

and three times award winner at the British Press Awards.

0:02:430:02:48

A scientist splits the atom,

0:02:480:02:50

knowing that that could lead to a nuclear bomb,

0:02:500:02:52

so what is the responsibility of the scientist?

0:02:520:02:55

SIR WALPORT: The responsibility is to argue for peaceful use

0:02:550:02:58

of that information rather than war-like.

0:02:580:03:01

-I mean...

-Argue for it?

-Yes.

0:03:010:03:03

Scientists are moral beings

0:03:030:03:05

in the same way as every other person is a moral being.

0:03:050:03:09

And I think that the important distinction is that in science,

0:03:090:03:13

I think, there are no limits to how much we can know.

0:03:130:03:18

But where morals come in,

0:03:180:03:19

is how you use that knowledge.

0:03:190:03:21

It's really important that one doesn't say that scientists

0:03:210:03:25

-operate in an amoral sphere.

-No.

0:03:250:03:27

So let's focus on splitting the atom.

0:03:270:03:29

Splitting the atom, you can either argue that this is

0:03:290:03:32

what's given us the nuclear bomb which has killed a lot of people

0:03:320:03:36

or what's given us nuclear power which is going to save the planet.

0:03:360:03:39

Right, but I don't think that you should then argue

0:03:390:03:42

that scientists aren't entitled to an opinion

0:03:420:03:45

about whether it's better to use that knowledge to produce power

0:03:450:03:48

as opposed to war. You could go all the way back to fire.

0:03:480:03:51

I've discovered this stuff called fire. In two different societies,

0:03:510:03:55

fire can be used for good purposes or bad purposes.

0:03:550:03:58

Scientists shouldn't be neutered,

0:03:580:04:00

scientists are entitled to express opinion...

0:04:000:04:03

-Indeed, but they're not terribly good at arguing.

-SUSAN: No.

0:04:030:04:06

Scientists are as good as anyone else at arguing, actually.

0:04:060:04:09

No, there are professional arguers.

0:04:090:04:12

-But coming back...

-But science is about argument,

0:04:120:04:14

it is about resolving uncertainty through experiment.

0:04:140:04:17

-That's science.

-We're not good at putting our case

0:04:170:04:20

-in general terms in a way that people...

-Some of us are...

0:04:200:04:24

-MARK: I disagree...

-Susan, you're rarely off the television.

0:04:240:04:27

You can't possibly...

0:04:270:04:29

MARK: What Susan said about scientists not being good at arguing

0:04:290:04:33

was probably true ten years ago.

0:04:330:04:36

And actually, partly because of people like Susan and Dick

0:04:360:04:39

and initiatives that they've set up,

0:04:390:04:42

I think, that probably has changed to a degree...

0:04:420:04:45

I think there's still a way to go, I don't know how Dick feels.

0:04:450:04:48

This is a truly important point for the scientists round the table -

0:04:480:04:52

culturally, in our community, it is still considered, in a sense,

0:04:520:04:55

a no-go area, but certainly less than desirable area

0:04:550:04:58

to talk to the media a lot. And to "dumb down".

0:04:580:05:01

You're, kind of, sneered at still, I'm afraid.

0:05:010:05:04

So only a few people will do this.

0:05:040:05:06

And so despite what you say about it now becoming respectable...

0:05:060:05:10

-It's not.

-Culturally,

0:05:100:05:11

it's an important issue among the scientific community.

0:05:110:05:14

DICK: Can I come back to the question of morality in science?

0:05:140:05:18

Because Mark raised an important point.

0:05:180:05:21

Science, as he indicated, is morally neutral.

0:05:210:05:24

I mean, it can be used for good or ill.

0:05:240:05:27

Scientists are not morally neutral, they're very much involved.

0:05:270:05:31

I think there's a further connection between science and morality

0:05:310:05:34

which is often neglected.

0:05:340:05:36

Because science, by being against superstition,

0:05:360:05:40

against dogmatism, because it's tentative knowledge,

0:05:400:05:44

because it's questioning and therefore anti-authority,

0:05:440:05:47

it's anti-authority and therefore anti-autocracy.

0:05:470:05:50

Science is, on the whole, a force for tolerance

0:05:500:05:53

and also very relevant to human rights

0:05:530:05:56

which are very great moral issues.

0:05:560:05:58

Because a lot of the dispute about human rights

0:05:580:06:02

is based on disputes about misconceptions.

0:06:020:06:05

None of those things...

0:06:050:06:06

That blacks are less intelligent,

0:06:060:06:08

-that women are second class citizens.

-Absolutely,

0:06:080:06:10

because people have made mistakes,

0:06:100:06:13

because science has managed to dispel the misconceptions there were

0:06:130:06:17

about the characteristics of gender.

0:06:170:06:19

BRYAN: You're making a category error.

0:06:190:06:21

You're talking about a process called science

0:06:210:06:24

which works by being neutral.

0:06:240:06:26

Now, the category error you're making is saying science is not

0:06:260:06:29

tolerant of dictatorship, tyranny, all those things.

0:06:290:06:32

-Historically...

-It questions authority as Galileo did.

0:06:320:06:36

..it's been very useful to tyranny.

0:06:360:06:38

Technology can be used for good or ill,

0:06:380:06:40

but science itself is questioning and therefore is anti-authoritarian.

0:06:400:06:44

The Russian scientist who conducted

0:06:440:06:46

a massive biological weapons programme,

0:06:460:06:49

did produce some very effective biological weapons.

0:06:490:06:52

The point is, science is a very neutral, very narrow activity.

0:06:520:06:55

That's what you've got.

0:06:550:06:57

The other things, is rhetoric surrounding science.

0:06:570:07:00

You're bringing morals from outside to attach to science.

0:07:000:07:03

MARK: There is one point though that Dick has made

0:07:030:07:06

that is very critical

0:07:060:07:07

and that is that science is inherently anti-authority.

0:07:070:07:11

In that if an authority figure says something,

0:07:110:07:14

all you need to get around that is evidence.

0:07:140:07:17

When Marshall and Warren came up with the idea that stomach ulcers

0:07:170:07:22

were cause by bacterium,

0:07:220:07:24

everybody laughed at them until they proved their case.

0:07:240:07:27

And science changed as a result of that.

0:07:270:07:29

There are countless examples of that where an unorthodox opinion,

0:07:290:07:33

provided that people provide the evidence, can take over.

0:07:330:07:37

PETRA: That presumes one view of science

0:07:370:07:39

and this is where the issue about gender,

0:07:390:07:41

would be lovely if that was true.

0:07:410:07:43

I'm sure we'd have a great time if that were true.

0:07:430:07:46

You know, evolutionary psychology doesn't think women are equal.

0:07:460:07:49

The lived experience of women,

0:07:490:07:51

in culture, women in science particularly,

0:07:510:07:54

that science hasn't always worked in that sense at all.

0:07:540:07:58

It's often worked to prove that we shouldn't be studying.

0:07:580:08:01

Do you think that it's a coincidence then,

0:08:010:08:03

that the position of women has approached equality

0:08:030:08:06

in those societies that are most scientifically advanced?

0:08:060:08:10

No.

0:08:110:08:12

Well, again it depends on how you view equality.

0:08:120:08:14

Look at the Islamic world.

0:08:140:08:16

TINA: Well, excuse me, I think this is really problematic.

0:08:160:08:19

If we're going to have this discussion, let's be scientific.

0:08:190:08:23

Countries that are tyrannies, dictatorships,

0:08:230:08:25

not at all like the, sort of,

0:08:250:08:27

gender egalitarian idyll that we're dreaming of in the West,

0:08:270:08:31

use science just as effectively as we do.

0:08:310:08:34

-But I don't think that was the argument.

-No, that's not.

0:08:340:08:37

SUSAN: Let's remind ourselves of what science is.

0:08:370:08:40

Science is a way of approaching a certain methodology,

0:08:400:08:43

it's a certain way of doing something

0:08:430:08:46

and let's not dignify it and sanctify it

0:08:460:08:48

as though anyone doing science has the answer.

0:08:480:08:52

There's bad scientists, good...

0:08:520:08:54

If your knowledge is tentative, scientific knowledge,

0:08:540:08:58

you're not going to be dogmatic the way that other people are.

0:08:580:09:02

I do want to get an idea of... the, kind of, staircase here.

0:09:020:09:06

We're quite certain the Earth is round, are we?

0:09:060:09:09

THEY LAUGH

0:09:090:09:11

-Well, I'm not so sure.

-And the sun goes round it?

0:09:130:09:15

-No.

-And the Earth goes round the sun.

0:09:150:09:17

We're quite sure the world is round. Are we sure that HIV causes AIDS?

0:09:170:09:24

-Yes.

-Are we sure that man is causing climate change?

0:09:240:09:28

We're...pretty sure, yeah.

0:09:280:09:30

And are we sure that GM foods are a good thing and safe?

0:09:300:09:34

No, because that's the wrong question.

0:09:340:09:37

The question is, what GM food for what purpose?

0:09:370:09:40

-MARK: Absolutely.

-There is nothing generic about GM foods.

0:09:400:09:43

Let's take the current one which is synthetic biology.

0:09:430:09:47

Is it a good thing to synthesise a virus that will carry

0:09:470:09:50

some terrible toxin and kill the world? Of course not.

0:09:500:09:53

Is it sensible to try and develop

0:09:530:09:55

organisms that will make antibiotics? Yes.

0:09:550:09:58

And so it's about specifics there.

0:09:580:10:00

GM foods are not generic, they're something specific.

0:10:000:10:03

You can make the argument about something like guns or hot water.

0:10:030:10:07

I mean, you can use them for good purposes and bad purposes.

0:10:070:10:10

Right, and so I want to ask Bryan,

0:10:100:10:13

do you believe that science should restrict itself? Limit itself

0:10:130:10:18

to the development of things that have good moral outcomes.

0:10:180:10:21

Obviously some outcomes are undesirable -

0:10:210:10:23

we don't want biological weapons in the hands of rogue states.

0:10:230:10:27

The problem with science is its immense power.

0:10:270:10:31

You have to look at not just the moment of doing the science,

0:10:310:10:36

but the moment people step away and it becomes real.

0:10:360:10:39

Now, a very respectable scientist, in his day, was Ernst Haeckel.

0:10:390:10:43

Generally his bit of biology

0:10:430:10:44

seemed pretty good, at the time. Right?

0:10:440:10:47

Well, he informed the politics of Mein Kampf.

0:10:470:10:49

That is a serious point.

0:10:490:10:50

-Well, is it? Is it?

-Yes.

0:10:500:10:53

-You're saying that this...

-I'm saying is...

-..this lovely, liberal

0:10:530:10:58

human kind loving man, was in some way responsible for Hitler?

0:10:580:11:02

No, I'm saying that the science he produced entailed Nazism.

0:11:020:11:09

It was inherent in his science to be adopted by the politics of Hitler.

0:11:090:11:16

That does not mean Haeckel was guilty,

0:11:160:11:19

he was an unpleasant man,

0:11:190:11:20

but that's another matter.

0:11:200:11:22

I'm not so interested in morality in science,

0:11:220:11:25

because that's absolutely determined.

0:11:250:11:27

Scientists are moral creatures.

0:11:270:11:29

Science is this single, narrow pursuit of something.

0:11:290:11:33

Everything that happens outside that is moral.

0:11:330:11:35

But supposing that it were right to say that different races

0:11:350:11:38

have different characteristics, different intelligences.

0:11:380:11:42

That isn't to say that you ought to exterminate all the people

0:11:420:11:45

who are different, so why are you blaming the scientist?

0:11:450:11:49

Supposing that...

0:11:490:11:50

A lot of scientists have colluded in those policies.

0:11:500:11:52

..what would you do with that fact?

0:11:520:11:54

But what would you do with the fact? I mean, are you saying

0:11:540:11:58

that there are things we should not discover

0:11:580:12:00

-in case they're abused by politicians?

-No, no.

0:12:000:12:04

No, I'm not saying that at all.

0:12:040:12:06

There should be a world which exists prior to science,

0:12:060:12:10

prior to that discovery, in which the idea of that discovery

0:12:100:12:13

being used against people would be absurd.

0:12:130:12:16

That people would be morally mature enough to see that.

0:12:160:12:19

-LORD WALPORT: Are you arguing against finding thing out?

-No.

0:12:190:12:22

He's asking me to argue with that.

0:12:220:12:24

TINA: For scientists to measure humans... Why do we want,

0:12:240:12:27

as a society, to transform our fellow human beings

0:12:270:12:30

into scientific measurable entities?

0:12:300:12:32

I'm sorry, Tina, your question is, why do we want to understand?

0:12:320:12:36

-I don't like that question.

-No, my question is not that.

0:12:360:12:39

It's what do I want to understand that matters.

0:12:390:12:43

-So, right. So...

-What do I want to understand?

0:12:430:12:46

So you are going to dictate to the rest of us

0:12:460:12:48

what we should be allowed to understand?

0:12:480:12:50

That I would be so powerful!

0:12:500:12:52

Wouldn't we live in a wonderful world!

0:12:520:12:55

You would like to be in that position...

0:12:550:12:57

No, I love being in this position - a messy conversation

0:12:570:13:00

where we can all chip in, that's what I want.

0:13:000:13:02

SIR WALPORT: But let's take a specific example.

0:13:020:13:05

So, there's dyslexia.

0:13:050:13:06

So, that's one aspect of cognitive ability -

0:13:060:13:09

your ability to read and write.

0:13:090:13:11

And we know that there's quite a high hereditability

0:13:110:13:14

to disorders in being able to read and write.

0:13:140:13:16

Now, does that mean that we shouldn't investigate

0:13:160:13:19

the genetics of that? Because if we understand that

0:13:190:13:21

then we may start explaining a normal range.

0:13:210:13:24

SUSAN: No, let's take dyslexia,

0:13:240:13:26

all a gene does is cause activation of proteins and what do proteins do?

0:13:260:13:30

The knock on, in brain function, is so indirect.

0:13:300:13:33

So therefore, much more interesting to ask,

0:13:330:13:35

what do we mean by intelligence?

0:13:350:13:38

What do we mean by understanding? What do we mean by human ability?

0:13:380:13:41

What do you want your kids to be like?

0:13:410:13:43

They're much more interesting questions

0:13:430:13:46

than just focusing on IQ and genes...

0:13:460:13:48

And who decides what's more interesting?

0:13:480:13:51

I mean, obviously you, but who else?

0:13:510:13:53

Ha! But I think a very interesting question, who decides?

0:13:530:13:57

It's something that I've been encountering recently

0:13:570:14:00

in the impact of screen technologies and how kids think.

0:14:000:14:03

This is a societal problem, it's something that no one sector owns.

0:14:030:14:07

Something we should all be debating and sharing.

0:14:070:14:09

That's when science is at its most interesting and potent.

0:14:090:14:12

The moral question, which we're supposed to be discussing,

0:14:120:14:16

is whether certain kinds of scientific research

0:14:160:14:19

are dangerous and should not be pursued.

0:14:190:14:24

Now, I think this is a very important question

0:14:240:14:26

and I don't see any limit on what should be pursued.

0:14:260:14:29

-It should ALWAYS be open.

-MARK: One of the issues here

0:14:290:14:32

is that very often it's a misinterpretation of results

0:14:320:14:36

that ends up with discrimination at the end of the day.

0:14:360:14:40

I think, genetics is a prime example of this.

0:14:400:14:43

Genetic effects are not as, by and large as is commonly assumed,

0:14:430:14:47

deterministic. They are probabilistic.

0:14:470:14:50

If you have a particular genetic type,

0:14:500:14:52

it doesn't necessarily doom you or predestine you to anything.

0:14:520:14:56

It might mean you have an elevated risk of something

0:14:560:14:59

that you should look out for, but...

0:14:590:15:01

BRYAN: The question is not that, the question is,

0:15:010:15:04

if science did discover something like that, what would we do?

0:15:040:15:07

Petra, in your unusually applied field,

0:15:070:15:11

you're, kind of, helping people,

0:15:110:15:13

do you see a moral purpose for what you're doing?

0:15:130:15:15

Yes, but it has to be very careful.

0:15:150:15:19

You know, I've met young people, 12, 13, 14, who are having sex for money.

0:15:190:15:25

Who are being abused.

0:15:250:15:26

And I would like to help those children.

0:15:260:15:29

It would be easy to say, life for those children is terrible

0:15:290:15:32

and there's a huge amount of suffering.

0:15:320:15:35

To make people do something, I can completely see why people do that,

0:15:350:15:38

but I might misrepresent what's happening.

0:15:380:15:41

It's not to deny those children are suffering,

0:15:410:15:43

but I have to contextualise that.

0:15:430:15:45

And that, for me, is where the morality is very important.

0:15:450:15:49

In that, I have to be... fair and balanced.

0:15:490:15:53

That's often unpopular,

0:15:530:15:54

because people want you to toe a political line.

0:15:540:15:57

But do you go to work each morning thinking, "I'm helping people."

0:15:570:16:01

Um... I'd like to, yeah.

0:16:010:16:04

-You would, yeah.

-BRYAN: But that's a perfectly good justification

0:16:040:16:08

-for medical research...

-I'm not saying it's a bad justification,

0:16:080:16:11

I wanted to understand what it was.

0:16:110:16:14

I mean, do you go to work thinking that?

0:16:140:16:16

No, I don't think I do.

0:16:160:16:17

I think I go to work recognising

0:16:170:16:19

that ultimately the more we understand about health,

0:16:190:16:22

it will be good for the health of the population.

0:16:220:16:26

But do I think about the patient first?

0:16:260:16:29

I am a doctor, but no I don't.

0:16:290:16:30

I think about the wonder of science and finding things...

0:16:300:16:33

SUSAN: If I'm honest, like with Mark,

0:16:330:16:36

my immediate goal is to solve the problem. Is to solve the problem.

0:16:360:16:40

Motivation, I think, is a very important question.

0:16:400:16:43

There's an awful lot of suspicion of research.

0:16:430:16:47

For example, the attack on genetic modification's always been based on

0:16:470:16:51

great, big corporate concern to make lots of profits.

0:16:510:16:55

Now, there's no doubt

0:16:550:16:56

that some pharmaceutical companies have misbehaved.

0:16:560:17:00

But, by and large, what I think is very important to remember

0:17:000:17:03

is that most people who work for companies

0:17:030:17:06

are actually motivated to try and improve the human race.

0:17:060:17:09

Secondly, in same ways, motivation is unimportant.

0:17:090:17:13

Because the question is not who paid

0:17:130:17:16

or what were his motives or her motives,

0:17:160:17:19

but, is the research reproducible?

0:17:190:17:23

Does it stand up? Is it something which can, in fact, be supported

0:17:230:17:29

on objective grounds? Because if someone is motivated

0:17:290:17:32

by the worst possible motive, produces good research,

0:17:320:17:36

that is good research.

0:17:360:17:37

TINA: So let's all use Joseph Mengele's research, then.

0:17:370:17:40

-SIR WALPORT: No, come on.

-Well, that is...

0:17:400:17:43

His research is reproducible.

0:17:430:17:45

It's abhorrent, but it's reproducible.

0:17:450:17:47

SIR WALPORT: Research does have to be conducted ethically.

0:17:470:17:51

Well, I don't think that is what Dick is saying, actually.

0:17:510:17:54

MARK: We don't know the extent to which

0:17:540:17:56

Mengele's research is reproducible,

0:17:560:17:59

because it's not published in an open fashion

0:17:590:18:01

and this is one of the critical things about science,

0:18:010:18:05

that results are published so they can be challenged

0:18:050:18:08

with all the data there and the workings there,

0:18:080:18:12

so that people can go back and examine it and it's the...

0:18:120:18:16

Everybody always asks, "What are the motives for doing it?"

0:18:160:18:19

And they ought to be asking, "What are the results of the research?

0:18:190:18:24

"Have they been peer reviewed?

0:18:240:18:25

"Have they been independently tested? Have they been reproduced?"

0:18:250:18:29

It doesn't matter whether if it's been done by Monsanto

0:18:290:18:32

or by Greenpeace. The question is, is it good research or bad research?

0:18:320:18:36

SIR WALPORT: But hang on, there are limits on how you find things out.

0:18:360:18:39

Absolutely, unequivocally.

0:18:390:18:41

TINA: It sounds like there aren't, if I listen to what Dick's saying.

0:18:410:18:45

-ALL: No!

-It doesn't matter what you do,

0:18:450:18:47

so long as you can repeat your experiments

0:18:470:18:50

and show that you have results.

0:18:500:18:52

But it seems to me, you say, "Look. I can associate Hitler with Mengele.

0:18:520:18:56

"Mengele was a scientist,

0:18:560:18:57

"therefore I've got a problem with science."

0:18:570:19:00

-It's just not a good argument.

-Exactly! I completely agree,

0:19:000:19:04

which is why I think you're making a really bad argument.

0:19:040:19:08

PETRA: Repeating research is important,

0:19:080:19:10

but it's contextualising and appraising it.

0:19:100:19:13

When work is in the public domain,

0:19:130:19:14

I can look at motivation.

0:19:140:19:16

I can start asking questions about who's done the research

0:19:160:19:19

and why they might have been interested in the question.

0:19:190:19:22

-DICK: Right.

-And that is important.

0:19:220:19:25

SUSAN: The central issue is what are we going to use science for?

0:19:250:19:28

There's a term called transhumanism.

0:19:280:19:30

Transhumanism, which was described recently

0:19:300:19:33

-as the world's most dangerous idea...

-BRYAN: I second that.

0:19:330:19:36

..is you can enhance physical and mental powers by

0:19:360:19:39

artificial means whether they are biomedical or physical.

0:19:390:19:42

You could have implants in the brain or genetic manipulation

0:19:420:19:45

or cognitive enhancers or whatever.

0:19:450:19:47

The notion being, that above and beyond your healthy norm,

0:19:470:19:51

you go beyond into some other stratosphere of perfection

0:19:510:19:54

which I think is a sinister notion.

0:19:540:19:56

Isn't the point that we've got to hope

0:19:560:19:58

that we live in societies which will actually put limits on what's done?

0:19:580:20:03

And so... I mean, for example,

0:20:030:20:06

for a long time there's been the ability to investigate

0:20:060:20:10

whether a couple who've had a child

0:20:100:20:13

with a severe genetic disease have the opportunity

0:20:130:20:16

to identify whether a subsequent pregnancy's affected.

0:20:160:20:19

And there are options.

0:20:190:20:20

There's IVF, there's abortion.

0:20:200:20:22

So we've been doing that for a very long time.

0:20:220:20:25

And I think most medical practitioners, not all,

0:20:250:20:27

would think that was reasonable to do and society at large has thought,

0:20:270:20:31

in a regulated way, it's a reasonable thing to do.

0:20:310:20:34

But I hope no-one round the table would think it was reasonable

0:20:340:20:37

to select children on the colour of their eyes,

0:20:370:20:40

or the colour of their hair or...

0:20:400:20:42

DICK: Male or female.

0:20:420:20:43

SUSAN: OK, who is physically and mentally perfect, then?

0:20:430:20:46

Wait, there's an important add-on to what Mark's just said too, there.

0:20:460:20:50

Which is that, the whole debate about so-called "designer babies"

0:20:500:20:54

that is happening at the moment,

0:20:540:20:58

is, by and large, founded on a science fiction

0:20:580:21:01

rather than on actual science.

0:21:010:21:03

In that, with the except...

0:21:030:21:05

OK, at the moment, we have the technology to look for single genes

0:21:050:21:09

in embryos that have been created by IVF and ensure that the embryos

0:21:090:21:14

that do not carry Huntington's disease or cystic fibrosis

0:21:140:21:18

are not implanted. It is a HUGE step

0:21:180:21:22

from that to attempting to select the dozens or hundreds or genes

0:21:220:21:28

that might influence intelligence or height or good looks or something.

0:21:280:21:31

For a large number of reasons.

0:21:310:21:33

And I think it's very important that, at least in the short to medium term,

0:21:330:21:38

that, actually, ethical arguments

0:21:380:21:40

about what is and isn't acceptable ethically

0:21:400:21:43

are grounded in what's actually possible

0:21:430:21:46

rather than what might happen in 50 or 100 years' time.

0:21:460:21:49

Couldn't you argue the opposite?

0:21:490:21:51

You could argue that given that we're not there,

0:21:510:21:54

this is the time to decide the ethical position.

0:21:540:21:57

BRYAN: Transhumanism is the point. I've spent time with transhumanists

0:21:570:22:00

and 50% are crazy, but 50% are quite intelligent.

0:22:000:22:03

And the argument is that you'd basically be consumer pressure.

0:22:030:22:07

Now, because of consumer pressure, I want my baby to be X.

0:22:070:22:10

Now, it may be a simple decision

0:22:100:22:11

about what we agree is a medical condition,

0:22:110:22:14

but it may not. It may be something else.

0:22:140:22:16

Now, when you get to that point, it's a consumer pressure.

0:22:160:22:19

And that will come.

0:22:190:22:21

And transhumanists are all for it.

0:22:210:22:24

SUSAN: Can I answer that? It's an important...

0:22:240:22:26

What is THE perfect brain?

0:22:260:22:28

You know? Is it that you have prolonged attention span?

0:22:280:22:31

Is it that you have a better memory? No, because computers

0:22:310:22:34

can do those things and I don't think one would point

0:22:340:22:37

to Beethoven, Shakespeare and Einstein

0:22:370:22:39

and say that what they had in common was an attention span.

0:22:390:22:42

So, the notion is

0:22:420:22:43

that there's no such thing as THE perfect brain anyway.

0:22:430:22:47

Surely the whole idea is that we're diverse...

0:22:470:22:49

BRYAN: Because we're human, we don't know what a human enhancement is.

0:22:490:22:53

Exactly, that's my point.

0:22:530:22:54

This is sophistry.

0:22:540:22:56

We're talking about parents who want their children to be

0:22:560:22:59

more intelligent rather than less.

0:22:590:23:01

More beautiful rather than less beautiful.

0:23:010:23:04

More tall rather than less tall.

0:23:040:23:05

So, you're a parent, let's say. OK, let's challenge you, Michael.

0:23:050:23:09

-OK, you have children. You want your kids to be more intelligent?

-Yeah.

0:23:090:23:13

What will they be able to do with this more intelligence?

0:23:130:23:16

Will they write more symphonies, will they write novels?

0:23:160:23:20

I think they'll have a better life, because they'll be engaged

0:23:200:23:24

in lovely conversations. I think they'll earn more money.

0:23:240:23:27

Oh, that's interesting.

0:23:270:23:29

But this more that you're going to give them, this "more intelligence",

0:23:290:23:33

Is it a better memory? A longer attention span? What is it?

0:23:330:23:36

No, really. I think you're being absolutely sophistical.

0:23:360:23:40

Go... Go out and ask, just ask anyone. Do you want to be...

0:23:400:23:44

-Do you want to be brighter or less bright?

-I'm asking you.

0:23:440:23:47

Yes, and I am answering you.

0:23:470:23:49

-What...

-I've answered you already.

0:23:490:23:51

I want them to have the quality of conversation,

0:23:510:23:53

I want them to have earning power.

0:23:530:23:55

Do you think we've all got brilliant memories round the table? No.

0:23:550:23:59

DICK: Susan. Susan, just a moment.

0:23:590:24:00

There are certain things which are already possible.

0:24:000:24:04

I mean, it may well be that we can stop people having

0:24:040:24:08

Huntington's chorea. There are certain...health effects

0:24:080:24:14

which could be prevented.

0:24:140:24:16

Now, it's not an awfully long step to stop people being too short.

0:24:160:24:22

You can begin to influence the way in which people develop.

0:24:220:24:27

Now, this does raise certain ethical problems.

0:24:270:24:33

You can't deny that.

0:24:330:24:35

SUSAN: How short is too short? 5ft 4? 5ft 5?

0:24:350:24:37

It's not... Wait a moment.

0:24:370:24:39

You're avoiding the issue.

0:24:390:24:40

I want to know how short you can be before you have to be manipulated.

0:24:400:24:44

No, but that doesn't deal with the question, Susan.

0:24:440:24:47

The answer to your question. The question to your ques...

0:24:470:24:51

The answer to your question is 5ft 7, now let's move on.

0:24:510:24:54

Tina, is there any moral difference between selecting an embryo

0:24:540:24:57

that doesn't have Huntington's chorea

0:24:570:25:00

and selecting an embryo that doesn't have blue eyes?

0:25:000:25:03

I think there is,

0:25:030:25:04

but I'm cautious on all this.

0:25:040:25:06

I mean, this is a question not about human perfectibility

0:25:060:25:10

which is such a myth and such a fantasy.

0:25:100:25:13

But about suffering and our capacity to alleviate it.

0:25:130:25:17

And I can't give a clear answer to that,

0:25:170:25:20

because I think we need to go on a case by case,

0:25:200:25:22

painful struggle, to say, where can we alleviate human suffering?

0:25:220:25:26

And if we want to do that, there are millions of children in this world

0:25:260:25:30

we could just give a square meal to tonight. That would be enough.

0:25:300:25:34

I know, I know, I know. All right, let me give you another issue.

0:25:340:25:37

Is there any moral difference between selecting whether the baby

0:25:370:25:41

is relieved from having Down's syndrome

0:25:410:25:43

or whether the baby is going to have blue eyes?

0:25:430:25:45

-I'm...not sure.

-All those kids out there with Down's syndrome

0:25:490:25:53

-are waiting to hear your answer.

-That's why I'm not sure.

0:25:530:25:56

I fear a world where we judge perfectibility

0:25:560:25:58

on external criteria that you... I mean, you, Michael.

0:25:580:26:01

You want genetic manipulation so that you can have richer, brighter kids.

0:26:010:26:06

If the economic meltdown is anything to go by, if you want richer kids

0:26:060:26:10

you should get a gene that is stupid and short-sighted.

0:26:100:26:13

Then your kids will be richer.

0:26:130:26:15

Yes, but I don't think your abuse of me entirely deals

0:26:150:26:18

with the moral problem.

0:26:180:26:19

I can't ENTIRELY deal with moral problems, none of us can.

0:26:190:26:23

We have to say, what CAN we do? What SHOULD we do?

0:26:230:26:26

And what good will it do, if we do it?

0:26:260:26:28

And every capacity we have, we need to ask those questions.

0:26:280:26:32

DICK: Tina, you haven't answered the question.

0:26:320:26:35

Because the question is that there's a very big moral difference

0:26:350:26:38

between selecting somebody who's got blue eyes

0:26:380:26:41

-and somebody who's got Down's syndrome.

-No.

-Why?

0:26:410:26:43

What's the moral difference?

0:26:430:26:46

There are a lot of differences.

0:26:460:26:48

One is irrelevant,

0:26:480:26:49

the other makes a huge difference to the future of their lives.

0:26:490:26:52

SIR WALPORT: The issue that we're not discussing

0:26:520:26:55

is the potential harm from making the choice.

0:26:550:26:57

You go through a process of in vitro fertilisation,

0:26:570:27:00

it has side effects.

0:27:000:27:02

You go through abortion, again there are risks.

0:27:020:27:05

The discussion becomes interesting

0:27:050:27:07

if you get into a universe where you can create an egg in the laboratory,

0:27:070:27:11

you can create a sperm,

0:27:110:27:12

then you can start the selection and there's no harm.

0:27:120:27:15

And that's where this is a decision

0:27:150:27:17

that shouldn't be made by scientists any more.

0:27:170:27:20

It's one that, actually, the whole of society has to debate.

0:27:200:27:23

What does the whole of society mean? Does it mean anything?

0:27:230:27:27

Well, here we mean parliamentary democracy, actually. We mean...

0:27:270:27:31

Well, it's the b... It's the best we can do.

0:27:310:27:34

And actually, on in vitro fertilisation and embryo research

0:27:340:27:37

Parliament has done well.

0:27:370:27:38

-The track record isn't all bad...

-Mixed.

0:27:380:27:42

SUSAN: Leaving that is the very exciting question,

0:27:420:27:45

what kind of society do we want?

0:27:450:27:47

What people do you want your kids to be?

0:27:470:27:49

What values and features do we want them to have?

0:27:490:27:52

For the first time, this is what's so exciting,

0:27:520:27:54

biomedical science has freed us up to have longer lives,

0:27:540:27:59

to be freed up, technology's freed us from the drudgery of everyday,

0:27:590:28:02

so for the first time EVER we can have these issues.

0:28:020:28:05

And it's exciting, but it's a very scary time as well.

0:28:050:28:08

Thank you all very much indeed.

0:28:080:28:10

'In tomorrow's world of impeccable blue-eyed babies,

0:28:100:28:15

'carbon neutral electricity,

0:28:150:28:17

'and plentiful, aphid-resistant crops,

0:28:170:28:19

'let us hope that our weather will be calm

0:28:190:28:22

'and our polar bears will be safe.

0:28:220:28:24

'But if so, as we sit down to our perfectly balanced meals

0:28:240:28:29

'at the dinner party of the future,

0:28:290:28:31

'will there be anything left to talk about?'

0:28:310:28:34

Subtitles by Red Bee Media Ltd

0:28:420:28:45

E-mail [email protected]

0:28:450:28:48

Over dinner, Michael Portillo and seven guests discuss whether scientists can be morally neutral in their pursuit of scientific knowledge. Should scientists take an ethical stand on how science is used? Should there be limits on what scientists are allowed to discover? What ethical dilemmas will be thrown up by the science of the future? Guests include Baroness Susan Greenfield, Mark Henderson and Bryan Appleyard.