Can Scientists Be Morally Neutral?

Download Subtitles

Transcript

0:00:03 > 0:00:06A scientist, Charles Darwin,

0:00:06 > 0:00:09made us see ourselves differently

0:00:09 > 0:00:12with an explanation of how humans came to exist.

0:00:12 > 0:00:16Even now, his theory of evolution makes many uncomfortable

0:00:16 > 0:00:19and challenges their beliefs.

0:00:19 > 0:00:23Later advances in science have produced many moral dilemmas.

0:00:23 > 0:00:27And in the 20th century, the Nazis used quasi-science

0:00:27 > 0:00:30to justify mass murder.

0:00:30 > 0:00:34In this century, many fear what science could unleash.

0:00:34 > 0:00:37Europe resisted genetically modified crops

0:00:37 > 0:00:41and America blocked stem cell research.

0:00:41 > 0:00:44And we shrink from the prospect of designing our babies.

0:00:44 > 0:00:47What responsibility does the scientist have,

0:00:47 > 0:00:49setting out on the path of research?

0:00:49 > 0:00:53Is it merely to encourage knowledge?

0:00:53 > 0:00:55To grow and to flourish.

0:00:55 > 0:00:59Or should they reflect on the impact that their discoveries,

0:00:59 > 0:01:03such as splitting the atom, could have on humanity?

0:01:03 > 0:01:08Can they safely leave such worries to those who enjoy political power?

0:01:11 > 0:01:15At dinner tonight, helping me to test the role

0:01:15 > 0:01:18of scientists to destruction,

0:01:18 > 0:01:19are a medical scientist,

0:01:19 > 0:01:21a Catholic theologian,

0:01:21 > 0:01:22an Oxford neuroscientist

0:01:22 > 0:01:25and a science journalist.

0:01:29 > 0:01:33Today, we trade in our Petri dishes and our test tubes

0:01:33 > 0:01:39for platters of food and goblets of wine, as we ask...

0:01:44 > 0:01:47My guests are Sir Mark Walport -

0:01:47 > 0:01:51doctor, medical scientist and director of the Wellcome Trust,

0:01:51 > 0:01:54the UK's biggest medical research charity.

0:01:54 > 0:01:59Petra Boynton who's an agony aunt, sex researcher

0:01:59 > 0:02:04and lecturer at University College London.

0:02:04 > 0:02:07Lord Dick Taverne - one time treasury minister

0:02:07 > 0:02:09and chair of Sense About Science,

0:02:09 > 0:02:13a charity promoting public understanding of science.

0:02:13 > 0:02:18Baroness Susan Greenfield - neuroscientist, cross-bench peer

0:02:18 > 0:02:23and professor of pharmacology at the University Of Oxford.

0:02:23 > 0:02:28Mark Henderson, who is the award winning science editor of The Times,

0:02:28 > 0:02:32and also, an author and columnist.

0:02:32 > 0:02:35Tina Beattie - Roman Catholic theologian

0:02:35 > 0:02:40and professor of Catholic Studies at Roehampton University.

0:02:40 > 0:02:43And Bryan Appleyard - author, journalist

0:02:43 > 0:02:48and three times award winner at the British Press Awards.

0:02:48 > 0:02:50A scientist splits the atom,

0:02:50 > 0:02:52knowing that that could lead to a nuclear bomb,

0:02:52 > 0:02:55so what is the responsibility of the scientist?

0:02:55 > 0:02:58SIR WALPORT: The responsibility is to argue for peaceful use

0:02:58 > 0:03:01of that information rather than war-like.

0:03:01 > 0:03:03- I mean...- Argue for it?- Yes.

0:03:03 > 0:03:05Scientists are moral beings

0:03:05 > 0:03:09in the same way as every other person is a moral being.

0:03:09 > 0:03:13And I think that the important distinction is that in science,

0:03:13 > 0:03:18I think, there are no limits to how much we can know.

0:03:18 > 0:03:19But where morals come in,

0:03:19 > 0:03:21is how you use that knowledge.

0:03:21 > 0:03:25It's really important that one doesn't say that scientists

0:03:25 > 0:03:27- operate in an amoral sphere.- No.

0:03:27 > 0:03:29So let's focus on splitting the atom.

0:03:29 > 0:03:32Splitting the atom, you can either argue that this is

0:03:32 > 0:03:36what's given us the nuclear bomb which has killed a lot of people

0:03:36 > 0:03:39or what's given us nuclear power which is going to save the planet.

0:03:39 > 0:03:42Right, but I don't think that you should then argue

0:03:42 > 0:03:45that scientists aren't entitled to an opinion

0:03:45 > 0:03:48about whether it's better to use that knowledge to produce power

0:03:48 > 0:03:51as opposed to war. You could go all the way back to fire.

0:03:51 > 0:03:55I've discovered this stuff called fire. In two different societies,

0:03:55 > 0:03:58fire can be used for good purposes or bad purposes.

0:03:58 > 0:04:00Scientists shouldn't be neutered,

0:04:00 > 0:04:03scientists are entitled to express opinion...

0:04:03 > 0:04:06- Indeed, but they're not terribly good at arguing.- SUSAN: No.

0:04:06 > 0:04:09Scientists are as good as anyone else at arguing, actually.

0:04:09 > 0:04:12No, there are professional arguers.

0:04:12 > 0:04:14- But coming back... - But science is about argument,

0:04:14 > 0:04:17it is about resolving uncertainty through experiment.

0:04:17 > 0:04:20- That's science. - We're not good at putting our case

0:04:20 > 0:04:24- in general terms in a way that people...- Some of us are...

0:04:24 > 0:04:27- MARK: I disagree...- Susan, you're rarely off the television.

0:04:27 > 0:04:29You can't possibly...

0:04:29 > 0:04:33MARK: What Susan said about scientists not being good at arguing

0:04:33 > 0:04:36was probably true ten years ago.

0:04:36 > 0:04:39And actually, partly because of people like Susan and Dick

0:04:39 > 0:04:42and initiatives that they've set up,

0:04:42 > 0:04:45I think, that probably has changed to a degree...

0:04:45 > 0:04:48I think there's still a way to go, I don't know how Dick feels.

0:04:48 > 0:04:52This is a truly important point for the scientists round the table -

0:04:52 > 0:04:55culturally, in our community, it is still considered, in a sense,

0:04:55 > 0:04:58a no-go area, but certainly less than desirable area

0:04:58 > 0:05:01to talk to the media a lot. And to "dumb down".

0:05:01 > 0:05:04You're, kind of, sneered at still, I'm afraid.

0:05:04 > 0:05:06So only a few people will do this.

0:05:06 > 0:05:10And so despite what you say about it now becoming respectable...

0:05:10 > 0:05:11- It's not.- Culturally,

0:05:11 > 0:05:14it's an important issue among the scientific community.

0:05:14 > 0:05:18DICK: Can I come back to the question of morality in science?

0:05:18 > 0:05:21Because Mark raised an important point.

0:05:21 > 0:05:24Science, as he indicated, is morally neutral.

0:05:24 > 0:05:27I mean, it can be used for good or ill.

0:05:27 > 0:05:31Scientists are not morally neutral, they're very much involved.

0:05:31 > 0:05:34I think there's a further connection between science and morality

0:05:34 > 0:05:36which is often neglected.

0:05:36 > 0:05:40Because science, by being against superstition,

0:05:40 > 0:05:44against dogmatism, because it's tentative knowledge,

0:05:44 > 0:05:47because it's questioning and therefore anti-authority,

0:05:47 > 0:05:50it's anti-authority and therefore anti-autocracy.

0:05:50 > 0:05:53Science is, on the whole, a force for tolerance

0:05:53 > 0:05:56and also very relevant to human rights

0:05:56 > 0:05:58which are very great moral issues.

0:05:58 > 0:06:02Because a lot of the dispute about human rights

0:06:02 > 0:06:05is based on disputes about misconceptions.

0:06:05 > 0:06:06None of those things...

0:06:06 > 0:06:08That blacks are less intelligent,

0:06:08 > 0:06:10- that women are second class citizens.- Absolutely,

0:06:10 > 0:06:13because people have made mistakes,

0:06:13 > 0:06:17because science has managed to dispel the misconceptions there were

0:06:17 > 0:06:19about the characteristics of gender.

0:06:19 > 0:06:21BRYAN: You're making a category error.

0:06:21 > 0:06:24You're talking about a process called science

0:06:24 > 0:06:26which works by being neutral.

0:06:26 > 0:06:29Now, the category error you're making is saying science is not

0:06:29 > 0:06:32tolerant of dictatorship, tyranny, all those things.

0:06:32 > 0:06:36- Historically...- It questions authority as Galileo did.

0:06:36 > 0:06:38..it's been very useful to tyranny.

0:06:38 > 0:06:40Technology can be used for good or ill,

0:06:40 > 0:06:44but science itself is questioning and therefore is anti-authoritarian.

0:06:44 > 0:06:46The Russian scientist who conducted

0:06:46 > 0:06:49a massive biological weapons programme,

0:06:49 > 0:06:52did produce some very effective biological weapons.

0:06:52 > 0:06:55The point is, science is a very neutral, very narrow activity.

0:06:55 > 0:06:57That's what you've got.

0:06:57 > 0:07:00The other things, is rhetoric surrounding science.

0:07:00 > 0:07:03You're bringing morals from outside to attach to science.

0:07:03 > 0:07:06MARK: There is one point though that Dick has made

0:07:06 > 0:07:07that is very critical

0:07:07 > 0:07:11and that is that science is inherently anti-authority.

0:07:11 > 0:07:14In that if an authority figure says something,

0:07:14 > 0:07:17all you need to get around that is evidence.

0:07:17 > 0:07:22When Marshall and Warren came up with the idea that stomach ulcers

0:07:22 > 0:07:24were cause by bacterium,

0:07:24 > 0:07:27everybody laughed at them until they proved their case.

0:07:27 > 0:07:29And science changed as a result of that.

0:07:29 > 0:07:33There are countless examples of that where an unorthodox opinion,

0:07:33 > 0:07:37provided that people provide the evidence, can take over.

0:07:37 > 0:07:39PETRA: That presumes one view of science

0:07:39 > 0:07:41and this is where the issue about gender,

0:07:41 > 0:07:43would be lovely if that was true.

0:07:43 > 0:07:46I'm sure we'd have a great time if that were true.

0:07:46 > 0:07:49You know, evolutionary psychology doesn't think women are equal.

0:07:49 > 0:07:51The lived experience of women,

0:07:51 > 0:07:54in culture, women in science particularly,

0:07:54 > 0:07:58that science hasn't always worked in that sense at all.

0:07:58 > 0:08:01It's often worked to prove that we shouldn't be studying.

0:08:01 > 0:08:03Do you think that it's a coincidence then,

0:08:03 > 0:08:06that the position of women has approached equality

0:08:06 > 0:08:10in those societies that are most scientifically advanced?

0:08:11 > 0:08:12No.

0:08:12 > 0:08:14Well, again it depends on how you view equality.

0:08:14 > 0:08:16Look at the Islamic world.

0:08:16 > 0:08:19TINA: Well, excuse me, I think this is really problematic.

0:08:19 > 0:08:23If we're going to have this discussion, let's be scientific.

0:08:23 > 0:08:25Countries that are tyrannies, dictatorships,

0:08:25 > 0:08:27not at all like the, sort of,

0:08:27 > 0:08:31gender egalitarian idyll that we're dreaming of in the West,

0:08:31 > 0:08:34use science just as effectively as we do.

0:08:34 > 0:08:37- But I don't think that was the argument.- No, that's not.

0:08:37 > 0:08:40SUSAN: Let's remind ourselves of what science is.

0:08:40 > 0:08:43Science is a way of approaching a certain methodology,

0:08:43 > 0:08:46it's a certain way of doing something

0:08:46 > 0:08:48and let's not dignify it and sanctify it

0:08:48 > 0:08:52as though anyone doing science has the answer.

0:08:52 > 0:08:54There's bad scientists, good...

0:08:54 > 0:08:58If your knowledge is tentative, scientific knowledge,

0:08:58 > 0:09:02you're not going to be dogmatic the way that other people are.

0:09:02 > 0:09:06I do want to get an idea of... the, kind of, staircase here.

0:09:06 > 0:09:09We're quite certain the Earth is round, are we?

0:09:09 > 0:09:11THEY LAUGH

0:09:13 > 0:09:15- Well, I'm not so sure. - And the sun goes round it?

0:09:15 > 0:09:17- No.- And the Earth goes round the sun.

0:09:17 > 0:09:24We're quite sure the world is round. Are we sure that HIV causes AIDS?

0:09:24 > 0:09:28- Yes.- Are we sure that man is causing climate change?

0:09:28 > 0:09:30We're...pretty sure, yeah.

0:09:30 > 0:09:34And are we sure that GM foods are a good thing and safe?

0:09:34 > 0:09:37No, because that's the wrong question.

0:09:37 > 0:09:40The question is, what GM food for what purpose?

0:09:40 > 0:09:43- MARK: Absolutely.- There is nothing generic about GM foods.

0:09:43 > 0:09:47Let's take the current one which is synthetic biology.

0:09:47 > 0:09:50Is it a good thing to synthesise a virus that will carry

0:09:50 > 0:09:53some terrible toxin and kill the world? Of course not.

0:09:53 > 0:09:55Is it sensible to try and develop

0:09:55 > 0:09:58organisms that will make antibiotics? Yes.

0:09:58 > 0:10:00And so it's about specifics there.

0:10:00 > 0:10:03GM foods are not generic, they're something specific.

0:10:03 > 0:10:07You can make the argument about something like guns or hot water.

0:10:07 > 0:10:10I mean, you can use them for good purposes and bad purposes.

0:10:10 > 0:10:13Right, and so I want to ask Bryan,

0:10:13 > 0:10:18do you believe that science should restrict itself? Limit itself

0:10:18 > 0:10:21to the development of things that have good moral outcomes.

0:10:21 > 0:10:23Obviously some outcomes are undesirable -

0:10:23 > 0:10:27we don't want biological weapons in the hands of rogue states.

0:10:27 > 0:10:31The problem with science is its immense power.

0:10:31 > 0:10:36You have to look at not just the moment of doing the science,

0:10:36 > 0:10:39but the moment people step away and it becomes real.

0:10:39 > 0:10:43Now, a very respectable scientist, in his day, was Ernst Haeckel.

0:10:43 > 0:10:44Generally his bit of biology

0:10:44 > 0:10:47seemed pretty good, at the time. Right?

0:10:47 > 0:10:49Well, he informed the politics of Mein Kampf.

0:10:49 > 0:10:50That is a serious point.

0:10:50 > 0:10:53- Well, is it? Is it?- Yes.

0:10:53 > 0:10:58- You're saying that this...- I'm saying is...- ..this lovely, liberal

0:10:58 > 0:11:02human kind loving man, was in some way responsible for Hitler?

0:11:02 > 0:11:09No, I'm saying that the science he produced entailed Nazism.

0:11:09 > 0:11:16It was inherent in his science to be adopted by the politics of Hitler.

0:11:16 > 0:11:19That does not mean Haeckel was guilty,

0:11:19 > 0:11:20he was an unpleasant man,

0:11:20 > 0:11:22but that's another matter.

0:11:22 > 0:11:25I'm not so interested in morality in science,

0:11:25 > 0:11:27because that's absolutely determined.

0:11:27 > 0:11:29Scientists are moral creatures.

0:11:29 > 0:11:33Science is this single, narrow pursuit of something.

0:11:33 > 0:11:35Everything that happens outside that is moral.

0:11:35 > 0:11:38But supposing that it were right to say that different races

0:11:38 > 0:11:42have different characteristics, different intelligences.

0:11:42 > 0:11:45That isn't to say that you ought to exterminate all the people

0:11:45 > 0:11:49who are different, so why are you blaming the scientist?

0:11:49 > 0:11:50Supposing that...

0:11:50 > 0:11:52A lot of scientists have colluded in those policies.

0:11:52 > 0:11:54..what would you do with that fact?

0:11:54 > 0:11:58But what would you do with the fact? I mean, are you saying

0:11:58 > 0:12:00that there are things we should not discover

0:12:00 > 0:12:04- in case they're abused by politicians?- No, no.

0:12:04 > 0:12:06No, I'm not saying that at all.

0:12:06 > 0:12:10There should be a world which exists prior to science,

0:12:10 > 0:12:13prior to that discovery, in which the idea of that discovery

0:12:13 > 0:12:16being used against people would be absurd.

0:12:16 > 0:12:19That people would be morally mature enough to see that.

0:12:19 > 0:12:22- LORD WALPORT: Are you arguing against finding thing out?- No.

0:12:22 > 0:12:24He's asking me to argue with that.

0:12:24 > 0:12:27TINA: For scientists to measure humans... Why do we want,

0:12:27 > 0:12:30as a society, to transform our fellow human beings

0:12:30 > 0:12:32into scientific measurable entities?

0:12:32 > 0:12:36I'm sorry, Tina, your question is, why do we want to understand?

0:12:36 > 0:12:39- I don't like that question. - No, my question is not that.

0:12:39 > 0:12:43It's what do I want to understand that matters.

0:12:43 > 0:12:46- So, right. So... - What do I want to understand?

0:12:46 > 0:12:48So you are going to dictate to the rest of us

0:12:48 > 0:12:50what we should be allowed to understand?

0:12:50 > 0:12:52That I would be so powerful!

0:12:52 > 0:12:55Wouldn't we live in a wonderful world!

0:12:55 > 0:12:57You would like to be in that position...

0:12:57 > 0:13:00No, I love being in this position - a messy conversation

0:13:00 > 0:13:02where we can all chip in, that's what I want.

0:13:02 > 0:13:05SIR WALPORT: But let's take a specific example.

0:13:05 > 0:13:06So, there's dyslexia.

0:13:06 > 0:13:09So, that's one aspect of cognitive ability -

0:13:09 > 0:13:11your ability to read and write.

0:13:11 > 0:13:14And we know that there's quite a high hereditability

0:13:14 > 0:13:16to disorders in being able to read and write.

0:13:16 > 0:13:19Now, does that mean that we shouldn't investigate

0:13:19 > 0:13:21the genetics of that? Because if we understand that

0:13:21 > 0:13:24then we may start explaining a normal range.

0:13:24 > 0:13:26SUSAN: No, let's take dyslexia,

0:13:26 > 0:13:30all a gene does is cause activation of proteins and what do proteins do?

0:13:30 > 0:13:33The knock on, in brain function, is so indirect.

0:13:33 > 0:13:35So therefore, much more interesting to ask,

0:13:35 > 0:13:38what do we mean by intelligence?

0:13:38 > 0:13:41What do we mean by understanding? What do we mean by human ability?

0:13:41 > 0:13:43What do you want your kids to be like?

0:13:43 > 0:13:46They're much more interesting questions

0:13:46 > 0:13:48than just focusing on IQ and genes...

0:13:48 > 0:13:51And who decides what's more interesting?

0:13:51 > 0:13:53I mean, obviously you, but who else?

0:13:53 > 0:13:57Ha! But I think a very interesting question, who decides?

0:13:57 > 0:14:00It's something that I've been encountering recently

0:14:00 > 0:14:03in the impact of screen technologies and how kids think.

0:14:03 > 0:14:07This is a societal problem, it's something that no one sector owns.

0:14:07 > 0:14:09Something we should all be debating and sharing.

0:14:09 > 0:14:12That's when science is at its most interesting and potent.

0:14:12 > 0:14:16The moral question, which we're supposed to be discussing,

0:14:16 > 0:14:19is whether certain kinds of scientific research

0:14:19 > 0:14:24are dangerous and should not be pursued.

0:14:24 > 0:14:26Now, I think this is a very important question

0:14:26 > 0:14:29and I don't see any limit on what should be pursued.

0:14:29 > 0:14:32- It should ALWAYS be open. - MARK: One of the issues here

0:14:32 > 0:14:36is that very often it's a misinterpretation of results

0:14:36 > 0:14:40that ends up with discrimination at the end of the day.

0:14:40 > 0:14:43I think, genetics is a prime example of this.

0:14:43 > 0:14:47Genetic effects are not as, by and large as is commonly assumed,

0:14:47 > 0:14:50deterministic. They are probabilistic.

0:14:50 > 0:14:52If you have a particular genetic type,

0:14:52 > 0:14:56it doesn't necessarily doom you or predestine you to anything.

0:14:56 > 0:14:59It might mean you have an elevated risk of something

0:14:59 > 0:15:01that you should look out for, but...

0:15:01 > 0:15:04BRYAN: The question is not that, the question is,

0:15:04 > 0:15:07if science did discover something like that, what would we do?

0:15:07 > 0:15:11Petra, in your unusually applied field,

0:15:11 > 0:15:13you're, kind of, helping people,

0:15:13 > 0:15:15do you see a moral purpose for what you're doing?

0:15:15 > 0:15:19Yes, but it has to be very careful.

0:15:19 > 0:15:25You know, I've met young people, 12, 13, 14, who are having sex for money.

0:15:25 > 0:15:26Who are being abused.

0:15:26 > 0:15:29And I would like to help those children.

0:15:29 > 0:15:32It would be easy to say, life for those children is terrible

0:15:32 > 0:15:35and there's a huge amount of suffering.

0:15:35 > 0:15:38To make people do something, I can completely see why people do that,

0:15:38 > 0:15:41but I might misrepresent what's happening.

0:15:41 > 0:15:43It's not to deny those children are suffering,

0:15:43 > 0:15:45but I have to contextualise that.

0:15:45 > 0:15:49And that, for me, is where the morality is very important.

0:15:49 > 0:15:53In that, I have to be... fair and balanced.

0:15:53 > 0:15:54That's often unpopular,

0:15:54 > 0:15:57because people want you to toe a political line.

0:15:57 > 0:16:01But do you go to work each morning thinking, "I'm helping people."

0:16:01 > 0:16:04Um... I'd like to, yeah.

0:16:04 > 0:16:08- You would, yeah.- BRYAN: But that's a perfectly good justification

0:16:08 > 0:16:11- for medical research...- I'm not saying it's a bad justification,

0:16:11 > 0:16:14I wanted to understand what it was.

0:16:14 > 0:16:16I mean, do you go to work thinking that?

0:16:16 > 0:16:17No, I don't think I do.

0:16:17 > 0:16:19I think I go to work recognising

0:16:19 > 0:16:22that ultimately the more we understand about health,

0:16:22 > 0:16:26it will be good for the health of the population.

0:16:26 > 0:16:29But do I think about the patient first?

0:16:29 > 0:16:30I am a doctor, but no I don't.

0:16:30 > 0:16:33I think about the wonder of science and finding things...

0:16:33 > 0:16:36SUSAN: If I'm honest, like with Mark,

0:16:36 > 0:16:40my immediate goal is to solve the problem. Is to solve the problem.

0:16:40 > 0:16:43Motivation, I think, is a very important question.

0:16:43 > 0:16:47There's an awful lot of suspicion of research.

0:16:47 > 0:16:51For example, the attack on genetic modification's always been based on

0:16:51 > 0:16:55great, big corporate concern to make lots of profits.

0:16:55 > 0:16:56Now, there's no doubt

0:16:56 > 0:17:00that some pharmaceutical companies have misbehaved.

0:17:00 > 0:17:03But, by and large, what I think is very important to remember

0:17:03 > 0:17:06is that most people who work for companies

0:17:06 > 0:17:09are actually motivated to try and improve the human race.

0:17:09 > 0:17:13Secondly, in same ways, motivation is unimportant.

0:17:13 > 0:17:16Because the question is not who paid

0:17:16 > 0:17:19or what were his motives or her motives,

0:17:19 > 0:17:23but, is the research reproducible?

0:17:23 > 0:17:29Does it stand up? Is it something which can, in fact, be supported

0:17:29 > 0:17:32on objective grounds? Because if someone is motivated

0:17:32 > 0:17:36by the worst possible motive, produces good research,

0:17:36 > 0:17:37that is good research.

0:17:37 > 0:17:40TINA: So let's all use Joseph Mengele's research, then.

0:17:40 > 0:17:43- SIR WALPORT: No, come on. - Well, that is...

0:17:43 > 0:17:45His research is reproducible.

0:17:45 > 0:17:47It's abhorrent, but it's reproducible.

0:17:47 > 0:17:51SIR WALPORT: Research does have to be conducted ethically.

0:17:51 > 0:17:54Well, I don't think that is what Dick is saying, actually.

0:17:54 > 0:17:56MARK: We don't know the extent to which

0:17:56 > 0:17:59Mengele's research is reproducible,

0:17:59 > 0:18:01because it's not published in an open fashion

0:18:01 > 0:18:05and this is one of the critical things about science,

0:18:05 > 0:18:08that results are published so they can be challenged

0:18:08 > 0:18:12with all the data there and the workings there,

0:18:12 > 0:18:16so that people can go back and examine it and it's the...

0:18:16 > 0:18:19Everybody always asks, "What are the motives for doing it?"

0:18:19 > 0:18:24And they ought to be asking, "What are the results of the research?

0:18:24 > 0:18:25"Have they been peer reviewed?

0:18:25 > 0:18:29"Have they been independently tested? Have they been reproduced?"

0:18:29 > 0:18:32It doesn't matter whether if it's been done by Monsanto

0:18:32 > 0:18:36or by Greenpeace. The question is, is it good research or bad research?

0:18:36 > 0:18:39SIR WALPORT: But hang on, there are limits on how you find things out.

0:18:39 > 0:18:41Absolutely, unequivocally.

0:18:41 > 0:18:45TINA: It sounds like there aren't, if I listen to what Dick's saying.

0:18:45 > 0:18:47- ALL: No! - It doesn't matter what you do,

0:18:47 > 0:18:50so long as you can repeat your experiments

0:18:50 > 0:18:52and show that you have results.

0:18:52 > 0:18:56But it seems to me, you say, "Look. I can associate Hitler with Mengele.

0:18:56 > 0:18:57"Mengele was a scientist,

0:18:57 > 0:19:00"therefore I've got a problem with science."

0:19:00 > 0:19:04- It's just not a good argument. - Exactly! I completely agree,

0:19:04 > 0:19:08which is why I think you're making a really bad argument.

0:19:08 > 0:19:10PETRA: Repeating research is important,

0:19:10 > 0:19:13but it's contextualising and appraising it.

0:19:13 > 0:19:14When work is in the public domain,

0:19:14 > 0:19:16I can look at motivation.

0:19:16 > 0:19:19I can start asking questions about who's done the research

0:19:19 > 0:19:22and why they might have been interested in the question.

0:19:22 > 0:19:25- DICK: Right.- And that is important.

0:19:25 > 0:19:28SUSAN: The central issue is what are we going to use science for?

0:19:28 > 0:19:30There's a term called transhumanism.

0:19:30 > 0:19:33Transhumanism, which was described recently

0:19:33 > 0:19:36- as the world's most dangerous idea...- BRYAN: I second that.

0:19:36 > 0:19:39..is you can enhance physical and mental powers by

0:19:39 > 0:19:42artificial means whether they are biomedical or physical.

0:19:42 > 0:19:45You could have implants in the brain or genetic manipulation

0:19:45 > 0:19:47or cognitive enhancers or whatever.

0:19:47 > 0:19:51The notion being, that above and beyond your healthy norm,

0:19:51 > 0:19:54you go beyond into some other stratosphere of perfection

0:19:54 > 0:19:56which I think is a sinister notion.

0:19:56 > 0:19:58Isn't the point that we've got to hope

0:19:58 > 0:20:03that we live in societies which will actually put limits on what's done?

0:20:03 > 0:20:06And so... I mean, for example,

0:20:06 > 0:20:10for a long time there's been the ability to investigate

0:20:10 > 0:20:13whether a couple who've had a child

0:20:13 > 0:20:16with a severe genetic disease have the opportunity

0:20:16 > 0:20:19to identify whether a subsequent pregnancy's affected.

0:20:19 > 0:20:20And there are options.

0:20:20 > 0:20:22There's IVF, there's abortion.

0:20:22 > 0:20:25So we've been doing that for a very long time.

0:20:25 > 0:20:27And I think most medical practitioners, not all,

0:20:27 > 0:20:31would think that was reasonable to do and society at large has thought,

0:20:31 > 0:20:34in a regulated way, it's a reasonable thing to do.

0:20:34 > 0:20:37But I hope no-one round the table would think it was reasonable

0:20:37 > 0:20:40to select children on the colour of their eyes,

0:20:40 > 0:20:42or the colour of their hair or...

0:20:42 > 0:20:43DICK: Male or female.

0:20:43 > 0:20:46SUSAN: OK, who is physically and mentally perfect, then?

0:20:46 > 0:20:50Wait, there's an important add-on to what Mark's just said too, there.

0:20:50 > 0:20:54Which is that, the whole debate about so-called "designer babies"

0:20:54 > 0:20:58that is happening at the moment,

0:20:58 > 0:21:01is, by and large, founded on a science fiction

0:21:01 > 0:21:03rather than on actual science.

0:21:03 > 0:21:05In that, with the except...

0:21:05 > 0:21:09OK, at the moment, we have the technology to look for single genes

0:21:09 > 0:21:14in embryos that have been created by IVF and ensure that the embryos

0:21:14 > 0:21:18that do not carry Huntington's disease or cystic fibrosis

0:21:18 > 0:21:22are not implanted. It is a HUGE step

0:21:22 > 0:21:28from that to attempting to select the dozens or hundreds or genes

0:21:28 > 0:21:31that might influence intelligence or height or good looks or something.

0:21:31 > 0:21:33For a large number of reasons.

0:21:33 > 0:21:38And I think it's very important that, at least in the short to medium term,

0:21:38 > 0:21:40that, actually, ethical arguments

0:21:40 > 0:21:43about what is and isn't acceptable ethically

0:21:43 > 0:21:46are grounded in what's actually possible

0:21:46 > 0:21:49rather than what might happen in 50 or 100 years' time.

0:21:49 > 0:21:51Couldn't you argue the opposite?

0:21:51 > 0:21:54You could argue that given that we're not there,

0:21:54 > 0:21:57this is the time to decide the ethical position.

0:21:57 > 0:22:00BRYAN: Transhumanism is the point. I've spent time with transhumanists

0:22:00 > 0:22:03and 50% are crazy, but 50% are quite intelligent.

0:22:03 > 0:22:07And the argument is that you'd basically be consumer pressure.

0:22:07 > 0:22:10Now, because of consumer pressure, I want my baby to be X.

0:22:10 > 0:22:11Now, it may be a simple decision

0:22:11 > 0:22:14about what we agree is a medical condition,

0:22:14 > 0:22:16but it may not. It may be something else.

0:22:16 > 0:22:19Now, when you get to that point, it's a consumer pressure.

0:22:19 > 0:22:21And that will come.

0:22:21 > 0:22:24And transhumanists are all for it.

0:22:24 > 0:22:26SUSAN: Can I answer that? It's an important...

0:22:26 > 0:22:28What is THE perfect brain?

0:22:28 > 0:22:31You know? Is it that you have prolonged attention span?

0:22:31 > 0:22:34Is it that you have a better memory? No, because computers

0:22:34 > 0:22:37can do those things and I don't think one would point

0:22:37 > 0:22:39to Beethoven, Shakespeare and Einstein

0:22:39 > 0:22:42and say that what they had in common was an attention span.

0:22:42 > 0:22:43So, the notion is

0:22:43 > 0:22:47that there's no such thing as THE perfect brain anyway.

0:22:47 > 0:22:49Surely the whole idea is that we're diverse...

0:22:49 > 0:22:53BRYAN: Because we're human, we don't know what a human enhancement is.

0:22:53 > 0:22:54Exactly, that's my point.

0:22:54 > 0:22:56This is sophistry.

0:22:56 > 0:22:59We're talking about parents who want their children to be

0:22:59 > 0:23:01more intelligent rather than less.

0:23:01 > 0:23:04More beautiful rather than less beautiful.

0:23:04 > 0:23:05More tall rather than less tall.

0:23:05 > 0:23:09So, you're a parent, let's say. OK, let's challenge you, Michael.

0:23:09 > 0:23:13- OK, you have children. You want your kids to be more intelligent?- Yeah.

0:23:13 > 0:23:16What will they be able to do with this more intelligence?

0:23:16 > 0:23:20Will they write more symphonies, will they write novels?

0:23:20 > 0:23:24I think they'll have a better life, because they'll be engaged

0:23:24 > 0:23:27in lovely conversations. I think they'll earn more money.

0:23:27 > 0:23:29Oh, that's interesting.

0:23:29 > 0:23:33But this more that you're going to give them, this "more intelligence",

0:23:33 > 0:23:36Is it a better memory? A longer attention span? What is it?

0:23:36 > 0:23:40No, really. I think you're being absolutely sophistical.

0:23:40 > 0:23:44Go... Go out and ask, just ask anyone. Do you want to be...

0:23:44 > 0:23:47- Do you want to be brighter or less bright?- I'm asking you.

0:23:47 > 0:23:49Yes, and I am answering you.

0:23:49 > 0:23:51- What...- I've answered you already.

0:23:51 > 0:23:53I want them to have the quality of conversation,

0:23:53 > 0:23:55I want them to have earning power.

0:23:55 > 0:23:59Do you think we've all got brilliant memories round the table? No.

0:23:59 > 0:24:00DICK: Susan. Susan, just a moment.

0:24:00 > 0:24:04There are certain things which are already possible.

0:24:04 > 0:24:08I mean, it may well be that we can stop people having

0:24:08 > 0:24:14Huntington's chorea. There are certain...health effects

0:24:14 > 0:24:16which could be prevented.

0:24:16 > 0:24:22Now, it's not an awfully long step to stop people being too short.

0:24:22 > 0:24:27You can begin to influence the way in which people develop.

0:24:27 > 0:24:33Now, this does raise certain ethical problems.

0:24:33 > 0:24:35You can't deny that.

0:24:35 > 0:24:37SUSAN: How short is too short? 5ft 4? 5ft 5?

0:24:37 > 0:24:39It's not... Wait a moment.

0:24:39 > 0:24:40You're avoiding the issue.

0:24:40 > 0:24:44I want to know how short you can be before you have to be manipulated.

0:24:44 > 0:24:47No, but that doesn't deal with the question, Susan.

0:24:47 > 0:24:51The answer to your question. The question to your ques...

0:24:51 > 0:24:54The answer to your question is 5ft 7, now let's move on.

0:24:54 > 0:24:57Tina, is there any moral difference between selecting an embryo

0:24:57 > 0:25:00that doesn't have Huntington's chorea

0:25:00 > 0:25:03and selecting an embryo that doesn't have blue eyes?

0:25:03 > 0:25:04I think there is,

0:25:04 > 0:25:06but I'm cautious on all this.

0:25:06 > 0:25:10I mean, this is a question not about human perfectibility

0:25:10 > 0:25:13which is such a myth and such a fantasy.

0:25:13 > 0:25:17But about suffering and our capacity to alleviate it.

0:25:17 > 0:25:20And I can't give a clear answer to that,

0:25:20 > 0:25:22because I think we need to go on a case by case,

0:25:22 > 0:25:26painful struggle, to say, where can we alleviate human suffering?

0:25:26 > 0:25:30And if we want to do that, there are millions of children in this world

0:25:30 > 0:25:34we could just give a square meal to tonight. That would be enough.

0:25:34 > 0:25:37I know, I know, I know. All right, let me give you another issue.

0:25:37 > 0:25:41Is there any moral difference between selecting whether the baby

0:25:41 > 0:25:43is relieved from having Down's syndrome

0:25:43 > 0:25:45or whether the baby is going to have blue eyes?

0:25:49 > 0:25:53- I'm...not sure.- All those kids out there with Down's syndrome

0:25:53 > 0:25:56- are waiting to hear your answer. - That's why I'm not sure.

0:25:56 > 0:25:58I fear a world where we judge perfectibility

0:25:58 > 0:26:01on external criteria that you... I mean, you, Michael.

0:26:01 > 0:26:06You want genetic manipulation so that you can have richer, brighter kids.

0:26:06 > 0:26:10If the economic meltdown is anything to go by, if you want richer kids

0:26:10 > 0:26:13you should get a gene that is stupid and short-sighted.

0:26:13 > 0:26:15Then your kids will be richer.

0:26:15 > 0:26:18Yes, but I don't think your abuse of me entirely deals

0:26:18 > 0:26:19with the moral problem.

0:26:19 > 0:26:23I can't ENTIRELY deal with moral problems, none of us can.

0:26:23 > 0:26:26We have to say, what CAN we do? What SHOULD we do?

0:26:26 > 0:26:28And what good will it do, if we do it?

0:26:28 > 0:26:32And every capacity we have, we need to ask those questions.

0:26:32 > 0:26:35DICK: Tina, you haven't answered the question.

0:26:35 > 0:26:38Because the question is that there's a very big moral difference

0:26:38 > 0:26:41between selecting somebody who's got blue eyes

0:26:41 > 0:26:43- and somebody who's got Down's syndrome.- No.- Why?

0:26:43 > 0:26:46What's the moral difference?

0:26:46 > 0:26:48There are a lot of differences.

0:26:48 > 0:26:49One is irrelevant,

0:26:49 > 0:26:52the other makes a huge difference to the future of their lives.

0:26:52 > 0:26:55SIR WALPORT: The issue that we're not discussing

0:26:55 > 0:26:57is the potential harm from making the choice.

0:26:57 > 0:27:00You go through a process of in vitro fertilisation,

0:27:00 > 0:27:02it has side effects.

0:27:02 > 0:27:05You go through abortion, again there are risks.

0:27:05 > 0:27:07The discussion becomes interesting

0:27:07 > 0:27:11if you get into a universe where you can create an egg in the laboratory,

0:27:11 > 0:27:12you can create a sperm,

0:27:12 > 0:27:15then you can start the selection and there's no harm.

0:27:15 > 0:27:17And that's where this is a decision

0:27:17 > 0:27:20that shouldn't be made by scientists any more.

0:27:20 > 0:27:23It's one that, actually, the whole of society has to debate.

0:27:23 > 0:27:27What does the whole of society mean? Does it mean anything?

0:27:27 > 0:27:31Well, here we mean parliamentary democracy, actually. We mean...

0:27:31 > 0:27:34Well, it's the b... It's the best we can do.

0:27:34 > 0:27:37And actually, on in vitro fertilisation and embryo research

0:27:37 > 0:27:38Parliament has done well.

0:27:38 > 0:27:42- The track record isn't all bad...- Mixed.

0:27:42 > 0:27:45SUSAN: Leaving that is the very exciting question,

0:27:45 > 0:27:47what kind of society do we want?

0:27:47 > 0:27:49What people do you want your kids to be?

0:27:49 > 0:27:52What values and features do we want them to have?

0:27:52 > 0:27:54For the first time, this is what's so exciting,

0:27:54 > 0:27:59biomedical science has freed us up to have longer lives,

0:27:59 > 0:28:02to be freed up, technology's freed us from the drudgery of everyday,

0:28:02 > 0:28:05so for the first time EVER we can have these issues.

0:28:05 > 0:28:08And it's exciting, but it's a very scary time as well.

0:28:08 > 0:28:10Thank you all very much indeed.

0:28:10 > 0:28:15'In tomorrow's world of impeccable blue-eyed babies,

0:28:15 > 0:28:17'carbon neutral electricity,

0:28:17 > 0:28:19'and plentiful, aphid-resistant crops,

0:28:19 > 0:28:22'let us hope that our weather will be calm

0:28:22 > 0:28:24'and our polar bears will be safe.

0:28:24 > 0:28:29'But if so, as we sit down to our perfectly balanced meals

0:28:29 > 0:28:31'at the dinner party of the future,

0:28:31 > 0:28:34'will there be anything left to talk about?'

0:28:42 > 0:28:45Subtitles by Red Bee Media Ltd

0:28:45 > 0:28:48E-mail subtitling@bbc.co.uk