
Browse content similar to Can Scientists Be Morally Neutral?. Check below for episodes and series from the same categories and more!
| Line | From | To | |
|---|---|---|---|
A scientist, Charles Darwin, | 0:00:03 | 0:00:06 | |
made us see ourselves differently | 0:00:06 | 0:00:09 | |
with an explanation of how humans came to exist. | 0:00:09 | 0:00:12 | |
Even now, his theory of evolution makes many uncomfortable | 0:00:12 | 0:00:16 | |
and challenges their beliefs. | 0:00:16 | 0:00:19 | |
Later advances in science have produced many moral dilemmas. | 0:00:19 | 0:00:23 | |
And in the 20th century, the Nazis used quasi-science | 0:00:23 | 0:00:27 | |
to justify mass murder. | 0:00:27 | 0:00:30 | |
In this century, many fear what science could unleash. | 0:00:30 | 0:00:34 | |
Europe resisted genetically modified crops | 0:00:34 | 0:00:37 | |
and America blocked stem cell research. | 0:00:37 | 0:00:41 | |
And we shrink from the prospect of designing our babies. | 0:00:41 | 0:00:44 | |
What responsibility does the scientist have, | 0:00:44 | 0:00:47 | |
setting out on the path of research? | 0:00:47 | 0:00:49 | |
Is it merely to encourage knowledge? | 0:00:49 | 0:00:53 | |
To grow and to flourish. | 0:00:53 | 0:00:55 | |
Or should they reflect on the impact that their discoveries, | 0:00:55 | 0:00:59 | |
such as splitting the atom, could have on humanity? | 0:00:59 | 0:01:03 | |
Can they safely leave such worries to those who enjoy political power? | 0:01:03 | 0:01:08 | |
At dinner tonight, helping me to test the role | 0:01:11 | 0:01:15 | |
of scientists to destruction, | 0:01:15 | 0:01:18 | |
are a medical scientist, | 0:01:18 | 0:01:19 | |
a Catholic theologian, | 0:01:19 | 0:01:21 | |
an Oxford neuroscientist | 0:01:21 | 0:01:22 | |
and a science journalist. | 0:01:22 | 0:01:25 | |
Today, we trade in our Petri dishes and our test tubes | 0:01:29 | 0:01:33 | |
for platters of food and goblets of wine, as we ask... | 0:01:33 | 0:01:39 | |
My guests are Sir Mark Walport - | 0:01:44 | 0:01:47 | |
doctor, medical scientist and director of the Wellcome Trust, | 0:01:47 | 0:01:51 | |
the UK's biggest medical research charity. | 0:01:51 | 0:01:54 | |
Petra Boynton who's an agony aunt, sex researcher | 0:01:54 | 0:01:59 | |
and lecturer at University College London. | 0:01:59 | 0:02:04 | |
Lord Dick Taverne - one time treasury minister | 0:02:04 | 0:02:07 | |
and chair of Sense About Science, | 0:02:07 | 0:02:09 | |
a charity promoting public understanding of science. | 0:02:09 | 0:02:13 | |
Baroness Susan Greenfield - neuroscientist, cross-bench peer | 0:02:13 | 0:02:18 | |
and professor of pharmacology at the University Of Oxford. | 0:02:18 | 0:02:23 | |
Mark Henderson, who is the award winning science editor of The Times, | 0:02:23 | 0:02:28 | |
and also, an author and columnist. | 0:02:28 | 0:02:32 | |
Tina Beattie - Roman Catholic theologian | 0:02:32 | 0:02:35 | |
and professor of Catholic Studies at Roehampton University. | 0:02:35 | 0:02:40 | |
And Bryan Appleyard - author, journalist | 0:02:40 | 0:02:43 | |
and three times award winner at the British Press Awards. | 0:02:43 | 0:02:48 | |
A scientist splits the atom, | 0:02:48 | 0:02:50 | |
knowing that that could lead to a nuclear bomb, | 0:02:50 | 0:02:52 | |
so what is the responsibility of the scientist? | 0:02:52 | 0:02:55 | |
SIR WALPORT: The responsibility is to argue for peaceful use | 0:02:55 | 0:02:58 | |
of that information rather than war-like. | 0:02:58 | 0:03:01 | |
-I mean... -Argue for it? -Yes. | 0:03:01 | 0:03:03 | |
Scientists are moral beings | 0:03:03 | 0:03:05 | |
in the same way as every other person is a moral being. | 0:03:05 | 0:03:09 | |
And I think that the important distinction is that in science, | 0:03:09 | 0:03:13 | |
I think, there are no limits to how much we can know. | 0:03:13 | 0:03:18 | |
But where morals come in, | 0:03:18 | 0:03:19 | |
is how you use that knowledge. | 0:03:19 | 0:03:21 | |
It's really important that one doesn't say that scientists | 0:03:21 | 0:03:25 | |
-operate in an amoral sphere. -No. | 0:03:25 | 0:03:27 | |
So let's focus on splitting the atom. | 0:03:27 | 0:03:29 | |
Splitting the atom, you can either argue that this is | 0:03:29 | 0:03:32 | |
what's given us the nuclear bomb which has killed a lot of people | 0:03:32 | 0:03:36 | |
or what's given us nuclear power which is going to save the planet. | 0:03:36 | 0:03:39 | |
Right, but I don't think that you should then argue | 0:03:39 | 0:03:42 | |
that scientists aren't entitled to an opinion | 0:03:42 | 0:03:45 | |
about whether it's better to use that knowledge to produce power | 0:03:45 | 0:03:48 | |
as opposed to war. You could go all the way back to fire. | 0:03:48 | 0:03:51 | |
I've discovered this stuff called fire. In two different societies, | 0:03:51 | 0:03:55 | |
fire can be used for good purposes or bad purposes. | 0:03:55 | 0:03:58 | |
Scientists shouldn't be neutered, | 0:03:58 | 0:04:00 | |
scientists are entitled to express opinion... | 0:04:00 | 0:04:03 | |
-Indeed, but they're not terribly good at arguing. -SUSAN: No. | 0:04:03 | 0:04:06 | |
Scientists are as good as anyone else at arguing, actually. | 0:04:06 | 0:04:09 | |
No, there are professional arguers. | 0:04:09 | 0:04:12 | |
-But coming back... -But science is about argument, | 0:04:12 | 0:04:14 | |
it is about resolving uncertainty through experiment. | 0:04:14 | 0:04:17 | |
-That's science. -We're not good at putting our case | 0:04:17 | 0:04:20 | |
-in general terms in a way that people... -Some of us are... | 0:04:20 | 0:04:24 | |
-MARK: I disagree... -Susan, you're rarely off the television. | 0:04:24 | 0:04:27 | |
You can't possibly... | 0:04:27 | 0:04:29 | |
MARK: What Susan said about scientists not being good at arguing | 0:04:29 | 0:04:33 | |
was probably true ten years ago. | 0:04:33 | 0:04:36 | |
And actually, partly because of people like Susan and Dick | 0:04:36 | 0:04:39 | |
and initiatives that they've set up, | 0:04:39 | 0:04:42 | |
I think, that probably has changed to a degree... | 0:04:42 | 0:04:45 | |
I think there's still a way to go, I don't know how Dick feels. | 0:04:45 | 0:04:48 | |
This is a truly important point for the scientists round the table - | 0:04:48 | 0:04:52 | |
culturally, in our community, it is still considered, in a sense, | 0:04:52 | 0:04:55 | |
a no-go area, but certainly less than desirable area | 0:04:55 | 0:04:58 | |
to talk to the media a lot. And to "dumb down". | 0:04:58 | 0:05:01 | |
You're, kind of, sneered at still, I'm afraid. | 0:05:01 | 0:05:04 | |
So only a few people will do this. | 0:05:04 | 0:05:06 | |
And so despite what you say about it now becoming respectable... | 0:05:06 | 0:05:10 | |
-It's not. -Culturally, | 0:05:10 | 0:05:11 | |
it's an important issue among the scientific community. | 0:05:11 | 0:05:14 | |
DICK: Can I come back to the question of morality in science? | 0:05:14 | 0:05:18 | |
Because Mark raised an important point. | 0:05:18 | 0:05:21 | |
Science, as he indicated, is morally neutral. | 0:05:21 | 0:05:24 | |
I mean, it can be used for good or ill. | 0:05:24 | 0:05:27 | |
Scientists are not morally neutral, they're very much involved. | 0:05:27 | 0:05:31 | |
I think there's a further connection between science and morality | 0:05:31 | 0:05:34 | |
which is often neglected. | 0:05:34 | 0:05:36 | |
Because science, by being against superstition, | 0:05:36 | 0:05:40 | |
against dogmatism, because it's tentative knowledge, | 0:05:40 | 0:05:44 | |
because it's questioning and therefore anti-authority, | 0:05:44 | 0:05:47 | |
it's anti-authority and therefore anti-autocracy. | 0:05:47 | 0:05:50 | |
Science is, on the whole, a force for tolerance | 0:05:50 | 0:05:53 | |
and also very relevant to human rights | 0:05:53 | 0:05:56 | |
which are very great moral issues. | 0:05:56 | 0:05:58 | |
Because a lot of the dispute about human rights | 0:05:58 | 0:06:02 | |
is based on disputes about misconceptions. | 0:06:02 | 0:06:05 | |
None of those things... | 0:06:05 | 0:06:06 | |
That blacks are less intelligent, | 0:06:06 | 0:06:08 | |
-that women are second class citizens. -Absolutely, | 0:06:08 | 0:06:10 | |
because people have made mistakes, | 0:06:10 | 0:06:13 | |
because science has managed to dispel the misconceptions there were | 0:06:13 | 0:06:17 | |
about the characteristics of gender. | 0:06:17 | 0:06:19 | |
BRYAN: You're making a category error. | 0:06:19 | 0:06:21 | |
You're talking about a process called science | 0:06:21 | 0:06:24 | |
which works by being neutral. | 0:06:24 | 0:06:26 | |
Now, the category error you're making is saying science is not | 0:06:26 | 0:06:29 | |
tolerant of dictatorship, tyranny, all those things. | 0:06:29 | 0:06:32 | |
-Historically... -It questions authority as Galileo did. | 0:06:32 | 0:06:36 | |
..it's been very useful to tyranny. | 0:06:36 | 0:06:38 | |
Technology can be used for good or ill, | 0:06:38 | 0:06:40 | |
but science itself is questioning and therefore is anti-authoritarian. | 0:06:40 | 0:06:44 | |
The Russian scientist who conducted | 0:06:44 | 0:06:46 | |
a massive biological weapons programme, | 0:06:46 | 0:06:49 | |
did produce some very effective biological weapons. | 0:06:49 | 0:06:52 | |
The point is, science is a very neutral, very narrow activity. | 0:06:52 | 0:06:55 | |
That's what you've got. | 0:06:55 | 0:06:57 | |
The other things, is rhetoric surrounding science. | 0:06:57 | 0:07:00 | |
You're bringing morals from outside to attach to science. | 0:07:00 | 0:07:03 | |
MARK: There is one point though that Dick has made | 0:07:03 | 0:07:06 | |
that is very critical | 0:07:06 | 0:07:07 | |
and that is that science is inherently anti-authority. | 0:07:07 | 0:07:11 | |
In that if an authority figure says something, | 0:07:11 | 0:07:14 | |
all you need to get around that is evidence. | 0:07:14 | 0:07:17 | |
When Marshall and Warren came up with the idea that stomach ulcers | 0:07:17 | 0:07:22 | |
were cause by bacterium, | 0:07:22 | 0:07:24 | |
everybody laughed at them until they proved their case. | 0:07:24 | 0:07:27 | |
And science changed as a result of that. | 0:07:27 | 0:07:29 | |
There are countless examples of that where an unorthodox opinion, | 0:07:29 | 0:07:33 | |
provided that people provide the evidence, can take over. | 0:07:33 | 0:07:37 | |
PETRA: That presumes one view of science | 0:07:37 | 0:07:39 | |
and this is where the issue about gender, | 0:07:39 | 0:07:41 | |
would be lovely if that was true. | 0:07:41 | 0:07:43 | |
I'm sure we'd have a great time if that were true. | 0:07:43 | 0:07:46 | |
You know, evolutionary psychology doesn't think women are equal. | 0:07:46 | 0:07:49 | |
The lived experience of women, | 0:07:49 | 0:07:51 | |
in culture, women in science particularly, | 0:07:51 | 0:07:54 | |
that science hasn't always worked in that sense at all. | 0:07:54 | 0:07:58 | |
It's often worked to prove that we shouldn't be studying. | 0:07:58 | 0:08:01 | |
Do you think that it's a coincidence then, | 0:08:01 | 0:08:03 | |
that the position of women has approached equality | 0:08:03 | 0:08:06 | |
in those societies that are most scientifically advanced? | 0:08:06 | 0:08:10 | |
No. | 0:08:11 | 0:08:12 | |
Well, again it depends on how you view equality. | 0:08:12 | 0:08:14 | |
Look at the Islamic world. | 0:08:14 | 0:08:16 | |
TINA: Well, excuse me, I think this is really problematic. | 0:08:16 | 0:08:19 | |
If we're going to have this discussion, let's be scientific. | 0:08:19 | 0:08:23 | |
Countries that are tyrannies, dictatorships, | 0:08:23 | 0:08:25 | |
not at all like the, sort of, | 0:08:25 | 0:08:27 | |
gender egalitarian idyll that we're dreaming of in the West, | 0:08:27 | 0:08:31 | |
use science just as effectively as we do. | 0:08:31 | 0:08:34 | |
-But I don't think that was the argument. -No, that's not. | 0:08:34 | 0:08:37 | |
SUSAN: Let's remind ourselves of what science is. | 0:08:37 | 0:08:40 | |
Science is a way of approaching a certain methodology, | 0:08:40 | 0:08:43 | |
it's a certain way of doing something | 0:08:43 | 0:08:46 | |
and let's not dignify it and sanctify it | 0:08:46 | 0:08:48 | |
as though anyone doing science has the answer. | 0:08:48 | 0:08:52 | |
There's bad scientists, good... | 0:08:52 | 0:08:54 | |
If your knowledge is tentative, scientific knowledge, | 0:08:54 | 0:08:58 | |
you're not going to be dogmatic the way that other people are. | 0:08:58 | 0:09:02 | |
I do want to get an idea of... the, kind of, staircase here. | 0:09:02 | 0:09:06 | |
We're quite certain the Earth is round, are we? | 0:09:06 | 0:09:09 | |
THEY LAUGH | 0:09:09 | 0:09:11 | |
-Well, I'm not so sure. -And the sun goes round it? | 0:09:13 | 0:09:15 | |
-No. -And the Earth goes round the sun. | 0:09:15 | 0:09:17 | |
We're quite sure the world is round. Are we sure that HIV causes AIDS? | 0:09:17 | 0:09:24 | |
-Yes. -Are we sure that man is causing climate change? | 0:09:24 | 0:09:28 | |
We're...pretty sure, yeah. | 0:09:28 | 0:09:30 | |
And are we sure that GM foods are a good thing and safe? | 0:09:30 | 0:09:34 | |
No, because that's the wrong question. | 0:09:34 | 0:09:37 | |
The question is, what GM food for what purpose? | 0:09:37 | 0:09:40 | |
-MARK: Absolutely. -There is nothing generic about GM foods. | 0:09:40 | 0:09:43 | |
Let's take the current one which is synthetic biology. | 0:09:43 | 0:09:47 | |
Is it a good thing to synthesise a virus that will carry | 0:09:47 | 0:09:50 | |
some terrible toxin and kill the world? Of course not. | 0:09:50 | 0:09:53 | |
Is it sensible to try and develop | 0:09:53 | 0:09:55 | |
organisms that will make antibiotics? Yes. | 0:09:55 | 0:09:58 | |
And so it's about specifics there. | 0:09:58 | 0:10:00 | |
GM foods are not generic, they're something specific. | 0:10:00 | 0:10:03 | |
You can make the argument about something like guns or hot water. | 0:10:03 | 0:10:07 | |
I mean, you can use them for good purposes and bad purposes. | 0:10:07 | 0:10:10 | |
Right, and so I want to ask Bryan, | 0:10:10 | 0:10:13 | |
do you believe that science should restrict itself? Limit itself | 0:10:13 | 0:10:18 | |
to the development of things that have good moral outcomes. | 0:10:18 | 0:10:21 | |
Obviously some outcomes are undesirable - | 0:10:21 | 0:10:23 | |
we don't want biological weapons in the hands of rogue states. | 0:10:23 | 0:10:27 | |
The problem with science is its immense power. | 0:10:27 | 0:10:31 | |
You have to look at not just the moment of doing the science, | 0:10:31 | 0:10:36 | |
but the moment people step away and it becomes real. | 0:10:36 | 0:10:39 | |
Now, a very respectable scientist, in his day, was Ernst Haeckel. | 0:10:39 | 0:10:43 | |
Generally his bit of biology | 0:10:43 | 0:10:44 | |
seemed pretty good, at the time. Right? | 0:10:44 | 0:10:47 | |
Well, he informed the politics of Mein Kampf. | 0:10:47 | 0:10:49 | |
That is a serious point. | 0:10:49 | 0:10:50 | |
-Well, is it? Is it? -Yes. | 0:10:50 | 0:10:53 | |
-You're saying that this... -I'm saying is... -..this lovely, liberal | 0:10:53 | 0:10:58 | |
human kind loving man, was in some way responsible for Hitler? | 0:10:58 | 0:11:02 | |
No, I'm saying that the science he produced entailed Nazism. | 0:11:02 | 0:11:09 | |
It was inherent in his science to be adopted by the politics of Hitler. | 0:11:09 | 0:11:16 | |
That does not mean Haeckel was guilty, | 0:11:16 | 0:11:19 | |
he was an unpleasant man, | 0:11:19 | 0:11:20 | |
but that's another matter. | 0:11:20 | 0:11:22 | |
I'm not so interested in morality in science, | 0:11:22 | 0:11:25 | |
because that's absolutely determined. | 0:11:25 | 0:11:27 | |
Scientists are moral creatures. | 0:11:27 | 0:11:29 | |
Science is this single, narrow pursuit of something. | 0:11:29 | 0:11:33 | |
Everything that happens outside that is moral. | 0:11:33 | 0:11:35 | |
But supposing that it were right to say that different races | 0:11:35 | 0:11:38 | |
have different characteristics, different intelligences. | 0:11:38 | 0:11:42 | |
That isn't to say that you ought to exterminate all the people | 0:11:42 | 0:11:45 | |
who are different, so why are you blaming the scientist? | 0:11:45 | 0:11:49 | |
Supposing that... | 0:11:49 | 0:11:50 | |
A lot of scientists have colluded in those policies. | 0:11:50 | 0:11:52 | |
..what would you do with that fact? | 0:11:52 | 0:11:54 | |
But what would you do with the fact? I mean, are you saying | 0:11:54 | 0:11:58 | |
that there are things we should not discover | 0:11:58 | 0:12:00 | |
-in case they're abused by politicians? -No, no. | 0:12:00 | 0:12:04 | |
No, I'm not saying that at all. | 0:12:04 | 0:12:06 | |
There should be a world which exists prior to science, | 0:12:06 | 0:12:10 | |
prior to that discovery, in which the idea of that discovery | 0:12:10 | 0:12:13 | |
being used against people would be absurd. | 0:12:13 | 0:12:16 | |
That people would be morally mature enough to see that. | 0:12:16 | 0:12:19 | |
-LORD WALPORT: Are you arguing against finding thing out? -No. | 0:12:19 | 0:12:22 | |
He's asking me to argue with that. | 0:12:22 | 0:12:24 | |
TINA: For scientists to measure humans... Why do we want, | 0:12:24 | 0:12:27 | |
as a society, to transform our fellow human beings | 0:12:27 | 0:12:30 | |
into scientific measurable entities? | 0:12:30 | 0:12:32 | |
I'm sorry, Tina, your question is, why do we want to understand? | 0:12:32 | 0:12:36 | |
-I don't like that question. -No, my question is not that. | 0:12:36 | 0:12:39 | |
It's what do I want to understand that matters. | 0:12:39 | 0:12:43 | |
-So, right. So... -What do I want to understand? | 0:12:43 | 0:12:46 | |
So you are going to dictate to the rest of us | 0:12:46 | 0:12:48 | |
what we should be allowed to understand? | 0:12:48 | 0:12:50 | |
That I would be so powerful! | 0:12:50 | 0:12:52 | |
Wouldn't we live in a wonderful world! | 0:12:52 | 0:12:55 | |
You would like to be in that position... | 0:12:55 | 0:12:57 | |
No, I love being in this position - a messy conversation | 0:12:57 | 0:13:00 | |
where we can all chip in, that's what I want. | 0:13:00 | 0:13:02 | |
SIR WALPORT: But let's take a specific example. | 0:13:02 | 0:13:05 | |
So, there's dyslexia. | 0:13:05 | 0:13:06 | |
So, that's one aspect of cognitive ability - | 0:13:06 | 0:13:09 | |
your ability to read and write. | 0:13:09 | 0:13:11 | |
And we know that there's quite a high hereditability | 0:13:11 | 0:13:14 | |
to disorders in being able to read and write. | 0:13:14 | 0:13:16 | |
Now, does that mean that we shouldn't investigate | 0:13:16 | 0:13:19 | |
the genetics of that? Because if we understand that | 0:13:19 | 0:13:21 | |
then we may start explaining a normal range. | 0:13:21 | 0:13:24 | |
SUSAN: No, let's take dyslexia, | 0:13:24 | 0:13:26 | |
all a gene does is cause activation of proteins and what do proteins do? | 0:13:26 | 0:13:30 | |
The knock on, in brain function, is so indirect. | 0:13:30 | 0:13:33 | |
So therefore, much more interesting to ask, | 0:13:33 | 0:13:35 | |
what do we mean by intelligence? | 0:13:35 | 0:13:38 | |
What do we mean by understanding? What do we mean by human ability? | 0:13:38 | 0:13:41 | |
What do you want your kids to be like? | 0:13:41 | 0:13:43 | |
They're much more interesting questions | 0:13:43 | 0:13:46 | |
than just focusing on IQ and genes... | 0:13:46 | 0:13:48 | |
And who decides what's more interesting? | 0:13:48 | 0:13:51 | |
I mean, obviously you, but who else? | 0:13:51 | 0:13:53 | |
Ha! But I think a very interesting question, who decides? | 0:13:53 | 0:13:57 | |
It's something that I've been encountering recently | 0:13:57 | 0:14:00 | |
in the impact of screen technologies and how kids think. | 0:14:00 | 0:14:03 | |
This is a societal problem, it's something that no one sector owns. | 0:14:03 | 0:14:07 | |
Something we should all be debating and sharing. | 0:14:07 | 0:14:09 | |
That's when science is at its most interesting and potent. | 0:14:09 | 0:14:12 | |
The moral question, which we're supposed to be discussing, | 0:14:12 | 0:14:16 | |
is whether certain kinds of scientific research | 0:14:16 | 0:14:19 | |
are dangerous and should not be pursued. | 0:14:19 | 0:14:24 | |
Now, I think this is a very important question | 0:14:24 | 0:14:26 | |
and I don't see any limit on what should be pursued. | 0:14:26 | 0:14:29 | |
-It should ALWAYS be open. -MARK: One of the issues here | 0:14:29 | 0:14:32 | |
is that very often it's a misinterpretation of results | 0:14:32 | 0:14:36 | |
that ends up with discrimination at the end of the day. | 0:14:36 | 0:14:40 | |
I think, genetics is a prime example of this. | 0:14:40 | 0:14:43 | |
Genetic effects are not as, by and large as is commonly assumed, | 0:14:43 | 0:14:47 | |
deterministic. They are probabilistic. | 0:14:47 | 0:14:50 | |
If you have a particular genetic type, | 0:14:50 | 0:14:52 | |
it doesn't necessarily doom you or predestine you to anything. | 0:14:52 | 0:14:56 | |
It might mean you have an elevated risk of something | 0:14:56 | 0:14:59 | |
that you should look out for, but... | 0:14:59 | 0:15:01 | |
BRYAN: The question is not that, the question is, | 0:15:01 | 0:15:04 | |
if science did discover something like that, what would we do? | 0:15:04 | 0:15:07 | |
Petra, in your unusually applied field, | 0:15:07 | 0:15:11 | |
you're, kind of, helping people, | 0:15:11 | 0:15:13 | |
do you see a moral purpose for what you're doing? | 0:15:13 | 0:15:15 | |
Yes, but it has to be very careful. | 0:15:15 | 0:15:19 | |
You know, I've met young people, 12, 13, 14, who are having sex for money. | 0:15:19 | 0:15:25 | |
Who are being abused. | 0:15:25 | 0:15:26 | |
And I would like to help those children. | 0:15:26 | 0:15:29 | |
It would be easy to say, life for those children is terrible | 0:15:29 | 0:15:32 | |
and there's a huge amount of suffering. | 0:15:32 | 0:15:35 | |
To make people do something, I can completely see why people do that, | 0:15:35 | 0:15:38 | |
but I might misrepresent what's happening. | 0:15:38 | 0:15:41 | |
It's not to deny those children are suffering, | 0:15:41 | 0:15:43 | |
but I have to contextualise that. | 0:15:43 | 0:15:45 | |
And that, for me, is where the morality is very important. | 0:15:45 | 0:15:49 | |
In that, I have to be... fair and balanced. | 0:15:49 | 0:15:53 | |
That's often unpopular, | 0:15:53 | 0:15:54 | |
because people want you to toe a political line. | 0:15:54 | 0:15:57 | |
But do you go to work each morning thinking, "I'm helping people." | 0:15:57 | 0:16:01 | |
Um... I'd like to, yeah. | 0:16:01 | 0:16:04 | |
-You would, yeah. -BRYAN: But that's a perfectly good justification | 0:16:04 | 0:16:08 | |
-for medical research... -I'm not saying it's a bad justification, | 0:16:08 | 0:16:11 | |
I wanted to understand what it was. | 0:16:11 | 0:16:14 | |
I mean, do you go to work thinking that? | 0:16:14 | 0:16:16 | |
No, I don't think I do. | 0:16:16 | 0:16:17 | |
I think I go to work recognising | 0:16:17 | 0:16:19 | |
that ultimately the more we understand about health, | 0:16:19 | 0:16:22 | |
it will be good for the health of the population. | 0:16:22 | 0:16:26 | |
But do I think about the patient first? | 0:16:26 | 0:16:29 | |
I am a doctor, but no I don't. | 0:16:29 | 0:16:30 | |
I think about the wonder of science and finding things... | 0:16:30 | 0:16:33 | |
SUSAN: If I'm honest, like with Mark, | 0:16:33 | 0:16:36 | |
my immediate goal is to solve the problem. Is to solve the problem. | 0:16:36 | 0:16:40 | |
Motivation, I think, is a very important question. | 0:16:40 | 0:16:43 | |
There's an awful lot of suspicion of research. | 0:16:43 | 0:16:47 | |
For example, the attack on genetic modification's always been based on | 0:16:47 | 0:16:51 | |
great, big corporate concern to make lots of profits. | 0:16:51 | 0:16:55 | |
Now, there's no doubt | 0:16:55 | 0:16:56 | |
that some pharmaceutical companies have misbehaved. | 0:16:56 | 0:17:00 | |
But, by and large, what I think is very important to remember | 0:17:00 | 0:17:03 | |
is that most people who work for companies | 0:17:03 | 0:17:06 | |
are actually motivated to try and improve the human race. | 0:17:06 | 0:17:09 | |
Secondly, in same ways, motivation is unimportant. | 0:17:09 | 0:17:13 | |
Because the question is not who paid | 0:17:13 | 0:17:16 | |
or what were his motives or her motives, | 0:17:16 | 0:17:19 | |
but, is the research reproducible? | 0:17:19 | 0:17:23 | |
Does it stand up? Is it something which can, in fact, be supported | 0:17:23 | 0:17:29 | |
on objective grounds? Because if someone is motivated | 0:17:29 | 0:17:32 | |
by the worst possible motive, produces good research, | 0:17:32 | 0:17:36 | |
that is good research. | 0:17:36 | 0:17:37 | |
TINA: So let's all use Joseph Mengele's research, then. | 0:17:37 | 0:17:40 | |
-SIR WALPORT: No, come on. -Well, that is... | 0:17:40 | 0:17:43 | |
His research is reproducible. | 0:17:43 | 0:17:45 | |
It's abhorrent, but it's reproducible. | 0:17:45 | 0:17:47 | |
SIR WALPORT: Research does have to be conducted ethically. | 0:17:47 | 0:17:51 | |
Well, I don't think that is what Dick is saying, actually. | 0:17:51 | 0:17:54 | |
MARK: We don't know the extent to which | 0:17:54 | 0:17:56 | |
Mengele's research is reproducible, | 0:17:56 | 0:17:59 | |
because it's not published in an open fashion | 0:17:59 | 0:18:01 | |
and this is one of the critical things about science, | 0:18:01 | 0:18:05 | |
that results are published so they can be challenged | 0:18:05 | 0:18:08 | |
with all the data there and the workings there, | 0:18:08 | 0:18:12 | |
so that people can go back and examine it and it's the... | 0:18:12 | 0:18:16 | |
Everybody always asks, "What are the motives for doing it?" | 0:18:16 | 0:18:19 | |
And they ought to be asking, "What are the results of the research? | 0:18:19 | 0:18:24 | |
"Have they been peer reviewed? | 0:18:24 | 0:18:25 | |
"Have they been independently tested? Have they been reproduced?" | 0:18:25 | 0:18:29 | |
It doesn't matter whether if it's been done by Monsanto | 0:18:29 | 0:18:32 | |
or by Greenpeace. The question is, is it good research or bad research? | 0:18:32 | 0:18:36 | |
SIR WALPORT: But hang on, there are limits on how you find things out. | 0:18:36 | 0:18:39 | |
Absolutely, unequivocally. | 0:18:39 | 0:18:41 | |
TINA: It sounds like there aren't, if I listen to what Dick's saying. | 0:18:41 | 0:18:45 | |
-ALL: No! -It doesn't matter what you do, | 0:18:45 | 0:18:47 | |
so long as you can repeat your experiments | 0:18:47 | 0:18:50 | |
and show that you have results. | 0:18:50 | 0:18:52 | |
But it seems to me, you say, "Look. I can associate Hitler with Mengele. | 0:18:52 | 0:18:56 | |
"Mengele was a scientist, | 0:18:56 | 0:18:57 | |
"therefore I've got a problem with science." | 0:18:57 | 0:19:00 | |
-It's just not a good argument. -Exactly! I completely agree, | 0:19:00 | 0:19:04 | |
which is why I think you're making a really bad argument. | 0:19:04 | 0:19:08 | |
PETRA: Repeating research is important, | 0:19:08 | 0:19:10 | |
but it's contextualising and appraising it. | 0:19:10 | 0:19:13 | |
When work is in the public domain, | 0:19:13 | 0:19:14 | |
I can look at motivation. | 0:19:14 | 0:19:16 | |
I can start asking questions about who's done the research | 0:19:16 | 0:19:19 | |
and why they might have been interested in the question. | 0:19:19 | 0:19:22 | |
-DICK: Right. -And that is important. | 0:19:22 | 0:19:25 | |
SUSAN: The central issue is what are we going to use science for? | 0:19:25 | 0:19:28 | |
There's a term called transhumanism. | 0:19:28 | 0:19:30 | |
Transhumanism, which was described recently | 0:19:30 | 0:19:33 | |
-as the world's most dangerous idea... -BRYAN: I second that. | 0:19:33 | 0:19:36 | |
..is you can enhance physical and mental powers by | 0:19:36 | 0:19:39 | |
artificial means whether they are biomedical or physical. | 0:19:39 | 0:19:42 | |
You could have implants in the brain or genetic manipulation | 0:19:42 | 0:19:45 | |
or cognitive enhancers or whatever. | 0:19:45 | 0:19:47 | |
The notion being, that above and beyond your healthy norm, | 0:19:47 | 0:19:51 | |
you go beyond into some other stratosphere of perfection | 0:19:51 | 0:19:54 | |
which I think is a sinister notion. | 0:19:54 | 0:19:56 | |
Isn't the point that we've got to hope | 0:19:56 | 0:19:58 | |
that we live in societies which will actually put limits on what's done? | 0:19:58 | 0:20:03 | |
And so... I mean, for example, | 0:20:03 | 0:20:06 | |
for a long time there's been the ability to investigate | 0:20:06 | 0:20:10 | |
whether a couple who've had a child | 0:20:10 | 0:20:13 | |
with a severe genetic disease have the opportunity | 0:20:13 | 0:20:16 | |
to identify whether a subsequent pregnancy's affected. | 0:20:16 | 0:20:19 | |
And there are options. | 0:20:19 | 0:20:20 | |
There's IVF, there's abortion. | 0:20:20 | 0:20:22 | |
So we've been doing that for a very long time. | 0:20:22 | 0:20:25 | |
And I think most medical practitioners, not all, | 0:20:25 | 0:20:27 | |
would think that was reasonable to do and society at large has thought, | 0:20:27 | 0:20:31 | |
in a regulated way, it's a reasonable thing to do. | 0:20:31 | 0:20:34 | |
But I hope no-one round the table would think it was reasonable | 0:20:34 | 0:20:37 | |
to select children on the colour of their eyes, | 0:20:37 | 0:20:40 | |
or the colour of their hair or... | 0:20:40 | 0:20:42 | |
DICK: Male or female. | 0:20:42 | 0:20:43 | |
SUSAN: OK, who is physically and mentally perfect, then? | 0:20:43 | 0:20:46 | |
Wait, there's an important add-on to what Mark's just said too, there. | 0:20:46 | 0:20:50 | |
Which is that, the whole debate about so-called "designer babies" | 0:20:50 | 0:20:54 | |
that is happening at the moment, | 0:20:54 | 0:20:58 | |
is, by and large, founded on a science fiction | 0:20:58 | 0:21:01 | |
rather than on actual science. | 0:21:01 | 0:21:03 | |
In that, with the except... | 0:21:03 | 0:21:05 | |
OK, at the moment, we have the technology to look for single genes | 0:21:05 | 0:21:09 | |
in embryos that have been created by IVF and ensure that the embryos | 0:21:09 | 0:21:14 | |
that do not carry Huntington's disease or cystic fibrosis | 0:21:14 | 0:21:18 | |
are not implanted. It is a HUGE step | 0:21:18 | 0:21:22 | |
from that to attempting to select the dozens or hundreds or genes | 0:21:22 | 0:21:28 | |
that might influence intelligence or height or good looks or something. | 0:21:28 | 0:21:31 | |
For a large number of reasons. | 0:21:31 | 0:21:33 | |
And I think it's very important that, at least in the short to medium term, | 0:21:33 | 0:21:38 | |
that, actually, ethical arguments | 0:21:38 | 0:21:40 | |
about what is and isn't acceptable ethically | 0:21:40 | 0:21:43 | |
are grounded in what's actually possible | 0:21:43 | 0:21:46 | |
rather than what might happen in 50 or 100 years' time. | 0:21:46 | 0:21:49 | |
Couldn't you argue the opposite? | 0:21:49 | 0:21:51 | |
You could argue that given that we're not there, | 0:21:51 | 0:21:54 | |
this is the time to decide the ethical position. | 0:21:54 | 0:21:57 | |
BRYAN: Transhumanism is the point. I've spent time with transhumanists | 0:21:57 | 0:22:00 | |
and 50% are crazy, but 50% are quite intelligent. | 0:22:00 | 0:22:03 | |
And the argument is that you'd basically be consumer pressure. | 0:22:03 | 0:22:07 | |
Now, because of consumer pressure, I want my baby to be X. | 0:22:07 | 0:22:10 | |
Now, it may be a simple decision | 0:22:10 | 0:22:11 | |
about what we agree is a medical condition, | 0:22:11 | 0:22:14 | |
but it may not. It may be something else. | 0:22:14 | 0:22:16 | |
Now, when you get to that point, it's a consumer pressure. | 0:22:16 | 0:22:19 | |
And that will come. | 0:22:19 | 0:22:21 | |
And transhumanists are all for it. | 0:22:21 | 0:22:24 | |
SUSAN: Can I answer that? It's an important... | 0:22:24 | 0:22:26 | |
What is THE perfect brain? | 0:22:26 | 0:22:28 | |
You know? Is it that you have prolonged attention span? | 0:22:28 | 0:22:31 | |
Is it that you have a better memory? No, because computers | 0:22:31 | 0:22:34 | |
can do those things and I don't think one would point | 0:22:34 | 0:22:37 | |
to Beethoven, Shakespeare and Einstein | 0:22:37 | 0:22:39 | |
and say that what they had in common was an attention span. | 0:22:39 | 0:22:42 | |
So, the notion is | 0:22:42 | 0:22:43 | |
that there's no such thing as THE perfect brain anyway. | 0:22:43 | 0:22:47 | |
Surely the whole idea is that we're diverse... | 0:22:47 | 0:22:49 | |
BRYAN: Because we're human, we don't know what a human enhancement is. | 0:22:49 | 0:22:53 | |
Exactly, that's my point. | 0:22:53 | 0:22:54 | |
This is sophistry. | 0:22:54 | 0:22:56 | |
We're talking about parents who want their children to be | 0:22:56 | 0:22:59 | |
more intelligent rather than less. | 0:22:59 | 0:23:01 | |
More beautiful rather than less beautiful. | 0:23:01 | 0:23:04 | |
More tall rather than less tall. | 0:23:04 | 0:23:05 | |
So, you're a parent, let's say. OK, let's challenge you, Michael. | 0:23:05 | 0:23:09 | |
-OK, you have children. You want your kids to be more intelligent? -Yeah. | 0:23:09 | 0:23:13 | |
What will they be able to do with this more intelligence? | 0:23:13 | 0:23:16 | |
Will they write more symphonies, will they write novels? | 0:23:16 | 0:23:20 | |
I think they'll have a better life, because they'll be engaged | 0:23:20 | 0:23:24 | |
in lovely conversations. I think they'll earn more money. | 0:23:24 | 0:23:27 | |
Oh, that's interesting. | 0:23:27 | 0:23:29 | |
But this more that you're going to give them, this "more intelligence", | 0:23:29 | 0:23:33 | |
Is it a better memory? A longer attention span? What is it? | 0:23:33 | 0:23:36 | |
No, really. I think you're being absolutely sophistical. | 0:23:36 | 0:23:40 | |
Go... Go out and ask, just ask anyone. Do you want to be... | 0:23:40 | 0:23:44 | |
-Do you want to be brighter or less bright? -I'm asking you. | 0:23:44 | 0:23:47 | |
Yes, and I am answering you. | 0:23:47 | 0:23:49 | |
-What... -I've answered you already. | 0:23:49 | 0:23:51 | |
I want them to have the quality of conversation, | 0:23:51 | 0:23:53 | |
I want them to have earning power. | 0:23:53 | 0:23:55 | |
Do you think we've all got brilliant memories round the table? No. | 0:23:55 | 0:23:59 | |
DICK: Susan. Susan, just a moment. | 0:23:59 | 0:24:00 | |
There are certain things which are already possible. | 0:24:00 | 0:24:04 | |
I mean, it may well be that we can stop people having | 0:24:04 | 0:24:08 | |
Huntington's chorea. There are certain...health effects | 0:24:08 | 0:24:14 | |
which could be prevented. | 0:24:14 | 0:24:16 | |
Now, it's not an awfully long step to stop people being too short. | 0:24:16 | 0:24:22 | |
You can begin to influence the way in which people develop. | 0:24:22 | 0:24:27 | |
Now, this does raise certain ethical problems. | 0:24:27 | 0:24:33 | |
You can't deny that. | 0:24:33 | 0:24:35 | |
SUSAN: How short is too short? 5ft 4? 5ft 5? | 0:24:35 | 0:24:37 | |
It's not... Wait a moment. | 0:24:37 | 0:24:39 | |
You're avoiding the issue. | 0:24:39 | 0:24:40 | |
I want to know how short you can be before you have to be manipulated. | 0:24:40 | 0:24:44 | |
No, but that doesn't deal with the question, Susan. | 0:24:44 | 0:24:47 | |
The answer to your question. The question to your ques... | 0:24:47 | 0:24:51 | |
The answer to your question is 5ft 7, now let's move on. | 0:24:51 | 0:24:54 | |
Tina, is there any moral difference between selecting an embryo | 0:24:54 | 0:24:57 | |
that doesn't have Huntington's chorea | 0:24:57 | 0:25:00 | |
and selecting an embryo that doesn't have blue eyes? | 0:25:00 | 0:25:03 | |
I think there is, | 0:25:03 | 0:25:04 | |
but I'm cautious on all this. | 0:25:04 | 0:25:06 | |
I mean, this is a question not about human perfectibility | 0:25:06 | 0:25:10 | |
which is such a myth and such a fantasy. | 0:25:10 | 0:25:13 | |
But about suffering and our capacity to alleviate it. | 0:25:13 | 0:25:17 | |
And I can't give a clear answer to that, | 0:25:17 | 0:25:20 | |
because I think we need to go on a case by case, | 0:25:20 | 0:25:22 | |
painful struggle, to say, where can we alleviate human suffering? | 0:25:22 | 0:25:26 | |
And if we want to do that, there are millions of children in this world | 0:25:26 | 0:25:30 | |
we could just give a square meal to tonight. That would be enough. | 0:25:30 | 0:25:34 | |
I know, I know, I know. All right, let me give you another issue. | 0:25:34 | 0:25:37 | |
Is there any moral difference between selecting whether the baby | 0:25:37 | 0:25:41 | |
is relieved from having Down's syndrome | 0:25:41 | 0:25:43 | |
or whether the baby is going to have blue eyes? | 0:25:43 | 0:25:45 | |
-I'm...not sure. -All those kids out there with Down's syndrome | 0:25:49 | 0:25:53 | |
-are waiting to hear your answer. -That's why I'm not sure. | 0:25:53 | 0:25:56 | |
I fear a world where we judge perfectibility | 0:25:56 | 0:25:58 | |
on external criteria that you... I mean, you, Michael. | 0:25:58 | 0:26:01 | |
You want genetic manipulation so that you can have richer, brighter kids. | 0:26:01 | 0:26:06 | |
If the economic meltdown is anything to go by, if you want richer kids | 0:26:06 | 0:26:10 | |
you should get a gene that is stupid and short-sighted. | 0:26:10 | 0:26:13 | |
Then your kids will be richer. | 0:26:13 | 0:26:15 | |
Yes, but I don't think your abuse of me entirely deals | 0:26:15 | 0:26:18 | |
with the moral problem. | 0:26:18 | 0:26:19 | |
I can't ENTIRELY deal with moral problems, none of us can. | 0:26:19 | 0:26:23 | |
We have to say, what CAN we do? What SHOULD we do? | 0:26:23 | 0:26:26 | |
And what good will it do, if we do it? | 0:26:26 | 0:26:28 | |
And every capacity we have, we need to ask those questions. | 0:26:28 | 0:26:32 | |
DICK: Tina, you haven't answered the question. | 0:26:32 | 0:26:35 | |
Because the question is that there's a very big moral difference | 0:26:35 | 0:26:38 | |
between selecting somebody who's got blue eyes | 0:26:38 | 0:26:41 | |
-and somebody who's got Down's syndrome. -No. -Why? | 0:26:41 | 0:26:43 | |
What's the moral difference? | 0:26:43 | 0:26:46 | |
There are a lot of differences. | 0:26:46 | 0:26:48 | |
One is irrelevant, | 0:26:48 | 0:26:49 | |
the other makes a huge difference to the future of their lives. | 0:26:49 | 0:26:52 | |
SIR WALPORT: The issue that we're not discussing | 0:26:52 | 0:26:55 | |
is the potential harm from making the choice. | 0:26:55 | 0:26:57 | |
You go through a process of in vitro fertilisation, | 0:26:57 | 0:27:00 | |
it has side effects. | 0:27:00 | 0:27:02 | |
You go through abortion, again there are risks. | 0:27:02 | 0:27:05 | |
The discussion becomes interesting | 0:27:05 | 0:27:07 | |
if you get into a universe where you can create an egg in the laboratory, | 0:27:07 | 0:27:11 | |
you can create a sperm, | 0:27:11 | 0:27:12 | |
then you can start the selection and there's no harm. | 0:27:12 | 0:27:15 | |
And that's where this is a decision | 0:27:15 | 0:27:17 | |
that shouldn't be made by scientists any more. | 0:27:17 | 0:27:20 | |
It's one that, actually, the whole of society has to debate. | 0:27:20 | 0:27:23 | |
What does the whole of society mean? Does it mean anything? | 0:27:23 | 0:27:27 | |
Well, here we mean parliamentary democracy, actually. We mean... | 0:27:27 | 0:27:31 | |
Well, it's the b... It's the best we can do. | 0:27:31 | 0:27:34 | |
And actually, on in vitro fertilisation and embryo research | 0:27:34 | 0:27:37 | |
Parliament has done well. | 0:27:37 | 0:27:38 | |
-The track record isn't all bad... -Mixed. | 0:27:38 | 0:27:42 | |
SUSAN: Leaving that is the very exciting question, | 0:27:42 | 0:27:45 | |
what kind of society do we want? | 0:27:45 | 0:27:47 | |
What people do you want your kids to be? | 0:27:47 | 0:27:49 | |
What values and features do we want them to have? | 0:27:49 | 0:27:52 | |
For the first time, this is what's so exciting, | 0:27:52 | 0:27:54 | |
biomedical science has freed us up to have longer lives, | 0:27:54 | 0:27:59 | |
to be freed up, technology's freed us from the drudgery of everyday, | 0:27:59 | 0:28:02 | |
so for the first time EVER we can have these issues. | 0:28:02 | 0:28:05 | |
And it's exciting, but it's a very scary time as well. | 0:28:05 | 0:28:08 | |
Thank you all very much indeed. | 0:28:08 | 0:28:10 | |
'In tomorrow's world of impeccable blue-eyed babies, | 0:28:10 | 0:28:15 | |
'carbon neutral electricity, | 0:28:15 | 0:28:17 | |
'and plentiful, aphid-resistant crops, | 0:28:17 | 0:28:19 | |
'let us hope that our weather will be calm | 0:28:19 | 0:28:22 | |
'and our polar bears will be safe. | 0:28:22 | 0:28:24 | |
'But if so, as we sit down to our perfectly balanced meals | 0:28:24 | 0:28:29 | |
'at the dinner party of the future, | 0:28:29 | 0:28:31 | |
'will there be anything left to talk about?' | 0:28:31 | 0:28:34 | |
Subtitles by Red Bee Media Ltd | 0:28:42 | 0:28:45 | |
E-mail [email protected] | 0:28:45 | 0:28:48 |