Browse content similar to Julian Savulescu - Medical ethicist. Check below for episodes and series from the same categories and more!
Line | From | To | |
---|---|---|---|
number of people have died, including a man crushed when a pile | :00:00. | :00:00. | |
of salt used for de-icing roads fell on him. Thousands of people have | :00:00. | :00:00. | |
been stranded after the cancellation of more than 2,000 flights. | :00:00. | :00:00. | |
Now on BBC News, HARDtalk. Welcome to HARDtalk. From genetic | :00:07. | :00:16. | |
engineering to bioscience, humans are close to acquiring the ability | :00:17. | :00:19. | |
not just to combat disease but to enhance and perfect our species. | :00:20. | :00:32. | |
But should we seek to do it? Or should we shy away from the path | :00:33. | :00:35. | |
that led to Nazi eugenics? My guest is Australian-born, | :00:36. | :00:37. | |
Oxford-based medical ethicist Julian Savulescu. Can we trust ourselves to | :00:38. | :00:42. | |
be wise masters of our own biology? Julian Savulescu, welcome to | :00:43. | :01:15. | |
HARDtalk. It seems we humans have acquired | :01:16. | :01:18. | |
extraordinary knowledge of how to re-engineer the genetic building | :01:19. | :01:24. | |
blocks of life. Do you regard that as a profoundly positive thing? I | :01:25. | :01:30. | |
think it is. Because we are, all, products of evolution. We aren't | :01:31. | :01:36. | |
products of design. We aren't designed to live happily or live | :01:37. | :01:42. | |
long-term or be peaceful. We are just the blind results of nature. Is | :01:43. | :01:49. | |
that natural evolution? What we are going to talk about today is about | :01:50. | :01:53. | |
human intervention. Human meddling. You can call it meddling or you can | :01:54. | :02:16. | |
call it modification, what you like. The important thing to ask is, is | :02:17. | :02:19. | |
the natural state of affairs the best state of affairs? There is huge | :02:20. | :02:22. | |
variation in levels of aggression, intelligence. At the extreme ends we | :02:23. | :02:25. | |
call them diseases. We call the bottom 2% diseased or disordered in | :02:26. | :02:28. | |
some way. Why should we draw the line at 2%? Why not 50% or 30%? We | :02:29. | :02:32. | |
now have the possibility of deciding what kind of beings we should be and | :02:33. | :02:35. | |
what sorts of lives we should lead. To believe that's a good thing, one | :02:36. | :02:39. | |
has to believe in the wisdom of human beings, to use the ability in | :02:40. | :02:42. | |
a way that is both individually and collectively beneficial. It's | :02:43. | :02:45. | |
certainly correct that with any powerful technology or intervention | :02:46. | :02:47. | |
is the possibility of great good and great harm. We haven't been | :02:48. | :02:54. | |
judicious masters of our technology, that's true. We have developed | :02:55. | :02:59. | |
nuclear weapons. But can we, at this point, use it and will it be | :03:00. | :03:02. | |
developed anyway and should we try to control its direction? | :03:03. | :03:07. | |
Let's start with repairing disease. Few people these days have found | :03:08. | :03:10. | |
ethical issues with the notion of genetic intervention used to prevent | :03:11. | :03:13. | |
or fix some of the most difficult inherited diseases, like cystic | :03:14. | :03:24. | |
fibrosis. That's a reality today, isn't it? | :03:25. | :03:29. | |
It's certainly a reality that you can select to have children who will | :03:30. | :03:33. | |
be free of that genetic mutation that causes cystic fibrosis. But for | :03:34. | :03:38. | |
many characteristics to do with our mental lives, like psychiatric | :03:39. | :03:40. | |
disorders, the line that's drawn is arbitrary. 10% of children now have | :03:41. | :03:49. | |
this new disease called attention deficit disorder. That's called a | :03:50. | :03:54. | |
disease but suddenly 10% of children have this disease warranting | :03:55. | :03:59. | |
treatment. I believe that, in many of these cases, these are normal | :04:00. | :04:02. | |
human variations that are just disadvantageous. Science now gives | :04:03. | :04:09. | |
us the ability of improving those. You go further than just discussing | :04:10. | :04:12. | |
the possibilities that science offers. You seem to believe there is | :04:13. | :04:17. | |
some sort of philosophical or moral obligation upon parents, who live in | :04:18. | :04:20. | |
a context where this technology is available, to use it, enhance and | :04:21. | :04:23. | |
maximise the potential of their progeny? Am I right? | :04:24. | :04:35. | |
As parents, most people want to have... Want their children to have | :04:36. | :04:39. | |
the best lives they can have. But it's a moral obligation? As we | :04:40. | :04:42. | |
look forward and technology becomes more pervasive, would you, as a | :04:43. | :04:45. | |
philosopher and ethicist, say to a parent who is offered the chance to | :04:46. | :04:48. | |
take advantage of this technology and to turn maybe just a kid, who | :04:49. | :04:52. | |
genetically looks like he might be under average, is there a | :04:53. | :04:54. | |
responsibility to intervene and make that kid above average? To make it | :04:55. | :05:08. | |
enhanced? If it makes his or her life the | :05:09. | :05:12. | |
better, yes. By moral obligation I mean that there is a reason to do | :05:13. | :05:18. | |
it. We should do lots of things, provide a good education to | :05:19. | :05:20. | |
children, provide diet and treat diseases. Where there are things | :05:21. | :05:28. | |
like vitamins or nutritional supplements that improve their | :05:29. | :05:30. | |
intelligence or reduce unwarranted aggression, then we should use those | :05:31. | :05:35. | |
in the same way. It sounds like strong language. I'm not saying | :05:36. | :05:42. | |
people should be required by law to do these things but we should | :05:43. | :05:44. | |
encourage people and using terms like moral obligation is not to say | :05:45. | :05:49. | |
that's what they must do. There's a good reason to do it. The reason is, | :05:50. | :05:54. | |
the child is expected to have a better life. | :05:55. | :05:56. | |
This is a quote from you. "Technology allows us to avoid the | :05:57. | :05:59. | |
lottery of life, to choose our destiny". Such a strong statement | :06:00. | :06:06. | |
and it brings to my mind the word hubris. If humans start talking | :06:07. | :06:10. | |
about choosing and deciding their destinies, that is hubris, is it | :06:11. | :06:16. | |
not? It can be but it's also important to | :06:17. | :06:19. | |
realise that there is enormous inequality and disadvantage. We are | :06:20. | :06:24. | |
two of the lucky ones. We are sitting here having a comfortable | :06:25. | :06:30. | |
conversation. There are many people who have a life of extraordinary | :06:31. | :06:32. | |
disease, disability, disadvantage and really being born behind the | :06:33. | :06:35. | |
eight ball, not only socially but in terms of biological obstacles to | :06:36. | :06:38. | |
them being able to just do the basic things in life, like work. If we | :06:39. | :06:49. | |
have a chance to overcome that, why shouldn't we ask ourselves whether | :06:50. | :06:54. | |
we should? Maybe because it's going down a | :06:55. | :06:59. | |
dangerous path. It's a mindset which the philosopher and ethicist Michael | :07:00. | :07:01. | |
Sandel says is undermining one of the most basic and important | :07:02. | :07:04. | |
qualities we, as humans beings, have. That's humility and openness | :07:05. | :07:17. | |
to be unbidden. I have been open to the unbidden. | :07:18. | :07:21. | |
Why shouldn't we remain open to being unbidden? Why should we try to | :07:22. | :07:27. | |
determine our destiny by meddling with genes? | :07:28. | :07:29. | |
When you provide education to children, it changed their brains. | :07:30. | :07:31. | |
You modify them. You do that according to certain values. | :07:32. | :07:39. | |
What is the moral or any other equivalent between genetic | :07:40. | :07:41. | |
engineering of the human embryo and sending your kid to school? | :07:42. | :07:46. | |
There's no morally relevant difference. Both change the way the | :07:47. | :07:50. | |
brain works. One acts directly, one acts through social mechanisms. The | :07:51. | :07:54. | |
final common pathway is what's going on in people's brains. If you can | :07:55. | :07:58. | |
use knowledge of science, in this case genetics, to augment the | :07:59. | :08:01. | |
effects of education, why would you draw an arbitrary line, just because | :08:02. | :08:04. | |
you think it's an internal modification rather than external? | :08:05. | :08:13. | |
Why would that make a difference? Maybe we can tease that out with | :08:14. | :08:18. | |
some examples. Starting with one of the most basic uses of genetics that | :08:19. | :08:22. | |
people may be aware of. That is, gender selection. For a long time | :08:23. | :08:29. | |
now people have been able to look at the fertilised embryo and see what | :08:30. | :08:34. | |
gender it is. You have taken a strong position on this, saying it's | :08:35. | :08:38. | |
a basic human right. That everybody should be allowed to choose the sex | :08:39. | :08:47. | |
of their baby. We have the right to choose when to | :08:48. | :08:50. | |
have children, how many children to have, now we have the possibility of | :08:51. | :08:53. | |
choosing between possible children. If you are going to restrict liberty | :08:54. | :08:56. | |
in a liberal democratic society, you need to have a reason based on a | :08:57. | :09:02. | |
risk of harm to people. You can't go around willy-nilly, stopping people | :09:03. | :09:04. | |
doing what they want, unless they are really harming somebody. When it | :09:05. | :09:10. | |
comes to sex selection, who are they harming? There are several possible | :09:11. | :09:14. | |
candidates but the usual one is society or women by creating sex | :09:15. | :09:26. | |
ratio imbalances. Which we see all over the world. | :09:27. | :09:29. | |
Not reducing genetic technologies but basically by prenatal testing | :09:30. | :09:32. | |
and abortion or infanticide. But when it comes to the use of these | :09:33. | :09:35. | |
technologies, if you could demonstrate a harm to women or | :09:36. | :09:38. | |
society, that would be a good reason to ban it, but you could easily | :09:39. | :09:43. | |
prevent this. You simply allow for what you call family balancing, | :09:44. | :09:46. | |
allowing it for the second or third child of a family who want a mix of | :09:47. | :09:52. | |
sexes. You can monitor the sex ratio, there are many ways in which | :09:53. | :09:55. | |
you can ensure that these sorts of harms don't occur. | :09:56. | :10:00. | |
You talk about this in the context of a liberal, free, rich world | :10:01. | :10:05. | |
society. Are you accepting that if you were to apply the logic of | :10:06. | :10:08. | |
freedom to choose, for individual parents on this gender question, in | :10:09. | :10:11. | |
many societies you would end up with a terribly unbalanced gender | :10:12. | :10:14. | |
situation inside the country and it would also entrench the second class | :10:15. | :10:17. | |
status and the disempowerment of so many women and so many societies? | :10:18. | :10:30. | |
If it had those effects... It would. | :10:31. | :10:34. | |
Then it would be a good reason to ban it in those societies. | :10:35. | :10:42. | |
Having a basic human right in respecting freedom is always limited | :10:43. | :10:45. | |
by the harm that you present to other people. In this case, there is | :10:46. | :10:49. | |
a clear harm to women. But if we don't have a grave social problem, | :10:50. | :10:52. | |
we shouldn't be infringing people's liberty. | :10:53. | :10:58. | |
Imagine a world where the science of genetics has gone even further than | :10:59. | :11:02. | |
it has today and one can look at the fertilised embryo and draw all sorts | :11:03. | :11:05. | |
of conclusions about things you have already mentioned. Intelligence, | :11:06. | :11:09. | |
likelihood of an inclination to violent or criminal behaviour. | :11:10. | :11:18. | |
Coming back to moral obligation, do you believe in those circumstances | :11:19. | :11:20. | |
there is a moral obligation to intervene? End those pregnancies | :11:21. | :11:34. | |
which involve embryos with those signatures or to somehow repair | :11:35. | :11:35. | |
them? Again, this is the reason amongst | :11:36. | :11:38. | |
others. I have never suggested that people have a moral obligation to | :11:39. | :11:41. | |
have termination of pregnancy. That's a very serious intervention. | :11:42. | :11:46. | |
What I have been discussing is when you have ten embryos and you have | :11:47. | :11:49. | |
tested them all for diseases... Like IVF? | :11:50. | :11:55. | |
This is the simplest case. You will soon be able to sequence the embryos | :11:56. | :12:01. | |
and you have found they are healthy. Why wouldn't you go on to look for | :12:02. | :12:04. | |
their dispositions towards intelligence or towards psychopathy? | :12:05. | :12:07. | |
1% of the population are psychopaths. We might identify the | :12:08. | :12:18. | |
genetic contributions to psychopathy, not to say it's | :12:19. | :12:21. | |
genetic, but those things that make it more likely and why wouldn't you | :12:22. | :12:23. | |
take that information into account when deciding... You will implant | :12:24. | :12:27. | |
one of those ten embryos. Why not choose the one that has a lower | :12:28. | :12:29. | |
chance... That sounds dangerously like | :12:30. | :12:31. | |
eugenics, when you take away all the traits, all of the huge numbers of | :12:32. | :12:35. | |
different traits that make up human beings, and you try to develop a | :12:36. | :12:44. | |
successful stereotype. We are in a completely different | :12:45. | :12:47. | |
world to the world of the past. It's true, this is eugenics. It is | :12:48. | :12:53. | |
eugenics when you test for cystic fibrosis or Down's syndrome, that's | :12:54. | :12:56. | |
trying to have a child who is healthier or has less intellectual | :12:57. | :13:01. | |
disability. That's healing a real impairment. | :13:02. | :13:06. | |
What you are now talking about is trying to perfect, trying to | :13:07. | :13:09. | |
maximise, the benefit to any newborn individual. For example, you might | :13:10. | :13:15. | |
say in a deeply prejudiced society, where it's difficult being an ethnic | :13:16. | :13:19. | |
minority, being black or being gay, then you should meddle with the | :13:20. | :13:21. | |
genetic material before birth in IVF, for example, to lighten the | :13:22. | :13:24. | |
scheme or to avoid having the genetic disposition to | :13:25. | :13:26. | |
homosexuality, because that would create such problems for the future | :13:27. | :13:36. | |
child. Is that what you would contemplate? | :13:37. | :13:41. | |
You also have the obvious strategy of reducing... Good memory | :13:42. | :13:49. | |
institutions. That's politics. But in the short | :13:50. | :13:54. | |
term, as a scientist, you are faced with the issue, do I get rid of the | :13:55. | :13:58. | |
embryo that is physically disabled or predisposed to being homosexual, | :13:59. | :14:00. | |
because the individual, if born, would have a much more difficult | :14:01. | :14:03. | |
life than the perfect stereotype that I'm seeking? As I said, there | :14:04. | :14:17. | |
are a number of ways to address this kind of problem and one of them is | :14:18. | :14:23. | |
through social intervention. That does not answer my question. Imagine | :14:24. | :14:29. | |
the society is not fixed, it has not worked the way you would like it to | :14:30. | :14:37. | |
work. But you still have to decide. Do you think it's legitimate to | :14:38. | :14:40. | |
decide on the basis that I just outlined to get rid of the embryo | :14:41. | :14:43. | |
that is predisposed to be homosexual or might have a physical impairment? | :14:44. | :14:49. | |
Because, to use your logic, why wouldn't you? Why not take the | :14:50. | :14:58. | |
embryo and maximise its own advantage? From the perspective of | :14:59. | :15:01. | |
the child that is born, in considering whether their life goes | :15:02. | :15:04. | |
well, you have to consider these unjust institutions and choose the | :15:05. | :15:06. | |
child that will have the better future. But there are other reasons | :15:07. | :15:14. | |
not to bend to these forces. Perhaps they want to change these unjust | :15:15. | :15:17. | |
practices by resisting this kind of choice. But you have to remember | :15:18. | :15:25. | |
that you are not doing that for the benefit of the child, you're doing | :15:26. | :15:28. | |
that to change society. And the reasons why people choose to have | :15:29. | :15:31. | |
boys in India... There are two reasons. First, if you have a girl, | :15:32. | :15:35. | |
you have to have a very large dowry. For poor people, it's very expensive | :15:36. | :15:39. | |
to have a girl. The problem there is the institution of the dowry. If you | :15:40. | :15:44. | |
are a poor person and you cannot afford a dowry, there is a very good | :15:45. | :15:48. | |
reason for you to select to have a boy. The problem in these cases is | :15:49. | :15:52. | |
not just to say that you are perpetuating this problem. You have | :15:53. | :15:59. | |
to address the causes not the symptoms of the disease. The causes | :16:00. | :16:02. | |
are homophobic attitudes, racism, and outdated social institutions. | :16:03. | :16:05. | |
Those are the causes. Of course, if you cannot treat the disease, you | :16:06. | :16:07. | |
have to consider symptomatic treatment. Thereby entrenching those | :16:08. | :16:16. | |
very institutions. And that might be a reason not to administer | :16:17. | :16:22. | |
symptomatic treatment. But you are picking a case where... You are | :16:23. | :16:26. | |
clearly an ethicist and you have to deal with it. Ethics is about not | :16:27. | :16:29. | |
considering one reason but to consider all relevant factors. You | :16:30. | :16:38. | |
cannot make a decision in theory about the rightness or wrongness of | :16:39. | :16:41. | |
selection that applies in every single circumstance. I don't want to | :16:42. | :16:48. | |
suggest this is simple. It's complex. | :16:49. | :16:52. | |
Human cloning. Do you believe that the ultimate logic of what you | :16:53. | :16:57. | |
suggest about perfecting - enhancing and perfecting human genetic | :16:58. | :17:02. | |
material - does lead to cloning? Taking the very best and replicating | :17:03. | :17:06. | |
it? Well, again, if that were your only goal, to have the most gifted | :17:07. | :17:14. | |
possible child, then... Sometimes it sounds like that is your goal. No, | :17:15. | :17:18. | |
it's not. Sometimes we have a reason. But sometimes people want to | :17:19. | :17:21. | |
have their own child, not Albert Einstein's child or Michael Jordan's | :17:22. | :17:26. | |
child. You would need to have a good reason to clone. Here is one. | :17:27. | :17:30. | |
Imagine if you were infertile and could only produce one or two | :17:31. | :17:35. | |
embryos. You cannot produce anymore eggs. And you have a pregnancy and | :17:36. | :17:41. | |
it looks like the pregnancy will be lost. But you can clone that foetus | :17:42. | :17:45. | |
in utero in case the pregnancy miscarries. That seems an entirely | :17:46. | :17:53. | |
legitimate use of cloning technology if you could do it, to enable that | :17:54. | :17:58. | |
couple to have a child. Yes, it would be a clone, but so what? One | :17:59. | :18:07. | |
line you appear to be happy to cross is cross-species genetic mixing. You | :18:08. | :18:10. | |
talk about a future in which the human being might have, to quote | :18:11. | :18:14. | |
you, sonar ability like a bat, or indeed, the ability to convert | :18:15. | :18:16. | |
sunlight into energy for growth like a plant. I know that this is | :18:17. | :18:28. | |
blue-sky thinking, but you don't have an ethical problem with this | :18:29. | :18:30. | |
cross-fertilisation of genetic material? Again, it's important to | :18:31. | :18:33. | |
start with the science and the facts. We are the result of | :18:34. | :18:39. | |
integration of viral DNA into our genome. A lot of our genetic | :18:40. | :18:43. | |
material is viral. We are already integrating stuff from the | :18:44. | :18:47. | |
environment. Now, if I had a gene sequence that could confer | :18:48. | :18:49. | |
resistance to HIV that came from some other animal, that was | :18:50. | :18:52. | |
constructed in a laboratory and put into people to prevent the spread of | :18:53. | :18:56. | |
HIV, you have to ask why is the fact that it comes from an animal | :18:57. | :18:59. | |
different from the fact that you just invented it in a laboratory? In | :19:00. | :19:07. | |
fact, you could just copy the sequence and construct it in a | :19:08. | :19:12. | |
laboratory. The question is, what are the risks and benefits of the | :19:13. | :19:15. | |
particular case under consideration? If we ran out of food and we could | :19:16. | :19:19. | |
then use sunlight to produce energy, why would that be... Certainly | :19:20. | :19:24. | |
unnatural, but why would it be wrong? Risks and benefits. It's | :19:25. | :19:30. | |
always a difficult calculation. You have been talking about futurology | :19:31. | :19:32. | |
to some extent but let's bring it back to something real, which is | :19:33. | :19:35. | |
your stance on performance-enhancing drugs in sport. You have made a big | :19:36. | :19:38. | |
noise around the world, saying that you believe it is right to allow | :19:39. | :19:41. | |
athletes, cyclists, to use performance-enhancing drugs, even | :19:42. | :19:44. | |
though many of those same athletes say that it is absolutely not the | :19:45. | :19:50. | |
way they want their sport to go. Why is that? The rules of sport are | :19:51. | :19:56. | |
entirely arbitrary. There is no inherent rightness or wrongness in | :19:57. | :20:00. | |
the set of rules. You choose them according to various values. Some of | :20:01. | :20:05. | |
those values include their ability to be enforced, the ability to | :20:06. | :20:08. | |
protect the health of participants, the ability to create a fair playing | :20:09. | :20:11. | |
field that is enforceable. The current regime involves, today, what | :20:12. | :20:14. | |
athletes are actually doing today is not what the East Germans did in the | :20:15. | :20:17. | |
1970s, taking massive doses of drugs to turn women into men. They are | :20:18. | :20:22. | |
moving within the natural range for all of these parameters. Blood, | :20:23. | :20:26. | |
testosterone, growth hormone. And it's virtually impossible to detect | :20:27. | :20:28. | |
whether someone is moving within this normal range and it's safe | :20:29. | :20:31. | |
because it's within this normal range. But if you persist in this, | :20:32. | :20:39. | |
you turn sport into a contest between scientists rather than a | :20:40. | :20:45. | |
contest between physical athletes. Because the best drugs will win. Of | :20:46. | :20:55. | |
course, you could if you again moved into supra-normal doses of these | :20:56. | :20:58. | |
things. You seem to miss the human element. To quote you from Bradley | :20:59. | :21:01. | |
Wiggins. He says he has never used drugs, in a sport which has been | :21:02. | :21:05. | |
tainted by drugs. He says, "I do what I do because I love it. I don't | :21:06. | :21:09. | |
do it for a power trip. What I love is doing my best and working my | :21:10. | :21:13. | |
hardest. If I felt I had to take drugs in my sport, I would quit | :21:14. | :21:18. | |
tomorrow". Well, they do take substances. Caffeine used to be | :21:19. | :21:25. | |
banned. It's not a naturally-occurring substance. It's | :21:26. | :21:34. | |
now permitted. It increases the tolerance of the body. Other | :21:35. | :21:37. | |
examples include analgesics. Local anaesthetics. Non-steroidal ones. It | :21:38. | :21:41. | |
enables them to deal with the pain and damage of competition. Nearly | :21:42. | :21:45. | |
100% of footballers are on these sorts of drugs to enable them to | :21:46. | :21:48. | |
perform better. They are very unnatural and they are dangerous but | :21:49. | :21:51. | |
nobody perceives them to be performance-enhancing drugs, but of | :21:52. | :21:55. | |
course they are. That's why they give them. You put an enormous | :21:56. | :22:01. | |
amount of faith and trust in humanity to use ever expanding | :22:02. | :22:03. | |
scientific, biological knowledge to the best effect, for the best good | :22:04. | :22:10. | |
of us as a species. There is one area where you seem to fall down in | :22:11. | :22:13. | |
that approach and that's bioscience, where you have admitted you're | :22:14. | :22:16. | |
terrified by the capacity of human beings to create a new pathogen, | :22:17. | :22:19. | |
which could actually wipe us all out. Are you terrified because of | :22:20. | :22:27. | |
what bioscience can do or are you actually terrified because you have | :22:28. | :22:30. | |
come to realise that human beings, we ourselves, are actually pretty | :22:31. | :22:37. | |
dangerous? I don't have great faith in human beings and their ability to | :22:38. | :22:40. | |
make decisions about this technology at all. We are the most dangerous | :22:41. | :22:46. | |
species on the planet. We may extinguish all life on the planet. | :22:47. | :22:51. | |
Don't get me wrong. And yet you want to give us all these freedoms to | :22:52. | :22:55. | |
meddle with nature. When it comes to meddling on the level of allowing | :22:56. | :22:57. | |
people to choose their family, yes, they should have freedom, but what I | :22:58. | :23:01. | |
would say is that once science gives us the knowledge and power, we have | :23:02. | :23:05. | |
to make a decision and we have to make it on the basis of values and | :23:06. | :23:10. | |
ethics. What I have argued in terms of sport and genetic selection is | :23:11. | :23:13. | |
that once you can make a difference, you have to ask if you should stay | :23:14. | :23:17. | |
with the status quo or if you should not. And if you choose to stay with | :23:18. | :23:22. | |
the status quo, you are then responsible for it. In all these | :23:23. | :23:26. | |
areas, we have to start thinking because unavoidably, science is | :23:27. | :23:29. | |
giving us the power to make these sorts of changes. My point is that | :23:30. | :23:34. | |
we need to have a set of values. What I have argued with sport is | :23:35. | :23:38. | |
that you can have a set of values that allows a level of safe | :23:39. | :23:41. | |
performance enhancement that would be not threatening to the human | :23:42. | :23:44. | |
contribution and would be much more enforceable and would make the whole | :23:45. | :23:48. | |
thing better. That is just a proposal. We know it's up for | :23:49. | :23:51. | |
debate. But that's what ethics is about - making proposals. And plenty | :23:52. | :23:54. | |
of debate. Thank you for joining us. There will be spells of wet and | :23:55. | :24:34. | |
windy weather over the next two or three days. If you have any concerns | :24:35. | :24:37. | |
about the flooding situation, call the floodline number. The reason we | :24:38. | :24:46. | |
had such a wet and | :24:47. | :24:47. |