Episode 8

Download Subtitles

Transcript

0:00:03 > 0:00:06Today on The Big Questions...

0:00:06 > 0:00:08Social media.

0:00:08 > 0:00:10Can you have the good without the bad?

0:00:10 > 0:00:12And Max's Law.

0:00:12 > 0:00:19Should you have to opt out of being on the organ donor register?

0:00:25 > 0:00:30APPLAUSE

0:00:30 > 0:00:32Good morning, I'm Nicky Campbell, welcome to The Big Questions.

0:00:32 > 0:00:34Today, we're live from Bath Spa University.

0:00:34 > 0:00:36Welcome, everybody, to The Big Questions.

0:00:36 > 0:00:43APPLAUSE

0:00:43 > 0:00:45Social media.

0:00:45 > 0:00:48The online world of Twitter, FaceBook, Snapchat and many other

0:00:48 > 0:00:52sites, where ideas and comments can be posted and pictures shared,

0:00:52 > 0:00:58is probably the biggest change to have affected our daily lives

0:00:58 > 0:01:00since the advent of television or the mobile phone.

0:01:00 > 0:01:02People across the globe can share what is happening to them

0:01:02 > 0:01:04with friends and complete strangers in an instant.

0:01:04 > 0:01:06Politicians, businesses, entertainers, artists and conmen

0:01:06 > 0:01:08all have an easy way to peddle their ideas

0:01:08 > 0:01:14and wares direct to you, 24 hours a day, wherever you are.

0:01:14 > 0:01:17The snag is there is no editorial control, there are no real systems

0:01:17 > 0:01:23to filter out the fake news or the scams.

0:01:23 > 0:01:26Indeed, fake news is often used to direct the unwary

0:01:26 > 0:01:27viewer to the scams.

0:01:27 > 0:01:29And while it can bring people closer together,

0:01:29 > 0:01:32it can also be highly divisive, pitting groups against each other

0:01:32 > 0:01:34and unleashing storms of abuse on hapless individuals.

0:01:34 > 0:01:42Is social media beyond control?

0:01:43 > 0:01:49Laura, welcome to The Big Questions, PhD researcher in social media. Out

0:01:49 > 0:01:54of control, isn't that the point, the wonderful thing about social

0:01:54 > 0:01:58media?I am with you, social media is a wonderful thing, but it has got

0:01:58 > 0:02:03to the point where it is well beyond control. In the last general

0:02:03 > 0:02:09election, we were exposed the amount of abuse social media can it bring,

0:02:09 > 0:02:16MPs spoke about the amount of abuse they experienced, and that exposed

0:02:16 > 0:02:21that actually social media can be a threat to democracy. I have had the

0:02:21 > 0:02:24pleasure of interviewing politicians and candidates and I remember a

0:02:24 > 0:02:28candidate speaking to me openly about the abuse she received and she

0:02:28 > 0:02:33turned around and said, if I had children, I would not put myself in

0:02:33 > 0:02:38this position, and that is a threat to our democracy, we are segregating

0:02:38 > 0:02:43certain people from going forward in our political system.Our question,

0:02:43 > 0:02:51is it beyond control,

0:02:52 > 0:02:53is it beyond control, can it be controlled, should it be controlled?

0:02:53 > 0:02:55The other side, a great force for democratisation, it has given people

0:02:55 > 0:02:58a voice, a fantastic thing, don't we have to accept it, uncomfortable as

0:02:58 > 0:03:04it is, because of the good?It does bring good and campaigns have been

0:03:04 > 0:03:12won solely online, The Everyday Sexism Project, brilliant, but there

0:03:12 > 0:03:18has to be restrictions. I am all for freedom of expression. But that is

0:03:18 > 0:03:23not absolute.The establishment is rattled. It has rattled their cage.

0:03:23 > 0:03:28If you start talking about restrictions, we smell a rat. Does

0:03:28 > 0:03:38everybody smell a rat? OK. Let us not move onto the next debate quite

0:03:38 > 0:03:43yet.Ben? A threat to democracy, but there is a real problem here which

0:03:43 > 0:03:48is a case in India last July, seven guys beaten to death by a mob

0:03:48 > 0:03:53because of a fake story on WhatsApp, that they were a child abductors.

0:03:53 > 0:03:58Same thing happened in the US, the story Hillary Clinton was running a

0:03:58 > 0:04:06paedophile ring from the basement of a pizzeria. An American guy went in

0:04:06 > 0:04:11with an assault rifle and started firing. What happens online does not

0:04:11 > 0:04:17stay online and it can have real life and real death consequences.Is

0:04:17 > 0:04:22it a new thing, and amplified thing now of that there is no doubt? But

0:04:22 > 0:04:26what about online abuse, Emily? People who do not want to go into

0:04:26 > 0:04:31politics because of vile stuff they are receiving something has to be

0:04:31 > 0:04:37done?Nothing has to be done. People use social media for good, the

0:04:37 > 0:04:41overwhelming majority. As is the case in public life, there is a flip

0:04:41 > 0:04:45side, there will always be people who are nasty and abusive, but I

0:04:45 > 0:04:51would say no

0:04:51 > 0:04:54would say no regulation, any regulation who was the person who

0:04:54 > 0:04:59decides what is the truth, the line of what people can and cannot say?

0:04:59 > 0:05:05That worries me, deeply concerning, that there will be gatekeepers and

0:05:05 > 0:05:10the right opinion. People get abused, deeply unpleasant. But it is

0:05:10 > 0:05:16a small drop in the massive ocean of good social media does in connecting

0:05:16 > 0:05:20people.Obviously, abuse happens on the street, but if you look at the

0:05:20 > 0:05:24type of abuse going on on social media, I would argue very little of

0:05:24 > 0:05:30that happens on the street. Could you imagine, for example, Jess

0:05:30 > 0:05:34Phillips, an MP in Birmingham, she spoke out about receiving 600

0:05:34 > 0:05:38threats of rape in one night alone on Twitter. Could you imagine

0:05:38 > 0:05:42someone in the street stood there well someone screamed threats of

0:05:42 > 0:05:49rape at you?Death threats as well. Have you had any?I have. I ran a

0:05:49 > 0:05:54press freedom campaign and I received thousands in the night.

0:05:54 > 0:05:59Death threats? Death threats, rape, misogynistic abuse. The difference

0:05:59 > 0:06:03between that and on the street, for example, I could turn off Twitter

0:06:03 > 0:06:09and it was quite a powerful feeling, thinking that, actually, this stuff

0:06:09 > 0:06:14is really unpleasant and no one is saying it is nice. I turned it off,

0:06:14 > 0:06:18my phone, and it stopped.Because of the dominance of social media, you

0:06:18 > 0:06:24are saying, the best way to overcome abuse is switching it off, well, is

0:06:24 > 0:06:29that actually controlling it, switching it off? If we look at the

0:06:29 > 0:06:37last general election, social media dominated. Fair MPs, they campaigned

0:06:37 > 0:06:41quite -- for MPs, they campaigned a lot on social media. If you said,

0:06:41 > 0:06:44you do not want to receive the threats online, switch off social

0:06:44 > 0:06:49media, would that MPs be in Parliament now?That is not the only

0:06:49 > 0:06:54option. If you are the victim of abuse, you do have ultimate control,

0:06:54 > 0:06:59you can turn it off, mute people, ignore them, it is not a tangible

0:06:59 > 0:07:04threat. It is not pleasant and no one reading through the tweets or

0:07:04 > 0:07:09messages is thinking, this is great fun, but you are in control, it is

0:07:09 > 0:07:15not a real threat to your life.To the audience, what would you like to

0:07:15 > 0:07:24say?We are assuming the mainstream media is not out of control. MSM.

0:07:24 > 0:07:30They are owned by the corrupt elite, brainwashing us for years.We have

0:07:30 > 0:07:35got regulators...Anything on mainstream news, we are saying it is

0:07:35 > 0:07:40real, I would much rather go to YouTube and look at Russell Brand to

0:07:40 > 0:07:44find out the truth behind something rather than immediately believing

0:07:44 > 0:07:50the newspaper. Taking some of the responsibility, like the lady said,

0:07:50 > 0:07:56the social media is about the people and taking control and learning to

0:07:56 > 0:07:59be... Learning how to recognise if something is real or fake, we should

0:07:59 > 0:08:05be teaching it in schools, and teaching about media, so that we can

0:08:05 > 0:08:10look at something, is that the truth?Russell, that is a good

0:08:10 > 0:08:16point, that is a load of baloney?I look at what he has come his

0:08:16 > 0:08:21history, background, belief systems, his intention.And it is your belief

0:08:21 > 0:08:26system?It might, but it might challenge mine.You and Russell are

0:08:26 > 0:08:31in a bubble.My views have changed massively since I was young and

0:08:31 > 0:08:36Socialist worker and now I am in my 40s, totally different beliefs, I

0:08:36 > 0:08:41have children, things change. I would love my children to be

0:08:41 > 0:08:45inquisitive, even when it comes to TV adverts, not much difference

0:08:45 > 0:08:50between TV adverts and Instagram. The mainstream media, people do not

0:08:50 > 0:08:54trust the mainstream media anymore, Ben?There is a conspiracy theory

0:08:54 > 0:09:00they are peddling the same story. Try to find the Telegraph and the

0:09:00 > 0:09:04Guardian agreeing on anything. They have hundreds of years of tradition

0:09:04 > 0:09:09and there is a system in place, fact checkers. A couple of times they

0:09:09 > 0:09:13have one stories I have done and it has taken time to get through

0:09:13 > 0:09:18because they are checking every link. There is a process. There is a

0:09:18 > 0:09:25fact checking process, correction process.This is dangerous.This is

0:09:25 > 0:09:34dangerous? Can I introduce you? Presenter of arty, used to be Russia

0:09:34 > 0:09:42Today, Afshin Rattansi.

0:09:42 > 0:09:45Today, Afshin Rattansi. -- presenter of RT. The reason there is fertile

0:09:45 > 0:09:48ground for the fake news is because the public have lost faith in the

0:09:48 > 0:09:57media. It is finished. That is why RT is doing well, maybe even Donald

0:09:57 > 0:10:02Trump is doing well. The reason why it is these publications, whether

0:10:02 > 0:10:09the Iraq war, Afghanistan war, Libyan war, issues of war and

0:10:09 > 0:10:15peace...Was it a problem with the BBC? The BBC just about got close

0:10:15 > 0:10:22down.They fired me, they fired the Director General. We could see we

0:10:22 > 0:10:30were being told by the Government to persuade the people into war. The

0:10:30 > 0:10:34Guardian, the Observer, they supported the same thing.

0:10:34 > 0:10:39Interesting, but on the point... A lot comes back to Russia, how much

0:10:39 > 0:10:46of a problem, a danger even, are RT? In terms of their viewing, very

0:10:46 > 0:10:54small, the figures, 0.4% at the time of being watched, in terms of the

0:10:54 > 0:10:56impact, there are orders of magnitude second and third and what

0:10:56 > 0:11:04we have seen as a full-scale attempt to undermine the mainstream media.

0:11:04 > 0:11:14And on... The public response to us. So does Ofcom.Ofcom has complained

0:11:14 > 0:11:20against the BBC for more than RT. When it gets down to media,

0:11:20 > 0:11:27partiality, overall, you are right, Ofcom have lots of... They deal with

0:11:27 > 0:11:30nudity, inappropriate language. Violations of journalistic

0:11:30 > 0:11:35standards, the observation further due accuracy and impartiality,

0:11:35 > 0:11:41#MeToo has had more programmes found guilty of violating those standards

0:11:41 > 0:11:47-- RT has had more programmes found guilty.I would say... I worked

0:11:47 > 0:11:54there. I have a TV show. No one has told me what to do.But your chief

0:11:54 > 0:11:59editor refers to RT as the information weapon. In an interview

0:11:59 > 0:12:04in 2012...It is a weapon for the poor and dispossessed. We interview

0:12:04 > 0:12:16the worker in the factory, not the CPO. It weapon

0:12:16 > 0:12:20CPO. It weapon the work adopted 2008, she said, we are fighting war

0:12:20 > 0:12:30against the entire Western world -- in an interview in 2008. I think

0:12:30 > 0:12:34those comments were taken completely out of...Let us take it back to

0:12:34 > 0:12:42social media, what are the Russians doing with bots and trolls. Bots our

0:12:42 > 0:12:45automated accounts. What we saw from Russia was an outfit called the

0:12:45 > 0:12:50internet research agency in St Petersburg won in 3500 patrol

0:12:50 > 0:12:54Twitter accounts, fake Facebook accounts, masquerading as Americans

0:12:54 > 0:13:01on both sides of the political divide, a lot of the content was

0:13:01 > 0:13:04pro-trump-macro, pro-guns, anti-migrant, white supremacist.A

0:13:04 > 0:13:11lot of the comment was black lives matter, saying white supremacists

0:13:11 > 0:13:14are evil, I would not necessarily disagree with that, but pushing both

0:13:14 > 0:13:20sides. In May, 2016, the troll factory in St Petersburg ordered two

0:13:20 > 0:13:25simultaneous rallies in Houston, one protesting against the opening of an

0:13:25 > 0:13:28Islamic cultural centre and one in favour.The Soviets used to do this,

0:13:28 > 0:13:34active measures. Propaganda. They spread the rumour that HIV was

0:13:34 > 0:13:38created by the CIA. They kicked off the stuff about the Kennedy

0:13:38 > 0:13:43conspiracy. What is the problem? The same as it ever was, the scale of

0:13:43 > 0:13:48it?The scale and the directness. There were cases from Robert

0:13:48 > 0:13:53Mueller's indictment, the troll factory is was interacting with real

0:13:53 > 0:13:57Americans through social media and organising campaign rallies, the

0:13:57 > 0:14:01troll factory paid people to turn up, dressed as Hillary Clinton in

0:14:01 > 0:14:05jail. You had direct contact between Russian agents in St Petersburg and

0:14:05 > 0:14:10Americans on the ground, not a case of collusion, people being fooled.

0:14:10 > 0:14:17This is dangerous. Will Moy.So interesting how much the world has

0:14:17 > 0:14:22changed since we started Full Fact. When we started, you could latest

0:14:22 > 0:14:30every outlet in the country, that day is over now. It means all of us

0:14:30 > 0:14:34can tell people what we think and what we know about the world, we can

0:14:34 > 0:14:38share those ideas, it is exciting, but it is much harder for all of us

0:14:38 > 0:14:41because things are coming to you from 1000 different places and you

0:14:41 > 0:14:46do not know the track record, you have to do more work to figure it

0:14:46 > 0:14:55out. The every link being fact checked, what Ben said, I'm sorry,

0:14:55 > 0:14:59that is not possible. Some are clinging onto enough money to do

0:14:59 > 0:15:05that, but most are publishing stuff too quickly to make that possible.

0:15:05 > 0:15:09That is what Full Fact does. We publish links to every review so

0:15:09 > 0:15:13that you can judge it for yourself and this is where we will have to

0:15:13 > 0:15:17end up. If you want someone to trust you, it will not be enough to say, I

0:15:17 > 0:15:22am so-and-so, take my word. It will have to be, I can show you where you

0:15:22 > 0:15:26got it from, you can judge it.Are people interested in checking it

0:15:26 > 0:15:32out? It is not critical thinking, it is wishful thinking. In a sense. You

0:15:32 > 0:15:42see stuff you want to think is true and you disregard the rest.

0:15:42 > 0:15:52We will all do that, of course. On the BBC, we get both sides of it.

0:15:52 > 0:15:57We have always tended to look for things we agree with, when we choose

0:15:57 > 0:16:03what newspaper to buy, we used to choose because we liked its views.

0:16:03 > 0:16:07I buy things I disagree with. But most of us are human, we look

0:16:07 > 0:16:13for things that we agree with. That is not a new challenge. We have to

0:16:13 > 0:16:19look on the Internet, it is about all of us having conversations with

0:16:19 > 0:16:26each other, but there are parts of it from a news point of view which

0:16:26 > 0:16:30aren't just all of us having conversations with ourselves. When

0:16:30 > 0:16:35it comes to people paying money in secret to target adverts to

0:16:35 > 0:16:41particular parts of the population, you don't know who, that is new and

0:16:41 > 0:16:44an exercise of power and it is reasonable to ask who is doing it

0:16:44 > 0:16:54and it should be transparent. Tell us the outlets you trust.

0:16:54 > 0:16:59First, Steve, if it is beyond control, is there any hope of doing

0:16:59 > 0:17:04anything to rein in not just the abusers but also the fake news

0:17:04 > 0:17:11purveyors? We need to design and architecture

0:17:11 > 0:17:14that is more conducive to high-quality information rather than

0:17:14 > 0:17:21fake news. I share the same concerns about censorship, determining what

0:17:21 > 0:17:24is fake news. We have to understand people respond

0:17:24 > 0:17:33to information, how Facebook and Twitter can be used, and the idea of

0:17:33 > 0:17:36targeting specific individuals based on their personality is a real risk

0:17:36 > 0:17:42to democracy. The reason that is it is happening in private. No one else

0:17:42 > 0:17:47knows what this person has perceived in terms of information. That is

0:17:47 > 0:17:51qualitatively different from the way things used to be where parties

0:17:51 > 0:17:54would put up billboards next to the road and we could all see them and

0:17:54 > 0:18:01knew what the message was. We now have the technology and it isn't the

0:18:01 > 0:18:08Russians, but right here in the UK. They are designing custom designed

0:18:08 > 0:18:12messages shown to people on Facebook. We don't know to what

0:18:12 > 0:18:16extent they are customised. Of the British Government doing it

0:18:16 > 0:18:20in the same way albeit on a smaller scale?

0:18:20 > 0:18:24I am not talking about governments. Do we have a mirror image to what

0:18:24 > 0:18:33Vladimir Putin is doing. I am not an expert.

0:18:33 > 0:18:36I am not an expert.And what was that allegation...The man who has

0:18:36 > 0:18:44the billion-dollar contract...Can I say one other element which is

0:18:44 > 0:18:48related, this started with four companies, Snapchat, Twitter...

0:18:48 > 0:18:57There is censorship going on. It is not as free as you make out. The

0:18:57 > 0:19:02idea of fake news as a price to pay for a free Internet, these companies

0:19:02 > 0:19:06are censoring Google search terms, lots of people saying the articles

0:19:06 > 0:19:13that are not the mainstream, the Atlantic Council, the pro-NATO

0:19:13 > 0:19:16organisations, they are given preferential treatment. In the case

0:19:16 > 0:19:20of harassment everyone here would say these companies have to do more.

0:19:20 > 0:19:28There are not democratically... Steve, finish your point. The

0:19:28 > 0:19:35algorithms thing, you go on Amazon, you buy a book, it tells you the

0:19:35 > 0:19:40same set of books you should buy. It doesn't give you the psychological

0:19:40 > 0:19:46impetus to break away. That is right but it is an

0:19:46 > 0:19:50opportunity, we have the technology to change algorithms and make

0:19:50 > 0:19:57suggestions to people on Amazon taking something is -- outside their

0:19:57 > 0:20:05comfort zone. I like this, and I wish I had the money to buy them.

0:20:05 > 0:20:12But for society it would be better for Amazon to tell me a book I would

0:20:12 > 0:20:15not like. Exactly right. That is the avenue we

0:20:15 > 0:20:20should pursue, to think about clever ways to broaden people's access to

0:20:20 > 0:20:25information without censorship. Editorial control, and the situation

0:20:25 > 0:20:31in Germany as well. What would you like to say?Who do

0:20:31 > 0:20:37we trust? Regarding fake news, it is a subjective thing, and what is the

0:20:37 > 0:20:47real definition CNN? It is a slur which has no objective meaning. As

0:20:47 > 0:20:50it is used it only means a news source I personally disagree with.

0:20:50 > 0:20:56Is it not used to... But if you check the facts behind

0:20:56 > 0:21:01the story?

0:21:02 > 0:21:05the story? I was try to get you some business!

0:21:05 > 0:21:12Thank you. Fake news is now a term used to abuse journalists when they

0:21:12 > 0:21:17hold politicians to account which is a bad thing.Does it have a precise

0:21:17 > 0:21:21definition?We need to recognise there are lots of separate problems.

0:21:21 > 0:21:26It started when people noticed there were teenagers in places like

0:21:26 > 0:21:31Macedonia publishing made up stories to get advertising. That was one

0:21:31 > 0:21:35kind of fake news. Another kind is where you take real data about the

0:21:35 > 0:21:40economy, you distort it for your political campaign. If you put both

0:21:40 > 0:21:45of those in one bucket we will never solve anything. One of them is part

0:21:45 > 0:21:51political debate. You can check a fat, the side of a

0:21:51 > 0:21:56bus, £350 million for the NHS, that is not fake news but a claim you can

0:21:56 > 0:22:08check.Absolutely -- you can check a sacked.

0:22:08 > 0:22:20sacked. -- fact. We can check the legal basis, all of that for you.

0:22:20 > 0:22:26I don't know what your political is for establishing a procedure to

0:22:26 > 0:22:32investigate whether a story is true. Journalism is a qualitative process,

0:22:32 > 0:22:37to obtain information. You don't have hard metrics for establishing

0:22:37 > 0:22:48whether it is true. Four. You are talking about other people's

0:22:48 > 0:22:52journalistic procedures which comes down to the same procedures

0:22:52 > 0:22:59fore-checking whether something is true. There is rarely a

0:22:59 > 0:23:06fundamentally absolutely objective truth underlying any story and it is

0:23:06 > 0:23:10not obvious that is something that can be obtained simply, quickly,

0:23:10 > 0:23:14efficiently by a public or private agency who happens to have some

0:23:14 > 0:23:21greater measure of the tooth.With Lord McAlpine when he was labelled,

0:23:21 > 0:23:24there was an objective truth underneath that which led to the

0:23:24 > 0:23:34court room. Steve? There is a real risk by saying, no

0:23:34 > 0:23:41one knows what the truth is, we are throwing up our hands and saying it

0:23:41 > 0:23:46-- there is too much information, something which has been happening

0:23:46 > 0:23:52on social media and culprit is making that claim in what I call a

0:23:52 > 0:23:56shock and chaos approach to fake news. At the moment we give up on

0:23:56 > 0:24:01this idea that there are certain things we can check like the

0:24:01 > 0:24:12full-scale about the £350 million. And it was false. But the point is,

0:24:12 > 0:24:15there are certain things that are false and other certain things that

0:24:15 > 0:24:25are true. The no group -- the inauguration of crowd for Trump.We

0:24:25 > 0:24:28have an innate ability to critically look at things for us to decide for

0:24:28 > 0:24:37ourselves what is true.. Do we have that critical ability? It is the

0:24:37 > 0:24:43same thing we accuse young jihadists who are radicalised. Maybe we are

0:24:43 > 0:24:51all losing the ability. Some people still stand by the 350 figure, as a

0:24:51 > 0:24:58gross figure, saying that is about the amount of money we can put in.

0:24:58 > 0:25:04Let us point out lots of things are true and false, lots more important

0:25:04 > 0:25:09things we don't know. A large part of what we are doing is to say this

0:25:09 > 0:25:13is as much as we do know and don't know, be put the shades of grey back

0:25:13 > 0:25:24in.I want to come to you.For me, with fake news, a lot of times it

0:25:24 > 0:25:27will come up on your social media profile and you will read the

0:25:27 > 0:25:33headline, how many go on to click? As soon as you click you might

0:25:33 > 0:25:39realise it is fake news. They will read the headline, I know people who

0:25:39 > 0:25:43then actively go on to share it and believe that is true.

0:25:43 > 0:25:52Facebook made a profit of £39 billion. You might say that is

0:25:52 > 0:26:02terrible. But these adverts, they are making money, some money.For

0:26:02 > 0:26:06me, social networking companies should be held to account. We

0:26:06 > 0:26:11mentioned education and the Government have put forward in their

0:26:11 > 0:26:14green paper about compulsory education within schools with regard

0:26:14 > 0:26:20to social media. I believe that is one way forward in that we should be

0:26:20 > 0:26:24educating. We have a whole generation that had been brought up

0:26:24 > 0:26:28with social media. They don't know a life without social media. Shouldn't

0:26:28 > 0:26:34we be educating them on social media?

0:26:34 > 0:26:43How do you counteract the business model? There is a big paradox which

0:26:43 > 0:26:48relies on the fakery? That is right, this is where you

0:26:48 > 0:26:53have the bleeding and from social to traditional media, so much that goes

0:26:53 > 0:26:59on is emotional targeting. Someone will put out a scare headline to get

0:26:59 > 0:27:05people afraid and they will click on it and then realised the story is a

0:27:05 > 0:27:11wind-up.Sometimes it is not. Sometimes it is true. Look at the

0:27:11 > 0:27:15tabloids, they have been doing this for decades, there is emotional

0:27:15 > 0:27:19targeting.Every reason why we shouldn't panic.

0:27:19 > 0:27:29Yes, lots of things are true. We are not machines, the world is really

0:27:29 > 0:27:32new now, and historically every time new communications technology has

0:27:32 > 0:27:39come up people have panicked and tried to limit it. When it comes to

0:27:39 > 0:27:44the app -- the opportunity tabloids we need to be careful.And the

0:27:44 > 0:27:50difference between what the tabloids used to do and Facebook now, one of

0:27:50 > 0:27:53the really important things we know from cognitive science about why

0:27:53 > 0:27:57people stick to their beliefs is because they think they are widely

0:27:57 > 0:28:03shared. With social media we have an opportunity for anyone, no matter

0:28:03 > 0:28:08how absurd their belief, that they can find a community, like-minded

0:28:08 > 0:28:12people on the Internet, and think their belief is widely shared.

0:28:12 > 0:28:16People at their seriously think the earth is flat. You can go on to

0:28:16 > 0:28:22Facebook and have a community of a thousand people around the world.We

0:28:22 > 0:28:28are doing that debate next week! So you will be resistant to changing

0:28:28 > 0:28:36your belief.300 years ago... The last word? Google is one of the

0:28:36 > 0:28:41most powerful companies and is already censoring it, it's not free

0:28:41 > 0:28:48at the moment. From WikiLeaks, we know Google works with Hillary

0:28:48 > 0:28:51Clinton, you mentioned four corporations, this far away from the

0:28:51 > 0:28:57dream of the Internet it should be free. These are multi-billion dollar

0:28:57 > 0:29:03company is censoring already. We need regulation to stop it?Like

0:29:03 > 0:29:08we nationalised the telephone company here, or the post office, we

0:29:08 > 0:29:13can nationalise some of these so they can come under democratic

0:29:13 > 0:29:17accountability.Broadcast organisations in the pocket of the

0:29:17 > 0:29:24Government?Not in the pocket of the Government.Democratically

0:29:24 > 0:29:29accountable. Thank your very much indeed.

0:29:29 > 0:29:33Thank you for your thoughts on that debate.

0:29:33 > 0:29:37You can join in all this morning's debates by logging

0:29:37 > 0:29:39on to bbc.co.uk/thebigquestions, and following the link

0:29:39 > 0:29:40to the online discussion.

0:29:40 > 0:29:44Or you can tweet using the hashtag #bbctbq.

0:29:44 > 0:29:47Tell us what you think about our last Big Question too.

0:29:47 > 0:29:49Should we presume consent for organ donations in England?

0:29:49 > 0:29:52And if you'd like to apply to be in the audience

0:29:52 > 0:29:55at a future show, you can email audiencetbq@mentorn.tv.

0:29:55 > 0:29:57We're in Edinburgh next week, then Newport in South Wales

0:29:57 > 0:30:00on March 11th, and Brighton the week after that.

0:30:07 > 0:30:10Friday saw a successful Second Reading of Geoffrey Robinson's bill

0:30:10 > 0:30:16to change the basis of the organ donor register to one

0:30:16 > 0:30:17of presumed consent in England.

0:30:17 > 0:30:20The Government is going to go ahead with the idea.

0:30:20 > 0:30:22Last autumn, Theresa May indicated her strong support

0:30:22 > 0:30:25of an opt-out system in a letter to Max Johnson, a nine year old boy

0:30:25 > 0:30:27who had received a heart transplant.

0:30:27 > 0:30:30She said it would be known as Max's Law.

0:30:30 > 0:30:34But there are still several hurdles to get across first,

0:30:34 > 0:30:37not least whether a potential donor's family should have a right

0:30:37 > 0:30:43of veto when the final decision must be made.

0:30:43 > 0:30:45Think about that one.

0:30:45 > 0:30:48Wales, just down the road from here in Bath, changed

0:30:48 > 0:30:49to an opt-out system two years ago.

0:30:49 > 0:30:52It's fair to say the results so far are mixed.

0:30:52 > 0:30:58Should we presume consent for organ donations in England?

0:30:58 > 0:31:01Scotland is still waiting on this one and trying to make up their

0:31:01 > 0:31:11minds. Here is the thing, consultant transplant surgeon, welcome, Mike

0:31:11 > 0:31:16Stephens, I have signed up to the register. As an adult, I expressed

0:31:16 > 0:31:21wishes, as an autonomous human being, but my wife or my daughters,

0:31:21 > 0:31:27my mother, they could stop that happening. How can that be right?

0:31:27 > 0:31:34You cannot think of that happening in any other sphere.Well, if you

0:31:34 > 0:31:38phrase it like that, it is not right, but I think you need to look

0:31:38 > 0:31:42at this in the context of the whole discussion. It is interesting we

0:31:42 > 0:31:47start the discussion here because the family's role is unchanged in

0:31:47 > 0:31:53the opt-out system in Wales. They have the same role as in the opt-in

0:31:53 > 0:32:00system in England. The family can overrule their relative's wishes. I

0:32:00 > 0:32:06think the language unionist in your introduction is part of the issue --

0:32:06 > 0:32:13unionist. You need to change that around, really. To make it much more

0:32:13 > 0:32:18about your decision. You make a decision in life and make sure

0:32:18 > 0:32:22people know what that decision is and you make it clear about that and

0:32:22 > 0:32:28then it is much more difficult for families to go against that. The is

0:32:28 > 0:32:33families are not overriding people's wishes, decisions for the sake of

0:32:33 > 0:32:39being awkward -- the truth is. They do not want to make things

0:32:39 > 0:32:44unpleasant.A period of intense grief.Absolutely. We are asking

0:32:44 > 0:32:48families to make decisions that probably the most emotional point in

0:32:48 > 0:32:52their life. When you make a decision when you are emotional, as we all

0:32:52 > 0:32:57know, you do not always make good decisions. Our research in Wales

0:32:57 > 0:33:01shows that if you go back to families who said no took organ

0:33:01 > 0:33:08donation, a few months down the line, many regret the decision. --

0:33:08 > 0:33:13to organ donation.It takes the possibility of regret away?You make

0:33:13 > 0:33:17it clear it is your decision as an individual, you convey your decision

0:33:17 > 0:33:23to your loved ones so they know what it is because in that moment of

0:33:23 > 0:33:27intense emotion, they are already brief, unexpectedly bereaved,

0:33:27 > 0:33:31normally, because that is what organ donors tend to be. You are asking

0:33:31 > 0:33:38them to make this decision. Imagine a scenario whereby you have died,

0:33:38 > 0:33:42your family are there, incredibly upset, somebody comes in and says,

0:33:42 > 0:33:46they were on the organ donor register, you say, hang on, they

0:33:46 > 0:33:52didn't tell me that, I and their wife, I know everything about them,

0:33:52 > 0:33:56why did they not discuss that with me? I don't understand. They go

0:33:56 > 0:34:01through the process of trying to understand. Was this really

0:34:01 > 0:34:05important to them? Was it just something they did when they got a

0:34:05 > 0:34:08driving licence?Presumed consent does not show you have made a

0:34:08 > 0:34:13decision.Well, it doesn't, and the key thing about all of this debate

0:34:13 > 0:34:20is that what the law change has done is given us a platform to have all

0:34:20 > 0:34:24of these discussions, including about what the family's role is, so

0:34:24 > 0:34:29we can put it out there that you have an opportunity to discuss your

0:34:29 > 0:34:33decision in life and then your relatives will know what to do.I

0:34:33 > 0:34:40cannot wait to hear from the audience in a moment. A lot of that

0:34:40 > 0:34:45will have resonated with you, tell us what happened to your son?My

0:34:45 > 0:34:52son, Connor, he was 18 when he was attacked, murdered. Three years ago.

0:34:52 > 0:34:58As a result of his injuries being so severe. Organ donation became an

0:34:58 > 0:35:06option. He had decided that he did agree with organ donation at the

0:35:06 > 0:35:12time, he was only 16. He made that decision. We had a conversation,

0:35:12 > 0:35:16albeit brief, we had a conversation about it. Never for one minute

0:35:16 > 0:35:24thinking we would be put into that position. And then we jumped forward

0:35:24 > 0:35:31two years, that awful night became a reality for us as a family and the

0:35:31 > 0:35:36specialist donor nurse came to speak to us, after lots of discussions

0:35:36 > 0:35:42with the medical team, police, lots of other individuals, to explain

0:35:42 > 0:35:48that Conner would possibly be in a position to become a donor and that

0:35:48 > 0:35:53is when the conversation started. I must say, I had a conversation with

0:35:53 > 0:35:58Conner but his dad had not. Initially, the reception we gave the

0:35:58 > 0:36:04specialist donor nurse was quite hostile, frosty.Really?Because

0:36:04 > 0:36:09from me as a mum, Conner had been through such a horrific ordeal, I

0:36:09 > 0:36:14did not want any more pain for him. I did not want him to be put through

0:36:14 > 0:36:21anything more painful. So with the specialist donor nurse, we talked

0:36:21 > 0:36:25through lots and lots of scenarios, questions, some of them might have

0:36:25 > 0:36:30been really small and insignificant, one of the biggest concerns for me

0:36:30 > 0:36:35was, if we were to continue with this journey, that Conner was never

0:36:35 > 0:36:40on his own, and that was a really big point for me and we had to

0:36:40 > 0:36:46discuss it with

0:36:46 > 0:36:49discuss it with her and never at any point did we feel under pressure to

0:36:49 > 0:36:54consent, we never felt under pressure to continue, if we had made

0:36:54 > 0:36:59a decision that we felt we needed to pull back, then that was OK, that

0:36:59 > 0:37:06was an option for us. But we made the decision, Conner made the

0:37:06 > 0:37:10decision, we just facilitated it. He believed life goes on, his mantra in

0:37:10 > 0:37:15life.Did that in a sense keep you going?That kept us very focused.He

0:37:15 > 0:37:22had the tattoo on his arm. What did it say?Life goes on. His tattoo. It

0:37:22 > 0:37:29was really poignant. But for us, that kept us believing we were doing

0:37:29 > 0:37:36the right thing. That is what it was, for Conner, it was the right

0:37:36 > 0:37:41decision. Emotionally, it wasn't possibly for us at the time, but it

0:37:41 > 0:37:43was about what Conner wanted and that is what we did.

0:37:43 > 0:37:48APPLAUSE

0:37:54 > 0:38:07And lives were saved?Yes, three. APPLAUSE

0:38:07 > 0:38:11Knowing that, that amazing thing, three lives were saved, what is that

0:38:11 > 0:38:18like?It is awesome, to use Conner's words. I remember a brief

0:38:18 > 0:38:23conversation we had at the time and he said, in his own naive

0:38:23 > 0:38:29matter-of-fact way, who wouldn't want a bit of this, mum? That was

0:38:29 > 0:38:37his humour, his belief. To say I am happy, it is the wrong expression.

0:38:37 > 0:38:42But I am comfortable and I am immensely proud that Conner make

0:38:42 > 0:38:48that decision.Can I ask, if you had not had a conversation with him

0:38:48 > 0:38:52beforehand, what would your decision has been on that night?It would

0:38:52 > 0:39:00have been to continue to go through the system. Conner was so strong

0:39:00 > 0:39:03willed, stubborn, knowing his personality, it was a difficult

0:39:03 > 0:39:08decision, by no means was it easy, but together, as a family, we would

0:39:08 > 0:39:14have carried on through the journey. Thoughts from the audience. Wow.

0:39:14 > 0:39:20What would you like to say? I would be interested as well, if there is

0:39:20 > 0:39:29anyone here who would opt-out of the register.As a medical student, I

0:39:29 > 0:39:35had the recent privilege of seeing a kidney transplant and just seeing

0:39:35 > 0:39:39this small grey kidney going pink before my eyes and start working,

0:39:39 > 0:39:44this is astonishing, amazing. I think donation is a wonderful thing

0:39:44 > 0:39:52and the Catholic Church encourage it as well. I think that is a great

0:39:52 > 0:39:58thing. We are here to discuss how we should increase the amount of

0:39:58 > 0:40:03donors, I think most people in the audience...Presumed consent, what

0:40:03 > 0:40:08are your thoughts specifically on that?Not not system, one, it has no

0:40:08 > 0:40:15clear evidence that will increase donors? -- and opt-out system. In

0:40:15 > 0:40:22some cases, it hasn't not increased donors. It is morally dubious and it

0:40:22 > 0:40:25is based upon this notion of presumed consent which is

0:40:25 > 0:40:31nonsensical. How can you say I have said yes to something, I haven't?We

0:40:31 > 0:40:36have a man who knows about the evidence, Professor Roy Thomas. You

0:40:36 > 0:40:40have studied this, looked at the evidence, what is it?We started

0:40:40 > 0:40:45looking at the evidence in 2007 in Wales and that is why we have the

0:40:45 > 0:40:50law now and you have two connecting groups. Connectivity being

0:40:50 > 0:40:56important. Conner, the family, and the donors. The recipients, and the

0:40:56 > 0:41:00donors working together. It is important to realise because we have

0:41:00 > 0:41:0510,000 people waiting in the UK. They call it the invisible death

0:41:05 > 0:41:11row. The invisibility because they do not have advocacy. Brave

0:41:11 > 0:41:18families, like Conner's case, they are important, they are only 1%.

0:41:18 > 0:41:22When they say there is no moral ethical guidance here, there is.

0:41:22 > 0:41:27There is a moral base in Wales for this, it is not about the law, that

0:41:27 > 0:41:32is important. The third group, the group that now can opt-out. That is

0:41:32 > 0:41:40important as well. They can opt-out, perhaps some people argue we say

0:41:40 > 0:41:45opting out of humanity, maybe, that is an argument point in moral

0:41:45 > 0:41:52ethics, but the key thing is we duck and discuss it in the round -- the

0:41:52 > 0:41:57key thing is we look. The ethical debate is important, as we are

0:41:57 > 0:42:01having today, but most people want this law now. Two thirds of the

0:42:01 > 0:42:07people want this law because it saves lives and there are 500 people

0:42:07 > 0:42:12who died last year, five bosses went over the cliffs of Dover, and that

0:42:12 > 0:42:21is a big deal, it used to be one bus in Wales -- five bosses went over

0:42:21 > 0:42:26the cliffs.You put your hand up, gentleman with the glasses, who

0:42:26 > 0:42:32would opt-out?I would opt-out. Organ donation is a very moral

0:42:32 > 0:42:37thing. If you want to be moral and ethical, do not force your morality

0:42:37 > 0:42:45on me, that is unethical. It is quite perverse we are to assume that

0:42:45 > 0:42:48all women and children born after this law passes, the government will

0:42:48 > 0:42:54assume the rights to harvest their organs like a demented keeper, a

0:42:54 > 0:42:57disturbing precedent, not the role of the state. Be moral, donate. But

0:42:57 > 0:43:05do not force it on me.Hand up beside you.I agree. My heart goes

0:43:05 > 0:43:10out to Conner and his mother. It is not some nightmare...Body

0:43:10 > 0:43:17snatchers?We are talking about lives here, lives that can be saved

0:43:17 > 0:43:22by that person making the choice, be it for personal, spiritual or

0:43:22 > 0:43:28religious reasons, they want to opt-out, OK, but the assumption is

0:43:28 > 0:43:34taken however, that we can save human lives with their organs that

0:43:34 > 0:43:39you have agreed to give up, should, God forbid, the worst happen to you

0:43:39 > 0:43:43or a member of your family. A very important point for those who do not

0:43:43 > 0:43:47agree with that, if it was one of your loved ones that desperately

0:43:47 > 0:43:54needed the organ, how would you feel then?Anyone else who would opt-out?

0:43:54 > 0:44:01I just want to know...I am kind of on the fence, brought up in the

0:44:01 > 0:44:07Jewish faith we would not give donations at that point in our life,

0:44:07 > 0:44:13we can give live donations, it is quite a contentious... Sorry?The

0:44:13 > 0:44:16Jewish leaders do not agree with that. They looked deeper, they

0:44:16 > 0:44:22would...The Jewish faith is quite interesting insofar as there are

0:44:22 > 0:44:27many beliefs and many different interpretations, so what I am trying

0:44:27 > 0:44:31to say is I have been brought up in a belief we would not donate,

0:44:31 > 0:44:38however, I have been living... I'm finding this very emotive. My

0:44:38 > 0:44:4411-year-old Sun is likely to need a kidney transplant multiple times in

0:44:44 > 0:44:48his life, currently going through potential bladder reconstruction as

0:44:48 > 0:44:52well, so for me to sit here and say I would expect myself or someone

0:44:52 > 0:45:00asked to donate to him but on death not to...Makes you more likely to

0:45:00 > 0:45:06give than receive? Important point. You have to look at this from so

0:45:06 > 0:45:09many different angles. But I think my inclination would be I would have

0:45:09 > 0:45:13to say that I would not opt-out although I do not believe in the

0:45:13 > 0:45:16system being proposed, I believe that if you are going to come in

0:45:16 > 0:45:20with a system like that, the transition state from how we

0:45:20 > 0:45:25currently are to where you are planning to go towards, it has to be

0:45:25 > 0:45:29managed very carefully, you have to know exactly what it is all about

0:45:29 > 0:45:39and just to presume whether it

0:45:39 > 0:45:41and just to presume whether it is at 16, 18, whatever age, the minute

0:45:41 > 0:45:44your birthday comes, does that mean you have to sign on the dotted line

0:45:44 > 0:45:46to make sure God forbid you are not caught in the two-year window?

0:45:46 > 0:45:51Fascinating points. I will be right with you. So interesting. As you

0:45:51 > 0:45:55say, so moving too, thinking about those situations. Sadly, heart

0:45:55 > 0:46:00health campaigner, critical heart condition, at any moment, you could

0:46:00 > 0:46:06need a heart transplant, if a viable heart was found in the family

0:46:06 > 0:46:12decided ultimately not to consent and to save your life, what would

0:46:12 > 0:46:17that be like for you?

0:46:17 > 0:46:21Of course that would be appalled but I want to approach it a different

0:46:21 > 0:46:26way. As a heart patient, everybody says to me, surely you must be in

0:46:26 > 0:46:33favour of the opt out system? I am not in favour at all because it is

0:46:33 > 0:46:35the word, presumed, that really worries me.

0:46:35 > 0:46:42I have spent the year talking with lots of decision makers who are

0:46:42 > 0:46:48invested in this, charities, the NHS, the team at Papworth Hospital,

0:46:48 > 0:46:52and my conclusion is something that everybody I am sure would agree

0:46:52 > 0:46:57with, the most important part of the process is having a conversation

0:46:57 > 0:47:01with your loved ones. Going for presumed consent can take away the

0:47:01 > 0:47:07need for the conversation. How to make sure we have those

0:47:07 > 0:47:09conversations? I want a mandatory decision making

0:47:09 > 0:47:17process. People have their own thoughts and feelings and emotions

0:47:17 > 0:47:22about this. This isn't about bullying anybody into making the

0:47:22 > 0:47:27decision. What this is about is saying, you need to make a decision.

0:47:27 > 0:47:35I have three children. When my eldest son turned 18, he could vote

0:47:35 > 0:47:40for the first time so we had a right of passage conversation, who will

0:47:40 > 0:47:45you vote for? He could drive, so we discussed never drinking and

0:47:45 > 0:47:50driving. As part of that rite of passage, I think the majority of

0:47:50 > 0:47:54families in this country will sit down together and say, you are 18

0:47:54 > 0:47:59now, you have to make a decision, let us discuss this, this is a rite

0:47:59 > 0:48:05of passage for you. You can tick yes or no but you have to make a

0:48:05 > 0:48:10decision and that forces the conversation, encourages the

0:48:10 > 0:48:12conversation in the family. The heart transplant sessions I have

0:48:12 > 0:48:21spoken to have said they have a care for the patient but also how to care

0:48:21 > 0:48:29for the family. They will never ruin another life to save a rice -- save

0:48:29 > 0:48:35a life. They will never go against wishes because they are humans

0:48:35 > 0:48:38themselves. They don't go into that profession to cause damage and upset

0:48:38 > 0:48:43to others. I do not believe presumed consent is

0:48:43 > 0:48:49the way to go. We have to make a mandatory decision and then educate

0:48:49 > 0:48:57from a very early age. You don't approve of the opt out

0:48:57 > 0:49:04system? Everyone here, we all believe in

0:49:04 > 0:49:08organ donation, we will want to increase organ donation and see more

0:49:08 > 0:49:16transplants. My criticism is it doesn't work. The evidence from the

0:49:16 > 0:49:24Welsh Government report which was still ambivalent, there has been no

0:49:24 > 0:49:28change in the rate. There may be a change in awareness and consent

0:49:28 > 0:49:35rates have gone up but the hard numbers has not happened. I am a

0:49:35 > 0:49:40firm believer in organ donation. If you are happy to receive a

0:49:40 > 0:49:43transplant you should be happy to be a donor. Any other position is

0:49:43 > 0:49:53hypocritical. There are different things we can do.That is

0:49:53 > 0:49:59interesting. Am I right in saying you think there should be a

0:49:59 > 0:50:06prioritisation for those who have agreed?I do. There are different

0:50:06 > 0:50:10systems in the world. Certainly, Israel edged used a system where

0:50:10 > 0:50:15they bring this system of reciprocity. -- have introduced a

0:50:15 > 0:50:22system. There are our priority points, if you are waiting for a

0:50:22 > 0:50:27heart or liver, your medical aid overrides everything. Kidney

0:50:27 > 0:50:36transplantation where we do have dialysis which is not as good...NHS

0:50:36 > 0:50:44should not be about the choices people make in life.

0:50:46 > 0:50:55And the NHS versus private? The figures are really important.

0:50:55 > 0:51:01This is fact, if we knew the answer whether or not opt out what we would

0:51:01 > 0:51:04have done it years ago. We don't have strong statistical evidence to

0:51:04 > 0:51:12prove it. We have got associations but not strong evidence. We may get

0:51:12 > 0:51:17it soon because of what has happened in Wales. But the numbers are small

0:51:17 > 0:51:24and the fluctuations of great. This law change is about consent change,

0:51:24 > 0:51:27a different way of consenting so the only thing that is relevant in terms

0:51:27 > 0:51:35of success is has the consent rate gone up. Year on year since the

0:51:35 > 0:51:43update -- the opt out was introduce in Wales, the rates have gone up.On

0:51:43 > 0:51:48the statistics and arguments? In Belgium in the mid-19 80s they

0:51:48 > 0:51:55introduced opt out. They saw a 50% increase in organ donation rates in

0:51:55 > 0:52:02five years, a clear statistic there for everyone to see. In the opt out

0:52:02 > 0:52:06countries, Spain, Denmark, Norway, they have a better organ donation

0:52:06 > 0:52:14rate than we have in the UK. It is important to look at that. Germany

0:52:14 > 0:52:20hasn't, at the other end of the scale. Facts show presumed consent

0:52:20 > 0:52:25works and studies at Harvard, Chicago, they showed these facts.

0:52:25 > 0:52:32Presumed consent works.Spain is usually cited as the perfect example

0:52:32 > 0:52:39of presumed consent.Spain don't have an opt out register.But

0:52:39 > 0:52:52Belgium does. Wales has not shown the same.It is too early.

0:52:52 > 0:52:54the same.It is too early. We need to take the right lessons from

0:52:54 > 0:53:01Spain. The Spanish have said it has nothing to do with presumed consent

0:53:01 > 0:53:05but investment, education, raising awareness.

0:53:05 > 0:53:10APPLAUSE

0:53:13 > 0:53:17APPLAUSE We have done all those things, we haven't had the consent

0:53:17 > 0:53:23rate increase so it is time the something else in addition.

0:53:23 > 0:53:32I would like to say, I want to thank my donor's family every day, really,

0:53:32 > 0:53:35so much. I had a liver transplant just over a

0:53:35 > 0:53:44year ago. I was one of the lucky ones with how quickly it happened.

0:53:44 > 0:53:52From feeling just tired and being told it was a virus, to going yellow

0:53:52 > 0:53:59and having a life-saving transplant two you -- weeks later. I got my

0:53:59 > 0:54:06gift. Having said I don't agree with opt out, there are Laverty points,

0:54:06 > 0:54:12one is consent, the biggest one is family. At the moment, even if you

0:54:12 > 0:54:17haven't signed up to the register your family can sign up for you.You

0:54:17 > 0:54:25say, my gift. Is it an important concept that it was a gift?You

0:54:25 > 0:54:29can't even get close to what it feels like when you wake up from a

0:54:29 > 0:54:38transplant. APPLAUSE

0:54:39 > 0:54:46I was 48 hours away, I was on a super urgent list where your life is

0:54:46 > 0:54:54in danger. I was at peace with the fact it

0:54:54 > 0:55:00might not happen for me.

0:55:00 > 0:55:04might not happen for me. When I got woken up, I could breathe for

0:55:04 > 0:55:08myself, my transplant coordinator grabbed my hand and said, we have

0:55:08 > 0:55:16found you a liver and it was the most incredible feeling. There was

0:55:16 > 0:55:26hope, it was amazing. Next thing, I was woken up and it is like a

0:55:26 > 0:55:30euphoria, you can't comprehend, there are no words, when someone

0:55:30 > 0:55:36else says the life. It is one thing getting better, battling cancer, but

0:55:36 > 0:55:41when someone else saves your life for you, it changes everything. I

0:55:41 > 0:55:46like to think I speak for all people who have had transplants, we don't

0:55:46 > 0:55:50go back to the lives we had before, we are conscious of the

0:55:50 > 0:55:57responsibility we had to the person who helped keep us alive that we try

0:55:57 > 0:56:04and live bigger and better and richer lives.

0:56:04 > 0:56:07richer lives. Organ donations are incredibly important but it will

0:56:07 > 0:56:14never be as simple as opt in or opt out. Probably one year to the date

0:56:14 > 0:56:20my donor died, my best friend's mum died in the same situation and I saw

0:56:20 > 0:56:23it from the other side. In that moment I realised it is not clean

0:56:23 > 0:56:29cut. But only are you so vulnerable, in agony with grief, but she had

0:56:29 > 0:56:37this huge hope, even if there is no hope. You still think that maybe

0:56:37 > 0:56:44they have got it wrong. That is a huge thing. It is not simple. It has

0:56:44 > 0:56:47to be a conversation, a change in culture.

0:56:47 > 0:56:56On the point about it is not simple as opt in and opt out. In Wales we

0:56:56 > 0:57:01have devised a resource pack due to go out to bring about a conversation

0:57:01 > 0:57:07with young adults around the subject of organ donation. We give lots of

0:57:07 > 0:57:14advice at school, mental health advice, sexual health advice but no

0:57:14 > 0:57:19one likes to talk about the other side, the dark side of life. In this

0:57:19 > 0:57:25pack, we have got a conversation going within groups that will answer

0:57:25 > 0:57:31those questions. Giving those young people the knowledge it is OK not to

0:57:31 > 0:57:35agree. There is no right or wrong answer but furnishing them with the

0:57:35 > 0:57:43information of the process. What are the steps? If they feel they need to

0:57:43 > 0:57:49know those. We spent a long time asking lots of questions and that is

0:57:49 > 0:57:56what the resource pack is for, to give information and to use my son's

0:57:56 > 0:58:01case as a living case. It does matter, these early conversations

0:58:01 > 0:58:08can make a difference. The keyword you used, living.

0:58:08 > 0:58:11That is it. Because three people are alive because of your son.

0:58:11 > 0:58:22Yes, living. It is really important we talked to

0:58:22 > 0:58:28the young people it affects, don't talk to those already buying into it

0:58:28 > 0:58:34or on the donor list, but the young people who need to sign up and don't

0:58:34 > 0:58:38presume their consent, let them make their decision.

0:58:38 > 0:58:40You are a professional, thank you!

0:58:40 > 0:58:42As always, the debates will continue online and on Twitter.

0:58:42 > 0:58:45Next week we're in Edinburgh, so do join us then.

0:58:45 > 0:58:47But for now, it's goodbye from Bath and have a great Sunday.

0:58:47 > 0:58:50APPLAUSE.