Browse content similar to Episode 8. Check below for episodes and series from the same categories and more!
Line | From | To | |
---|---|---|---|
Today on The Big Questions... | 0:00:03 | 0:00:06 | |
Social media. | 0:00:06 | 0:00:08 | |
Can you have the good
without the bad? | 0:00:08 | 0:00:10 | |
And Max's Law. | 0:00:10 | 0:00:12 | |
Should you have to opt out of being
on the organ donor register? | 0:00:12 | 0:00:19 | |
APPLAUSE | 0:00:25 | 0:00:30 | |
Good morning, I'm Nicky Campbell,
welcome to The Big Questions. | 0:00:30 | 0:00:32 | |
Today, we're live from
Bath Spa University. | 0:00:32 | 0:00:34 | |
Welcome, everybody,
to The Big Questions. | 0:00:34 | 0:00:36 | |
APPLAUSE | 0:00:36 | 0:00:43 | |
Social media. | 0:00:43 | 0:00:45 | |
The online world of Twitter,
FaceBook, Snapchat and many other | 0:00:45 | 0:00:48 | |
sites, where ideas and comments can
be posted and pictures shared, | 0:00:48 | 0:00:52 | |
is probably the biggest change
to have affected our daily lives | 0:00:52 | 0:00:58 | |
since the advent of television
or the mobile phone. | 0:00:58 | 0:01:00 | |
People across the globe can share
what is happening to them | 0:01:00 | 0:01:02 | |
with friends and complete strangers
in an instant. | 0:01:02 | 0:01:04 | |
Politicians, businesses,
entertainers, artists and conmen | 0:01:04 | 0:01:06 | |
all have an easy way
to peddle their ideas | 0:01:06 | 0:01:08 | |
and wares direct to you,
24 hours a day, wherever you are. | 0:01:08 | 0:01:14 | |
The snag is there is no editorial
control, there are no real systems | 0:01:14 | 0:01:17 | |
to filter out the fake news
or the scams. | 0:01:17 | 0:01:23 | |
Indeed, fake news is often used
to direct the unwary | 0:01:23 | 0:01:26 | |
viewer to the scams. | 0:01:26 | 0:01:27 | |
And while it can bring
people closer together, | 0:01:27 | 0:01:29 | |
it can also be highly divisive,
pitting groups against each other | 0:01:29 | 0:01:32 | |
and unleashing storms of abuse
on hapless individuals. | 0:01:32 | 0:01:34 | |
Is social media beyond control? | 0:01:34 | 0:01:42 | |
Laura, welcome to The Big Questions,
PhD researcher in social media. Out | 0:01:43 | 0:01:49 | |
of control, isn't that the point,
the wonderful thing about social | 0:01:49 | 0:01:54 | |
media? I am with you, social media
is a wonderful thing, but it has got | 0:01:54 | 0:01:58 | |
to the point where it is well beyond
control. In the last general | 0:01:58 | 0:02:03 | |
election, we were exposed the amount
of abuse social media can it bring, | 0:02:03 | 0:02:09 | |
MPs spoke about the amount of abuse
they experienced, and that exposed | 0:02:09 | 0:02:16 | |
that actually social media can be a
threat to democracy. I have had the | 0:02:16 | 0:02:21 | |
pleasure of interviewing politicians
and candidates and I remember a | 0:02:21 | 0:02:24 | |
candidate speaking to me openly
about the abuse she received and she | 0:02:24 | 0:02:28 | |
turned around and said, if I had
children, I would not put myself in | 0:02:28 | 0:02:33 | |
this position, and that is a threat
to our democracy, we are segregating | 0:02:33 | 0:02:38 | |
certain people from going forward in
our political system. Our question, | 0:02:38 | 0:02:43 | |
is it beyond control, | 0:02:43 | 0:02:51 | |
is it beyond control, can it be
controlled, should it be controlled? | 0:02:52 | 0:02:53 | |
The other side, a great force for
democratisation, it has given people | 0:02:53 | 0:02:55 | |
a voice, a fantastic thing, don't we
have to accept it, uncomfortable as | 0:02:55 | 0:02:58 | |
it is, because of the good? It does
bring good and campaigns have been | 0:02:58 | 0:03:04 | |
won solely online, The Everyday
Sexism Project, brilliant, but there | 0:03:04 | 0:03:12 | |
has to be restrictions. I am all for
freedom of expression. But that is | 0:03:12 | 0:03:18 | |
not absolute. The establishment is
rattled. It has rattled their cage. | 0:03:18 | 0:03:23 | |
If you start talking about
restrictions, we smell a rat. Does | 0:03:23 | 0:03:28 | |
everybody smell a rat? OK. Let us
not move onto the next debate quite | 0:03:28 | 0:03:38 | |
yet. Ben? A threat to democracy, but
there is a real problem here which | 0:03:38 | 0:03:43 | |
is a case in India last July, seven
guys beaten to death by a mob | 0:03:43 | 0:03:48 | |
because of a fake story on WhatsApp,
that they were a child abductors. | 0:03:48 | 0:03:53 | |
Same thing happened in the US, the
story Hillary Clinton was running a | 0:03:53 | 0:03:58 | |
paedophile ring from the basement of
a pizzeria. An American guy went in | 0:03:58 | 0:04:06 | |
with an assault rifle and started
firing. What happens online does not | 0:04:06 | 0:04:11 | |
stay online and it can have real
life and real death consequences. Is | 0:04:11 | 0:04:17 | |
it a new thing, and amplified thing
now of that there is no doubt? But | 0:04:17 | 0:04:22 | |
what about online abuse, Emily?
People who do not want to go into | 0:04:22 | 0:04:26 | |
politics because of vile stuff they
are receiving something has to be | 0:04:26 | 0:04:31 | |
done? Nothing has to be done. People
use social media for good, the | 0:04:31 | 0:04:37 | |
overwhelming majority. As is the
case in public life, there is a flip | 0:04:37 | 0:04:41 | |
side, there will always be people
who are nasty and abusive, but I | 0:04:41 | 0:04:45 | |
would say no | 0:04:45 | 0:04:51 | |
would say no regulation, any
regulation who was the person who | 0:04:51 | 0:04:54 | |
decides what is the truth, the line
of what people can and cannot say? | 0:04:54 | 0:04:59 | |
That worries me, deeply concerning,
that there will be gatekeepers and | 0:04:59 | 0:05:05 | |
the right opinion. People get
abused, deeply unpleasant. But it is | 0:05:05 | 0:05:10 | |
a small drop in the massive ocean of
good social media does in connecting | 0:05:10 | 0:05:16 | |
people. Obviously, abuse happens on
the street, but if you look at the | 0:05:16 | 0:05:20 | |
type of abuse going on on social
media, I would argue very little of | 0:05:20 | 0:05:24 | |
that happens on the street. Could
you imagine, for example, Jess | 0:05:24 | 0:05:30 | |
Phillips, an MP in Birmingham, she
spoke out about receiving 600 | 0:05:30 | 0:05:34 | |
threats of rape in one night alone
on Twitter. Could you imagine | 0:05:34 | 0:05:38 | |
someone in the street stood there
well someone screamed threats of | 0:05:38 | 0:05:42 | |
rape at you? Death threats as well.
Have you had any? I have. I ran a | 0:05:42 | 0:05:49 | |
press freedom campaign and I
received thousands in the night. | 0:05:49 | 0:05:54 | |
Death threats? Death threats, rape,
misogynistic abuse. The difference | 0:05:54 | 0:05:59 | |
between that and on the street, for
example, I could turn off Twitter | 0:05:59 | 0:06:03 | |
and it was quite a powerful feeling,
thinking that, actually, this stuff | 0:06:03 | 0:06:09 | |
is really unpleasant and no one is
saying it is nice. I turned it off, | 0:06:09 | 0:06:14 | |
my phone, and it stopped. Because of
the dominance of social media, you | 0:06:14 | 0:06:18 | |
are saying, the best way to overcome
abuse is switching it off, well, is | 0:06:18 | 0:06:24 | |
that actually controlling it,
switching it off? If we look at the | 0:06:24 | 0:06:29 | |
last general election, social media
dominated. Fair MPs, they campaigned | 0:06:29 | 0:06:37 | |
quite -- for MPs, they campaigned a
lot on social media. If you said, | 0:06:37 | 0:06:41 | |
you do not want to receive the
threats online, switch off social | 0:06:41 | 0:06:44 | |
media, would that MPs be in
Parliament now? That is not the only | 0:06:44 | 0:06:49 | |
option. If you are the victim of
abuse, you do have ultimate control, | 0:06:49 | 0:06:54 | |
you can turn it off, mute people,
ignore them, it is not a tangible | 0:06:54 | 0:06:59 | |
threat. It is not pleasant and no
one reading through the tweets or | 0:06:59 | 0:07:04 | |
messages is thinking, this is great
fun, but you are in control, it is | 0:07:04 | 0:07:09 | |
not a real threat to your life. To
the audience, what would you like to | 0:07:09 | 0:07:15 | |
say? We are assuming the mainstream
media is not out of control. MSM. | 0:07:15 | 0:07:24 | |
They are owned by the corrupt elite,
brainwashing us for years. We have | 0:07:24 | 0:07:30 | |
got regulators... Anything on
mainstream news, we are saying it is | 0:07:30 | 0:07:35 | |
real, I would much rather go to
YouTube and look at Russell Brand to | 0:07:35 | 0:07:40 | |
find out the truth behind something
rather than immediately believing | 0:07:40 | 0:07:44 | |
the newspaper. Taking some of the
responsibility, like the lady said, | 0:07:44 | 0:07:50 | |
the social media is about the people
and taking control and learning to | 0:07:50 | 0:07:56 | |
be... Learning how to recognise if
something is real or fake, we should | 0:07:56 | 0:07:59 | |
be teaching it in schools, and
teaching about media, so that we can | 0:07:59 | 0:08:05 | |
look at something, is that the
truth? Russell, that is a good | 0:08:05 | 0:08:10 | |
point, that is a load of baloney? I
look at what he has come his | 0:08:10 | 0:08:16 | |
history, background, belief systems,
his intention. And it is your belief | 0:08:16 | 0:08:21 | |
system? It might, but it might
challenge mine. You and Russell are | 0:08:21 | 0:08:26 | |
in a bubble. My views have changed
massively since I was young and | 0:08:26 | 0:08:31 | |
Socialist worker and now I am in my
40s, totally different beliefs, I | 0:08:31 | 0:08:36 | |
have children, things change. I
would love my children to be | 0:08:36 | 0:08:41 | |
inquisitive, even when it comes to
TV adverts, not much difference | 0:08:41 | 0:08:45 | |
between TV adverts and Instagram.
The mainstream media, people do not | 0:08:45 | 0:08:50 | |
trust the mainstream media anymore,
Ben? There is a conspiracy theory | 0:08:50 | 0:08:54 | |
they are peddling the same story.
Try to find the Telegraph and the | 0:08:54 | 0:09:00 | |
Guardian agreeing on anything. They
have hundreds of years of tradition | 0:09:00 | 0:09:04 | |
and there is a system in place, fact
checkers. A couple of times they | 0:09:04 | 0:09:09 | |
have one stories I have done and it
has taken time to get through | 0:09:09 | 0:09:13 | |
because they are checking every
link. There is a process. There is a | 0:09:13 | 0:09:18 | |
fact checking process, correction
process. This is dangerous. This is | 0:09:18 | 0:09:25 | |
dangerous? Can I introduce you?
Presenter of arty, used to be Russia | 0:09:25 | 0:09:34 | |
Today, Afshin Rattansi. | 0:09:34 | 0:09:42 | |
Today, Afshin Rattansi. -- presenter
of RT. The reason there is fertile | 0:09:42 | 0:09:45 | |
ground for the fake news is because
the public have lost faith in the | 0:09:45 | 0:09:48 | |
media. It is finished. That is why
RT is doing well, maybe even Donald | 0:09:48 | 0:09:57 | |
Trump is doing well. The reason why
it is these publications, whether | 0:09:57 | 0:10:02 | |
the Iraq war, Afghanistan war,
Libyan war, issues of war and | 0:10:02 | 0:10:09 | |
peace... Was it a problem with the
BBC? The BBC just about got close | 0:10:09 | 0:10:15 | |
down. They fired me, they fired the
Director General. We could see we | 0:10:15 | 0:10:22 | |
were being told by the Government to
persuade the people into war. The | 0:10:22 | 0:10:30 | |
Guardian, the Observer, they
supported the same thing. | 0:10:30 | 0:10:34 | |
Interesting, but on the point... A
lot comes back to Russia, how much | 0:10:34 | 0:10:39 | |
of a problem, a danger even, are RT?
In terms of their viewing, very | 0:10:39 | 0:10:46 | |
small, the figures, 0.4% at the time
of being watched, in terms of the | 0:10:46 | 0:10:54 | |
impact, there are orders of
magnitude second and third and what | 0:10:54 | 0:10:56 | |
we have seen as a full-scale attempt
to undermine the mainstream media. | 0:10:56 | 0:11:04 | |
And on... The public response to us.
So does Ofcom. Ofcom has complained | 0:11:04 | 0:11:14 | |
against the BBC for more than RT.
When it gets down to media, | 0:11:14 | 0:11:20 | |
partiality, overall, you are right,
Ofcom have lots of... They deal with | 0:11:20 | 0:11:27 | |
nudity, inappropriate language.
Violations of journalistic | 0:11:27 | 0:11:30 | |
standards, the observation further
due accuracy and impartiality, | 0:11:30 | 0:11:35 | |
#MeToo has had more programmes found
guilty of violating those standards | 0:11:35 | 0:11:41 | |
-- RT has had more programmes found
guilty. I would say... I worked | 0:11:41 | 0:11:47 | |
there. I have a TV show. No one has
told me what to do. But your chief | 0:11:47 | 0:11:54 | |
editor refers to RT as the
information weapon. In an interview | 0:11:54 | 0:11:59 | |
in 2012... It is a weapon for the
poor and dispossessed. We interview | 0:11:59 | 0:12:04 | |
the worker in the factory, not the
CPO. It weapon | 0:12:04 | 0:12:16 | |
CPO. It weapon the work adopted
2008, she said, we are fighting war | 0:12:16 | 0:12:20 | |
against the entire Western world --
in an interview in 2008. I think | 0:12:20 | 0:12:30 | |
those comments were taken completely
out of... Let us take it back to | 0:12:30 | 0:12:34 | |
social media, what are the Russians
doing with bots and trolls. Bots our | 0:12:34 | 0:12:42 | |
automated accounts. What we saw from
Russia was an outfit called the | 0:12:42 | 0:12:45 | |
internet research agency in St
Petersburg won in 3500 patrol | 0:12:45 | 0:12:50 | |
Twitter accounts, fake Facebook
accounts, masquerading as Americans | 0:12:50 | 0:12:54 | |
on both sides of the political
divide, a lot of the content was | 0:12:54 | 0:13:01 | |
pro-trump-macro, pro-guns,
anti-migrant, white supremacist. A | 0:13:01 | 0:13:04 | |
lot of the comment was black lives
matter, saying white supremacists | 0:13:04 | 0:13:11 | |
are evil, I would not necessarily
disagree with that, but pushing both | 0:13:11 | 0:13:14 | |
sides. In May, 2016, the troll
factory in St Petersburg ordered two | 0:13:14 | 0:13:20 | |
simultaneous rallies in Houston, one
protesting against the opening of an | 0:13:20 | 0:13:25 | |
Islamic cultural centre and one in
favour. The Soviets used to do this, | 0:13:25 | 0:13:28 | |
active measures. Propaganda. They
spread the rumour that HIV was | 0:13:28 | 0:13:34 | |
created by the CIA. They kicked off
the stuff about the Kennedy | 0:13:34 | 0:13:38 | |
conspiracy. What is the problem? The
same as it ever was, the scale of | 0:13:38 | 0:13:43 | |
it? The scale and the directness.
There were cases from Robert | 0:13:43 | 0:13:48 | |
Mueller's indictment, the troll
factory is was interacting with real | 0:13:48 | 0:13:53 | |
Americans through social media and
organising campaign rallies, the | 0:13:53 | 0:13:57 | |
troll factory paid people to turn
up, dressed as Hillary Clinton in | 0:13:57 | 0:14:01 | |
jail. You had direct contact between
Russian agents in St Petersburg and | 0:14:01 | 0:14:05 | |
Americans on the ground, not a case
of collusion, people being fooled. | 0:14:05 | 0:14:10 | |
This is dangerous. Will Moy. So
interesting how much the world has | 0:14:10 | 0:14:17 | |
changed since we started Full Fact.
When we started, you could latest | 0:14:17 | 0:14:22 | |
every outlet in the country, that
day is over now. It means all of us | 0:14:22 | 0:14:30 | |
can tell people what we think and
what we know about the world, we can | 0:14:30 | 0:14:34 | |
share those ideas, it is exciting,
but it is much harder for all of us | 0:14:34 | 0:14:38 | |
because things are coming to you
from 1000 different places and you | 0:14:38 | 0:14:41 | |
do not know the track record, you
have to do more work to figure it | 0:14:41 | 0:14:46 | |
out. The every link being fact
checked, what Ben said, I'm sorry, | 0:14:46 | 0:14:55 | |
that is not possible. Some are
clinging onto enough money to do | 0:14:55 | 0:14:59 | |
that, but most are publishing stuff
too quickly to make that possible. | 0:14:59 | 0:15:05 | |
That is what Full Fact does. We
publish links to every review so | 0:15:05 | 0:15:09 | |
that you can judge it for yourself
and this is where we will have to | 0:15:09 | 0:15:13 | |
end up. If you want someone to trust
you, it will not be enough to say, I | 0:15:13 | 0:15:17 | |
am so-and-so, take my word. It will
have to be, I can show you where you | 0:15:17 | 0:15:22 | |
got it from, you can judge it. Are
people interested in checking it | 0:15:22 | 0:15:26 | |
out? It is not critical thinking, it
is wishful thinking. In a sense. You | 0:15:26 | 0:15:32 | |
see stuff you want to think is true
and you disregard the rest. | 0:15:32 | 0:15:42 | |
We will all do that, of course.
On the BBC, we get both sides of it. | 0:15:42 | 0:15:52 | |
We have always tended to look for
things we agree with, when we choose | 0:15:52 | 0:15:57 | |
what newspaper to buy, we used to
choose because we liked its views. | 0:15:57 | 0:16:03 | |
I buy things I disagree with.
But most of us are human, we look | 0:16:03 | 0:16:07 | |
for things that we agree with. That
is not a new challenge. We have to | 0:16:07 | 0:16:13 | |
look on the Internet, it is about
all of us having conversations with | 0:16:13 | 0:16:19 | |
each other, but there are parts of
it from a news point of view which | 0:16:19 | 0:16:26 | |
aren't just all of us having
conversations with ourselves. When | 0:16:26 | 0:16:30 | |
it comes to people paying money in
secret to target adverts to | 0:16:30 | 0:16:35 | |
particular parts of the population,
you don't know who, that is new and | 0:16:35 | 0:16:41 | |
an exercise of power and it is
reasonable to ask who is doing it | 0:16:41 | 0:16:44 | |
and it should be transparent.
Tell us the outlets you trust. | 0:16:44 | 0:16:54 | |
First, Steve, if it is beyond
control, is there any hope of doing | 0:16:54 | 0:16:59 | |
anything to rein in not just the
abusers but also the fake news | 0:16:59 | 0:17:04 | |
purveyors?
We need to design and architecture | 0:17:04 | 0:17:11 | |
that is more conducive to
high-quality information rather than | 0:17:11 | 0:17:14 | |
fake news. I share the same concerns
about censorship, determining what | 0:17:14 | 0:17:21 | |
is fake news.
We have to understand people respond | 0:17:21 | 0:17:24 | |
to information, how Facebook and
Twitter can be used, and the idea of | 0:17:24 | 0:17:33 | |
targeting specific individuals based
on their personality is a real risk | 0:17:33 | 0:17:36 | |
to democracy. The reason that is it
is happening in private. No one else | 0:17:36 | 0:17:42 | |
knows what this person has perceived
in terms of information. That is | 0:17:42 | 0:17:47 | |
qualitatively different from the way
things used to be where parties | 0:17:47 | 0:17:51 | |
would put up billboards next to the
road and we could all see them and | 0:17:51 | 0:17:54 | |
knew what the message was. We now
have the technology and it isn't the | 0:17:54 | 0:18:01 | |
Russians, but right here in the UK.
They are designing custom designed | 0:18:01 | 0:18:08 | |
messages shown to people on
Facebook. We don't know to what | 0:18:08 | 0:18:12 | |
extent they are customised.
Of the British Government doing it | 0:18:12 | 0:18:16 | |
in the same way albeit on a smaller
scale? | 0:18:16 | 0:18:20 | |
I am not talking about governments.
Do we have a mirror image to what | 0:18:20 | 0:18:24 | |
Vladimir Putin is doing.
I am not an expert. | 0:18:24 | 0:18:33 | |
I am not an expert. And what was
that allegation... The man who has | 0:18:33 | 0:18:36 | |
the billion-dollar contract... Can I
say one other element which is | 0:18:36 | 0:18:44 | |
related, this started with four
companies, Snapchat, Twitter... | 0:18:44 | 0:18:48 | |
There is censorship going on. It is
not as free as you make out. The | 0:18:48 | 0:18:57 | |
idea of fake news as a price to pay
for a free Internet, these companies | 0:18:57 | 0:19:02 | |
are censoring Google search terms,
lots of people saying the articles | 0:19:02 | 0:19:06 | |
that are not the mainstream, the
Atlantic Council, the pro-NATO | 0:19:06 | 0:19:13 | |
organisations, they are given
preferential treatment. In the case | 0:19:13 | 0:19:16 | |
of harassment everyone here would
say these companies have to do more. | 0:19:16 | 0:19:20 | |
There are not democratically...
Steve, finish your point. The | 0:19:20 | 0:19:28 | |
algorithms thing, you go on Amazon,
you buy a book, it tells you the | 0:19:28 | 0:19:35 | |
same set of books you should buy. It
doesn't give you the psychological | 0:19:35 | 0:19:40 | |
impetus to break away.
That is right but it is an | 0:19:40 | 0:19:46 | |
opportunity, we have the technology
to change algorithms and make | 0:19:46 | 0:19:50 | |
suggestions to people on Amazon
taking something is -- outside their | 0:19:50 | 0:19:57 | |
comfort zone. I like this, and I
wish I had the money to buy them. | 0:19:57 | 0:20:05 | |
But for society it would be better
for Amazon to tell me a book I would | 0:20:05 | 0:20:12 | |
not like.
Exactly right. That is the avenue we | 0:20:12 | 0:20:15 | |
should pursue, to think about clever
ways to broaden people's access to | 0:20:15 | 0:20:20 | |
information without censorship.
Editorial control, and the situation | 0:20:20 | 0:20:25 | |
in Germany as well.
What would you like to say? Who do | 0:20:25 | 0:20:31 | |
we trust? Regarding fake news, it is
a subjective thing, and what is the | 0:20:31 | 0:20:37 | |
real definition CNN? It is a slur
which has no objective meaning. As | 0:20:37 | 0:20:47 | |
it is used it only means a news
source I personally disagree with. | 0:20:47 | 0:20:50 | |
Is it not used to...
But if you check the facts behind | 0:20:50 | 0:20:56 | |
the story? | 0:20:56 | 0:21:01 | |
the story? I was try to get you some
business! | 0:21:02 | 0:21:05 | |
Thank you. Fake news is now a term
used to abuse journalists when they | 0:21:05 | 0:21:12 | |
hold politicians to account which is
a bad thing. Does it have a precise | 0:21:12 | 0:21:17 | |
definition? We need to recognise
there are lots of separate problems. | 0:21:17 | 0:21:21 | |
It started when people noticed there
were teenagers in places like | 0:21:21 | 0:21:26 | |
Macedonia publishing made up stories
to get advertising. That was one | 0:21:26 | 0:21:31 | |
kind of fake news. Another kind is
where you take real data about the | 0:21:31 | 0:21:35 | |
economy, you distort it for your
political campaign. If you put both | 0:21:35 | 0:21:40 | |
of those in one bucket we will never
solve anything. One of them is part | 0:21:40 | 0:21:45 | |
political debate.
You can check a fat, the side of a | 0:21:45 | 0:21:51 | |
bus, £350 million for the NHS, that
is not fake news but a claim you can | 0:21:51 | 0:21:56 | |
check. Absolutely -- you can check a
sacked. | 0:21:56 | 0:22:08 | |
sacked. -- fact. We can check the
legal basis, all of that for you. | 0:22:08 | 0:22:20 | |
I don't know what your political is
for establishing a procedure to | 0:22:20 | 0:22:26 | |
investigate whether a story is true.
Journalism is a qualitative process, | 0:22:26 | 0:22:32 | |
to obtain information. You don't
have hard metrics for establishing | 0:22:32 | 0:22:37 | |
whether it is true. Four. You are
talking about other people's | 0:22:37 | 0:22:48 | |
journalistic procedures which comes
down to the same procedures | 0:22:48 | 0:22:52 | |
fore-checking whether something is
true. There is rarely a | 0:22:52 | 0:22:59 | |
fundamentally absolutely objective
truth underlying any story and it is | 0:22:59 | 0:23:06 | |
not obvious that is something that
can be obtained simply, quickly, | 0:23:06 | 0:23:10 | |
efficiently by a public or private
agency who happens to have some | 0:23:10 | 0:23:14 | |
greater measure of the tooth. With
Lord McAlpine when he was labelled, | 0:23:14 | 0:23:21 | |
there was an objective truth
underneath that which led to the | 0:23:21 | 0:23:24 | |
court room. Steve?
There is a real risk by saying, no | 0:23:24 | 0:23:34 | |
one knows what the truth is, we are
throwing up our hands and saying it | 0:23:34 | 0:23:41 | |
-- there is too much information,
something which has been happening | 0:23:41 | 0:23:46 | |
on social media and culprit is
making that claim in what I call a | 0:23:46 | 0:23:52 | |
shock and chaos approach to fake
news. At the moment we give up on | 0:23:52 | 0:23:56 | |
this idea that there are certain
things we can check like the | 0:23:56 | 0:24:01 | |
full-scale about the £350 million.
And it was false. But the point is, | 0:24:01 | 0:24:12 | |
there are certain things that are
false and other certain things that | 0:24:12 | 0:24:15 | |
are true. The no group -- the
inauguration of crowd for Trump. We | 0:24:15 | 0:24:25 | |
have an innate ability to critically
look at things for us to decide for | 0:24:25 | 0:24:28 | |
ourselves what is true. . Do we have
that critical ability? It is the | 0:24:28 | 0:24:37 | |
same thing we accuse young jihadists
who are radicalised. Maybe we are | 0:24:37 | 0:24:43 | |
all losing the ability. Some people
still stand by the 350 figure, as a | 0:24:43 | 0:24:51 | |
gross figure, saying that is about
the amount of money we can put in. | 0:24:51 | 0:24:58 | |
Let us point out lots of things are
true and false, lots more important | 0:24:58 | 0:25:04 | |
things we don't know. A large part
of what we are doing is to say this | 0:25:04 | 0:25:09 | |
is as much as we do know and don't
know, be put the shades of grey back | 0:25:09 | 0:25:13 | |
in. I want to come to you. For me,
with fake news, a lot of times it | 0:25:13 | 0:25:24 | |
will come up on your social media
profile and you will read the | 0:25:24 | 0:25:27 | |
headline, how many go on to click?
As soon as you click you might | 0:25:27 | 0:25:33 | |
realise it is fake news. They will
read the headline, I know people who | 0:25:33 | 0:25:39 | |
then actively go on to share it and
believe that is true. | 0:25:39 | 0:25:43 | |
Facebook made a profit of £39
billion. You might say that is | 0:25:43 | 0:25:52 | |
terrible. But these adverts, they
are making money, some money. For | 0:25:52 | 0:26:02 | |
me, social networking companies
should be held to account. We | 0:26:02 | 0:26:06 | |
mentioned education and the
Government have put forward in their | 0:26:06 | 0:26:11 | |
green paper about compulsory
education within schools with regard | 0:26:11 | 0:26:14 | |
to social media. I believe that is
one way forward in that we should be | 0:26:14 | 0:26:20 | |
educating. We have a whole
generation that had been brought up | 0:26:20 | 0:26:24 | |
with social media. They don't know a
life without social media. Shouldn't | 0:26:24 | 0:26:28 | |
we be educating them on social
media? | 0:26:28 | 0:26:34 | |
How do you counteract the business
model? There is a big paradox which | 0:26:34 | 0:26:43 | |
relies on the fakery?
That is right, this is where you | 0:26:43 | 0:26:48 | |
have the bleeding and from social to
traditional media, so much that goes | 0:26:48 | 0:26:53 | |
on is emotional targeting. Someone
will put out a scare headline to get | 0:26:53 | 0:26:59 | |
people afraid and they will click on
it and then realised the story is a | 0:26:59 | 0:27:05 | |
wind-up. Sometimes it is not.
Sometimes it is true. Look at the | 0:27:05 | 0:27:11 | |
tabloids, they have been doing this
for decades, there is emotional | 0:27:11 | 0:27:15 | |
targeting. Every reason why we
shouldn't panic. | 0:27:15 | 0:27:19 | |
Yes, lots of things are true. We are
not machines, the world is really | 0:27:19 | 0:27:29 | |
new now, and historically every time
new communications technology has | 0:27:29 | 0:27:32 | |
come up people have panicked and
tried to limit it. When it comes to | 0:27:32 | 0:27:39 | |
the app -- the opportunity tabloids
we need to be careful. And the | 0:27:39 | 0:27:44 | |
difference between what the tabloids
used to do and Facebook now, one of | 0:27:44 | 0:27:50 | |
the really important things we know
from cognitive science about why | 0:27:50 | 0:27:53 | |
people stick to their beliefs is
because they think they are widely | 0:27:53 | 0:27:57 | |
shared. With social media we have an
opportunity for anyone, no matter | 0:27:57 | 0:28:03 | |
how absurd their belief, that they
can find a community, like-minded | 0:28:03 | 0:28:08 | |
people on the Internet, and think
their belief is widely shared. | 0:28:08 | 0:28:12 | |
People at their seriously think the
earth is flat. You can go on to | 0:28:12 | 0:28:16 | |
Facebook and have a community of a
thousand people around the world. We | 0:28:16 | 0:28:22 | |
are doing that debate next week!
So you will be resistant to changing | 0:28:22 | 0:28:28 | |
your belief. 300 years ago...
The last word? Google is one of the | 0:28:28 | 0:28:36 | |
most powerful companies and is
already censoring it, it's not free | 0:28:36 | 0:28:41 | |
at the moment. From WikiLeaks, we
know Google works with Hillary | 0:28:41 | 0:28:48 | |
Clinton, you mentioned four
corporations, this far away from the | 0:28:48 | 0:28:51 | |
dream of the Internet it should be
free. These are multi-billion dollar | 0:28:51 | 0:28:57 | |
company is censoring already.
We need regulation to stop it? Like | 0:28:57 | 0:29:03 | |
we nationalised the telephone
company here, or the post office, we | 0:29:03 | 0:29:08 | |
can nationalise some of these so
they can come under democratic | 0:29:08 | 0:29:13 | |
accountability. Broadcast
organisations in the pocket of the | 0:29:13 | 0:29:17 | |
Government? Not in the pocket of the
Government. Democratically | 0:29:17 | 0:29:24 | |
accountable. Thank your very much
indeed. | 0:29:24 | 0:29:29 | |
Thank you for your thoughts on that
debate. | 0:29:29 | 0:29:33 | |
You can join in all this
morning's debates by logging | 0:29:33 | 0:29:37 | |
on to bbc.co.uk/thebigquestions,
and following the link | 0:29:37 | 0:29:39 | |
to the online discussion. | 0:29:39 | 0:29:40 | |
Or you can tweet using
the hashtag #bbctbq. | 0:29:40 | 0:29:44 | |
Tell us what you think
about our last Big Question too. | 0:29:44 | 0:29:47 | |
Should we presume consent for organ
donations in England? | 0:29:47 | 0:29:49 | |
And if you'd like to apply
to be in the audience | 0:29:49 | 0:29:52 | |
at a future show, you
can email [email protected]. | 0:29:52 | 0:29:55 | |
We're in Edinburgh next week,
then Newport in South Wales | 0:29:55 | 0:29:57 | |
on March 11th, and Brighton
the week after that. | 0:29:57 | 0:30:00 | |
Friday saw a successful Second
Reading of Geoffrey Robinson's bill | 0:30:07 | 0:30:10 | |
to change the basis of the organ
donor register to one | 0:30:10 | 0:30:16 | |
of presumed consent in England. | 0:30:16 | 0:30:17 | |
The Government is going to go
ahead with the idea. | 0:30:17 | 0:30:20 | |
Last autumn, Theresa May
indicated her strong support | 0:30:20 | 0:30:22 | |
of an opt-out system in a letter
to Max Johnson, a nine year old boy | 0:30:22 | 0:30:25 | |
who had received a heart transplant. | 0:30:25 | 0:30:27 | |
She said it would be
known as Max's Law. | 0:30:27 | 0:30:30 | |
But there are still several hurdles
to get across first, | 0:30:30 | 0:30:34 | |
not least whether a potential
donor's family should have a right | 0:30:34 | 0:30:37 | |
of veto when the final
decision must be made. | 0:30:37 | 0:30:43 | |
Think about that one. | 0:30:43 | 0:30:45 | |
Wales, just down the road
from here in Bath, changed | 0:30:45 | 0:30:48 | |
to an opt-out system two years ago. | 0:30:48 | 0:30:49 | |
It's fair to say the results
so far are mixed. | 0:30:49 | 0:30:52 | |
Should we presume consent for organ
donations in England? | 0:30:52 | 0:30:58 | |
Scotland is still waiting on this
one and trying to make up their | 0:30:58 | 0:31:01 | |
minds. Here is the thing, consultant
transplant surgeon, welcome, Mike | 0:31:01 | 0:31:11 | |
Stephens, I have signed up to the
register. As an adult, I expressed | 0:31:11 | 0:31:16 | |
wishes, as an autonomous human
being, but my wife or my daughters, | 0:31:16 | 0:31:21 | |
my mother, they could stop that
happening. How can that be right? | 0:31:21 | 0:31:27 | |
You cannot think of that happening
in any other sphere. Well, if you | 0:31:27 | 0:31:34 | |
phrase it like that, it is not
right, but I think you need to look | 0:31:34 | 0:31:38 | |
at this in the context of the whole
discussion. It is interesting we | 0:31:38 | 0:31:42 | |
start the discussion here because
the family's role is unchanged in | 0:31:42 | 0:31:47 | |
the opt-out system in Wales. They
have the same role as in the opt-in | 0:31:47 | 0:31:53 | |
system in England. The family can
overrule their relative's wishes. I | 0:31:53 | 0:32:00 | |
think the language unionist in your
introduction is part of the issue -- | 0:32:00 | 0:32:06 | |
unionist. You need to change that
around, really. To make it much more | 0:32:06 | 0:32:13 | |
about your decision. You make a
decision in life and make sure | 0:32:13 | 0:32:18 | |
people know what that decision is
and you make it clear about that and | 0:32:18 | 0:32:22 | |
then it is much more difficult for
families to go against that. The is | 0:32:22 | 0:32:28 | |
families are not overriding people's
wishes, decisions for the sake of | 0:32:28 | 0:32:33 | |
being awkward -- the truth is. They
do not want to make things | 0:32:33 | 0:32:39 | |
unpleasant. A period of intense
grief. Absolutely. We are asking | 0:32:39 | 0:32:44 | |
families to make decisions that
probably the most emotional point in | 0:32:44 | 0:32:48 | |
their life. When you make a decision
when you are emotional, as we all | 0:32:48 | 0:32:52 | |
know, you do not always make good
decisions. Our research in Wales | 0:32:52 | 0:32:57 | |
shows that if you go back to
families who said no took organ | 0:32:57 | 0:33:01 | |
donation, a few months down the
line, many regret the decision. -- | 0:33:01 | 0:33:08 | |
to organ donation. It takes the
possibility of regret away? You make | 0:33:08 | 0:33:13 | |
it clear it is your decision as an
individual, you convey your decision | 0:33:13 | 0:33:17 | |
to your loved ones so they know what
it is because in that moment of | 0:33:17 | 0:33:23 | |
intense emotion, they are already
brief, unexpectedly bereaved, | 0:33:23 | 0:33:27 | |
normally, because that is what organ
donors tend to be. You are asking | 0:33:27 | 0:33:31 | |
them to make this decision. Imagine
a scenario whereby you have died, | 0:33:31 | 0:33:38 | |
your family are there, incredibly
upset, somebody comes in and says, | 0:33:38 | 0:33:42 | |
they were on the organ donor
register, you say, hang on, they | 0:33:42 | 0:33:46 | |
didn't tell me that, I and their
wife, I know everything about them, | 0:33:46 | 0:33:52 | |
why did they not discuss that with
me? I don't understand. They go | 0:33:52 | 0:33:56 | |
through the process of trying to
understand. Was this really | 0:33:56 | 0:34:01 | |
important to them? Was it just
something they did when they got a | 0:34:01 | 0:34:05 | |
driving licence? Presumed consent
does not show you have made a | 0:34:05 | 0:34:08 | |
decision. Well, it doesn't, and the
key thing about all of this debate | 0:34:08 | 0:34:13 | |
is that what the law change has done
is given us a platform to have all | 0:34:13 | 0:34:20 | |
of these discussions, including
about what the family's role is, so | 0:34:20 | 0:34:24 | |
we can put it out there that you
have an opportunity to discuss your | 0:34:24 | 0:34:29 | |
decision in life and then your
relatives will know what to do. I | 0:34:29 | 0:34:33 | |
cannot wait to hear from the
audience in a moment. A lot of that | 0:34:33 | 0:34:40 | |
will have resonated with you, tell
us what happened to your son? My | 0:34:40 | 0:34:45 | |
son, Connor, he was 18 when he was
attacked, murdered. Three years ago. | 0:34:45 | 0:34:52 | |
As a result of his injuries being so
severe. Organ donation became an | 0:34:52 | 0:34:58 | |
option. He had decided that he did
agree with organ donation at the | 0:34:58 | 0:35:06 | |
time, he was only 16. He made that
decision. We had a conversation, | 0:35:06 | 0:35:12 | |
albeit brief, we had a conversation
about it. Never for one minute | 0:35:12 | 0:35:16 | |
thinking we would be put into that
position. And then we jumped forward | 0:35:16 | 0:35:24 | |
two years, that awful night became a
reality for us as a family and the | 0:35:24 | 0:35:31 | |
specialist donor nurse came to speak
to us, after lots of discussions | 0:35:31 | 0:35:36 | |
with the medical team, police, lots
of other individuals, to explain | 0:35:36 | 0:35:42 | |
that Conner would possibly be in a
position to become a donor and that | 0:35:42 | 0:35:48 | |
is when the conversation started. I
must say, I had a conversation with | 0:35:48 | 0:35:53 | |
Conner but his dad had not.
Initially, the reception we gave the | 0:35:53 | 0:35:58 | |
specialist donor nurse was quite
hostile, frosty. Really? Because | 0:35:58 | 0:36:04 | |
from me as a mum, Conner had been
through such a horrific ordeal, I | 0:36:04 | 0:36:09 | |
did not want any more pain for him.
I did not want him to be put through | 0:36:09 | 0:36:14 | |
anything more painful. So with the
specialist donor nurse, we talked | 0:36:14 | 0:36:21 | |
through lots and lots of scenarios,
questions, some of them might have | 0:36:21 | 0:36:25 | |
been really small and insignificant,
one of the biggest concerns for me | 0:36:25 | 0:36:30 | |
was, if we were to continue with
this journey, that Conner was never | 0:36:30 | 0:36:35 | |
on his own, and that was a really
big point for me and we had to | 0:36:35 | 0:36:40 | |
discuss it with | 0:36:40 | 0:36:46 | |
discuss it with her and never at any
point did we feel under pressure to | 0:36:46 | 0:36:49 | |
consent, we never felt under
pressure to continue, if we had made | 0:36:49 | 0:36:54 | |
a decision that we felt we needed to
pull back, then that was OK, that | 0:36:54 | 0:36:59 | |
was an option for us. But we made
the decision, Conner made the | 0:36:59 | 0:37:06 | |
decision, we just facilitated it. He
believed life goes on, his mantra in | 0:37:06 | 0:37:10 | |
life. Did that in a sense keep you
going? That kept us very focused. He | 0:37:10 | 0:37:15 | |
had the tattoo on his arm. What did
it say? Life goes on. His tattoo. It | 0:37:15 | 0:37:22 | |
was really poignant. But for us,
that kept us believing we were doing | 0:37:22 | 0:37:29 | |
the right thing. That is what it
was, for Conner, it was the right | 0:37:29 | 0:37:36 | |
decision. Emotionally, it wasn't
possibly for us at the time, but it | 0:37:36 | 0:37:41 | |
was about what Conner wanted and
that is what we did. | 0:37:41 | 0:37:43 | |
APPLAUSE | 0:37:43 | 0:37:48 | |
And lives were saved? Yes, three.
APPLAUSE | 0:37:54 | 0:38:07 | |
Knowing that, that amazing thing,
three lives were saved, what is that | 0:38:07 | 0:38:11 | |
like? It is awesome, to use Conner's
words. I remember a brief | 0:38:11 | 0:38:18 | |
conversation we had at the time and
he said, in his own naive | 0:38:18 | 0:38:23 | |
matter-of-fact way, who wouldn't
want a bit of this, mum? That was | 0:38:23 | 0:38:29 | |
his humour, his belief. To say I am
happy, it is the wrong expression. | 0:38:29 | 0:38:37 | |
But I am comfortable and I am
immensely proud that Conner make | 0:38:37 | 0:38:42 | |
that decision. Can I ask, if you had
not had a conversation with him | 0:38:42 | 0:38:48 | |
beforehand, what would your decision
has been on that night? It would | 0:38:48 | 0:38:52 | |
have been to continue to go through
the system. Conner was so strong | 0:38:52 | 0:39:00 | |
willed, stubborn, knowing his
personality, it was a difficult | 0:39:00 | 0:39:03 | |
decision, by no means was it easy,
but together, as a family, we would | 0:39:03 | 0:39:08 | |
have carried on through the journey.
Thoughts from the audience. Wow. | 0:39:08 | 0:39:14 | |
What would you like to say? I would
be interested as well, if there is | 0:39:14 | 0:39:20 | |
anyone here who would opt-out of the
register. As a medical student, I | 0:39:20 | 0:39:29 | |
had the recent privilege of seeing a
kidney transplant and just seeing | 0:39:29 | 0:39:35 | |
this small grey kidney going pink
before my eyes and start working, | 0:39:35 | 0:39:39 | |
this is astonishing, amazing. I
think donation is a wonderful thing | 0:39:39 | 0:39:44 | |
and the Catholic Church encourage it
as well. I think that is a great | 0:39:44 | 0:39:52 | |
thing. We are here to discuss how we
should increase the amount of | 0:39:52 | 0:39:58 | |
donors, I think most people in the
audience... Presumed consent, what | 0:39:58 | 0:40:03 | |
are your thoughts specifically on
that? Not not system, one, it has no | 0:40:03 | 0:40:08 | |
clear evidence that will increase
donors? -- and opt-out system. In | 0:40:08 | 0:40:15 | |
some cases, it hasn't not increased
donors. It is morally dubious and it | 0:40:15 | 0:40:22 | |
is based upon this notion of
presumed consent which is | 0:40:22 | 0:40:25 | |
nonsensical. How can you say I have
said yes to something, I haven't? We | 0:40:25 | 0:40:31 | |
have a man who knows about the
evidence, Professor Roy Thomas. You | 0:40:31 | 0:40:36 | |
have studied this, looked at the
evidence, what is it? We started | 0:40:36 | 0:40:40 | |
looking at the evidence in 2007 in
Wales and that is why we have the | 0:40:40 | 0:40:45 | |
law now and you have two connecting
groups. Connectivity being | 0:40:45 | 0:40:50 | |
important. Conner, the family, and
the donors. The recipients, and the | 0:40:50 | 0:40:56 | |
donors working together. It is
important to realise because we have | 0:40:56 | 0:41:00 | |
10,000 people waiting in the UK.
They call it the invisible death | 0:41:00 | 0:41:05 | |
row. The invisibility because they
do not have advocacy. Brave | 0:41:05 | 0:41:11 | |
families, like Conner's case, they
are important, they are only 1%. | 0:41:11 | 0:41:18 | |
When they say there is no moral
ethical guidance here, there is. | 0:41:18 | 0:41:22 | |
There is a moral base in Wales for
this, it is not about the law, that | 0:41:22 | 0:41:27 | |
is important. The third group, the
group that now can opt-out. That is | 0:41:27 | 0:41:32 | |
important as well. They can opt-out,
perhaps some people argue we say | 0:41:32 | 0:41:40 | |
opting out of humanity, maybe, that
is an argument point in moral | 0:41:40 | 0:41:45 | |
ethics, but the key thing is we duck
and discuss it in the round -- the | 0:41:45 | 0:41:52 | |
key thing is we look. The ethical
debate is important, as we are | 0:41:52 | 0:41:57 | |
having today, but most people want
this law now. Two thirds of the | 0:41:57 | 0:42:01 | |
people want this law because it
saves lives and there are 500 people | 0:42:01 | 0:42:07 | |
who died last year, five bosses went
over the cliffs of Dover, and that | 0:42:07 | 0:42:12 | |
is a big deal, it used to be one bus
in Wales -- five bosses went over | 0:42:12 | 0:42:21 | |
the cliffs. You put your hand up,
gentleman with the glasses, who | 0:42:21 | 0:42:26 | |
would opt-out? I would opt-out.
Organ donation is a very moral | 0:42:26 | 0:42:32 | |
thing. If you want to be moral and
ethical, do not force your morality | 0:42:32 | 0:42:37 | |
on me, that is unethical. It is
quite perverse we are to assume that | 0:42:37 | 0:42:45 | |
all women and children born after
this law passes, the government will | 0:42:45 | 0:42:48 | |
assume the rights to harvest their
organs like a demented keeper, a | 0:42:48 | 0:42:54 | |
disturbing precedent, not the role
of the state. Be moral, donate. But | 0:42:54 | 0:42:57 | |
do not force it on me. Hand up
beside you. I agree. My heart goes | 0:42:57 | 0:43:05 | |
out to Conner and his mother. It is
not some nightmare... Body | 0:43:05 | 0:43:10 | |
snatchers? We are talking about
lives here, lives that can be saved | 0:43:10 | 0:43:17 | |
by that person making the choice, be
it for personal, spiritual or | 0:43:17 | 0:43:22 | |
religious reasons, they want to
opt-out, OK, but the assumption is | 0:43:22 | 0:43:28 | |
taken however, that we can save
human lives with their organs that | 0:43:28 | 0:43:34 | |
you have agreed to give up, should,
God forbid, the worst happen to you | 0:43:34 | 0:43:39 | |
or a member of your family. A very
important point for those who do not | 0:43:39 | 0:43:43 | |
agree with that, if it was one of
your loved ones that desperately | 0:43:43 | 0:43:47 | |
needed the organ, how would you feel
then? Anyone else who would opt-out? | 0:43:47 | 0:43:54 | |
I just want to know... I am kind of
on the fence, brought up in the | 0:43:54 | 0:44:01 | |
Jewish faith we would not give
donations at that point in our life, | 0:44:01 | 0:44:07 | |
we can give live donations, it is
quite a contentious... Sorry? The | 0:44:07 | 0:44:13 | |
Jewish leaders do not agree with
that. They looked deeper, they | 0:44:13 | 0:44:16 | |
would... The Jewish faith is quite
interesting insofar as there are | 0:44:16 | 0:44:22 | |
many beliefs and many different
interpretations, so what I am trying | 0:44:22 | 0:44:27 | |
to say is I have been brought up in
a belief we would not donate, | 0:44:27 | 0:44:31 | |
however, I have been living... I'm
finding this very emotive. My | 0:44:31 | 0:44:38 | |
11-year-old Sun is likely to need a
kidney transplant multiple times in | 0:44:38 | 0:44:44 | |
his life, currently going through
potential bladder reconstruction as | 0:44:44 | 0:44:48 | |
well, so for me to sit here and say
I would expect myself or someone | 0:44:48 | 0:44:52 | |
asked to donate to him but on death
not to... Makes you more likely to | 0:44:52 | 0:45:00 | |
give than receive? Important point.
You have to look at this from so | 0:45:00 | 0:45:06 | |
many different angles. But I think
my inclination would be I would have | 0:45:06 | 0:45:09 | |
to say that I would not opt-out
although I do not believe in the | 0:45:09 | 0:45:13 | |
system being proposed, I believe
that if you are going to come in | 0:45:13 | 0:45:16 | |
with a system like that, the
transition state from how we | 0:45:16 | 0:45:20 | |
currently are to where you are
planning to go towards, it has to be | 0:45:20 | 0:45:25 | |
managed very carefully, you have to
know exactly what it is all about | 0:45:25 | 0:45:29 | |
and just to presume whether it | 0:45:29 | 0:45:39 | |
and just to presume whether it is at
16, 18, whatever age, the minute | 0:45:39 | 0:45:41 | |
your birthday comes, does that mean
you have to sign on the dotted line | 0:45:41 | 0:45:44 | |
to make sure God forbid you are not
caught in the two-year window? | 0:45:44 | 0:45:46 | |
Fascinating points. I will be right
with you. So interesting. As you | 0:45:46 | 0:45:51 | |
say, so moving too, thinking about
those situations. Sadly, heart | 0:45:51 | 0:45:55 | |
health campaigner, critical heart
condition, at any moment, you could | 0:45:55 | 0:46:00 | |
need a heart transplant, if a viable
heart was found in the family | 0:46:00 | 0:46:06 | |
decided ultimately not to consent
and to save your life, what would | 0:46:06 | 0:46:12 | |
that be like for you? | 0:46:12 | 0:46:17 | |
Of course that would be appalled but
I want to approach it a different | 0:46:17 | 0:46:21 | |
way. As a heart patient, everybody
says to me, surely you must be in | 0:46:21 | 0:46:26 | |
favour of the opt out system? I am
not in favour at all because it is | 0:46:26 | 0:46:33 | |
the word, presumed, that really
worries me. | 0:46:33 | 0:46:35 | |
I have spent the year talking with
lots of decision makers who are | 0:46:35 | 0:46:42 | |
invested in this, charities, the
NHS, the team at Papworth Hospital, | 0:46:42 | 0:46:48 | |
and my conclusion is something that
everybody I am sure would agree | 0:46:48 | 0:46:52 | |
with, the most important part of the
process is having a conversation | 0:46:52 | 0:46:57 | |
with your loved ones. Going for
presumed consent can take away the | 0:46:57 | 0:47:01 | |
need for the conversation.
How to make sure we have those | 0:47:01 | 0:47:07 | |
conversations?
I want a mandatory decision making | 0:47:07 | 0:47:09 | |
process. People have their own
thoughts and feelings and emotions | 0:47:09 | 0:47:17 | |
about this. This isn't about
bullying anybody into making the | 0:47:17 | 0:47:22 | |
decision. What this is about is
saying, you need to make a decision. | 0:47:22 | 0:47:27 | |
I have three children. When my
eldest son turned 18, he could vote | 0:47:27 | 0:47:35 | |
for the first time so we had a right
of passage conversation, who will | 0:47:35 | 0:47:40 | |
you vote for? He could drive, so we
discussed never drinking and | 0:47:40 | 0:47:45 | |
driving. As part of that rite of
passage, I think the majority of | 0:47:45 | 0:47:50 | |
families in this country will sit
down together and say, you are 18 | 0:47:50 | 0:47:54 | |
now, you have to make a decision,
let us discuss this, this is a rite | 0:47:54 | 0:47:59 | |
of passage for you. You can tick yes
or no but you have to make a | 0:47:59 | 0:48:05 | |
decision and that forces the
conversation, encourages the | 0:48:05 | 0:48:10 | |
conversation in the family. The
heart transplant sessions I have | 0:48:10 | 0:48:12 | |
spoken to have said they have a care
for the patient but also how to care | 0:48:12 | 0:48:21 | |
for the family. They will never ruin
another life to save a rice -- save | 0:48:21 | 0:48:29 | |
a life. They will never go against
wishes because they are humans | 0:48:29 | 0:48:35 | |
themselves. They don't go into that
profession to cause damage and upset | 0:48:35 | 0:48:38 | |
to others.
I do not believe presumed consent is | 0:48:38 | 0:48:43 | |
the way to go. We have to make a
mandatory decision and then educate | 0:48:43 | 0:48:49 | |
from a very early age.
You don't approve of the opt out | 0:48:49 | 0:48:57 | |
system?
Everyone here, we all believe in | 0:48:57 | 0:49:04 | |
organ donation, we will want to
increase organ donation and see more | 0:49:04 | 0:49:08 | |
transplants. My criticism is it
doesn't work. The evidence from the | 0:49:08 | 0:49:16 | |
Welsh Government report which was
still ambivalent, there has been no | 0:49:16 | 0:49:24 | |
change in the rate. There may be a
change in awareness and consent | 0:49:24 | 0:49:28 | |
rates have gone up but the hard
numbers has not happened. I am a | 0:49:28 | 0:49:35 | |
firm believer in organ donation. If
you are happy to receive a | 0:49:35 | 0:49:40 | |
transplant you should be happy to be
a donor. Any other position is | 0:49:40 | 0:49:43 | |
hypocritical. There are different
things we can do. That is | 0:49:43 | 0:49:53 | |
interesting. Am I right in saying
you think there should be a | 0:49:53 | 0:49:59 | |
prioritisation for those who have
agreed? I do. There are different | 0:49:59 | 0:50:06 | |
systems in the world. Certainly,
Israel edged used a system where | 0:50:06 | 0:50:10 | |
they bring this system of
reciprocity. -- have introduced a | 0:50:10 | 0:50:15 | |
system. There are our priority
points, if you are waiting for a | 0:50:15 | 0:50:22 | |
heart or liver, your medical aid
overrides everything. Kidney | 0:50:22 | 0:50:27 | |
transplantation where we do have
dialysis which is not as good... NHS | 0:50:27 | 0:50:36 | |
should not be about the choices
people make in life. | 0:50:36 | 0:50:44 | |
And the NHS versus private?
The figures are really important. | 0:50:46 | 0:50:55 | |
This is fact, if we knew the answer
whether or not opt out what we would | 0:50:55 | 0:51:01 | |
have done it years ago. We don't
have strong statistical evidence to | 0:51:01 | 0:51:04 | |
prove it. We have got associations
but not strong evidence. We may get | 0:51:04 | 0:51:12 | |
it soon because of what has happened
in Wales. But the numbers are small | 0:51:12 | 0:51:17 | |
and the fluctuations of great. This
law change is about consent change, | 0:51:17 | 0:51:24 | |
a different way of consenting so the
only thing that is relevant in terms | 0:51:24 | 0:51:27 | |
of success is has the consent rate
gone up. Year on year since the | 0:51:27 | 0:51:35 | |
update -- the opt out was introduce
in Wales, the rates have gone up. On | 0:51:35 | 0:51:43 | |
the statistics and arguments?
In Belgium in the mid-19 80s they | 0:51:43 | 0:51:48 | |
introduced opt out. They saw a 50%
increase in organ donation rates in | 0:51:48 | 0:51:55 | |
five years, a clear statistic there
for everyone to see. In the opt out | 0:51:55 | 0:52:02 | |
countries, Spain, Denmark, Norway,
they have a better organ donation | 0:52:02 | 0:52:06 | |
rate than we have in the UK. It is
important to look at that. Germany | 0:52:06 | 0:52:14 | |
hasn't, at the other end of the
scale. Facts show presumed consent | 0:52:14 | 0:52:20 | |
works and studies at Harvard,
Chicago, they showed these facts. | 0:52:20 | 0:52:25 | |
Presumed consent works. Spain is
usually cited as the perfect example | 0:52:25 | 0:52:32 | |
of presumed consent. Spain don't
have an opt out register. But | 0:52:32 | 0:52:39 | |
Belgium does. Wales has not shown
the same. It is too early. | 0:52:39 | 0:52:52 | |
the same. It is too early. We need
to take the right lessons from | 0:52:52 | 0:52:54 | |
Spain. The Spanish have said it has
nothing to do with presumed consent | 0:52:54 | 0:53:01 | |
but investment, education, raising
awareness. | 0:53:01 | 0:53:05 | |
APPLAUSE | 0:53:05 | 0:53:10 | |
APPLAUSE We have done all those
things, we haven't had the consent | 0:53:13 | 0:53:17 | |
rate increase so it is time the
something else in addition. | 0:53:17 | 0:53:23 | |
I would like to say, I want to thank
my donor's family every day, really, | 0:53:23 | 0:53:32 | |
so much.
I had a liver transplant just over a | 0:53:32 | 0:53:35 | |
year ago. I was one of the lucky
ones with how quickly it happened. | 0:53:35 | 0:53:44 | |
From feeling just tired and being
told it was a virus, to going yellow | 0:53:44 | 0:53:52 | |
and having a life-saving transplant
two you -- weeks later. I got my | 0:53:52 | 0:53:59 | |
gift. Having said I don't agree with
opt out, there are Laverty points, | 0:53:59 | 0:54:06 | |
one is consent, the biggest one is
family. At the moment, even if you | 0:54:06 | 0:54:12 | |
haven't signed up to the register
your family can sign up for you. You | 0:54:12 | 0:54:17 | |
say, my gift. Is it an important
concept that it was a gift? You | 0:54:17 | 0:54:25 | |
can't even get close to what it
feels like when you wake up from a | 0:54:25 | 0:54:29 | |
transplant.
APPLAUSE | 0:54:29 | 0:54:38 | |
I was 48 hours away, I was on a
super urgent list where your life is | 0:54:39 | 0:54:46 | |
in danger.
I was at peace with the fact it | 0:54:46 | 0:54:54 | |
might not happen for me. | 0:54:54 | 0:55:00 | |
might not happen for me. When I got
woken up, I could breathe for | 0:55:00 | 0:55:04 | |
myself, my transplant coordinator
grabbed my hand and said, we have | 0:55:04 | 0:55:08 | |
found you a liver and it was the
most incredible feeling. There was | 0:55:08 | 0:55:16 | |
hope, it was amazing. Next thing, I
was woken up and it is like a | 0:55:16 | 0:55:26 | |
euphoria, you can't comprehend,
there are no words, when someone | 0:55:26 | 0:55:30 | |
else says the life. It is one thing
getting better, battling cancer, but | 0:55:30 | 0:55:36 | |
when someone else saves your life
for you, it changes everything. I | 0:55:36 | 0:55:41 | |
like to think I speak for all people
who have had transplants, we don't | 0:55:41 | 0:55:46 | |
go back to the lives we had before,
we are conscious of the | 0:55:46 | 0:55:50 | |
responsibility we had to the person
who helped keep us alive that we try | 0:55:50 | 0:55:57 | |
and live bigger and better and
richer lives. | 0:55:57 | 0:56:04 | |
richer lives. Organ donations are
incredibly important but it will | 0:56:04 | 0:56:07 | |
never be as simple as opt in or opt
out. Probably one year to the date | 0:56:07 | 0:56:14 | |
my donor died, my best friend's mum
died in the same situation and I saw | 0:56:14 | 0:56:20 | |
it from the other side. In that
moment I realised it is not clean | 0:56:20 | 0:56:23 | |
cut. But only are you so vulnerable,
in agony with grief, but she had | 0:56:23 | 0:56:29 | |
this huge hope, even if there is no
hope. You still think that maybe | 0:56:29 | 0:56:37 | |
they have got it wrong. That is a
huge thing. It is not simple. It has | 0:56:37 | 0:56:44 | |
to be a conversation, a change in
culture. | 0:56:44 | 0:56:47 | |
On the point about it is not simple
as opt in and opt out. In Wales we | 0:56:47 | 0:56:56 | |
have devised a resource pack due to
go out to bring about a conversation | 0:56:56 | 0:57:01 | |
with young adults around the subject
of organ donation. We give lots of | 0:57:01 | 0:57:07 | |
advice at school, mental health
advice, sexual health advice but no | 0:57:07 | 0:57:14 | |
one likes to talk about the other
side, the dark side of life. In this | 0:57:14 | 0:57:19 | |
pack, we have got a conversation
going within groups that will answer | 0:57:19 | 0:57:25 | |
those questions. Giving those young
people the knowledge it is OK not to | 0:57:25 | 0:57:31 | |
agree. There is no right or wrong
answer but furnishing them with the | 0:57:31 | 0:57:35 | |
information of the process. What are
the steps? If they feel they need to | 0:57:35 | 0:57:43 | |
know those. We spent a long time
asking lots of questions and that is | 0:57:43 | 0:57:49 | |
what the resource pack is for, to
give information and to use my son's | 0:57:49 | 0:57:56 | |
case as a living case. It does
matter, these early conversations | 0:57:56 | 0:58:01 | |
can make a difference.
The keyword you used, living. | 0:58:01 | 0:58:08 | |
That is it. Because three people are
alive because of your son. | 0:58:08 | 0:58:11 | |
Yes, living.
It is really important we talked to | 0:58:11 | 0:58:22 | |
the young people it affects, don't
talk to those already buying into it | 0:58:22 | 0:58:28 | |
or on the donor list, but the young
people who need to sign up and don't | 0:58:28 | 0:58:34 | |
presume their consent, let them make
their decision. | 0:58:34 | 0:58:38 | |
You are a professional, thank you! | 0:58:38 | 0:58:40 | |
As always, the debates will continue
online and on Twitter. | 0:58:40 | 0:58:42 | |
Next week we're in Edinburgh,
so do join us then. | 0:58:42 | 0:58:45 | |
But for now, it's goodbye from Bath
and have a great Sunday. | 0:58:45 | 0:58:47 | |
APPLAUSE. | 0:58:47 | 0:58:50 |