Episode 8 The Big Questions


Episode 8

Nicky Campbell presents topical debate from Bath Spa University, Bath. Is social media beyond control? Should we presume consent for organ donation in England?


Similar Content

Browse content similar to Episode 8. Check below for episodes and series from the same categories and more!

Transcript


LineFromTo

Today on The Big Questions...

0:00:030:00:06

Social media.

0:00:060:00:08

Can you have the good

without the bad?

0:00:080:00:10

And Max's Law.

0:00:100:00:12

Should you have to opt out of being

on the organ donor register?

0:00:120:00:19

APPLAUSE

0:00:250:00:30

Good morning, I'm Nicky Campbell,

welcome to The Big Questions.

0:00:300:00:32

Today, we're live from

Bath Spa University.

0:00:320:00:34

Welcome, everybody,

to The Big Questions.

0:00:340:00:36

APPLAUSE

0:00:360:00:43

Social media.

0:00:430:00:45

The online world of Twitter,

FaceBook, Snapchat and many other

0:00:450:00:48

sites, where ideas and comments can

be posted and pictures shared,

0:00:480:00:52

is probably the biggest change

to have affected our daily lives

0:00:520:00:58

since the advent of television

or the mobile phone.

0:00:580:01:00

People across the globe can share

what is happening to them

0:01:000:01:02

with friends and complete strangers

in an instant.

0:01:020:01:04

Politicians, businesses,

entertainers, artists and conmen

0:01:040:01:06

all have an easy way

to peddle their ideas

0:01:060:01:08

and wares direct to you,

24 hours a day, wherever you are.

0:01:080:01:14

The snag is there is no editorial

control, there are no real systems

0:01:140:01:17

to filter out the fake news

or the scams.

0:01:170:01:23

Indeed, fake news is often used

to direct the unwary

0:01:230:01:26

viewer to the scams.

0:01:260:01:27

And while it can bring

people closer together,

0:01:270:01:29

it can also be highly divisive,

pitting groups against each other

0:01:290:01:32

and unleashing storms of abuse

on hapless individuals.

0:01:320:01:34

Is social media beyond control?

0:01:340:01:42

Laura, welcome to The Big Questions,

PhD researcher in social media. Out

0:01:430:01:49

of control, isn't that the point,

the wonderful thing about social

0:01:490:01:54

media?

I am with you, social media

is a wonderful thing, but it has got

0:01:540:01:58

to the point where it is well beyond

control. In the last general

0:01:580:02:03

election, we were exposed the amount

of abuse social media can it bring,

0:02:030:02:09

MPs spoke about the amount of abuse

they experienced, and that exposed

0:02:090:02:16

that actually social media can be a

threat to democracy. I have had the

0:02:160:02:21

pleasure of interviewing politicians

and candidates and I remember a

0:02:210:02:24

candidate speaking to me openly

about the abuse she received and she

0:02:240:02:28

turned around and said, if I had

children, I would not put myself in

0:02:280:02:33

this position, and that is a threat

to our democracy, we are segregating

0:02:330:02:38

certain people from going forward in

our political system.

Our question,

0:02:380:02:43

is it beyond control,

0:02:430:02:51

is it beyond control, can it be

controlled, should it be controlled?

0:02:520:02:53

The other side, a great force for

democratisation, it has given people

0:02:530:02:55

a voice, a fantastic thing, don't we

have to accept it, uncomfortable as

0:02:550:02:58

it is, because of the good?

It does

bring good and campaigns have been

0:02:580:03:04

won solely online, The Everyday

Sexism Project, brilliant, but there

0:03:040:03:12

has to be restrictions. I am all for

freedom of expression. But that is

0:03:120:03:18

not absolute.

The establishment is

rattled. It has rattled their cage.

0:03:180:03:23

If you start talking about

restrictions, we smell a rat. Does

0:03:230:03:28

everybody smell a rat? OK. Let us

not move onto the next debate quite

0:03:280:03:38

yet.

Ben? A threat to democracy, but

there is a real problem here which

0:03:380:03:43

is a case in India last July, seven

guys beaten to death by a mob

0:03:430:03:48

because of a fake story on WhatsApp,

that they were a child abductors.

0:03:480:03:53

Same thing happened in the US, the

story Hillary Clinton was running a

0:03:530:03:58

paedophile ring from the basement of

a pizzeria. An American guy went in

0:03:580:04:06

with an assault rifle and started

firing. What happens online does not

0:04:060:04:11

stay online and it can have real

life and real death consequences.

Is

0:04:110:04:17

it a new thing, and amplified thing

now of that there is no doubt? But

0:04:170:04:22

what about online abuse, Emily?

People who do not want to go into

0:04:220:04:26

politics because of vile stuff they

are receiving something has to be

0:04:260:04:31

done?

Nothing has to be done. People

use social media for good, the

0:04:310:04:37

overwhelming majority. As is the

case in public life, there is a flip

0:04:370:04:41

side, there will always be people

who are nasty and abusive, but I

0:04:410:04:45

would say no

0:04:450:04:51

would say no regulation, any

regulation who was the person who

0:04:510:04:54

decides what is the truth, the line

of what people can and cannot say?

0:04:540:04:59

That worries me, deeply concerning,

that there will be gatekeepers and

0:04:590:05:05

the right opinion. People get

abused, deeply unpleasant. But it is

0:05:050:05:10

a small drop in the massive ocean of

good social media does in connecting

0:05:100:05:16

people.

Obviously, abuse happens on

the street, but if you look at the

0:05:160:05:20

type of abuse going on on social

media, I would argue very little of

0:05:200:05:24

that happens on the street. Could

you imagine, for example, Jess

0:05:240:05:30

Phillips, an MP in Birmingham, she

spoke out about receiving 600

0:05:300:05:34

threats of rape in one night alone

on Twitter. Could you imagine

0:05:340:05:38

someone in the street stood there

well someone screamed threats of

0:05:380:05:42

rape at you?

Death threats as well.

Have you had any?

I have. I ran a

0:05:420:05:49

press freedom campaign and I

received thousands in the night.

0:05:490:05:54

Death threats? Death threats, rape,

misogynistic abuse. The difference

0:05:540:05:59

between that and on the street, for

example, I could turn off Twitter

0:05:590:06:03

and it was quite a powerful feeling,

thinking that, actually, this stuff

0:06:030:06:09

is really unpleasant and no one is

saying it is nice. I turned it off,

0:06:090:06:14

my phone, and it stopped.

Because of

the dominance of social media, you

0:06:140:06:18

are saying, the best way to overcome

abuse is switching it off, well, is

0:06:180:06:24

that actually controlling it,

switching it off? If we look at the

0:06:240:06:29

last general election, social media

dominated. Fair MPs, they campaigned

0:06:290:06:37

quite -- for MPs, they campaigned a

lot on social media. If you said,

0:06:370:06:41

you do not want to receive the

threats online, switch off social

0:06:410:06:44

media, would that MPs be in

Parliament now?

That is not the only

0:06:440:06:49

option. If you are the victim of

abuse, you do have ultimate control,

0:06:490:06:54

you can turn it off, mute people,

ignore them, it is not a tangible

0:06:540:06:59

threat. It is not pleasant and no

one reading through the tweets or

0:06:590:07:04

messages is thinking, this is great

fun, but you are in control, it is

0:07:040:07:09

not a real threat to your life.

To

the audience, what would you like to

0:07:090:07:15

say?

We are assuming the mainstream

media is not out of control. MSM.

0:07:150:07:24

They are owned by the corrupt elite,

brainwashing us for years.

We have

0:07:240:07:30

got regulators...

Anything on

mainstream news, we are saying it is

0:07:300:07:35

real, I would much rather go to

YouTube and look at Russell Brand to

0:07:350:07:40

find out the truth behind something

rather than immediately believing

0:07:400:07:44

the newspaper. Taking some of the

responsibility, like the lady said,

0:07:440:07:50

the social media is about the people

and taking control and learning to

0:07:500:07:56

be... Learning how to recognise if

something is real or fake, we should

0:07:560:07:59

be teaching it in schools, and

teaching about media, so that we can

0:07:590:08:05

look at something, is that the

truth?

Russell, that is a good

0:08:050:08:10

point, that is a load of baloney?

I

look at what he has come his

0:08:100:08:16

history, background, belief systems,

his intention.

And it is your belief

0:08:160:08:21

system?

It might, but it might

challenge mine.

You and Russell are

0:08:210:08:26

in a bubble.

My views have changed

massively since I was young and

0:08:260:08:31

Socialist worker and now I am in my

40s, totally different beliefs, I

0:08:310:08:36

have children, things change. I

would love my children to be

0:08:360:08:41

inquisitive, even when it comes to

TV adverts, not much difference

0:08:410:08:45

between TV adverts and Instagram.

The mainstream media, people do not

0:08:450:08:50

trust the mainstream media anymore,

Ben?

There is a conspiracy theory

0:08:500:08:54

they are peddling the same story.

Try to find the Telegraph and the

0:08:540:09:00

Guardian agreeing on anything. They

have hundreds of years of tradition

0:09:000:09:04

and there is a system in place, fact

checkers. A couple of times they

0:09:040:09:09

have one stories I have done and it

has taken time to get through

0:09:090:09:13

because they are checking every

link. There is a process. There is a

0:09:130:09:18

fact checking process, correction

process.

This is dangerous.

This is

0:09:180:09:25

dangerous? Can I introduce you?

Presenter of arty, used to be Russia

0:09:250:09:34

Today, Afshin Rattansi.

0:09:340:09:42

Today, Afshin Rattansi. -- presenter

of RT. The reason there is fertile

0:09:420:09:45

ground for the fake news is because

the public have lost faith in the

0:09:450:09:48

media. It is finished. That is why

RT is doing well, maybe even Donald

0:09:480:09:57

Trump is doing well. The reason why

it is these publications, whether

0:09:570:10:02

the Iraq war, Afghanistan war,

Libyan war, issues of war and

0:10:020:10:09

peace...

Was it a problem with the

BBC? The BBC just about got close

0:10:090:10:15

down.

They fired me, they fired the

Director General. We could see we

0:10:150:10:22

were being told by the Government to

persuade the people into war. The

0:10:220:10:30

Guardian, the Observer, they

supported the same thing.

0:10:300:10:34

Interesting, but on the point... A

lot comes back to Russia, how much

0:10:340:10:39

of a problem, a danger even, are RT?

In terms of their viewing, very

0:10:390:10:46

small, the figures, 0.4% at the time

of being watched, in terms of the

0:10:460:10:54

impact, there are orders of

magnitude second and third and what

0:10:540:10:56

we have seen as a full-scale attempt

to undermine the mainstream media.

0:10:560:11:04

And on... The public response to us.

So does Ofcom.

Ofcom has complained

0:11:040:11:14

against the BBC for more than RT.

When it gets down to media,

0:11:140:11:20

partiality, overall, you are right,

Ofcom have lots of... They deal with

0:11:200:11:27

nudity, inappropriate language.

Violations of journalistic

0:11:270:11:30

standards, the observation further

due accuracy and impartiality,

0:11:300:11:35

#MeToo has had more programmes found

guilty of violating those standards

0:11:350:11:41

-- RT has had more programmes found

guilty.

I would say... I worked

0:11:410:11:47

there. I have a TV show. No one has

told me what to do.

But your chief

0:11:470:11:54

editor refers to RT as the

information weapon. In an interview

0:11:540:11:59

in 2012...

It is a weapon for the

poor and dispossessed. We interview

0:11:590:12:04

the worker in the factory, not the

CPO. It weapon

0:12:040:12:16

CPO. It weapon the work adopted

2008, she said, we are fighting war

0:12:160:12:20

against the entire Western world --

in an interview in 2008. I think

0:12:200:12:30

those comments were taken completely

out of...

Let us take it back to

0:12:300:12:34

social media, what are the Russians

doing with bots and trolls. Bots our

0:12:340:12:42

automated accounts. What we saw from

Russia was an outfit called the

0:12:420:12:45

internet research agency in St

Petersburg won in 3500 patrol

0:12:450:12:50

Twitter accounts, fake Facebook

accounts, masquerading as Americans

0:12:500:12:54

on both sides of the political

divide, a lot of the content was

0:12:540:13:01

pro-trump-macro, pro-guns,

anti-migrant, white supremacist.

A

0:13:010:13:04

lot of the comment was black lives

matter, saying white supremacists

0:13:040:13:11

are evil, I would not necessarily

disagree with that, but pushing both

0:13:110:13:14

sides. In May, 2016, the troll

factory in St Petersburg ordered two

0:13:140:13:20

simultaneous rallies in Houston, one

protesting against the opening of an

0:13:200:13:25

Islamic cultural centre and one in

favour.

The Soviets used to do this,

0:13:250:13:28

active measures. Propaganda. They

spread the rumour that HIV was

0:13:280:13:34

created by the CIA. They kicked off

the stuff about the Kennedy

0:13:340:13:38

conspiracy. What is the problem? The

same as it ever was, the scale of

0:13:380:13:43

it?

The scale and the directness.

There were cases from Robert

0:13:430:13:48

Mueller's indictment, the troll

factory is was interacting with real

0:13:480:13:53

Americans through social media and

organising campaign rallies, the

0:13:530:13:57

troll factory paid people to turn

up, dressed as Hillary Clinton in

0:13:570:14:01

jail. You had direct contact between

Russian agents in St Petersburg and

0:14:010:14:05

Americans on the ground, not a case

of collusion, people being fooled.

0:14:050:14:10

This is dangerous. Will Moy.

So

interesting how much the world has

0:14:100:14:17

changed since we started Full Fact.

When we started, you could latest

0:14:170:14:22

every outlet in the country, that

day is over now. It means all of us

0:14:220:14:30

can tell people what we think and

what we know about the world, we can

0:14:300:14:34

share those ideas, it is exciting,

but it is much harder for all of us

0:14:340:14:38

because things are coming to you

from 1000 different places and you

0:14:380:14:41

do not know the track record, you

have to do more work to figure it

0:14:410:14:46

out. The every link being fact

checked, what Ben said, I'm sorry,

0:14:460:14:55

that is not possible. Some are

clinging onto enough money to do

0:14:550:14:59

that, but most are publishing stuff

too quickly to make that possible.

0:14:590:15:05

That is what Full Fact does. We

publish links to every review so

0:15:050:15:09

that you can judge it for yourself

and this is where we will have to

0:15:090:15:13

end up. If you want someone to trust

you, it will not be enough to say, I

0:15:130:15:17

am so-and-so, take my word. It will

have to be, I can show you where you

0:15:170:15:22

got it from, you can judge it.

Are

people interested in checking it

0:15:220:15:26

out? It is not critical thinking, it

is wishful thinking. In a sense. You

0:15:260:15:32

see stuff you want to think is true

and you disregard the rest.

0:15:320:15:42

We will all do that, of course.

On the BBC, we get both sides of it.

0:15:420:15:52

We have always tended to look for

things we agree with, when we choose

0:15:520:15:57

what newspaper to buy, we used to

choose because we liked its views.

0:15:570:16:03

I buy things I disagree with.

But most of us are human, we look

0:16:030:16:07

for things that we agree with. That

is not a new challenge. We have to

0:16:070:16:13

look on the Internet, it is about

all of us having conversations with

0:16:130:16:19

each other, but there are parts of

it from a news point of view which

0:16:190:16:26

aren't just all of us having

conversations with ourselves. When

0:16:260:16:30

it comes to people paying money in

secret to target adverts to

0:16:300:16:35

particular parts of the population,

you don't know who, that is new and

0:16:350:16:41

an exercise of power and it is

reasonable to ask who is doing it

0:16:410:16:44

and it should be transparent.

Tell us the outlets you trust.

0:16:440:16:54

First, Steve, if it is beyond

control, is there any hope of doing

0:16:540:16:59

anything to rein in not just the

abusers but also the fake news

0:16:590:17:04

purveyors?

We need to design and architecture

0:17:040:17:11

that is more conducive to

high-quality information rather than

0:17:110:17:14

fake news. I share the same concerns

about censorship, determining what

0:17:140:17:21

is fake news.

We have to understand people respond

0:17:210:17:24

to information, how Facebook and

Twitter can be used, and the idea of

0:17:240:17:33

targeting specific individuals based

on their personality is a real risk

0:17:330:17:36

to democracy. The reason that is it

is happening in private. No one else

0:17:360:17:42

knows what this person has perceived

in terms of information. That is

0:17:420:17:47

qualitatively different from the way

things used to be where parties

0:17:470:17:51

would put up billboards next to the

road and we could all see them and

0:17:510:17:54

knew what the message was. We now

have the technology and it isn't the

0:17:540:18:01

Russians, but right here in the UK.

They are designing custom designed

0:18:010:18:08

messages shown to people on

Facebook. We don't know to what

0:18:080:18:12

extent they are customised.

Of the British Government doing it

0:18:120:18:16

in the same way albeit on a smaller

scale?

0:18:160:18:20

I am not talking about governments.

Do we have a mirror image to what

0:18:200:18:24

Vladimir Putin is doing.

I am not an expert.

0:18:240:18:33

I am not an expert.

And what was

that allegation...

The man who has

0:18:330:18:36

the billion-dollar contract...

Can I

say one other element which is

0:18:360:18:44

related, this started with four

companies, Snapchat, Twitter...

0:18:440:18:48

There is censorship going on. It is

not as free as you make out. The

0:18:480:18:57

idea of fake news as a price to pay

for a free Internet, these companies

0:18:570:19:02

are censoring Google search terms,

lots of people saying the articles

0:19:020:19:06

that are not the mainstream, the

Atlantic Council, the pro-NATO

0:19:060:19:13

organisations, they are given

preferential treatment. In the case

0:19:130:19:16

of harassment everyone here would

say these companies have to do more.

0:19:160:19:20

There are not democratically...

Steve, finish your point. The

0:19:200:19:28

algorithms thing, you go on Amazon,

you buy a book, it tells you the

0:19:280:19:35

same set of books you should buy. It

doesn't give you the psychological

0:19:350:19:40

impetus to break away.

That is right but it is an

0:19:400:19:46

opportunity, we have the technology

to change algorithms and make

0:19:460:19:50

suggestions to people on Amazon

taking something is -- outside their

0:19:500:19:57

comfort zone. I like this, and I

wish I had the money to buy them.

0:19:570:20:05

But for society it would be better

for Amazon to tell me a book I would

0:20:050:20:12

not like.

Exactly right. That is the avenue we

0:20:120:20:15

should pursue, to think about clever

ways to broaden people's access to

0:20:150:20:20

information without censorship.

Editorial control, and the situation

0:20:200:20:25

in Germany as well.

What would you like to say?

Who do

0:20:250:20:31

we trust? Regarding fake news, it is

a subjective thing, and what is the

0:20:310:20:37

real definition CNN? It is a slur

which has no objective meaning. As

0:20:370:20:47

it is used it only means a news

source I personally disagree with.

0:20:470:20:50

Is it not used to...

But if you check the facts behind

0:20:500:20:56

the story?

0:20:560:21:01

the story? I was try to get you some

business!

0:21:020:21:05

Thank you. Fake news is now a term

used to abuse journalists when they

0:21:050:21:12

hold politicians to account which is

a bad thing.

Does it have a precise

0:21:120:21:17

definition?

We need to recognise

there are lots of separate problems.

0:21:170:21:21

It started when people noticed there

were teenagers in places like

0:21:210:21:26

Macedonia publishing made up stories

to get advertising. That was one

0:21:260:21:31

kind of fake news. Another kind is

where you take real data about the

0:21:310:21:35

economy, you distort it for your

political campaign. If you put both

0:21:350:21:40

of those in one bucket we will never

solve anything. One of them is part

0:21:400:21:45

political debate.

You can check a fat, the side of a

0:21:450:21:51

bus, £350 million for the NHS, that

is not fake news but a claim you can

0:21:510:21:56

check.

Absolutely -- you can check a

sacked.

0:21:560:22:08

sacked. -- fact. We can check the

legal basis, all of that for you.

0:22:080:22:20

I don't know what your political is

for establishing a procedure to

0:22:200:22:26

investigate whether a story is true.

Journalism is a qualitative process,

0:22:260:22:32

to obtain information. You don't

have hard metrics for establishing

0:22:320:22:37

whether it is true. Four. You are

talking about other people's

0:22:370:22:48

journalistic procedures which comes

down to the same procedures

0:22:480:22:52

fore-checking whether something is

true. There is rarely a

0:22:520:22:59

fundamentally absolutely objective

truth underlying any story and it is

0:22:590:23:06

not obvious that is something that

can be obtained simply, quickly,

0:23:060:23:10

efficiently by a public or private

agency who happens to have some

0:23:100:23:14

greater measure of the tooth.

With

Lord McAlpine when he was labelled,

0:23:140:23:21

there was an objective truth

underneath that which led to the

0:23:210:23:24

court room. Steve?

There is a real risk by saying, no

0:23:240:23:34

one knows what the truth is, we are

throwing up our hands and saying it

0:23:340:23:41

-- there is too much information,

something which has been happening

0:23:410:23:46

on social media and culprit is

making that claim in what I call a

0:23:460:23:52

shock and chaos approach to fake

news. At the moment we give up on

0:23:520:23:56

this idea that there are certain

things we can check like the

0:23:560:24:01

full-scale about the £350 million.

And it was false. But the point is,

0:24:010:24:12

there are certain things that are

false and other certain things that

0:24:120:24:15

are true. The no group -- the

inauguration of crowd for Trump.

We

0:24:150:24:25

have an innate ability to critically

look at things for us to decide for

0:24:250:24:28

ourselves what is true.

. Do we have

that critical ability? It is the

0:24:280:24:37

same thing we accuse young jihadists

who are radicalised. Maybe we are

0:24:370:24:43

all losing the ability. Some people

still stand by the 350 figure, as a

0:24:430:24:51

gross figure, saying that is about

the amount of money we can put in.

0:24:510:24:58

Let us point out lots of things are

true and false, lots more important

0:24:580:25:04

things we don't know. A large part

of what we are doing is to say this

0:25:040:25:09

is as much as we do know and don't

know, be put the shades of grey back

0:25:090:25:13

in.

I want to come to you.

For me,

with fake news, a lot of times it

0:25:130:25:24

will come up on your social media

profile and you will read the

0:25:240:25:27

headline, how many go on to click?

As soon as you click you might

0:25:270:25:33

realise it is fake news. They will

read the headline, I know people who

0:25:330:25:39

then actively go on to share it and

believe that is true.

0:25:390:25:43

Facebook made a profit of £39

billion. You might say that is

0:25:430:25:52

terrible. But these adverts, they

are making money, some money.

For

0:25:520:26:02

me, social networking companies

should be held to account. We

0:26:020:26:06

mentioned education and the

Government have put forward in their

0:26:060:26:11

green paper about compulsory

education within schools with regard

0:26:110:26:14

to social media. I believe that is

one way forward in that we should be

0:26:140:26:20

educating. We have a whole

generation that had been brought up

0:26:200:26:24

with social media. They don't know a

life without social media. Shouldn't

0:26:240:26:28

we be educating them on social

media?

0:26:280:26:34

How do you counteract the business

model? There is a big paradox which

0:26:340:26:43

relies on the fakery?

That is right, this is where you

0:26:430:26:48

have the bleeding and from social to

traditional media, so much that goes

0:26:480:26:53

on is emotional targeting. Someone

will put out a scare headline to get

0:26:530:26:59

people afraid and they will click on

it and then realised the story is a

0:26:590:27:05

wind-up.

Sometimes it is not.

Sometimes it is true. Look at the

0:27:050:27:11

tabloids, they have been doing this

for decades, there is emotional

0:27:110:27:15

targeting.

Every reason why we

shouldn't panic.

0:27:150:27:19

Yes, lots of things are true. We are

not machines, the world is really

0:27:190:27:29

new now, and historically every time

new communications technology has

0:27:290:27:32

come up people have panicked and

tried to limit it. When it comes to

0:27:320:27:39

the app -- the opportunity tabloids

we need to be careful.

And the

0:27:390:27:44

difference between what the tabloids

used to do and Facebook now, one of

0:27:440:27:50

the really important things we know

from cognitive science about why

0:27:500:27:53

people stick to their beliefs is

because they think they are widely

0:27:530:27:57

shared. With social media we have an

opportunity for anyone, no matter

0:27:570:28:03

how absurd their belief, that they

can find a community, like-minded

0:28:030:28:08

people on the Internet, and think

their belief is widely shared.

0:28:080:28:12

People at their seriously think the

earth is flat. You can go on to

0:28:120:28:16

Facebook and have a community of a

thousand people around the world.

We

0:28:160:28:22

are doing that debate next week!

So you will be resistant to changing

0:28:220:28:28

your belief.

300 years ago...

The last word? Google is one of the

0:28:280:28:36

most powerful companies and is

already censoring it, it's not free

0:28:360:28:41

at the moment. From WikiLeaks, we

know Google works with Hillary

0:28:410:28:48

Clinton, you mentioned four

corporations, this far away from the

0:28:480:28:51

dream of the Internet it should be

free. These are multi-billion dollar

0:28:510:28:57

company is censoring already.

We need regulation to stop it?

Like

0:28:570:29:03

we nationalised the telephone

company here, or the post office, we

0:29:030:29:08

can nationalise some of these so

they can come under democratic

0:29:080:29:13

accountability.

Broadcast

organisations in the pocket of the

0:29:130:29:17

Government?

Not in the pocket of the

Government.

Democratically

0:29:170:29:24

accountable. Thank your very much

indeed.

0:29:240:29:29

Thank you for your thoughts on that

debate.

0:29:290:29:33

You can join in all this

morning's debates by logging

0:29:330:29:37

on to bbc.co.uk/thebigquestions,

and following the link

0:29:370:29:39

to the online discussion.

0:29:390:29:40

Or you can tweet using

the hashtag #bbctbq.

0:29:400:29:44

Tell us what you think

about our last Big Question too.

0:29:440:29:47

Should we presume consent for organ

donations in England?

0:29:470:29:49

And if you'd like to apply

to be in the audience

0:29:490:29:52

at a future show, you

can email [email protected]

0:29:520:29:55

We're in Edinburgh next week,

then Newport in South Wales

0:29:550:29:57

on March 11th, and Brighton

the week after that.

0:29:570:30:00

Friday saw a successful Second

Reading of Geoffrey Robinson's bill

0:30:070:30:10

to change the basis of the organ

donor register to one

0:30:100:30:16

of presumed consent in England.

0:30:160:30:17

The Government is going to go

ahead with the idea.

0:30:170:30:20

Last autumn, Theresa May

indicated her strong support

0:30:200:30:22

of an opt-out system in a letter

to Max Johnson, a nine year old boy

0:30:220:30:25

who had received a heart transplant.

0:30:250:30:27

She said it would be

known as Max's Law.

0:30:270:30:30

But there are still several hurdles

to get across first,

0:30:300:30:34

not least whether a potential

donor's family should have a right

0:30:340:30:37

of veto when the final

decision must be made.

0:30:370:30:43

Think about that one.

0:30:430:30:45

Wales, just down the road

from here in Bath, changed

0:30:450:30:48

to an opt-out system two years ago.

0:30:480:30:49

It's fair to say the results

so far are mixed.

0:30:490:30:52

Should we presume consent for organ

donations in England?

0:30:520:30:58

Scotland is still waiting on this

one and trying to make up their

0:30:580:31:01

minds. Here is the thing, consultant

transplant surgeon, welcome, Mike

0:31:010:31:11

Stephens, I have signed up to the

register. As an adult, I expressed

0:31:110:31:16

wishes, as an autonomous human

being, but my wife or my daughters,

0:31:160:31:21

my mother, they could stop that

happening. How can that be right?

0:31:210:31:27

You cannot think of that happening

in any other sphere.

Well, if you

0:31:270:31:34

phrase it like that, it is not

right, but I think you need to look

0:31:340:31:38

at this in the context of the whole

discussion. It is interesting we

0:31:380:31:42

start the discussion here because

the family's role is unchanged in

0:31:420:31:47

the opt-out system in Wales. They

have the same role as in the opt-in

0:31:470:31:53

system in England. The family can

overrule their relative's wishes. I

0:31:530:32:00

think the language unionist in your

introduction is part of the issue --

0:32:000:32:06

unionist. You need to change that

around, really. To make it much more

0:32:060:32:13

about your decision. You make a

decision in life and make sure

0:32:130:32:18

people know what that decision is

and you make it clear about that and

0:32:180:32:22

then it is much more difficult for

families to go against that. The is

0:32:220:32:28

families are not overriding people's

wishes, decisions for the sake of

0:32:280:32:33

being awkward -- the truth is. They

do not want to make things

0:32:330:32:39

unpleasant.

A period of intense

grief.

Absolutely. We are asking

0:32:390:32:44

families to make decisions that

probably the most emotional point in

0:32:440:32:48

their life. When you make a decision

when you are emotional, as we all

0:32:480:32:52

know, you do not always make good

decisions. Our research in Wales

0:32:520:32:57

shows that if you go back to

families who said no took organ

0:32:570:33:01

donation, a few months down the

line, many regret the decision. --

0:33:010:33:08

to organ donation.

It takes the

possibility of regret away?

You make

0:33:080:33:13

it clear it is your decision as an

individual, you convey your decision

0:33:130:33:17

to your loved ones so they know what

it is because in that moment of

0:33:170:33:23

intense emotion, they are already

brief, unexpectedly bereaved,

0:33:230:33:27

normally, because that is what organ

donors tend to be. You are asking

0:33:270:33:31

them to make this decision. Imagine

a scenario whereby you have died,

0:33:310:33:38

your family are there, incredibly

upset, somebody comes in and says,

0:33:380:33:42

they were on the organ donor

register, you say, hang on, they

0:33:420:33:46

didn't tell me that, I and their

wife, I know everything about them,

0:33:460:33:52

why did they not discuss that with

me? I don't understand. They go

0:33:520:33:56

through the process of trying to

understand. Was this really

0:33:560:34:01

important to them? Was it just

something they did when they got a

0:34:010:34:05

driving licence?

Presumed consent

does not show you have made a

0:34:050:34:08

decision.

Well, it doesn't, and the

key thing about all of this debate

0:34:080:34:13

is that what the law change has done

is given us a platform to have all

0:34:130:34:20

of these discussions, including

about what the family's role is, so

0:34:200:34:24

we can put it out there that you

have an opportunity to discuss your

0:34:240:34:29

decision in life and then your

relatives will know what to do.

I

0:34:290:34:33

cannot wait to hear from the

audience in a moment. A lot of that

0:34:330:34:40

will have resonated with you, tell

us what happened to your son?

My

0:34:400:34:45

son, Connor, he was 18 when he was

attacked, murdered. Three years ago.

0:34:450:34:52

As a result of his injuries being so

severe. Organ donation became an

0:34:520:34:58

option. He had decided that he did

agree with organ donation at the

0:34:580:35:06

time, he was only 16. He made that

decision. We had a conversation,

0:35:060:35:12

albeit brief, we had a conversation

about it. Never for one minute

0:35:120:35:16

thinking we would be put into that

position. And then we jumped forward

0:35:160:35:24

two years, that awful night became a

reality for us as a family and the

0:35:240:35:31

specialist donor nurse came to speak

to us, after lots of discussions

0:35:310:35:36

with the medical team, police, lots

of other individuals, to explain

0:35:360:35:42

that Conner would possibly be in a

position to become a donor and that

0:35:420:35:48

is when the conversation started. I

must say, I had a conversation with

0:35:480:35:53

Conner but his dad had not.

Initially, the reception we gave the

0:35:530:35:58

specialist donor nurse was quite

hostile, frosty.

Really?

Because

0:35:580:36:04

from me as a mum, Conner had been

through such a horrific ordeal, I

0:36:040:36:09

did not want any more pain for him.

I did not want him to be put through

0:36:090:36:14

anything more painful. So with the

specialist donor nurse, we talked

0:36:140:36:21

through lots and lots of scenarios,

questions, some of them might have

0:36:210:36:25

been really small and insignificant,

one of the biggest concerns for me

0:36:250:36:30

was, if we were to continue with

this journey, that Conner was never

0:36:300:36:35

on his own, and that was a really

big point for me and we had to

0:36:350:36:40

discuss it with

0:36:400:36:46

discuss it with her and never at any

point did we feel under pressure to

0:36:460:36:49

consent, we never felt under

pressure to continue, if we had made

0:36:490:36:54

a decision that we felt we needed to

pull back, then that was OK, that

0:36:540:36:59

was an option for us. But we made

the decision, Conner made the

0:36:590:37:06

decision, we just facilitated it. He

believed life goes on, his mantra in

0:37:060:37:10

life.

Did that in a sense keep you

going?

That kept us very focused.

He

0:37:100:37:15

had the tattoo on his arm. What did

it say?

Life goes on. His tattoo. It

0:37:150:37:22

was really poignant. But for us,

that kept us believing we were doing

0:37:220:37:29

the right thing. That is what it

was, for Conner, it was the right

0:37:290:37:36

decision. Emotionally, it wasn't

possibly for us at the time, but it

0:37:360:37:41

was about what Conner wanted and

that is what we did.

0:37:410:37:43

APPLAUSE

0:37:430:37:48

And lives were saved?

Yes, three.

APPLAUSE

0:37:540:38:07

Knowing that, that amazing thing,

three lives were saved, what is that

0:38:070:38:11

like?

It is awesome, to use Conner's

words. I remember a brief

0:38:110:38:18

conversation we had at the time and

he said, in his own naive

0:38:180:38:23

matter-of-fact way, who wouldn't

want a bit of this, mum? That was

0:38:230:38:29

his humour, his belief. To say I am

happy, it is the wrong expression.

0:38:290:38:37

But I am comfortable and I am

immensely proud that Conner make

0:38:370:38:42

that decision.

Can I ask, if you had

not had a conversation with him

0:38:420:38:48

beforehand, what would your decision

has been on that night?

It would

0:38:480:38:52

have been to continue to go through

the system. Conner was so strong

0:38:520:39:00

willed, stubborn, knowing his

personality, it was a difficult

0:39:000:39:03

decision, by no means was it easy,

but together, as a family, we would

0:39:030:39:08

have carried on through the journey.

Thoughts from the audience. Wow.

0:39:080:39:14

What would you like to say? I would

be interested as well, if there is

0:39:140:39:20

anyone here who would opt-out of the

register.

As a medical student, I

0:39:200:39:29

had the recent privilege of seeing a

kidney transplant and just seeing

0:39:290:39:35

this small grey kidney going pink

before my eyes and start working,

0:39:350:39:39

this is astonishing, amazing. I

think donation is a wonderful thing

0:39:390:39:44

and the Catholic Church encourage it

as well. I think that is a great

0:39:440:39:52

thing. We are here to discuss how we

should increase the amount of

0:39:520:39:58

donors, I think most people in the

audience...

Presumed consent, what

0:39:580:40:03

are your thoughts specifically on

that?

Not not system, one, it has no

0:40:030:40:08

clear evidence that will increase

donors? -- and opt-out system. In

0:40:080:40:15

some cases, it hasn't not increased

donors. It is morally dubious and it

0:40:150:40:22

is based upon this notion of

presumed consent which is

0:40:220:40:25

nonsensical. How can you say I have

said yes to something, I haven't?

We

0:40:250:40:31

have a man who knows about the

evidence, Professor Roy Thomas. You

0:40:310:40:36

have studied this, looked at the

evidence, what is it?

We started

0:40:360:40:40

looking at the evidence in 2007 in

Wales and that is why we have the

0:40:400:40:45

law now and you have two connecting

groups. Connectivity being

0:40:450:40:50

important. Conner, the family, and

the donors. The recipients, and the

0:40:500:40:56

donors working together. It is

important to realise because we have

0:40:560:41:00

10,000 people waiting in the UK.

They call it the invisible death

0:41:000:41:05

row. The invisibility because they

do not have advocacy. Brave

0:41:050:41:11

families, like Conner's case, they

are important, they are only 1%.

0:41:110:41:18

When they say there is no moral

ethical guidance here, there is.

0:41:180:41:22

There is a moral base in Wales for

this, it is not about the law, that

0:41:220:41:27

is important. The third group, the

group that now can opt-out. That is

0:41:270:41:32

important as well. They can opt-out,

perhaps some people argue we say

0:41:320:41:40

opting out of humanity, maybe, that

is an argument point in moral

0:41:400:41:45

ethics, but the key thing is we duck

and discuss it in the round -- the

0:41:450:41:52

key thing is we look. The ethical

debate is important, as we are

0:41:520:41:57

having today, but most people want

this law now. Two thirds of the

0:41:570:42:01

people want this law because it

saves lives and there are 500 people

0:42:010:42:07

who died last year, five bosses went

over the cliffs of Dover, and that

0:42:070:42:12

is a big deal, it used to be one bus

in Wales -- five bosses went over

0:42:120:42:21

the cliffs.

You put your hand up,

gentleman with the glasses, who

0:42:210:42:26

would opt-out?

I would opt-out.

Organ donation is a very moral

0:42:260:42:32

thing. If you want to be moral and

ethical, do not force your morality

0:42:320:42:37

on me, that is unethical. It is

quite perverse we are to assume that

0:42:370:42:45

all women and children born after

this law passes, the government will

0:42:450:42:48

assume the rights to harvest their

organs like a demented keeper, a

0:42:480:42:54

disturbing precedent, not the role

of the state. Be moral, donate. But

0:42:540:42:57

do not force it on me.

Hand up

beside you.

I agree. My heart goes

0:42:570:43:05

out to Conner and his mother. It is

not some nightmare...

Body

0:43:050:43:10

snatchers?

We are talking about

lives here, lives that can be saved

0:43:100:43:17

by that person making the choice, be

it for personal, spiritual or

0:43:170:43:22

religious reasons, they want to

opt-out, OK, but the assumption is

0:43:220:43:28

taken however, that we can save

human lives with their organs that

0:43:280:43:34

you have agreed to give up, should,

God forbid, the worst happen to you

0:43:340:43:39

or a member of your family. A very

important point for those who do not

0:43:390:43:43

agree with that, if it was one of

your loved ones that desperately

0:43:430:43:47

needed the organ, how would you feel

then?

Anyone else who would opt-out?

0:43:470:43:54

I just want to know...

I am kind of

on the fence, brought up in the

0:43:540:44:01

Jewish faith we would not give

donations at that point in our life,

0:44:010:44:07

we can give live donations, it is

quite a contentious... Sorry?

The

0:44:070:44:13

Jewish leaders do not agree with

that. They looked deeper, they

0:44:130:44:16

would...

The Jewish faith is quite

interesting insofar as there are

0:44:160:44:22

many beliefs and many different

interpretations, so what I am trying

0:44:220:44:27

to say is I have been brought up in

a belief we would not donate,

0:44:270:44:31

however, I have been living... I'm

finding this very emotive. My

0:44:310:44:38

11-year-old Sun is likely to need a

kidney transplant multiple times in

0:44:380:44:44

his life, currently going through

potential bladder reconstruction as

0:44:440:44:48

well, so for me to sit here and say

I would expect myself or someone

0:44:480:44:52

asked to donate to him but on death

not to...

Makes you more likely to

0:44:520:45:00

give than receive? Important point.

You have to look at this from so

0:45:000:45:06

many different angles. But I think

my inclination would be I would have

0:45:060:45:09

to say that I would not opt-out

although I do not believe in the

0:45:090:45:13

system being proposed, I believe

that if you are going to come in

0:45:130:45:16

with a system like that, the

transition state from how we

0:45:160:45:20

currently are to where you are

planning to go towards, it has to be

0:45:200:45:25

managed very carefully, you have to

know exactly what it is all about

0:45:250:45:29

and just to presume whether it

0:45:290:45:39

and just to presume whether it is at

16, 18, whatever age, the minute

0:45:390:45:41

your birthday comes, does that mean

you have to sign on the dotted line

0:45:410:45:44

to make sure God forbid you are not

caught in the two-year window?

0:45:440:45:46

Fascinating points. I will be right

with you. So interesting. As you

0:45:460:45:51

say, so moving too, thinking about

those situations. Sadly, heart

0:45:510:45:55

health campaigner, critical heart

condition, at any moment, you could

0:45:550:46:00

need a heart transplant, if a viable

heart was found in the family

0:46:000:46:06

decided ultimately not to consent

and to save your life, what would

0:46:060:46:12

that be like for you?

0:46:120:46:17

Of course that would be appalled but

I want to approach it a different

0:46:170:46:21

way. As a heart patient, everybody

says to me, surely you must be in

0:46:210:46:26

favour of the opt out system? I am

not in favour at all because it is

0:46:260:46:33

the word, presumed, that really

worries me.

0:46:330:46:35

I have spent the year talking with

lots of decision makers who are

0:46:350:46:42

invested in this, charities, the

NHS, the team at Papworth Hospital,

0:46:420:46:48

and my conclusion is something that

everybody I am sure would agree

0:46:480:46:52

with, the most important part of the

process is having a conversation

0:46:520:46:57

with your loved ones. Going for

presumed consent can take away the

0:46:570:47:01

need for the conversation.

How to make sure we have those

0:47:010:47:07

conversations?

I want a mandatory decision making

0:47:070:47:09

process. People have their own

thoughts and feelings and emotions

0:47:090:47:17

about this. This isn't about

bullying anybody into making the

0:47:170:47:22

decision. What this is about is

saying, you need to make a decision.

0:47:220:47:27

I have three children. When my

eldest son turned 18, he could vote

0:47:270:47:35

for the first time so we had a right

of passage conversation, who will

0:47:350:47:40

you vote for? He could drive, so we

discussed never drinking and

0:47:400:47:45

driving. As part of that rite of

passage, I think the majority of

0:47:450:47:50

families in this country will sit

down together and say, you are 18

0:47:500:47:54

now, you have to make a decision,

let us discuss this, this is a rite

0:47:540:47:59

of passage for you. You can tick yes

or no but you have to make a

0:47:590:48:05

decision and that forces the

conversation, encourages the

0:48:050:48:10

conversation in the family. The

heart transplant sessions I have

0:48:100:48:12

spoken to have said they have a care

for the patient but also how to care

0:48:120:48:21

for the family. They will never ruin

another life to save a rice -- save

0:48:210:48:29

a life. They will never go against

wishes because they are humans

0:48:290:48:35

themselves. They don't go into that

profession to cause damage and upset

0:48:350:48:38

to others.

I do not believe presumed consent is

0:48:380:48:43

the way to go. We have to make a

mandatory decision and then educate

0:48:430:48:49

from a very early age.

You don't approve of the opt out

0:48:490:48:57

system?

Everyone here, we all believe in

0:48:570:49:04

organ donation, we will want to

increase organ donation and see more

0:49:040:49:08

transplants. My criticism is it

doesn't work. The evidence from the

0:49:080:49:16

Welsh Government report which was

still ambivalent, there has been no

0:49:160:49:24

change in the rate. There may be a

change in awareness and consent

0:49:240:49:28

rates have gone up but the hard

numbers has not happened. I am a

0:49:280:49:35

firm believer in organ donation. If

you are happy to receive a

0:49:350:49:40

transplant you should be happy to be

a donor. Any other position is

0:49:400:49:43

hypocritical. There are different

things we can do.

That is

0:49:430:49:53

interesting. Am I right in saying

you think there should be a

0:49:530:49:59

prioritisation for those who have

agreed?

I do. There are different

0:49:590:50:06

systems in the world. Certainly,

Israel edged used a system where

0:50:060:50:10

they bring this system of

reciprocity. -- have introduced a

0:50:100:50:15

system. There are our priority

points, if you are waiting for a

0:50:150:50:22

heart or liver, your medical aid

overrides everything. Kidney

0:50:220:50:27

transplantation where we do have

dialysis which is not as good...

NHS

0:50:270:50:36

should not be about the choices

people make in life.

0:50:360:50:44

And the NHS versus private?

The figures are really important.

0:50:460:50:55

This is fact, if we knew the answer

whether or not opt out what we would

0:50:550:51:01

have done it years ago. We don't

have strong statistical evidence to

0:51:010:51:04

prove it. We have got associations

but not strong evidence. We may get

0:51:040:51:12

it soon because of what has happened

in Wales. But the numbers are small

0:51:120:51:17

and the fluctuations of great. This

law change is about consent change,

0:51:170:51:24

a different way of consenting so the

only thing that is relevant in terms

0:51:240:51:27

of success is has the consent rate

gone up. Year on year since the

0:51:270:51:35

update -- the opt out was introduce

in Wales, the rates have gone up.

On

0:51:350:51:43

the statistics and arguments?

In Belgium in the mid-19 80s they

0:51:430:51:48

introduced opt out. They saw a 50%

increase in organ donation rates in

0:51:480:51:55

five years, a clear statistic there

for everyone to see. In the opt out

0:51:550:52:02

countries, Spain, Denmark, Norway,

they have a better organ donation

0:52:020:52:06

rate than we have in the UK. It is

important to look at that. Germany

0:52:060:52:14

hasn't, at the other end of the

scale. Facts show presumed consent

0:52:140:52:20

works and studies at Harvard,

Chicago, they showed these facts.

0:52:200:52:25

Presumed consent works.

Spain is

usually cited as the perfect example

0:52:250:52:32

of presumed consent.

Spain don't

have an opt out register.

But

0:52:320:52:39

Belgium does. Wales has not shown

the same.

It is too early.

0:52:390:52:52

the same.

It is too early. We need

to take the right lessons from

0:52:520:52:54

Spain. The Spanish have said it has

nothing to do with presumed consent

0:52:540:53:01

but investment, education, raising

awareness.

0:53:010:53:05

APPLAUSE

0:53:050:53:10

APPLAUSE We have done all those

things, we haven't had the consent

0:53:130:53:17

rate increase so it is time the

something else in addition.

0:53:170:53:23

I would like to say, I want to thank

my donor's family every day, really,

0:53:230:53:32

so much.

I had a liver transplant just over a

0:53:320:53:35

year ago. I was one of the lucky

ones with how quickly it happened.

0:53:350:53:44

From feeling just tired and being

told it was a virus, to going yellow

0:53:440:53:52

and having a life-saving transplant

two you -- weeks later. I got my

0:53:520:53:59

gift. Having said I don't agree with

opt out, there are Laverty points,

0:53:590:54:06

one is consent, the biggest one is

family. At the moment, even if you

0:54:060:54:12

haven't signed up to the register

your family can sign up for you.

You

0:54:120:54:17

say, my gift. Is it an important

concept that it was a gift?

You

0:54:170:54:25

can't even get close to what it

feels like when you wake up from a

0:54:250:54:29

transplant.

APPLAUSE

0:54:290:54:38

I was 48 hours away, I was on a

super urgent list where your life is

0:54:390:54:46

in danger.

I was at peace with the fact it

0:54:460:54:54

might not happen for me.

0:54:540:55:00

might not happen for me. When I got

woken up, I could breathe for

0:55:000:55:04

myself, my transplant coordinator

grabbed my hand and said, we have

0:55:040:55:08

found you a liver and it was the

most incredible feeling. There was

0:55:080:55:16

hope, it was amazing. Next thing, I

was woken up and it is like a

0:55:160:55:26

euphoria, you can't comprehend,

there are no words, when someone

0:55:260:55:30

else says the life. It is one thing

getting better, battling cancer, but

0:55:300:55:36

when someone else saves your life

for you, it changes everything. I

0:55:360:55:41

like to think I speak for all people

who have had transplants, we don't

0:55:410:55:46

go back to the lives we had before,

we are conscious of the

0:55:460:55:50

responsibility we had to the person

who helped keep us alive that we try

0:55:500:55:57

and live bigger and better and

richer lives.

0:55:570:56:04

richer lives. Organ donations are

incredibly important but it will

0:56:040:56:07

never be as simple as opt in or opt

out. Probably one year to the date

0:56:070:56:14

my donor died, my best friend's mum

died in the same situation and I saw

0:56:140:56:20

it from the other side. In that

moment I realised it is not clean

0:56:200:56:23

cut. But only are you so vulnerable,

in agony with grief, but she had

0:56:230:56:29

this huge hope, even if there is no

hope. You still think that maybe

0:56:290:56:37

they have got it wrong. That is a

huge thing. It is not simple. It has

0:56:370:56:44

to be a conversation, a change in

culture.

0:56:440:56:47

On the point about it is not simple

as opt in and opt out. In Wales we

0:56:470:56:56

have devised a resource pack due to

go out to bring about a conversation

0:56:560:57:01

with young adults around the subject

of organ donation. We give lots of

0:57:010:57:07

advice at school, mental health

advice, sexual health advice but no

0:57:070:57:14

one likes to talk about the other

side, the dark side of life. In this

0:57:140:57:19

pack, we have got a conversation

going within groups that will answer

0:57:190:57:25

those questions. Giving those young

people the knowledge it is OK not to

0:57:250:57:31

agree. There is no right or wrong

answer but furnishing them with the

0:57:310:57:35

information of the process. What are

the steps? If they feel they need to

0:57:350:57:43

know those. We spent a long time

asking lots of questions and that is

0:57:430:57:49

what the resource pack is for, to

give information and to use my son's

0:57:490:57:56

case as a living case. It does

matter, these early conversations

0:57:560:58:01

can make a difference.

The keyword you used, living.

0:58:010:58:08

That is it. Because three people are

alive because of your son.

0:58:080:58:11

Yes, living.

It is really important we talked to

0:58:110:58:22

the young people it affects, don't

talk to those already buying into it

0:58:220:58:28

or on the donor list, but the young

people who need to sign up and don't

0:58:280:58:34

presume their consent, let them make

their decision.

0:58:340:58:38

You are a professional, thank you!

0:58:380:58:40

As always, the debates will continue

online and on Twitter.

0:58:400:58:42

Next week we're in Edinburgh,

so do join us then.

0:58:420:58:45

But for now, it's goodbye from Bath

and have a great Sunday.

0:58:450:58:47

APPLAUSE.

0:58:470:58:50

Download Subtitles

SRT

ASS