Episode 2 The Big Questions


Episode 2

Similar Content

Browse content similar to Episode 2. Check below for episodes and series from the same categories and more!

Transcript


LineFromTo

Today, on The Big Questions, is digital media good for democracy?

0:00:020:00:05

APPLAUSE

0:00:150:00:18

Good morning. I'm Nicky Campbell. Welcome to The Big Questions.

0:00:180:00:22

Today, we're back at Brunel University, London, in Uxbridge,

0:00:220:00:25

to debate one very big question - is digital media good for democracy?

0:00:250:00:30

Welcome, everybody, to The Big Questions.

0:00:300:00:33

Now, this Friday, Donald Trump will be sworn in as

0:00:350:00:38

the 45th President of the United States of America.

0:00:380:00:41

Many say he owes his victory to a greater mastery of the tools

0:00:410:00:46

of this digital age - Twitter, Facebook,

0:00:460:00:48

and all the other forms of online communication.

0:00:480:00:51

And, US intelligence agencies are saying that Russian

0:00:510:00:54

intervention in cyberspace played a hand, too.

0:00:540:00:58

Well, here in Britain,

0:00:580:01:00

the Brexit campaign was acknowledged to be far savvier in its use

0:01:000:01:03

of social media than the Remain side,

0:01:030:01:05

and the Scottish Nationalist online army narrowed the gap in

0:01:050:01:08

their independence referendum and helped drive out Labour

0:01:080:01:12

in the 2015 general election.

0:01:120:01:14

So, there's no denying the power of digital media.

0:01:140:01:17

But, it also poses a real challenge.

0:01:170:01:20

How do you know what is true and what is fake in an online world

0:01:200:01:24

with no editorial control or standards?

0:01:240:01:27

And, what is the effect on democracy of trolling, fake news,

0:01:270:01:32

and the echo chambers created by algorithms?

0:01:320:01:35

To debate this brave new world,

0:01:350:01:37

we've assembled a digitally savvy front row of journalists,

0:01:370:01:41

politicians, campaign advisers, techno wizards,

0:01:410:01:45

think tank heavyweights, television executives, learned academics,

0:01:450:01:48

all here in the flesh,

0:01:480:01:50

not created in some virtual world.

0:01:500:01:52

They are all too real.

0:01:520:01:55

And, you can join in on Twitter or online,

0:01:550:01:59

by logging into bbc.co.uk/thebigquestions.

0:01:590:02:02

Follow the link to the online discussion.

0:02:020:02:04

Lots of engagement,

0:02:040:02:05

contributions from our highly representative

0:02:050:02:09

Uxbridge audience as well.

0:02:090:02:11

Well, Jamie Bartlett, if I may...

0:02:110:02:16

director at the Centre for the Analysis of Social Media,

0:02:160:02:19

the very man.

0:02:190:02:21

This has changed politics forever, hasn't it?

0:02:210:02:24

For years, we have been saying how do we get young people?

0:02:240:02:26

How do we get more people engaged?

0:02:260:02:28

How do we stop them from being disenfranchised?

0:02:280:02:31

We have done just that.

0:02:310:02:32

It has got a lot more people engaged.

0:02:320:02:34

There's no doubt about that. And it's not just people tweeting

0:02:340:02:38

and posting things on Facebook.

0:02:380:02:40

Social media has been remarkably good

0:02:400:02:43

at forming a bridge to get people into real-world politics as well.

0:02:430:02:47

Various types of activism, marching, demonstrating, even voting.

0:02:470:02:51

I think people are more likely to vote if they get involved in

0:02:510:02:54

-politics online.

-All good.

-That is all good.

0:02:540:02:57

And I think everybody here would acknowledge that is

0:02:570:03:00

a very good thing. But, of course, it creates new problems.

0:03:000:03:03

And those problems, I think, we're only just beginning to grapple with.

0:03:030:03:07

I think it's making politics, in some ways, far more polarised,

0:03:070:03:11

far more difficult to predict or control.

0:03:110:03:14

In many ways, Donald Trump is the perfect politician for

0:03:140:03:17

a digital age.

0:03:170:03:18

We are overwhelmed with information,

0:03:180:03:20

graphs and charts and infographics and tweets and retweets.

0:03:200:03:24

And in those circumstances, we tend to go for whatever is the

0:03:240:03:28

loudest, the most offensive, the most emotive.

0:03:280:03:31

And that is why politics, I think,

0:03:310:03:33

is becoming more polarised, it is creating more centres of

0:03:330:03:36

power that we barely understand, huge tech companies that can

0:03:360:03:40

control what we see online, and we don't really know how they work.

0:03:400:03:44

So, while it's bringing more people into politics,

0:03:440:03:48

it's not clear to me yet whether it's making politics any better.

0:03:480:03:51

In fact, in some ways, it's making it rather worse.

0:03:510:03:54

Well, there are loads of issues there,

0:03:540:03:56

I'm really looking forward to the next hour or so.

0:03:560:03:58

Helen, he makes some good points.

0:03:580:04:00

Are people actually becoming better informed and more engaged?

0:04:000:04:06

Because a click of a mouse is one thing,

0:04:060:04:09

but actually trudging up the stairwells of high rises

0:04:090:04:12

delivering leaflets on a cold February night is another

0:04:120:04:16

thing entirely.

0:04:160:04:18

I don't think we should denigrate the click of the mouse.

0:04:180:04:21

I think there is a culture, particularly in politics,

0:04:210:04:23

particularly British politics, that politics must be painful.

0:04:230:04:27

You've got to do something cold, or long, or boring, or hard work.

0:04:270:04:31

And what digital media have done

0:04:310:04:33

is they've sort of democratised the act of political participation.

0:04:330:04:36

So, now, there's very small acts of participation.

0:04:360:04:39

Liking or clicking something.

0:04:390:04:41

Indeed, the click of the mouse, which is what is doing the

0:04:410:04:44

work here, it's what's drawing people into politics.

0:04:440:04:47

We shouldn't let go of that, even while we're blaming...

0:04:470:04:51

-Don't denigrate the click.

-Yes, don't denigrate the click.

0:04:510:04:54

Yeah, and it makes a difference.

0:04:540:04:56

People feel they are making a difference, but are they?

0:04:560:04:58

They are only...

0:04:580:04:59

Because of these algorithms which send you your own things

0:04:590:05:02

that they think you will like.

0:05:020:05:04

So, basically, it is that echo chamber, you are only confronting,

0:05:040:05:09

if that's the right word, your own opinions, there's no challenge.

0:05:090:05:12

I don't think that's right.

0:05:120:05:15

The true echo chamber would be just reading the Daily Mail in the 1930s.

0:05:150:05:19

Or...just watching one particular TV station in the US.

0:05:190:05:24

That would be a real echo chamber. And I don't think...

0:05:240:05:28

Of course, our social media environments are personalised

0:05:280:05:33

by us, and personalised by the social media platforms and we need

0:05:330:05:36

to work against that, but I do think, still,

0:05:360:05:40

that our information environment, and there is research to show this,

0:05:400:05:43

that people are exposed to more news sources on social media

0:05:430:05:46

than people who don't use social media.

0:05:460:05:47

-Only the news sources they agree with.

-No. Not necessarily.

0:05:470:05:51

A more heterogeneous range of news sources on social media

0:05:510:05:55

than they are in the offline world.

0:05:550:05:57

We're very good at creating echo chambers outside social media.

0:05:570:06:01

And I don't think we should forget that.

0:06:010:06:03

Yeah, echo chambers, there's nothing new about them. Owen, is that right?

0:06:030:06:06

Well, I think, firstly, Facebook is a bit different,

0:06:060:06:09

but Twitter, I think we overblow its importance because

0:06:090:06:12

only a very small section of the population are regularly using

0:06:120:06:14

Twitter to talk about the world around them.

0:06:140:06:17

And they tend to be younger, and more affluent.

0:06:170:06:20

You take the Brexit... the EU referendum,

0:06:200:06:23

the people who are most likely to vote to leave were people

0:06:230:06:25

over the age of 65, who are the least likely to be on Twitter.

0:06:250:06:29

So, I think if we're talking about these grand political developments

0:06:290:06:32

sweeping the Western world and putting it down to Twitter,

0:06:320:06:34

then I think we are mistaken.

0:06:340:06:36

I think the danger with people like me on the left of politics

0:06:360:06:39

is we think, well, the mainstream media is biased against us,

0:06:390:06:41

most of the press supports the Conservative government,

0:06:410:06:44

so, let's retreat to Twitter. I think that's a mistake,

0:06:440:06:47

because most people, they come home at the end of the day,

0:06:470:06:49

a busy day, they've got jobs, they've got kids,

0:06:490:06:52

they might listen to a bit of radio, watch the evening news,

0:06:520:06:55

maybe flick through a newspaper,

0:06:550:06:57

and that is how most people still get their news coverage.

0:06:570:07:00

Do we get a false impression of what is happening in the world, in a sense?

0:07:000:07:02

Because Helen is saying with all this information, you're better informed than ever,

0:07:020:07:06

but you're saying there's something of a political mirage about it.

0:07:060:07:09

I think when it comes to Twitter. I think on Facebook, we've got, look, you've got someone here

0:07:090:07:13

who did an excellent job for the Conservatives.

0:07:130:07:15

You won't often hear me saying that about people on that wing of British politics,

0:07:150:07:18

but he did because what they did with Facebook,

0:07:180:07:21

it wasn't about joining Facebook groups, which is again just people often agreeing with each other,

0:07:210:07:25

what the Conservatives did is they targeted specific demographics,

0:07:250:07:29

because millions of people use Facebook.

0:07:290:07:31

They're not talking about politics again on Facebook.

0:07:310:07:33

They're clicking on photos, sharing funny messages with friends.

0:07:330:07:36

But they targeted adverts at specific demographics.

0:07:360:07:39

That is very effective.

0:07:390:07:41

But if people like myself who are campaigners think we can just

0:07:410:07:44

send a few tweets going, "Rrrrah, Theresa May!",

0:07:440:07:47

yes, a load of people will retweet that,

0:07:470:07:49

but then we make the mistake of thinking the rest of the country

0:07:490:07:52

are full of the same enthusiasm as me, and that's not true.

0:07:520:07:56

So, Twitter has its use, it can mobilise campaigners,

0:07:560:07:59

put issues on the agenda, but it isn't a substitute, unfortunately,

0:07:590:08:02

for trying to have a strategy to deal with a very hostile press.

0:08:020:08:05

The aforementioned Tom Edmonds in a moment, but, Ellie, you want to come in?

0:08:050:08:08

Yeah, I think there was a point made earlier that social media,

0:08:080:08:11

which I think is a great new tool,

0:08:110:08:12

it's just another kind of technological advance in

0:08:120:08:14

the way that people communicate with each other,

0:08:140:08:17

and that is to be welcomed, makes politics harder to control,

0:08:170:08:20

or makes politics more polarised.

0:08:200:08:21

I think most of us, myself included, would welcome that

0:08:210:08:24

because having more ability to have an argument,

0:08:240:08:28

having more scope to have a debate and have more people involved

0:08:280:08:31

in the conversation is always a positive.

0:08:310:08:33

Certainly, Twitter is a very kind of closed shop in the way that

0:08:330:08:36

it's mainly young people and media commentators -

0:08:360:08:38

myself as a journalist knows that.

0:08:380:08:39

And Facebook, as Owen said, is a bit different.

0:08:390:08:42

But the idea that this should be...

0:08:420:08:44

What kind of frightens me more about social media

0:08:440:08:47

isn't necessarily who's on it and who's talking about it,

0:08:470:08:49

it's the attempts to control it,

0:08:490:08:52

the attempts to make Facebook and Twitter put out certain

0:08:520:08:56

messages or be worried about what messages they've put out

0:08:560:08:59

-and the drive to censor social media.

-That is a fascinating area.

0:08:590:09:03

I'm going to address that later on

0:09:030:09:04

because we have lots to focus on there with the trolling,

0:09:040:09:07

with the vile abuse, with the fake news,

0:09:070:09:10

with the false news and all that, that's all to come.

0:09:100:09:13

But you were mentioned, Tom, in Dispatches,

0:09:130:09:15

by Owen Jones,

0:09:150:09:17

in a congratulatory tone because of the brilliant strategy

0:09:170:09:21

that you came up with for the Tories in the last election online.

0:09:210:09:24

What did you do and how did you do it?

0:09:240:09:25

Well, broadly,

0:09:250:09:26

political parties use online communications for two things.

0:09:260:09:29

The first thing is to find and recruit supporters who will go and

0:09:290:09:33

take those vital actions to support them, primarily offline,

0:09:330:09:36

as other panel members have mentioned.

0:09:360:09:38

So, it's the people who are going to go and knock on the doors

0:09:380:09:41

because nothing beats that face-to-face communication.

0:09:410:09:44

The second thing they do, and we did in the 2015 campaign,

0:09:440:09:48

is to find and reach out to those swing voters

0:09:480:09:51

in the marginal seats that are going to decide the election,

0:09:510:09:53

and speak to them about the issues they care about.

0:09:530:09:56

Now, if you manage an election campaign, you're speaking from...

0:09:560:09:59

anyone from as diverse as Lib Dem voters in the South West to

0:09:590:10:04

Labour voters in the North of England and in Scotland as well.

0:10:040:10:08

So, you need to speak to them about issues they care about.

0:10:080:10:10

Now, what Facebook allows you to do is find those groups of individuals

0:10:100:10:14

and speak to them about the issues...

0:10:140:10:16

You'll have to work quite hard to find Labour voters in Scotland.

0:10:160:10:18

-Nowadays, yes.

-So, that's what you did -

0:10:180:10:22

very targeted, very focused, and very successful.

0:10:220:10:24

Very targeted, very segmented communications.

0:10:240:10:27

Now, digital in itself is just a medium.

0:10:270:10:31

So, you need to have the message right

0:10:310:10:33

before you go and speak to these people.

0:10:330:10:34

There's no point finding individual groups of people and speaking

0:10:340:10:37

about things they don't care about.

0:10:370:10:39

There's no point finding groups of people,

0:10:390:10:41

finding groups of young voters, and then speaking to them about pensions,

0:10:410:10:45

or, erm, about issues they don't care about.

0:10:450:10:47

So, it's about finding groups of people, having the right message,

0:10:470:10:50

-and getting in front of them time and time again.

-Mm. And it was

0:10:500:10:53

Helen who was remarkably successful in the marginals, wasn't it?

0:10:530:10:56

Well, it was.

0:10:560:10:58

And, if you...

0:10:580:11:00

And I think also very successful for the Leave campaign, Brexit,

0:11:000:11:04

and very successful for Trump,

0:11:040:11:06

because, after all, if you just reach an echo chamber of people

0:11:060:11:09

who agree with you, you're not going to make any difference.

0:11:090:11:12

I mean, that's not what parties are looking to do.

0:11:120:11:15

They're looking to find people

0:11:150:11:16

who are undecided or don't know what they think yet.

0:11:160:11:18

And what Labour did at the last general election, as you know better than me,

0:11:180:11:22

is just randomly throw out the same message across Facebook.

0:11:220:11:25

So, it didn't target specific people

0:11:250:11:27

in specific seats they needed to win over.

0:11:270:11:29

And, also, they mistook engagement -

0:11:290:11:31

that's lots of people liking and commenting - for success.

0:11:310:11:35

Actually, what that generally meant was the messages their core supporters already agreed with,

0:11:350:11:39

like the NHS, were getting lots of attraction,

0:11:390:11:42

but they weren't reaching out to the people they needed to win over.

0:11:420:11:45

And, obviously, that's what you learned from so successfully.

0:11:450:11:48

You said, we're not just going to do things for likes or clicks,

0:11:480:11:50

we're going to target the people we need to win over

0:11:500:11:52

who aren't in our natural coalition.

0:11:520:11:54

Yes. Chi, as a Labour MP, just moving on ever so slightly,

0:11:540:11:57

the digital media revolution,

0:11:570:12:01

it's a great process and platform of democratisation, isn't it?

0:12:010:12:06

Isn't it a great leveller?

0:12:060:12:08

Absolutely. And digital media...

0:12:080:12:10

Before I came into politics, I was an electrical engineer

0:12:100:12:13

for 20 years, helping build out these networks.

0:12:130:12:15

And the reason why I went into that is because anything,

0:12:150:12:18

something that gives people

0:12:180:12:20

the power to connect with other people, which enables people

0:12:200:12:23

to reach out to share, that is progress,

0:12:230:12:26

and that, I believe, is good for democracy.

0:12:260:12:29

But, like any technology, it comes with its disadvantages as well.

0:12:290:12:34

And, also, any technology,

0:12:340:12:37

those in power will try and use it to reinforce their power,

0:12:370:12:41

and as they have the most means, they're the most likely

0:12:410:12:43

to be able to use it most effectively to begin with.

0:12:430:12:46

But you don't have to, you know,

0:12:460:12:48

have lunch, schmooze Rupert Murdoch to get your message out there any more.

0:12:480:12:51

-Because there are other ways of doing it.

-Let me say this.

0:12:510:12:54

There's two really big caveats with that,

0:12:540:12:56

and the first one, and that's the reason why

0:12:560:12:58

I still do walk out on cold winter's days and knock on doors,

0:12:580:13:01

is because there are millions of people

0:13:010:13:03

in this country who still don't have broadband,

0:13:030:13:05

who still don't have any digital access, you know.

0:13:050:13:08

The state of broadband in this country is a disgrace.

0:13:080:13:12

And then, there are those who may have access,

0:13:120:13:15

but they can't afford it, it's too expensive,

0:13:150:13:17

or they haven't got the digital skills necessary to use it.

0:13:170:13:21

So, there are millions of people off, away from this debate,

0:13:210:13:25

which we really need to include, and when we have included them,

0:13:250:13:28

then we need to look at the way in which that debate and those

0:13:280:13:32

messages are being put out. And what I would say particularly about...

0:13:320:13:36

And really, it was a most effective campaign,

0:13:360:13:39

the Tory campaign in 2015, spent, I think it was...

0:13:390:13:42

-Was it ten times more than Labour on Facebook?

-Mm.

0:13:420:13:46

But the other point is that those messages came with adverts

0:13:460:13:50

down the sides, you know,

0:13:500:13:53

and when I'm reaching out to my constituents,

0:13:530:13:56

and Facebook enables me to do that and I do use that,

0:13:560:13:59

but I'm very aware that if I'm asking them about how they feel

0:13:590:14:03

about obesity or something,

0:14:030:14:05

there's going to be adverts for, you know, well-known fast food places

0:14:050:14:09

down one side and there's going to be messages which are

0:14:090:14:12

calculated on the basis of algorithms which are entirely,

0:14:120:14:16

you know, invisible, entirely opaque,

0:14:160:14:19

which we have no knowledge of.

0:14:190:14:21

So, what I would say is, we need to update...

0:14:210:14:24

The system feeds you with things that they know you will like,

0:14:240:14:28

because of your previous clicking habits,

0:14:280:14:31

just to highlight that for people. Carry on.

0:14:310:14:34

So, we need to update, you know, the regulations, basically,

0:14:340:14:38

to make sure this fantastic opportunity is fairer and is

0:14:380:14:44

accessible to people and is used to support and enable people

0:14:440:14:47

and not to feed them the same messages.

0:14:470:14:50

Allie, you don't look happy. The word "regulations!"

0:14:500:14:52

It always gets my back up, yeah. I think that...

0:14:520:14:55

I also think we may be giving a bit too much credit to social media,

0:14:550:14:57

in that the idea that political messages put out by parties

0:14:570:15:01

simply failed because they weren't using the right social media

0:15:010:15:05

strategy is perhaps masking a bigger problem of,

0:15:050:15:07

perhaps they were putting out the wrong messages,

0:15:070:15:10

or messages that people weren't interested in.

0:15:100:15:12

You brought up the fact of adverts being alongside things on

0:15:120:15:15

Facebook, and there is this sense that people are just

0:15:150:15:18

mindlessly clicking on something and then

0:15:180:15:20

a pop-up for McDonald's comes up and they think,

0:15:200:15:22

"Oh, yeah, I want McDonald's," and it's that kind of mindless,

0:15:220:15:25

I think that's a very insulting view of the public,

0:15:250:15:27

not only how they engage with news, but the media in general...

0:15:270:15:29

That's not what I said.

0:15:290:15:31

But advertising is advertising for a reason, because it does

0:15:310:15:34

actually work, so, and that's where Facebook gets its revenues from.

0:15:340:15:38

I'm not saying people are mindless,

0:15:380:15:40

but I am saying that people are influenced by advertising,

0:15:400:15:43

otherwise you wouldn't have such big budgets on it.

0:15:430:15:45

No, absolutely, but if you're putting out a message on obesity,

0:15:450:15:48

and then you're saying that there's a link between that

0:15:480:15:50

and then a fast food advertisement being alongside that,

0:15:500:15:53

you're drawing some kind of relationship between them,

0:15:530:15:55

and then when you use the word "regulation," my alarm bells go off,

0:15:550:15:58

because I think, what do you want to regulate in tat?

0:15:580:16:00

-AUDIENCE MEMBER APPLAUDS

-Everything, because...

0:16:000:16:02

I'm going to come to you in a minute,

0:16:020:16:04

because that was quite a round of applause!

0:16:040:16:05

Just this problem with the word "regulation" -

0:16:050:16:08

nothing is above the law.

0:16:080:16:09

Are you saying that the internet should be above the law?

0:16:090:16:11

I mean, there are regulations that exist now - you recognise

0:16:110:16:14

that we have a legal system.

0:16:140:16:16

Yeah, and I would disagree with some of those.

0:16:160:16:18

Yeah, but nothing's above the law. So, the law should apply.

0:16:180:16:21

Yes, but in terms of free speech on the internet, it should be absolute.

0:16:210:16:24

Well, free speech on the internet, that's going to be

0:16:240:16:26

a tasty area for us later on.

0:16:260:16:28

You burst into a rapturous solo round of applause there!

0:16:280:16:33

What... Expand on it.

0:16:330:16:36

Um, I just agree with the lady here, I just think that, um,

0:16:360:16:39

it's very easy and I think it's a correct, you know, link between

0:16:390:16:45

advertising on social media and problems that we face in society.

0:16:450:16:50

Now, Jamie, you mentioned Trump, I think, earlier on.

0:16:500:16:54

Was Trump's campaign a triumph of digital media?

0:16:540:17:00

In some ways, yes.

0:17:000:17:01

Of course, there are much bigger trends behind that.

0:17:010:17:03

You can't just arrive and use Twitter and Facebook and then

0:17:030:17:06

suddenly get yourself elected.

0:17:060:17:08

You're obviously tapping into various,

0:17:080:17:10

much deeper concerns that people have.

0:17:100:17:12

But the point I want to make about that is...

0:17:120:17:15

when you are completely overwhelmed with different sources of

0:17:150:17:19

information, this is the problem with social media -

0:17:190:17:23

we were promised in the 1990s, from the digital prophets,

0:17:230:17:27

that the politicians of the future,

0:17:270:17:29

when we were all connected and we had access to the world's

0:17:290:17:31

information, would be more informed,

0:17:310:17:33

they'd be smarter and we'd all be kinder and nicer to each other.

0:17:330:17:37

One great academic even said it would be the end of nationalism,

0:17:370:17:40

because we'd be able to link to everybody and we'd all

0:17:400:17:43

understand each other. This is ludicrous.

0:17:430:17:45

The reality is, of course,

0:17:450:17:47

that when we are overwhelmed with information,

0:17:470:17:49

we use heuristics to try to quickly figure out what we think,

0:17:490:17:52

we do tend, I think, I disagree slightly with you on this, Helen,

0:17:520:17:56

we do tend to find things that we already agree with,

0:17:560:17:59

that our friends already agree with, and we trust that more.

0:17:590:18:02

We do that anyway.

0:18:020:18:03

We do that anyway, but we do that to such

0:18:030:18:06

a greater extent now than we ever did before,

0:18:060:18:08

and we're less aware of it.

0:18:080:18:10

Is that true, Helen?

0:18:100:18:11

Well, I think it's a tendency to present it as one thing or

0:18:110:18:15

the other. I mean, it's a very grey area, you know, between...

0:18:150:18:20

You even hear people talking about kind of hermetically sealed

0:18:200:18:23

echo chambers, and that's...

0:18:230:18:25

But people say, "Oh, the Twittersphere has gone mad!"

0:18:250:18:28

What they mean is, their own Twitter feed has gone mad.

0:18:280:18:31

Yeah, but in almost any platform,

0:18:310:18:33

you'll be able to see trending information and you're just

0:18:330:18:36

one click away from a huge array of opinions.

0:18:360:18:39

But the danger is, I think, that social media politics is

0:18:390:18:42

in many ways more angry, more aggressive, more emotive,

0:18:420:18:47

and the danger of that, I think,

0:18:470:18:49

is that it makes compromise more difficult.

0:18:490:18:52

The US Congress is more polarised than it has ever been,

0:18:520:18:55

since the war at the least, and I think that's happening in

0:18:550:18:59

a lot of different democracies, and part of the reason is the way

0:18:590:19:02

that we do now - most of us, not everyone, but most of us -

0:19:020:19:05

engage with politics.

0:19:050:19:06

That makes compromise more difficult and I think that is going

0:19:060:19:09

-to make politics much more difficult in the future.

-But I think...

0:19:090:19:12

Look, all our societies have become more polarised,

0:19:120:19:15

where people, whether it be Trump in the United States, Brexit here,

0:19:150:19:18

across Europe the rise of the populist right,

0:19:180:19:21

we've become far more divided as societies.

0:19:210:19:24

Not obviously social media that's caused that -

0:19:240:19:26

social media is amplifying aspects of it, though,

0:19:260:19:28

because people communicate on social media in a way they would

0:19:280:19:31

often never dream of speaking to each other in real life.

0:19:310:19:34

It often encourages a pack mentality,

0:19:340:19:37

where you do get groups of people...

0:19:370:19:39

It's like the mob in the French Revolution sometimes.

0:19:390:19:41

But I don't want to get like that,

0:19:410:19:43

-because that does sound kind of "Grrr!"

-It can be "Grrr!"

0:19:430:19:45

It's good that you can democratise information...

0:19:450:19:48

But somebody says something, for example, on Twitter...

0:19:480:19:51

I was on holiday and I looked online on the phone to see what was

0:19:510:19:53

in the newspapers, and there were three stories

0:19:530:19:56

about people who had said something

0:19:560:19:58

on Twitter and had had the fury poured upon them for doing so

0:19:580:20:02

and had to delete the tweets and had to apologise.

0:20:020:20:04

So, is that really good for freedom of speech?

0:20:040:20:06

No, that's not, because you do end up with the situation now

0:20:060:20:09

where basically, people often, if they make a mistake or

0:20:090:20:12

they say something which is obviously quite inflammatory

0:20:120:20:14

or divisive, then a group on Twitter will basically coordinate and

0:20:140:20:18

relish taking that person down,

0:20:180:20:20

and they will throw everything at them,

0:20:200:20:22

and if you are in the middle of a Twitter storm -

0:20:220:20:24

Jon Ronson wrote an excellent book about this,

0:20:240:20:26

about people shamed on Twitter - it can be a terrifying experience,

0:20:260:20:30

where random strangers all over the world are suddenly screaming at you.

0:20:300:20:34

It can be quite threatening and menacing.

0:20:340:20:36

You even get a phenomenon of so-called doxxing,

0:20:360:20:38

where people will expose your personal information,

0:20:380:20:40

go through your back story, everything possible, try to

0:20:400:20:43

get you sacked by your employer, so it can be very menacing.

0:20:430:20:46

And the mainstream newspapers feed off the back of it,

0:20:460:20:48

and have stories about that very thing happening and people

0:20:480:20:51

being shamed for saying something that some of the papers

0:20:510:20:54

reporting on it actually agree with.

0:20:540:20:56

I think that's the other problem,

0:20:560:20:57

because this is the other issue, though,

0:20:570:20:59

because we look at social media as though this is a new phenomenon,

0:20:590:21:01

but there's a very long history of our very politicised media finding

0:21:010:21:05

individuals who they disagree with, hunting them down,

0:21:050:21:09

having reporters outside their front doors, going through their

0:21:090:21:12

private life, going through the private lives of people around them.

0:21:120:21:15

So, yes, we should hold social media to account,

0:21:150:21:17

but we should also hold to account our mainstream media,

0:21:170:21:20

which is responsible often for disseminating false news,

0:21:200:21:23

for targeting individuals they don't agree with and humiliating them.

0:21:230:21:26

So, yes, let's have that concern, but I think we should talk

0:21:260:21:28

about our press as well, which is often out of control itself.

0:21:280:21:31

That's something we'll skip onto in just a second.

0:21:310:21:33

Is this good for freedom of speech, then,

0:21:330:21:36

-if people feel intimidated, Jamie?

-No, I mean, there is always...

0:21:360:21:39

I think there's a danger with social media,

0:21:390:21:41

and I'm in favour of...

0:21:410:21:42

I think social media and democracy work in some ways very well

0:21:420:21:45

together, and I think the positives do outweigh the negatives.

0:21:450:21:49

There is definitely a sense, though, that a lot of people -

0:21:490:21:52

and I feel it myself sometimes - are self-censoring,

0:21:520:21:54

fearing to put something online that,

0:21:540:21:56

"Will I be attacked for that, is it the right thing to do?"

0:21:560:21:59

And just avoiding the scandal, avoiding the mob,

0:21:590:22:02

avoiding saying something that might cause offence.

0:22:020:22:05

-What did you want to say?

-Well, I'm definitely not going to say it now!

0:22:050:22:08

Although if I did, I might get some re-tweets and YouTube views

0:22:080:22:11

and all the rest of it. And that is a danger,

0:22:110:22:13

especially given that anything you post stays online forever.

0:22:130:22:18

And so, you might be called up for something you did ten years ago,

0:22:180:22:22

that is then waved in front of you and you lose your job or you

0:22:220:22:25

have people complaining about you, and that has a real chilling

0:22:250:22:28

effect, because people will be afraid of ever speaking their mind.

0:22:280:22:31

Tom, is this something you take into account when you are

0:22:310:22:34

employing people, for example?

0:22:340:22:35

Do you have to look at their Facebook history and their...

0:22:350:22:38

Absolutely, and we notice increasingly...

0:22:380:22:40

For example, if a political party wants to use

0:22:400:22:42

a real-life case study for a film or party political broadcast,

0:22:420:22:46

a poster campaign, etc, you would look into their background,

0:22:460:22:49

and you find increasingly,

0:22:490:22:50

when you look at people from a younger generation,

0:22:500:22:53

when you look at their Facebook feeds,

0:22:530:22:55

when you look at their Twitter feeds,

0:22:550:22:57

they are saying stuff that kids aged 16, 17, 18 say, which,

0:22:570:23:00

you know, when I was 16, 17, 18, I was saying stupid stuff as well.

0:23:000:23:03

Or you'll find that they will have liked something from UniLad

0:23:030:23:06

or The LAD Bible or something like that,

0:23:060:23:08

which we know that this is just part and parcel of being young and

0:23:080:23:12

being a bit stupid and making mistakes, as we've all done,

0:23:120:23:15

but taken out of context, suddenly it blows up and becomes...

0:23:150:23:18

And now it's written in stone, isn't it?

0:23:180:23:20

Well, exactly, and I remember the case...

0:23:200:23:22

I expect others on the panel will know the details more,

0:23:220:23:25

but a Youth Police and Crime Commissioner, I think she was in

0:23:250:23:28

Birmingham, who had sort of put her head above the parapet for a job...

0:23:280:23:31

-In Kent, I think, yeah.

-Yeah, maybe, yeah.

0:23:310:23:34

She had put her head above the parapet for

0:23:340:23:36

a job and then the press went over her tweets,

0:23:360:23:38

as it is their right to do,

0:23:380:23:40

and they found that she had said some stupid stuff, as we've all

0:23:400:23:43

done, and suddenly, she became the number one victim in the country...

0:23:430:23:46

number one criminal in the country,

0:23:460:23:48

they put her face all over the papers and she had to resign.

0:23:480:23:51

Now, I'm not arguing the rights or wrongs about what she said,

0:23:510:23:53

but increasingly now, the things we say when we are younger

0:23:530:23:56

are being held against us because it's a permanent record online.

0:23:560:23:59

Dan, you want to come in?

0:23:590:24:01

Well, these things are important, but I'd like to get us on to

0:24:010:24:04

fake news, which I think is a cancer in the system, I mean,

0:24:040:24:09

you know, information is the lifeblood of democracy and

0:24:090:24:12

fake news is a terrible thing...

0:24:120:24:14

But Owen's saying fake news is nothing new.

0:24:140:24:16

Well, we've seen the rise of it,

0:24:160:24:17

but we've really seen the rise of it on social media in the USA...

0:24:170:24:20

Have we not had fake news in the mainstream media?

0:24:200:24:22

You know, the Pope supports Donald Trump, Hillary Clinton is

0:24:220:24:25

part of a paedophile ring - I mean, these things are nonsense,

0:24:250:24:27

but they are very serious nonsense...

0:24:270:24:29

But clearly nonsense, aren't they?

0:24:290:24:31

Well, the thing is, I think that what we...

0:24:310:24:33

We've got an election coming up in three years' time,

0:24:330:24:35

there is some evidence to suggest it's had an impact on

0:24:350:24:38

the democratic process in America, I think we need to find ways of

0:24:380:24:42

stopping that tide reaching these shores over the next three years.

0:24:420:24:46

I personally...

0:24:460:24:48

Although I think that social media and the internet are

0:24:480:24:50

fundamentally good for democracy, they disseminate information

0:24:500:24:53

and encourage participation, fake news needs to be stamped out.

0:24:530:24:57

I think the responsibility for that lies with social media companies.

0:24:570:25:01

But what's the difference between...

0:25:010:25:02

I don't think they are being responsible enough about it

0:25:020:25:05

and I think they need to be called out on that.

0:25:050:25:06

OK, are they tech platforms or publishers?

0:25:060:25:09

Well, the clue's in the title, they are social media platforms,

0:25:090:25:11

and I don't know why they are saying they are not media companies.

0:25:110:25:14

On this fake news business,

0:25:140:25:15

some of these fake news stories are so blatantly ludicrous,

0:25:150:25:18

-surely most people must think, "Oh, for goodness' sake!"

-But we saw...

0:25:180:25:22

And which is more insidious,

0:25:220:25:23

that or some of the misreporting and misrepresentation and

0:25:230:25:27

distortions that we've seen over the years from the mainstream media?

0:25:270:25:30

Which is more dangerous?

0:25:300:25:32

I think they are both dangerous.

0:25:320:25:33

There is a big, big difference between those two worlds,

0:25:330:25:36

though, which is that particularly for television,

0:25:360:25:38

it's a much more regulated world, there are rules, there is

0:25:380:25:41

a third-party independent regulator who can investigate complaints.

0:25:410:25:46

Social media is the Wild West,

0:25:460:25:48

it is absolutely the Wild West, and that's a problem.

0:25:480:25:52

Look, back in 1984, during the miners' strike,

0:25:520:25:55

a very divisive moment in British history,

0:25:550:25:57

there was the infamous Battle of Orgreave,

0:25:570:25:59

an infamous battle in the whole struggle,

0:25:590:26:02

and the BBC, I'm afraid to say, at the time,

0:26:020:26:05

they put the tape in a wrong direction, because originally,

0:26:050:26:07

as Liberty said, the human rights organisation,

0:26:070:26:10

it was a police riot, the police had to pay compensation,

0:26:100:26:12

and they put it in the wrong direction, the BBC -

0:26:120:26:15

yet to apologise for it, incidentally -

0:26:150:26:17

and made it look like the miners had attacked the police.

0:26:170:26:20

The other example, infamously, is Hillsborough,

0:26:200:26:23

where the Sun newspaper spread false news, fake news,

0:26:230:26:27

about Liverpool fans, with absolutely horrendous consequences.

0:26:270:26:31

So, we should take on fake news on the new media,

0:26:310:26:34

but also on the old media as well.

0:26:340:26:36

But this happens every day on Twitter and digital media,

0:26:360:26:38

-fake news...

-But I think, often...

0:26:380:26:40

Well, look, whether it be about immigrants, Muslims,

0:26:400:26:43

unemployed people, benefit fraud,

0:26:430:26:45

I think all of these are issues which are either exaggerated

0:26:450:26:47

or distorted, often, as well, by the mainstream press,

0:26:470:26:50

which play into people's prejudices, where you end up, for example,

0:26:500:26:53

with disabled people who have been, as disabled charities have said,

0:26:530:26:56

abused on the streets as a consequence.

0:26:560:26:58

That's because of the mainstream press as well.

0:26:580:27:01

But you can complain to somebody, as Dan says - with the so-called

0:27:010:27:04

mainstream media, the MSM, you can complain.

0:27:040:27:07

Theoretically, there's redress, isn't there?

0:27:070:27:10

In theory, but I'm afraid to say, often the way the media operates

0:27:100:27:13

in this country is that they often behave, still, with impunity.

0:27:130:27:16

You'll get corrections on page 22 in a tiny little box.

0:27:160:27:19

So, yes, obviously, I'm not saying we shouldn't take it seriously,

0:27:190:27:23

absolutely we should do that,

0:27:230:27:25

false news, which is deliberately fake in order to play into

0:27:250:27:28

the prejudices of very angry sections of the population,

0:27:280:27:31

but I just don't think there is enough anger about the...

0:27:310:27:33

Andrew, I'm coming to you, don't worry.

0:27:330:27:35

Because Andrew has a lot to say on this and knows a lot about it.

0:27:350:27:38

But Will is a fact-checker. You must be a very busy man at the moment!

0:27:380:27:42

-It's been a tough year!

-LAUGHTER

0:27:420:27:45

We spend our time not only checking what people are saying,

0:27:450:27:48

be they politicians, be they journalists,

0:27:480:27:50

be they lots of other people in public life,

0:27:500:27:52

but also asking them to go and correct the record.

0:27:520:27:54

But it's not just mischievous, this stuff, it's malicious, isn't it?

0:27:540:27:57

Is deliberate and it's malicious.

0:27:570:27:58

It's not for me to judge that,

0:27:580:27:59

it's for everyone in this room to judge that.

0:27:590:28:01

If we are concerned about the effect of information on democracy,

0:28:010:28:04

we've got to remember that we are democracy,

0:28:040:28:07

it is all of us as voters who decide whether digital media is

0:28:070:28:10

going to be good for us or bad for us.

0:28:100:28:12

But I think the thing we've got to look at is,

0:28:120:28:15

who has power and are they using it fairly?

0:28:150:28:17

Are people who are in powerful positions using information

0:28:170:28:21

to mislead us, and if so, what are we going to do about it?

0:28:210:28:24

And Owen is right, there are lots of examples of mainstream media

0:28:240:28:28

outlets publishing things that are clearly not true.

0:28:280:28:31

-In our experience...

-And getting clobbered for it, eventually.

-Sorry?

0:28:310:28:34

-And getting clobbered for it, eventually.

-Well, clobbered is...

0:28:340:28:37

This stuff is round the world in 80 clicks, isn't it?

0:28:370:28:40

Clobbered is putting it far too strongly.

0:28:400:28:42

Very few mainstream media outlets have any kind of system where

0:28:420:28:45

people get "clobbered" for getting things wrong.

0:28:450:28:48

That's simply not true in television.

0:28:480:28:50

There is a very clear set of rules,

0:28:500:28:52

there's a third-party independent regulator who polices that,

0:28:520:28:55

you can complain, things are investigated, and ultimately,

0:28:550:28:58

if you are a repeat offender, you have your licence revoked.

0:28:580:29:01

Well, A, when did that last happen in a major media programme?

0:29:010:29:04

Well, but I think that some of the...

0:29:040:29:06

Secondly, television is, as you said...

0:29:060:29:07

Some of the news channels can really sail close to the wind,

0:29:070:29:10

and they get called on this stuff by the regulator,

0:29:100:29:13

and they have to amend their behaviour, and they do.

0:29:130:29:16

Television, as you said, is the most regulated,

0:29:160:29:18

and it's absolutely right that they have the strongest kind of

0:29:180:29:21

set of rules around accuracy and the greatest caution.

0:29:210:29:23

It is nonetheless the case that at least one of them

0:29:230:29:25

has the informal slogan of "never wrong for long" - we'll get

0:29:250:29:28

it on air as quickly as possible and we'll correct it if we need to.

0:29:280:29:31

Television and radio have very strong rules.

0:29:310:29:34

Newspapers, which are highly partisan, don't have the same rules.

0:29:340:29:37

They have a whole range of things.

0:29:370:29:39

Some of them will correct things very quickly and very fairly when

0:29:390:29:42

we ask them to, some of them will simply ignore a corrections request.

0:29:420:29:45

So there is a whole range of stuff going on.

0:29:450:29:48

-How long does it take to check a fact?

-Usually, about a day.

-A day?

0:29:480:29:52

To turn around a really solid fact check where you've looked at

0:29:520:29:55

all the sources, you've talked to experts where you need to...

0:29:550:29:57

Well, as I say, it's been around the world about twice

0:29:570:30:00

on digital media by that time.

0:30:000:30:01

Exactly, so the thing that really matters isn't actually

0:30:010:30:05

whether media outlets get things wrong once, it's how often

0:30:050:30:08

things get repeated, because that's what's really powerful.

0:30:080:30:11

So if they do correct things when we point it out,

0:30:110:30:14

or when someone else points it out, that can do a lot of good,

0:30:140:30:16

but not all of them do it consistently.

0:30:160:30:18

Let's talk about social media for a minute,

0:30:180:30:21

because there are kind of two things going on with social media.

0:30:210:30:24

One is that all of us can share whatever we want

0:30:240:30:27

with whoever we want, and that's amazing.

0:30:270:30:29

That's democracy writ large.

0:30:290:30:30

And if you don't like the consequences,

0:30:300:30:32

then change your voters, ie it's up to all of us

0:30:320:30:35

to do a better job as voters.

0:30:350:30:37

The other bit is, people in positions of power,

0:30:370:30:40

be they the social media companies or be they the advertisers

0:30:400:30:44

on social media, using that platform,

0:30:440:30:46

either fairly to the users or unfairly.

0:30:460:30:49

Now, the thing we don't know,

0:30:490:30:51

when the political parties spend hundreds of thousands, millions,

0:30:510:30:55

I don't know, advertising, is what they are saying and to whom.

0:30:550:30:59

It has never, ever been possible before to reach millions

0:30:590:31:03

of people in the country and have the other millions

0:31:030:31:06

of people in the country not know.

0:31:060:31:08

You couldn't take out a newspaper advert

0:31:080:31:09

and have the rest of us not see it.

0:31:090:31:11

So now we have a whole set of political advertising

0:31:110:31:14

that is completely unscrutinised,

0:31:140:31:16

and that's the thing we should be a bit worried about.

0:31:160:31:18

Well, Dan's agreeing with you.

0:31:180:31:19

Social media is invisible for the rest of us.

0:31:190:31:22

I think it's the fact that it's unscrutinised is basically

0:31:220:31:24

-Dan's point. Dr Andrew Calcutt...

-Can I just make a point about...?

0:31:240:31:28

-You can in a minute.

-Just about trust.

-I'll come back to trust.

0:31:280:31:31

-Because the rules...

-TRUST me, I will come back to it.

0:31:310:31:34

Or I'll apologise.

0:31:340:31:37

But Andrew hasn't been in it, and I know, Laura,

0:31:370:31:39

I've not to come to you yet.

0:31:390:31:41

Everyone's had a say and will have far more of a say, but, Andrew,

0:31:410:31:45

journalism, humanities and creative industries, that's your thing,

0:31:450:31:48

as Dr Andrew Calcutt.

0:31:480:31:51

We're on false news, fake news now,

0:31:510:31:54

but in this cacophony of calumny and falsehood, is the truth dying?

0:31:540:32:01

Well, I think the idea that democracy is threatened

0:32:010:32:06

and truth is destroyed by a bunch of spotty youths from Macedonia...

0:32:060:32:11

NICKY LAUGHS

0:32:110:32:13

..it's just ludicrous, and equally risible is the CIA,

0:32:130:32:16

of all people, complaining about state agencies interfering

0:32:160:32:20

in the business of other sovereign nations, you couldn't make it up.

0:32:200:32:25

APPLAUSE

0:32:250:32:26

I think though that...

0:32:260:32:28

I think some of my fellow panellists are trying to do the jigsaw

0:32:280:32:32

by just the first couple of pieces that come to hand.

0:32:320:32:36

And I think, really, if we all take a step back

0:32:360:32:40

and look at that question, it's kind of too early to tell...

0:32:400:32:43

whether social media and digital media are good for democracy,

0:32:430:32:48

cos we're not in one, we're not in a democracy,

0:32:480:32:51

we're in an era of something like zombie politics.

0:32:510:32:54

So, if I may, it's a little bit like, was it Deng Xiaoping?

0:32:540:32:56

When asked about the French Revolution.

0:32:560:32:58

Yeah, it's too early to tell.

0:32:580:32:59

And he said, did it work, "It's too early to tell."

0:32:590:33:02

I don't know if it was Deng, but it's a little bit like that.

0:33:020:33:05

We're in an era of zombie politics where the majority of

0:33:050:33:09

the population is notable by its absence.

0:33:090:33:13

And professional politicians have been like

0:33:130:33:17

a kind of medicine show,

0:33:170:33:19

spooning out what the majority population is supposed

0:33:190:33:24

to accept, and you're meant to just take your medicine,

0:33:240:33:27

and if you don't take it, you're decried as being a deplorable.

0:33:270:33:32

And guess what?

0:33:320:33:35

In the last year, major sections of that population have said,

0:33:350:33:38

"We're not going to be the studio audience any more,

0:33:380:33:41

"we're going to walk out on that scenario."

0:33:410:33:44

And so, with all due respect, a lot of people sitting around

0:33:440:33:48

the inner circle are indeed IN the inner circle.

0:33:480:33:50

-They're in the bubble that is...

-We are the problem.

0:33:500:33:54

..that is rather far-removed from where most people are,

0:33:540:33:58

and most people are not in any substantial sense

0:33:580:34:02

-engaged in politics.

-But if there's a lack of trust.

0:34:020:34:05

The idea that this kind of interpersonal details amounts

0:34:050:34:09

to political polarisation is just ludicrous.

0:34:090:34:12

Put your hand up in the audience if you want to say something

0:34:120:34:14

on this, and I'll get round to you.

0:34:140:34:16

But if you say that people are disillusioned with politics

0:34:160:34:19

because of a lack of trust, we are now playing around

0:34:190:34:22

with this platform which has stuff on it that we cannot trust.

0:34:220:34:27

In there is a paradox somewhere, Dan, isn't there?

0:34:270:34:29

Well, yeah, I'm particularly interested in this question

0:34:290:34:32

of rules and regulations,

0:34:320:34:34

because the most regulated part of the media is television.

0:34:340:34:40

It is, and we've got this diverse public service system in the UK,

0:34:400:34:44

and that is policed by Ofcom, and lo and behold, people's biggest source

0:34:440:34:49

of news in the UK is television, and it's also the most trusted.

0:34:490:34:54

Social media at the moment

0:34:540:34:57

is much, much lower, both in terms of the amount of people

0:34:570:35:00

that do access it for news, but also for levels of trust.

0:35:000:35:03

-The problem is...

-WILL:

-Same is true of newspapers, by the way.

0:35:030:35:06

The problem is you've got this issue of fake news,

0:35:060:35:10

and we have to stamp it out before it gets bigger.

0:35:100:35:12

-Too late.

-It's not.

0:35:120:35:14

It's not too late.

0:35:140:35:15

But there's no such thing as fake news and not fake news

0:35:150:35:18

and it's clear to determine which is which.

0:35:180:35:20

It's such a big grey area between one and the other.

0:35:200:35:24

-There may be, but...

-Stamping it out is censorship.

0:35:240:35:28

-DR ANDREW:

-Aren't our lives micromanaged enough already?

0:35:280:35:31

Do you have to regulate everything?

0:35:310:35:33

We have to bring up the elephant in the room here,

0:35:330:35:35

which is that what no-one wants to admit is that the idea of

0:35:350:35:38

the fear of fake news is the fear of who is reading the fake news

0:35:380:35:41

and what will they do with that information? And, essentially,

0:35:410:35:44

when you say there is a danger of people being influenced

0:35:440:35:47

by fake news negatively,

0:35:470:35:48

you're making a differentiation between you,

0:35:480:35:50

who can understand that the news is fake,

0:35:500:35:52

and those thickos out there who can't, and that's a deeply...

0:35:520:35:55

-DAN:

-No, I don't...

-Hang on.

-Don't think that at all.

0:35:550:35:58

If you think fake news should be stamped out, who do you want

0:35:580:36:01

to stamp it out and why should they be the arbiters of truth?

0:36:010:36:03

-And who do you stamp it out?

-Exactly.

0:36:030:36:05

Well, I think that's complicated.

0:36:050:36:07

That's the easy answer to a really uncomfortable question.

0:36:070:36:10

The horse has bolted.

0:36:100:36:12

But you have different systems within different media,

0:36:120:36:15

and television is heavily regulated,

0:36:150:36:17

it didn't stop television holding power to account,

0:36:170:36:20

generally being extremely accurate, being diverse and being popular.

0:36:200:36:24

-It doesn't stop that.

-But can I ask you why you're worried about...?

0:36:240:36:26

Why are you so worried that people...?

0:36:260:36:28

And I don't know who you're talking about when you're

0:36:280:36:30

talking about people that don't understand that wild story

0:36:300:36:33

about Ed Miliband fixing relationships

0:36:330:36:35

with Hillary Clinton is fake.

0:36:350:36:37

Well, BuzzFeed's research showed that there was

0:36:370:36:41

more consumption of fake news during the US election than true news.

0:36:410:36:46

-Entertainment. It's people having a laugh.

-That's a problem.

0:36:460:36:50

Can I ask a question? Just a hypothetical.

0:36:500:36:52

If something completely fake about you went all over the world,

0:36:520:36:56

like someone claimed you were the head of a drugs cartel,

0:36:560:36:58

something obviously ridiculous,

0:36:580:37:00

that went all around the world, they don't know you from Adam,

0:37:000:37:03

and your picture is there, calling you the head of a drug cartel,

0:37:030:37:06

you'd be like, "No correction needed, free speech,

0:37:060:37:08

"they can say what they want"?

0:37:080:37:09

Would you be comfortable with that, genuinely?

0:37:090:37:11

I suppose I would publicly say that that isn't the case.

0:37:110:37:14

Wouldn't that put you at risk?

0:37:140:37:16

The cost of having a free society is people being able to say whatever

0:37:160:37:19

they want, and sometimes that being untrue, sometimes that being nasty.

0:37:190:37:22

-But you wouldn't want it stamped out?

-That's the cost of a free society.

0:37:220:37:24

I'm going to take that thought,

0:37:240:37:26

people being able to say whatever they want,

0:37:260:37:28

and we're going to discuss it in the context of digital media,

0:37:280:37:30

but first of all, as promised, what would you like to say?

0:37:300:37:33

Thank you.

0:37:330:37:34

What I wanted to say was that this particular issue about news,

0:37:340:37:39

where they are coming from,

0:37:390:37:40

it's not the responsibility of one side, it's the responsibility

0:37:400:37:43

of the receivers, us, general public, as well,

0:37:430:37:46

to find out where is it coming from, if we can't get right to the bottom

0:37:460:37:50

of it, as least do a little bit of research

0:37:500:37:52

or investigation of our own.

0:37:520:37:54

Nowadays, it's not only TV, as somebody said,

0:37:540:37:57

it's mostly free newspapers

0:37:570:37:58

on the train, that's the time we have to just listen and digest.

0:37:580:38:02

Now, just, even if we take simple rules, golden rules...

0:38:020:38:06

This particular rule I'm saying is coming from a Muslim community.

0:38:060:38:10

It's a saying of Prophet Muhammad, a basic saying, a Hadith,

0:38:100:38:13

that for a person to be a liar

0:38:130:38:15

is enough that he passes on whatever he hears

0:38:150:38:18

without making any effort to check.

0:38:180:38:21

So, if everybody takes just this simple, basic rule,

0:38:210:38:24

we'll avoid so much of spreading around just

0:38:240:38:27

a news that we heard on a WhatsApp or Facebook...

0:38:270:38:30

-Rumours, gossip.

-That's it, yeah.

0:38:300:38:32

The Hadith are very strong against rumours and backbiting.

0:38:320:38:35

So, if everybody sticks to that rule, so much will be relieved.

0:38:350:38:39

-That's actually a really smart point.

-It's the best point so far.

0:38:390:38:43

-LAUGHTER

-As a fact-checker,

0:38:430:38:45

it's lovely to think that we can use fact-checking to empower everybody.

0:38:450:38:49

The thing you've got to ask,

0:38:490:38:51

the flipside of the question of who's the person who can

0:38:510:38:55

split the difference between all those stupid people

0:38:550:38:57

who believe the fake news and the rest of us, actually, none of us

0:38:570:39:01

can just skim through a set of headlines

0:39:010:39:03

and know which is true or not.

0:39:030:39:04

It's immensely difficult to spot people who are

0:39:040:39:07

trying to mislead you and people who aren't.

0:39:070:39:10

Without the thrills and spills and the excitement of,

0:39:100:39:13

"Ooh, what if that's true? That's great."

0:39:130:39:16

-Exactly.

-It's a completely different set of personal standards we apply

0:39:160:39:19

to that than we do to watching TV news or radio news, but you can...

0:39:190:39:22

Never mind splitting the difference,

0:39:220:39:24

here's somebody who made a difference.

0:39:240:39:27

Laura, your campaign, the tampon tax,

0:39:270:39:30

-that was through social media, wasn't it?

-Yep.

0:39:300:39:33

I think one of the main really good points about social media is

0:39:330:39:36

that it really gives power to under-represented communities

0:39:360:39:39

of people, like women, for example.

0:39:390:39:41

So, tampon tax has existed since 1973 and people have campaigned

0:39:410:39:45

for generations to end it, but the reason why this campaign

0:39:450:39:49

finally won - at least I think, anyway -

0:39:490:39:51

is because we've had this platform of social media

0:39:510:39:54

that we've never had before, and upon that platform we can finally,

0:39:540:39:57

really be heard, and politicians can't ignore us any more.

0:39:570:40:00

So, I think that's one really good and important point

0:40:000:40:04

of social media, that individuals can really use it,

0:40:040:40:06

rather than political parties or even news outlets.

0:40:060:40:09

It gives the individual back power,

0:40:090:40:11

-and then communities that are under-represented.

-Yeah.

0:40:110:40:14

There's loads of examples, things I'm interested in,

0:40:140:40:17

the whole pressure on SeaWorld in Florida...

0:40:170:40:19

The Black Lives Matter campaign.

0:40:190:40:21

The Black Lives Matter campaign, your campaign,

0:40:210:40:24

but it's not just a click, is it?

0:40:240:40:25

Cos it's what you were saying earlier on, Helen,

0:40:250:40:27

it's a million clicks.

0:40:270:40:29

Exactly, the clicks scale up to something really significant.

0:40:290:40:33

-Also, people don't just click and do nothing else...

-They're engaged.

0:40:330:40:37

Exactly, people don't just scroll through, like Change.org,

0:40:370:40:39

for example, and click on every petition.

0:40:390:40:42

I get people e-mailing me every day with little spelling errors

0:40:420:40:46

I've made that I've not even seen.

0:40:460:40:47

So they definitely do read through your campaigns.

0:40:470:40:50

We also have lots of communities throughout the world really

0:40:500:40:53

that have bound together and made a difference within

0:40:530:40:56

their community, whether that be through tampon tax or

0:40:560:40:59

the new campaign about homeless people

0:40:590:41:00

and improving sanitary provision to homeless women.

0:41:000:41:03

Lots of people do stuff physically that's inspired by....

0:41:030:41:06

Did you get any abuse and vile trolling?

0:41:060:41:09

-As a result of your campaign.

-I did.

0:41:090:41:12

Especially a woman, or a feminist campaigner online,

0:41:120:41:15

you're just open to that, which is really sad,

0:41:150:41:18

but I found that online trolling is actually

0:41:180:41:20

a really important form of sexism that I've incurred anyway,

0:41:200:41:23

in that it's really obviously documented and it's evidence

0:41:230:41:26

-to show that sexism really does exist.

-Shocking.

0:41:260:41:29

Yeah, it shocks people, but women face sexism on an everyday basis.

0:41:290:41:33

-Not like that.

-Lots of people basically say,

0:41:330:41:35

"Oh, sexism doesn't exist,

0:41:350:41:37

"we have never experienced it as men, so therefore it doesn't..."

0:41:370:41:40

-But some of that stuff is psychotic, it's not just...

-Yeah, but I mean,

0:41:400:41:43

it's just evidence of feelings that are evident throughout society,

0:41:430:41:48

and if that comes out online that's, in a way, a good thing,

0:41:480:41:50

cos it shows that sexism really does exist and we need to tackle it.

0:41:500:41:54

It's lifted up the stone.

0:41:540:41:55

What about so many female MPs, from all sides of the chamber,

0:41:550:41:59

of the political divide, have had some dreadful, dreadful stuff?

0:41:590:42:05

-Absolutely vile, and...

-What's that about?

-I think it's about...

0:42:050:42:10

It's the power of the network you're talking about, with the tampon tax,

0:42:100:42:14

and networks before were old boys' networks,

0:42:140:42:16

they were for the powerful,

0:42:160:42:18

whereas now we can have networks for everyone to come together.

0:42:180:42:21

But as well as that, that attracts, and what the internet does

0:42:210:42:25

really well is amplify what is out there in the debate.

0:42:250:42:31

It attracts those who are the most vocal and the most aggressive

0:42:310:42:35

to take part in a debate, and what I really dislike is this idea

0:42:350:42:39

that if you respond to that you are feeding the trolls.

0:42:390:42:42

I think that women need to put their voices out there and we need

0:42:420:42:48

to take back that space, we need to take back the internet space,

0:42:480:42:53

and what I would say, again,

0:42:530:42:55

that's why we need a more active government setting.

0:42:550:43:00

It's too early to tell, I agree, the impact of digital media

0:43:000:43:05

on democracy, but we need to decide what that impact is.

0:43:050:43:08

More active government, what do you mean?

0:43:080:43:10

We need a government that's looking at the issues,

0:43:100:43:15

protecting free speech, but also

0:43:150:43:17

against hate speech online, for example.

0:43:170:43:19

But if you were to meet one of these people, and very often,

0:43:190:43:22

Owen did, they're not very impressive human beings.

0:43:220:43:26

They're pretty sad individuals, but they become

0:43:260:43:28

these keyboard warriors, and as Owen said earlier on,

0:43:280:43:31

they say things to you that they would never say to your face.

0:43:310:43:35

That's not representative, it's just brought out a sense of...

0:43:350:43:38

It's empowered some rather pathetic people.

0:43:380:43:41

Well, it's empowered, if you like, the wrong people, but there

0:43:410:43:44

needs to also be sanctions, and this is the point about...

0:43:440:43:49

It's empowered the wrong people.

0:43:490:43:51

It's empowered the wrong people when they're attacking and trolling.

0:43:510:43:55

And there needs to be sanctions, and I'm very interested,

0:43:550:43:57

the idea that you don't believe there should be sanctions

0:43:570:44:01

online for the sort of behaviour which, in the street,

0:44:010:44:05

if somebody attacks me in the street in the way that they do online,

0:44:050:44:08

there would be sanctions.

0:44:080:44:09

What I'm saying is that regulation that applies in real life

0:44:090:44:13

also applies online.

0:44:130:44:14

I don't think there should be any sanctions on speech

0:44:140:44:17

-in real life or anywhere else.

-It's empowering the wrong people.

0:44:170:44:19

The phrase "empowering the wrong people"

0:44:190:44:21

is so terrifying to hear from an elected MP,

0:44:210:44:23

that is just shocking to me,

0:44:230:44:25

because what this is really masking - and the question about

0:44:250:44:28

women is so interesting, because, so often, censorship online

0:44:280:44:32

is couched in terms of protecting women, protecting minorities,

0:44:320:44:34

and what that basically is saying is that women cannot handle

0:44:340:44:38

really sometimes quite awful abuse online, as well as men,

0:44:380:44:42

to me that's deeply...

0:44:420:44:43

Do you think it stops women putting their head above the parapet,

0:44:430:44:46

wanting to stand for parliament for example?

0:44:460:44:49

Hang on, if you're a public figure, especially if you're an MP,

0:44:490:44:52

you should be able to accept that you're talking to the public

0:44:520:44:55

and the public should be allowed to talk to you.

0:44:550:44:57

Sometimes that's particularly nasty, actually,

0:44:570:44:59

the majority of so-called sexist abuse that MPs get -

0:44:590:45:02

and I'm thinking about certain MPs who are very vocal about it -

0:45:020:45:05

Jess Phillips, Yvette Cooper - is actually just criticism of them.

0:45:050:45:08

But hang on, if someone came to your constituency surgery and said,

0:45:080:45:11

"I think you should be raped",

0:45:110:45:13

you wouldn't say, "Hang on a minute, I'll be with you in just a second."

0:45:130:45:16

-Is that what you're saying?

-No, no, that's...

0:45:160:45:19

It's fine that they're empowered, is it?

0:45:190:45:21

Death threats and things like that are clearly illegal

0:45:210:45:24

and should be dealt with through law, but what we're really talking

0:45:240:45:27

-about now is a broader question of what is sexist...

-No, we're not!

0:45:270:45:30

..and actually what is talked about is saying stuff like...

0:45:300:45:33

And actually I've written on this a lot - what is sexist trolling?

0:45:330:45:37

And it's often about calling women fat

0:45:370:45:39

or saying horrible things about women.

0:45:390:45:41

And that is seen - especially with the tampon tax - that is seen...

0:45:410:45:45

Jamie, where are you on this?

0:45:450:45:47

Just to say, just finally,

0:45:470:45:49

just to say that women are more subject to online abuse

0:45:490:45:54

and that there should be sanctions to protect women is deeply

0:45:540:45:56

more sexist than anything an online control can...

0:45:560:45:58

How do we stop this? Should we stop it?

0:45:580:46:00

No, we can't stop it, and we shouldn't stop all types of trolling,

0:46:000:46:03

and we shouldn't stop all types of offensive language or behaviour,

0:46:030:46:07

because when we live in a society where people are slightly worried

0:46:070:46:11

about self-censoring, not being able to speak their mind, actually,

0:46:110:46:15

sometimes internet trolls play quite a valuable role.

0:46:150:46:18

They push at the boundaries of offensiveness and make

0:46:180:46:21

other people feel like they might be able to speak their mind, too.

0:46:210:46:23

Yes, of course there are limits to that,

0:46:230:46:26

where it gets particularly targeted or where it's

0:46:260:46:28

a threat against an individual, but there's always a danger,

0:46:280:46:31

I think, that with internet trolls, we just say,

0:46:310:46:33

"They're terrible, we've got to silence them all."

0:46:330:46:35

That would actually be quite detrimental

0:46:350:46:37

for the health of our democracy.

0:46:370:46:39

-Owen, do they have their uses?

-What, trolls?

-Yeah.

0:46:390:46:41

Well, we conflate, I think, trolls and abuse.

0:46:410:46:43

I get trolled all the time - I look like a 12-year-old,

0:46:430:46:46

some age better than others, what can you do?

0:46:460:46:48

LAUGHTER

0:46:480:46:49

You wrote an article, if I may say, it was provocative,

0:46:490:46:52

as most of your articles can be,

0:46:520:46:54

but it was pretty thoughtful and well-argued about the fact that

0:46:540:46:59

having Jeremy Corbyn as leader might not be the best idea in the world,

0:46:590:47:04

and it was interesting, and I watched it happen,

0:47:040:47:06

it unfolded in front of me like a fight in the school yard.

0:47:060:47:09

You weren't behaving in that manner, but other people were.

0:47:090:47:12

The abuse you got was extraordinary.

0:47:120:47:14

Well, I am a right-wing sell-out, as my mum would tell me.

0:47:140:47:17

-No, I'm joking.

-There's not much room for civilised debate sometimes.

0:47:170:47:20

No, but I would say this actually, because, yes,

0:47:200:47:22

I've had that kind of attack and it seems ridiculous,

0:47:220:47:25

we've had a lot of focus on that,

0:47:250:47:28

but the vast majority of abuse I get is from the far right

0:47:280:47:31

and the hard right, and that is stuff like threats of violence,

0:47:310:47:34

threats of death, sharing my home address,

0:47:340:47:37

sharing pictures from Google Street View of my bedroom

0:47:370:47:40

and my front door, with arrows pointing at them saying,

0:47:400:47:42

"Here's his bedroom, here's his door."

0:47:420:47:45

Now, if you're saying an absolute view on free speech, which is people

0:47:450:47:49

have the right to say I should be chopped up because I'm gay,

0:47:490:47:52

that people online should... I don't take it seriously, I should say.

0:47:520:47:56

Some of my friends think I should take it

0:47:560:47:57

far more seriously than I do, but nonetheless,

0:47:570:47:59

in the interests of free speech, people can basically issue a hate...

0:47:590:48:03

So, what do we do about it?

0:48:030:48:05

-Well, it should be.

-Over to you..

0:48:050:48:07

In the real world, if someone says to you should...

0:48:070:48:10

It's not the real world.

0:48:100:48:11

It is the real world online, of course it is,

0:48:110:48:13

and I have to say,

0:48:130:48:15

after the death of Jo Cox by a far-right terrorist,

0:48:150:48:18

we should take far more seriously the sort of hate speech

0:48:180:48:21

we see online, and it legitimises often the sorts of extremism

0:48:210:48:23

-that can end in violence.

-There you are,

0:48:230:48:25

that's where that ends up, Ella.

0:48:250:48:26

No, I don't think it necessarily ends up

0:48:260:48:28

in the tragic death of Jo Cox,

0:48:280:48:30

which was not through a social media hate campaign,

0:48:300:48:33

that was not what happened.

0:48:330:48:34

-That's not what I said.

-But the thing is,

0:48:340:48:38

the left used to be about protecting and arguing for

0:48:380:48:41

people's freedoms, and right now,

0:48:410:48:42

a lot of the trolling online comes from, yes, right-wing saddos

0:48:420:48:48

who are keyboard warriors and they really wouldn't threaten a mouse,

0:48:480:48:50

let alone women or...

0:48:500:48:52

There's left-wing saddos, as well, for the sake of BBC impartiality.

0:48:520:48:57

It comes from both sides,

0:48:570:48:59

but on the left we used to argue for freedom, and absolute freedom,

0:48:590:49:03

and the freedom for somebody to be able to express themselves

0:49:030:49:06

in any way, and that's actually now the biggest argument

0:49:060:49:10

against free speech,

0:49:100:49:12

which is an absolute right in a free society,

0:49:120:49:14

if you don't have free speech,

0:49:140:49:15

you don't have any kind of freedom,

0:49:150:49:17

cos it's the fundamental right to say what you believe,

0:49:170:49:20

and the left is now trying to close that down through sanctions...

0:49:200:49:23

-Andrew, wait a minute.

-..through sanctions and censorship and rules.

0:49:230:49:26

Dr Andrew Calcutt, is this just a necessary

0:49:260:49:32

or unavoidable consequence of the digital media platforms?

0:49:320:49:35

Is it just a manifestation of aspects of the human condition,

0:49:350:49:40

or is it something that we should stamp on?

0:49:400:49:42

Oh, I certainly don't think we should be stamping

0:49:420:49:46

and regulating things out of existence, not by any means.

0:49:460:49:50

I think it would just be ludicrous to try and close it down,

0:49:500:49:55

and sanitise it, and outlaw the deplorables.

0:49:550:50:01

It's the same kind of Hillary Clinton discourse all over again.

0:50:010:50:06

You shouldn't be allowed to say that "we don't want you",

0:50:060:50:09

even in the echo chamber.

0:50:090:50:11

But what about, this is something, we'll come back to Dan,

0:50:110:50:13

cos Dan mentioned this earlier on,

0:50:130:50:15

and I think somebody else did, too - seems an awful long time ago -

0:50:150:50:17

Facebook and Twitter, are they tech platforms or are they publishers?

0:50:170:50:22

Well, they're something like a hybrid in between the two.

0:50:220:50:26

But I don't think they are either the problem or the solution.

0:50:260:50:31

I think we've lived through decades of intense interpersonalisation,

0:50:310:50:36

privatisation of everyday life,

0:50:360:50:40

mixed in with a kind of post-political managerial elite

0:50:400:50:44

that wants to micromanage every aspect of our existence,

0:50:440:50:48

including what we do online...

0:50:480:50:50

I don't think anyone's saying that.

0:50:500:50:53

-Forgive me...

-Dan, come back on that.

0:50:530:50:56

I don't think anyone's saying, I would imagine...

0:50:560:50:59

Well, of course, you're not going to say it.

0:50:590:51:01

I would imagine everyone on this panel...

0:51:010:51:03

Because I'm revealing what you don't want a say.

0:51:030:51:06

..wants a society of free speech in this country.

0:51:060:51:08

Yeah, well, they all say that, come on.

0:51:080:51:11

But a total and utter free-for-all ignores the fact that

0:51:110:51:15

there are other aspects of existence that also have importance.

0:51:150:51:18

We're not talking about micromanaging that out of existence,

0:51:180:51:21

we're just talking about creating an environment

0:51:210:51:25

where there's some basic decency.

0:51:250:51:26

Creating an environment means managing it.

0:51:260:51:28

But we already have that! We already have that in our media.

0:51:280:51:31

One of the best things about the original incarnation of

0:51:310:51:33

the internet was the Electronic Frontier Foundation,

0:51:330:51:36

which saw it as a frontier for experimentation...

0:51:360:51:39

But we already have it in the world

0:51:390:51:41

of television and newspapers and radio.

0:51:410:51:44

THEY TALK OVER EACH OTHER

0:51:440:51:46

Dan, as a Channel 4 man... Wait a minute, everybody, please!

0:51:460:51:50

Or I'll unfollow you all.

0:51:500:51:51

LAUGHTER

0:51:510:51:53

As a TV man, as a Channel 4 man, are you not slightly on

0:51:540:51:59

the defensive here cos you're old news

0:51:590:52:02

and this is what's happening now?

0:52:020:52:05

It's going to happen in the future, because the TV news -

0:52:050:52:07

Channel 4 News - great programme - Sky News, they do a wonderful job,

0:52:070:52:11

BBC, ITN, likewise - but don't they all look a little bit leaden-footed?

0:52:110:52:16

No, I don't think that's right, actually.

0:52:160:52:19

I don't think that is a view of what is actually happening.

0:52:190:52:22

As I said earlier, television is the main source of news for most people.

0:52:220:52:27

But the traditional media companies - Channel 4 included,

0:52:270:52:30

the BBC included - have done an incredible job of going into

0:52:300:52:33

the new world and distributing video-based news.

0:52:330:52:36

Channel 4 News is now the biggest British broadcaster

0:52:360:52:40

putting news into Facebook.

0:52:400:52:42

But I think you've done a deplorable job on telling the story of

0:52:420:52:46

what's been going on in Western societies for the last 20, 30 years.

0:52:460:52:50

That's why you didn't get

0:52:500:52:51

that people were going to vote for Brexit.

0:52:510:52:54

That's why the mainstream media didn't get that people were

0:52:540:52:57

going to vote for Trump in the way that they did...

0:52:570:52:59

-That's as much about politicians...

-..because you haven't covered the story.

0:52:590:53:02

Sorry, just to complete on that point, I think the important

0:53:020:53:05

thing to say about the old media going into the new media space

0:53:050:53:08

is that we, the BBC, the other public service broadcasters,

0:53:080:53:11

we apply the standards that exist in the television world -

0:53:110:53:15

which are much higher, and, in the UK,

0:53:150:53:17

are the gold standard in the world, in any form of media -

0:53:170:53:21

we apply those standards into a world

0:53:210:53:23

where other people aren't applying it.

0:53:230:53:25

That's not to say the internet-only players don't have terrifically

0:53:250:53:28

high standards. Some of them do, people like Huffington Post.

0:53:280:53:31

I think you're covering yourself in medals you don't deserve.

0:53:310:53:34

It's more like self-service broadcasting.

0:53:340:53:36

-But there are many who don't.

-Andrew's on fire now!

0:53:360:53:40

Interesting points though, interesting points.

0:53:400:53:42

Self-service broadcasting, that's quite a... Jamie?

0:53:420:53:45

-What?

-You've been listening?

0:53:450:53:48

You know about this stuff?

0:53:480:53:50

-The reality is, I think...

-The cat's out of the bag.

0:53:500:53:54

Well, it's getting far more unstable and unpredictable,

0:53:540:53:57

and that's also very exciting.

0:53:570:53:59

And we definitely need to reinvigorate our democracies,

0:53:590:54:01

-we need new ideas...

-Do we need new controls?

0:54:010:54:05

What about Facebook, what about Twitter?

0:54:050:54:07

They're publishers, essentially, aren't they?

0:54:070:54:09

Shouldn't they be doing more?

0:54:090:54:10

If they are judged as publishers, they have an opt-out,

0:54:100:54:13

under a 1995 legislation in the US, which means they're not

0:54:130:54:17

responsible for things that are published on their platform.

0:54:170:54:20

Because, if they were, they would probably collapse immediately,

0:54:200:54:23

because they'd have to check everything before it was posted,

0:54:230:54:25

and we're talking about millions, billions...

0:54:250:54:27

You can go online right now, you can find IS, you can find vile

0:54:270:54:30

Islamism, you can find vile Nazism, you can find Holocaust denial...

0:54:300:54:34

It's not possible for them to actually check everything all

0:54:340:54:37

the time, it's simply too much.

0:54:370:54:39

But they have a moral imperative to do significantly more

0:54:390:54:42

than they are doing.

0:54:420:54:43

At the moment, they have immense power, as companies,

0:54:430:54:47

and they are simply not exercising enough responsibility.

0:54:470:54:51

I think we're mixing up two things, or, Dan, I think,

0:54:510:54:53

with respect, you're mixing up two things.

0:54:530:54:55

One is that we all, as individuals, are on social media.

0:54:550:54:58

And what we do as individuals is our business,

0:54:580:55:01

and everyone else can mind theirs.

0:55:010:55:03

There are also lots of people publishing on social media,

0:55:030:55:05

who are organisations,

0:55:050:55:07

who are doing what used to be only the remit of proper journalists

0:55:070:55:10

with proper, hopefully, proper journalistic standards.

0:55:100:55:13

And, obviously, as a professional journalist who lives by those

0:55:130:55:17

standards, you're offended to see people purporting to do that

0:55:170:55:21

and not living up to those standards.

0:55:210:55:23

Those are the people you have a beef with, and I think we need to be...

0:55:230:55:26

No, I have a beef with people who are outright lying,

0:55:260:55:28

and also people who are doing it simply for their own financial ends,

0:55:280:55:31

before we even get to political influence.

0:55:310:55:34

Helen, I haven't heard from you for a while?

0:55:340:55:35

The point is that social media platforms are now,

0:55:350:55:38

to some extent, institutions of democracy, if you like.

0:55:380:55:42

I mean, they are where our democracies are played out.

0:55:420:55:44

They're where our campaigns take place,

0:55:440:55:46

where our elections take place, where our governments act,

0:55:460:55:49

and where the president-elect is actually MAKING policy.

0:55:490:55:52

But they've only been there for about five years,

0:55:520:55:55

they've only been around at all for ten,

0:55:550:55:57

and they've only been intertwined with our political systems for five.

0:55:570:56:00

So, I suppose, it's not surprising we've got

0:56:000:56:02

a lot of institutional catch-up to do.

0:56:020:56:04

But there are things we can do, and there are things being done.

0:56:040:56:08

I mean, not enough, it's too late, et cetera,

0:56:080:56:11

but we shouldn't just throw our hands in the air and give up.

0:56:110:56:14

And we shouldn't lump everything together.

0:56:140:56:17

We're lumping together fake news, trolls -

0:56:170:56:20

the small, very small number of people who are threatening to

0:56:200:56:24

close down Twitter entirely.

0:56:240:56:26

Right now, this is a fantastic debate, but out there,

0:56:260:56:30

there is an even more fantastic debate about this debate,

0:56:300:56:34

happening right now, on Twitter.

0:56:340:56:36

If only they could all...

0:56:360:56:37

That's right, even this debate about politics is being played out

0:56:370:56:41

on social media as we speak, exactly.

0:56:410:56:43

But the solutions to the things are different, as well.

0:56:430:56:46

The solutions to fake news are things like your organisation

0:56:460:56:50

and what Facebook have said they will do about fake news.

0:56:500:56:53

Both they themselves, in terms of fact checking, and allowing

0:56:530:56:56

individual social media users to report.

0:56:560:56:59

The solutions are different for trolls and they are different

0:56:590:57:04

for all the other things we've been talking about.

0:57:040:57:07

But there are possibilities out there, and they are being worked at.

0:57:070:57:11

There are even bots being developed...

0:57:110:57:13

Bots, just explain to people who don't know what a bot is?

0:57:130:57:16

An automated Twitter account, for example.

0:57:160:57:18

There are automated Twitter accounts out there, facing racism,

0:57:180:57:23

attacking racism, or just trying to persuade racist trolls of the

0:57:230:57:30

error of their ways, et cetera.

0:57:300:57:33

And some of those things are being shown to work.

0:57:330:57:36

So, there are a lot of possibilities out there,

0:57:360:57:38

but we can't just lump them all together.

0:57:380:57:39

Chi, we're getting towards the end, but we've got about...

0:57:390:57:43

A 40-second answer from you, if you would?

0:57:430:57:46

Do you celebrate this?

0:57:460:57:47

Is it a better form of contact with your constituents,

0:57:470:57:50

as an elected representative?

0:57:500:57:52

I celebrate it as an elected representative, in terms of

0:57:520:57:55

being able, instantly, almost, to see some constituents -

0:57:550:57:59

those who are on Twitter, or those who are on Facebook -

0:57:590:58:03

respond to really important issues. That is fantastic, online.

0:58:030:58:07

But I also recognise there are so many who aren't making those

0:58:070:58:10

responses, because a lot of people have better things to do with

0:58:100:58:13

their lives than respond to their MPs online!

0:58:130:58:17

But, also, I think we haven't talked about this,

0:58:170:58:19

the rules that Facebook and others do in order to decide what

0:58:190:58:23

you see, the algorithms, they are what control what people see.

0:58:230:58:27

And after this, we're all going to check our phones, right?

0:58:270:58:30

Thank you all very much indeed.

0:58:300:58:32

As always, the debate continues on the digital universe,

0:58:320:58:35

on Twitter and online, join us next Sunday from Bradford,

0:58:350:58:37

goodbye from everyone here in Uxbridge.

0:58:370:58:39

Have a great Sunday.

0:58:390:58:41

APPLAUSE

0:58:410:58:43

Download Subtitles

SRT

ASS