Episode 2

Download Subtitles

Transcript

0:00:02 > 0:00:05Today, on The Big Questions, is digital media good for democracy?

0:00:15 > 0:00:18APPLAUSE

0:00:18 > 0:00:22Good morning. I'm Nicky Campbell. Welcome to The Big Questions.

0:00:22 > 0:00:25Today, we're back at Brunel University, London, in Uxbridge,

0:00:25 > 0:00:30to debate one very big question - is digital media good for democracy?

0:00:30 > 0:00:33Welcome, everybody, to The Big Questions.

0:00:35 > 0:00:38Now, this Friday, Donald Trump will be sworn in as

0:00:38 > 0:00:41the 45th President of the United States of America.

0:00:41 > 0:00:46Many say he owes his victory to a greater mastery of the tools

0:00:46 > 0:00:48of this digital age - Twitter, Facebook,

0:00:48 > 0:00:51and all the other forms of online communication.

0:00:51 > 0:00:54And, US intelligence agencies are saying that Russian

0:00:54 > 0:00:58intervention in cyberspace played a hand, too.

0:00:58 > 0:01:00Well, here in Britain,

0:01:00 > 0:01:03the Brexit campaign was acknowledged to be far savvier in its use

0:01:03 > 0:01:05of social media than the Remain side,

0:01:05 > 0:01:08and the Scottish Nationalist online army narrowed the gap in

0:01:08 > 0:01:12their independence referendum and helped drive out Labour

0:01:12 > 0:01:14in the 2015 general election.

0:01:14 > 0:01:17So, there's no denying the power of digital media.

0:01:17 > 0:01:20But, it also poses a real challenge.

0:01:20 > 0:01:24How do you know what is true and what is fake in an online world

0:01:24 > 0:01:27with no editorial control or standards?

0:01:27 > 0:01:32And, what is the effect on democracy of trolling, fake news,

0:01:32 > 0:01:35and the echo chambers created by algorithms?

0:01:35 > 0:01:37To debate this brave new world,

0:01:37 > 0:01:41we've assembled a digitally savvy front row of journalists,

0:01:41 > 0:01:45politicians, campaign advisers, techno wizards,

0:01:45 > 0:01:48think tank heavyweights, television executives, learned academics,

0:01:48 > 0:01:50all here in the flesh,

0:01:50 > 0:01:52not created in some virtual world.

0:01:52 > 0:01:55They are all too real.

0:01:55 > 0:01:59And, you can join in on Twitter or online,

0:01:59 > 0:02:02by logging into bbc.co.uk/thebigquestions.

0:02:02 > 0:02:04Follow the link to the online discussion.

0:02:04 > 0:02:05Lots of engagement,

0:02:05 > 0:02:09contributions from our highly representative

0:02:09 > 0:02:11Uxbridge audience as well.

0:02:11 > 0:02:16Well, Jamie Bartlett, if I may...

0:02:16 > 0:02:19director at the Centre for the Analysis of Social Media,

0:02:19 > 0:02:21the very man.

0:02:21 > 0:02:24This has changed politics forever, hasn't it?

0:02:24 > 0:02:26For years, we have been saying how do we get young people?

0:02:26 > 0:02:28How do we get more people engaged?

0:02:28 > 0:02:31How do we stop them from being disenfranchised?

0:02:31 > 0:02:32We have done just that.

0:02:32 > 0:02:34It has got a lot more people engaged.

0:02:34 > 0:02:38There's no doubt about that. And it's not just people tweeting

0:02:38 > 0:02:40and posting things on Facebook.

0:02:40 > 0:02:43Social media has been remarkably good

0:02:43 > 0:02:47at forming a bridge to get people into real-world politics as well.

0:02:47 > 0:02:51Various types of activism, marching, demonstrating, even voting.

0:02:51 > 0:02:54I think people are more likely to vote if they get involved in

0:02:54 > 0:02:57- politics online.- All good. - That is all good.

0:02:57 > 0:03:00And I think everybody here would acknowledge that is

0:03:00 > 0:03:03a very good thing. But, of course, it creates new problems.

0:03:03 > 0:03:07And those problems, I think, we're only just beginning to grapple with.

0:03:07 > 0:03:11I think it's making politics, in some ways, far more polarised,

0:03:11 > 0:03:14far more difficult to predict or control.

0:03:14 > 0:03:17In many ways, Donald Trump is the perfect politician for

0:03:17 > 0:03:18a digital age.

0:03:18 > 0:03:20We are overwhelmed with information,

0:03:20 > 0:03:24graphs and charts and infographics and tweets and retweets.

0:03:24 > 0:03:28And in those circumstances, we tend to go for whatever is the

0:03:28 > 0:03:31loudest, the most offensive, the most emotive.

0:03:31 > 0:03:33And that is why politics, I think,

0:03:33 > 0:03:36is becoming more polarised, it is creating more centres of

0:03:36 > 0:03:40power that we barely understand, huge tech companies that can

0:03:40 > 0:03:44control what we see online, and we don't really know how they work.

0:03:44 > 0:03:48So, while it's bringing more people into politics,

0:03:48 > 0:03:51it's not clear to me yet whether it's making politics any better.

0:03:51 > 0:03:54In fact, in some ways, it's making it rather worse.

0:03:54 > 0:03:56Well, there are loads of issues there,

0:03:56 > 0:03:58I'm really looking forward to the next hour or so.

0:03:58 > 0:04:00Helen, he makes some good points.

0:04:00 > 0:04:06Are people actually becoming better informed and more engaged?

0:04:06 > 0:04:09Because a click of a mouse is one thing,

0:04:09 > 0:04:12but actually trudging up the stairwells of high rises

0:04:12 > 0:04:16delivering leaflets on a cold February night is another

0:04:16 > 0:04:18thing entirely.

0:04:18 > 0:04:21I don't think we should denigrate the click of the mouse.

0:04:21 > 0:04:23I think there is a culture, particularly in politics,

0:04:23 > 0:04:27particularly British politics, that politics must be painful.

0:04:27 > 0:04:31You've got to do something cold, or long, or boring, or hard work.

0:04:31 > 0:04:33And what digital media have done

0:04:33 > 0:04:36is they've sort of democratised the act of political participation.

0:04:36 > 0:04:39So, now, there's very small acts of participation.

0:04:39 > 0:04:41Liking or clicking something.

0:04:41 > 0:04:44Indeed, the click of the mouse, which is what is doing the

0:04:44 > 0:04:47work here, it's what's drawing people into politics.

0:04:47 > 0:04:51We shouldn't let go of that, even while we're blaming...

0:04:51 > 0:04:54- Don't denigrate the click. - Yes, don't denigrate the click.

0:04:54 > 0:04:56Yeah, and it makes a difference.

0:04:56 > 0:04:58People feel they are making a difference, but are they?

0:04:58 > 0:04:59They are only...

0:04:59 > 0:05:02Because of these algorithms which send you your own things

0:05:02 > 0:05:04that they think you will like.

0:05:04 > 0:05:09So, basically, it is that echo chamber, you are only confronting,

0:05:09 > 0:05:12if that's the right word, your own opinions, there's no challenge.

0:05:12 > 0:05:15I don't think that's right.

0:05:15 > 0:05:19The true echo chamber would be just reading the Daily Mail in the 1930s.

0:05:19 > 0:05:24Or...just watching one particular TV station in the US.

0:05:24 > 0:05:28That would be a real echo chamber. And I don't think...

0:05:28 > 0:05:33Of course, our social media environments are personalised

0:05:33 > 0:05:36by us, and personalised by the social media platforms and we need

0:05:36 > 0:05:40to work against that, but I do think, still,

0:05:40 > 0:05:43that our information environment, and there is research to show this,

0:05:43 > 0:05:46that people are exposed to more news sources on social media

0:05:46 > 0:05:47than people who don't use social media.

0:05:47 > 0:05:51- Only the news sources they agree with.- No. Not necessarily.

0:05:51 > 0:05:55A more heterogeneous range of news sources on social media

0:05:55 > 0:05:57than they are in the offline world.

0:05:57 > 0:06:01We're very good at creating echo chambers outside social media.

0:06:01 > 0:06:03And I don't think we should forget that.

0:06:03 > 0:06:06Yeah, echo chambers, there's nothing new about them. Owen, is that right?

0:06:06 > 0:06:09Well, I think, firstly, Facebook is a bit different,

0:06:09 > 0:06:12but Twitter, I think we overblow its importance because

0:06:12 > 0:06:14only a very small section of the population are regularly using

0:06:14 > 0:06:17Twitter to talk about the world around them.

0:06:17 > 0:06:20And they tend to be younger, and more affluent.

0:06:20 > 0:06:23You take the Brexit... the EU referendum,

0:06:23 > 0:06:25the people who are most likely to vote to leave were people

0:06:25 > 0:06:29over the age of 65, who are the least likely to be on Twitter.

0:06:29 > 0:06:32So, I think if we're talking about these grand political developments

0:06:32 > 0:06:34sweeping the Western world and putting it down to Twitter,

0:06:34 > 0:06:36then I think we are mistaken.

0:06:36 > 0:06:39I think the danger with people like me on the left of politics

0:06:39 > 0:06:41is we think, well, the mainstream media is biased against us,

0:06:41 > 0:06:44most of the press supports the Conservative government,

0:06:44 > 0:06:47so, let's retreat to Twitter. I think that's a mistake,

0:06:47 > 0:06:49because most people, they come home at the end of the day,

0:06:49 > 0:06:52a busy day, they've got jobs, they've got kids,

0:06:52 > 0:06:55they might listen to a bit of radio, watch the evening news,

0:06:55 > 0:06:57maybe flick through a newspaper,

0:06:57 > 0:07:00and that is how most people still get their news coverage.

0:07:00 > 0:07:02Do we get a false impression of what is happening in the world, in a sense?

0:07:02 > 0:07:06Because Helen is saying with all this information, you're better informed than ever,

0:07:06 > 0:07:09but you're saying there's something of a political mirage about it.

0:07:09 > 0:07:13I think when it comes to Twitter. I think on Facebook, we've got, look, you've got someone here

0:07:13 > 0:07:15who did an excellent job for the Conservatives.

0:07:15 > 0:07:18You won't often hear me saying that about people on that wing of British politics,

0:07:18 > 0:07:21but he did because what they did with Facebook,

0:07:21 > 0:07:25it wasn't about joining Facebook groups, which is again just people often agreeing with each other,

0:07:25 > 0:07:29what the Conservatives did is they targeted specific demographics,

0:07:29 > 0:07:31because millions of people use Facebook.

0:07:31 > 0:07:33They're not talking about politics again on Facebook.

0:07:33 > 0:07:36They're clicking on photos, sharing funny messages with friends.

0:07:36 > 0:07:39But they targeted adverts at specific demographics.

0:07:39 > 0:07:41That is very effective.

0:07:41 > 0:07:44But if people like myself who are campaigners think we can just

0:07:44 > 0:07:47send a few tweets going, "Rrrrah, Theresa May!",

0:07:47 > 0:07:49yes, a load of people will retweet that,

0:07:49 > 0:07:52but then we make the mistake of thinking the rest of the country

0:07:52 > 0:07:56are full of the same enthusiasm as me, and that's not true.

0:07:56 > 0:07:59So, Twitter has its use, it can mobilise campaigners,

0:07:59 > 0:08:02put issues on the agenda, but it isn't a substitute, unfortunately,

0:08:02 > 0:08:05for trying to have a strategy to deal with a very hostile press.

0:08:05 > 0:08:08The aforementioned Tom Edmonds in a moment, but, Ellie, you want to come in?

0:08:08 > 0:08:11Yeah, I think there was a point made earlier that social media,

0:08:11 > 0:08:12which I think is a great new tool,

0:08:12 > 0:08:14it's just another kind of technological advance in

0:08:14 > 0:08:17the way that people communicate with each other,

0:08:17 > 0:08:20and that is to be welcomed, makes politics harder to control,

0:08:20 > 0:08:21or makes politics more polarised.

0:08:21 > 0:08:24I think most of us, myself included, would welcome that

0:08:24 > 0:08:28because having more ability to have an argument,

0:08:28 > 0:08:31having more scope to have a debate and have more people involved

0:08:31 > 0:08:33in the conversation is always a positive.

0:08:33 > 0:08:36Certainly, Twitter is a very kind of closed shop in the way that

0:08:36 > 0:08:38it's mainly young people and media commentators -

0:08:38 > 0:08:39myself as a journalist knows that.

0:08:39 > 0:08:42And Facebook, as Owen said, is a bit different.

0:08:42 > 0:08:44But the idea that this should be...

0:08:44 > 0:08:47What kind of frightens me more about social media

0:08:47 > 0:08:49isn't necessarily who's on it and who's talking about it,

0:08:49 > 0:08:52it's the attempts to control it,

0:08:52 > 0:08:56the attempts to make Facebook and Twitter put out certain

0:08:56 > 0:08:59messages or be worried about what messages they've put out

0:08:59 > 0:09:03- and the drive to censor social media.- That is a fascinating area.

0:09:03 > 0:09:04I'm going to address that later on

0:09:04 > 0:09:07because we have lots to focus on there with the trolling,

0:09:07 > 0:09:10with the vile abuse, with the fake news,

0:09:10 > 0:09:13with the false news and all that, that's all to come.

0:09:13 > 0:09:15But you were mentioned, Tom, in Dispatches,

0:09:15 > 0:09:17by Owen Jones,

0:09:17 > 0:09:21in a congratulatory tone because of the brilliant strategy

0:09:21 > 0:09:24that you came up with for the Tories in the last election online.

0:09:24 > 0:09:25What did you do and how did you do it?

0:09:25 > 0:09:26Well, broadly,

0:09:26 > 0:09:29political parties use online communications for two things.

0:09:29 > 0:09:33The first thing is to find and recruit supporters who will go and

0:09:33 > 0:09:36take those vital actions to support them, primarily offline,

0:09:36 > 0:09:38as other panel members have mentioned.

0:09:38 > 0:09:41So, it's the people who are going to go and knock on the doors

0:09:41 > 0:09:44because nothing beats that face-to-face communication.

0:09:44 > 0:09:48The second thing they do, and we did in the 2015 campaign,

0:09:48 > 0:09:51is to find and reach out to those swing voters

0:09:51 > 0:09:53in the marginal seats that are going to decide the election,

0:09:53 > 0:09:56and speak to them about the issues they care about.

0:09:56 > 0:09:59Now, if you manage an election campaign, you're speaking from...

0:09:59 > 0:10:04anyone from as diverse as Lib Dem voters in the South West to

0:10:04 > 0:10:08Labour voters in the North of England and in Scotland as well.

0:10:08 > 0:10:10So, you need to speak to them about issues they care about.

0:10:10 > 0:10:14Now, what Facebook allows you to do is find those groups of individuals

0:10:14 > 0:10:16and speak to them about the issues...

0:10:16 > 0:10:18You'll have to work quite hard to find Labour voters in Scotland.

0:10:18 > 0:10:22- Nowadays, yes. - So, that's what you did -

0:10:22 > 0:10:24very targeted, very focused, and very successful.

0:10:24 > 0:10:27Very targeted, very segmented communications.

0:10:27 > 0:10:31Now, digital in itself is just a medium.

0:10:31 > 0:10:33So, you need to have the message right

0:10:33 > 0:10:34before you go and speak to these people.

0:10:34 > 0:10:37There's no point finding individual groups of people and speaking

0:10:37 > 0:10:39about things they don't care about.

0:10:39 > 0:10:41There's no point finding groups of people,

0:10:41 > 0:10:45finding groups of young voters, and then speaking to them about pensions,

0:10:45 > 0:10:47or, erm, about issues they don't care about.

0:10:47 > 0:10:50So, it's about finding groups of people, having the right message,

0:10:50 > 0:10:53- and getting in front of them time and time again.- Mm. And it was

0:10:53 > 0:10:56Helen who was remarkably successful in the marginals, wasn't it?

0:10:56 > 0:10:58Well, it was.

0:10:58 > 0:11:00And, if you...

0:11:00 > 0:11:04And I think also very successful for the Leave campaign, Brexit,

0:11:04 > 0:11:06and very successful for Trump,

0:11:06 > 0:11:09because, after all, if you just reach an echo chamber of people

0:11:09 > 0:11:12who agree with you, you're not going to make any difference.

0:11:12 > 0:11:15I mean, that's not what parties are looking to do.

0:11:15 > 0:11:16They're looking to find people

0:11:16 > 0:11:18who are undecided or don't know what they think yet.

0:11:18 > 0:11:22And what Labour did at the last general election, as you know better than me,

0:11:22 > 0:11:25is just randomly throw out the same message across Facebook.

0:11:25 > 0:11:27So, it didn't target specific people

0:11:27 > 0:11:29in specific seats they needed to win over.

0:11:29 > 0:11:31And, also, they mistook engagement -

0:11:31 > 0:11:35that's lots of people liking and commenting - for success.

0:11:35 > 0:11:39Actually, what that generally meant was the messages their core supporters already agreed with,

0:11:39 > 0:11:42like the NHS, were getting lots of attraction,

0:11:42 > 0:11:45but they weren't reaching out to the people they needed to win over.

0:11:45 > 0:11:48And, obviously, that's what you learned from so successfully.

0:11:48 > 0:11:50You said, we're not just going to do things for likes or clicks,

0:11:50 > 0:11:52we're going to target the people we need to win over

0:11:52 > 0:11:54who aren't in our natural coalition.

0:11:54 > 0:11:57Yes. Chi, as a Labour MP, just moving on ever so slightly,

0:11:57 > 0:12:01the digital media revolution,

0:12:01 > 0:12:06it's a great process and platform of democratisation, isn't it?

0:12:06 > 0:12:08Isn't it a great leveller?

0:12:08 > 0:12:10Absolutely. And digital media...

0:12:10 > 0:12:13Before I came into politics, I was an electrical engineer

0:12:13 > 0:12:15for 20 years, helping build out these networks.

0:12:15 > 0:12:18And the reason why I went into that is because anything,

0:12:18 > 0:12:20something that gives people

0:12:20 > 0:12:23the power to connect with other people, which enables people

0:12:23 > 0:12:26to reach out to share, that is progress,

0:12:26 > 0:12:29and that, I believe, is good for democracy.

0:12:29 > 0:12:34But, like any technology, it comes with its disadvantages as well.

0:12:34 > 0:12:37And, also, any technology,

0:12:37 > 0:12:41those in power will try and use it to reinforce their power,

0:12:41 > 0:12:43and as they have the most means, they're the most likely

0:12:43 > 0:12:46to be able to use it most effectively to begin with.

0:12:46 > 0:12:48But you don't have to, you know,

0:12:48 > 0:12:51have lunch, schmooze Rupert Murdoch to get your message out there any more.

0:12:51 > 0:12:54- Because there are other ways of doing it.- Let me say this.

0:12:54 > 0:12:56There's two really big caveats with that,

0:12:56 > 0:12:58and the first one, and that's the reason why

0:12:58 > 0:13:01I still do walk out on cold winter's days and knock on doors,

0:13:01 > 0:13:03is because there are millions of people

0:13:03 > 0:13:05in this country who still don't have broadband,

0:13:05 > 0:13:08who still don't have any digital access, you know.

0:13:08 > 0:13:12The state of broadband in this country is a disgrace.

0:13:12 > 0:13:15And then, there are those who may have access,

0:13:15 > 0:13:17but they can't afford it, it's too expensive,

0:13:17 > 0:13:21or they haven't got the digital skills necessary to use it.

0:13:21 > 0:13:25So, there are millions of people off, away from this debate,

0:13:25 > 0:13:28which we really need to include, and when we have included them,

0:13:28 > 0:13:32then we need to look at the way in which that debate and those

0:13:32 > 0:13:36messages are being put out. And what I would say particularly about...

0:13:36 > 0:13:39And really, it was a most effective campaign,

0:13:39 > 0:13:42the Tory campaign in 2015, spent, I think it was...

0:13:42 > 0:13:46- Was it ten times more than Labour on Facebook?- Mm.

0:13:46 > 0:13:50But the other point is that those messages came with adverts

0:13:50 > 0:13:53down the sides, you know,

0:13:53 > 0:13:56and when I'm reaching out to my constituents,

0:13:56 > 0:13:59and Facebook enables me to do that and I do use that,

0:13:59 > 0:14:03but I'm very aware that if I'm asking them about how they feel

0:14:03 > 0:14:05about obesity or something,

0:14:05 > 0:14:09there's going to be adverts for, you know, well-known fast food places

0:14:09 > 0:14:12down one side and there's going to be messages which are

0:14:12 > 0:14:16calculated on the basis of algorithms which are entirely,

0:14:16 > 0:14:19you know, invisible, entirely opaque,

0:14:19 > 0:14:21which we have no knowledge of.

0:14:21 > 0:14:24So, what I would say is, we need to update...

0:14:24 > 0:14:28The system feeds you with things that they know you will like,

0:14:28 > 0:14:31because of your previous clicking habits,

0:14:31 > 0:14:34just to highlight that for people. Carry on.

0:14:34 > 0:14:38So, we need to update, you know, the regulations, basically,

0:14:38 > 0:14:44to make sure this fantastic opportunity is fairer and is

0:14:44 > 0:14:47accessible to people and is used to support and enable people

0:14:47 > 0:14:50and not to feed them the same messages.

0:14:50 > 0:14:52Allie, you don't look happy. The word "regulations!"

0:14:52 > 0:14:55It always gets my back up, yeah. I think that...

0:14:55 > 0:14:57I also think we may be giving a bit too much credit to social media,

0:14:57 > 0:15:01in that the idea that political messages put out by parties

0:15:01 > 0:15:05simply failed because they weren't using the right social media

0:15:05 > 0:15:07strategy is perhaps masking a bigger problem of,

0:15:07 > 0:15:10perhaps they were putting out the wrong messages,

0:15:10 > 0:15:12or messages that people weren't interested in.

0:15:12 > 0:15:15You brought up the fact of adverts being alongside things on

0:15:15 > 0:15:18Facebook, and there is this sense that people are just

0:15:18 > 0:15:20mindlessly clicking on something and then

0:15:20 > 0:15:22a pop-up for McDonald's comes up and they think,

0:15:22 > 0:15:25"Oh, yeah, I want McDonald's," and it's that kind of mindless,

0:15:25 > 0:15:27I think that's a very insulting view of the public,

0:15:27 > 0:15:29not only how they engage with news, but the media in general...

0:15:29 > 0:15:31That's not what I said.

0:15:31 > 0:15:34But advertising is advertising for a reason, because it does

0:15:34 > 0:15:38actually work, so, and that's where Facebook gets its revenues from.

0:15:38 > 0:15:40I'm not saying people are mindless,

0:15:40 > 0:15:43but I am saying that people are influenced by advertising,

0:15:43 > 0:15:45otherwise you wouldn't have such big budgets on it.

0:15:45 > 0:15:48No, absolutely, but if you're putting out a message on obesity,

0:15:48 > 0:15:50and then you're saying that there's a link between that

0:15:50 > 0:15:53and then a fast food advertisement being alongside that,

0:15:53 > 0:15:55you're drawing some kind of relationship between them,

0:15:55 > 0:15:58and then when you use the word "regulation," my alarm bells go off,

0:15:58 > 0:16:00because I think, what do you want to regulate in tat?

0:16:00 > 0:16:02- AUDIENCE MEMBER APPLAUDS - Everything, because...

0:16:02 > 0:16:04I'm going to come to you in a minute,

0:16:04 > 0:16:05because that was quite a round of applause!

0:16:05 > 0:16:08Just this problem with the word "regulation" -

0:16:08 > 0:16:09nothing is above the law.

0:16:09 > 0:16:11Are you saying that the internet should be above the law?

0:16:11 > 0:16:14I mean, there are regulations that exist now - you recognise

0:16:14 > 0:16:16that we have a legal system.

0:16:16 > 0:16:18Yeah, and I would disagree with some of those.

0:16:18 > 0:16:21Yeah, but nothing's above the law. So, the law should apply.

0:16:21 > 0:16:24Yes, but in terms of free speech on the internet, it should be absolute.

0:16:24 > 0:16:26Well, free speech on the internet, that's going to be

0:16:26 > 0:16:28a tasty area for us later on.

0:16:28 > 0:16:33You burst into a rapturous solo round of applause there!

0:16:33 > 0:16:36What... Expand on it.

0:16:36 > 0:16:39Um, I just agree with the lady here, I just think that, um,

0:16:39 > 0:16:45it's very easy and I think it's a correct, you know, link between

0:16:45 > 0:16:50advertising on social media and problems that we face in society.

0:16:50 > 0:16:54Now, Jamie, you mentioned Trump, I think, earlier on.

0:16:54 > 0:17:00Was Trump's campaign a triumph of digital media?

0:17:00 > 0:17:01In some ways, yes.

0:17:01 > 0:17:03Of course, there are much bigger trends behind that.

0:17:03 > 0:17:06You can't just arrive and use Twitter and Facebook and then

0:17:06 > 0:17:08suddenly get yourself elected.

0:17:08 > 0:17:10You're obviously tapping into various,

0:17:10 > 0:17:12much deeper concerns that people have.

0:17:12 > 0:17:15But the point I want to make about that is...

0:17:15 > 0:17:19when you are completely overwhelmed with different sources of

0:17:19 > 0:17:23information, this is the problem with social media -

0:17:23 > 0:17:27we were promised in the 1990s, from the digital prophets,

0:17:27 > 0:17:29that the politicians of the future,

0:17:29 > 0:17:31when we were all connected and we had access to the world's

0:17:31 > 0:17:33information, would be more informed,

0:17:33 > 0:17:37they'd be smarter and we'd all be kinder and nicer to each other.

0:17:37 > 0:17:40One great academic even said it would be the end of nationalism,

0:17:40 > 0:17:43because we'd be able to link to everybody and we'd all

0:17:43 > 0:17:45understand each other. This is ludicrous.

0:17:45 > 0:17:47The reality is, of course,

0:17:47 > 0:17:49that when we are overwhelmed with information,

0:17:49 > 0:17:52we use heuristics to try to quickly figure out what we think,

0:17:52 > 0:17:56we do tend, I think, I disagree slightly with you on this, Helen,

0:17:56 > 0:17:59we do tend to find things that we already agree with,

0:17:59 > 0:18:02that our friends already agree with, and we trust that more.

0:18:02 > 0:18:03We do that anyway.

0:18:03 > 0:18:06We do that anyway, but we do that to such

0:18:06 > 0:18:08a greater extent now than we ever did before,

0:18:08 > 0:18:10and we're less aware of it.

0:18:10 > 0:18:11Is that true, Helen?

0:18:11 > 0:18:15Well, I think it's a tendency to present it as one thing or

0:18:15 > 0:18:20the other. I mean, it's a very grey area, you know, between...

0:18:20 > 0:18:23You even hear people talking about kind of hermetically sealed

0:18:23 > 0:18:25echo chambers, and that's...

0:18:25 > 0:18:28But people say, "Oh, the Twittersphere has gone mad!"

0:18:28 > 0:18:31What they mean is, their own Twitter feed has gone mad.

0:18:31 > 0:18:33Yeah, but in almost any platform,

0:18:33 > 0:18:36you'll be able to see trending information and you're just

0:18:36 > 0:18:39one click away from a huge array of opinions.

0:18:39 > 0:18:42But the danger is, I think, that social media politics is

0:18:42 > 0:18:47in many ways more angry, more aggressive, more emotive,

0:18:47 > 0:18:49and the danger of that, I think,

0:18:49 > 0:18:52is that it makes compromise more difficult.

0:18:52 > 0:18:55The US Congress is more polarised than it has ever been,

0:18:55 > 0:18:59since the war at the least, and I think that's happening in

0:18:59 > 0:19:02a lot of different democracies, and part of the reason is the way

0:19:02 > 0:19:05that we do now - most of us, not everyone, but most of us -

0:19:05 > 0:19:06engage with politics.

0:19:06 > 0:19:09That makes compromise more difficult and I think that is going

0:19:09 > 0:19:12- to make politics much more difficult in the future.- But I think...

0:19:12 > 0:19:15Look, all our societies have become more polarised,

0:19:15 > 0:19:18where people, whether it be Trump in the United States, Brexit here,

0:19:18 > 0:19:21across Europe the rise of the populist right,

0:19:21 > 0:19:24we've become far more divided as societies.

0:19:24 > 0:19:26Not obviously social media that's caused that -

0:19:26 > 0:19:28social media is amplifying aspects of it, though,

0:19:28 > 0:19:31because people communicate on social media in a way they would

0:19:31 > 0:19:34often never dream of speaking to each other in real life.

0:19:34 > 0:19:37It often encourages a pack mentality,

0:19:37 > 0:19:39where you do get groups of people...

0:19:39 > 0:19:41It's like the mob in the French Revolution sometimes.

0:19:41 > 0:19:43But I don't want to get like that,

0:19:43 > 0:19:45- because that does sound kind of "Grrr!"- It can be "Grrr!"

0:19:45 > 0:19:48It's good that you can democratise information...

0:19:48 > 0:19:51But somebody says something, for example, on Twitter...

0:19:51 > 0:19:53I was on holiday and I looked online on the phone to see what was

0:19:53 > 0:19:56in the newspapers, and there were three stories

0:19:56 > 0:19:58about people who had said something

0:19:58 > 0:20:02on Twitter and had had the fury poured upon them for doing so

0:20:02 > 0:20:04and had to delete the tweets and had to apologise.

0:20:04 > 0:20:06So, is that really good for freedom of speech?

0:20:06 > 0:20:09No, that's not, because you do end up with the situation now

0:20:09 > 0:20:12where basically, people often, if they make a mistake or

0:20:12 > 0:20:14they say something which is obviously quite inflammatory

0:20:14 > 0:20:18or divisive, then a group on Twitter will basically coordinate and

0:20:18 > 0:20:20relish taking that person down,

0:20:20 > 0:20:22and they will throw everything at them,

0:20:22 > 0:20:24and if you are in the middle of a Twitter storm -

0:20:24 > 0:20:26Jon Ronson wrote an excellent book about this,

0:20:26 > 0:20:30about people shamed on Twitter - it can be a terrifying experience,

0:20:30 > 0:20:34where random strangers all over the world are suddenly screaming at you.

0:20:34 > 0:20:36It can be quite threatening and menacing.

0:20:36 > 0:20:38You even get a phenomenon of so-called doxxing,

0:20:38 > 0:20:40where people will expose your personal information,

0:20:40 > 0:20:43go through your back story, everything possible, try to

0:20:43 > 0:20:46get you sacked by your employer, so it can be very menacing.

0:20:46 > 0:20:48And the mainstream newspapers feed off the back of it,

0:20:48 > 0:20:51and have stories about that very thing happening and people

0:20:51 > 0:20:54being shamed for saying something that some of the papers

0:20:54 > 0:20:56reporting on it actually agree with.

0:20:56 > 0:20:57I think that's the other problem,

0:20:57 > 0:20:59because this is the other issue, though,

0:20:59 > 0:21:01because we look at social media as though this is a new phenomenon,

0:21:01 > 0:21:05but there's a very long history of our very politicised media finding

0:21:05 > 0:21:09individuals who they disagree with, hunting them down,

0:21:09 > 0:21:12having reporters outside their front doors, going through their

0:21:12 > 0:21:15private life, going through the private lives of people around them.

0:21:15 > 0:21:17So, yes, we should hold social media to account,

0:21:17 > 0:21:20but we should also hold to account our mainstream media,

0:21:20 > 0:21:23which is responsible often for disseminating false news,

0:21:23 > 0:21:26for targeting individuals they don't agree with and humiliating them.

0:21:26 > 0:21:28So, yes, let's have that concern, but I think we should talk

0:21:28 > 0:21:31about our press as well, which is often out of control itself.

0:21:31 > 0:21:33That's something we'll skip onto in just a second.

0:21:33 > 0:21:36Is this good for freedom of speech, then,

0:21:36 > 0:21:39- if people feel intimidated, Jamie? - No, I mean, there is always...

0:21:39 > 0:21:41I think there's a danger with social media,

0:21:41 > 0:21:42and I'm in favour of...

0:21:42 > 0:21:45I think social media and democracy work in some ways very well

0:21:45 > 0:21:49together, and I think the positives do outweigh the negatives.

0:21:49 > 0:21:52There is definitely a sense, though, that a lot of people -

0:21:52 > 0:21:54and I feel it myself sometimes - are self-censoring,

0:21:54 > 0:21:56fearing to put something online that,

0:21:56 > 0:21:59"Will I be attacked for that, is it the right thing to do?"

0:21:59 > 0:22:02And just avoiding the scandal, avoiding the mob,

0:22:02 > 0:22:05avoiding saying something that might cause offence.

0:22:05 > 0:22:08- What did you want to say?- Well, I'm definitely not going to say it now!

0:22:08 > 0:22:11Although if I did, I might get some re-tweets and YouTube views

0:22:11 > 0:22:13and all the rest of it. And that is a danger,

0:22:13 > 0:22:18especially given that anything you post stays online forever.

0:22:18 > 0:22:22And so, you might be called up for something you did ten years ago,

0:22:22 > 0:22:25that is then waved in front of you and you lose your job or you

0:22:25 > 0:22:28have people complaining about you, and that has a real chilling

0:22:28 > 0:22:31effect, because people will be afraid of ever speaking their mind.

0:22:31 > 0:22:34Tom, is this something you take into account when you are

0:22:34 > 0:22:35employing people, for example?

0:22:35 > 0:22:38Do you have to look at their Facebook history and their...

0:22:38 > 0:22:40Absolutely, and we notice increasingly...

0:22:40 > 0:22:42For example, if a political party wants to use

0:22:42 > 0:22:46a real-life case study for a film or party political broadcast,

0:22:46 > 0:22:49a poster campaign, etc, you would look into their background,

0:22:49 > 0:22:50and you find increasingly,

0:22:50 > 0:22:53when you look at people from a younger generation,

0:22:53 > 0:22:55when you look at their Facebook feeds,

0:22:55 > 0:22:57when you look at their Twitter feeds,

0:22:57 > 0:23:00they are saying stuff that kids aged 16, 17, 18 say, which,

0:23:00 > 0:23:03you know, when I was 16, 17, 18, I was saying stupid stuff as well.

0:23:03 > 0:23:06Or you'll find that they will have liked something from UniLad

0:23:06 > 0:23:08or The LAD Bible or something like that,

0:23:08 > 0:23:12which we know that this is just part and parcel of being young and

0:23:12 > 0:23:15being a bit stupid and making mistakes, as we've all done,

0:23:15 > 0:23:18but taken out of context, suddenly it blows up and becomes...

0:23:18 > 0:23:20And now it's written in stone, isn't it?

0:23:20 > 0:23:22Well, exactly, and I remember the case...

0:23:22 > 0:23:25I expect others on the panel will know the details more,

0:23:25 > 0:23:28but a Youth Police and Crime Commissioner, I think she was in

0:23:28 > 0:23:31Birmingham, who had sort of put her head above the parapet for a job...

0:23:31 > 0:23:34- In Kent, I think, yeah. - Yeah, maybe, yeah.

0:23:34 > 0:23:36She had put her head above the parapet for

0:23:36 > 0:23:38a job and then the press went over her tweets,

0:23:38 > 0:23:40as it is their right to do,

0:23:40 > 0:23:43and they found that she had said some stupid stuff, as we've all

0:23:43 > 0:23:46done, and suddenly, she became the number one victim in the country...

0:23:46 > 0:23:48number one criminal in the country,

0:23:48 > 0:23:51they put her face all over the papers and she had to resign.

0:23:51 > 0:23:53Now, I'm not arguing the rights or wrongs about what she said,

0:23:53 > 0:23:56but increasingly now, the things we say when we are younger

0:23:56 > 0:23:59are being held against us because it's a permanent record online.

0:23:59 > 0:24:01Dan, you want to come in?

0:24:01 > 0:24:04Well, these things are important, but I'd like to get us on to

0:24:04 > 0:24:09fake news, which I think is a cancer in the system, I mean,

0:24:09 > 0:24:12you know, information is the lifeblood of democracy and

0:24:12 > 0:24:14fake news is a terrible thing...

0:24:14 > 0:24:16But Owen's saying fake news is nothing new.

0:24:16 > 0:24:17Well, we've seen the rise of it,

0:24:17 > 0:24:20but we've really seen the rise of it on social media in the USA...

0:24:20 > 0:24:22Have we not had fake news in the mainstream media?

0:24:22 > 0:24:25You know, the Pope supports Donald Trump, Hillary Clinton is

0:24:25 > 0:24:27part of a paedophile ring - I mean, these things are nonsense,

0:24:27 > 0:24:29but they are very serious nonsense...

0:24:29 > 0:24:31But clearly nonsense, aren't they?

0:24:31 > 0:24:33Well, the thing is, I think that what we...

0:24:33 > 0:24:35We've got an election coming up in three years' time,

0:24:35 > 0:24:38there is some evidence to suggest it's had an impact on

0:24:38 > 0:24:42the democratic process in America, I think we need to find ways of

0:24:42 > 0:24:46stopping that tide reaching these shores over the next three years.

0:24:46 > 0:24:48I personally...

0:24:48 > 0:24:50Although I think that social media and the internet are

0:24:50 > 0:24:53fundamentally good for democracy, they disseminate information

0:24:53 > 0:24:57and encourage participation, fake news needs to be stamped out.

0:24:57 > 0:25:01I think the responsibility for that lies with social media companies.

0:25:01 > 0:25:02But what's the difference between...

0:25:02 > 0:25:05I don't think they are being responsible enough about it

0:25:05 > 0:25:06and I think they need to be called out on that.

0:25:06 > 0:25:09OK, are they tech platforms or publishers?

0:25:09 > 0:25:11Well, the clue's in the title, they are social media platforms,

0:25:11 > 0:25:14and I don't know why they are saying they are not media companies.

0:25:14 > 0:25:15On this fake news business,

0:25:15 > 0:25:18some of these fake news stories are so blatantly ludicrous,

0:25:18 > 0:25:22- surely most people must think, "Oh, for goodness' sake!"- But we saw...

0:25:22 > 0:25:23And which is more insidious,

0:25:23 > 0:25:27that or some of the misreporting and misrepresentation and

0:25:27 > 0:25:30distortions that we've seen over the years from the mainstream media?

0:25:30 > 0:25:32Which is more dangerous?

0:25:32 > 0:25:33I think they are both dangerous.

0:25:33 > 0:25:36There is a big, big difference between those two worlds,

0:25:36 > 0:25:38though, which is that particularly for television,

0:25:38 > 0:25:41it's a much more regulated world, there are rules, there is

0:25:41 > 0:25:46a third-party independent regulator who can investigate complaints.

0:25:46 > 0:25:48Social media is the Wild West,

0:25:48 > 0:25:52it is absolutely the Wild West, and that's a problem.

0:25:52 > 0:25:55Look, back in 1984, during the miners' strike,

0:25:55 > 0:25:57a very divisive moment in British history,

0:25:57 > 0:25:59there was the infamous Battle of Orgreave,

0:25:59 > 0:26:02an infamous battle in the whole struggle,

0:26:02 > 0:26:05and the BBC, I'm afraid to say, at the time,

0:26:05 > 0:26:07they put the tape in a wrong direction, because originally,

0:26:07 > 0:26:10as Liberty said, the human rights organisation,

0:26:10 > 0:26:12it was a police riot, the police had to pay compensation,

0:26:12 > 0:26:15and they put it in the wrong direction, the BBC -

0:26:15 > 0:26:17yet to apologise for it, incidentally -

0:26:17 > 0:26:20and made it look like the miners had attacked the police.

0:26:20 > 0:26:23The other example, infamously, is Hillsborough,

0:26:23 > 0:26:27where the Sun newspaper spread false news, fake news,

0:26:27 > 0:26:31about Liverpool fans, with absolutely horrendous consequences.

0:26:31 > 0:26:34So, we should take on fake news on the new media,

0:26:34 > 0:26:36but also on the old media as well.

0:26:36 > 0:26:38But this happens every day on Twitter and digital media,

0:26:38 > 0:26:40- fake news...- But I think, often...

0:26:40 > 0:26:43Well, look, whether it be about immigrants, Muslims,

0:26:43 > 0:26:45unemployed people, benefit fraud,

0:26:45 > 0:26:47I think all of these are issues which are either exaggerated

0:26:47 > 0:26:50or distorted, often, as well, by the mainstream press,

0:26:50 > 0:26:53which play into people's prejudices, where you end up, for example,

0:26:53 > 0:26:56with disabled people who have been, as disabled charities have said,

0:26:56 > 0:26:58abused on the streets as a consequence.

0:26:58 > 0:27:01That's because of the mainstream press as well.

0:27:01 > 0:27:04But you can complain to somebody, as Dan says - with the so-called

0:27:04 > 0:27:07mainstream media, the MSM, you can complain.

0:27:07 > 0:27:10Theoretically, there's redress, isn't there?

0:27:10 > 0:27:13In theory, but I'm afraid to say, often the way the media operates

0:27:13 > 0:27:16in this country is that they often behave, still, with impunity.

0:27:16 > 0:27:19You'll get corrections on page 22 in a tiny little box.

0:27:19 > 0:27:23So, yes, obviously, I'm not saying we shouldn't take it seriously,

0:27:23 > 0:27:25absolutely we should do that,

0:27:25 > 0:27:28false news, which is deliberately fake in order to play into

0:27:28 > 0:27:31the prejudices of very angry sections of the population,

0:27:31 > 0:27:33but I just don't think there is enough anger about the...

0:27:33 > 0:27:35Andrew, I'm coming to you, don't worry.

0:27:35 > 0:27:38Because Andrew has a lot to say on this and knows a lot about it.

0:27:38 > 0:27:42But Will is a fact-checker. You must be a very busy man at the moment!

0:27:42 > 0:27:45- It's been a tough year! - LAUGHTER

0:27:45 > 0:27:48We spend our time not only checking what people are saying,

0:27:48 > 0:27:50be they politicians, be they journalists,

0:27:50 > 0:27:52be they lots of other people in public life,

0:27:52 > 0:27:54but also asking them to go and correct the record.

0:27:54 > 0:27:57But it's not just mischievous, this stuff, it's malicious, isn't it?

0:27:57 > 0:27:58Is deliberate and it's malicious.

0:27:58 > 0:27:59It's not for me to judge that,

0:27:59 > 0:28:01it's for everyone in this room to judge that.

0:28:01 > 0:28:04If we are concerned about the effect of information on democracy,

0:28:04 > 0:28:07we've got to remember that we are democracy,

0:28:07 > 0:28:10it is all of us as voters who decide whether digital media is

0:28:10 > 0:28:12going to be good for us or bad for us.

0:28:12 > 0:28:15But I think the thing we've got to look at is,

0:28:15 > 0:28:17who has power and are they using it fairly?

0:28:17 > 0:28:21Are people who are in powerful positions using information

0:28:21 > 0:28:24to mislead us, and if so, what are we going to do about it?

0:28:24 > 0:28:28And Owen is right, there are lots of examples of mainstream media

0:28:28 > 0:28:31outlets publishing things that are clearly not true.

0:28:31 > 0:28:34- In our experience...- And getting clobbered for it, eventually.- Sorry?

0:28:34 > 0:28:37- And getting clobbered for it, eventually.- Well, clobbered is...

0:28:37 > 0:28:40This stuff is round the world in 80 clicks, isn't it?

0:28:40 > 0:28:42Clobbered is putting it far too strongly.

0:28:42 > 0:28:45Very few mainstream media outlets have any kind of system where

0:28:45 > 0:28:48people get "clobbered" for getting things wrong.

0:28:48 > 0:28:50That's simply not true in television.

0:28:50 > 0:28:52There is a very clear set of rules,

0:28:52 > 0:28:55there's a third-party independent regulator who polices that,

0:28:55 > 0:28:58you can complain, things are investigated, and ultimately,

0:28:58 > 0:29:01if you are a repeat offender, you have your licence revoked.

0:29:01 > 0:29:04Well, A, when did that last happen in a major media programme?

0:29:04 > 0:29:06Well, but I think that some of the...

0:29:06 > 0:29:07Secondly, television is, as you said...

0:29:07 > 0:29:10Some of the news channels can really sail close to the wind,

0:29:10 > 0:29:13and they get called on this stuff by the regulator,

0:29:13 > 0:29:16and they have to amend their behaviour, and they do.

0:29:16 > 0:29:18Television, as you said, is the most regulated,

0:29:18 > 0:29:21and it's absolutely right that they have the strongest kind of

0:29:21 > 0:29:23set of rules around accuracy and the greatest caution.

0:29:23 > 0:29:25It is nonetheless the case that at least one of them

0:29:25 > 0:29:28has the informal slogan of "never wrong for long" - we'll get

0:29:28 > 0:29:31it on air as quickly as possible and we'll correct it if we need to.

0:29:31 > 0:29:34Television and radio have very strong rules.

0:29:34 > 0:29:37Newspapers, which are highly partisan, don't have the same rules.

0:29:37 > 0:29:39They have a whole range of things.

0:29:39 > 0:29:42Some of them will correct things very quickly and very fairly when

0:29:42 > 0:29:45we ask them to, some of them will simply ignore a corrections request.

0:29:45 > 0:29:48So there is a whole range of stuff going on.

0:29:48 > 0:29:52- How long does it take to check a fact?- Usually, about a day.- A day?

0:29:52 > 0:29:55To turn around a really solid fact check where you've looked at

0:29:55 > 0:29:57all the sources, you've talked to experts where you need to...

0:29:57 > 0:30:00Well, as I say, it's been around the world about twice

0:30:00 > 0:30:01on digital media by that time.

0:30:01 > 0:30:05Exactly, so the thing that really matters isn't actually

0:30:05 > 0:30:08whether media outlets get things wrong once, it's how often

0:30:08 > 0:30:11things get repeated, because that's what's really powerful.

0:30:11 > 0:30:14So if they do correct things when we point it out,

0:30:14 > 0:30:16or when someone else points it out, that can do a lot of good,

0:30:16 > 0:30:18but not all of them do it consistently.

0:30:18 > 0:30:21Let's talk about social media for a minute,

0:30:21 > 0:30:24because there are kind of two things going on with social media.

0:30:24 > 0:30:27One is that all of us can share whatever we want

0:30:27 > 0:30:29with whoever we want, and that's amazing.

0:30:29 > 0:30:30That's democracy writ large.

0:30:30 > 0:30:32And if you don't like the consequences,

0:30:32 > 0:30:35then change your voters, ie it's up to all of us

0:30:35 > 0:30:37to do a better job as voters.

0:30:37 > 0:30:40The other bit is, people in positions of power,

0:30:40 > 0:30:44be they the social media companies or be they the advertisers

0:30:44 > 0:30:46on social media, using that platform,

0:30:46 > 0:30:49either fairly to the users or unfairly.

0:30:49 > 0:30:51Now, the thing we don't know,

0:30:51 > 0:30:55when the political parties spend hundreds of thousands, millions,

0:30:55 > 0:30:59I don't know, advertising, is what they are saying and to whom.

0:30:59 > 0:31:03It has never, ever been possible before to reach millions

0:31:03 > 0:31:06of people in the country and have the other millions

0:31:06 > 0:31:08of people in the country not know.

0:31:08 > 0:31:09You couldn't take out a newspaper advert

0:31:09 > 0:31:11and have the rest of us not see it.

0:31:11 > 0:31:14So now we have a whole set of political advertising

0:31:14 > 0:31:16that is completely unscrutinised,

0:31:16 > 0:31:18and that's the thing we should be a bit worried about.

0:31:18 > 0:31:19Well, Dan's agreeing with you.

0:31:19 > 0:31:22Social media is invisible for the rest of us.

0:31:22 > 0:31:24I think it's the fact that it's unscrutinised is basically

0:31:24 > 0:31:28- Dan's point. Dr Andrew Calcutt... - Can I just make a point about...?

0:31:28 > 0:31:31- You can in a minute.- Just about trust.- I'll come back to trust.

0:31:31 > 0:31:34- Because the rules... - TRUST me, I will come back to it.

0:31:34 > 0:31:37Or I'll apologise.

0:31:37 > 0:31:39But Andrew hasn't been in it, and I know, Laura,

0:31:39 > 0:31:41I've not to come to you yet.

0:31:41 > 0:31:45Everyone's had a say and will have far more of a say, but, Andrew,

0:31:45 > 0:31:48journalism, humanities and creative industries, that's your thing,

0:31:48 > 0:31:51as Dr Andrew Calcutt.

0:31:51 > 0:31:54We're on false news, fake news now,

0:31:54 > 0:32:01but in this cacophony of calumny and falsehood, is the truth dying?

0:32:01 > 0:32:06Well, I think the idea that democracy is threatened

0:32:06 > 0:32:11and truth is destroyed by a bunch of spotty youths from Macedonia...

0:32:11 > 0:32:13NICKY LAUGHS

0:32:13 > 0:32:16..it's just ludicrous, and equally risible is the CIA,

0:32:16 > 0:32:20of all people, complaining about state agencies interfering

0:32:20 > 0:32:25in the business of other sovereign nations, you couldn't make it up.

0:32:25 > 0:32:26APPLAUSE

0:32:26 > 0:32:28I think though that...

0:32:28 > 0:32:32I think some of my fellow panellists are trying to do the jigsaw

0:32:32 > 0:32:36by just the first couple of pieces that come to hand.

0:32:36 > 0:32:40And I think, really, if we all take a step back

0:32:40 > 0:32:43and look at that question, it's kind of too early to tell...

0:32:43 > 0:32:48whether social media and digital media are good for democracy,

0:32:48 > 0:32:51cos we're not in one, we're not in a democracy,

0:32:51 > 0:32:54we're in an era of something like zombie politics.

0:32:54 > 0:32:56So, if I may, it's a little bit like, was it Deng Xiaoping?

0:32:56 > 0:32:58When asked about the French Revolution.

0:32:58 > 0:32:59Yeah, it's too early to tell.

0:32:59 > 0:33:02And he said, did it work, "It's too early to tell."

0:33:02 > 0:33:05I don't know if it was Deng, but it's a little bit like that.

0:33:05 > 0:33:09We're in an era of zombie politics where the majority of

0:33:09 > 0:33:13the population is notable by its absence.

0:33:13 > 0:33:17And professional politicians have been like

0:33:17 > 0:33:19a kind of medicine show,

0:33:19 > 0:33:24spooning out what the majority population is supposed

0:33:24 > 0:33:27to accept, and you're meant to just take your medicine,

0:33:27 > 0:33:32and if you don't take it, you're decried as being a deplorable.

0:33:32 > 0:33:35And guess what?

0:33:35 > 0:33:38In the last year, major sections of that population have said,

0:33:38 > 0:33:41"We're not going to be the studio audience any more,

0:33:41 > 0:33:44"we're going to walk out on that scenario."

0:33:44 > 0:33:48And so, with all due respect, a lot of people sitting around

0:33:48 > 0:33:50the inner circle are indeed IN the inner circle.

0:33:50 > 0:33:54- They're in the bubble that is... - We are the problem.

0:33:54 > 0:33:58..that is rather far-removed from where most people are,

0:33:58 > 0:34:02and most people are not in any substantial sense

0:34:02 > 0:34:05- engaged in politics. - But if there's a lack of trust.

0:34:05 > 0:34:09The idea that this kind of interpersonal details amounts

0:34:09 > 0:34:12to political polarisation is just ludicrous.

0:34:12 > 0:34:14Put your hand up in the audience if you want to say something

0:34:14 > 0:34:16on this, and I'll get round to you.

0:34:16 > 0:34:19But if you say that people are disillusioned with politics

0:34:19 > 0:34:22because of a lack of trust, we are now playing around

0:34:22 > 0:34:27with this platform which has stuff on it that we cannot trust.

0:34:27 > 0:34:29In there is a paradox somewhere, Dan, isn't there?

0:34:29 > 0:34:32Well, yeah, I'm particularly interested in this question

0:34:32 > 0:34:34of rules and regulations,

0:34:34 > 0:34:40because the most regulated part of the media is television.

0:34:40 > 0:34:44It is, and we've got this diverse public service system in the UK,

0:34:44 > 0:34:49and that is policed by Ofcom, and lo and behold, people's biggest source

0:34:49 > 0:34:54of news in the UK is television, and it's also the most trusted.

0:34:54 > 0:34:57Social media at the moment

0:34:57 > 0:35:00is much, much lower, both in terms of the amount of people

0:35:00 > 0:35:03that do access it for news, but also for levels of trust.

0:35:03 > 0:35:06- The problem is...- WILL:- Same is true of newspapers, by the way.

0:35:06 > 0:35:10The problem is you've got this issue of fake news,

0:35:10 > 0:35:12and we have to stamp it out before it gets bigger.

0:35:12 > 0:35:14- Too late.- It's not.

0:35:14 > 0:35:15It's not too late.

0:35:15 > 0:35:18But there's no such thing as fake news and not fake news

0:35:18 > 0:35:20and it's clear to determine which is which.

0:35:20 > 0:35:24It's such a big grey area between one and the other.

0:35:24 > 0:35:28- There may be, but... - Stamping it out is censorship.

0:35:28 > 0:35:31- DR ANDREW:- Aren't our lives micromanaged enough already?

0:35:31 > 0:35:33Do you have to regulate everything?

0:35:33 > 0:35:35We have to bring up the elephant in the room here,

0:35:35 > 0:35:38which is that what no-one wants to admit is that the idea of

0:35:38 > 0:35:41the fear of fake news is the fear of who is reading the fake news

0:35:41 > 0:35:44and what will they do with that information? And, essentially,

0:35:44 > 0:35:47when you say there is a danger of people being influenced

0:35:47 > 0:35:48by fake news negatively,

0:35:48 > 0:35:50you're making a differentiation between you,

0:35:50 > 0:35:52who can understand that the news is fake,

0:35:52 > 0:35:55and those thickos out there who can't, and that's a deeply...

0:35:55 > 0:35:58- DAN:- No, I don't...- Hang on. - Don't think that at all.

0:35:58 > 0:36:01If you think fake news should be stamped out, who do you want

0:36:01 > 0:36:03to stamp it out and why should they be the arbiters of truth?

0:36:03 > 0:36:05- And who do you stamp it out? - Exactly.

0:36:05 > 0:36:07Well, I think that's complicated.

0:36:07 > 0:36:10That's the easy answer to a really uncomfortable question.

0:36:10 > 0:36:12The horse has bolted.

0:36:12 > 0:36:15But you have different systems within different media,

0:36:15 > 0:36:17and television is heavily regulated,

0:36:17 > 0:36:20it didn't stop television holding power to account,

0:36:20 > 0:36:24generally being extremely accurate, being diverse and being popular.

0:36:24 > 0:36:26- It doesn't stop that.- But can I ask you why you're worried about...?

0:36:26 > 0:36:28Why are you so worried that people...?

0:36:28 > 0:36:30And I don't know who you're talking about when you're

0:36:30 > 0:36:33talking about people that don't understand that wild story

0:36:33 > 0:36:35about Ed Miliband fixing relationships

0:36:35 > 0:36:37with Hillary Clinton is fake.

0:36:37 > 0:36:41Well, BuzzFeed's research showed that there was

0:36:41 > 0:36:46more consumption of fake news during the US election than true news.

0:36:46 > 0:36:50- Entertainment. It's people having a laugh.- That's a problem.

0:36:50 > 0:36:52Can I ask a question? Just a hypothetical.

0:36:52 > 0:36:56If something completely fake about you went all over the world,

0:36:56 > 0:36:58like someone claimed you were the head of a drugs cartel,

0:36:58 > 0:37:00something obviously ridiculous,

0:37:00 > 0:37:03that went all around the world, they don't know you from Adam,

0:37:03 > 0:37:06and your picture is there, calling you the head of a drug cartel,

0:37:06 > 0:37:08you'd be like, "No correction needed, free speech,

0:37:08 > 0:37:09"they can say what they want"?

0:37:09 > 0:37:11Would you be comfortable with that, genuinely?

0:37:11 > 0:37:14I suppose I would publicly say that that isn't the case.

0:37:14 > 0:37:16Wouldn't that put you at risk?

0:37:16 > 0:37:19The cost of having a free society is people being able to say whatever

0:37:19 > 0:37:22they want, and sometimes that being untrue, sometimes that being nasty.

0:37:22 > 0:37:24- But you wouldn't want it stamped out? - That's the cost of a free society.

0:37:24 > 0:37:26I'm going to take that thought,

0:37:26 > 0:37:28people being able to say whatever they want,

0:37:28 > 0:37:30and we're going to discuss it in the context of digital media,

0:37:30 > 0:37:33but first of all, as promised, what would you like to say?

0:37:33 > 0:37:34Thank you.

0:37:34 > 0:37:39What I wanted to say was that this particular issue about news,

0:37:39 > 0:37:40where they are coming from,

0:37:40 > 0:37:43it's not the responsibility of one side, it's the responsibility

0:37:43 > 0:37:46of the receivers, us, general public, as well,

0:37:46 > 0:37:50to find out where is it coming from, if we can't get right to the bottom

0:37:50 > 0:37:52of it, as least do a little bit of research

0:37:52 > 0:37:54or investigation of our own.

0:37:54 > 0:37:57Nowadays, it's not only TV, as somebody said,

0:37:57 > 0:37:58it's mostly free newspapers

0:37:58 > 0:38:02on the train, that's the time we have to just listen and digest.

0:38:02 > 0:38:06Now, just, even if we take simple rules, golden rules...

0:38:06 > 0:38:10This particular rule I'm saying is coming from a Muslim community.

0:38:10 > 0:38:13It's a saying of Prophet Muhammad, a basic saying, a Hadith,

0:38:13 > 0:38:15that for a person to be a liar

0:38:15 > 0:38:18is enough that he passes on whatever he hears

0:38:18 > 0:38:21without making any effort to check.

0:38:21 > 0:38:24So, if everybody takes just this simple, basic rule,

0:38:24 > 0:38:27we'll avoid so much of spreading around just

0:38:27 > 0:38:30a news that we heard on a WhatsApp or Facebook...

0:38:30 > 0:38:32- Rumours, gossip. - That's it, yeah.

0:38:32 > 0:38:35The Hadith are very strong against rumours and backbiting.

0:38:35 > 0:38:39So, if everybody sticks to that rule, so much will be relieved.

0:38:39 > 0:38:43- That's actually a really smart point.- It's the best point so far.

0:38:43 > 0:38:45- LAUGHTER - As a fact-checker,

0:38:45 > 0:38:49it's lovely to think that we can use fact-checking to empower everybody.

0:38:49 > 0:38:51The thing you've got to ask,

0:38:51 > 0:38:55the flipside of the question of who's the person who can

0:38:55 > 0:38:57split the difference between all those stupid people

0:38:57 > 0:39:01who believe the fake news and the rest of us, actually, none of us

0:39:01 > 0:39:03can just skim through a set of headlines

0:39:03 > 0:39:04and know which is true or not.

0:39:04 > 0:39:07It's immensely difficult to spot people who are

0:39:07 > 0:39:10trying to mislead you and people who aren't.

0:39:10 > 0:39:13Without the thrills and spills and the excitement of,

0:39:13 > 0:39:16"Ooh, what if that's true? That's great."

0:39:16 > 0:39:19- Exactly.- It's a completely different set of personal standards we apply

0:39:19 > 0:39:22to that than we do to watching TV news or radio news, but you can...

0:39:22 > 0:39:24Never mind splitting the difference,

0:39:24 > 0:39:27here's somebody who made a difference.

0:39:27 > 0:39:30Laura, your campaign, the tampon tax,

0:39:30 > 0:39:33- that was through social media, wasn't it?- Yep.

0:39:33 > 0:39:36I think one of the main really good points about social media is

0:39:36 > 0:39:39that it really gives power to under-represented communities

0:39:39 > 0:39:41of people, like women, for example.

0:39:41 > 0:39:45So, tampon tax has existed since 1973 and people have campaigned

0:39:45 > 0:39:49for generations to end it, but the reason why this campaign

0:39:49 > 0:39:51finally won - at least I think, anyway -

0:39:51 > 0:39:54is because we've had this platform of social media

0:39:54 > 0:39:57that we've never had before, and upon that platform we can finally,

0:39:57 > 0:40:00really be heard, and politicians can't ignore us any more.

0:40:00 > 0:40:04So, I think that's one really good and important point

0:40:04 > 0:40:06of social media, that individuals can really use it,

0:40:06 > 0:40:09rather than political parties or even news outlets.

0:40:09 > 0:40:11It gives the individual back power,

0:40:11 > 0:40:14- and then communities that are under-represented.- Yeah.

0:40:14 > 0:40:17There's loads of examples, things I'm interested in,

0:40:17 > 0:40:19the whole pressure on SeaWorld in Florida...

0:40:19 > 0:40:21The Black Lives Matter campaign.

0:40:21 > 0:40:24The Black Lives Matter campaign, your campaign,

0:40:24 > 0:40:25but it's not just a click, is it?

0:40:25 > 0:40:27Cos it's what you were saying earlier on, Helen,

0:40:27 > 0:40:29it's a million clicks.

0:40:29 > 0:40:33Exactly, the clicks scale up to something really significant.

0:40:33 > 0:40:37- Also, people don't just click and do nothing else...- They're engaged.

0:40:37 > 0:40:39Exactly, people don't just scroll through, like Change.org,

0:40:39 > 0:40:42for example, and click on every petition.

0:40:42 > 0:40:46I get people e-mailing me every day with little spelling errors

0:40:46 > 0:40:47I've made that I've not even seen.

0:40:47 > 0:40:50So they definitely do read through your campaigns.

0:40:50 > 0:40:53We also have lots of communities throughout the world really

0:40:53 > 0:40:56that have bound together and made a difference within

0:40:56 > 0:40:59their community, whether that be through tampon tax or

0:40:59 > 0:41:00the new campaign about homeless people

0:41:00 > 0:41:03and improving sanitary provision to homeless women.

0:41:03 > 0:41:06Lots of people do stuff physically that's inspired by....

0:41:06 > 0:41:09Did you get any abuse and vile trolling?

0:41:09 > 0:41:12- As a result of your campaign.- I did.

0:41:12 > 0:41:15Especially a woman, or a feminist campaigner online,

0:41:15 > 0:41:18you're just open to that, which is really sad,

0:41:18 > 0:41:20but I found that online trolling is actually

0:41:20 > 0:41:23a really important form of sexism that I've incurred anyway,

0:41:23 > 0:41:26in that it's really obviously documented and it's evidence

0:41:26 > 0:41:29- to show that sexism really does exist.- Shocking.

0:41:29 > 0:41:33Yeah, it shocks people, but women face sexism on an everyday basis.

0:41:33 > 0:41:35- Not like that. - Lots of people basically say,

0:41:35 > 0:41:37"Oh, sexism doesn't exist,

0:41:37 > 0:41:40"we have never experienced it as men, so therefore it doesn't..."

0:41:40 > 0:41:43- But some of that stuff is psychotic, it's not just...- Yeah, but I mean,

0:41:43 > 0:41:48it's just evidence of feelings that are evident throughout society,

0:41:48 > 0:41:50and if that comes out online that's, in a way, a good thing,

0:41:50 > 0:41:54cos it shows that sexism really does exist and we need to tackle it.

0:41:54 > 0:41:55It's lifted up the stone.

0:41:55 > 0:41:59What about so many female MPs, from all sides of the chamber,

0:41:59 > 0:42:05of the political divide, have had some dreadful, dreadful stuff?

0:42:05 > 0:42:10- Absolutely vile, and...- What's that about?- I think it's about...

0:42:10 > 0:42:14It's the power of the network you're talking about, with the tampon tax,

0:42:14 > 0:42:16and networks before were old boys' networks,

0:42:16 > 0:42:18they were for the powerful,

0:42:18 > 0:42:21whereas now we can have networks for everyone to come together.

0:42:21 > 0:42:25But as well as that, that attracts, and what the internet does

0:42:25 > 0:42:31really well is amplify what is out there in the debate.

0:42:31 > 0:42:35It attracts those who are the most vocal and the most aggressive

0:42:35 > 0:42:39to take part in a debate, and what I really dislike is this idea

0:42:39 > 0:42:42that if you respond to that you are feeding the trolls.

0:42:42 > 0:42:48I think that women need to put their voices out there and we need

0:42:48 > 0:42:53to take back that space, we need to take back the internet space,

0:42:53 > 0:42:55and what I would say, again,

0:42:55 > 0:43:00that's why we need a more active government setting.

0:43:00 > 0:43:05It's too early to tell, I agree, the impact of digital media

0:43:05 > 0:43:08on democracy, but we need to decide what that impact is.

0:43:08 > 0:43:10More active government, what do you mean?

0:43:10 > 0:43:15We need a government that's looking at the issues,

0:43:15 > 0:43:17protecting free speech, but also

0:43:17 > 0:43:19against hate speech online, for example.

0:43:19 > 0:43:22But if you were to meet one of these people, and very often,

0:43:22 > 0:43:26Owen did, they're not very impressive human beings.

0:43:26 > 0:43:28They're pretty sad individuals, but they become

0:43:28 > 0:43:31these keyboard warriors, and as Owen said earlier on,

0:43:31 > 0:43:35they say things to you that they would never say to your face.

0:43:35 > 0:43:38That's not representative, it's just brought out a sense of...

0:43:38 > 0:43:41It's empowered some rather pathetic people.

0:43:41 > 0:43:44Well, it's empowered, if you like, the wrong people, but there

0:43:44 > 0:43:49needs to also be sanctions, and this is the point about...

0:43:49 > 0:43:51It's empowered the wrong people.

0:43:51 > 0:43:55It's empowered the wrong people when they're attacking and trolling.

0:43:55 > 0:43:57And there needs to be sanctions, and I'm very interested,

0:43:57 > 0:44:01the idea that you don't believe there should be sanctions

0:44:01 > 0:44:05online for the sort of behaviour which, in the street,

0:44:05 > 0:44:08if somebody attacks me in the street in the way that they do online,

0:44:08 > 0:44:09there would be sanctions.

0:44:09 > 0:44:13What I'm saying is that regulation that applies in real life

0:44:13 > 0:44:14also applies online.

0:44:14 > 0:44:17I don't think there should be any sanctions on speech

0:44:17 > 0:44:19- in real life or anywhere else. - It's empowering the wrong people.

0:44:19 > 0:44:21The phrase "empowering the wrong people"

0:44:21 > 0:44:23is so terrifying to hear from an elected MP,

0:44:23 > 0:44:25that is just shocking to me,

0:44:25 > 0:44:28because what this is really masking - and the question about

0:44:28 > 0:44:32women is so interesting, because, so often, censorship online

0:44:32 > 0:44:34is couched in terms of protecting women, protecting minorities,

0:44:34 > 0:44:38and what that basically is saying is that women cannot handle

0:44:38 > 0:44:42really sometimes quite awful abuse online, as well as men,

0:44:42 > 0:44:43to me that's deeply...

0:44:43 > 0:44:46Do you think it stops women putting their head above the parapet,

0:44:46 > 0:44:49wanting to stand for parliament for example?

0:44:49 > 0:44:52Hang on, if you're a public figure, especially if you're an MP,

0:44:52 > 0:44:55you should be able to accept that you're talking to the public

0:44:55 > 0:44:57and the public should be allowed to talk to you.

0:44:57 > 0:44:59Sometimes that's particularly nasty, actually,

0:44:59 > 0:45:02the majority of so-called sexist abuse that MPs get -

0:45:02 > 0:45:05and I'm thinking about certain MPs who are very vocal about it -

0:45:05 > 0:45:08Jess Phillips, Yvette Cooper - is actually just criticism of them.

0:45:08 > 0:45:11But hang on, if someone came to your constituency surgery and said,

0:45:11 > 0:45:13"I think you should be raped",

0:45:13 > 0:45:16you wouldn't say, "Hang on a minute, I'll be with you in just a second."

0:45:16 > 0:45:19- Is that what you're saying? - No, no, that's...

0:45:19 > 0:45:21It's fine that they're empowered, is it?

0:45:21 > 0:45:24Death threats and things like that are clearly illegal

0:45:24 > 0:45:27and should be dealt with through law, but what we're really talking

0:45:27 > 0:45:30- about now is a broader question of what is sexist...- No, we're not!

0:45:30 > 0:45:33..and actually what is talked about is saying stuff like...

0:45:33 > 0:45:37And actually I've written on this a lot - what is sexist trolling?

0:45:37 > 0:45:39And it's often about calling women fat

0:45:39 > 0:45:41or saying horrible things about women.

0:45:41 > 0:45:45And that is seen - especially with the tampon tax - that is seen...

0:45:45 > 0:45:47Jamie, where are you on this?

0:45:47 > 0:45:49Just to say, just finally,

0:45:49 > 0:45:54just to say that women are more subject to online abuse

0:45:54 > 0:45:56and that there should be sanctions to protect women is deeply

0:45:56 > 0:45:58more sexist than anything an online control can...

0:45:58 > 0:46:00How do we stop this? Should we stop it?

0:46:00 > 0:46:03No, we can't stop it, and we shouldn't stop all types of trolling,

0:46:03 > 0:46:07and we shouldn't stop all types of offensive language or behaviour,

0:46:07 > 0:46:11because when we live in a society where people are slightly worried

0:46:11 > 0:46:15about self-censoring, not being able to speak their mind, actually,

0:46:15 > 0:46:18sometimes internet trolls play quite a valuable role.

0:46:18 > 0:46:21They push at the boundaries of offensiveness and make

0:46:21 > 0:46:23other people feel like they might be able to speak their mind, too.

0:46:23 > 0:46:26Yes, of course there are limits to that,

0:46:26 > 0:46:28where it gets particularly targeted or where it's

0:46:28 > 0:46:31a threat against an individual, but there's always a danger,

0:46:31 > 0:46:33I think, that with internet trolls, we just say,

0:46:33 > 0:46:35"They're terrible, we've got to silence them all."

0:46:35 > 0:46:37That would actually be quite detrimental

0:46:37 > 0:46:39for the health of our democracy.

0:46:39 > 0:46:41- Owen, do they have their uses? - What, trolls?- Yeah.

0:46:41 > 0:46:43Well, we conflate, I think, trolls and abuse.

0:46:43 > 0:46:46I get trolled all the time - I look like a 12-year-old,

0:46:46 > 0:46:48some age better than others, what can you do?

0:46:48 > 0:46:49LAUGHTER

0:46:49 > 0:46:52You wrote an article, if I may say, it was provocative,

0:46:52 > 0:46:54as most of your articles can be,

0:46:54 > 0:46:59but it was pretty thoughtful and well-argued about the fact that

0:46:59 > 0:47:04having Jeremy Corbyn as leader might not be the best idea in the world,

0:47:04 > 0:47:06and it was interesting, and I watched it happen,

0:47:06 > 0:47:09it unfolded in front of me like a fight in the school yard.

0:47:09 > 0:47:12You weren't behaving in that manner, but other people were.

0:47:12 > 0:47:14The abuse you got was extraordinary.

0:47:14 > 0:47:17Well, I am a right-wing sell-out, as my mum would tell me.

0:47:17 > 0:47:20- No, I'm joking.- There's not much room for civilised debate sometimes.

0:47:20 > 0:47:22No, but I would say this actually, because, yes,

0:47:22 > 0:47:25I've had that kind of attack and it seems ridiculous,

0:47:25 > 0:47:28we've had a lot of focus on that,

0:47:28 > 0:47:31but the vast majority of abuse I get is from the far right

0:47:31 > 0:47:34and the hard right, and that is stuff like threats of violence,

0:47:34 > 0:47:37threats of death, sharing my home address,

0:47:37 > 0:47:40sharing pictures from Google Street View of my bedroom

0:47:40 > 0:47:42and my front door, with arrows pointing at them saying,

0:47:42 > 0:47:45"Here's his bedroom, here's his door."

0:47:45 > 0:47:49Now, if you're saying an absolute view on free speech, which is people

0:47:49 > 0:47:52have the right to say I should be chopped up because I'm gay,

0:47:52 > 0:47:56that people online should... I don't take it seriously, I should say.

0:47:56 > 0:47:57Some of my friends think I should take it

0:47:57 > 0:47:59far more seriously than I do, but nonetheless,

0:47:59 > 0:48:03in the interests of free speech, people can basically issue a hate...

0:48:03 > 0:48:05So, what do we do about it?

0:48:05 > 0:48:07- Well, it should be.- Over to you..

0:48:07 > 0:48:10In the real world, if someone says to you should...

0:48:10 > 0:48:11It's not the real world.

0:48:11 > 0:48:13It is the real world online, of course it is,

0:48:13 > 0:48:15and I have to say,

0:48:15 > 0:48:18after the death of Jo Cox by a far-right terrorist,

0:48:18 > 0:48:21we should take far more seriously the sort of hate speech

0:48:21 > 0:48:23we see online, and it legitimises often the sorts of extremism

0:48:23 > 0:48:25- that can end in violence. - There you are,

0:48:25 > 0:48:26that's where that ends up, Ella.

0:48:26 > 0:48:28No, I don't think it necessarily ends up

0:48:28 > 0:48:30in the tragic death of Jo Cox,

0:48:30 > 0:48:33which was not through a social media hate campaign,

0:48:33 > 0:48:34that was not what happened.

0:48:34 > 0:48:38- That's not what I said. - But the thing is,

0:48:38 > 0:48:41the left used to be about protecting and arguing for

0:48:41 > 0:48:42people's freedoms, and right now,

0:48:42 > 0:48:48a lot of the trolling online comes from, yes, right-wing saddos

0:48:48 > 0:48:50who are keyboard warriors and they really wouldn't threaten a mouse,

0:48:50 > 0:48:52let alone women or...

0:48:52 > 0:48:57There's left-wing saddos, as well, for the sake of BBC impartiality.

0:48:57 > 0:48:59It comes from both sides,

0:48:59 > 0:49:03but on the left we used to argue for freedom, and absolute freedom,

0:49:03 > 0:49:06and the freedom for somebody to be able to express themselves

0:49:06 > 0:49:10in any way, and that's actually now the biggest argument

0:49:10 > 0:49:12against free speech,

0:49:12 > 0:49:14which is an absolute right in a free society,

0:49:14 > 0:49:15if you don't have free speech,

0:49:15 > 0:49:17you don't have any kind of freedom,

0:49:17 > 0:49:20cos it's the fundamental right to say what you believe,

0:49:20 > 0:49:23and the left is now trying to close that down through sanctions...

0:49:23 > 0:49:26- Andrew, wait a minute.- ..through sanctions and censorship and rules.

0:49:26 > 0:49:32Dr Andrew Calcutt, is this just a necessary

0:49:32 > 0:49:35or unavoidable consequence of the digital media platforms?

0:49:35 > 0:49:40Is it just a manifestation of aspects of the human condition,

0:49:40 > 0:49:42or is it something that we should stamp on?

0:49:42 > 0:49:46Oh, I certainly don't think we should be stamping

0:49:46 > 0:49:50and regulating things out of existence, not by any means.

0:49:50 > 0:49:55I think it would just be ludicrous to try and close it down,

0:49:55 > 0:50:01and sanitise it, and outlaw the deplorables.

0:50:01 > 0:50:06It's the same kind of Hillary Clinton discourse all over again.

0:50:06 > 0:50:09You shouldn't be allowed to say that "we don't want you",

0:50:09 > 0:50:11even in the echo chamber.

0:50:11 > 0:50:13But what about, this is something, we'll come back to Dan,

0:50:13 > 0:50:15cos Dan mentioned this earlier on,

0:50:15 > 0:50:17and I think somebody else did, too - seems an awful long time ago -

0:50:17 > 0:50:22Facebook and Twitter, are they tech platforms or are they publishers?

0:50:22 > 0:50:26Well, they're something like a hybrid in between the two.

0:50:26 > 0:50:31But I don't think they are either the problem or the solution.

0:50:31 > 0:50:36I think we've lived through decades of intense interpersonalisation,

0:50:36 > 0:50:40privatisation of everyday life,

0:50:40 > 0:50:44mixed in with a kind of post-political managerial elite

0:50:44 > 0:50:48that wants to micromanage every aspect of our existence,

0:50:48 > 0:50:50including what we do online...

0:50:50 > 0:50:53I don't think anyone's saying that.

0:50:53 > 0:50:56- Forgive me... - Dan, come back on that.

0:50:56 > 0:50:59I don't think anyone's saying, I would imagine...

0:50:59 > 0:51:01Well, of course, you're not going to say it.

0:51:01 > 0:51:03I would imagine everyone on this panel...

0:51:03 > 0:51:06Because I'm revealing what you don't want a say.

0:51:06 > 0:51:08..wants a society of free speech in this country.

0:51:08 > 0:51:11Yeah, well, they all say that, come on.

0:51:11 > 0:51:15But a total and utter free-for-all ignores the fact that

0:51:15 > 0:51:18there are other aspects of existence that also have importance.

0:51:18 > 0:51:21We're not talking about micromanaging that out of existence,

0:51:21 > 0:51:25we're just talking about creating an environment

0:51:25 > 0:51:26where there's some basic decency.

0:51:26 > 0:51:28Creating an environment means managing it.

0:51:28 > 0:51:31But we already have that! We already have that in our media.

0:51:31 > 0:51:33One of the best things about the original incarnation of

0:51:33 > 0:51:36the internet was the Electronic Frontier Foundation,

0:51:36 > 0:51:39which saw it as a frontier for experimentation...

0:51:39 > 0:51:41But we already have it in the world

0:51:41 > 0:51:44of television and newspapers and radio.

0:51:44 > 0:51:46THEY TALK OVER EACH OTHER

0:51:46 > 0:51:50Dan, as a Channel 4 man... Wait a minute, everybody, please!

0:51:50 > 0:51:51Or I'll unfollow you all.

0:51:51 > 0:51:53LAUGHTER

0:51:54 > 0:51:59As a TV man, as a Channel 4 man, are you not slightly on

0:51:59 > 0:52:02the defensive here cos you're old news

0:52:02 > 0:52:05and this is what's happening now?

0:52:05 > 0:52:07It's going to happen in the future, because the TV news -

0:52:07 > 0:52:11Channel 4 News - great programme - Sky News, they do a wonderful job,

0:52:11 > 0:52:16BBC, ITN, likewise - but don't they all look a little bit leaden-footed?

0:52:16 > 0:52:19No, I don't think that's right, actually.

0:52:19 > 0:52:22I don't think that is a view of what is actually happening.

0:52:22 > 0:52:27As I said earlier, television is the main source of news for most people.

0:52:27 > 0:52:30But the traditional media companies - Channel 4 included,

0:52:30 > 0:52:33the BBC included - have done an incredible job of going into

0:52:33 > 0:52:36the new world and distributing video-based news.

0:52:36 > 0:52:40Channel 4 News is now the biggest British broadcaster

0:52:40 > 0:52:42putting news into Facebook.

0:52:42 > 0:52:46But I think you've done a deplorable job on telling the story of

0:52:46 > 0:52:50what's been going on in Western societies for the last 20, 30 years.

0:52:50 > 0:52:51That's why you didn't get

0:52:51 > 0:52:54that people were going to vote for Brexit.

0:52:54 > 0:52:57That's why the mainstream media didn't get that people were

0:52:57 > 0:52:59going to vote for Trump in the way that they did...

0:52:59 > 0:53:02- That's as much about politicians... - ..because you haven't covered the story.

0:53:02 > 0:53:05Sorry, just to complete on that point, I think the important

0:53:05 > 0:53:08thing to say about the old media going into the new media space

0:53:08 > 0:53:11is that we, the BBC, the other public service broadcasters,

0:53:11 > 0:53:15we apply the standards that exist in the television world -

0:53:15 > 0:53:17which are much higher, and, in the UK,

0:53:17 > 0:53:21are the gold standard in the world, in any form of media -

0:53:21 > 0:53:23we apply those standards into a world

0:53:23 > 0:53:25where other people aren't applying it.

0:53:25 > 0:53:28That's not to say the internet-only players don't have terrifically

0:53:28 > 0:53:31high standards. Some of them do, people like Huffington Post.

0:53:31 > 0:53:34I think you're covering yourself in medals you don't deserve.

0:53:34 > 0:53:36It's more like self-service broadcasting.

0:53:36 > 0:53:40- But there are many who don't. - Andrew's on fire now!

0:53:40 > 0:53:42Interesting points though, interesting points.

0:53:42 > 0:53:45Self-service broadcasting, that's quite a... Jamie?

0:53:45 > 0:53:48- What?- You've been listening?

0:53:48 > 0:53:50You know about this stuff?

0:53:50 > 0:53:54- The reality is, I think... - The cat's out of the bag.

0:53:54 > 0:53:57Well, it's getting far more unstable and unpredictable,

0:53:57 > 0:53:59and that's also very exciting.

0:53:59 > 0:54:01And we definitely need to reinvigorate our democracies,

0:54:01 > 0:54:05- we need new ideas... - Do we need new controls?

0:54:05 > 0:54:07What about Facebook, what about Twitter?

0:54:07 > 0:54:09They're publishers, essentially, aren't they?

0:54:09 > 0:54:10Shouldn't they be doing more?

0:54:10 > 0:54:13If they are judged as publishers, they have an opt-out,

0:54:13 > 0:54:17under a 1995 legislation in the US, which means they're not

0:54:17 > 0:54:20responsible for things that are published on their platform.

0:54:20 > 0:54:23Because, if they were, they would probably collapse immediately,

0:54:23 > 0:54:25because they'd have to check everything before it was posted,

0:54:25 > 0:54:27and we're talking about millions, billions...

0:54:27 > 0:54:30You can go online right now, you can find IS, you can find vile

0:54:30 > 0:54:34Islamism, you can find vile Nazism, you can find Holocaust denial...

0:54:34 > 0:54:37It's not possible for them to actually check everything all

0:54:37 > 0:54:39the time, it's simply too much.

0:54:39 > 0:54:42But they have a moral imperative to do significantly more

0:54:42 > 0:54:43than they are doing.

0:54:43 > 0:54:47At the moment, they have immense power, as companies,

0:54:47 > 0:54:51and they are simply not exercising enough responsibility.

0:54:51 > 0:54:53I think we're mixing up two things, or, Dan, I think,

0:54:53 > 0:54:55with respect, you're mixing up two things.

0:54:55 > 0:54:58One is that we all, as individuals, are on social media.

0:54:58 > 0:55:01And what we do as individuals is our business,

0:55:01 > 0:55:03and everyone else can mind theirs.

0:55:03 > 0:55:05There are also lots of people publishing on social media,

0:55:05 > 0:55:07who are organisations,

0:55:07 > 0:55:10who are doing what used to be only the remit of proper journalists

0:55:10 > 0:55:13with proper, hopefully, proper journalistic standards.

0:55:13 > 0:55:17And, obviously, as a professional journalist who lives by those

0:55:17 > 0:55:21standards, you're offended to see people purporting to do that

0:55:21 > 0:55:23and not living up to those standards.

0:55:23 > 0:55:26Those are the people you have a beef with, and I think we need to be...

0:55:26 > 0:55:28No, I have a beef with people who are outright lying,

0:55:28 > 0:55:31and also people who are doing it simply for their own financial ends,

0:55:31 > 0:55:34before we even get to political influence.

0:55:34 > 0:55:35Helen, I haven't heard from you for a while?

0:55:35 > 0:55:38The point is that social media platforms are now,

0:55:38 > 0:55:42to some extent, institutions of democracy, if you like.

0:55:42 > 0:55:44I mean, they are where our democracies are played out.

0:55:44 > 0:55:46They're where our campaigns take place,

0:55:46 > 0:55:49where our elections take place, where our governments act,

0:55:49 > 0:55:52and where the president-elect is actually MAKING policy.

0:55:52 > 0:55:55But they've only been there for about five years,

0:55:55 > 0:55:57they've only been around at all for ten,

0:55:57 > 0:56:00and they've only been intertwined with our political systems for five.

0:56:00 > 0:56:02So, I suppose, it's not surprising we've got

0:56:02 > 0:56:04a lot of institutional catch-up to do.

0:56:04 > 0:56:08But there are things we can do, and there are things being done.

0:56:08 > 0:56:11I mean, not enough, it's too late, et cetera,

0:56:11 > 0:56:14but we shouldn't just throw our hands in the air and give up.

0:56:14 > 0:56:17And we shouldn't lump everything together.

0:56:17 > 0:56:20We're lumping together fake news, trolls -

0:56:20 > 0:56:24the small, very small number of people who are threatening to

0:56:24 > 0:56:26close down Twitter entirely.

0:56:26 > 0:56:30Right now, this is a fantastic debate, but out there,

0:56:30 > 0:56:34there is an even more fantastic debate about this debate,

0:56:34 > 0:56:36happening right now, on Twitter.

0:56:36 > 0:56:37If only they could all...

0:56:37 > 0:56:41That's right, even this debate about politics is being played out

0:56:41 > 0:56:43on social media as we speak, exactly.

0:56:43 > 0:56:46But the solutions to the things are different, as well.

0:56:46 > 0:56:50The solutions to fake news are things like your organisation

0:56:50 > 0:56:53and what Facebook have said they will do about fake news.

0:56:53 > 0:56:56Both they themselves, in terms of fact checking, and allowing

0:56:56 > 0:56:59individual social media users to report.

0:56:59 > 0:57:04The solutions are different for trolls and they are different

0:57:04 > 0:57:07for all the other things we've been talking about.

0:57:07 > 0:57:11But there are possibilities out there, and they are being worked at.

0:57:11 > 0:57:13There are even bots being developed...

0:57:13 > 0:57:16Bots, just explain to people who don't know what a bot is?

0:57:16 > 0:57:18An automated Twitter account, for example.

0:57:18 > 0:57:23There are automated Twitter accounts out there, facing racism,

0:57:23 > 0:57:30attacking racism, or just trying to persuade racist trolls of the

0:57:30 > 0:57:33error of their ways, et cetera.

0:57:33 > 0:57:36And some of those things are being shown to work.

0:57:36 > 0:57:38So, there are a lot of possibilities out there,

0:57:38 > 0:57:39but we can't just lump them all together.

0:57:39 > 0:57:43Chi, we're getting towards the end, but we've got about...

0:57:43 > 0:57:46A 40-second answer from you, if you would?

0:57:46 > 0:57:47Do you celebrate this?

0:57:47 > 0:57:50Is it a better form of contact with your constituents,

0:57:50 > 0:57:52as an elected representative?

0:57:52 > 0:57:55I celebrate it as an elected representative, in terms of

0:57:55 > 0:57:59being able, instantly, almost, to see some constituents -

0:57:59 > 0:58:03those who are on Twitter, or those who are on Facebook -

0:58:03 > 0:58:07respond to really important issues. That is fantastic, online.

0:58:07 > 0:58:10But I also recognise there are so many who aren't making those

0:58:10 > 0:58:13responses, because a lot of people have better things to do with

0:58:13 > 0:58:17their lives than respond to their MPs online!

0:58:17 > 0:58:19But, also, I think we haven't talked about this,

0:58:19 > 0:58:23the rules that Facebook and others do in order to decide what

0:58:23 > 0:58:27you see, the algorithms, they are what control what people see.

0:58:27 > 0:58:30And after this, we're all going to check our phones, right?

0:58:30 > 0:58:32Thank you all very much indeed.

0:58:32 > 0:58:35As always, the debate continues on the digital universe,

0:58:35 > 0:58:37on Twitter and online, join us next Sunday from Bradford,

0:58:37 > 0:58:39goodbye from everyone here in Uxbridge.

0:58:39 > 0:58:41Have a great Sunday.

0:58:41 > 0:58:43APPLAUSE