Browse content similar to Episode 2. Check below for episodes and series from the same categories and more!
Line | From | To | |
---|---|---|---|
Today, on The Big Questions, is digital media good for democracy? | 0:00:02 | 0:00:05 | |
APPLAUSE | 0:00:15 | 0:00:18 | |
Good morning. I'm Nicky Campbell. Welcome to The Big Questions. | 0:00:18 | 0:00:22 | |
Today, we're back at Brunel University, London, in Uxbridge, | 0:00:22 | 0:00:25 | |
to debate one very big question - is digital media good for democracy? | 0:00:25 | 0:00:30 | |
Welcome, everybody, to The Big Questions. | 0:00:30 | 0:00:33 | |
Now, this Friday, Donald Trump will be sworn in as | 0:00:35 | 0:00:38 | |
the 45th President of the United States of America. | 0:00:38 | 0:00:41 | |
Many say he owes his victory to a greater mastery of the tools | 0:00:41 | 0:00:46 | |
of this digital age - Twitter, Facebook, | 0:00:46 | 0:00:48 | |
and all the other forms of online communication. | 0:00:48 | 0:00:51 | |
And, US intelligence agencies are saying that Russian | 0:00:51 | 0:00:54 | |
intervention in cyberspace played a hand, too. | 0:00:54 | 0:00:58 | |
Well, here in Britain, | 0:00:58 | 0:01:00 | |
the Brexit campaign was acknowledged to be far savvier in its use | 0:01:00 | 0:01:03 | |
of social media than the Remain side, | 0:01:03 | 0:01:05 | |
and the Scottish Nationalist online army narrowed the gap in | 0:01:05 | 0:01:08 | |
their independence referendum and helped drive out Labour | 0:01:08 | 0:01:12 | |
in the 2015 general election. | 0:01:12 | 0:01:14 | |
So, there's no denying the power of digital media. | 0:01:14 | 0:01:17 | |
But, it also poses a real challenge. | 0:01:17 | 0:01:20 | |
How do you know what is true and what is fake in an online world | 0:01:20 | 0:01:24 | |
with no editorial control or standards? | 0:01:24 | 0:01:27 | |
And, what is the effect on democracy of trolling, fake news, | 0:01:27 | 0:01:32 | |
and the echo chambers created by algorithms? | 0:01:32 | 0:01:35 | |
To debate this brave new world, | 0:01:35 | 0:01:37 | |
we've assembled a digitally savvy front row of journalists, | 0:01:37 | 0:01:41 | |
politicians, campaign advisers, techno wizards, | 0:01:41 | 0:01:45 | |
think tank heavyweights, television executives, learned academics, | 0:01:45 | 0:01:48 | |
all here in the flesh, | 0:01:48 | 0:01:50 | |
not created in some virtual world. | 0:01:50 | 0:01:52 | |
They are all too real. | 0:01:52 | 0:01:55 | |
And, you can join in on Twitter or online, | 0:01:55 | 0:01:59 | |
by logging into bbc.co.uk/thebigquestions. | 0:01:59 | 0:02:02 | |
Follow the link to the online discussion. | 0:02:02 | 0:02:04 | |
Lots of engagement, | 0:02:04 | 0:02:05 | |
contributions from our highly representative | 0:02:05 | 0:02:09 | |
Uxbridge audience as well. | 0:02:09 | 0:02:11 | |
Well, Jamie Bartlett, if I may... | 0:02:11 | 0:02:16 | |
director at the Centre for the Analysis of Social Media, | 0:02:16 | 0:02:19 | |
the very man. | 0:02:19 | 0:02:21 | |
This has changed politics forever, hasn't it? | 0:02:21 | 0:02:24 | |
For years, we have been saying how do we get young people? | 0:02:24 | 0:02:26 | |
How do we get more people engaged? | 0:02:26 | 0:02:28 | |
How do we stop them from being disenfranchised? | 0:02:28 | 0:02:31 | |
We have done just that. | 0:02:31 | 0:02:32 | |
It has got a lot more people engaged. | 0:02:32 | 0:02:34 | |
There's no doubt about that. And it's not just people tweeting | 0:02:34 | 0:02:38 | |
and posting things on Facebook. | 0:02:38 | 0:02:40 | |
Social media has been remarkably good | 0:02:40 | 0:02:43 | |
at forming a bridge to get people into real-world politics as well. | 0:02:43 | 0:02:47 | |
Various types of activism, marching, demonstrating, even voting. | 0:02:47 | 0:02:51 | |
I think people are more likely to vote if they get involved in | 0:02:51 | 0:02:54 | |
-politics online. -All good. -That is all good. | 0:02:54 | 0:02:57 | |
And I think everybody here would acknowledge that is | 0:02:57 | 0:03:00 | |
a very good thing. But, of course, it creates new problems. | 0:03:00 | 0:03:03 | |
And those problems, I think, we're only just beginning to grapple with. | 0:03:03 | 0:03:07 | |
I think it's making politics, in some ways, far more polarised, | 0:03:07 | 0:03:11 | |
far more difficult to predict or control. | 0:03:11 | 0:03:14 | |
In many ways, Donald Trump is the perfect politician for | 0:03:14 | 0:03:17 | |
a digital age. | 0:03:17 | 0:03:18 | |
We are overwhelmed with information, | 0:03:18 | 0:03:20 | |
graphs and charts and infographics and tweets and retweets. | 0:03:20 | 0:03:24 | |
And in those circumstances, we tend to go for whatever is the | 0:03:24 | 0:03:28 | |
loudest, the most offensive, the most emotive. | 0:03:28 | 0:03:31 | |
And that is why politics, I think, | 0:03:31 | 0:03:33 | |
is becoming more polarised, it is creating more centres of | 0:03:33 | 0:03:36 | |
power that we barely understand, huge tech companies that can | 0:03:36 | 0:03:40 | |
control what we see online, and we don't really know how they work. | 0:03:40 | 0:03:44 | |
So, while it's bringing more people into politics, | 0:03:44 | 0:03:48 | |
it's not clear to me yet whether it's making politics any better. | 0:03:48 | 0:03:51 | |
In fact, in some ways, it's making it rather worse. | 0:03:51 | 0:03:54 | |
Well, there are loads of issues there, | 0:03:54 | 0:03:56 | |
I'm really looking forward to the next hour or so. | 0:03:56 | 0:03:58 | |
Helen, he makes some good points. | 0:03:58 | 0:04:00 | |
Are people actually becoming better informed and more engaged? | 0:04:00 | 0:04:06 | |
Because a click of a mouse is one thing, | 0:04:06 | 0:04:09 | |
but actually trudging up the stairwells of high rises | 0:04:09 | 0:04:12 | |
delivering leaflets on a cold February night is another | 0:04:12 | 0:04:16 | |
thing entirely. | 0:04:16 | 0:04:18 | |
I don't think we should denigrate the click of the mouse. | 0:04:18 | 0:04:21 | |
I think there is a culture, particularly in politics, | 0:04:21 | 0:04:23 | |
particularly British politics, that politics must be painful. | 0:04:23 | 0:04:27 | |
You've got to do something cold, or long, or boring, or hard work. | 0:04:27 | 0:04:31 | |
And what digital media have done | 0:04:31 | 0:04:33 | |
is they've sort of democratised the act of political participation. | 0:04:33 | 0:04:36 | |
So, now, there's very small acts of participation. | 0:04:36 | 0:04:39 | |
Liking or clicking something. | 0:04:39 | 0:04:41 | |
Indeed, the click of the mouse, which is what is doing the | 0:04:41 | 0:04:44 | |
work here, it's what's drawing people into politics. | 0:04:44 | 0:04:47 | |
We shouldn't let go of that, even while we're blaming... | 0:04:47 | 0:04:51 | |
-Don't denigrate the click. -Yes, don't denigrate the click. | 0:04:51 | 0:04:54 | |
Yeah, and it makes a difference. | 0:04:54 | 0:04:56 | |
People feel they are making a difference, but are they? | 0:04:56 | 0:04:58 | |
They are only... | 0:04:58 | 0:04:59 | |
Because of these algorithms which send you your own things | 0:04:59 | 0:05:02 | |
that they think you will like. | 0:05:02 | 0:05:04 | |
So, basically, it is that echo chamber, you are only confronting, | 0:05:04 | 0:05:09 | |
if that's the right word, your own opinions, there's no challenge. | 0:05:09 | 0:05:12 | |
I don't think that's right. | 0:05:12 | 0:05:15 | |
The true echo chamber would be just reading the Daily Mail in the 1930s. | 0:05:15 | 0:05:19 | |
Or...just watching one particular TV station in the US. | 0:05:19 | 0:05:24 | |
That would be a real echo chamber. And I don't think... | 0:05:24 | 0:05:28 | |
Of course, our social media environments are personalised | 0:05:28 | 0:05:33 | |
by us, and personalised by the social media platforms and we need | 0:05:33 | 0:05:36 | |
to work against that, but I do think, still, | 0:05:36 | 0:05:40 | |
that our information environment, and there is research to show this, | 0:05:40 | 0:05:43 | |
that people are exposed to more news sources on social media | 0:05:43 | 0:05:46 | |
than people who don't use social media. | 0:05:46 | 0:05:47 | |
-Only the news sources they agree with. -No. Not necessarily. | 0:05:47 | 0:05:51 | |
A more heterogeneous range of news sources on social media | 0:05:51 | 0:05:55 | |
than they are in the offline world. | 0:05:55 | 0:05:57 | |
We're very good at creating echo chambers outside social media. | 0:05:57 | 0:06:01 | |
And I don't think we should forget that. | 0:06:01 | 0:06:03 | |
Yeah, echo chambers, there's nothing new about them. Owen, is that right? | 0:06:03 | 0:06:06 | |
Well, I think, firstly, Facebook is a bit different, | 0:06:06 | 0:06:09 | |
but Twitter, I think we overblow its importance because | 0:06:09 | 0:06:12 | |
only a very small section of the population are regularly using | 0:06:12 | 0:06:14 | |
Twitter to talk about the world around them. | 0:06:14 | 0:06:17 | |
And they tend to be younger, and more affluent. | 0:06:17 | 0:06:20 | |
You take the Brexit... the EU referendum, | 0:06:20 | 0:06:23 | |
the people who are most likely to vote to leave were people | 0:06:23 | 0:06:25 | |
over the age of 65, who are the least likely to be on Twitter. | 0:06:25 | 0:06:29 | |
So, I think if we're talking about these grand political developments | 0:06:29 | 0:06:32 | |
sweeping the Western world and putting it down to Twitter, | 0:06:32 | 0:06:34 | |
then I think we are mistaken. | 0:06:34 | 0:06:36 | |
I think the danger with people like me on the left of politics | 0:06:36 | 0:06:39 | |
is we think, well, the mainstream media is biased against us, | 0:06:39 | 0:06:41 | |
most of the press supports the Conservative government, | 0:06:41 | 0:06:44 | |
so, let's retreat to Twitter. I think that's a mistake, | 0:06:44 | 0:06:47 | |
because most people, they come home at the end of the day, | 0:06:47 | 0:06:49 | |
a busy day, they've got jobs, they've got kids, | 0:06:49 | 0:06:52 | |
they might listen to a bit of radio, watch the evening news, | 0:06:52 | 0:06:55 | |
maybe flick through a newspaper, | 0:06:55 | 0:06:57 | |
and that is how most people still get their news coverage. | 0:06:57 | 0:07:00 | |
Do we get a false impression of what is happening in the world, in a sense? | 0:07:00 | 0:07:02 | |
Because Helen is saying with all this information, you're better informed than ever, | 0:07:02 | 0:07:06 | |
but you're saying there's something of a political mirage about it. | 0:07:06 | 0:07:09 | |
I think when it comes to Twitter. I think on Facebook, we've got, look, you've got someone here | 0:07:09 | 0:07:13 | |
who did an excellent job for the Conservatives. | 0:07:13 | 0:07:15 | |
You won't often hear me saying that about people on that wing of British politics, | 0:07:15 | 0:07:18 | |
but he did because what they did with Facebook, | 0:07:18 | 0:07:21 | |
it wasn't about joining Facebook groups, which is again just people often agreeing with each other, | 0:07:21 | 0:07:25 | |
what the Conservatives did is they targeted specific demographics, | 0:07:25 | 0:07:29 | |
because millions of people use Facebook. | 0:07:29 | 0:07:31 | |
They're not talking about politics again on Facebook. | 0:07:31 | 0:07:33 | |
They're clicking on photos, sharing funny messages with friends. | 0:07:33 | 0:07:36 | |
But they targeted adverts at specific demographics. | 0:07:36 | 0:07:39 | |
That is very effective. | 0:07:39 | 0:07:41 | |
But if people like myself who are campaigners think we can just | 0:07:41 | 0:07:44 | |
send a few tweets going, "Rrrrah, Theresa May!", | 0:07:44 | 0:07:47 | |
yes, a load of people will retweet that, | 0:07:47 | 0:07:49 | |
but then we make the mistake of thinking the rest of the country | 0:07:49 | 0:07:52 | |
are full of the same enthusiasm as me, and that's not true. | 0:07:52 | 0:07:56 | |
So, Twitter has its use, it can mobilise campaigners, | 0:07:56 | 0:07:59 | |
put issues on the agenda, but it isn't a substitute, unfortunately, | 0:07:59 | 0:08:02 | |
for trying to have a strategy to deal with a very hostile press. | 0:08:02 | 0:08:05 | |
The aforementioned Tom Edmonds in a moment, but, Ellie, you want to come in? | 0:08:05 | 0:08:08 | |
Yeah, I think there was a point made earlier that social media, | 0:08:08 | 0:08:11 | |
which I think is a great new tool, | 0:08:11 | 0:08:12 | |
it's just another kind of technological advance in | 0:08:12 | 0:08:14 | |
the way that people communicate with each other, | 0:08:14 | 0:08:17 | |
and that is to be welcomed, makes politics harder to control, | 0:08:17 | 0:08:20 | |
or makes politics more polarised. | 0:08:20 | 0:08:21 | |
I think most of us, myself included, would welcome that | 0:08:21 | 0:08:24 | |
because having more ability to have an argument, | 0:08:24 | 0:08:28 | |
having more scope to have a debate and have more people involved | 0:08:28 | 0:08:31 | |
in the conversation is always a positive. | 0:08:31 | 0:08:33 | |
Certainly, Twitter is a very kind of closed shop in the way that | 0:08:33 | 0:08:36 | |
it's mainly young people and media commentators - | 0:08:36 | 0:08:38 | |
myself as a journalist knows that. | 0:08:38 | 0:08:39 | |
And Facebook, as Owen said, is a bit different. | 0:08:39 | 0:08:42 | |
But the idea that this should be... | 0:08:42 | 0:08:44 | |
What kind of frightens me more about social media | 0:08:44 | 0:08:47 | |
isn't necessarily who's on it and who's talking about it, | 0:08:47 | 0:08:49 | |
it's the attempts to control it, | 0:08:49 | 0:08:52 | |
the attempts to make Facebook and Twitter put out certain | 0:08:52 | 0:08:56 | |
messages or be worried about what messages they've put out | 0:08:56 | 0:08:59 | |
-and the drive to censor social media. -That is a fascinating area. | 0:08:59 | 0:09:03 | |
I'm going to address that later on | 0:09:03 | 0:09:04 | |
because we have lots to focus on there with the trolling, | 0:09:04 | 0:09:07 | |
with the vile abuse, with the fake news, | 0:09:07 | 0:09:10 | |
with the false news and all that, that's all to come. | 0:09:10 | 0:09:13 | |
But you were mentioned, Tom, in Dispatches, | 0:09:13 | 0:09:15 | |
by Owen Jones, | 0:09:15 | 0:09:17 | |
in a congratulatory tone because of the brilliant strategy | 0:09:17 | 0:09:21 | |
that you came up with for the Tories in the last election online. | 0:09:21 | 0:09:24 | |
What did you do and how did you do it? | 0:09:24 | 0:09:25 | |
Well, broadly, | 0:09:25 | 0:09:26 | |
political parties use online communications for two things. | 0:09:26 | 0:09:29 | |
The first thing is to find and recruit supporters who will go and | 0:09:29 | 0:09:33 | |
take those vital actions to support them, primarily offline, | 0:09:33 | 0:09:36 | |
as other panel members have mentioned. | 0:09:36 | 0:09:38 | |
So, it's the people who are going to go and knock on the doors | 0:09:38 | 0:09:41 | |
because nothing beats that face-to-face communication. | 0:09:41 | 0:09:44 | |
The second thing they do, and we did in the 2015 campaign, | 0:09:44 | 0:09:48 | |
is to find and reach out to those swing voters | 0:09:48 | 0:09:51 | |
in the marginal seats that are going to decide the election, | 0:09:51 | 0:09:53 | |
and speak to them about the issues they care about. | 0:09:53 | 0:09:56 | |
Now, if you manage an election campaign, you're speaking from... | 0:09:56 | 0:09:59 | |
anyone from as diverse as Lib Dem voters in the South West to | 0:09:59 | 0:10:04 | |
Labour voters in the North of England and in Scotland as well. | 0:10:04 | 0:10:08 | |
So, you need to speak to them about issues they care about. | 0:10:08 | 0:10:10 | |
Now, what Facebook allows you to do is find those groups of individuals | 0:10:10 | 0:10:14 | |
and speak to them about the issues... | 0:10:14 | 0:10:16 | |
You'll have to work quite hard to find Labour voters in Scotland. | 0:10:16 | 0:10:18 | |
-Nowadays, yes. -So, that's what you did - | 0:10:18 | 0:10:22 | |
very targeted, very focused, and very successful. | 0:10:22 | 0:10:24 | |
Very targeted, very segmented communications. | 0:10:24 | 0:10:27 | |
Now, digital in itself is just a medium. | 0:10:27 | 0:10:31 | |
So, you need to have the message right | 0:10:31 | 0:10:33 | |
before you go and speak to these people. | 0:10:33 | 0:10:34 | |
There's no point finding individual groups of people and speaking | 0:10:34 | 0:10:37 | |
about things they don't care about. | 0:10:37 | 0:10:39 | |
There's no point finding groups of people, | 0:10:39 | 0:10:41 | |
finding groups of young voters, and then speaking to them about pensions, | 0:10:41 | 0:10:45 | |
or, erm, about issues they don't care about. | 0:10:45 | 0:10:47 | |
So, it's about finding groups of people, having the right message, | 0:10:47 | 0:10:50 | |
-and getting in front of them time and time again. -Mm. And it was | 0:10:50 | 0:10:53 | |
Helen who was remarkably successful in the marginals, wasn't it? | 0:10:53 | 0:10:56 | |
Well, it was. | 0:10:56 | 0:10:58 | |
And, if you... | 0:10:58 | 0:11:00 | |
And I think also very successful for the Leave campaign, Brexit, | 0:11:00 | 0:11:04 | |
and very successful for Trump, | 0:11:04 | 0:11:06 | |
because, after all, if you just reach an echo chamber of people | 0:11:06 | 0:11:09 | |
who agree with you, you're not going to make any difference. | 0:11:09 | 0:11:12 | |
I mean, that's not what parties are looking to do. | 0:11:12 | 0:11:15 | |
They're looking to find people | 0:11:15 | 0:11:16 | |
who are undecided or don't know what they think yet. | 0:11:16 | 0:11:18 | |
And what Labour did at the last general election, as you know better than me, | 0:11:18 | 0:11:22 | |
is just randomly throw out the same message across Facebook. | 0:11:22 | 0:11:25 | |
So, it didn't target specific people | 0:11:25 | 0:11:27 | |
in specific seats they needed to win over. | 0:11:27 | 0:11:29 | |
And, also, they mistook engagement - | 0:11:29 | 0:11:31 | |
that's lots of people liking and commenting - for success. | 0:11:31 | 0:11:35 | |
Actually, what that generally meant was the messages their core supporters already agreed with, | 0:11:35 | 0:11:39 | |
like the NHS, were getting lots of attraction, | 0:11:39 | 0:11:42 | |
but they weren't reaching out to the people they needed to win over. | 0:11:42 | 0:11:45 | |
And, obviously, that's what you learned from so successfully. | 0:11:45 | 0:11:48 | |
You said, we're not just going to do things for likes or clicks, | 0:11:48 | 0:11:50 | |
we're going to target the people we need to win over | 0:11:50 | 0:11:52 | |
who aren't in our natural coalition. | 0:11:52 | 0:11:54 | |
Yes. Chi, as a Labour MP, just moving on ever so slightly, | 0:11:54 | 0:11:57 | |
the digital media revolution, | 0:11:57 | 0:12:01 | |
it's a great process and platform of democratisation, isn't it? | 0:12:01 | 0:12:06 | |
Isn't it a great leveller? | 0:12:06 | 0:12:08 | |
Absolutely. And digital media... | 0:12:08 | 0:12:10 | |
Before I came into politics, I was an electrical engineer | 0:12:10 | 0:12:13 | |
for 20 years, helping build out these networks. | 0:12:13 | 0:12:15 | |
And the reason why I went into that is because anything, | 0:12:15 | 0:12:18 | |
something that gives people | 0:12:18 | 0:12:20 | |
the power to connect with other people, which enables people | 0:12:20 | 0:12:23 | |
to reach out to share, that is progress, | 0:12:23 | 0:12:26 | |
and that, I believe, is good for democracy. | 0:12:26 | 0:12:29 | |
But, like any technology, it comes with its disadvantages as well. | 0:12:29 | 0:12:34 | |
And, also, any technology, | 0:12:34 | 0:12:37 | |
those in power will try and use it to reinforce their power, | 0:12:37 | 0:12:41 | |
and as they have the most means, they're the most likely | 0:12:41 | 0:12:43 | |
to be able to use it most effectively to begin with. | 0:12:43 | 0:12:46 | |
But you don't have to, you know, | 0:12:46 | 0:12:48 | |
have lunch, schmooze Rupert Murdoch to get your message out there any more. | 0:12:48 | 0:12:51 | |
-Because there are other ways of doing it. -Let me say this. | 0:12:51 | 0:12:54 | |
There's two really big caveats with that, | 0:12:54 | 0:12:56 | |
and the first one, and that's the reason why | 0:12:56 | 0:12:58 | |
I still do walk out on cold winter's days and knock on doors, | 0:12:58 | 0:13:01 | |
is because there are millions of people | 0:13:01 | 0:13:03 | |
in this country who still don't have broadband, | 0:13:03 | 0:13:05 | |
who still don't have any digital access, you know. | 0:13:05 | 0:13:08 | |
The state of broadband in this country is a disgrace. | 0:13:08 | 0:13:12 | |
And then, there are those who may have access, | 0:13:12 | 0:13:15 | |
but they can't afford it, it's too expensive, | 0:13:15 | 0:13:17 | |
or they haven't got the digital skills necessary to use it. | 0:13:17 | 0:13:21 | |
So, there are millions of people off, away from this debate, | 0:13:21 | 0:13:25 | |
which we really need to include, and when we have included them, | 0:13:25 | 0:13:28 | |
then we need to look at the way in which that debate and those | 0:13:28 | 0:13:32 | |
messages are being put out. And what I would say particularly about... | 0:13:32 | 0:13:36 | |
And really, it was a most effective campaign, | 0:13:36 | 0:13:39 | |
the Tory campaign in 2015, spent, I think it was... | 0:13:39 | 0:13:42 | |
-Was it ten times more than Labour on Facebook? -Mm. | 0:13:42 | 0:13:46 | |
But the other point is that those messages came with adverts | 0:13:46 | 0:13:50 | |
down the sides, you know, | 0:13:50 | 0:13:53 | |
and when I'm reaching out to my constituents, | 0:13:53 | 0:13:56 | |
and Facebook enables me to do that and I do use that, | 0:13:56 | 0:13:59 | |
but I'm very aware that if I'm asking them about how they feel | 0:13:59 | 0:14:03 | |
about obesity or something, | 0:14:03 | 0:14:05 | |
there's going to be adverts for, you know, well-known fast food places | 0:14:05 | 0:14:09 | |
down one side and there's going to be messages which are | 0:14:09 | 0:14:12 | |
calculated on the basis of algorithms which are entirely, | 0:14:12 | 0:14:16 | |
you know, invisible, entirely opaque, | 0:14:16 | 0:14:19 | |
which we have no knowledge of. | 0:14:19 | 0:14:21 | |
So, what I would say is, we need to update... | 0:14:21 | 0:14:24 | |
The system feeds you with things that they know you will like, | 0:14:24 | 0:14:28 | |
because of your previous clicking habits, | 0:14:28 | 0:14:31 | |
just to highlight that for people. Carry on. | 0:14:31 | 0:14:34 | |
So, we need to update, you know, the regulations, basically, | 0:14:34 | 0:14:38 | |
to make sure this fantastic opportunity is fairer and is | 0:14:38 | 0:14:44 | |
accessible to people and is used to support and enable people | 0:14:44 | 0:14:47 | |
and not to feed them the same messages. | 0:14:47 | 0:14:50 | |
Allie, you don't look happy. The word "regulations!" | 0:14:50 | 0:14:52 | |
It always gets my back up, yeah. I think that... | 0:14:52 | 0:14:55 | |
I also think we may be giving a bit too much credit to social media, | 0:14:55 | 0:14:57 | |
in that the idea that political messages put out by parties | 0:14:57 | 0:15:01 | |
simply failed because they weren't using the right social media | 0:15:01 | 0:15:05 | |
strategy is perhaps masking a bigger problem of, | 0:15:05 | 0:15:07 | |
perhaps they were putting out the wrong messages, | 0:15:07 | 0:15:10 | |
or messages that people weren't interested in. | 0:15:10 | 0:15:12 | |
You brought up the fact of adverts being alongside things on | 0:15:12 | 0:15:15 | |
Facebook, and there is this sense that people are just | 0:15:15 | 0:15:18 | |
mindlessly clicking on something and then | 0:15:18 | 0:15:20 | |
a pop-up for McDonald's comes up and they think, | 0:15:20 | 0:15:22 | |
"Oh, yeah, I want McDonald's," and it's that kind of mindless, | 0:15:22 | 0:15:25 | |
I think that's a very insulting view of the public, | 0:15:25 | 0:15:27 | |
not only how they engage with news, but the media in general... | 0:15:27 | 0:15:29 | |
That's not what I said. | 0:15:29 | 0:15:31 | |
But advertising is advertising for a reason, because it does | 0:15:31 | 0:15:34 | |
actually work, so, and that's where Facebook gets its revenues from. | 0:15:34 | 0:15:38 | |
I'm not saying people are mindless, | 0:15:38 | 0:15:40 | |
but I am saying that people are influenced by advertising, | 0:15:40 | 0:15:43 | |
otherwise you wouldn't have such big budgets on it. | 0:15:43 | 0:15:45 | |
No, absolutely, but if you're putting out a message on obesity, | 0:15:45 | 0:15:48 | |
and then you're saying that there's a link between that | 0:15:48 | 0:15:50 | |
and then a fast food advertisement being alongside that, | 0:15:50 | 0:15:53 | |
you're drawing some kind of relationship between them, | 0:15:53 | 0:15:55 | |
and then when you use the word "regulation," my alarm bells go off, | 0:15:55 | 0:15:58 | |
because I think, what do you want to regulate in tat? | 0:15:58 | 0:16:00 | |
-AUDIENCE MEMBER APPLAUDS -Everything, because... | 0:16:00 | 0:16:02 | |
I'm going to come to you in a minute, | 0:16:02 | 0:16:04 | |
because that was quite a round of applause! | 0:16:04 | 0:16:05 | |
Just this problem with the word "regulation" - | 0:16:05 | 0:16:08 | |
nothing is above the law. | 0:16:08 | 0:16:09 | |
Are you saying that the internet should be above the law? | 0:16:09 | 0:16:11 | |
I mean, there are regulations that exist now - you recognise | 0:16:11 | 0:16:14 | |
that we have a legal system. | 0:16:14 | 0:16:16 | |
Yeah, and I would disagree with some of those. | 0:16:16 | 0:16:18 | |
Yeah, but nothing's above the law. So, the law should apply. | 0:16:18 | 0:16:21 | |
Yes, but in terms of free speech on the internet, it should be absolute. | 0:16:21 | 0:16:24 | |
Well, free speech on the internet, that's going to be | 0:16:24 | 0:16:26 | |
a tasty area for us later on. | 0:16:26 | 0:16:28 | |
You burst into a rapturous solo round of applause there! | 0:16:28 | 0:16:33 | |
What... Expand on it. | 0:16:33 | 0:16:36 | |
Um, I just agree with the lady here, I just think that, um, | 0:16:36 | 0:16:39 | |
it's very easy and I think it's a correct, you know, link between | 0:16:39 | 0:16:45 | |
advertising on social media and problems that we face in society. | 0:16:45 | 0:16:50 | |
Now, Jamie, you mentioned Trump, I think, earlier on. | 0:16:50 | 0:16:54 | |
Was Trump's campaign a triumph of digital media? | 0:16:54 | 0:17:00 | |
In some ways, yes. | 0:17:00 | 0:17:01 | |
Of course, there are much bigger trends behind that. | 0:17:01 | 0:17:03 | |
You can't just arrive and use Twitter and Facebook and then | 0:17:03 | 0:17:06 | |
suddenly get yourself elected. | 0:17:06 | 0:17:08 | |
You're obviously tapping into various, | 0:17:08 | 0:17:10 | |
much deeper concerns that people have. | 0:17:10 | 0:17:12 | |
But the point I want to make about that is... | 0:17:12 | 0:17:15 | |
when you are completely overwhelmed with different sources of | 0:17:15 | 0:17:19 | |
information, this is the problem with social media - | 0:17:19 | 0:17:23 | |
we were promised in the 1990s, from the digital prophets, | 0:17:23 | 0:17:27 | |
that the politicians of the future, | 0:17:27 | 0:17:29 | |
when we were all connected and we had access to the world's | 0:17:29 | 0:17:31 | |
information, would be more informed, | 0:17:31 | 0:17:33 | |
they'd be smarter and we'd all be kinder and nicer to each other. | 0:17:33 | 0:17:37 | |
One great academic even said it would be the end of nationalism, | 0:17:37 | 0:17:40 | |
because we'd be able to link to everybody and we'd all | 0:17:40 | 0:17:43 | |
understand each other. This is ludicrous. | 0:17:43 | 0:17:45 | |
The reality is, of course, | 0:17:45 | 0:17:47 | |
that when we are overwhelmed with information, | 0:17:47 | 0:17:49 | |
we use heuristics to try to quickly figure out what we think, | 0:17:49 | 0:17:52 | |
we do tend, I think, I disagree slightly with you on this, Helen, | 0:17:52 | 0:17:56 | |
we do tend to find things that we already agree with, | 0:17:56 | 0:17:59 | |
that our friends already agree with, and we trust that more. | 0:17:59 | 0:18:02 | |
We do that anyway. | 0:18:02 | 0:18:03 | |
We do that anyway, but we do that to such | 0:18:03 | 0:18:06 | |
a greater extent now than we ever did before, | 0:18:06 | 0:18:08 | |
and we're less aware of it. | 0:18:08 | 0:18:10 | |
Is that true, Helen? | 0:18:10 | 0:18:11 | |
Well, I think it's a tendency to present it as one thing or | 0:18:11 | 0:18:15 | |
the other. I mean, it's a very grey area, you know, between... | 0:18:15 | 0:18:20 | |
You even hear people talking about kind of hermetically sealed | 0:18:20 | 0:18:23 | |
echo chambers, and that's... | 0:18:23 | 0:18:25 | |
But people say, "Oh, the Twittersphere has gone mad!" | 0:18:25 | 0:18:28 | |
What they mean is, their own Twitter feed has gone mad. | 0:18:28 | 0:18:31 | |
Yeah, but in almost any platform, | 0:18:31 | 0:18:33 | |
you'll be able to see trending information and you're just | 0:18:33 | 0:18:36 | |
one click away from a huge array of opinions. | 0:18:36 | 0:18:39 | |
But the danger is, I think, that social media politics is | 0:18:39 | 0:18:42 | |
in many ways more angry, more aggressive, more emotive, | 0:18:42 | 0:18:47 | |
and the danger of that, I think, | 0:18:47 | 0:18:49 | |
is that it makes compromise more difficult. | 0:18:49 | 0:18:52 | |
The US Congress is more polarised than it has ever been, | 0:18:52 | 0:18:55 | |
since the war at the least, and I think that's happening in | 0:18:55 | 0:18:59 | |
a lot of different democracies, and part of the reason is the way | 0:18:59 | 0:19:02 | |
that we do now - most of us, not everyone, but most of us - | 0:19:02 | 0:19:05 | |
engage with politics. | 0:19:05 | 0:19:06 | |
That makes compromise more difficult and I think that is going | 0:19:06 | 0:19:09 | |
-to make politics much more difficult in the future. -But I think... | 0:19:09 | 0:19:12 | |
Look, all our societies have become more polarised, | 0:19:12 | 0:19:15 | |
where people, whether it be Trump in the United States, Brexit here, | 0:19:15 | 0:19:18 | |
across Europe the rise of the populist right, | 0:19:18 | 0:19:21 | |
we've become far more divided as societies. | 0:19:21 | 0:19:24 | |
Not obviously social media that's caused that - | 0:19:24 | 0:19:26 | |
social media is amplifying aspects of it, though, | 0:19:26 | 0:19:28 | |
because people communicate on social media in a way they would | 0:19:28 | 0:19:31 | |
often never dream of speaking to each other in real life. | 0:19:31 | 0:19:34 | |
It often encourages a pack mentality, | 0:19:34 | 0:19:37 | |
where you do get groups of people... | 0:19:37 | 0:19:39 | |
It's like the mob in the French Revolution sometimes. | 0:19:39 | 0:19:41 | |
But I don't want to get like that, | 0:19:41 | 0:19:43 | |
-because that does sound kind of "Grrr!" -It can be "Grrr!" | 0:19:43 | 0:19:45 | |
It's good that you can democratise information... | 0:19:45 | 0:19:48 | |
But somebody says something, for example, on Twitter... | 0:19:48 | 0:19:51 | |
I was on holiday and I looked online on the phone to see what was | 0:19:51 | 0:19:53 | |
in the newspapers, and there were three stories | 0:19:53 | 0:19:56 | |
about people who had said something | 0:19:56 | 0:19:58 | |
on Twitter and had had the fury poured upon them for doing so | 0:19:58 | 0:20:02 | |
and had to delete the tweets and had to apologise. | 0:20:02 | 0:20:04 | |
So, is that really good for freedom of speech? | 0:20:04 | 0:20:06 | |
No, that's not, because you do end up with the situation now | 0:20:06 | 0:20:09 | |
where basically, people often, if they make a mistake or | 0:20:09 | 0:20:12 | |
they say something which is obviously quite inflammatory | 0:20:12 | 0:20:14 | |
or divisive, then a group on Twitter will basically coordinate and | 0:20:14 | 0:20:18 | |
relish taking that person down, | 0:20:18 | 0:20:20 | |
and they will throw everything at them, | 0:20:20 | 0:20:22 | |
and if you are in the middle of a Twitter storm - | 0:20:22 | 0:20:24 | |
Jon Ronson wrote an excellent book about this, | 0:20:24 | 0:20:26 | |
about people shamed on Twitter - it can be a terrifying experience, | 0:20:26 | 0:20:30 | |
where random strangers all over the world are suddenly screaming at you. | 0:20:30 | 0:20:34 | |
It can be quite threatening and menacing. | 0:20:34 | 0:20:36 | |
You even get a phenomenon of so-called doxxing, | 0:20:36 | 0:20:38 | |
where people will expose your personal information, | 0:20:38 | 0:20:40 | |
go through your back story, everything possible, try to | 0:20:40 | 0:20:43 | |
get you sacked by your employer, so it can be very menacing. | 0:20:43 | 0:20:46 | |
And the mainstream newspapers feed off the back of it, | 0:20:46 | 0:20:48 | |
and have stories about that very thing happening and people | 0:20:48 | 0:20:51 | |
being shamed for saying something that some of the papers | 0:20:51 | 0:20:54 | |
reporting on it actually agree with. | 0:20:54 | 0:20:56 | |
I think that's the other problem, | 0:20:56 | 0:20:57 | |
because this is the other issue, though, | 0:20:57 | 0:20:59 | |
because we look at social media as though this is a new phenomenon, | 0:20:59 | 0:21:01 | |
but there's a very long history of our very politicised media finding | 0:21:01 | 0:21:05 | |
individuals who they disagree with, hunting them down, | 0:21:05 | 0:21:09 | |
having reporters outside their front doors, going through their | 0:21:09 | 0:21:12 | |
private life, going through the private lives of people around them. | 0:21:12 | 0:21:15 | |
So, yes, we should hold social media to account, | 0:21:15 | 0:21:17 | |
but we should also hold to account our mainstream media, | 0:21:17 | 0:21:20 | |
which is responsible often for disseminating false news, | 0:21:20 | 0:21:23 | |
for targeting individuals they don't agree with and humiliating them. | 0:21:23 | 0:21:26 | |
So, yes, let's have that concern, but I think we should talk | 0:21:26 | 0:21:28 | |
about our press as well, which is often out of control itself. | 0:21:28 | 0:21:31 | |
That's something we'll skip onto in just a second. | 0:21:31 | 0:21:33 | |
Is this good for freedom of speech, then, | 0:21:33 | 0:21:36 | |
-if people feel intimidated, Jamie? -No, I mean, there is always... | 0:21:36 | 0:21:39 | |
I think there's a danger with social media, | 0:21:39 | 0:21:41 | |
and I'm in favour of... | 0:21:41 | 0:21:42 | |
I think social media and democracy work in some ways very well | 0:21:42 | 0:21:45 | |
together, and I think the positives do outweigh the negatives. | 0:21:45 | 0:21:49 | |
There is definitely a sense, though, that a lot of people - | 0:21:49 | 0:21:52 | |
and I feel it myself sometimes - are self-censoring, | 0:21:52 | 0:21:54 | |
fearing to put something online that, | 0:21:54 | 0:21:56 | |
"Will I be attacked for that, is it the right thing to do?" | 0:21:56 | 0:21:59 | |
And just avoiding the scandal, avoiding the mob, | 0:21:59 | 0:22:02 | |
avoiding saying something that might cause offence. | 0:22:02 | 0:22:05 | |
-What did you want to say? -Well, I'm definitely not going to say it now! | 0:22:05 | 0:22:08 | |
Although if I did, I might get some re-tweets and YouTube views | 0:22:08 | 0:22:11 | |
and all the rest of it. And that is a danger, | 0:22:11 | 0:22:13 | |
especially given that anything you post stays online forever. | 0:22:13 | 0:22:18 | |
And so, you might be called up for something you did ten years ago, | 0:22:18 | 0:22:22 | |
that is then waved in front of you and you lose your job or you | 0:22:22 | 0:22:25 | |
have people complaining about you, and that has a real chilling | 0:22:25 | 0:22:28 | |
effect, because people will be afraid of ever speaking their mind. | 0:22:28 | 0:22:31 | |
Tom, is this something you take into account when you are | 0:22:31 | 0:22:34 | |
employing people, for example? | 0:22:34 | 0:22:35 | |
Do you have to look at their Facebook history and their... | 0:22:35 | 0:22:38 | |
Absolutely, and we notice increasingly... | 0:22:38 | 0:22:40 | |
For example, if a political party wants to use | 0:22:40 | 0:22:42 | |
a real-life case study for a film or party political broadcast, | 0:22:42 | 0:22:46 | |
a poster campaign, etc, you would look into their background, | 0:22:46 | 0:22:49 | |
and you find increasingly, | 0:22:49 | 0:22:50 | |
when you look at people from a younger generation, | 0:22:50 | 0:22:53 | |
when you look at their Facebook feeds, | 0:22:53 | 0:22:55 | |
when you look at their Twitter feeds, | 0:22:55 | 0:22:57 | |
they are saying stuff that kids aged 16, 17, 18 say, which, | 0:22:57 | 0:23:00 | |
you know, when I was 16, 17, 18, I was saying stupid stuff as well. | 0:23:00 | 0:23:03 | |
Or you'll find that they will have liked something from UniLad | 0:23:03 | 0:23:06 | |
or The LAD Bible or something like that, | 0:23:06 | 0:23:08 | |
which we know that this is just part and parcel of being young and | 0:23:08 | 0:23:12 | |
being a bit stupid and making mistakes, as we've all done, | 0:23:12 | 0:23:15 | |
but taken out of context, suddenly it blows up and becomes... | 0:23:15 | 0:23:18 | |
And now it's written in stone, isn't it? | 0:23:18 | 0:23:20 | |
Well, exactly, and I remember the case... | 0:23:20 | 0:23:22 | |
I expect others on the panel will know the details more, | 0:23:22 | 0:23:25 | |
but a Youth Police and Crime Commissioner, I think she was in | 0:23:25 | 0:23:28 | |
Birmingham, who had sort of put her head above the parapet for a job... | 0:23:28 | 0:23:31 | |
-In Kent, I think, yeah. -Yeah, maybe, yeah. | 0:23:31 | 0:23:34 | |
She had put her head above the parapet for | 0:23:34 | 0:23:36 | |
a job and then the press went over her tweets, | 0:23:36 | 0:23:38 | |
as it is their right to do, | 0:23:38 | 0:23:40 | |
and they found that she had said some stupid stuff, as we've all | 0:23:40 | 0:23:43 | |
done, and suddenly, she became the number one victim in the country... | 0:23:43 | 0:23:46 | |
number one criminal in the country, | 0:23:46 | 0:23:48 | |
they put her face all over the papers and she had to resign. | 0:23:48 | 0:23:51 | |
Now, I'm not arguing the rights or wrongs about what she said, | 0:23:51 | 0:23:53 | |
but increasingly now, the things we say when we are younger | 0:23:53 | 0:23:56 | |
are being held against us because it's a permanent record online. | 0:23:56 | 0:23:59 | |
Dan, you want to come in? | 0:23:59 | 0:24:01 | |
Well, these things are important, but I'd like to get us on to | 0:24:01 | 0:24:04 | |
fake news, which I think is a cancer in the system, I mean, | 0:24:04 | 0:24:09 | |
you know, information is the lifeblood of democracy and | 0:24:09 | 0:24:12 | |
fake news is a terrible thing... | 0:24:12 | 0:24:14 | |
But Owen's saying fake news is nothing new. | 0:24:14 | 0:24:16 | |
Well, we've seen the rise of it, | 0:24:16 | 0:24:17 | |
but we've really seen the rise of it on social media in the USA... | 0:24:17 | 0:24:20 | |
Have we not had fake news in the mainstream media? | 0:24:20 | 0:24:22 | |
You know, the Pope supports Donald Trump, Hillary Clinton is | 0:24:22 | 0:24:25 | |
part of a paedophile ring - I mean, these things are nonsense, | 0:24:25 | 0:24:27 | |
but they are very serious nonsense... | 0:24:27 | 0:24:29 | |
But clearly nonsense, aren't they? | 0:24:29 | 0:24:31 | |
Well, the thing is, I think that what we... | 0:24:31 | 0:24:33 | |
We've got an election coming up in three years' time, | 0:24:33 | 0:24:35 | |
there is some evidence to suggest it's had an impact on | 0:24:35 | 0:24:38 | |
the democratic process in America, I think we need to find ways of | 0:24:38 | 0:24:42 | |
stopping that tide reaching these shores over the next three years. | 0:24:42 | 0:24:46 | |
I personally... | 0:24:46 | 0:24:48 | |
Although I think that social media and the internet are | 0:24:48 | 0:24:50 | |
fundamentally good for democracy, they disseminate information | 0:24:50 | 0:24:53 | |
and encourage participation, fake news needs to be stamped out. | 0:24:53 | 0:24:57 | |
I think the responsibility for that lies with social media companies. | 0:24:57 | 0:25:01 | |
But what's the difference between... | 0:25:01 | 0:25:02 | |
I don't think they are being responsible enough about it | 0:25:02 | 0:25:05 | |
and I think they need to be called out on that. | 0:25:05 | 0:25:06 | |
OK, are they tech platforms or publishers? | 0:25:06 | 0:25:09 | |
Well, the clue's in the title, they are social media platforms, | 0:25:09 | 0:25:11 | |
and I don't know why they are saying they are not media companies. | 0:25:11 | 0:25:14 | |
On this fake news business, | 0:25:14 | 0:25:15 | |
some of these fake news stories are so blatantly ludicrous, | 0:25:15 | 0:25:18 | |
-surely most people must think, "Oh, for goodness' sake!" -But we saw... | 0:25:18 | 0:25:22 | |
And which is more insidious, | 0:25:22 | 0:25:23 | |
that or some of the misreporting and misrepresentation and | 0:25:23 | 0:25:27 | |
distortions that we've seen over the years from the mainstream media? | 0:25:27 | 0:25:30 | |
Which is more dangerous? | 0:25:30 | 0:25:32 | |
I think they are both dangerous. | 0:25:32 | 0:25:33 | |
There is a big, big difference between those two worlds, | 0:25:33 | 0:25:36 | |
though, which is that particularly for television, | 0:25:36 | 0:25:38 | |
it's a much more regulated world, there are rules, there is | 0:25:38 | 0:25:41 | |
a third-party independent regulator who can investigate complaints. | 0:25:41 | 0:25:46 | |
Social media is the Wild West, | 0:25:46 | 0:25:48 | |
it is absolutely the Wild West, and that's a problem. | 0:25:48 | 0:25:52 | |
Look, back in 1984, during the miners' strike, | 0:25:52 | 0:25:55 | |
a very divisive moment in British history, | 0:25:55 | 0:25:57 | |
there was the infamous Battle of Orgreave, | 0:25:57 | 0:25:59 | |
an infamous battle in the whole struggle, | 0:25:59 | 0:26:02 | |
and the BBC, I'm afraid to say, at the time, | 0:26:02 | 0:26:05 | |
they put the tape in a wrong direction, because originally, | 0:26:05 | 0:26:07 | |
as Liberty said, the human rights organisation, | 0:26:07 | 0:26:10 | |
it was a police riot, the police had to pay compensation, | 0:26:10 | 0:26:12 | |
and they put it in the wrong direction, the BBC - | 0:26:12 | 0:26:15 | |
yet to apologise for it, incidentally - | 0:26:15 | 0:26:17 | |
and made it look like the miners had attacked the police. | 0:26:17 | 0:26:20 | |
The other example, infamously, is Hillsborough, | 0:26:20 | 0:26:23 | |
where the Sun newspaper spread false news, fake news, | 0:26:23 | 0:26:27 | |
about Liverpool fans, with absolutely horrendous consequences. | 0:26:27 | 0:26:31 | |
So, we should take on fake news on the new media, | 0:26:31 | 0:26:34 | |
but also on the old media as well. | 0:26:34 | 0:26:36 | |
But this happens every day on Twitter and digital media, | 0:26:36 | 0:26:38 | |
-fake news... -But I think, often... | 0:26:38 | 0:26:40 | |
Well, look, whether it be about immigrants, Muslims, | 0:26:40 | 0:26:43 | |
unemployed people, benefit fraud, | 0:26:43 | 0:26:45 | |
I think all of these are issues which are either exaggerated | 0:26:45 | 0:26:47 | |
or distorted, often, as well, by the mainstream press, | 0:26:47 | 0:26:50 | |
which play into people's prejudices, where you end up, for example, | 0:26:50 | 0:26:53 | |
with disabled people who have been, as disabled charities have said, | 0:26:53 | 0:26:56 | |
abused on the streets as a consequence. | 0:26:56 | 0:26:58 | |
That's because of the mainstream press as well. | 0:26:58 | 0:27:01 | |
But you can complain to somebody, as Dan says - with the so-called | 0:27:01 | 0:27:04 | |
mainstream media, the MSM, you can complain. | 0:27:04 | 0:27:07 | |
Theoretically, there's redress, isn't there? | 0:27:07 | 0:27:10 | |
In theory, but I'm afraid to say, often the way the media operates | 0:27:10 | 0:27:13 | |
in this country is that they often behave, still, with impunity. | 0:27:13 | 0:27:16 | |
You'll get corrections on page 22 in a tiny little box. | 0:27:16 | 0:27:19 | |
So, yes, obviously, I'm not saying we shouldn't take it seriously, | 0:27:19 | 0:27:23 | |
absolutely we should do that, | 0:27:23 | 0:27:25 | |
false news, which is deliberately fake in order to play into | 0:27:25 | 0:27:28 | |
the prejudices of very angry sections of the population, | 0:27:28 | 0:27:31 | |
but I just don't think there is enough anger about the... | 0:27:31 | 0:27:33 | |
Andrew, I'm coming to you, don't worry. | 0:27:33 | 0:27:35 | |
Because Andrew has a lot to say on this and knows a lot about it. | 0:27:35 | 0:27:38 | |
But Will is a fact-checker. You must be a very busy man at the moment! | 0:27:38 | 0:27:42 | |
-It's been a tough year! -LAUGHTER | 0:27:42 | 0:27:45 | |
We spend our time not only checking what people are saying, | 0:27:45 | 0:27:48 | |
be they politicians, be they journalists, | 0:27:48 | 0:27:50 | |
be they lots of other people in public life, | 0:27:50 | 0:27:52 | |
but also asking them to go and correct the record. | 0:27:52 | 0:27:54 | |
But it's not just mischievous, this stuff, it's malicious, isn't it? | 0:27:54 | 0:27:57 | |
Is deliberate and it's malicious. | 0:27:57 | 0:27:58 | |
It's not for me to judge that, | 0:27:58 | 0:27:59 | |
it's for everyone in this room to judge that. | 0:27:59 | 0:28:01 | |
If we are concerned about the effect of information on democracy, | 0:28:01 | 0:28:04 | |
we've got to remember that we are democracy, | 0:28:04 | 0:28:07 | |
it is all of us as voters who decide whether digital media is | 0:28:07 | 0:28:10 | |
going to be good for us or bad for us. | 0:28:10 | 0:28:12 | |
But I think the thing we've got to look at is, | 0:28:12 | 0:28:15 | |
who has power and are they using it fairly? | 0:28:15 | 0:28:17 | |
Are people who are in powerful positions using information | 0:28:17 | 0:28:21 | |
to mislead us, and if so, what are we going to do about it? | 0:28:21 | 0:28:24 | |
And Owen is right, there are lots of examples of mainstream media | 0:28:24 | 0:28:28 | |
outlets publishing things that are clearly not true. | 0:28:28 | 0:28:31 | |
-In our experience... -And getting clobbered for it, eventually. -Sorry? | 0:28:31 | 0:28:34 | |
-And getting clobbered for it, eventually. -Well, clobbered is... | 0:28:34 | 0:28:37 | |
This stuff is round the world in 80 clicks, isn't it? | 0:28:37 | 0:28:40 | |
Clobbered is putting it far too strongly. | 0:28:40 | 0:28:42 | |
Very few mainstream media outlets have any kind of system where | 0:28:42 | 0:28:45 | |
people get "clobbered" for getting things wrong. | 0:28:45 | 0:28:48 | |
That's simply not true in television. | 0:28:48 | 0:28:50 | |
There is a very clear set of rules, | 0:28:50 | 0:28:52 | |
there's a third-party independent regulator who polices that, | 0:28:52 | 0:28:55 | |
you can complain, things are investigated, and ultimately, | 0:28:55 | 0:28:58 | |
if you are a repeat offender, you have your licence revoked. | 0:28:58 | 0:29:01 | |
Well, A, when did that last happen in a major media programme? | 0:29:01 | 0:29:04 | |
Well, but I think that some of the... | 0:29:04 | 0:29:06 | |
Secondly, television is, as you said... | 0:29:06 | 0:29:07 | |
Some of the news channels can really sail close to the wind, | 0:29:07 | 0:29:10 | |
and they get called on this stuff by the regulator, | 0:29:10 | 0:29:13 | |
and they have to amend their behaviour, and they do. | 0:29:13 | 0:29:16 | |
Television, as you said, is the most regulated, | 0:29:16 | 0:29:18 | |
and it's absolutely right that they have the strongest kind of | 0:29:18 | 0:29:21 | |
set of rules around accuracy and the greatest caution. | 0:29:21 | 0:29:23 | |
It is nonetheless the case that at least one of them | 0:29:23 | 0:29:25 | |
has the informal slogan of "never wrong for long" - we'll get | 0:29:25 | 0:29:28 | |
it on air as quickly as possible and we'll correct it if we need to. | 0:29:28 | 0:29:31 | |
Television and radio have very strong rules. | 0:29:31 | 0:29:34 | |
Newspapers, which are highly partisan, don't have the same rules. | 0:29:34 | 0:29:37 | |
They have a whole range of things. | 0:29:37 | 0:29:39 | |
Some of them will correct things very quickly and very fairly when | 0:29:39 | 0:29:42 | |
we ask them to, some of them will simply ignore a corrections request. | 0:29:42 | 0:29:45 | |
So there is a whole range of stuff going on. | 0:29:45 | 0:29:48 | |
-How long does it take to check a fact? -Usually, about a day. -A day? | 0:29:48 | 0:29:52 | |
To turn around a really solid fact check where you've looked at | 0:29:52 | 0:29:55 | |
all the sources, you've talked to experts where you need to... | 0:29:55 | 0:29:57 | |
Well, as I say, it's been around the world about twice | 0:29:57 | 0:30:00 | |
on digital media by that time. | 0:30:00 | 0:30:01 | |
Exactly, so the thing that really matters isn't actually | 0:30:01 | 0:30:05 | |
whether media outlets get things wrong once, it's how often | 0:30:05 | 0:30:08 | |
things get repeated, because that's what's really powerful. | 0:30:08 | 0:30:11 | |
So if they do correct things when we point it out, | 0:30:11 | 0:30:14 | |
or when someone else points it out, that can do a lot of good, | 0:30:14 | 0:30:16 | |
but not all of them do it consistently. | 0:30:16 | 0:30:18 | |
Let's talk about social media for a minute, | 0:30:18 | 0:30:21 | |
because there are kind of two things going on with social media. | 0:30:21 | 0:30:24 | |
One is that all of us can share whatever we want | 0:30:24 | 0:30:27 | |
with whoever we want, and that's amazing. | 0:30:27 | 0:30:29 | |
That's democracy writ large. | 0:30:29 | 0:30:30 | |
And if you don't like the consequences, | 0:30:30 | 0:30:32 | |
then change your voters, ie it's up to all of us | 0:30:32 | 0:30:35 | |
to do a better job as voters. | 0:30:35 | 0:30:37 | |
The other bit is, people in positions of power, | 0:30:37 | 0:30:40 | |
be they the social media companies or be they the advertisers | 0:30:40 | 0:30:44 | |
on social media, using that platform, | 0:30:44 | 0:30:46 | |
either fairly to the users or unfairly. | 0:30:46 | 0:30:49 | |
Now, the thing we don't know, | 0:30:49 | 0:30:51 | |
when the political parties spend hundreds of thousands, millions, | 0:30:51 | 0:30:55 | |
I don't know, advertising, is what they are saying and to whom. | 0:30:55 | 0:30:59 | |
It has never, ever been possible before to reach millions | 0:30:59 | 0:31:03 | |
of people in the country and have the other millions | 0:31:03 | 0:31:06 | |
of people in the country not know. | 0:31:06 | 0:31:08 | |
You couldn't take out a newspaper advert | 0:31:08 | 0:31:09 | |
and have the rest of us not see it. | 0:31:09 | 0:31:11 | |
So now we have a whole set of political advertising | 0:31:11 | 0:31:14 | |
that is completely unscrutinised, | 0:31:14 | 0:31:16 | |
and that's the thing we should be a bit worried about. | 0:31:16 | 0:31:18 | |
Well, Dan's agreeing with you. | 0:31:18 | 0:31:19 | |
Social media is invisible for the rest of us. | 0:31:19 | 0:31:22 | |
I think it's the fact that it's unscrutinised is basically | 0:31:22 | 0:31:24 | |
-Dan's point. Dr Andrew Calcutt... -Can I just make a point about...? | 0:31:24 | 0:31:28 | |
-You can in a minute. -Just about trust. -I'll come back to trust. | 0:31:28 | 0:31:31 | |
-Because the rules... -TRUST me, I will come back to it. | 0:31:31 | 0:31:34 | |
Or I'll apologise. | 0:31:34 | 0:31:37 | |
But Andrew hasn't been in it, and I know, Laura, | 0:31:37 | 0:31:39 | |
I've not to come to you yet. | 0:31:39 | 0:31:41 | |
Everyone's had a say and will have far more of a say, but, Andrew, | 0:31:41 | 0:31:45 | |
journalism, humanities and creative industries, that's your thing, | 0:31:45 | 0:31:48 | |
as Dr Andrew Calcutt. | 0:31:48 | 0:31:51 | |
We're on false news, fake news now, | 0:31:51 | 0:31:54 | |
but in this cacophony of calumny and falsehood, is the truth dying? | 0:31:54 | 0:32:01 | |
Well, I think the idea that democracy is threatened | 0:32:01 | 0:32:06 | |
and truth is destroyed by a bunch of spotty youths from Macedonia... | 0:32:06 | 0:32:11 | |
NICKY LAUGHS | 0:32:11 | 0:32:13 | |
..it's just ludicrous, and equally risible is the CIA, | 0:32:13 | 0:32:16 | |
of all people, complaining about state agencies interfering | 0:32:16 | 0:32:20 | |
in the business of other sovereign nations, you couldn't make it up. | 0:32:20 | 0:32:25 | |
APPLAUSE | 0:32:25 | 0:32:26 | |
I think though that... | 0:32:26 | 0:32:28 | |
I think some of my fellow panellists are trying to do the jigsaw | 0:32:28 | 0:32:32 | |
by just the first couple of pieces that come to hand. | 0:32:32 | 0:32:36 | |
And I think, really, if we all take a step back | 0:32:36 | 0:32:40 | |
and look at that question, it's kind of too early to tell... | 0:32:40 | 0:32:43 | |
whether social media and digital media are good for democracy, | 0:32:43 | 0:32:48 | |
cos we're not in one, we're not in a democracy, | 0:32:48 | 0:32:51 | |
we're in an era of something like zombie politics. | 0:32:51 | 0:32:54 | |
So, if I may, it's a little bit like, was it Deng Xiaoping? | 0:32:54 | 0:32:56 | |
When asked about the French Revolution. | 0:32:56 | 0:32:58 | |
Yeah, it's too early to tell. | 0:32:58 | 0:32:59 | |
And he said, did it work, "It's too early to tell." | 0:32:59 | 0:33:02 | |
I don't know if it was Deng, but it's a little bit like that. | 0:33:02 | 0:33:05 | |
We're in an era of zombie politics where the majority of | 0:33:05 | 0:33:09 | |
the population is notable by its absence. | 0:33:09 | 0:33:13 | |
And professional politicians have been like | 0:33:13 | 0:33:17 | |
a kind of medicine show, | 0:33:17 | 0:33:19 | |
spooning out what the majority population is supposed | 0:33:19 | 0:33:24 | |
to accept, and you're meant to just take your medicine, | 0:33:24 | 0:33:27 | |
and if you don't take it, you're decried as being a deplorable. | 0:33:27 | 0:33:32 | |
And guess what? | 0:33:32 | 0:33:35 | |
In the last year, major sections of that population have said, | 0:33:35 | 0:33:38 | |
"We're not going to be the studio audience any more, | 0:33:38 | 0:33:41 | |
"we're going to walk out on that scenario." | 0:33:41 | 0:33:44 | |
And so, with all due respect, a lot of people sitting around | 0:33:44 | 0:33:48 | |
the inner circle are indeed IN the inner circle. | 0:33:48 | 0:33:50 | |
-They're in the bubble that is... -We are the problem. | 0:33:50 | 0:33:54 | |
..that is rather far-removed from where most people are, | 0:33:54 | 0:33:58 | |
and most people are not in any substantial sense | 0:33:58 | 0:34:02 | |
-engaged in politics. -But if there's a lack of trust. | 0:34:02 | 0:34:05 | |
The idea that this kind of interpersonal details amounts | 0:34:05 | 0:34:09 | |
to political polarisation is just ludicrous. | 0:34:09 | 0:34:12 | |
Put your hand up in the audience if you want to say something | 0:34:12 | 0:34:14 | |
on this, and I'll get round to you. | 0:34:14 | 0:34:16 | |
But if you say that people are disillusioned with politics | 0:34:16 | 0:34:19 | |
because of a lack of trust, we are now playing around | 0:34:19 | 0:34:22 | |
with this platform which has stuff on it that we cannot trust. | 0:34:22 | 0:34:27 | |
In there is a paradox somewhere, Dan, isn't there? | 0:34:27 | 0:34:29 | |
Well, yeah, I'm particularly interested in this question | 0:34:29 | 0:34:32 | |
of rules and regulations, | 0:34:32 | 0:34:34 | |
because the most regulated part of the media is television. | 0:34:34 | 0:34:40 | |
It is, and we've got this diverse public service system in the UK, | 0:34:40 | 0:34:44 | |
and that is policed by Ofcom, and lo and behold, people's biggest source | 0:34:44 | 0:34:49 | |
of news in the UK is television, and it's also the most trusted. | 0:34:49 | 0:34:54 | |
Social media at the moment | 0:34:54 | 0:34:57 | |
is much, much lower, both in terms of the amount of people | 0:34:57 | 0:35:00 | |
that do access it for news, but also for levels of trust. | 0:35:00 | 0:35:03 | |
-The problem is... -WILL: -Same is true of newspapers, by the way. | 0:35:03 | 0:35:06 | |
The problem is you've got this issue of fake news, | 0:35:06 | 0:35:10 | |
and we have to stamp it out before it gets bigger. | 0:35:10 | 0:35:12 | |
-Too late. -It's not. | 0:35:12 | 0:35:14 | |
It's not too late. | 0:35:14 | 0:35:15 | |
But there's no such thing as fake news and not fake news | 0:35:15 | 0:35:18 | |
and it's clear to determine which is which. | 0:35:18 | 0:35:20 | |
It's such a big grey area between one and the other. | 0:35:20 | 0:35:24 | |
-There may be, but... -Stamping it out is censorship. | 0:35:24 | 0:35:28 | |
-DR ANDREW: -Aren't our lives micromanaged enough already? | 0:35:28 | 0:35:31 | |
Do you have to regulate everything? | 0:35:31 | 0:35:33 | |
We have to bring up the elephant in the room here, | 0:35:33 | 0:35:35 | |
which is that what no-one wants to admit is that the idea of | 0:35:35 | 0:35:38 | |
the fear of fake news is the fear of who is reading the fake news | 0:35:38 | 0:35:41 | |
and what will they do with that information? And, essentially, | 0:35:41 | 0:35:44 | |
when you say there is a danger of people being influenced | 0:35:44 | 0:35:47 | |
by fake news negatively, | 0:35:47 | 0:35:48 | |
you're making a differentiation between you, | 0:35:48 | 0:35:50 | |
who can understand that the news is fake, | 0:35:50 | 0:35:52 | |
and those thickos out there who can't, and that's a deeply... | 0:35:52 | 0:35:55 | |
-DAN: -No, I don't... -Hang on. -Don't think that at all. | 0:35:55 | 0:35:58 | |
If you think fake news should be stamped out, who do you want | 0:35:58 | 0:36:01 | |
to stamp it out and why should they be the arbiters of truth? | 0:36:01 | 0:36:03 | |
-And who do you stamp it out? -Exactly. | 0:36:03 | 0:36:05 | |
Well, I think that's complicated. | 0:36:05 | 0:36:07 | |
That's the easy answer to a really uncomfortable question. | 0:36:07 | 0:36:10 | |
The horse has bolted. | 0:36:10 | 0:36:12 | |
But you have different systems within different media, | 0:36:12 | 0:36:15 | |
and television is heavily regulated, | 0:36:15 | 0:36:17 | |
it didn't stop television holding power to account, | 0:36:17 | 0:36:20 | |
generally being extremely accurate, being diverse and being popular. | 0:36:20 | 0:36:24 | |
-It doesn't stop that. -But can I ask you why you're worried about...? | 0:36:24 | 0:36:26 | |
Why are you so worried that people...? | 0:36:26 | 0:36:28 | |
And I don't know who you're talking about when you're | 0:36:28 | 0:36:30 | |
talking about people that don't understand that wild story | 0:36:30 | 0:36:33 | |
about Ed Miliband fixing relationships | 0:36:33 | 0:36:35 | |
with Hillary Clinton is fake. | 0:36:35 | 0:36:37 | |
Well, BuzzFeed's research showed that there was | 0:36:37 | 0:36:41 | |
more consumption of fake news during the US election than true news. | 0:36:41 | 0:36:46 | |
-Entertainment. It's people having a laugh. -That's a problem. | 0:36:46 | 0:36:50 | |
Can I ask a question? Just a hypothetical. | 0:36:50 | 0:36:52 | |
If something completely fake about you went all over the world, | 0:36:52 | 0:36:56 | |
like someone claimed you were the head of a drugs cartel, | 0:36:56 | 0:36:58 | |
something obviously ridiculous, | 0:36:58 | 0:37:00 | |
that went all around the world, they don't know you from Adam, | 0:37:00 | 0:37:03 | |
and your picture is there, calling you the head of a drug cartel, | 0:37:03 | 0:37:06 | |
you'd be like, "No correction needed, free speech, | 0:37:06 | 0:37:08 | |
"they can say what they want"? | 0:37:08 | 0:37:09 | |
Would you be comfortable with that, genuinely? | 0:37:09 | 0:37:11 | |
I suppose I would publicly say that that isn't the case. | 0:37:11 | 0:37:14 | |
Wouldn't that put you at risk? | 0:37:14 | 0:37:16 | |
The cost of having a free society is people being able to say whatever | 0:37:16 | 0:37:19 | |
they want, and sometimes that being untrue, sometimes that being nasty. | 0:37:19 | 0:37:22 | |
-But you wouldn't want it stamped out? -That's the cost of a free society. | 0:37:22 | 0:37:24 | |
I'm going to take that thought, | 0:37:24 | 0:37:26 | |
people being able to say whatever they want, | 0:37:26 | 0:37:28 | |
and we're going to discuss it in the context of digital media, | 0:37:28 | 0:37:30 | |
but first of all, as promised, what would you like to say? | 0:37:30 | 0:37:33 | |
Thank you. | 0:37:33 | 0:37:34 | |
What I wanted to say was that this particular issue about news, | 0:37:34 | 0:37:39 | |
where they are coming from, | 0:37:39 | 0:37:40 | |
it's not the responsibility of one side, it's the responsibility | 0:37:40 | 0:37:43 | |
of the receivers, us, general public, as well, | 0:37:43 | 0:37:46 | |
to find out where is it coming from, if we can't get right to the bottom | 0:37:46 | 0:37:50 | |
of it, as least do a little bit of research | 0:37:50 | 0:37:52 | |
or investigation of our own. | 0:37:52 | 0:37:54 | |
Nowadays, it's not only TV, as somebody said, | 0:37:54 | 0:37:57 | |
it's mostly free newspapers | 0:37:57 | 0:37:58 | |
on the train, that's the time we have to just listen and digest. | 0:37:58 | 0:38:02 | |
Now, just, even if we take simple rules, golden rules... | 0:38:02 | 0:38:06 | |
This particular rule I'm saying is coming from a Muslim community. | 0:38:06 | 0:38:10 | |
It's a saying of Prophet Muhammad, a basic saying, a Hadith, | 0:38:10 | 0:38:13 | |
that for a person to be a liar | 0:38:13 | 0:38:15 | |
is enough that he passes on whatever he hears | 0:38:15 | 0:38:18 | |
without making any effort to check. | 0:38:18 | 0:38:21 | |
So, if everybody takes just this simple, basic rule, | 0:38:21 | 0:38:24 | |
we'll avoid so much of spreading around just | 0:38:24 | 0:38:27 | |
a news that we heard on a WhatsApp or Facebook... | 0:38:27 | 0:38:30 | |
-Rumours, gossip. -That's it, yeah. | 0:38:30 | 0:38:32 | |
The Hadith are very strong against rumours and backbiting. | 0:38:32 | 0:38:35 | |
So, if everybody sticks to that rule, so much will be relieved. | 0:38:35 | 0:38:39 | |
-That's actually a really smart point. -It's the best point so far. | 0:38:39 | 0:38:43 | |
-LAUGHTER -As a fact-checker, | 0:38:43 | 0:38:45 | |
it's lovely to think that we can use fact-checking to empower everybody. | 0:38:45 | 0:38:49 | |
The thing you've got to ask, | 0:38:49 | 0:38:51 | |
the flipside of the question of who's the person who can | 0:38:51 | 0:38:55 | |
split the difference between all those stupid people | 0:38:55 | 0:38:57 | |
who believe the fake news and the rest of us, actually, none of us | 0:38:57 | 0:39:01 | |
can just skim through a set of headlines | 0:39:01 | 0:39:03 | |
and know which is true or not. | 0:39:03 | 0:39:04 | |
It's immensely difficult to spot people who are | 0:39:04 | 0:39:07 | |
trying to mislead you and people who aren't. | 0:39:07 | 0:39:10 | |
Without the thrills and spills and the excitement of, | 0:39:10 | 0:39:13 | |
"Ooh, what if that's true? That's great." | 0:39:13 | 0:39:16 | |
-Exactly. -It's a completely different set of personal standards we apply | 0:39:16 | 0:39:19 | |
to that than we do to watching TV news or radio news, but you can... | 0:39:19 | 0:39:22 | |
Never mind splitting the difference, | 0:39:22 | 0:39:24 | |
here's somebody who made a difference. | 0:39:24 | 0:39:27 | |
Laura, your campaign, the tampon tax, | 0:39:27 | 0:39:30 | |
-that was through social media, wasn't it? -Yep. | 0:39:30 | 0:39:33 | |
I think one of the main really good points about social media is | 0:39:33 | 0:39:36 | |
that it really gives power to under-represented communities | 0:39:36 | 0:39:39 | |
of people, like women, for example. | 0:39:39 | 0:39:41 | |
So, tampon tax has existed since 1973 and people have campaigned | 0:39:41 | 0:39:45 | |
for generations to end it, but the reason why this campaign | 0:39:45 | 0:39:49 | |
finally won - at least I think, anyway - | 0:39:49 | 0:39:51 | |
is because we've had this platform of social media | 0:39:51 | 0:39:54 | |
that we've never had before, and upon that platform we can finally, | 0:39:54 | 0:39:57 | |
really be heard, and politicians can't ignore us any more. | 0:39:57 | 0:40:00 | |
So, I think that's one really good and important point | 0:40:00 | 0:40:04 | |
of social media, that individuals can really use it, | 0:40:04 | 0:40:06 | |
rather than political parties or even news outlets. | 0:40:06 | 0:40:09 | |
It gives the individual back power, | 0:40:09 | 0:40:11 | |
-and then communities that are under-represented. -Yeah. | 0:40:11 | 0:40:14 | |
There's loads of examples, things I'm interested in, | 0:40:14 | 0:40:17 | |
the whole pressure on SeaWorld in Florida... | 0:40:17 | 0:40:19 | |
The Black Lives Matter campaign. | 0:40:19 | 0:40:21 | |
The Black Lives Matter campaign, your campaign, | 0:40:21 | 0:40:24 | |
but it's not just a click, is it? | 0:40:24 | 0:40:25 | |
Cos it's what you were saying earlier on, Helen, | 0:40:25 | 0:40:27 | |
it's a million clicks. | 0:40:27 | 0:40:29 | |
Exactly, the clicks scale up to something really significant. | 0:40:29 | 0:40:33 | |
-Also, people don't just click and do nothing else... -They're engaged. | 0:40:33 | 0:40:37 | |
Exactly, people don't just scroll through, like Change.org, | 0:40:37 | 0:40:39 | |
for example, and click on every petition. | 0:40:39 | 0:40:42 | |
I get people e-mailing me every day with little spelling errors | 0:40:42 | 0:40:46 | |
I've made that I've not even seen. | 0:40:46 | 0:40:47 | |
So they definitely do read through your campaigns. | 0:40:47 | 0:40:50 | |
We also have lots of communities throughout the world really | 0:40:50 | 0:40:53 | |
that have bound together and made a difference within | 0:40:53 | 0:40:56 | |
their community, whether that be through tampon tax or | 0:40:56 | 0:40:59 | |
the new campaign about homeless people | 0:40:59 | 0:41:00 | |
and improving sanitary provision to homeless women. | 0:41:00 | 0:41:03 | |
Lots of people do stuff physically that's inspired by.... | 0:41:03 | 0:41:06 | |
Did you get any abuse and vile trolling? | 0:41:06 | 0:41:09 | |
-As a result of your campaign. -I did. | 0:41:09 | 0:41:12 | |
Especially a woman, or a feminist campaigner online, | 0:41:12 | 0:41:15 | |
you're just open to that, which is really sad, | 0:41:15 | 0:41:18 | |
but I found that online trolling is actually | 0:41:18 | 0:41:20 | |
a really important form of sexism that I've incurred anyway, | 0:41:20 | 0:41:23 | |
in that it's really obviously documented and it's evidence | 0:41:23 | 0:41:26 | |
-to show that sexism really does exist. -Shocking. | 0:41:26 | 0:41:29 | |
Yeah, it shocks people, but women face sexism on an everyday basis. | 0:41:29 | 0:41:33 | |
-Not like that. -Lots of people basically say, | 0:41:33 | 0:41:35 | |
"Oh, sexism doesn't exist, | 0:41:35 | 0:41:37 | |
"we have never experienced it as men, so therefore it doesn't..." | 0:41:37 | 0:41:40 | |
-But some of that stuff is psychotic, it's not just... -Yeah, but I mean, | 0:41:40 | 0:41:43 | |
it's just evidence of feelings that are evident throughout society, | 0:41:43 | 0:41:48 | |
and if that comes out online that's, in a way, a good thing, | 0:41:48 | 0:41:50 | |
cos it shows that sexism really does exist and we need to tackle it. | 0:41:50 | 0:41:54 | |
It's lifted up the stone. | 0:41:54 | 0:41:55 | |
What about so many female MPs, from all sides of the chamber, | 0:41:55 | 0:41:59 | |
of the political divide, have had some dreadful, dreadful stuff? | 0:41:59 | 0:42:05 | |
-Absolutely vile, and... -What's that about? -I think it's about... | 0:42:05 | 0:42:10 | |
It's the power of the network you're talking about, with the tampon tax, | 0:42:10 | 0:42:14 | |
and networks before were old boys' networks, | 0:42:14 | 0:42:16 | |
they were for the powerful, | 0:42:16 | 0:42:18 | |
whereas now we can have networks for everyone to come together. | 0:42:18 | 0:42:21 | |
But as well as that, that attracts, and what the internet does | 0:42:21 | 0:42:25 | |
really well is amplify what is out there in the debate. | 0:42:25 | 0:42:31 | |
It attracts those who are the most vocal and the most aggressive | 0:42:31 | 0:42:35 | |
to take part in a debate, and what I really dislike is this idea | 0:42:35 | 0:42:39 | |
that if you respond to that you are feeding the trolls. | 0:42:39 | 0:42:42 | |
I think that women need to put their voices out there and we need | 0:42:42 | 0:42:48 | |
to take back that space, we need to take back the internet space, | 0:42:48 | 0:42:53 | |
and what I would say, again, | 0:42:53 | 0:42:55 | |
that's why we need a more active government setting. | 0:42:55 | 0:43:00 | |
It's too early to tell, I agree, the impact of digital media | 0:43:00 | 0:43:05 | |
on democracy, but we need to decide what that impact is. | 0:43:05 | 0:43:08 | |
More active government, what do you mean? | 0:43:08 | 0:43:10 | |
We need a government that's looking at the issues, | 0:43:10 | 0:43:15 | |
protecting free speech, but also | 0:43:15 | 0:43:17 | |
against hate speech online, for example. | 0:43:17 | 0:43:19 | |
But if you were to meet one of these people, and very often, | 0:43:19 | 0:43:22 | |
Owen did, they're not very impressive human beings. | 0:43:22 | 0:43:26 | |
They're pretty sad individuals, but they become | 0:43:26 | 0:43:28 | |
these keyboard warriors, and as Owen said earlier on, | 0:43:28 | 0:43:31 | |
they say things to you that they would never say to your face. | 0:43:31 | 0:43:35 | |
That's not representative, it's just brought out a sense of... | 0:43:35 | 0:43:38 | |
It's empowered some rather pathetic people. | 0:43:38 | 0:43:41 | |
Well, it's empowered, if you like, the wrong people, but there | 0:43:41 | 0:43:44 | |
needs to also be sanctions, and this is the point about... | 0:43:44 | 0:43:49 | |
It's empowered the wrong people. | 0:43:49 | 0:43:51 | |
It's empowered the wrong people when they're attacking and trolling. | 0:43:51 | 0:43:55 | |
And there needs to be sanctions, and I'm very interested, | 0:43:55 | 0:43:57 | |
the idea that you don't believe there should be sanctions | 0:43:57 | 0:44:01 | |
online for the sort of behaviour which, in the street, | 0:44:01 | 0:44:05 | |
if somebody attacks me in the street in the way that they do online, | 0:44:05 | 0:44:08 | |
there would be sanctions. | 0:44:08 | 0:44:09 | |
What I'm saying is that regulation that applies in real life | 0:44:09 | 0:44:13 | |
also applies online. | 0:44:13 | 0:44:14 | |
I don't think there should be any sanctions on speech | 0:44:14 | 0:44:17 | |
-in real life or anywhere else. -It's empowering the wrong people. | 0:44:17 | 0:44:19 | |
The phrase "empowering the wrong people" | 0:44:19 | 0:44:21 | |
is so terrifying to hear from an elected MP, | 0:44:21 | 0:44:23 | |
that is just shocking to me, | 0:44:23 | 0:44:25 | |
because what this is really masking - and the question about | 0:44:25 | 0:44:28 | |
women is so interesting, because, so often, censorship online | 0:44:28 | 0:44:32 | |
is couched in terms of protecting women, protecting minorities, | 0:44:32 | 0:44:34 | |
and what that basically is saying is that women cannot handle | 0:44:34 | 0:44:38 | |
really sometimes quite awful abuse online, as well as men, | 0:44:38 | 0:44:42 | |
to me that's deeply... | 0:44:42 | 0:44:43 | |
Do you think it stops women putting their head above the parapet, | 0:44:43 | 0:44:46 | |
wanting to stand for parliament for example? | 0:44:46 | 0:44:49 | |
Hang on, if you're a public figure, especially if you're an MP, | 0:44:49 | 0:44:52 | |
you should be able to accept that you're talking to the public | 0:44:52 | 0:44:55 | |
and the public should be allowed to talk to you. | 0:44:55 | 0:44:57 | |
Sometimes that's particularly nasty, actually, | 0:44:57 | 0:44:59 | |
the majority of so-called sexist abuse that MPs get - | 0:44:59 | 0:45:02 | |
and I'm thinking about certain MPs who are very vocal about it - | 0:45:02 | 0:45:05 | |
Jess Phillips, Yvette Cooper - is actually just criticism of them. | 0:45:05 | 0:45:08 | |
But hang on, if someone came to your constituency surgery and said, | 0:45:08 | 0:45:11 | |
"I think you should be raped", | 0:45:11 | 0:45:13 | |
you wouldn't say, "Hang on a minute, I'll be with you in just a second." | 0:45:13 | 0:45:16 | |
-Is that what you're saying? -No, no, that's... | 0:45:16 | 0:45:19 | |
It's fine that they're empowered, is it? | 0:45:19 | 0:45:21 | |
Death threats and things like that are clearly illegal | 0:45:21 | 0:45:24 | |
and should be dealt with through law, but what we're really talking | 0:45:24 | 0:45:27 | |
-about now is a broader question of what is sexist... -No, we're not! | 0:45:27 | 0:45:30 | |
..and actually what is talked about is saying stuff like... | 0:45:30 | 0:45:33 | |
And actually I've written on this a lot - what is sexist trolling? | 0:45:33 | 0:45:37 | |
And it's often about calling women fat | 0:45:37 | 0:45:39 | |
or saying horrible things about women. | 0:45:39 | 0:45:41 | |
And that is seen - especially with the tampon tax - that is seen... | 0:45:41 | 0:45:45 | |
Jamie, where are you on this? | 0:45:45 | 0:45:47 | |
Just to say, just finally, | 0:45:47 | 0:45:49 | |
just to say that women are more subject to online abuse | 0:45:49 | 0:45:54 | |
and that there should be sanctions to protect women is deeply | 0:45:54 | 0:45:56 | |
more sexist than anything an online control can... | 0:45:56 | 0:45:58 | |
How do we stop this? Should we stop it? | 0:45:58 | 0:46:00 | |
No, we can't stop it, and we shouldn't stop all types of trolling, | 0:46:00 | 0:46:03 | |
and we shouldn't stop all types of offensive language or behaviour, | 0:46:03 | 0:46:07 | |
because when we live in a society where people are slightly worried | 0:46:07 | 0:46:11 | |
about self-censoring, not being able to speak their mind, actually, | 0:46:11 | 0:46:15 | |
sometimes internet trolls play quite a valuable role. | 0:46:15 | 0:46:18 | |
They push at the boundaries of offensiveness and make | 0:46:18 | 0:46:21 | |
other people feel like they might be able to speak their mind, too. | 0:46:21 | 0:46:23 | |
Yes, of course there are limits to that, | 0:46:23 | 0:46:26 | |
where it gets particularly targeted or where it's | 0:46:26 | 0:46:28 | |
a threat against an individual, but there's always a danger, | 0:46:28 | 0:46:31 | |
I think, that with internet trolls, we just say, | 0:46:31 | 0:46:33 | |
"They're terrible, we've got to silence them all." | 0:46:33 | 0:46:35 | |
That would actually be quite detrimental | 0:46:35 | 0:46:37 | |
for the health of our democracy. | 0:46:37 | 0:46:39 | |
-Owen, do they have their uses? -What, trolls? -Yeah. | 0:46:39 | 0:46:41 | |
Well, we conflate, I think, trolls and abuse. | 0:46:41 | 0:46:43 | |
I get trolled all the time - I look like a 12-year-old, | 0:46:43 | 0:46:46 | |
some age better than others, what can you do? | 0:46:46 | 0:46:48 | |
LAUGHTER | 0:46:48 | 0:46:49 | |
You wrote an article, if I may say, it was provocative, | 0:46:49 | 0:46:52 | |
as most of your articles can be, | 0:46:52 | 0:46:54 | |
but it was pretty thoughtful and well-argued about the fact that | 0:46:54 | 0:46:59 | |
having Jeremy Corbyn as leader might not be the best idea in the world, | 0:46:59 | 0:47:04 | |
and it was interesting, and I watched it happen, | 0:47:04 | 0:47:06 | |
it unfolded in front of me like a fight in the school yard. | 0:47:06 | 0:47:09 | |
You weren't behaving in that manner, but other people were. | 0:47:09 | 0:47:12 | |
The abuse you got was extraordinary. | 0:47:12 | 0:47:14 | |
Well, I am a right-wing sell-out, as my mum would tell me. | 0:47:14 | 0:47:17 | |
-No, I'm joking. -There's not much room for civilised debate sometimes. | 0:47:17 | 0:47:20 | |
No, but I would say this actually, because, yes, | 0:47:20 | 0:47:22 | |
I've had that kind of attack and it seems ridiculous, | 0:47:22 | 0:47:25 | |
we've had a lot of focus on that, | 0:47:25 | 0:47:28 | |
but the vast majority of abuse I get is from the far right | 0:47:28 | 0:47:31 | |
and the hard right, and that is stuff like threats of violence, | 0:47:31 | 0:47:34 | |
threats of death, sharing my home address, | 0:47:34 | 0:47:37 | |
sharing pictures from Google Street View of my bedroom | 0:47:37 | 0:47:40 | |
and my front door, with arrows pointing at them saying, | 0:47:40 | 0:47:42 | |
"Here's his bedroom, here's his door." | 0:47:42 | 0:47:45 | |
Now, if you're saying an absolute view on free speech, which is people | 0:47:45 | 0:47:49 | |
have the right to say I should be chopped up because I'm gay, | 0:47:49 | 0:47:52 | |
that people online should... I don't take it seriously, I should say. | 0:47:52 | 0:47:56 | |
Some of my friends think I should take it | 0:47:56 | 0:47:57 | |
far more seriously than I do, but nonetheless, | 0:47:57 | 0:47:59 | |
in the interests of free speech, people can basically issue a hate... | 0:47:59 | 0:48:03 | |
So, what do we do about it? | 0:48:03 | 0:48:05 | |
-Well, it should be. -Over to you.. | 0:48:05 | 0:48:07 | |
In the real world, if someone says to you should... | 0:48:07 | 0:48:10 | |
It's not the real world. | 0:48:10 | 0:48:11 | |
It is the real world online, of course it is, | 0:48:11 | 0:48:13 | |
and I have to say, | 0:48:13 | 0:48:15 | |
after the death of Jo Cox by a far-right terrorist, | 0:48:15 | 0:48:18 | |
we should take far more seriously the sort of hate speech | 0:48:18 | 0:48:21 | |
we see online, and it legitimises often the sorts of extremism | 0:48:21 | 0:48:23 | |
-that can end in violence. -There you are, | 0:48:23 | 0:48:25 | |
that's where that ends up, Ella. | 0:48:25 | 0:48:26 | |
No, I don't think it necessarily ends up | 0:48:26 | 0:48:28 | |
in the tragic death of Jo Cox, | 0:48:28 | 0:48:30 | |
which was not through a social media hate campaign, | 0:48:30 | 0:48:33 | |
that was not what happened. | 0:48:33 | 0:48:34 | |
-That's not what I said. -But the thing is, | 0:48:34 | 0:48:38 | |
the left used to be about protecting and arguing for | 0:48:38 | 0:48:41 | |
people's freedoms, and right now, | 0:48:41 | 0:48:42 | |
a lot of the trolling online comes from, yes, right-wing saddos | 0:48:42 | 0:48:48 | |
who are keyboard warriors and they really wouldn't threaten a mouse, | 0:48:48 | 0:48:50 | |
let alone women or... | 0:48:50 | 0:48:52 | |
There's left-wing saddos, as well, for the sake of BBC impartiality. | 0:48:52 | 0:48:57 | |
It comes from both sides, | 0:48:57 | 0:48:59 | |
but on the left we used to argue for freedom, and absolute freedom, | 0:48:59 | 0:49:03 | |
and the freedom for somebody to be able to express themselves | 0:49:03 | 0:49:06 | |
in any way, and that's actually now the biggest argument | 0:49:06 | 0:49:10 | |
against free speech, | 0:49:10 | 0:49:12 | |
which is an absolute right in a free society, | 0:49:12 | 0:49:14 | |
if you don't have free speech, | 0:49:14 | 0:49:15 | |
you don't have any kind of freedom, | 0:49:15 | 0:49:17 | |
cos it's the fundamental right to say what you believe, | 0:49:17 | 0:49:20 | |
and the left is now trying to close that down through sanctions... | 0:49:20 | 0:49:23 | |
-Andrew, wait a minute. -..through sanctions and censorship and rules. | 0:49:23 | 0:49:26 | |
Dr Andrew Calcutt, is this just a necessary | 0:49:26 | 0:49:32 | |
or unavoidable consequence of the digital media platforms? | 0:49:32 | 0:49:35 | |
Is it just a manifestation of aspects of the human condition, | 0:49:35 | 0:49:40 | |
or is it something that we should stamp on? | 0:49:40 | 0:49:42 | |
Oh, I certainly don't think we should be stamping | 0:49:42 | 0:49:46 | |
and regulating things out of existence, not by any means. | 0:49:46 | 0:49:50 | |
I think it would just be ludicrous to try and close it down, | 0:49:50 | 0:49:55 | |
and sanitise it, and outlaw the deplorables. | 0:49:55 | 0:50:01 | |
It's the same kind of Hillary Clinton discourse all over again. | 0:50:01 | 0:50:06 | |
You shouldn't be allowed to say that "we don't want you", | 0:50:06 | 0:50:09 | |
even in the echo chamber. | 0:50:09 | 0:50:11 | |
But what about, this is something, we'll come back to Dan, | 0:50:11 | 0:50:13 | |
cos Dan mentioned this earlier on, | 0:50:13 | 0:50:15 | |
and I think somebody else did, too - seems an awful long time ago - | 0:50:15 | 0:50:17 | |
Facebook and Twitter, are they tech platforms or are they publishers? | 0:50:17 | 0:50:22 | |
Well, they're something like a hybrid in between the two. | 0:50:22 | 0:50:26 | |
But I don't think they are either the problem or the solution. | 0:50:26 | 0:50:31 | |
I think we've lived through decades of intense interpersonalisation, | 0:50:31 | 0:50:36 | |
privatisation of everyday life, | 0:50:36 | 0:50:40 | |
mixed in with a kind of post-political managerial elite | 0:50:40 | 0:50:44 | |
that wants to micromanage every aspect of our existence, | 0:50:44 | 0:50:48 | |
including what we do online... | 0:50:48 | 0:50:50 | |
I don't think anyone's saying that. | 0:50:50 | 0:50:53 | |
-Forgive me... -Dan, come back on that. | 0:50:53 | 0:50:56 | |
I don't think anyone's saying, I would imagine... | 0:50:56 | 0:50:59 | |
Well, of course, you're not going to say it. | 0:50:59 | 0:51:01 | |
I would imagine everyone on this panel... | 0:51:01 | 0:51:03 | |
Because I'm revealing what you don't want a say. | 0:51:03 | 0:51:06 | |
..wants a society of free speech in this country. | 0:51:06 | 0:51:08 | |
Yeah, well, they all say that, come on. | 0:51:08 | 0:51:11 | |
But a total and utter free-for-all ignores the fact that | 0:51:11 | 0:51:15 | |
there are other aspects of existence that also have importance. | 0:51:15 | 0:51:18 | |
We're not talking about micromanaging that out of existence, | 0:51:18 | 0:51:21 | |
we're just talking about creating an environment | 0:51:21 | 0:51:25 | |
where there's some basic decency. | 0:51:25 | 0:51:26 | |
Creating an environment means managing it. | 0:51:26 | 0:51:28 | |
But we already have that! We already have that in our media. | 0:51:28 | 0:51:31 | |
One of the best things about the original incarnation of | 0:51:31 | 0:51:33 | |
the internet was the Electronic Frontier Foundation, | 0:51:33 | 0:51:36 | |
which saw it as a frontier for experimentation... | 0:51:36 | 0:51:39 | |
But we already have it in the world | 0:51:39 | 0:51:41 | |
of television and newspapers and radio. | 0:51:41 | 0:51:44 | |
THEY TALK OVER EACH OTHER | 0:51:44 | 0:51:46 | |
Dan, as a Channel 4 man... Wait a minute, everybody, please! | 0:51:46 | 0:51:50 | |
Or I'll unfollow you all. | 0:51:50 | 0:51:51 | |
LAUGHTER | 0:51:51 | 0:51:53 | |
As a TV man, as a Channel 4 man, are you not slightly on | 0:51:54 | 0:51:59 | |
the defensive here cos you're old news | 0:51:59 | 0:52:02 | |
and this is what's happening now? | 0:52:02 | 0:52:05 | |
It's going to happen in the future, because the TV news - | 0:52:05 | 0:52:07 | |
Channel 4 News - great programme - Sky News, they do a wonderful job, | 0:52:07 | 0:52:11 | |
BBC, ITN, likewise - but don't they all look a little bit leaden-footed? | 0:52:11 | 0:52:16 | |
No, I don't think that's right, actually. | 0:52:16 | 0:52:19 | |
I don't think that is a view of what is actually happening. | 0:52:19 | 0:52:22 | |
As I said earlier, television is the main source of news for most people. | 0:52:22 | 0:52:27 | |
But the traditional media companies - Channel 4 included, | 0:52:27 | 0:52:30 | |
the BBC included - have done an incredible job of going into | 0:52:30 | 0:52:33 | |
the new world and distributing video-based news. | 0:52:33 | 0:52:36 | |
Channel 4 News is now the biggest British broadcaster | 0:52:36 | 0:52:40 | |
putting news into Facebook. | 0:52:40 | 0:52:42 | |
But I think you've done a deplorable job on telling the story of | 0:52:42 | 0:52:46 | |
what's been going on in Western societies for the last 20, 30 years. | 0:52:46 | 0:52:50 | |
That's why you didn't get | 0:52:50 | 0:52:51 | |
that people were going to vote for Brexit. | 0:52:51 | 0:52:54 | |
That's why the mainstream media didn't get that people were | 0:52:54 | 0:52:57 | |
going to vote for Trump in the way that they did... | 0:52:57 | 0:52:59 | |
-That's as much about politicians... -..because you haven't covered the story. | 0:52:59 | 0:53:02 | |
Sorry, just to complete on that point, I think the important | 0:53:02 | 0:53:05 | |
thing to say about the old media going into the new media space | 0:53:05 | 0:53:08 | |
is that we, the BBC, the other public service broadcasters, | 0:53:08 | 0:53:11 | |
we apply the standards that exist in the television world - | 0:53:11 | 0:53:15 | |
which are much higher, and, in the UK, | 0:53:15 | 0:53:17 | |
are the gold standard in the world, in any form of media - | 0:53:17 | 0:53:21 | |
we apply those standards into a world | 0:53:21 | 0:53:23 | |
where other people aren't applying it. | 0:53:23 | 0:53:25 | |
That's not to say the internet-only players don't have terrifically | 0:53:25 | 0:53:28 | |
high standards. Some of them do, people like Huffington Post. | 0:53:28 | 0:53:31 | |
I think you're covering yourself in medals you don't deserve. | 0:53:31 | 0:53:34 | |
It's more like self-service broadcasting. | 0:53:34 | 0:53:36 | |
-But there are many who don't. -Andrew's on fire now! | 0:53:36 | 0:53:40 | |
Interesting points though, interesting points. | 0:53:40 | 0:53:42 | |
Self-service broadcasting, that's quite a... Jamie? | 0:53:42 | 0:53:45 | |
-What? -You've been listening? | 0:53:45 | 0:53:48 | |
You know about this stuff? | 0:53:48 | 0:53:50 | |
-The reality is, I think... -The cat's out of the bag. | 0:53:50 | 0:53:54 | |
Well, it's getting far more unstable and unpredictable, | 0:53:54 | 0:53:57 | |
and that's also very exciting. | 0:53:57 | 0:53:59 | |
And we definitely need to reinvigorate our democracies, | 0:53:59 | 0:54:01 | |
-we need new ideas... -Do we need new controls? | 0:54:01 | 0:54:05 | |
What about Facebook, what about Twitter? | 0:54:05 | 0:54:07 | |
They're publishers, essentially, aren't they? | 0:54:07 | 0:54:09 | |
Shouldn't they be doing more? | 0:54:09 | 0:54:10 | |
If they are judged as publishers, they have an opt-out, | 0:54:10 | 0:54:13 | |
under a 1995 legislation in the US, which means they're not | 0:54:13 | 0:54:17 | |
responsible for things that are published on their platform. | 0:54:17 | 0:54:20 | |
Because, if they were, they would probably collapse immediately, | 0:54:20 | 0:54:23 | |
because they'd have to check everything before it was posted, | 0:54:23 | 0:54:25 | |
and we're talking about millions, billions... | 0:54:25 | 0:54:27 | |
You can go online right now, you can find IS, you can find vile | 0:54:27 | 0:54:30 | |
Islamism, you can find vile Nazism, you can find Holocaust denial... | 0:54:30 | 0:54:34 | |
It's not possible for them to actually check everything all | 0:54:34 | 0:54:37 | |
the time, it's simply too much. | 0:54:37 | 0:54:39 | |
But they have a moral imperative to do significantly more | 0:54:39 | 0:54:42 | |
than they are doing. | 0:54:42 | 0:54:43 | |
At the moment, they have immense power, as companies, | 0:54:43 | 0:54:47 | |
and they are simply not exercising enough responsibility. | 0:54:47 | 0:54:51 | |
I think we're mixing up two things, or, Dan, I think, | 0:54:51 | 0:54:53 | |
with respect, you're mixing up two things. | 0:54:53 | 0:54:55 | |
One is that we all, as individuals, are on social media. | 0:54:55 | 0:54:58 | |
And what we do as individuals is our business, | 0:54:58 | 0:55:01 | |
and everyone else can mind theirs. | 0:55:01 | 0:55:03 | |
There are also lots of people publishing on social media, | 0:55:03 | 0:55:05 | |
who are organisations, | 0:55:05 | 0:55:07 | |
who are doing what used to be only the remit of proper journalists | 0:55:07 | 0:55:10 | |
with proper, hopefully, proper journalistic standards. | 0:55:10 | 0:55:13 | |
And, obviously, as a professional journalist who lives by those | 0:55:13 | 0:55:17 | |
standards, you're offended to see people purporting to do that | 0:55:17 | 0:55:21 | |
and not living up to those standards. | 0:55:21 | 0:55:23 | |
Those are the people you have a beef with, and I think we need to be... | 0:55:23 | 0:55:26 | |
No, I have a beef with people who are outright lying, | 0:55:26 | 0:55:28 | |
and also people who are doing it simply for their own financial ends, | 0:55:28 | 0:55:31 | |
before we even get to political influence. | 0:55:31 | 0:55:34 | |
Helen, I haven't heard from you for a while? | 0:55:34 | 0:55:35 | |
The point is that social media platforms are now, | 0:55:35 | 0:55:38 | |
to some extent, institutions of democracy, if you like. | 0:55:38 | 0:55:42 | |
I mean, they are where our democracies are played out. | 0:55:42 | 0:55:44 | |
They're where our campaigns take place, | 0:55:44 | 0:55:46 | |
where our elections take place, where our governments act, | 0:55:46 | 0:55:49 | |
and where the president-elect is actually MAKING policy. | 0:55:49 | 0:55:52 | |
But they've only been there for about five years, | 0:55:52 | 0:55:55 | |
they've only been around at all for ten, | 0:55:55 | 0:55:57 | |
and they've only been intertwined with our political systems for five. | 0:55:57 | 0:56:00 | |
So, I suppose, it's not surprising we've got | 0:56:00 | 0:56:02 | |
a lot of institutional catch-up to do. | 0:56:02 | 0:56:04 | |
But there are things we can do, and there are things being done. | 0:56:04 | 0:56:08 | |
I mean, not enough, it's too late, et cetera, | 0:56:08 | 0:56:11 | |
but we shouldn't just throw our hands in the air and give up. | 0:56:11 | 0:56:14 | |
And we shouldn't lump everything together. | 0:56:14 | 0:56:17 | |
We're lumping together fake news, trolls - | 0:56:17 | 0:56:20 | |
the small, very small number of people who are threatening to | 0:56:20 | 0:56:24 | |
close down Twitter entirely. | 0:56:24 | 0:56:26 | |
Right now, this is a fantastic debate, but out there, | 0:56:26 | 0:56:30 | |
there is an even more fantastic debate about this debate, | 0:56:30 | 0:56:34 | |
happening right now, on Twitter. | 0:56:34 | 0:56:36 | |
If only they could all... | 0:56:36 | 0:56:37 | |
That's right, even this debate about politics is being played out | 0:56:37 | 0:56:41 | |
on social media as we speak, exactly. | 0:56:41 | 0:56:43 | |
But the solutions to the things are different, as well. | 0:56:43 | 0:56:46 | |
The solutions to fake news are things like your organisation | 0:56:46 | 0:56:50 | |
and what Facebook have said they will do about fake news. | 0:56:50 | 0:56:53 | |
Both they themselves, in terms of fact checking, and allowing | 0:56:53 | 0:56:56 | |
individual social media users to report. | 0:56:56 | 0:56:59 | |
The solutions are different for trolls and they are different | 0:56:59 | 0:57:04 | |
for all the other things we've been talking about. | 0:57:04 | 0:57:07 | |
But there are possibilities out there, and they are being worked at. | 0:57:07 | 0:57:11 | |
There are even bots being developed... | 0:57:11 | 0:57:13 | |
Bots, just explain to people who don't know what a bot is? | 0:57:13 | 0:57:16 | |
An automated Twitter account, for example. | 0:57:16 | 0:57:18 | |
There are automated Twitter accounts out there, facing racism, | 0:57:18 | 0:57:23 | |
attacking racism, or just trying to persuade racist trolls of the | 0:57:23 | 0:57:30 | |
error of their ways, et cetera. | 0:57:30 | 0:57:33 | |
And some of those things are being shown to work. | 0:57:33 | 0:57:36 | |
So, there are a lot of possibilities out there, | 0:57:36 | 0:57:38 | |
but we can't just lump them all together. | 0:57:38 | 0:57:39 | |
Chi, we're getting towards the end, but we've got about... | 0:57:39 | 0:57:43 | |
A 40-second answer from you, if you would? | 0:57:43 | 0:57:46 | |
Do you celebrate this? | 0:57:46 | 0:57:47 | |
Is it a better form of contact with your constituents, | 0:57:47 | 0:57:50 | |
as an elected representative? | 0:57:50 | 0:57:52 | |
I celebrate it as an elected representative, in terms of | 0:57:52 | 0:57:55 | |
being able, instantly, almost, to see some constituents - | 0:57:55 | 0:57:59 | |
those who are on Twitter, or those who are on Facebook - | 0:57:59 | 0:58:03 | |
respond to really important issues. That is fantastic, online. | 0:58:03 | 0:58:07 | |
But I also recognise there are so many who aren't making those | 0:58:07 | 0:58:10 | |
responses, because a lot of people have better things to do with | 0:58:10 | 0:58:13 | |
their lives than respond to their MPs online! | 0:58:13 | 0:58:17 | |
But, also, I think we haven't talked about this, | 0:58:17 | 0:58:19 | |
the rules that Facebook and others do in order to decide what | 0:58:19 | 0:58:23 | |
you see, the algorithms, they are what control what people see. | 0:58:23 | 0:58:27 | |
And after this, we're all going to check our phones, right? | 0:58:27 | 0:58:30 | |
Thank you all very much indeed. | 0:58:30 | 0:58:32 | |
As always, the debate continues on the digital universe, | 0:58:32 | 0:58:35 | |
on Twitter and online, join us next Sunday from Bradford, | 0:58:35 | 0:58:37 | |
goodbye from everyone here in Uxbridge. | 0:58:37 | 0:58:39 | |
Have a great Sunday. | 0:58:39 | 0:58:41 | |
APPLAUSE | 0:58:41 | 0:58:43 |