Browse content similar to What Facebook Knows About You. Check below for episodes and series from the same categories and more!
Line | From | To | |
---|---|---|---|
Tonight on Panorama, | 0:00:02 | 0:00:04 | |
we investigate one of the most popular companies on the planet. | 0:00:04 | 0:00:08 | |
I think the relationship of users to Facebook is of that of a drug addict with a drug, in many ways. | 0:00:09 | 0:00:14 | |
Facebook may know more about us than any organisation in history. | 0:00:14 | 0:00:18 | |
What I care about is giving people the power to share, giving every person a voice, | 0:00:18 | 0:00:22 | |
so we can make the world more open and connected. | 0:00:22 | 0:00:25 | |
It makes a fortune from advertising, using our personal information. | 0:00:25 | 0:00:29 | |
It's scary, we are all walking barcodes. | 0:00:31 | 0:00:33 | |
And politicians are now using our data to target us, too. | 0:00:34 | 0:00:38 | |
The way we bought media on Facebook was like no-one else | 0:00:40 | 0:00:44 | |
in politics had ever done. | 0:00:44 | 0:00:45 | |
If it's influencing elections, is it now time to control Facebook? | 0:00:45 | 0:00:51 | |
With Facebook, you have a media which is increasingly seen as the most | 0:00:51 | 0:00:55 | |
valuable media that campaigns invest in, in the election period, | 0:00:55 | 0:00:58 | |
but which is totally unregulated. | 0:00:58 | 0:01:00 | |
32 million of us in the UK share our lives on Facebook. | 0:01:19 | 0:01:23 | |
You may have high privacy settings, but anything you share with friends | 0:01:26 | 0:01:31 | |
can become public if they don't have high privacy settings, too. | 0:01:31 | 0:01:34 | |
-I'm trying to get a look at how private your settings really are. -Yeah. | 0:01:36 | 0:01:40 | |
-How much of your life is out there. -Anything on cats I love. | 0:01:40 | 0:01:44 | |
-So everyone can just see those? -Yes. -Oh, my God. | 0:01:44 | 0:01:47 | |
Facebook says it helps people manage their privacy and have control. | 0:01:50 | 0:01:55 | |
Oh! | 0:01:55 | 0:01:56 | |
Right, so I think I'll have to change some of my settings now. | 0:01:56 | 0:02:00 | |
But our tests shows how information and pictures can slip into | 0:02:02 | 0:02:07 | |
public view. | 0:02:07 | 0:02:08 | |
We're looking at basically weapons here. | 0:02:08 | 0:02:12 | |
-But are they available? -They are available, yes. -Wow. | 0:02:12 | 0:02:16 | |
Although I support it, I don't want it in public view. | 0:02:16 | 0:02:20 | |
-These are pages that you've liked. -Yeah. | 0:02:20 | 0:02:23 | |
-Metallica. -Yeah. | 0:02:23 | 0:02:25 | |
I'm surprised that people can get to that level and get this information. | 0:02:27 | 0:02:34 | |
I don't think some of my friends will be very happy about that. | 0:02:34 | 0:02:37 | |
So what did you think overall? | 0:02:40 | 0:02:43 | |
Overall, I'm very concerned because I think my pictures are private, | 0:02:43 | 0:02:47 | |
and I like those. | 0:02:47 | 0:02:48 | |
So I thought that it was just your profile picture which you | 0:02:48 | 0:02:51 | |
know is always public. As opposed to... | 0:02:51 | 0:02:55 | |
the rest that we've seen today, so... | 0:02:55 | 0:02:59 | |
-Yeah. -There weren't too many. | 0:02:59 | 0:03:00 | |
No, there weren't, but, you know, there are too many for me. | 0:03:00 | 0:03:05 | |
When we first launched, we were hoping for, you know, | 0:03:09 | 0:03:12 | |
maybe 400, 500 people. | 0:03:12 | 0:03:13 | |
And now we're at 100,000 people, so who knows where we're going next. | 0:03:13 | 0:03:17 | |
Mark Zuckerberg's college project is now used by | 0:03:19 | 0:03:22 | |
almost 2 billion people. | 0:03:22 | 0:03:24 | |
Facebook is where many of us share those important moments in our lives. | 0:03:26 | 0:03:30 | |
It's transformed the way the world stays in touch. | 0:03:30 | 0:03:33 | |
Our development is guided by the idea that every year, | 0:03:34 | 0:03:38 | |
the amount that people want to add and share and express is increasing. | 0:03:38 | 0:03:42 | |
The freedom of the internet has helped Facebook grow so quickly. | 0:03:43 | 0:03:48 | |
But after years of unfettered free speech, | 0:03:48 | 0:03:51 | |
social media companies are now facing severe criticism | 0:03:51 | 0:03:55 | |
and being called to account for the material on their sites. | 0:03:55 | 0:03:58 | |
With respect, I... | 0:04:01 | 0:04:03 | |
You always preface your comments "with respect", but it's a simple | 0:04:03 | 0:04:07 | |
question - do you feel any shame at all by some of what we've seen here? | 0:04:07 | 0:04:11 | |
I don't think it's a simple question to suggest, do you have no shame? | 0:04:11 | 0:04:15 | |
I feel very responsible, as do my colleagues, | 0:04:15 | 0:04:17 | |
for the safety of the 1.9 billion people using our service. | 0:04:17 | 0:04:21 | |
-And I'm very proud of the work we do. -Proud of the work... | 0:04:21 | 0:04:25 | |
I don't think Facebook should be necessarily responsible for | 0:04:25 | 0:04:28 | |
every idiot on the internet. But we need to agree that. | 0:04:28 | 0:04:32 | |
We need to have regulation that reflects what we think, | 0:04:32 | 0:04:35 | |
what our values are. And right now, we haven't got any... | 0:04:35 | 0:04:39 | |
any kind of framework that sets that out. | 0:04:39 | 0:04:42 | |
Facebook is the giant of social media. | 0:04:49 | 0:04:52 | |
That scale and power makes it different from other internet companies. | 0:04:52 | 0:04:57 | |
It has personal information about every aspect of our lives. | 0:04:57 | 0:05:01 | |
It uses that information to create the most targeted advertising | 0:05:03 | 0:05:07 | |
machine in history. | 0:05:07 | 0:05:09 | |
And the ad industry loves it. | 0:05:09 | 0:05:11 | |
The amount of targeting you can do on Facebook is extraordinary. | 0:05:13 | 0:05:17 | |
Sometimes I'll be on a website called Competitive Cyclists, | 0:05:17 | 0:05:20 | |
and I'll be looking at a new wheel set or something. | 0:05:20 | 0:05:23 | |
Well, they'll chase me all over the internet with ads for bicycle parts. | 0:05:23 | 0:05:27 | |
Well, if I'm on Facebook and I'm going to see ads, I'd rather | 0:05:27 | 0:05:30 | |
see an ad for a bicycle part than women's pair of shoes, right? | 0:05:30 | 0:05:34 | |
So, as a user, I'm into it because now they're showing | 0:05:34 | 0:05:37 | |
me ads about things that I'm interested in. | 0:05:37 | 0:05:39 | |
It's scary. We are all walking barcodes. | 0:05:41 | 0:05:43 | |
HE LAUGHS | 0:05:43 | 0:05:45 | |
It's a brilliant business model that has turned Facebook into | 0:05:48 | 0:05:51 | |
a 400 billion company. | 0:05:51 | 0:05:54 | |
We produce Facebook's content for free, | 0:05:55 | 0:05:58 | |
while advertisers queue up to sell us products. | 0:05:58 | 0:06:02 | |
It's the ultimate crowd-funded media company. | 0:06:02 | 0:06:05 | |
All the media comes from the users, and all the viewers are users, | 0:06:05 | 0:06:09 | |
and advertisers get to slip into the middle of that. | 0:06:09 | 0:06:12 | |
People love Facebook. But there is a potential downside. | 0:06:16 | 0:06:21 | |
Its computer algorithms track our lives. | 0:06:21 | 0:06:25 | |
It's claimed that Facebook have more information about us than any | 0:06:25 | 0:06:28 | |
government organisation or any other business ever. | 0:06:28 | 0:06:32 | |
And the details can cover just about every aspect of our lives. | 0:06:32 | 0:06:36 | |
This is the house that Mark built. | 0:06:36 | 0:06:38 | |
Ultimately, Facebook is the company that is making billions | 0:06:46 | 0:06:51 | |
out of this data. | 0:06:51 | 0:06:53 | |
It has immense power and it has immense responsibility that | 0:06:53 | 0:06:56 | |
goes with that, and I don't think Facebook recognises that. | 0:06:56 | 0:06:59 | |
The way Facebook's algorithms work | 0:07:13 | 0:07:15 | |
is one of its most closely guarded secrets. | 0:07:15 | 0:07:18 | |
But one former insider has agreed to talk to me. | 0:07:23 | 0:07:27 | |
We're sailing between the San Juan Islands in Washington state, | 0:07:28 | 0:07:33 | |
right up in the north-west corner of America. | 0:07:33 | 0:07:36 | |
Canada, a couple of miles that direction. | 0:07:36 | 0:07:39 | |
Silicon Valley, about 1,000 miles that way. | 0:07:39 | 0:07:42 | |
Antonio Garcia Martinez helped build Facebook's advertising machine. | 0:07:47 | 0:07:52 | |
He now has a very different lifestyle. | 0:07:55 | 0:07:57 | |
He left the company after a difference of opinion. | 0:08:00 | 0:08:02 | |
Somewhat like being inducted into a cult. | 0:08:04 | 0:08:07 | |
There's actually an initiation ritual called "onboarding", in which | 0:08:07 | 0:08:10 | |
they try to inculcate the values of Facebook and the Facebook | 0:08:10 | 0:08:12 | |
way of doing things. | 0:08:12 | 0:08:14 | |
I guess the closest example might be a baptism in the case of a Catholic Church, | 0:08:14 | 0:08:17 | |
or perhaps more an evangelical church, where as an adult, | 0:08:17 | 0:08:19 | |
you sort of leave your sort of past and adopt this new path in life. | 0:08:19 | 0:08:23 | |
Well, if it's a cult, every cult has to have a leader. | 0:08:23 | 0:08:25 | |
-Who's the leader? -Yeah, well, Mark Zuckerberg is the prophet of that religion. | 0:08:25 | 0:08:29 | |
And he's very much in charge. | 0:08:29 | 0:08:32 | |
He's definitely the Jesus of that church. | 0:08:32 | 0:08:34 | |
Antonio knows how the world's most powerful advertising tool works. | 0:08:36 | 0:08:40 | |
This is how the internet gets paid for. | 0:08:43 | 0:08:46 | |
Every time you get almost every page on the internet, | 0:08:50 | 0:08:52 | |
or every app that you load on your phone. | 0:08:52 | 0:08:55 | |
In that moment, literally in that instant, we're talking tens of milliseconds, | 0:08:55 | 0:08:58 | |
there are signals that go out that say, "Hey, this person showed up." | 0:08:58 | 0:09:02 | |
Then comes the real magic. | 0:09:07 | 0:09:09 | |
Instantly, Facebook gathers all it knows about you and matches | 0:09:12 | 0:09:17 | |
it with data from your real life. | 0:09:17 | 0:09:21 | |
Your age, your home. | 0:09:21 | 0:09:24 | |
Your bank, your credit card purchases... | 0:09:25 | 0:09:29 | |
your holidays... | 0:09:29 | 0:09:31 | |
Anything that helps advertisers target you. | 0:09:31 | 0:09:34 | |
It happens hundreds of billions of times a day, all over the internet. | 0:09:35 | 0:09:40 | |
How long does that process take? | 0:09:40 | 0:09:42 | |
Literally half the time that it takes for you to blink your eyes. | 0:09:45 | 0:09:48 | |
That's how fast it is. | 0:09:48 | 0:09:50 | |
'Facebook says it complies with all regulations that apply to it. | 0:09:52 | 0:09:55 | |
'But critics say those regulations were not designed for | 0:09:57 | 0:10:00 | |
'a company of Facebook's reach and complexity.' | 0:10:00 | 0:10:03 | |
Facebook has brought real value to many people's lives but the | 0:10:04 | 0:10:08 | |
fact is that it's not about whether or not | 0:10:08 | 0:10:13 | |
Mark Zuckerberg is a nice guy or has benign intentions, | 0:10:13 | 0:10:18 | |
it's about people's rights and responsibilities. | 0:10:18 | 0:10:22 | |
I can see a future where we are too tied into Facebook, as citizens, | 0:10:22 | 0:10:27 | |
where we are scared to move from it, | 0:10:27 | 0:10:29 | |
leave it, because it's got all our data. | 0:10:29 | 0:10:31 | |
But I can also see a future with effective regulation, | 0:10:31 | 0:10:34 | |
and that's also the future, I think, which is best for Facebook. | 0:10:34 | 0:10:37 | |
The trouble is, | 0:10:41 | 0:10:43 | |
Facebook's computer algorithms are beyond the comprehension of | 0:10:43 | 0:10:46 | |
most of its own staff, let alone government regulators. | 0:10:46 | 0:10:50 | |
You know what's naive? | 0:10:51 | 0:10:53 | |
The thought that a European bureaucrat, | 0:10:53 | 0:10:55 | |
who's never managed so much as a blog, could somehow, | 0:10:55 | 0:10:57 | |
in any way, get up to speed or even attempt to regulate the algorithm | 0:10:57 | 0:11:01 | |
that drives literally a quarter of the internet in the entire world, | 0:11:01 | 0:11:05 | |
right, that's never going to happen, in any realistic way. | 0:11:05 | 0:11:08 | |
Knowing what Facebook does with our personal information is now | 0:11:12 | 0:11:15 | |
more important than ever. | 0:11:15 | 0:11:18 | |
Because it's no longer just advertisers using our data. | 0:11:18 | 0:11:21 | |
Politicians are targeting us, too. | 0:11:22 | 0:11:25 | |
-MAN: -Isis scum! -CROWD: -Off our streets! | 0:11:25 | 0:11:28 | |
-Isis scum! -Off our streets! | 0:11:28 | 0:11:30 | |
Take the far-right group Britain First. | 0:11:30 | 0:11:34 | |
They've told panorama | 0:11:34 | 0:11:36 | |
that they pay Facebook to repeatedly promote their videos. | 0:11:36 | 0:11:40 | |
And it works. Britain First now has 1.6 million Facebook followers. | 0:11:42 | 0:11:48 | |
The British people have spoken and the answer is, we're out. | 0:11:50 | 0:11:53 | |
But it was the Brexit vote last summer that revealed the true | 0:11:55 | 0:11:57 | |
power of Facebook. | 0:11:57 | 0:11:59 | |
CHEERING | 0:11:59 | 0:12:01 | |
I think Facebook was a game-changer for the campaign, | 0:12:03 | 0:12:07 | |
I don't think there's any question about it. | 0:12:07 | 0:12:10 | |
Both sides in the referendum used Facebook to target us individually. | 0:12:10 | 0:12:14 | |
'You can say to Facebook, "I would like to make sure that I can' | 0:12:15 | 0:12:20 | |
"micro-target that fisherman, in certain parts of the UK," | 0:12:20 | 0:12:26 | |
so that they are specifically hearing, on Facebook, | 0:12:26 | 0:12:30 | |
that if you vote to leave, | 0:12:30 | 0:12:33 | |
that you will be able to change the way that the regulations | 0:12:33 | 0:12:40 | |
are set for the fishing industry. | 0:12:40 | 0:12:42 | |
Now, I can do the exact same thing, for example, for people who live | 0:12:42 | 0:12:46 | |
in the Midlands, that are struggling because the factory has shut down. | 0:12:46 | 0:12:50 | |
So I may send a specific message through Facebook to them | 0:12:50 | 0:12:55 | |
that nobody else is seeing. | 0:12:55 | 0:12:57 | |
However you want to micro-target that message, | 0:12:59 | 0:13:02 | |
you can do it on Facebook. | 0:13:02 | 0:13:04 | |
The referendum showed how much Facebook has changed. | 0:13:04 | 0:13:07 | |
We think Facebook is about keeping in touch with family and friends. | 0:13:12 | 0:13:17 | |
But it's quietly become an important political player. | 0:13:18 | 0:13:21 | |
Donald Trump's presidential campaign took full advantage. | 0:13:24 | 0:13:30 | |
The way we bought media on Facebook was like no-one else in politics has ever done. | 0:13:30 | 0:13:35 | |
How much money would you have spent with Facebook? Could you tell me? | 0:13:35 | 0:13:38 | |
Uh, that's a number we haven't publicised. | 0:13:38 | 0:13:41 | |
If I said 70 million, from the official campaign, | 0:13:41 | 0:13:44 | |
-would that be there or thereabouts? -Uh, I would say that's nearby, yes. | 0:13:44 | 0:13:48 | |
By using people's personal information, Facebook helped | 0:13:56 | 0:14:00 | |
the campaigns to target voters more effectively than ever before. | 0:14:00 | 0:14:04 | |
CROWD CHANTS AND APPLAUDS | 0:14:04 | 0:14:07 | |
We take your name, address, you know, phone number, | 0:14:07 | 0:14:09 | |
if we have it, e-mail, and we can take these pieces of data, | 0:14:09 | 0:14:11 | |
basically put 'em in a file, send 'em to Facebook, | 0:14:11 | 0:14:15 | |
Facebook then matches them to the user's profile. | 0:14:15 | 0:14:18 | |
So, if you're on Facebook and you have an arrow, | 0:14:18 | 0:14:21 | |
I live at so-and-so address, I can then match you. | 0:14:21 | 0:14:23 | |
And then once you're matched, I can put you into what is called an audience. | 0:14:23 | 0:14:27 | |
An audience is essentially a bucket of users that I can then target. | 0:14:27 | 0:14:31 | |
-Was Facebook decisive? -Yes. | 0:14:31 | 0:14:34 | |
Now, I'm going to make our country rich again! | 0:14:35 | 0:14:39 | |
CHEERING AND APPLAUSE | 0:14:39 | 0:14:42 | |
Facebook says it wants to be the most open and transparent company. | 0:14:46 | 0:14:50 | |
But it won't tell us how much cash it took from Republicans | 0:14:54 | 0:14:57 | |
and Democrats, citing client confidentiality. | 0:14:57 | 0:15:01 | |
It's estimated the election earned Facebook at least 250 million. | 0:15:04 | 0:15:09 | |
There it is, Facebook's Washington HQ. | 0:15:13 | 0:15:16 | |
Millions and millions of campaign dollars were spent here. | 0:15:16 | 0:15:19 | |
The company had dedicated teams to help | 0:15:19 | 0:15:21 | |
the political parties reach voters. | 0:15:21 | 0:15:23 | |
Up the street, three blocks, the White House. | 0:15:24 | 0:15:27 | |
Most of us had little idea Facebook had become so involved in politics. | 0:15:29 | 0:15:34 | |
It even sent its own people to work directly with the campaigns. | 0:15:35 | 0:15:41 | |
Both parties are going to have a team that are from Facebook, | 0:15:41 | 0:15:44 | |
you know, we had folks on the ground with us, on the campaign, | 0:15:44 | 0:15:47 | |
working with us day in, day out, | 0:15:47 | 0:15:49 | |
we had dedicated staff that we would call when we ran into | 0:15:49 | 0:15:52 | |
problems or got stuck on using a new tool or product of theirs. | 0:15:52 | 0:15:55 | |
Or, we had an idea and we wanted to see if we could do it on | 0:15:55 | 0:15:58 | |
Facebook and they would help us create a unique solution. | 0:15:58 | 0:16:01 | |
We have been asking Facebook about this for weeks. | 0:16:01 | 0:16:04 | |
In terms of the US presidential election, did you have Facebook | 0:16:06 | 0:16:10 | |
people working directly with their respective campaigns? | 0:16:10 | 0:16:14 | |
No Facebook employee worked directly for a campaign. | 0:16:14 | 0:16:18 | |
Not "for", WITH, alongside? | 0:16:18 | 0:16:21 | |
So, one of the things we absolutely are there to do is to | 0:16:21 | 0:16:23 | |
help people make use of Facebook products. | 0:16:23 | 0:16:26 | |
We do have people who are... whose role is to help politicians | 0:16:26 | 0:16:30 | |
and governments make good use of Facebook. | 0:16:30 | 0:16:33 | |
And that's not just around campaigns. | 0:16:33 | 0:16:35 | |
But how many people did you have working effectively with these campaigns? | 0:16:35 | 0:16:39 | |
I can't give you the number of exactly how many people | 0:16:40 | 0:16:42 | |
worked with those campaigns but I can tell you that it was | 0:16:42 | 0:16:45 | |
completely demand driven. So it's really up to the campaigns. | 0:16:45 | 0:16:48 | |
So, according to the winners of both Brexit and the US presidential | 0:16:51 | 0:16:56 | |
race, Facebook was decisive. | 0:16:56 | 0:16:58 | |
Facebook is now expected to play a major role in our general election. | 0:16:59 | 0:17:04 | |
This matters because critics say Facebook is largely | 0:17:06 | 0:17:08 | |
unregulated and unaccountable. | 0:17:08 | 0:17:11 | |
Historically, there have been quite strict rules about the way | 0:17:11 | 0:17:14 | |
information is presented. Broadcasters worked to a very... | 0:17:14 | 0:17:17 | |
a very strict code in terms of partiality. | 0:17:17 | 0:17:20 | |
And there are restrictions on use of advertising but on something | 0:17:20 | 0:17:23 | |
like Facebook, you have a media which is increasingly seen as | 0:17:23 | 0:17:27 | |
the most important, most valuable media that campaigns | 0:17:27 | 0:17:30 | |
invest in, in an election period, but which is totally unregulated. | 0:17:30 | 0:17:33 | |
Facebook is also under fire for fake news. | 0:17:42 | 0:17:45 | |
It was the main platform for spreading made-up stories | 0:17:49 | 0:17:52 | |
during the US presidential election. | 0:17:52 | 0:17:55 | |
Some of them concerned Donald Trump. | 0:18:02 | 0:18:06 | |
Most were about Hilary Clinton. | 0:18:06 | 0:18:08 | |
There are dozens and dozens of fake news stories about | 0:18:14 | 0:18:17 | |
Hilary Clinton to be found, | 0:18:17 | 0:18:20 | |
linking her to sex scandals and several murders. | 0:18:20 | 0:18:23 | |
'Mark Zuckerberg has tried to play down the importance of fake news.' | 0:18:26 | 0:18:31 | |
The idea that fake news on Facebook influenced the election in | 0:18:32 | 0:18:37 | |
any way, I think, is a pretty crazy idea. | 0:18:37 | 0:18:41 | |
You know, voters make decisions based on their lived experience. | 0:18:41 | 0:18:44 | |
But some of the made-up stories contained false information | 0:18:46 | 0:18:49 | |
about when to vote. | 0:18:49 | 0:18:52 | |
The Democrats say Facebook refused to take them down. | 0:18:54 | 0:18:56 | |
A lot of them were very specifically targeted at African Americans or | 0:18:59 | 0:19:04 | |
targeted at other populations that are very likely to be Democrats. | 0:19:04 | 0:19:08 | |
We took it up with all of the large social media and technology companies. | 0:19:10 | 0:19:14 | |
None of 'em was willing to just sort of get rid of the stuff. | 0:19:14 | 0:19:18 | |
-Including Facebook? -None of 'em was willing, yeah. Yeah, yeah, none of 'em. | 0:19:18 | 0:19:21 | |
Why didn't you simply take them down when you were asked to? | 0:19:24 | 0:19:27 | |
It is my understanding that indeed we did take them down when we | 0:19:27 | 0:19:30 | |
were asked to. When we were told about those posts, we took them down. And we also... | 0:19:30 | 0:19:33 | |
Well, can I just... Can I just show you that there? | 0:19:33 | 0:19:36 | |
I got that on the internet just on a Facebook page just the other day. | 0:19:36 | 0:19:40 | |
So they're still up there. | 0:19:40 | 0:19:43 | |
One of the things about our platform is we do rely on people to | 0:19:43 | 0:19:45 | |
let us know about content they believe shouldn't be on | 0:19:45 | 0:19:48 | |
Facebook, particularly this kind of material, and therefore, | 0:19:48 | 0:19:51 | |
when people let us know about that, then we take action. | 0:19:51 | 0:19:54 | |
And it's my understanding that we did indeed take down the posts | 0:19:54 | 0:19:57 | |
that we were notified of by the Clinton campaign. | 0:19:57 | 0:20:03 | |
'Facebook says it helped 2 million people register to vote in the US election. | 0:20:03 | 0:20:08 | |
'And it has launched a series of initiatives to combat fake news. | 0:20:08 | 0:20:12 | |
'They include suspending fake accounts and working with | 0:20:14 | 0:20:17 | |
'fact checkers to highlight what Facebook calls disputed stories. | 0:20:17 | 0:20:21 | |
'But why doesn't it simply remove them?' | 0:20:23 | 0:20:26 | |
Facebook says it doesn't want to ban fake news because it doesn't | 0:20:26 | 0:20:30 | |
want to censor the internet. | 0:20:30 | 0:20:32 | |
But there's another reason the company might be reluctant to get rid of fake news. | 0:20:32 | 0:20:37 | |
All those made-up stories - they help boost its profits. | 0:20:37 | 0:20:42 | |
The key to Facebook's profitability is engagement. | 0:20:50 | 0:20:54 | |
The more time people stay online, | 0:20:54 | 0:20:56 | |
the more money Facebook earns from advertising. | 0:20:56 | 0:20:59 | |
And fake news about the US election kept people engaged. | 0:21:00 | 0:21:04 | |
Zuck got up and said, "Well, only about 1% of the content is fake news." | 0:21:06 | 0:21:10 | |
Numerically, that might be true, | 0:21:10 | 0:21:12 | |
in terms of all the content in the world. However, | 0:21:12 | 0:21:14 | |
the fraction of engagement that those posts, you know, drove, and by | 0:21:14 | 0:21:18 | |
engagement I mean likes, comments, shares, is way more than 1%. | 0:21:18 | 0:21:22 | |
And why is that? Well, because click bait, as it's called, works. | 0:21:22 | 0:21:25 | |
Right? a headline that proclaims that Hillary, whatever, | 0:21:25 | 0:21:28 | |
just murdered her campaign manager, | 0:21:28 | 0:21:29 | |
will obviously drive a lot more interest than some quiet | 0:21:29 | 0:21:32 | |
sober analysis about the state of the election, or whatever. | 0:21:32 | 0:21:35 | |
How much money did Facebook make through fake news? | 0:21:38 | 0:21:42 | |
This is a really important issue to us, er, | 0:21:43 | 0:21:46 | |
we talk about it a lot as a company. | 0:21:46 | 0:21:48 | |
I can tell you that the estimate the amount of fake news on Facebook is | 0:21:48 | 0:21:51 | |
very small and the amount of money we make from it is negligible. | 0:21:51 | 0:21:55 | |
What does negligible actually mean? | 0:21:55 | 0:21:57 | |
It means a very small amount, compared with everything else, | 0:21:57 | 0:22:01 | |
all the other content on Facebook. And indeed, if you think about it... | 0:22:01 | 0:22:04 | |
OK, you've got your evaluation right now of 400 billion, | 0:22:04 | 0:22:08 | |
in that context, can you tell me what negligible really means? | 0:22:08 | 0:22:12 | |
So, the amount of advertising revenue that we get from fake news | 0:22:12 | 0:22:16 | |
is negligible because the amount of fake news on Facebook is very small. | 0:22:16 | 0:22:19 | |
Can you give me an actual figure? | 0:22:19 | 0:22:21 | |
So, we take fake news very seriously, we estimate the amount | 0:22:21 | 0:22:24 | |
is very small and the amount of money we make from it is negligible. | 0:22:24 | 0:22:27 | |
But we want to get both of those to zero. | 0:22:27 | 0:22:28 | |
I appreciate that but what is it actually in pounds, shillings and pence? | 0:22:28 | 0:22:32 | |
We take this issue very seriously. | 0:22:32 | 0:22:34 | |
We think it's a very small amount of the content on Facebook. | 0:22:34 | 0:22:36 | |
But you can't give me an actual figure? | 0:22:36 | 0:22:39 | |
We take the issue very seriously but what is very clear, is we | 0:22:39 | 0:22:41 | |
estimate it to be very small and the amount of money that we make | 0:22:41 | 0:22:44 | |
from it is negligible and we're determined to do everything we can, | 0:22:44 | 0:22:48 | |
humanly, and technically possible, to reduce both of those to zero. | 0:22:48 | 0:22:53 | |
A Westminster enquiry into fake news is expected to resume | 0:22:55 | 0:22:59 | |
after the general election. | 0:22:59 | 0:23:01 | |
The analysis done on the spreading of fake news has shown that Facebook | 0:23:02 | 0:23:05 | |
has been the principal platform whereby fake news is shared. | 0:23:05 | 0:23:08 | |
So, we're not only looking at Facebook but clearly they will be | 0:23:08 | 0:23:11 | |
a major part of our enquiry. | 0:23:11 | 0:23:12 | |
Primarily, I think, there is an obligation on behalf of the | 0:23:12 | 0:23:15 | |
companies themselves to recognise there is a problem, that | 0:23:15 | 0:23:17 | |
they have a social observation to do something about it. | 0:23:17 | 0:23:20 | |
But if they refuse to, | 0:23:20 | 0:23:21 | |
then we'll be looking to make recommendations about what | 0:23:21 | 0:23:24 | |
could be done to put in a better regime that, you know, | 0:23:24 | 0:23:26 | |
creates a legal obligation on behalf of those companies to combat | 0:23:26 | 0:23:29 | |
fake news and other sort of unhelpful and illicit material. | 0:23:29 | 0:23:32 | |
It's not just fake news. | 0:23:36 | 0:23:39 | |
Pressure is now growing for Facebook to take responsibility for | 0:23:42 | 0:23:46 | |
all the material on its site. | 0:23:46 | 0:23:48 | |
Critics say the company is just too slow to take content down | 0:23:57 | 0:24:01 | |
when harmful and offensive material is posted. | 0:24:01 | 0:24:04 | |
A possible group action is now being prepared by lawyers | 0:24:08 | 0:24:11 | |
representing children who have been victimised and bullied online. | 0:24:11 | 0:24:15 | |
Facebook tend to use the excuse, as most of the social networking | 0:24:18 | 0:24:21 | |
giants do - they say, "Look, we can't monitor everything. | 0:24:21 | 0:24:24 | |
"There's far too much coming in." But suddenly, if they're faced with a major threat, | 0:24:24 | 0:24:28 | |
it's surprising what they can do and what they can do quickly. | 0:24:28 | 0:24:32 | |
This is all about timing. | 0:24:32 | 0:24:33 | |
The classic example was the 1972 photograph of the naked child | 0:24:38 | 0:24:43 | |
running away from a napalm bomb in the Vietnamese village. | 0:24:43 | 0:24:46 | |
Facebook, that fell foul of their algorithm and it was pulled | 0:24:47 | 0:24:50 | |
until somebody drew it to their attention. | 0:24:50 | 0:24:53 | |
Yet, for some reason, they can't act with the same speed and dexterity | 0:24:53 | 0:24:58 | |
if some child's naked picture is posted, in a revenge porn or a schoolyard prank or whatever. | 0:24:58 | 0:25:06 | |
One common complaint is that Facebook is very slow to | 0:25:12 | 0:25:15 | |
remove material once it's complained of. Why is that? | 0:25:15 | 0:25:19 | |
We take our responsibilities really seriously in this area and we | 0:25:19 | 0:25:21 | |
aim to get...to reach all reports very quickly. | 0:25:21 | 0:25:24 | |
Indeed, we recently committed, along with other companies, | 0:25:24 | 0:25:27 | |
that when it comes to hate speech, for instance, | 0:25:27 | 0:25:29 | |
we would aim to review all reports within 24 hours. | 0:25:29 | 0:25:33 | |
And we certainly aim to get all reports of whatever nature addressed within 48 hours. | 0:25:33 | 0:25:38 | |
'After a series of controversies, Facebook last week announced | 0:25:38 | 0:25:41 | |
'plans to hire an extra 3,000 people to review content. | 0:25:41 | 0:25:45 | |
'Critics say Facebook has an unfair commercial advantage. | 0:25:52 | 0:25:56 | |
'It's a media company that doesn't check most of what it publishes. | 0:25:57 | 0:26:02 | |
'It's an advertising company that no regulator fully understands. | 0:26:02 | 0:26:07 | |
'And it's a political tool that seems beyond regulation.' | 0:26:07 | 0:26:10 | |
They have no competitors. | 0:26:13 | 0:26:16 | |
We don't even know what market it is they should be being regulated in. | 0:26:16 | 0:26:19 | |
And we're all kind of dancing around this huge ten-tonne elephant, | 0:26:19 | 0:26:23 | |
which is Facebook, and nobody knows who is responsible. | 0:26:23 | 0:26:26 | |
We're slowly finding out more about what Facebook does with our information. | 0:26:32 | 0:26:37 | |
By finding little things about you and then putting it all | 0:26:38 | 0:26:41 | |
together, in fact, they're learning a lot about my life | 0:26:41 | 0:26:44 | |
and what I'm like as a person, which I think's really weird. | 0:26:44 | 0:26:47 | |
All right. | 0:26:47 | 0:26:49 | |
We're really getting a sense of your character here, I think. | 0:26:49 | 0:26:53 | |
-Of your interests, your job, your music... -Yeah. | 0:26:53 | 0:26:55 | |
I think I should pay more attention what I comment | 0:26:55 | 0:27:00 | |
and where I leave comments. Especially if, like, if it's a controversial subject - | 0:27:00 | 0:27:06 | |
Brexit, for example, | 0:27:06 | 0:27:09 | |
or, you know, Trump. | 0:27:09 | 0:27:11 | |
Legislators around the world now seem determined to tighten | 0:27:18 | 0:27:22 | |
the rules for Facebook. | 0:27:22 | 0:27:24 | |
They have to accept that as the owners and creators of this platform, | 0:27:26 | 0:27:29 | |
they have a social obligation towards the way it is used. | 0:27:29 | 0:27:32 | |
Their technology is being used by people to, you know, | 0:27:33 | 0:27:36 | |
spread messages of hate, to spread lies, to disrupt elections, | 0:27:36 | 0:27:40 | |
then they have a social obligation to act against that. | 0:27:40 | 0:27:43 | |
And that could be bad news for the company that Mark built. | 0:27:44 | 0:27:48 | |
If Zuck is losing sleep over anything, | 0:27:50 | 0:27:52 | |
it's the government actually imposing that level | 0:27:52 | 0:27:55 | |
of control over messaging and editorial control over what appears. | 0:27:55 | 0:27:58 | |
That's what he's worried about. | 0:27:58 | 0:28:00 | |
A quarter of the world's population has signed up to Facebook. | 0:28:06 | 0:28:10 | |
The simple social network has developed at astonishing speed. | 0:28:11 | 0:28:15 | |
Is it now time for our lawmakers and regulators to catch up? | 0:28:18 | 0:28:22 |