What Facebook Knows About You Panorama


What Facebook Knows About You

Similar Content

Browse content similar to What Facebook Knows About You. Check below for episodes and series from the same categories and more!

Transcript


LineFromTo

Tonight on Panorama,

0:00:020:00:04

we investigate one of the most popular companies on the planet.

0:00:040:00:08

I think the relationship of users to Facebook is of that of a drug addict with a drug, in many ways.

0:00:090:00:14

Facebook may know more about us than any organisation in history.

0:00:140:00:18

What I care about is giving people the power to share, giving every person a voice,

0:00:180:00:22

so we can make the world more open and connected.

0:00:220:00:25

It makes a fortune from advertising, using our personal information.

0:00:250:00:29

It's scary, we are all walking barcodes.

0:00:310:00:33

And politicians are now using our data to target us, too.

0:00:340:00:38

The way we bought media on Facebook was like no-one else

0:00:400:00:44

in politics had ever done.

0:00:440:00:45

If it's influencing elections, is it now time to control Facebook?

0:00:450:00:51

With Facebook, you have a media which is increasingly seen as the most

0:00:510:00:55

valuable media that campaigns invest in, in the election period,

0:00:550:00:58

but which is totally unregulated.

0:00:580:01:00

32 million of us in the UK share our lives on Facebook.

0:01:190:01:23

You may have high privacy settings, but anything you share with friends

0:01:260:01:31

can become public if they don't have high privacy settings, too.

0:01:310:01:34

-I'm trying to get a look at how private your settings really are.

-Yeah.

0:01:360:01:40

-How much of your life is out there.

-Anything on cats I love.

0:01:400:01:44

-So everyone can just see those?

-Yes.

-Oh, my God.

0:01:440:01:47

Facebook says it helps people manage their privacy and have control.

0:01:500:01:55

Oh!

0:01:550:01:56

Right, so I think I'll have to change some of my settings now.

0:01:560:02:00

But our tests shows how information and pictures can slip into

0:02:020:02:07

public view.

0:02:070:02:08

We're looking at basically weapons here.

0:02:080:02:12

-But are they available?

-They are available, yes.

-Wow.

0:02:120:02:16

Although I support it, I don't want it in public view.

0:02:160:02:20

-These are pages that you've liked.

-Yeah.

0:02:200:02:23

-Metallica.

-Yeah.

0:02:230:02:25

I'm surprised that people can get to that level and get this information.

0:02:270:02:34

I don't think some of my friends will be very happy about that.

0:02:340:02:37

So what did you think overall?

0:02:400:02:43

Overall, I'm very concerned because I think my pictures are private,

0:02:430:02:47

and I like those.

0:02:470:02:48

So I thought that it was just your profile picture which you

0:02:480:02:51

know is always public. As opposed to...

0:02:510:02:55

the rest that we've seen today, so...

0:02:550:02:59

-Yeah.

-There weren't too many.

0:02:590:03:00

No, there weren't, but, you know, there are too many for me.

0:03:000:03:05

When we first launched, we were hoping for, you know,

0:03:090:03:12

maybe 400, 500 people.

0:03:120:03:13

And now we're at 100,000 people, so who knows where we're going next.

0:03:130:03:17

Mark Zuckerberg's college project is now used by

0:03:190:03:22

almost 2 billion people.

0:03:220:03:24

Facebook is where many of us share those important moments in our lives.

0:03:260:03:30

It's transformed the way the world stays in touch.

0:03:300:03:33

Our development is guided by the idea that every year,

0:03:340:03:38

the amount that people want to add and share and express is increasing.

0:03:380:03:42

The freedom of the internet has helped Facebook grow so quickly.

0:03:430:03:48

But after years of unfettered free speech,

0:03:480:03:51

social media companies are now facing severe criticism

0:03:510:03:55

and being called to account for the material on their sites.

0:03:550:03:58

With respect, I...

0:04:010:04:03

You always preface your comments "with respect", but it's a simple

0:04:030:04:07

question - do you feel any shame at all by some of what we've seen here?

0:04:070:04:11

I don't think it's a simple question to suggest, do you have no shame?

0:04:110:04:15

I feel very responsible, as do my colleagues,

0:04:150:04:17

for the safety of the 1.9 billion people using our service.

0:04:170:04:21

-And I'm very proud of the work we do.

-Proud of the work...

0:04:210:04:25

I don't think Facebook should be necessarily responsible for

0:04:250:04:28

every idiot on the internet. But we need to agree that.

0:04:280:04:32

We need to have regulation that reflects what we think,

0:04:320:04:35

what our values are. And right now, we haven't got any...

0:04:350:04:39

any kind of framework that sets that out.

0:04:390:04:42

Facebook is the giant of social media.

0:04:490:04:52

That scale and power makes it different from other internet companies.

0:04:520:04:57

It has personal information about every aspect of our lives.

0:04:570:05:01

It uses that information to create the most targeted advertising

0:05:030:05:07

machine in history.

0:05:070:05:09

And the ad industry loves it.

0:05:090:05:11

The amount of targeting you can do on Facebook is extraordinary.

0:05:130:05:17

Sometimes I'll be on a website called Competitive Cyclists,

0:05:170:05:20

and I'll be looking at a new wheel set or something.

0:05:200:05:23

Well, they'll chase me all over the internet with ads for bicycle parts.

0:05:230:05:27

Well, if I'm on Facebook and I'm going to see ads, I'd rather

0:05:270:05:30

see an ad for a bicycle part than women's pair of shoes, right?

0:05:300:05:34

So, as a user, I'm into it because now they're showing

0:05:340:05:37

me ads about things that I'm interested in.

0:05:370:05:39

It's scary. We are all walking barcodes.

0:05:410:05:43

HE LAUGHS

0:05:430:05:45

It's a brilliant business model that has turned Facebook into

0:05:480:05:51

a 400 billion company.

0:05:510:05:54

We produce Facebook's content for free,

0:05:550:05:58

while advertisers queue up to sell us products.

0:05:580:06:02

It's the ultimate crowd-funded media company.

0:06:020:06:05

All the media comes from the users, and all the viewers are users,

0:06:050:06:09

and advertisers get to slip into the middle of that.

0:06:090:06:12

People love Facebook. But there is a potential downside.

0:06:160:06:21

Its computer algorithms track our lives.

0:06:210:06:25

It's claimed that Facebook have more information about us than any

0:06:250:06:28

government organisation or any other business ever.

0:06:280:06:32

And the details can cover just about every aspect of our lives.

0:06:320:06:36

This is the house that Mark built.

0:06:360:06:38

Ultimately, Facebook is the company that is making billions

0:06:460:06:51

out of this data.

0:06:510:06:53

It has immense power and it has immense responsibility that

0:06:530:06:56

goes with that, and I don't think Facebook recognises that.

0:06:560:06:59

The way Facebook's algorithms work

0:07:130:07:15

is one of its most closely guarded secrets.

0:07:150:07:18

But one former insider has agreed to talk to me.

0:07:230:07:27

We're sailing between the San Juan Islands in Washington state,

0:07:280:07:33

right up in the north-west corner of America.

0:07:330:07:36

Canada, a couple of miles that direction.

0:07:360:07:39

Silicon Valley, about 1,000 miles that way.

0:07:390:07:42

Antonio Garcia Martinez helped build Facebook's advertising machine.

0:07:470:07:52

He now has a very different lifestyle.

0:07:550:07:57

He left the company after a difference of opinion.

0:08:000:08:02

Somewhat like being inducted into a cult.

0:08:040:08:07

There's actually an initiation ritual called "onboarding", in which

0:08:070:08:10

they try to inculcate the values of Facebook and the Facebook

0:08:100:08:12

way of doing things.

0:08:120:08:14

I guess the closest example might be a baptism in the case of a Catholic Church,

0:08:140:08:17

or perhaps more an evangelical church, where as an adult,

0:08:170:08:19

you sort of leave your sort of past and adopt this new path in life.

0:08:190:08:23

Well, if it's a cult, every cult has to have a leader.

0:08:230:08:25

-Who's the leader?

-Yeah, well, Mark Zuckerberg is the prophet of that religion.

0:08:250:08:29

And he's very much in charge.

0:08:290:08:32

He's definitely the Jesus of that church.

0:08:320:08:34

Antonio knows how the world's most powerful advertising tool works.

0:08:360:08:40

This is how the internet gets paid for.

0:08:430:08:46

Every time you get almost every page on the internet,

0:08:500:08:52

or every app that you load on your phone.

0:08:520:08:55

In that moment, literally in that instant, we're talking tens of milliseconds,

0:08:550:08:58

there are signals that go out that say, "Hey, this person showed up."

0:08:580:09:02

Then comes the real magic.

0:09:070:09:09

Instantly, Facebook gathers all it knows about you and matches

0:09:120:09:17

it with data from your real life.

0:09:170:09:21

Your age, your home.

0:09:210:09:24

Your bank, your credit card purchases...

0:09:250:09:29

your holidays...

0:09:290:09:31

Anything that helps advertisers target you.

0:09:310:09:34

It happens hundreds of billions of times a day, all over the internet.

0:09:350:09:40

How long does that process take?

0:09:400:09:42

Literally half the time that it takes for you to blink your eyes.

0:09:450:09:48

That's how fast it is.

0:09:480:09:50

'Facebook says it complies with all regulations that apply to it.

0:09:520:09:55

'But critics say those regulations were not designed for

0:09:570:10:00

'a company of Facebook's reach and complexity.'

0:10:000:10:03

Facebook has brought real value to many people's lives but the

0:10:040:10:08

fact is that it's not about whether or not

0:10:080:10:13

Mark Zuckerberg is a nice guy or has benign intentions,

0:10:130:10:18

it's about people's rights and responsibilities.

0:10:180:10:22

I can see a future where we are too tied into Facebook, as citizens,

0:10:220:10:27

where we are scared to move from it,

0:10:270:10:29

leave it, because it's got all our data.

0:10:290:10:31

But I can also see a future with effective regulation,

0:10:310:10:34

and that's also the future, I think, which is best for Facebook.

0:10:340:10:37

The trouble is,

0:10:410:10:43

Facebook's computer algorithms are beyond the comprehension of

0:10:430:10:46

most of its own staff, let alone government regulators.

0:10:460:10:50

You know what's naive?

0:10:510:10:53

The thought that a European bureaucrat,

0:10:530:10:55

who's never managed so much as a blog, could somehow,

0:10:550:10:57

in any way, get up to speed or even attempt to regulate the algorithm

0:10:570:11:01

that drives literally a quarter of the internet in the entire world,

0:11:010:11:05

right, that's never going to happen, in any realistic way.

0:11:050:11:08

Knowing what Facebook does with our personal information is now

0:11:120:11:15

more important than ever.

0:11:150:11:18

Because it's no longer just advertisers using our data.

0:11:180:11:21

Politicians are targeting us, too.

0:11:220:11:25

-MAN:

-Isis scum!

-CROWD:

-Off our streets!

0:11:250:11:28

-Isis scum!

-Off our streets!

0:11:280:11:30

Take the far-right group Britain First.

0:11:300:11:34

They've told panorama

0:11:340:11:36

that they pay Facebook to repeatedly promote their videos.

0:11:360:11:40

And it works. Britain First now has 1.6 million Facebook followers.

0:11:420:11:48

The British people have spoken and the answer is, we're out.

0:11:500:11:53

But it was the Brexit vote last summer that revealed the true

0:11:550:11:57

power of Facebook.

0:11:570:11:59

CHEERING

0:11:590:12:01

I think Facebook was a game-changer for the campaign,

0:12:030:12:07

I don't think there's any question about it.

0:12:070:12:10

Both sides in the referendum used Facebook to target us individually.

0:12:100:12:14

'You can say to Facebook, "I would like to make sure that I can'

0:12:150:12:20

"micro-target that fisherman, in certain parts of the UK,"

0:12:200:12:26

so that they are specifically hearing, on Facebook,

0:12:260:12:30

that if you vote to leave,

0:12:300:12:33

that you will be able to change the way that the regulations

0:12:330:12:40

are set for the fishing industry.

0:12:400:12:42

Now, I can do the exact same thing, for example, for people who live

0:12:420:12:46

in the Midlands, that are struggling because the factory has shut down.

0:12:460:12:50

So I may send a specific message through Facebook to them

0:12:500:12:55

that nobody else is seeing.

0:12:550:12:57

However you want to micro-target that message,

0:12:590:13:02

you can do it on Facebook.

0:13:020:13:04

The referendum showed how much Facebook has changed.

0:13:040:13:07

We think Facebook is about keeping in touch with family and friends.

0:13:120:13:17

But it's quietly become an important political player.

0:13:180:13:21

Donald Trump's presidential campaign took full advantage.

0:13:240:13:30

The way we bought media on Facebook was like no-one else in politics has ever done.

0:13:300:13:35

How much money would you have spent with Facebook? Could you tell me?

0:13:350:13:38

Uh, that's a number we haven't publicised.

0:13:380:13:41

If I said 70 million, from the official campaign,

0:13:410:13:44

-would that be there or thereabouts?

-Uh, I would say that's nearby, yes.

0:13:440:13:48

By using people's personal information, Facebook helped

0:13:560:14:00

the campaigns to target voters more effectively than ever before.

0:14:000:14:04

CROWD CHANTS AND APPLAUDS

0:14:040:14:07

We take your name, address, you know, phone number,

0:14:070:14:09

if we have it, e-mail, and we can take these pieces of data,

0:14:090:14:11

basically put 'em in a file, send 'em to Facebook,

0:14:110:14:15

Facebook then matches them to the user's profile.

0:14:150:14:18

So, if you're on Facebook and you have an arrow,

0:14:180:14:21

I live at so-and-so address, I can then match you.

0:14:210:14:23

And then once you're matched, I can put you into what is called an audience.

0:14:230:14:27

An audience is essentially a bucket of users that I can then target.

0:14:270:14:31

-Was Facebook decisive?

-Yes.

0:14:310:14:34

Now, I'm going to make our country rich again!

0:14:350:14:39

CHEERING AND APPLAUSE

0:14:390:14:42

Facebook says it wants to be the most open and transparent company.

0:14:460:14:50

But it won't tell us how much cash it took from Republicans

0:14:540:14:57

and Democrats, citing client confidentiality.

0:14:570:15:01

It's estimated the election earned Facebook at least 250 million.

0:15:040:15:09

There it is, Facebook's Washington HQ.

0:15:130:15:16

Millions and millions of campaign dollars were spent here.

0:15:160:15:19

The company had dedicated teams to help

0:15:190:15:21

the political parties reach voters.

0:15:210:15:23

Up the street, three blocks, the White House.

0:15:240:15:27

Most of us had little idea Facebook had become so involved in politics.

0:15:290:15:34

It even sent its own people to work directly with the campaigns.

0:15:350:15:41

Both parties are going to have a team that are from Facebook,

0:15:410:15:44

you know, we had folks on the ground with us, on the campaign,

0:15:440:15:47

working with us day in, day out,

0:15:470:15:49

we had dedicated staff that we would call when we ran into

0:15:490:15:52

problems or got stuck on using a new tool or product of theirs.

0:15:520:15:55

Or, we had an idea and we wanted to see if we could do it on

0:15:550:15:58

Facebook and they would help us create a unique solution.

0:15:580:16:01

We have been asking Facebook about this for weeks.

0:16:010:16:04

In terms of the US presidential election, did you have Facebook

0:16:060:16:10

people working directly with their respective campaigns?

0:16:100:16:14

No Facebook employee worked directly for a campaign.

0:16:140:16:18

Not "for", WITH, alongside?

0:16:180:16:21

So, one of the things we absolutely are there to do is to

0:16:210:16:23

help people make use of Facebook products.

0:16:230:16:26

We do have people who are... whose role is to help politicians

0:16:260:16:30

and governments make good use of Facebook.

0:16:300:16:33

And that's not just around campaigns.

0:16:330:16:35

But how many people did you have working effectively with these campaigns?

0:16:350:16:39

I can't give you the number of exactly how many people

0:16:400:16:42

worked with those campaigns but I can tell you that it was

0:16:420:16:45

completely demand driven. So it's really up to the campaigns.

0:16:450:16:48

So, according to the winners of both Brexit and the US presidential

0:16:510:16:56

race, Facebook was decisive.

0:16:560:16:58

Facebook is now expected to play a major role in our general election.

0:16:590:17:04

This matters because critics say Facebook is largely

0:17:060:17:08

unregulated and unaccountable.

0:17:080:17:11

Historically, there have been quite strict rules about the way

0:17:110:17:14

information is presented. Broadcasters worked to a very...

0:17:140:17:17

a very strict code in terms of partiality.

0:17:170:17:20

And there are restrictions on use of advertising but on something

0:17:200:17:23

like Facebook, you have a media which is increasingly seen as

0:17:230:17:27

the most important, most valuable media that campaigns

0:17:270:17:30

invest in, in an election period, but which is totally unregulated.

0:17:300:17:33

Facebook is also under fire for fake news.

0:17:420:17:45

It was the main platform for spreading made-up stories

0:17:490:17:52

during the US presidential election.

0:17:520:17:55

Some of them concerned Donald Trump.

0:18:020:18:06

Most were about Hilary Clinton.

0:18:060:18:08

There are dozens and dozens of fake news stories about

0:18:140:18:17

Hilary Clinton to be found,

0:18:170:18:20

linking her to sex scandals and several murders.

0:18:200:18:23

'Mark Zuckerberg has tried to play down the importance of fake news.'

0:18:260:18:31

The idea that fake news on Facebook influenced the election in

0:18:320:18:37

any way, I think, is a pretty crazy idea.

0:18:370:18:41

You know, voters make decisions based on their lived experience.

0:18:410:18:44

But some of the made-up stories contained false information

0:18:460:18:49

about when to vote.

0:18:490:18:52

The Democrats say Facebook refused to take them down.

0:18:540:18:56

A lot of them were very specifically targeted at African Americans or

0:18:590:19:04

targeted at other populations that are very likely to be Democrats.

0:19:040:19:08

We took it up with all of the large social media and technology companies.

0:19:100:19:14

None of 'em was willing to just sort of get rid of the stuff.

0:19:140:19:18

-Including Facebook?

-None of 'em was willing, yeah. Yeah, yeah, none of 'em.

0:19:180:19:21

Why didn't you simply take them down when you were asked to?

0:19:240:19:27

It is my understanding that indeed we did take them down when we

0:19:270:19:30

were asked to. When we were told about those posts, we took them down. And we also...

0:19:300:19:33

Well, can I just... Can I just show you that there?

0:19:330:19:36

I got that on the internet just on a Facebook page just the other day.

0:19:360:19:40

So they're still up there.

0:19:400:19:43

One of the things about our platform is we do rely on people to

0:19:430:19:45

let us know about content they believe shouldn't be on

0:19:450:19:48

Facebook, particularly this kind of material, and therefore,

0:19:480:19:51

when people let us know about that, then we take action.

0:19:510:19:54

And it's my understanding that we did indeed take down the posts

0:19:540:19:57

that we were notified of by the Clinton campaign.

0:19:570:20:03

'Facebook says it helped 2 million people register to vote in the US election.

0:20:030:20:08

'And it has launched a series of initiatives to combat fake news.

0:20:080:20:12

'They include suspending fake accounts and working with

0:20:140:20:17

'fact checkers to highlight what Facebook calls disputed stories.

0:20:170:20:21

'But why doesn't it simply remove them?'

0:20:230:20:26

Facebook says it doesn't want to ban fake news because it doesn't

0:20:260:20:30

want to censor the internet.

0:20:300:20:32

But there's another reason the company might be reluctant to get rid of fake news.

0:20:320:20:37

All those made-up stories - they help boost its profits.

0:20:370:20:42

The key to Facebook's profitability is engagement.

0:20:500:20:54

The more time people stay online,

0:20:540:20:56

the more money Facebook earns from advertising.

0:20:560:20:59

And fake news about the US election kept people engaged.

0:21:000:21:04

Zuck got up and said, "Well, only about 1% of the content is fake news."

0:21:060:21:10

Numerically, that might be true,

0:21:100:21:12

in terms of all the content in the world. However,

0:21:120:21:14

the fraction of engagement that those posts, you know, drove, and by

0:21:140:21:18

engagement I mean likes, comments, shares, is way more than 1%.

0:21:180:21:22

And why is that? Well, because click bait, as it's called, works.

0:21:220:21:25

Right? a headline that proclaims that Hillary, whatever,

0:21:250:21:28

just murdered her campaign manager,

0:21:280:21:29

will obviously drive a lot more interest than some quiet

0:21:290:21:32

sober analysis about the state of the election, or whatever.

0:21:320:21:35

How much money did Facebook make through fake news?

0:21:380:21:42

This is a really important issue to us, er,

0:21:430:21:46

we talk about it a lot as a company.

0:21:460:21:48

I can tell you that the estimate the amount of fake news on Facebook is

0:21:480:21:51

very small and the amount of money we make from it is negligible.

0:21:510:21:55

What does negligible actually mean?

0:21:550:21:57

It means a very small amount, compared with everything else,

0:21:570:22:01

all the other content on Facebook. And indeed, if you think about it...

0:22:010:22:04

OK, you've got your evaluation right now of 400 billion,

0:22:040:22:08

in that context, can you tell me what negligible really means?

0:22:080:22:12

So, the amount of advertising revenue that we get from fake news

0:22:120:22:16

is negligible because the amount of fake news on Facebook is very small.

0:22:160:22:19

Can you give me an actual figure?

0:22:190:22:21

So, we take fake news very seriously, we estimate the amount

0:22:210:22:24

is very small and the amount of money we make from it is negligible.

0:22:240:22:27

But we want to get both of those to zero.

0:22:270:22:28

I appreciate that but what is it actually in pounds, shillings and pence?

0:22:280:22:32

We take this issue very seriously.

0:22:320:22:34

We think it's a very small amount of the content on Facebook.

0:22:340:22:36

But you can't give me an actual figure?

0:22:360:22:39

We take the issue very seriously but what is very clear, is we

0:22:390:22:41

estimate it to be very small and the amount of money that we make

0:22:410:22:44

from it is negligible and we're determined to do everything we can,

0:22:440:22:48

humanly, and technically possible, to reduce both of those to zero.

0:22:480:22:53

A Westminster enquiry into fake news is expected to resume

0:22:550:22:59

after the general election.

0:22:590:23:01

The analysis done on the spreading of fake news has shown that Facebook

0:23:020:23:05

has been the principal platform whereby fake news is shared.

0:23:050:23:08

So, we're not only looking at Facebook but clearly they will be

0:23:080:23:11

a major part of our enquiry.

0:23:110:23:12

Primarily, I think, there is an obligation on behalf of the

0:23:120:23:15

companies themselves to recognise there is a problem, that

0:23:150:23:17

they have a social observation to do something about it.

0:23:170:23:20

But if they refuse to,

0:23:200:23:21

then we'll be looking to make recommendations about what

0:23:210:23:24

could be done to put in a better regime that, you know,

0:23:240:23:26

creates a legal obligation on behalf of those companies to combat

0:23:260:23:29

fake news and other sort of unhelpful and illicit material.

0:23:290:23:32

It's not just fake news.

0:23:360:23:39

Pressure is now growing for Facebook to take responsibility for

0:23:420:23:46

all the material on its site.

0:23:460:23:48

Critics say the company is just too slow to take content down

0:23:570:24:01

when harmful and offensive material is posted.

0:24:010:24:04

A possible group action is now being prepared by lawyers

0:24:080:24:11

representing children who have been victimised and bullied online.

0:24:110:24:15

Facebook tend to use the excuse, as most of the social networking

0:24:180:24:21

giants do - they say, "Look, we can't monitor everything.

0:24:210:24:24

"There's far too much coming in." But suddenly, if they're faced with a major threat,

0:24:240:24:28

it's surprising what they can do and what they can do quickly.

0:24:280:24:32

This is all about timing.

0:24:320:24:33

The classic example was the 1972 photograph of the naked child

0:24:380:24:43

running away from a napalm bomb in the Vietnamese village.

0:24:430:24:46

Facebook, that fell foul of their algorithm and it was pulled

0:24:470:24:50

until somebody drew it to their attention.

0:24:500:24:53

Yet, for some reason, they can't act with the same speed and dexterity

0:24:530:24:58

if some child's naked picture is posted, in a revenge porn or a schoolyard prank or whatever.

0:24:580:25:06

One common complaint is that Facebook is very slow to

0:25:120:25:15

remove material once it's complained of. Why is that?

0:25:150:25:19

We take our responsibilities really seriously in this area and we

0:25:190:25:21

aim to get...to reach all reports very quickly.

0:25:210:25:24

Indeed, we recently committed, along with other companies,

0:25:240:25:27

that when it comes to hate speech, for instance,

0:25:270:25:29

we would aim to review all reports within 24 hours.

0:25:290:25:33

And we certainly aim to get all reports of whatever nature addressed within 48 hours.

0:25:330:25:38

'After a series of controversies, Facebook last week announced

0:25:380:25:41

'plans to hire an extra 3,000 people to review content.

0:25:410:25:45

'Critics say Facebook has an unfair commercial advantage.

0:25:520:25:56

'It's a media company that doesn't check most of what it publishes.

0:25:570:26:02

'It's an advertising company that no regulator fully understands.

0:26:020:26:07

'And it's a political tool that seems beyond regulation.'

0:26:070:26:10

They have no competitors.

0:26:130:26:16

We don't even know what market it is they should be being regulated in.

0:26:160:26:19

And we're all kind of dancing around this huge ten-tonne elephant,

0:26:190:26:23

which is Facebook, and nobody knows who is responsible.

0:26:230:26:26

We're slowly finding out more about what Facebook does with our information.

0:26:320:26:37

By finding little things about you and then putting it all

0:26:380:26:41

together, in fact, they're learning a lot about my life

0:26:410:26:44

and what I'm like as a person, which I think's really weird.

0:26:440:26:47

All right.

0:26:470:26:49

We're really getting a sense of your character here, I think.

0:26:490:26:53

-Of your interests, your job, your music...

-Yeah.

0:26:530:26:55

I think I should pay more attention what I comment

0:26:550:27:00

and where I leave comments. Especially if, like, if it's a controversial subject -

0:27:000:27:06

Brexit, for example,

0:27:060:27:09

or, you know, Trump.

0:27:090:27:11

Legislators around the world now seem determined to tighten

0:27:180:27:22

the rules for Facebook.

0:27:220:27:24

They have to accept that as the owners and creators of this platform,

0:27:260:27:29

they have a social obligation towards the way it is used.

0:27:290:27:32

Their technology is being used by people to, you know,

0:27:330:27:36

spread messages of hate, to spread lies, to disrupt elections,

0:27:360:27:40

then they have a social obligation to act against that.

0:27:400:27:43

And that could be bad news for the company that Mark built.

0:27:440:27:48

If Zuck is losing sleep over anything,

0:27:500:27:52

it's the government actually imposing that level

0:27:520:27:55

of control over messaging and editorial control over what appears.

0:27:550:27:58

That's what he's worried about.

0:27:580:28:00

A quarter of the world's population has signed up to Facebook.

0:28:060:28:10

The simple social network has developed at astonishing speed.

0:28:110:28:15

Is it now time for our lawmakers and regulators to catch up?

0:28:180:28:22

Download Subtitles

SRT

ASS