0:00:02 > 0:00:04Tonight on Panorama,
0:00:04 > 0:00:08we investigate one of the most popular companies on the planet.
0:00:09 > 0:00:14I think the relationship of users to Facebook is of that of a drug addict with a drug, in many ways.
0:00:14 > 0:00:18Facebook may know more about us than any organisation in history.
0:00:18 > 0:00:22What I care about is giving people the power to share, giving every person a voice,
0:00:22 > 0:00:25so we can make the world more open and connected.
0:00:25 > 0:00:29It makes a fortune from advertising, using our personal information.
0:00:31 > 0:00:33It's scary, we are all walking barcodes.
0:00:34 > 0:00:38And politicians are now using our data to target us, too.
0:00:40 > 0:00:44The way we bought media on Facebook was like no-one else
0:00:44 > 0:00:45in politics had ever done.
0:00:45 > 0:00:51If it's influencing elections, is it now time to control Facebook?
0:00:51 > 0:00:55With Facebook, you have a media which is increasingly seen as the most
0:00:55 > 0:00:58valuable media that campaigns invest in, in the election period,
0:00:58 > 0:01:00but which is totally unregulated.
0:01:19 > 0:01:2332 million of us in the UK share our lives on Facebook.
0:01:26 > 0:01:31You may have high privacy settings, but anything you share with friends
0:01:31 > 0:01:34can become public if they don't have high privacy settings, too.
0:01:36 > 0:01:40- I'm trying to get a look at how private your settings really are. - Yeah.
0:01:40 > 0:01:44- How much of your life is out there. - Anything on cats I love.
0:01:44 > 0:01:47- So everyone can just see those? - Yes.- Oh, my God.
0:01:50 > 0:01:55Facebook says it helps people manage their privacy and have control.
0:01:55 > 0:01:56Oh!
0:01:56 > 0:02:00Right, so I think I'll have to change some of my settings now.
0:02:02 > 0:02:07But our tests shows how information and pictures can slip into
0:02:07 > 0:02:08public view.
0:02:08 > 0:02:12We're looking at basically weapons here.
0:02:12 > 0:02:16- But are they available? - They are available, yes.- Wow.
0:02:16 > 0:02:20Although I support it, I don't want it in public view.
0:02:20 > 0:02:23- These are pages that you've liked. - Yeah.
0:02:23 > 0:02:25- Metallica.- Yeah.
0:02:27 > 0:02:34I'm surprised that people can get to that level and get this information.
0:02:34 > 0:02:37I don't think some of my friends will be very happy about that.
0:02:40 > 0:02:43So what did you think overall?
0:02:43 > 0:02:47Overall, I'm very concerned because I think my pictures are private,
0:02:47 > 0:02:48and I like those.
0:02:48 > 0:02:51So I thought that it was just your profile picture which you
0:02:51 > 0:02:55know is always public. As opposed to...
0:02:55 > 0:02:59the rest that we've seen today, so...
0:02:59 > 0:03:00- Yeah.- There weren't too many.
0:03:00 > 0:03:05No, there weren't, but, you know, there are too many for me.
0:03:09 > 0:03:12When we first launched, we were hoping for, you know,
0:03:12 > 0:03:13maybe 400, 500 people.
0:03:13 > 0:03:17And now we're at 100,000 people, so who knows where we're going next.
0:03:19 > 0:03:22Mark Zuckerberg's college project is now used by
0:03:22 > 0:03:24almost 2 billion people.
0:03:26 > 0:03:30Facebook is where many of us share those important moments in our lives.
0:03:30 > 0:03:33It's transformed the way the world stays in touch.
0:03:34 > 0:03:38Our development is guided by the idea that every year,
0:03:38 > 0:03:42the amount that people want to add and share and express is increasing.
0:03:43 > 0:03:48The freedom of the internet has helped Facebook grow so quickly.
0:03:48 > 0:03:51But after years of unfettered free speech,
0:03:51 > 0:03:55social media companies are now facing severe criticism
0:03:55 > 0:03:58and being called to account for the material on their sites.
0:04:01 > 0:04:03With respect, I...
0:04:03 > 0:04:07You always preface your comments "with respect", but it's a simple
0:04:07 > 0:04:11question - do you feel any shame at all by some of what we've seen here?
0:04:11 > 0:04:15I don't think it's a simple question to suggest, do you have no shame?
0:04:15 > 0:04:17I feel very responsible, as do my colleagues,
0:04:17 > 0:04:21for the safety of the 1.9 billion people using our service.
0:04:21 > 0:04:25- And I'm very proud of the work we do.- Proud of the work...
0:04:25 > 0:04:28I don't think Facebook should be necessarily responsible for
0:04:28 > 0:04:32every idiot on the internet. But we need to agree that.
0:04:32 > 0:04:35We need to have regulation that reflects what we think,
0:04:35 > 0:04:39what our values are. And right now, we haven't got any...
0:04:39 > 0:04:42any kind of framework that sets that out.
0:04:49 > 0:04:52Facebook is the giant of social media.
0:04:52 > 0:04:57That scale and power makes it different from other internet companies.
0:04:57 > 0:05:01It has personal information about every aspect of our lives.
0:05:03 > 0:05:07It uses that information to create the most targeted advertising
0:05:07 > 0:05:09machine in history.
0:05:09 > 0:05:11And the ad industry loves it.
0:05:13 > 0:05:17The amount of targeting you can do on Facebook is extraordinary.
0:05:17 > 0:05:20Sometimes I'll be on a website called Competitive Cyclists,
0:05:20 > 0:05:23and I'll be looking at a new wheel set or something.
0:05:23 > 0:05:27Well, they'll chase me all over the internet with ads for bicycle parts.
0:05:27 > 0:05:30Well, if I'm on Facebook and I'm going to see ads, I'd rather
0:05:30 > 0:05:34see an ad for a bicycle part than women's pair of shoes, right?
0:05:34 > 0:05:37So, as a user, I'm into it because now they're showing
0:05:37 > 0:05:39me ads about things that I'm interested in.
0:05:41 > 0:05:43It's scary. We are all walking barcodes.
0:05:43 > 0:05:45HE LAUGHS
0:05:48 > 0:05:51It's a brilliant business model that has turned Facebook into
0:05:51 > 0:05:54a 400 billion company.
0:05:55 > 0:05:58We produce Facebook's content for free,
0:05:58 > 0:06:02while advertisers queue up to sell us products.
0:06:02 > 0:06:05It's the ultimate crowd-funded media company.
0:06:05 > 0:06:09All the media comes from the users, and all the viewers are users,
0:06:09 > 0:06:12and advertisers get to slip into the middle of that.
0:06:16 > 0:06:21People love Facebook. But there is a potential downside.
0:06:21 > 0:06:25Its computer algorithms track our lives.
0:06:25 > 0:06:28It's claimed that Facebook have more information about us than any
0:06:28 > 0:06:32government organisation or any other business ever.
0:06:32 > 0:06:36And the details can cover just about every aspect of our lives.
0:06:36 > 0:06:38This is the house that Mark built.
0:06:46 > 0:06:51Ultimately, Facebook is the company that is making billions
0:06:51 > 0:06:53out of this data.
0:06:53 > 0:06:56It has immense power and it has immense responsibility that
0:06:56 > 0:06:59goes with that, and I don't think Facebook recognises that.
0:07:13 > 0:07:15The way Facebook's algorithms work
0:07:15 > 0:07:18is one of its most closely guarded secrets.
0:07:23 > 0:07:27But one former insider has agreed to talk to me.
0:07:28 > 0:07:33We're sailing between the San Juan Islands in Washington state,
0:07:33 > 0:07:36right up in the north-west corner of America.
0:07:36 > 0:07:39Canada, a couple of miles that direction.
0:07:39 > 0:07:42Silicon Valley, about 1,000 miles that way.
0:07:47 > 0:07:52Antonio Garcia Martinez helped build Facebook's advertising machine.
0:07:55 > 0:07:57He now has a very different lifestyle.
0:08:00 > 0:08:02He left the company after a difference of opinion.
0:08:04 > 0:08:07Somewhat like being inducted into a cult.
0:08:07 > 0:08:10There's actually an initiation ritual called "onboarding", in which
0:08:10 > 0:08:12they try to inculcate the values of Facebook and the Facebook
0:08:12 > 0:08:14way of doing things.
0:08:14 > 0:08:17I guess the closest example might be a baptism in the case of a Catholic Church,
0:08:17 > 0:08:19or perhaps more an evangelical church, where as an adult,
0:08:19 > 0:08:23you sort of leave your sort of past and adopt this new path in life.
0:08:23 > 0:08:25Well, if it's a cult, every cult has to have a leader.
0:08:25 > 0:08:29- Who's the leader?- Yeah, well, Mark Zuckerberg is the prophet of that religion.
0:08:29 > 0:08:32And he's very much in charge.
0:08:32 > 0:08:34He's definitely the Jesus of that church.
0:08:36 > 0:08:40Antonio knows how the world's most powerful advertising tool works.
0:08:43 > 0:08:46This is how the internet gets paid for.
0:08:50 > 0:08:52Every time you get almost every page on the internet,
0:08:52 > 0:08:55or every app that you load on your phone.
0:08:55 > 0:08:58In that moment, literally in that instant, we're talking tens of milliseconds,
0:08:58 > 0:09:02there are signals that go out that say, "Hey, this person showed up."
0:09:07 > 0:09:09Then comes the real magic.
0:09:12 > 0:09:17Instantly, Facebook gathers all it knows about you and matches
0:09:17 > 0:09:21it with data from your real life.
0:09:21 > 0:09:24Your age, your home.
0:09:25 > 0:09:29Your bank, your credit card purchases...
0:09:29 > 0:09:31your holidays...
0:09:31 > 0:09:34Anything that helps advertisers target you.
0:09:35 > 0:09:40It happens hundreds of billions of times a day, all over the internet.
0:09:40 > 0:09:42How long does that process take?
0:09:45 > 0:09:48Literally half the time that it takes for you to blink your eyes.
0:09:48 > 0:09:50That's how fast it is.
0:09:52 > 0:09:55'Facebook says it complies with all regulations that apply to it.
0:09:57 > 0:10:00'But critics say those regulations were not designed for
0:10:00 > 0:10:03'a company of Facebook's reach and complexity.'
0:10:04 > 0:10:08Facebook has brought real value to many people's lives but the
0:10:08 > 0:10:13fact is that it's not about whether or not
0:10:13 > 0:10:18Mark Zuckerberg is a nice guy or has benign intentions,
0:10:18 > 0:10:22it's about people's rights and responsibilities.
0:10:22 > 0:10:27I can see a future where we are too tied into Facebook, as citizens,
0:10:27 > 0:10:29where we are scared to move from it,
0:10:29 > 0:10:31leave it, because it's got all our data.
0:10:31 > 0:10:34But I can also see a future with effective regulation,
0:10:34 > 0:10:37and that's also the future, I think, which is best for Facebook.
0:10:41 > 0:10:43The trouble is,
0:10:43 > 0:10:46Facebook's computer algorithms are beyond the comprehension of
0:10:46 > 0:10:50most of its own staff, let alone government regulators.
0:10:51 > 0:10:53You know what's naive?
0:10:53 > 0:10:55The thought that a European bureaucrat,
0:10:55 > 0:10:57who's never managed so much as a blog, could somehow,
0:10:57 > 0:11:01in any way, get up to speed or even attempt to regulate the algorithm
0:11:01 > 0:11:05that drives literally a quarter of the internet in the entire world,
0:11:05 > 0:11:08right, that's never going to happen, in any realistic way.
0:11:12 > 0:11:15Knowing what Facebook does with our personal information is now
0:11:15 > 0:11:18more important than ever.
0:11:18 > 0:11:21Because it's no longer just advertisers using our data.
0:11:22 > 0:11:25Politicians are targeting us, too.
0:11:25 > 0:11:28- MAN:- Isis scum! - CROWD:- Off our streets!
0:11:28 > 0:11:30- Isis scum! - Off our streets!
0:11:30 > 0:11:34Take the far-right group Britain First.
0:11:34 > 0:11:36They've told panorama
0:11:36 > 0:11:40that they pay Facebook to repeatedly promote their videos.
0:11:42 > 0:11:48And it works. Britain First now has 1.6 million Facebook followers.
0:11:50 > 0:11:53The British people have spoken and the answer is, we're out.
0:11:55 > 0:11:57But it was the Brexit vote last summer that revealed the true
0:11:57 > 0:11:59power of Facebook.
0:11:59 > 0:12:01CHEERING
0:12:03 > 0:12:07I think Facebook was a game-changer for the campaign,
0:12:07 > 0:12:10I don't think there's any question about it.
0:12:10 > 0:12:14Both sides in the referendum used Facebook to target us individually.
0:12:15 > 0:12:20'You can say to Facebook, "I would like to make sure that I can'
0:12:20 > 0:12:26"micro-target that fisherman, in certain parts of the UK,"
0:12:26 > 0:12:30so that they are specifically hearing, on Facebook,
0:12:30 > 0:12:33that if you vote to leave,
0:12:33 > 0:12:40that you will be able to change the way that the regulations
0:12:40 > 0:12:42are set for the fishing industry.
0:12:42 > 0:12:46Now, I can do the exact same thing, for example, for people who live
0:12:46 > 0:12:50in the Midlands, that are struggling because the factory has shut down.
0:12:50 > 0:12:55So I may send a specific message through Facebook to them
0:12:55 > 0:12:57that nobody else is seeing.
0:12:59 > 0:13:02However you want to micro-target that message,
0:13:02 > 0:13:04you can do it on Facebook.
0:13:04 > 0:13:07The referendum showed how much Facebook has changed.
0:13:12 > 0:13:17We think Facebook is about keeping in touch with family and friends.
0:13:18 > 0:13:21But it's quietly become an important political player.
0:13:24 > 0:13:30Donald Trump's presidential campaign took full advantage.
0:13:30 > 0:13:35The way we bought media on Facebook was like no-one else in politics has ever done.
0:13:35 > 0:13:38How much money would you have spent with Facebook? Could you tell me?
0:13:38 > 0:13:41Uh, that's a number we haven't publicised.
0:13:41 > 0:13:44If I said 70 million, from the official campaign,
0:13:44 > 0:13:48- would that be there or thereabouts? - Uh, I would say that's nearby, yes.
0:13:56 > 0:14:00By using people's personal information, Facebook helped
0:14:00 > 0:14:04the campaigns to target voters more effectively than ever before.
0:14:04 > 0:14:07CROWD CHANTS AND APPLAUDS
0:14:07 > 0:14:09We take your name, address, you know, phone number,
0:14:09 > 0:14:11if we have it, e-mail, and we can take these pieces of data,
0:14:11 > 0:14:15basically put 'em in a file, send 'em to Facebook,
0:14:15 > 0:14:18Facebook then matches them to the user's profile.
0:14:18 > 0:14:21So, if you're on Facebook and you have an arrow,
0:14:21 > 0:14:23I live at so-and-so address, I can then match you.
0:14:23 > 0:14:27And then once you're matched, I can put you into what is called an audience.
0:14:27 > 0:14:31An audience is essentially a bucket of users that I can then target.
0:14:31 > 0:14:34- Was Facebook decisive?- Yes.
0:14:35 > 0:14:39Now, I'm going to make our country rich again!
0:14:39 > 0:14:42CHEERING AND APPLAUSE
0:14:46 > 0:14:50Facebook says it wants to be the most open and transparent company.
0:14:54 > 0:14:57But it won't tell us how much cash it took from Republicans
0:14:57 > 0:15:01and Democrats, citing client confidentiality.
0:15:04 > 0:15:09It's estimated the election earned Facebook at least 250 million.
0:15:13 > 0:15:16There it is, Facebook's Washington HQ.
0:15:16 > 0:15:19Millions and millions of campaign dollars were spent here.
0:15:19 > 0:15:21The company had dedicated teams to help
0:15:21 > 0:15:23the political parties reach voters.
0:15:24 > 0:15:27Up the street, three blocks, the White House.
0:15:29 > 0:15:34Most of us had little idea Facebook had become so involved in politics.
0:15:35 > 0:15:41It even sent its own people to work directly with the campaigns.
0:15:41 > 0:15:44Both parties are going to have a team that are from Facebook,
0:15:44 > 0:15:47you know, we had folks on the ground with us, on the campaign,
0:15:47 > 0:15:49working with us day in, day out,
0:15:49 > 0:15:52we had dedicated staff that we would call when we ran into
0:15:52 > 0:15:55problems or got stuck on using a new tool or product of theirs.
0:15:55 > 0:15:58Or, we had an idea and we wanted to see if we could do it on
0:15:58 > 0:16:01Facebook and they would help us create a unique solution.
0:16:01 > 0:16:04We have been asking Facebook about this for weeks.
0:16:06 > 0:16:10In terms of the US presidential election, did you have Facebook
0:16:10 > 0:16:14people working directly with their respective campaigns?
0:16:14 > 0:16:18No Facebook employee worked directly for a campaign.
0:16:18 > 0:16:21Not "for", WITH, alongside?
0:16:21 > 0:16:23So, one of the things we absolutely are there to do is to
0:16:23 > 0:16:26help people make use of Facebook products.
0:16:26 > 0:16:30We do have people who are... whose role is to help politicians
0:16:30 > 0:16:33and governments make good use of Facebook.
0:16:33 > 0:16:35And that's not just around campaigns.
0:16:35 > 0:16:39But how many people did you have working effectively with these campaigns?
0:16:40 > 0:16:42I can't give you the number of exactly how many people
0:16:42 > 0:16:45worked with those campaigns but I can tell you that it was
0:16:45 > 0:16:48completely demand driven. So it's really up to the campaigns.
0:16:51 > 0:16:56So, according to the winners of both Brexit and the US presidential
0:16:56 > 0:16:58race, Facebook was decisive.
0:16:59 > 0:17:04Facebook is now expected to play a major role in our general election.
0:17:06 > 0:17:08This matters because critics say Facebook is largely
0:17:08 > 0:17:11unregulated and unaccountable.
0:17:11 > 0:17:14Historically, there have been quite strict rules about the way
0:17:14 > 0:17:17information is presented. Broadcasters worked to a very...
0:17:17 > 0:17:20a very strict code in terms of partiality.
0:17:20 > 0:17:23And there are restrictions on use of advertising but on something
0:17:23 > 0:17:27like Facebook, you have a media which is increasingly seen as
0:17:27 > 0:17:30the most important, most valuable media that campaigns
0:17:30 > 0:17:33invest in, in an election period, but which is totally unregulated.
0:17:42 > 0:17:45Facebook is also under fire for fake news.
0:17:49 > 0:17:52It was the main platform for spreading made-up stories
0:17:52 > 0:17:55during the US presidential election.
0:18:02 > 0:18:06Some of them concerned Donald Trump.
0:18:06 > 0:18:08Most were about Hilary Clinton.
0:18:14 > 0:18:17There are dozens and dozens of fake news stories about
0:18:17 > 0:18:20Hilary Clinton to be found,
0:18:20 > 0:18:23linking her to sex scandals and several murders.
0:18:26 > 0:18:31'Mark Zuckerberg has tried to play down the importance of fake news.'
0:18:32 > 0:18:37The idea that fake news on Facebook influenced the election in
0:18:37 > 0:18:41any way, I think, is a pretty crazy idea.
0:18:41 > 0:18:44You know, voters make decisions based on their lived experience.
0:18:46 > 0:18:49But some of the made-up stories contained false information
0:18:49 > 0:18:52about when to vote.
0:18:54 > 0:18:56The Democrats say Facebook refused to take them down.
0:18:59 > 0:19:04A lot of them were very specifically targeted at African Americans or
0:19:04 > 0:19:08targeted at other populations that are very likely to be Democrats.
0:19:10 > 0:19:14We took it up with all of the large social media and technology companies.
0:19:14 > 0:19:18None of 'em was willing to just sort of get rid of the stuff.
0:19:18 > 0:19:21- Including Facebook? - None of 'em was willing, yeah. Yeah, yeah, none of 'em.
0:19:24 > 0:19:27Why didn't you simply take them down when you were asked to?
0:19:27 > 0:19:30It is my understanding that indeed we did take them down when we
0:19:30 > 0:19:33were asked to. When we were told about those posts, we took them down. And we also...
0:19:33 > 0:19:36Well, can I just... Can I just show you that there?
0:19:36 > 0:19:40I got that on the internet just on a Facebook page just the other day.
0:19:40 > 0:19:43So they're still up there.
0:19:43 > 0:19:45One of the things about our platform is we do rely on people to
0:19:45 > 0:19:48let us know about content they believe shouldn't be on
0:19:48 > 0:19:51Facebook, particularly this kind of material, and therefore,
0:19:51 > 0:19:54when people let us know about that, then we take action.
0:19:54 > 0:19:57And it's my understanding that we did indeed take down the posts
0:19:57 > 0:20:03that we were notified of by the Clinton campaign.
0:20:03 > 0:20:08'Facebook says it helped 2 million people register to vote in the US election.
0:20:08 > 0:20:12'And it has launched a series of initiatives to combat fake news.
0:20:14 > 0:20:17'They include suspending fake accounts and working with
0:20:17 > 0:20:21'fact checkers to highlight what Facebook calls disputed stories.
0:20:23 > 0:20:26'But why doesn't it simply remove them?'
0:20:26 > 0:20:30Facebook says it doesn't want to ban fake news because it doesn't
0:20:30 > 0:20:32want to censor the internet.
0:20:32 > 0:20:37But there's another reason the company might be reluctant to get rid of fake news.
0:20:37 > 0:20:42All those made-up stories - they help boost its profits.
0:20:50 > 0:20:54The key to Facebook's profitability is engagement.
0:20:54 > 0:20:56The more time people stay online,
0:20:56 > 0:20:59the more money Facebook earns from advertising.
0:21:00 > 0:21:04And fake news about the US election kept people engaged.
0:21:06 > 0:21:10Zuck got up and said, "Well, only about 1% of the content is fake news."
0:21:10 > 0:21:12Numerically, that might be true,
0:21:12 > 0:21:14in terms of all the content in the world. However,
0:21:14 > 0:21:18the fraction of engagement that those posts, you know, drove, and by
0:21:18 > 0:21:22engagement I mean likes, comments, shares, is way more than 1%.
0:21:22 > 0:21:25And why is that? Well, because click bait, as it's called, works.
0:21:25 > 0:21:28Right? a headline that proclaims that Hillary, whatever,
0:21:28 > 0:21:29just murdered her campaign manager,
0:21:29 > 0:21:32will obviously drive a lot more interest than some quiet
0:21:32 > 0:21:35sober analysis about the state of the election, or whatever.
0:21:38 > 0:21:42How much money did Facebook make through fake news?
0:21:43 > 0:21:46This is a really important issue to us, er,
0:21:46 > 0:21:48we talk about it a lot as a company.
0:21:48 > 0:21:51I can tell you that the estimate the amount of fake news on Facebook is
0:21:51 > 0:21:55very small and the amount of money we make from it is negligible.
0:21:55 > 0:21:57What does negligible actually mean?
0:21:57 > 0:22:01It means a very small amount, compared with everything else,
0:22:01 > 0:22:04all the other content on Facebook. And indeed, if you think about it...
0:22:04 > 0:22:08OK, you've got your evaluation right now of 400 billion,
0:22:08 > 0:22:12in that context, can you tell me what negligible really means?
0:22:12 > 0:22:16So, the amount of advertising revenue that we get from fake news
0:22:16 > 0:22:19is negligible because the amount of fake news on Facebook is very small.
0:22:19 > 0:22:21Can you give me an actual figure?
0:22:21 > 0:22:24So, we take fake news very seriously, we estimate the amount
0:22:24 > 0:22:27is very small and the amount of money we make from it is negligible.
0:22:27 > 0:22:28But we want to get both of those to zero.
0:22:28 > 0:22:32I appreciate that but what is it actually in pounds, shillings and pence?
0:22:32 > 0:22:34We take this issue very seriously.
0:22:34 > 0:22:36We think it's a very small amount of the content on Facebook.
0:22:36 > 0:22:39But you can't give me an actual figure?
0:22:39 > 0:22:41We take the issue very seriously but what is very clear, is we
0:22:41 > 0:22:44estimate it to be very small and the amount of money that we make
0:22:44 > 0:22:48from it is negligible and we're determined to do everything we can,
0:22:48 > 0:22:53humanly, and technically possible, to reduce both of those to zero.
0:22:55 > 0:22:59A Westminster enquiry into fake news is expected to resume
0:22:59 > 0:23:01after the general election.
0:23:02 > 0:23:05The analysis done on the spreading of fake news has shown that Facebook
0:23:05 > 0:23:08has been the principal platform whereby fake news is shared.
0:23:08 > 0:23:11So, we're not only looking at Facebook but clearly they will be
0:23:11 > 0:23:12a major part of our enquiry.
0:23:12 > 0:23:15Primarily, I think, there is an obligation on behalf of the
0:23:15 > 0:23:17companies themselves to recognise there is a problem, that
0:23:17 > 0:23:20they have a social observation to do something about it.
0:23:20 > 0:23:21But if they refuse to,
0:23:21 > 0:23:24then we'll be looking to make recommendations about what
0:23:24 > 0:23:26could be done to put in a better regime that, you know,
0:23:26 > 0:23:29creates a legal obligation on behalf of those companies to combat
0:23:29 > 0:23:32fake news and other sort of unhelpful and illicit material.
0:23:36 > 0:23:39It's not just fake news.
0:23:42 > 0:23:46Pressure is now growing for Facebook to take responsibility for
0:23:46 > 0:23:48all the material on its site.
0:23:57 > 0:24:01Critics say the company is just too slow to take content down
0:24:01 > 0:24:04when harmful and offensive material is posted.
0:24:08 > 0:24:11A possible group action is now being prepared by lawyers
0:24:11 > 0:24:15representing children who have been victimised and bullied online.
0:24:18 > 0:24:21Facebook tend to use the excuse, as most of the social networking
0:24:21 > 0:24:24giants do - they say, "Look, we can't monitor everything.
0:24:24 > 0:24:28"There's far too much coming in." But suddenly, if they're faced with a major threat,
0:24:28 > 0:24:32it's surprising what they can do and what they can do quickly.
0:24:32 > 0:24:33This is all about timing.
0:24:38 > 0:24:43The classic example was the 1972 photograph of the naked child
0:24:43 > 0:24:46running away from a napalm bomb in the Vietnamese village.
0:24:47 > 0:24:50Facebook, that fell foul of their algorithm and it was pulled
0:24:50 > 0:24:53until somebody drew it to their attention.
0:24:53 > 0:24:58Yet, for some reason, they can't act with the same speed and dexterity
0:24:58 > 0:25:06if some child's naked picture is posted, in a revenge porn or a schoolyard prank or whatever.
0:25:12 > 0:25:15One common complaint is that Facebook is very slow to
0:25:15 > 0:25:19remove material once it's complained of. Why is that?
0:25:19 > 0:25:21We take our responsibilities really seriously in this area and we
0:25:21 > 0:25:24aim to get...to reach all reports very quickly.
0:25:24 > 0:25:27Indeed, we recently committed, along with other companies,
0:25:27 > 0:25:29that when it comes to hate speech, for instance,
0:25:29 > 0:25:33we would aim to review all reports within 24 hours.
0:25:33 > 0:25:38And we certainly aim to get all reports of whatever nature addressed within 48 hours.
0:25:38 > 0:25:41'After a series of controversies, Facebook last week announced
0:25:41 > 0:25:45'plans to hire an extra 3,000 people to review content.
0:25:52 > 0:25:56'Critics say Facebook has an unfair commercial advantage.
0:25:57 > 0:26:02'It's a media company that doesn't check most of what it publishes.
0:26:02 > 0:26:07'It's an advertising company that no regulator fully understands.
0:26:07 > 0:26:10'And it's a political tool that seems beyond regulation.'
0:26:13 > 0:26:16They have no competitors.
0:26:16 > 0:26:19We don't even know what market it is they should be being regulated in.
0:26:19 > 0:26:23And we're all kind of dancing around this huge ten-tonne elephant,
0:26:23 > 0:26:26which is Facebook, and nobody knows who is responsible.
0:26:32 > 0:26:37We're slowly finding out more about what Facebook does with our information.
0:26:38 > 0:26:41By finding little things about you and then putting it all
0:26:41 > 0:26:44together, in fact, they're learning a lot about my life
0:26:44 > 0:26:47and what I'm like as a person, which I think's really weird.
0:26:47 > 0:26:49All right.
0:26:49 > 0:26:53We're really getting a sense of your character here, I think.
0:26:53 > 0:26:55- Of your interests, your job, your music...- Yeah.
0:26:55 > 0:27:00I think I should pay more attention what I comment
0:27:00 > 0:27:06and where I leave comments. Especially if, like, if it's a controversial subject -
0:27:06 > 0:27:09Brexit, for example,
0:27:09 > 0:27:11or, you know, Trump.
0:27:18 > 0:27:22Legislators around the world now seem determined to tighten
0:27:22 > 0:27:24the rules for Facebook.
0:27:26 > 0:27:29They have to accept that as the owners and creators of this platform,
0:27:29 > 0:27:32they have a social obligation towards the way it is used.
0:27:33 > 0:27:36Their technology is being used by people to, you know,
0:27:36 > 0:27:40spread messages of hate, to spread lies, to disrupt elections,
0:27:40 > 0:27:43then they have a social obligation to act against that.
0:27:44 > 0:27:48And that could be bad news for the company that Mark built.
0:27:50 > 0:27:52If Zuck is losing sleep over anything,
0:27:52 > 0:27:55it's the government actually imposing that level
0:27:55 > 0:27:58of control over messaging and editorial control over what appears.
0:27:58 > 0:28:00That's what he's worried about.
0:28:06 > 0:28:10A quarter of the world's population has signed up to Facebook.
0:28:11 > 0:28:15The simple social network has developed at astonishing speed.
0:28:18 > 0:28:22Is it now time for our lawmakers and regulators to catch up?