The Persuasion Machine Secrets Of Silicon Valley


The Persuasion Machine

Similar Content

Browse content similar to The Persuasion Machine. Check below for episodes and series from the same categories and more!

Transcript


LineFromTo

It was the biggest political earthquake of the century.

0:00:040:00:08

We will make America great again!

0:00:080:00:12

But just how did Donald Trump defy the predictions

0:00:150:00:17

of political pundits and pollsters?

0:00:170:00:21

The secret lies here, in San Antonio, Texas.

0:00:210:00:25

This was our Project Alamo.

0:00:270:00:29

This is where the digital arm of the Trump campaign operation was held.

0:00:290:00:35

This is the extraordinary story of two men

0:00:370:00:40

with two very different views of the world.

0:00:400:00:43

We will build the wall...

0:00:440:00:46

The path forward is to connect more, not less.

0:00:460:00:49

And how Facebook's Mark Zuckerberg inadvertently helped Donald Trump

0:00:490:00:54

become the most powerful man on the planet.

0:00:540:00:57

Without Facebook, we wouldn't have won.

0:00:570:01:00

I mean, Facebook really and truly put us over the edge.

0:01:000:01:04

With their secret algorithms and online tracking,

0:01:040:01:08

social media companies know more about us than anyone.

0:01:080:01:12

So you can predict mine and everybody else's personality

0:01:130:01:16

based on the things that they've liked?

0:01:160:01:19

That's correct.

0:01:190:01:20

A very accurate prediction of your intimate traits such as

0:01:200:01:24

religiosity, political views, intelligence, sexual orientation.

0:01:240:01:28

Now, this power is transforming politics.

0:01:280:01:31

Can you understand though why maybe

0:01:310:01:33

some people find it a little bit creepy?

0:01:330:01:36

No, I can't - quite the opposite. That is the way the world is moving.

0:01:360:01:38

Whether you like it or not, it's an inevitable fact.

0:01:380:01:42

Social media can bring politics closer to the people,

0:01:420:01:46

but its destructive power

0:01:460:01:48

is creating a new and unpredictable world.

0:01:480:01:51

This is the story of how Silicon Valley's mission to connect

0:01:510:01:55

all of us is disrupting politics,

0:01:550:01:59

plunging us into a world of political turbulence

0:01:590:02:02

that no-one can control.

0:02:020:02:04

It might not look like it, but anger is building in Silicon Valley.

0:02:200:02:24

It's usually pretty quiet around here.

0:02:240:02:27

But not today.

0:02:270:02:29

Every day you wake up and you wonder what's going to be today's grief,

0:02:290:02:34

brought by certain politicians and leaders in the world.

0:02:340:02:37

This is a very unusual demonstration.

0:02:370:02:41

I've been to loads of demonstrations,

0:02:410:02:44

but these aren't the people

0:02:440:02:46

that usually go to demonstrations.

0:02:460:02:48

In some ways, these are the winners of society.

0:02:480:02:52

This is the tech community in Silicon Valley.

0:02:520:02:55

Some of the wealthiest people in the world.

0:02:550:02:59

And they're here protesting and demonstrating...

0:02:590:03:05

against what they see as the kind of changing world

0:03:050:03:07

that they don't like.

0:03:070:03:09

The problem, of course, is the election of Donald Trump.

0:03:090:03:13

Every day, I'm sure, you think,

0:03:150:03:17

"I could be a part of resisting those efforts

0:03:170:03:20

"to mess things up for the rest of us."

0:03:200:03:22

Trump came to power promising to control immigration...

0:03:220:03:26

We will build the wall, 100%.

0:03:260:03:30

..and to disengage from the world.

0:03:300:03:33

From this day forward, it's going to be

0:03:330:03:37

only America first.

0:03:370:03:40

America first.

0:03:420:03:44

Now, Silicon Valley is mobilising against him.

0:03:460:03:51

We are seeing this explosion of political activism, you know,

0:03:510:03:54

all through the US and in Europe.

0:03:540:03:56

Before he became an activist,

0:03:560:03:59

Dex Torricke-Barton was speech writer to the chairman of Google

0:03:590:04:03

and the founder of Facebook.

0:04:030:04:05

It's a moment when people who believe in this global vision,

0:04:070:04:10

as opposed to the nationalist vision of the world,

0:04:100:04:12

who believe in a world that isn't about protectionism,

0:04:120:04:15

whether it's data or whether it's about trade,

0:04:150:04:17

they're coming to stand up and to mobilise in response to that.

0:04:170:04:20

Because it feels like the whole of Silicon Valley

0:04:200:04:23

-has been slightly taken by surprise by what's happening.

-Absolutely.

0:04:230:04:26

But these are the smartest minds in the world.

0:04:260:04:28

-Yeah.

-With the most amazing data models and polling.

0:04:280:04:30

The smartest minds in the world often can be very,

0:04:300:04:33

very ignorant of the things that are going on in the world.

0:04:330:04:36

The tech god with the most ambitious global vision is Dex's old boss -

0:04:360:04:41

Facebook founder Mark Zuckerberg.

0:04:410:04:44

One word captures the world Zuck, as he's known here,

0:04:440:04:48

is trying to build.

0:04:480:04:51

Connectivity and access. If we connected them...

0:04:510:04:53

You give people connectivity... That's the mission.

0:04:530:04:55

Connecting with their friends... You connect people over time...

0:04:550:04:58

Connect everyone in the world... I'm really optimistic about that.

0:04:580:05:00

Make the world more open and connected, that's what I care about.

0:05:000:05:03

This is part of the critical enabling infrastructure for the world.

0:05:030:05:06

Thank you, guys.

0:05:060:05:08

What's Mark Zuckerberg worried about most?

0:05:080:05:10

Well, you know, Mark has dedicated his life to connecting, you know,

0:05:100:05:13

the world. You know, this is something that he really, you know,

0:05:130:05:16

cares passionately about.

0:05:160:05:17

And, you know, as I said, you know,

0:05:170:05:19

the same worldview and set of policies that, you know,

0:05:190:05:22

we'll build walls here,

0:05:220:05:24

we'll build walls against the sharing of information

0:05:240:05:26

and building those kind of, you know, networks.

0:05:260:05:29

It's bigger than just the tech.

0:05:290:05:30

-It is.

-It's about the society that Silicon Valley also wants to create?

0:05:300:05:33

Absolutely.

0:05:330:05:34

The tech gods believe the election

0:05:340:05:36

of Donald Trump threatens their vision

0:05:360:05:39

of a globalised world.

0:05:390:05:41

But in a cruel twist,

0:05:430:05:44

is it possible their mission to connect the world

0:05:440:05:47

actually helped bring him to power?

0:05:470:05:50

The question I have is whether the revolution brought about

0:05:500:05:53

by social media companies like Facebook

0:05:530:05:56

has actually led to the political changes in the world

0:05:560:06:00

that these guys are so worried about.

0:06:000:06:03

To answer that question,

0:06:060:06:08

you have to understand how the tech titans of Silicon Valley

0:06:080:06:12

rose to power.

0:06:120:06:15

For that, you have to go back 20 years

0:06:150:06:18

to a time when the online world was still in its infancy.

0:06:180:06:22

MUSIC: Rock N Roll Star by Oasis

0:06:220:06:25

There were fears the new internet was like the Wild West,

0:06:270:06:31

anarchic and potentially harmful.

0:06:310:06:34

Today our world is being remade, yet again, by an information revolution.

0:06:340:06:40

Changing the way we work, the way we live,

0:06:400:06:43

the way we relate to each other.

0:06:430:06:46

The Telecommunications Act of 1996

0:06:460:06:49

was designed to civilise the internet,

0:06:490:06:51

including protect children from pornography.

0:06:510:06:56

Today with the stroke of a pen, our laws will catch up with our future.

0:06:560:07:00

But buried deep within the act was a secret whose impact no-one foresaw.

0:07:000:07:05

Jeremy?

0:07:140:07:15

Jamie. How are you doing?

0:07:150:07:16

Nice to meet you.

0:07:160:07:19

VOICEOVER: Jeremy Malcolm is an analyst

0:07:190:07:21

at the Electronic Frontier Foundation,

0:07:210:07:23

a civil liberties group for the digital age.

0:07:230:07:26

Much of Silicon Valley's accelerated growth in the last two decades

0:07:280:07:32

has been enabled by one clause in the legislation.

0:07:320:07:36

Hidden away in the middle of that is this Section 230.

0:07:370:07:40

-What's the key line?

-It literally just says,

0:07:400:07:42

no provider or user of an interactive computer service

0:07:420:07:45

shall be treated as the publisher or speaker

0:07:450:07:48

of any information provided

0:07:480:07:50

by another information content provider. That's it.

0:07:500:07:53

So what that basically means is, if you're an internet platform,

0:07:530:07:56

you don't get treated as the publisher or speaker

0:07:560:07:58

of something that your users say using your platform.

0:07:580:08:01

If the user says something online that is, say, defamatory,

0:08:010:08:05

the platform that they communicate on

0:08:050:08:07

isn't going to be held responsible for it.

0:08:070:08:09

And the user, of course, can be held directly responsible.

0:08:090:08:13

How important is this line for social media companies today?

0:08:130:08:19

I think if we didn't have this, we probably wouldn't have

0:08:190:08:22

the same kind of social media companies that we have today.

0:08:220:08:25

They wouldn't be willing to take on the risk of having so much

0:08:250:08:28

-unfettered discussion.

-It's key to the internet's freedom, really?

0:08:280:08:33

We wouldn't have the internet of today without this.

0:08:330:08:36

And so, if we are going to make any changes to it,

0:08:360:08:39

we have to be really, really careful.

0:08:390:08:41

These 26 words changed the world.

0:08:430:08:46

They allowed a new kind of business to spring up -

0:08:500:08:53

online platforms that became the internet giants of today.

0:08:530:08:58

Facebook, Google, YouTube - they encouraged users to upload content,

0:08:580:09:02

often things about their lives or moments that mattered to them,

0:09:020:09:06

onto their sites for free.

0:09:060:09:09

And in exchange, they got to hoard all of that data

0:09:090:09:13

but without any real responsibility

0:09:130:09:15

for the effects of the content that people were posting.

0:09:150:09:19

Hundreds of millions of us flocked to these new sites,

0:09:220:09:25

putting more of our lives online.

0:09:250:09:29

At first, the tech firms couldn't figure out

0:09:290:09:32

how to turn that data into big money.

0:09:320:09:35

But that changed when a secret within that data was unlocked.

0:09:350:09:40

Antonio, Jamie.

0:09:400:09:42

'A secret Antonio Garcia Martinez helped reveal at Facebook.'

0:09:420:09:46

Tell me a bit about your time at Facebook.

0:09:490:09:51

Well, that was interesting.

0:09:510:09:53

I was what's called a product manager for ads targeting.

0:09:530:09:56

That means basically taking your data and using it to basically

0:09:560:09:58

make money on Facebook, to monetise Facebook's data.

0:09:580:10:02

If you go browse the internet or buy stuff in stores or whatever,

0:10:020:10:04

and then you see ads related to all that stuff

0:10:040:10:06

inside Facebook - I created that.

0:10:060:10:08

Facebook offers advertisers ways

0:10:100:10:12

to target individual users of the site with adverts.

0:10:120:10:15

It can be driven by data about how we use the platform.

0:10:150:10:21

Here's some examples of what's data for Facebook that makes money.

0:10:210:10:24

What you've liked on Facebook, links that you shared,

0:10:240:10:26

who you happen to know on Facebook, for example.

0:10:260:10:29

Where you've used Facebook, what devices, your iPad, your work computer,

0:10:290:10:32

your home computer. In the case of Amazon,

0:10:320:10:34

it's obviously what you've purchased.

0:10:340:10:35

In the case of Google, it's what you searched for.

0:10:350:10:38

How do they turn me... I like something on Facebook,

0:10:380:10:40

and I share a link on Facebook, how could they turn that

0:10:400:10:43

into something that another company would care about?

0:10:430:10:46

There is what's called a targeting system,

0:10:460:10:48

and so the advertiser can actually go in and specify,

0:10:480:10:50

I want people who are within this city and who have liked BMW or Burberry, for example.

0:10:500:10:54

So an advertiser pays Facebook and says,

0:10:540:10:56

-I want these sorts of people?

-That's effectively it, that's right.

0:10:560:11:00

The innovation that opened up bigger profits was to allow Facebook users

0:11:010:11:05

to be targeted using data about what they do on the rest of the internet.

0:11:050:11:12

The real key thing that marketers want

0:11:120:11:14

is the unique, immutable, flawless, high fidelity ID

0:11:140:11:18

for one person on the internet, and Facebook provides that.

0:11:180:11:21

It is your identity online.

0:11:210:11:23

Facebook can tell an advertiser, this is the real Jamie Barlow,

0:11:230:11:26

-this is what he's like?

-Yeah.

0:11:260:11:28

A company like Walmart can literally take your data, your e-mail,

0:11:280:11:32

phone number, whatever you use for their frequent shopper programme, etc,

0:11:320:11:35

and join that to Facebook and literally target those people based on that data.

0:11:350:11:39

That's part of what I built.

0:11:390:11:40

The tech gods suck in all this data about how we use their technologies

0:11:450:11:49

to build their vast fortunes.

0:11:490:11:53

I mean, it sounds like data is like oil, it's keeping the economy going?

0:11:550:11:58

Right. I mean, the difference is these companies,

0:11:580:12:00

instead of drilling for this oil, they generate this oil via,

0:12:000:12:04

by getting users to actually use their apps and then they actually

0:12:040:12:07

monetise it. Usually via advertising or other mechanisms.

0:12:070:12:09

But, yeah, it is the new oil.

0:12:090:12:12

Data about billions of us is propelling Silicon Valley

0:12:120:12:15

to the pinnacle of the global economy.

0:12:150:12:19

The world's largest hotel company, Airbnb,

0:12:210:12:23

doesn't own a single piece of real estate.

0:12:230:12:25

The world's largest taxi company, Uber, doesn't own any cars.

0:12:250:12:28

The world's largest media company, Facebook, doesn't produce any media, right?

0:12:280:12:31

So what do they have?

0:12:310:12:32

Well, they have the data around how you use those resources and how you

0:12:320:12:35

use those assets. And that's really what they are.

0:12:350:12:38

The secret of targeting us with adverts is keeping us online

0:12:410:12:46

for as long as possible.

0:12:460:12:49

I thought I'd just see how much time I spend on here.

0:12:490:12:53

So I've got an app that counts how often I pick this thing up.

0:12:530:12:58

Our time is the Holy Grail of Silicon Valley.

0:12:580:13:02

Here's what my life looks like on a typical day.

0:13:030:13:06

-Yeah, could I have a flat white, please?

-Flat white?

-Yeah.

0:13:090:13:12

Like more and more of us, my phone is my gateway to the online world.

0:13:120:13:18

It's how I check my social media accounts.

0:13:180:13:20

On average, Facebook users spend 50 minutes every day on the site.

0:13:200:13:26

The longer we spend connected,

0:13:310:13:33

the more Silicon Valley can learn about us,

0:13:330:13:37

and the more targeted and effective their advertising can be.

0:13:370:13:41

So apparently, today, I...

0:14:050:14:08

I've checked my phone 117 times...

0:14:080:14:13

..and I've been on this phone for nearly five and a half hours.

0:14:140:14:20

Well, I mean, that's a lot, that's a lot of hours.

0:14:200:14:22

I mean, it's kind of nearly half the day,

0:14:220:14:25

spent on this phone.

0:14:250:14:28

But it's weird, because it doesn't feel like I spend that long on it.

0:14:280:14:31

The strange thing about it is

0:14:310:14:33

that I don't even really know what I'm doing

0:14:330:14:36

for these five hours that I'm spending on this phone.

0:14:360:14:38

What is it that is keeping us hooked to Silicon Valley's global network?

0:14:400:14:45

I'm in Seattle to meet someone who saw how the tech gods embraced

0:14:490:14:54

new psychological insights into how we all make decisions.

0:14:540:14:59

I was a post-doc with Stephen Hawking.

0:15:020:15:04

Once Chief Technology Officer at Microsoft,

0:15:040:15:07

Nathan Myhrvold is the most passionate technologist

0:15:070:15:10

I have ever met.

0:15:100:15:13

Some complicated-looking equations in the background.

0:15:130:15:15

Well, it turns out if you work with Stephen Hawking,

0:15:150:15:17

you do work with complicated equations.

0:15:170:15:20

It's kind of the nature of the beast!

0:15:200:15:22

Amazing picture.

0:15:220:15:24

A decade ago, Nathan brought together Daniel Kahneman,

0:15:270:15:30

pioneer of the new science of behavioural economics,

0:15:300:15:33

and Silicon Valley's leaders, for a series of meetings.

0:15:330:15:37

I came and Jeff Bezos came.

0:15:370:15:40

That's Jeff Bezos, the founder of Amazon, worth 76 billion.

0:15:400:15:46

Sean Parker. Sean was there.

0:15:460:15:48

That's Sean Parker, the first president of Facebook.

0:15:480:15:52

And the Google founders were there.

0:15:520:15:54

The proposition was, come to this very nice resort in Napa,

0:15:560:16:03

and for several days, just have Kahneman

0:16:030:16:07

and then also a couple of

0:16:070:16:09

other behavioural economists explain things.

0:16:090:16:12

And ask questions and see what happens.

0:16:120:16:15

Kahneman had a simple but brilliant theory on how we make decisions.

0:16:150:16:21

He had found we use one of two different systems of thinking.

0:16:210:16:27

In this dichotomy, you have...

0:16:280:16:31

over here is a hunch...

0:16:310:16:34

..a guess,

0:16:360:16:37

a gut feeling...

0:16:370:16:40

..and "I just know".

0:16:440:16:47

So this is sort of emotional and more like, just instant stuff?

0:16:470:16:53

That's the idea. This set of things

0:16:530:16:56

is not particularly good at a different set of stuff

0:16:560:17:00

that involves...

0:17:000:17:02

..analysis...

0:17:040:17:05

..numbers...

0:17:080:17:10

..probability.

0:17:120:17:13

The meetings in Napa didn't deal with the basics

0:17:140:17:18

of behavioural economics, but how might the insights of the new

0:17:180:17:21

science have helped the tech gods?

0:17:210:17:23

A lot of advertising is about trying to hook people

0:17:230:17:26

in these type-one things to get interested one way or the other.

0:17:260:17:30

Technology companies undoubtedly use that to one degree or another.

0:17:300:17:35

You know, the term "clickbait",

0:17:350:17:37

for things that look exciting to click on.

0:17:370:17:39

There's billions of dollars change hands

0:17:390:17:42

because we all get enticed into clicking something.

0:17:420:17:46

And there's a lot of things that I click on and then you get there,

0:17:460:17:50

you're like, OK, fine, you were just messing with me.

0:17:500:17:53

You're playing to the type-one things,

0:17:530:17:55

you're putting a set of triggers out there

0:17:550:17:58

that make me want to click on it,

0:17:580:18:01

and even though, like, I'm aware of that, I still sometimes click!

0:18:010:18:06

Tech companies both try to understand

0:18:060:18:09

our behaviour by having smart humans think about it and increasingly

0:18:090:18:12

by having machines think about it.

0:18:120:18:15

By having machines track us

0:18:150:18:16

to see, what is the clickbait Nathan falls for?

0:18:160:18:19

What are the things he really likes to spend time on?

0:18:190:18:21

Let's show him more of that stuff!

0:18:210:18:24

Trying to grab the attention of the consumer is nothing new.

0:18:250:18:29

That's what advertising is all about.

0:18:290:18:32

But insights into how we make decisions helped Silicon Valley

0:18:320:18:35

to shape the online world.

0:18:350:18:37

And little wonder, their success depends on keeping us engaged.

0:18:370:18:43

From 1-Click buying on Amazon to the Facebook like,

0:18:430:18:47

the more they've hooked us, the more the money has rolled in.

0:18:470:18:51

As Silicon Valley became more influential,

0:18:530:18:55

it started attracting powerful friends...in politics.

0:18:550:19:00

In 2008, Barack Obama had pioneered political campaigning on Facebook.

0:19:000:19:07

As President, he was drawn to Facebook's founder, Zuck.

0:19:070:19:12

Sorry, I'm kind of nervous.

0:19:120:19:14

We have the President of the United States here!

0:19:140:19:17

My name is Barack Obama and I'm the guy who got Mark

0:19:170:19:21

to wear a jacket and tie!

0:19:210:19:24

How you doing?

0:19:270:19:28

-Great.

-I'll have huevos rancheros, please.

0:19:280:19:31

And if I could have an egg and cheese sandwich on English muffin?

0:19:310:19:36

Aneesh Chopra was Obama's first Chief Technology Officer,

0:19:360:19:40

and saw how close the relationship

0:19:400:19:42

between the White House and Silicon Valley became.

0:19:420:19:47

The President's philosophy and his approach to governing

0:19:470:19:50

garnered a great deal of personal interest

0:19:500:19:53

among many executives in Silicon Valley.

0:19:530:19:56

They were donors to his campaign, volunteers,

0:19:560:19:59

active recruiters of engineering talent

0:19:590:20:01

to support the campaign apparatus.

0:20:010:20:03

He had struck a chord.

0:20:030:20:05

-Why?

-Because frankly,

0:20:050:20:07

it was an inspiring voice that really tapped into

0:20:070:20:10

the hopefulness of the country.

0:20:100:20:13

And a lot of Silicon Valley shares this sense of hopefulness,

0:20:130:20:17

this optimistic view that we can solve problems

0:20:170:20:19

if we would work together

0:20:190:20:20

and take advantage of these new capabilities that are coming online.

0:20:200:20:24

So people in Silicon Valley saw President Obama

0:20:240:20:27

-as a bit of a kindred spirit?

-Oh, yeah. Oh, yeah.

0:20:270:20:32

If you'd like, Mark, we can take our jackets off.

0:20:320:20:34

That's good!

0:20:340:20:36

Facebook's mission to connect the world

0:20:360:20:39

went hand-in-hand with Obama's policies promoting globalisation

0:20:390:20:44

and free markets.

0:20:440:20:46

And Facebook was seen to be improving

0:20:460:20:48

the political process itself.

0:20:480:20:50

Part of what makes for a healthy democracy

0:20:500:20:55

is when you've got citizens who are informed, who are engaged,

0:20:550:20:59

and what Facebook allows us to do

0:20:590:21:02

is make sure this isn't just a one-way conversation.

0:21:020:21:05

You have the tech mind-set and governments increasingly share

0:21:050:21:09

the same view of the world. That's not a natural...

0:21:090:21:12

It's a spirit of liberty.

0:21:120:21:13

It's a spirit of freedom.

0:21:130:21:15

That is manifest today in these new technologies.

0:21:150:21:19

It happens to be that freedom means I can tweet something offensive.

0:21:190:21:23

But it also means that I have a voice.

0:21:230:21:26

Please raise your right hand and repeat after me...

0:21:260:21:29

By the time Obama won his second term,

0:21:290:21:32

he was feted for his mastery of social media's persuasive power.

0:21:320:21:37

But across the political spectrum,

0:21:370:21:39

the race was on to find new ways to gain a digital edge.

0:21:390:21:43

The world was about to change for Facebook.

0:21:430:21:47

Stanford University, in the heart of Silicon Valley.

0:21:560:22:01

Home to a psychologist investigating just how revealing

0:22:010:22:05

Facebook's hoard of information about each of us could really be.

0:22:050:22:10

How are you doing? Nice to meet you.

0:22:100:22:13

Dr Michal Kosinski specialises in psychometrics -

0:22:130:22:18

the science of predicting psychological traits

0:22:180:22:21

like personality.

0:22:210:22:23

So in the past, when you wanted to measure someone's personality

0:22:230:22:26

or intelligence, you needed to give them a question or a test,

0:22:260:22:30

and they would have to answer a bunch of questions.

0:22:300:22:33

Now, many of those questions would basically ask you

0:22:330:22:37

about whether you like poetry,

0:22:370:22:39

or you like hanging out with other people,

0:22:390:22:41

or you like the theatre, and so on.

0:22:410:22:43

But these days, you don't need to ask these questions any more.

0:22:430:22:47

Why? Because while going through our lives,

0:22:470:22:51

we are leaving behind a lot of digital footprints

0:22:510:22:54

that basically contain the same information.

0:22:540:22:57

So instead of asking you whether you like poetry,

0:22:570:23:01

I can just look at your reading history on Amazon

0:23:010:23:05

or your Facebook likes,

0:23:050:23:08

and I would just get exactly the same information.

0:23:080:23:10

In 2011, Dr Kosinski and his team at the University of Cambridge

0:23:120:23:16

developed an online survey

0:23:160:23:18

to measure volunteers' personality traits.

0:23:180:23:21

With their permission, he matched their results

0:23:210:23:24

with their Facebook data.

0:23:240:23:26

More than 6 million people took part.

0:23:260:23:30

We have people's Facebook likes,

0:23:300:23:32

people's status updates and profile data,

0:23:320:23:36

and this allows us to build those... to gain better understanding

0:23:360:23:40

of how psychological traits are being expressed

0:23:400:23:43

in the digital environment.

0:23:430:23:45

How you can measure psychological traits using digital footprints.

0:23:450:23:50

An algorithm that can look at millions of people

0:23:500:23:52

and it can look at hundreds of thousands

0:23:520:23:55

or tens of thousands of your likes

0:23:550:23:56

can extract and utilise even those little pieces of information

0:23:560:24:01

and combine it into a very accurate profile.

0:24:010:24:04

You can quite accurately predict mine, and in fact, everybody else's

0:24:040:24:09

personality, based on the things that they've liked?

0:24:090:24:14

That's correct.

0:24:140:24:15

It can also be used to turn your digital footprint

0:24:150:24:18

into a very accurate prediction

0:24:180:24:21

of your intimate traits, such as religiosity,

0:24:210:24:23

political views, personality, intelligence,

0:24:230:24:26

sexual orientation and a bunch of other psychological traits.

0:24:260:24:30

If I'm logged in, we can maybe see how accurate this actually is.

0:24:300:24:33

So I hit "make prediction" and it's going to try and predict

0:24:330:24:37

my personality from my Facebook page.

0:24:370:24:39

From your Facebook likes.

0:24:390:24:41

According to your likes,

0:24:410:24:42

you're open-minded, liberal and artistic.

0:24:420:24:46

Judges your intelligence to be extremely high.

0:24:460:24:49

-Well done.

-Yes. I'm extremely intelligent!

0:24:490:24:53

You're not religious, but if you are religious,

0:24:530:24:56

-most likely you'd be a Catholic.

-I was raised a Catholic!

0:24:560:24:59

I can't believe it knows that.

0:24:590:25:02

Because I... I don't say anything about being a Catholic anywhere,

0:25:020:25:05

but I was raised as a Catholic, but I'm not a practising Catholic.

0:25:050:25:09

So it's like...

0:25:090:25:11

It's absolutely spot on.

0:25:110:25:13

Oh, my God! Journalism, but also what did I study?

0:25:130:25:17

Studied history, and I didn't put anything about history in there.

0:25:170:25:21

I think this is one of the things

0:25:210:25:22

that people don't really get about those predictions, that they think,

0:25:220:25:26

look, if I like Lady Gaga on Facebook,

0:25:260:25:29

obviously people will know that I like Lady Gaga,

0:25:290:25:31

or the Government will know that I like Lady Gaga.

0:25:310:25:34

Look, you don't need a rocket scientist

0:25:340:25:36

to look at your Spotify playlist or your Facebook likes

0:25:360:25:39

to figure out that you like Lady Gaga.

0:25:390:25:42

What's really world-changing about those algorithms is that they can

0:25:420:25:47

take your music preferences or your book preferences

0:25:470:25:51

and extract from this seemingly innocent information

0:25:510:25:55

very accurate predictions about your religiosity, leadership potential,

0:25:550:25:59

political views, personality and so on.

0:25:590:26:02

Can you predict people's political persuasions with this?

0:26:020:26:05

In fact, the first dimension here, openness to experience,

0:26:050:26:08

is a very good predictor of political views.

0:26:080:26:11

People scoring high on openness, they tend to be liberal,

0:26:110:26:14

people who score low on openness, we even call it conservative and

0:26:140:26:19

traditional, they tend to vote for conservative candidates.

0:26:190:26:22

What about the potential to manipulate people?

0:26:220:26:25

So, obviously, if you now can use an algorithm to get to know millions

0:26:250:26:29

of people very intimately and then use another algorithm

0:26:290:26:33

to adjust the message that you are sending to them,

0:26:330:26:37

to make it most persuasive,

0:26:370:26:40

obviously gives you a lot of power.

0:26:400:26:43

I'm quite surprised at just how accurate this model could be.

0:26:520:26:55

I mean, I cannot believe that on the basis of a few things

0:26:550:27:00

I just, you know, carelessly clicked that I liked,

0:27:000:27:05

the model was able to work out that I could have been Catholic,

0:27:050:27:10

or had a Catholic upbringing.

0:27:100:27:12

And clearly, this is a very powerful way of understanding people.

0:27:120:27:18

Very exciting possibilities, but I can't help

0:27:190:27:22

fearing that there is that potential, whoever has that power,

0:27:220:27:28

whoever can control that model...

0:27:280:27:31

..will have sort of unprecedented possibilities of manipulating

0:27:330:27:37

what people think, how they behave, what they see,

0:27:370:27:41

whether that's selling things to people or how people vote,

0:27:410:27:45

and that's pretty scary too.

0:27:450:27:48

Our era is defined by political shocks.

0:27:520:27:55

None bigger than the rise of Donald Trump,

0:27:590:28:02

who defied pollsters and the mainstream media

0:28:020:28:05

to win the American presidency.

0:28:050:28:07

Now questions swirl around his use of the American affiliate

0:28:080:28:12

of a British insights company,

0:28:120:28:14

Cambridge Analytica, who use psychographics.

0:28:140:28:17

I'm in Texas to uncover how far

0:28:250:28:27

Cambridge Analytica's expertise in personality prediction

0:28:270:28:31

played a part in Trump's political triumph,

0:28:310:28:36

and how his revolutionary campaign

0:28:360:28:38

exploited Silicon Valley's social networks.

0:28:380:28:42

Everyone seems to agree

0:28:440:28:45

that Trump ran an exceptional election campaign

0:28:450:28:49

using digital technologies.

0:28:490:28:52

But no-one really knows what they did,

0:28:520:28:56

who they were working with, who was helping them,

0:28:560:28:59

what the techniques they used were.

0:28:590:29:03

So I've come here to try to unravel the mystery.

0:29:030:29:07

The operation inside this unassuming building in San Antonio

0:29:090:29:13

was largely hidden from view, but crucial to Trump's success.

0:29:130:29:18

Since then, I'm the only person to get in here

0:29:180:29:22

to find out what really happened.

0:29:220:29:25

This was our Project Alamo,

0:29:250:29:27

so this was where the digital arm

0:29:270:29:30

of the Trump campaign operation was held.

0:29:300:29:33

Theresa Hong is speaking publicly

0:29:330:29:36

for the first time about her role as digital content director

0:29:360:29:40

for the Trump campaign.

0:29:400:29:42

So, why is it called Project Alamo?

0:29:430:29:47

It was called Project Alamo based on the data, actually,

0:29:470:29:51

that was Cambridge Analytica -

0:29:510:29:53

they came up with the Alamo data set, right?

0:29:530:29:56

So, we just kind of adopted the name Project Alamo.

0:29:560:30:00

It does conjure up sort of images of a battle of some sort.

0:30:000:30:04

Yeah. It kind of was!

0:30:040:30:06

In a sense, you know. Yeah, yeah.

0:30:060:30:09

Project Alamo was so important,

0:30:130:30:16

Donald Trump visited the hundred or so workers based here

0:30:160:30:20

during the campaign.

0:30:200:30:22

Ever since he started tweeting in 2009,

0:30:220:30:26

Trump had grasped the power of social media.

0:30:260:30:29

Now, in the fight of his life,

0:30:290:30:32

the campaign manipulated his Facebook presence.

0:30:320:30:36

Trump's Twitter account, that's all his, he's the one that ran that,

0:30:360:30:40

and I did a lot of his Facebook,

0:30:400:30:43

so I wrote a lot for him, you know, I kind of channelled Mr Trump.

0:30:430:30:48

How do you possibly write a post on Facebook like Donald Trump?

0:30:480:30:52

"Believe me."

0:30:520:30:54

A lot of believe me's, a lot of alsos, a lot of...verys.

0:30:540:30:58

Actually, he was really wonderful to write for, just because

0:30:580:31:01

it was so refreshing, it was so authentic.

0:31:010:31:06

We headed to the heart of the operation.

0:31:090:31:11

Cambridge Analytica was here.

0:31:140:31:16

It was just a line of computers, right?

0:31:160:31:19

This is where their operation was and this was kind of

0:31:190:31:22

the brain of the data, this was the data centre.

0:31:220:31:26

This was the data centre, this was the centre of the data centre.

0:31:260:31:30

Exactly right, yes, it was. Yes, it was.

0:31:300:31:33

Cambridge Analytica were using data

0:31:330:31:35

on around 220 million Americans to target potential donors and voters.

0:31:350:31:40

It was, you know,

0:31:400:31:43

a bunch of card tables that sat here and a bunch of computers and people

0:31:430:31:48

that were behind the computers, monitoring the data.

0:31:480:31:51

We've got to target this state, that state or this universe or whatever.

0:31:510:31:55

So that's what they were doing, they were gathering all the data.

0:31:550:31:58

A "universe" was the name given to a group of voters

0:31:580:32:02

defined by Cambridge Analytica.

0:32:020:32:05

What sort of attributes did these people have

0:32:050:32:07

that they had been able to work out?

0:32:070:32:09

Some of the attributes would be, when was the last time they voted?

0:32:090:32:12

Who did they vote for?

0:32:120:32:14

You know, what kind of car do they drive?

0:32:140:32:17

What kind of things do they look at on the internet?

0:32:170:32:20

What do they stand for?

0:32:200:32:22

I mean, one person might really be about job creation

0:32:220:32:25

and keeping jobs in America, you know.

0:32:250:32:27

Another person, they might resonate with, you know,

0:32:270:32:30

Second Amendment and gun rights, so...

0:32:300:32:33

How would they know that?

0:32:330:32:34

-..within that...

-How would they know that?

-That is their secret sauce.

0:32:340:32:38

Did the "secret sauce"

0:32:400:32:42

contain predictions of personality and political leanings?

0:32:420:32:46

Were they able to kind of understand people's personalities?

0:32:460:32:50

Yes, I mean, you know,

0:32:500:32:52

they do specialise in psychographics, right?

0:32:520:32:56

But based on personal interests and based on what a person cares for

0:32:560:33:01

and what means something to them,

0:33:010:33:03

they were able to extract and then we were able to target.

0:33:030:33:06

So, the psychographic stuff, were they using that here?

0:33:060:33:10

Was that part of the model you were working on?

0:33:100:33:13

Well, I mean, towards the end with the persuasion, absolutely.

0:33:130:33:17

I mean, we really were targeting

0:33:170:33:19

on these universes that they had collected.

0:33:190:33:22

I mean, did some of the attributes that you were able to use

0:33:220:33:25

from Cambridge Analytica have a sort of emotional effect, you know,

0:33:250:33:29

happy, sad, anxious, worried - moods, that kind of thing?

0:33:290:33:32

Right, yeah.

0:33:320:33:34

I do know Cambridge Analytica follows something called Ocean

0:33:340:33:38

and it's based on, you know, whether or not, like,

0:33:380:33:41

are you an extrovert?

0:33:410:33:43

Are you more of an introvert?

0:33:430:33:45

Are you fearful? Are you positive?

0:33:450:33:50

So, they did use that.

0:33:500:33:52

Armed with Cambridge Analytica's revolutionary insights,

0:33:540:33:58

the next step in the battle to win over millions of Americans

0:33:580:34:02

was to shape the online messages they would see.

0:34:020:34:06

Now we're going to go into the big kind of bull pen where a lot of

0:34:060:34:10

-the creatives were, and this is where I was as well.

-Right.

0:34:100:34:14

For today's modern working-class families,

0:34:140:34:17

the challenge is very, very real.

0:34:170:34:19

With childcare costs...

0:34:190:34:21

Adverts were tailored to particular audiences, defined by data.

0:34:210:34:25

Donald Trump wants to give families the break they deserve.

0:34:250:34:28

This universe right here that Cambridge Analytica,

0:34:280:34:31

they've collected data and they have identified as working mothers

0:34:310:34:34

that are concerned about childcare,

0:34:340:34:37

and childcare, obviously, that's not going to be, like,

0:34:370:34:40

a war-ridden, you know, destructive ad, right?

0:34:400:34:43

That's more warm and fuzzy,

0:34:430:34:44

and this is what Trump is going to do for you.

0:34:440:34:46

But if you notice, Trump's not speaking.

0:34:460:34:49

He wasn't speaking, he wasn't in it at all.

0:34:490:34:51

Right, Trump wasn't speaking in it.

0:34:510:34:53

That audience there, we wanted a softer approach,

0:34:530:34:56

so this is the type of approach that we would take.

0:34:560:34:59

The campaign made thousands of different versions

0:34:590:35:03

of the same fundraising adverts.

0:35:030:35:05

The design was constantly tweaked to see which version performed best.

0:35:050:35:11

It wasn't uncommon to have about 35-45,000 iterations

0:35:110:35:15

of these types of ads every day, right?

0:35:150:35:18

So, you know, it could be

0:35:180:35:22

as subtle and just, you know, people wouldn't even notice,

0:35:220:35:28

where you have the green button,

0:35:280:35:30

sometimes a red button would work better

0:35:300:35:32

or a blue button would work better.

0:35:320:35:34

Now, the voters Cambridge Analytica had targeted

0:35:340:35:38

were bombarded with adverts.

0:35:380:35:40

I may have short-circuited...

0:35:460:35:47

I'm Donald Trump and I approve this message.

0:35:490:35:51

On a typical day, the campaign would run more than 100 different adverts.

0:35:520:35:57

The delivery system was Silicon Valley's vast social networks.

0:35:570:36:02

We had the Facebook and YouTube and Google people,

0:36:020:36:05

they would congregate here.

0:36:050:36:07

Almost all of America's voters could now be reached online in an instant.

0:36:070:36:12

I mean, what were Facebook and Google and YouTube people actually

0:36:120:36:16

-doing here, why were they here?

-They were helping us, you know.

0:36:160:36:19

They were basically our kind of hands-on partners

0:36:190:36:24

as far as being able to utilise the platform

0:36:240:36:27

as effectively as possible.

0:36:270:36:29

The Trump campaign spent the lion's share of its advertising budget,

0:36:290:36:34

around 85 million, on Facebook.

0:36:340:36:38

When you're pumping in millions and millions of dollars

0:36:380:36:42

to these social platforms,

0:36:420:36:44

you're going to get white-glove treatment,

0:36:440:36:46

so, they would send people, you know,

0:36:460:36:48

representatives to, you know,

0:36:480:36:51

Project Alamo to ensure that all of our needs were being met.

0:36:510:36:56

The success of Trump's digital strategy was built on

0:36:560:36:59

the effectiveness of Facebook as an advertising medium.

0:36:590:37:03

It's become a very powerful political tool

0:37:030:37:06

that's largely unregulated.

0:37:060:37:08

Without Facebook, we wouldn't have won.

0:37:080:37:12

I mean, Facebook really and truly put us over the edge,

0:37:120:37:16

I mean, Facebook was the medium

0:37:160:37:18

that proved most successful for this campaign.

0:37:180:37:21

Facebook didn't want to meet me, but made it clear that like all

0:37:250:37:28

advertisers on Facebook, political campaigns must ensure

0:37:280:37:32

their ads comply with all applicable laws and regulations.

0:37:320:37:36

The company also said no personally identifiable

0:37:360:37:40

information can be shared with advertising,

0:37:400:37:42

measurement or analytics partners unless people give permission.

0:37:420:37:46

In London, I'm on the trail

0:37:510:37:52

of the data company Cambridge Analytica.

0:37:520:37:56

Ever since the American election, the firm has been under pressure

0:37:590:38:03

to come clean over its use of personality prediction.

0:38:030:38:08

I want to know exactly how the firm used psychographics

0:38:080:38:12

to target voters for the Trump campaign.

0:38:120:38:16

-Alexander.

-Hi.

0:38:160:38:18

How do you do? Pleasure.

0:38:180:38:19

VOICEOVER: Alexander Nix's firm first used data to target voters

0:38:190:38:23

in the American presidential elections while working on

0:38:230:38:27

the Ted Cruz campaign for the Republican nomination.

0:38:270:38:31

When Cruz lost, they went to work for Trump.

0:38:310:38:35

I want to start with the Trump campaign.

0:38:350:38:37

Did Cambridge Analytica ever use

0:38:370:38:40

psychometric or psychographic methods in this campaign?

0:38:400:38:44

We left the Cruz campaign in April after the nomination was over.

0:38:440:38:47

We pivoted right across onto the Trump campaign.

0:38:470:38:50

It was about five and a half months before polling.

0:38:500:38:53

And whilst on the Cruz campaign, we were able to invest

0:38:530:38:58

a lot more time into building psychographic models,

0:38:580:39:01

into profiling, using behavioural profiling to understand different

0:39:010:39:04

personality groups and different personality drivers,

0:39:040:39:07

in order to inform our messaging and our creative.

0:39:070:39:10

We simply didn't have the time to employ this level of

0:39:100:39:14

rigorous methodology for Trump.

0:39:140:39:17

For Cruz, Cambridge Analytica built computer models which could crunch

0:39:170:39:21

huge amounts of data on each voter, including psychographic data

0:39:210:39:26

predicting voters' personality types.

0:39:260:39:30

When the firm transferred to the Trump campaign,

0:39:300:39:32

they took data with them.

0:39:320:39:34

Now, there is clearly some legacy psychographics in the data,

0:39:340:39:40

because the data is, um, model data,

0:39:400:39:44

or a lot of it is model data that we had used across

0:39:440:39:46

the last 14, 15 months of campaigning through the midterms,

0:39:460:39:50

and then through the primaries.

0:39:500:39:52

But specifically, did we build specific psychographic models

0:39:520:39:57

for the Trump campaign? No, we didn't.

0:39:570:39:59

So you didn't build specific models for this campaign,

0:39:590:40:03

but it sounds like you did use some element of psychographic modelling,

0:40:030:40:07

as an approach in the Trump campaign?

0:40:070:40:10

Only as a result of legacy data models.

0:40:100:40:13

So, the answer... The answer you are looking for is no.

0:40:130:40:17

The answer I am looking for is the extent to which it was used.

0:40:170:40:20

I mean, legacy... I don't know what that means, legacy data modelling.

0:40:200:40:23

What does that mean for the Trump campaign?

0:40:230:40:26

Well, so we were able to take models we had made previously

0:40:260:40:28

over the last two or three years,

0:40:280:40:30

and integrate those into some of the work we were doing.

0:40:300:40:33

Where did all the information

0:40:330:40:34

to predict voters' personalities come from?

0:40:340:40:38

Very originally, we used a combination of telephone surveys

0:40:380:40:42

and then we used a number of...

0:40:420:40:45

..online platforms...

0:40:460:40:50

for gathering questions.

0:40:500:40:52

As we started to gather more data,

0:40:520:40:55

we started to look at other platforms.

0:40:550:40:57

Such as Facebook, for instance.

0:40:570:41:00

Predicting personality is just one element in the big data

0:41:020:41:05

companies like Cambridge Analytica are using to revolutionise

0:41:050:41:09

the way democracy works.

0:41:090:41:11

Can you understand, though, why maybe

0:41:110:41:13

some people find it a little bit creepy?

0:41:130:41:16

No, I can't. Quite the opposite.

0:41:160:41:17

I think that the move away from blanket advertising,

0:41:170:41:21

the move towards ever more personalised communication,

0:41:210:41:24

is a very natural progression.

0:41:240:41:25

I think it is only going to increase.

0:41:250:41:27

I find it a little bit weird, if I had a very sort of detailed

0:41:270:41:30

personality assessment of me, based on all sorts of different data

0:41:300:41:35

that I had put all over the internet,

0:41:350:41:37

and that as a result of that, some profile had been made of me

0:41:370:41:40

that was then used to target me for adverts,

0:41:400:41:43

and I didn't really know that any of that had happened.

0:41:430:41:45

That's why some people might find it a little bit sinister.

0:41:450:41:48

Well, you have just said yourself,

0:41:480:41:50

you are putting this data out into the public domain.

0:41:500:41:52

I'm sure that you have a supermarket loyalty card.

0:41:520:41:56

I'm sure you understand the reciprocity that is going on there -

0:41:560:42:01

you get points, and in return, they gather your data

0:42:010:42:05

on your consumer behaviour.

0:42:050:42:07

I mean, we are talking about politics

0:42:070:42:08

and we're talking about shopping. Are they really the same thing?

0:42:080:42:12

The technology is the same.

0:42:120:42:14

In the next ten years,

0:42:140:42:15

the sheer volumes of data that are going to be available,

0:42:150:42:18

that are going to be driving all sorts of things

0:42:180:42:20

including marketing and communications,

0:42:200:42:23

is going to be a paradigm shift from where we are now

0:42:230:42:25

and it's going to be a revolution,

0:42:250:42:27

and that is the way the world is moving.

0:42:270:42:29

And, you know, I think,

0:42:290:42:31

whether you like it or not, it, it...

0:42:310:42:35

it is an inevitable fact.

0:42:350:42:37

To the new data barons, this is all just business.

0:42:390:42:42

But to the rest of us, it's more than that.

0:42:420:42:46

By the time of Donald Trump's inauguration,

0:42:490:42:52

it was accepted that his mastery of data and social media

0:42:520:42:55

had made him the most powerful man in the world.

0:42:550:42:59

We will make America great again.

0:42:590:43:04

The election of Donald Trump

0:43:040:43:06

was greeted with barely concealed fury in Silicon Valley.

0:43:060:43:10

But Facebook and other tech companies had made

0:43:100:43:13

millions of dollars by helping to make it happen.

0:43:130:43:16

Their power as advertising platforms

0:43:160:43:19

had been exploited by a politician

0:43:190:43:21

with a very different view of the world.

0:43:210:43:24

But Facebook's problems were only just beginning.

0:43:270:43:30

Another phenomenon of the election

0:43:320:43:34

was plunging the tech titan into crisis.

0:43:340:43:37

Fake news, often targeting Hillary Clinton,

0:43:430:43:46

had dominated the election campaign.

0:43:460:43:49

Now, the departing President turned on the social media giant

0:43:520:43:55

he had once embraced.

0:43:550:43:58

In an age where, uh...

0:43:590:44:01

there is so much active misinformation,

0:44:010:44:06

and it is packaged very well,

0:44:060:44:08

and it looks the same when you see it on a Facebook page,

0:44:080:44:11

or you turn on your television,

0:44:110:44:13

if everything, uh...

0:44:130:44:17

seems to be the same,

0:44:170:44:20

and no distinctions are made,

0:44:200:44:22

then we won't know what to protect.

0:44:220:44:25

We won't know what to fight for.

0:44:250:44:27

Fake news had provoked a storm of criticism

0:44:330:44:36

over Facebook's impact on democracy.

0:44:360:44:39

Its founder, Zuck, claimed it was extremely unlikely

0:44:390:44:42

fake news had changed the election's outcome.

0:44:420:44:46

But he didn't address why it had spread like wildfire

0:44:460:44:49

across the platform.

0:44:490:44:51

-Jamie.

-Hi, Jamie.

0:44:510:44:53

VOICEOVER: Meet Jeff Hancock,

0:44:530:44:55

a psychologist who has investigated a hidden aspect of Facebook

0:44:550:44:58

that helps explain how the platform became weaponised in this way.

0:44:580:45:03

It turns out the power of Facebook to affect our emotions is key,

0:45:060:45:11

something that had been uncovered

0:45:110:45:13

in an experiment the company itself had run in 2012.

0:45:130:45:18

It was one of the earlier, you know,

0:45:180:45:19

what we would call big data, social science-type studies.

0:45:190:45:22

The newsfeeds of nearly 700,000 users were secretly manipulated

0:45:260:45:30

so they would see fewer positive or negative posts.

0:45:300:45:35

Jeff helped interpret the results.

0:45:350:45:38

So, what did you actually find?

0:45:380:45:40

We found that if you were one of those people

0:45:400:45:43

that were seeing less negative emotion words in their posts,

0:45:430:45:46

then you would write with less negative emotion in your own posts,

0:45:460:45:49

and more positive emotion.

0:45:490:45:51

-This is emotional contagion.

-And what about positive posts?

0:45:510:45:55

Did that have the same kind of effect?

0:45:550:45:57

Yeah, we saw the same effect

0:45:570:45:59

when positive emotion worded posts were decreased.

0:45:590:46:03

We saw the same thing. So I would produce fewer positive

0:46:030:46:06

emotion words, and more negative emotion words.

0:46:060:46:08

And that is consistent with emotional contagion theory.

0:46:080:46:11

Basically, we were showing that people were writing in a way

0:46:110:46:15

that was matching the emotion that they were seeing

0:46:150:46:18

in the Facebook news feed.

0:46:180:46:21

Emotion draws people to fake news, and then supercharges its spread.

0:46:210:46:27

So the more emotional the content,

0:46:270:46:29

the more likely it is to spread online.

0:46:290:46:31

-Is that true?

-Yeah, the more intense the emotion in content,

0:46:310:46:34

the more likely it is to spread, to go viral.

0:46:340:46:38

It doesn't matter whether it is sad or happy, like negative or positive,

0:46:380:46:42

the more important thing is how intense the emotion is.

0:46:420:46:44

The process of emotional contagion

0:46:440:46:47

helps explain why fake news has spread

0:46:470:46:50

so far across social media.

0:46:500:46:52

Advertisers have been well aware of how emotion can be used

0:46:550:46:58

to manipulate people's attention.

0:46:580:46:59

But now we are seeing this with a whole host of other actors,

0:46:590:47:02

some of them nefarious.

0:47:020:47:04

So, other state actors trying to influence your election,

0:47:040:47:06

people trying to manipulate the media

0:47:060:47:08

are using emotional contagion,

0:47:080:47:10

and also using those original platforms like Facebook

0:47:100:47:14

to accomplish other objectives, like sowing distrust,

0:47:140:47:19

or creating false beliefs.

0:47:190:47:21

You see it sort of, maybe weaponised

0:47:210:47:23

and being used at scale.

0:47:230:47:26

I'm in Germany,

0:47:350:47:37

where the presence of more than a million new refugees

0:47:370:47:39

has caused tension across the political spectrum.

0:47:390:47:43

Earlier this year,

0:47:430:47:44

a story appeared on the American alt-right news site Breitbart

0:47:440:47:49

about the torching of a church in Dortmund by a North African mob.

0:47:490:47:54

It was widely shared on social media -

0:47:540:47:57

but it wasn't true.

0:47:570:47:59

The problem with social networks

0:48:040:48:06

is that all information is treated equally.

0:48:060:48:08

So you have good, honest, accurate information

0:48:080:48:12

sitting alongside and treated equally to

0:48:120:48:15

lies and propaganda.

0:48:150:48:18

And the difficulty for citizens is that it can be very hard

0:48:180:48:21

to tell the difference between the two.

0:48:210:48:24

But one data scientist here has discovered an even darker side

0:48:240:48:30

to the way Facebook is being manipulated.

0:48:300:48:32

We created a network of all the likes

0:48:330:48:37

in the refugee debate on Facebook.

0:48:370:48:40

Professor Simon Hegelich has found evidence the debate

0:48:420:48:46

about refugees on Facebook

0:48:460:48:47

is being skewed by anonymous political forces.

0:48:470:48:52

So you are trying to understand who is liking pages?

0:48:520:48:56

Exactly.

0:48:560:48:58

One statistic among many used by Facebook to rank stories in

0:48:580:49:01

your news feed is the number of likes they get.

0:49:010:49:04

The red area of this chart shows a network of people liking

0:49:040:49:08

anti-refugee posts on Facebook.

0:49:080:49:12

Most of the likes are sent by just a handful of people - 25 people are...

0:49:120:49:18

..responsible for more than 90% of all these likes.

0:49:200:49:24

25 Facebook accounts each liked

0:49:240:49:26

more than 30,000 comments over six months.

0:49:260:49:30

These hyperactive accounts could be run by real people, or software.

0:49:300:49:36

I think the rationale behind this

0:49:360:49:38

is that they tried to make their content viral.

0:49:380:49:42

They think if we like all this stuff, then Facebook,

0:49:420:49:46

the algorithm of Facebook,

0:49:460:49:47

will pick up our content and show it to other users,

0:49:470:49:51

and then the whole world sees that refugees are bad

0:49:510:49:54

and that they shouldn't come to Germany.

0:49:540:49:57

This is evidence the number of likes on Facebook can be easily gamed

0:49:570:50:02

as part of an effort to try to influence the prominence

0:50:020:50:05

of anti-refugee content on the site.

0:50:050:50:08

Does this worry you, though?

0:50:080:50:10

It's definitely changing structure of public opinion.

0:50:100:50:14

Democracy is built on public opinion,

0:50:140:50:18

so such a change definitely has to change the way democracy works.

0:50:180:50:23

Facebook told us they are working to disrupt the economic incentives

0:50:260:50:30

behind false news, removing tens of thousands of fake accounts,

0:50:300:50:34

and building new products

0:50:340:50:36

to identify and limit the spread of false news.

0:50:360:50:39

Zuck is trying to hold the line that his company is not a publisher,

0:50:390:50:44

based on that obscure legal clause from the 1990s.

0:50:440:50:49

You know, Facebook is a new kind of platform.

0:50:490:50:51

You know, it's not a traditional technology company,

0:50:510:50:54

it's not a traditional media company.

0:50:540:50:57

Um, we don't write the news that people...

0:50:570:50:59

that people read on the platform.

0:50:590:51:01

In Germany, one man is taking on the tech gods over their responsibility

0:51:030:51:08

for what appears on their sites.

0:51:080:51:10

Ulrich Kelber is a minister in the Justice Department.

0:51:120:51:17

He was once called a "Jewish pig" in a post on Facebook.

0:51:170:51:20

He reported it to the company, who refused to delete it.

0:51:200:51:24

IN GERMAN:

0:51:240:51:26

Under a new law,

0:51:430:51:45

Facebook and other social networking sites

0:51:450:51:47

could be fined up to 50 million euros

0:51:470:51:50

if they fail to take down hate speech posts that are illegal.

0:51:500:51:55

Is this too much?

0:51:550:51:56

Is this a little bit Draconian?

0:51:560:51:59

This is the first time a government has challenged

0:52:200:52:23

the principles underlying a Silicon Valley platform.

0:52:230:52:27

Once this thread has been pulled,

0:52:270:52:29

their whole world could start to unravel.

0:52:290:52:32

They must find you a pain.

0:52:320:52:35

I mean, this must be annoying for them.

0:52:350:52:37

It must be.

0:52:370:52:38

Facebook told us they share the goal of fighting hate speech,

0:52:470:52:51

and they have made substantial progress

0:52:510:52:54

in removing illegal content,

0:52:540:52:56

adding 3,000 people to their community operations team.

0:52:560:53:00

Facebook now connects more than two billion people around the world,

0:53:040:53:08

including more and more voters in the West.

0:53:080:53:12

When you think of what Facebook has become,

0:53:120:53:14

in such a short space of time, it's actually pretty bizarre.

0:53:140:53:18

I mean, this was just a platform

0:53:180:53:20

for sharing photos or chatting to friends,

0:53:200:53:23

but in less than a decade, it has become a platform that has

0:53:230:53:27

dramatic implications for how our democracy works.

0:53:270:53:30

Old structures of power are falling away.

0:53:320:53:35

Social media is giving ordinary people access to huge audiences.

0:53:350:53:40

And politics is changing as a result.

0:53:400:53:43

Significant numbers were motivated

0:53:460:53:47

through social media to vote for Brexit.

0:53:470:53:50

Jeremy Corbyn lost the general election,

0:53:520:53:54

but enjoyed unexpected gains,

0:53:540:53:58

an achievement he put down in part to the power of social media.

0:53:580:54:03

One influential force in this new world is found here in Bristol.

0:54:060:54:12

The Canary is an online political news outlet.

0:54:120:54:15

During the election campaign,

0:54:150:54:18

their stories got more than 25 million hits on a tiny budget.

0:54:180:54:22

So, how much of your readership comes through Facebook?

0:54:240:54:27

It's really high, it's about 80%.

0:54:270:54:30

So it is an enormously important distribution mechanism.

0:54:300:54:33

And free.

0:54:330:54:35

The Canary's presentation of its pro-Corbyn news

0:54:350:54:39

is tailored to social media.

0:54:390:54:43

I have noticed that a lot of your headlines, or your pictures,

0:54:430:54:45

they are quite emotional.

0:54:450:54:47

They are sort of pictures of sad Theresa May,

0:54:470:54:50

or delighted Jeremy Corbyn.

0:54:500:54:52

Is that part of the purpose of it all?

0:54:520:54:54

Yeah, it has to be. We are out there trying to have a conversation

0:54:540:54:57

with a lot of people, so it is on us to be compelling.

0:54:570:55:00

Human beings work on facts, but they also work on gut instinct,

0:55:000:55:04

they work on emotions, feelings, and fidelity and community.

0:55:040:55:09

All of these issues.

0:55:090:55:10

Social media enables those with few resources to compete

0:55:100:55:15

with the mainstream media for the attention of millions of us.

0:55:150:55:20

You put up quite sort of clickbait-y stories, you know,

0:55:200:55:23

the headlines are there to get clicks.

0:55:230:55:26

-Yeah.

-Um... Is that a fair criticism?

0:55:260:55:28

Of course they're there to get clicks.

0:55:280:55:30

We don't want to have a conversation with ten people.

0:55:300:55:32

You can't change the world talking to ten people.

0:55:320:55:34

The tech gods are giving all of us the power to influence the world.

0:55:420:55:48

Connectivity and access... Connect everyone in the world...

0:55:480:55:50

Make the world more open and connected...

0:55:500:55:51

You connect people over time... You get people connectivity...

0:55:510:55:54

That's the mission. That's what I care about.

0:55:540:55:57

Social media's unparalleled power to persuade,

0:55:570:56:01

first developed for advertisers,

0:56:010:56:03

is now being exploited by political forces of all kinds.

0:56:030:56:06

Grassroots movements are regaining their power,

0:56:060:56:11

challenging political elites.

0:56:110:56:13

Extremists are discovering new ways to stoke hatred and spread lies.

0:56:130:56:19

And wealthy political parties are developing the ability

0:56:190:56:21

to manipulate our thoughts and feelings

0:56:210:56:24

using powerful psychological tools,

0:56:240:56:27

which is leading to a world of unexpected political opportunity,

0:56:270:56:32

and turbulence.

0:56:320:56:33

I think the people that connected the world really believed that

0:56:330:56:37

somehow, just by us being connected, our politics would be better.

0:56:370:56:42

But the world is changing in ways that they never imagined,

0:56:420:56:48

and they are probably not happy about any more.

0:56:480:56:51

But in truth,

0:56:510:56:53

they are no more in charge of this technology than any of us are now.

0:56:530:56:56

Silicon Valley's philosophy is called disruption.

0:56:560:57:00

Breaking down the way we do things

0:57:000:57:02

and using technology to improve the world.

0:57:020:57:06

In this series, I have seen how

0:57:060:57:08

sharing platforms like Uber and Airbnb

0:57:080:57:11

are transforming our cities.

0:57:110:57:14

And how automation and artificial intelligence

0:57:180:57:21

threaten to destroy millions of jobs.

0:57:210:57:24

Within 30 years, half of humanity won't have a job.

0:57:240:57:27

It could get ugly, there could be revolution.

0:57:270:57:29

Now, the technology to connect the world unleashed by a few billionaire

0:57:290:57:34

entrepreneurs is having a dramatic influence on our politics.

0:57:340:57:39

The people who are responsible for building this technology,

0:57:390:57:42

for unleashing this disruption onto all of us,

0:57:420:57:47

don't ever feel like they are responsible for the consequences

0:57:470:57:51

of any of that.

0:57:510:57:53

They retain this absolute religious faith that technology and

0:57:530:57:58

connectivity is always going to make things turn out for the best.

0:57:580:58:03

And it doesn't matter what happens,

0:58:030:58:05

it doesn't matter how much that's proven not to be the case,

0:58:050:58:08

they still believe.

0:58:080:58:10

How did Silicon Valley become so influential?

0:58:220:58:25

The Open University has produced an interactive timeline

0:58:250:58:28

exploring the history of this place.

0:58:280:58:30

To find out more, visit...

0:58:300:58:32

..and follow the links to the Open University.

0:58:360:58:40

Download Subtitles

SRT

ASS