0:00:04 > 0:00:08It was the biggest political earthquake of the century.
0:00:08 > 0:00:12We will make America great again!
0:00:15 > 0:00:17But just how did Donald Trump defy the predictions
0:00:17 > 0:00:21of political pundits and pollsters?
0:00:21 > 0:00:25The secret lies here, in San Antonio, Texas.
0:00:27 > 0:00:29This was our Project Alamo.
0:00:29 > 0:00:35This is where the digital arm of the Trump campaign operation was held.
0:00:37 > 0:00:40This is the extraordinary story of two men
0:00:40 > 0:00:43with two very different views of the world.
0:00:44 > 0:00:46We will build the wall...
0:00:46 > 0:00:49The path forward is to connect more, not less.
0:00:49 > 0:00:54And how Facebook's Mark Zuckerberg inadvertently helped Donald Trump
0:00:54 > 0:00:57become the most powerful man on the planet.
0:00:57 > 0:01:00Without Facebook, we wouldn't have won.
0:01:00 > 0:01:04I mean, Facebook really and truly put us over the edge.
0:01:04 > 0:01:08With their secret algorithms and online tracking,
0:01:08 > 0:01:12social media companies know more about us than anyone.
0:01:13 > 0:01:16So you can predict mine and everybody else's personality
0:01:16 > 0:01:19based on the things that they've liked?
0:01:19 > 0:01:20That's correct.
0:01:20 > 0:01:24A very accurate prediction of your intimate traits such as
0:01:24 > 0:01:28religiosity, political views, intelligence, sexual orientation.
0:01:28 > 0:01:31Now, this power is transforming politics.
0:01:31 > 0:01:33Can you understand though why maybe
0:01:33 > 0:01:36some people find it a little bit creepy?
0:01:36 > 0:01:38No, I can't - quite the opposite. That is the way the world is moving.
0:01:38 > 0:01:42Whether you like it or not, it's an inevitable fact.
0:01:42 > 0:01:46Social media can bring politics closer to the people,
0:01:46 > 0:01:48but its destructive power
0:01:48 > 0:01:51is creating a new and unpredictable world.
0:01:51 > 0:01:55This is the story of how Silicon Valley's mission to connect
0:01:55 > 0:01:59all of us is disrupting politics,
0:01:59 > 0:02:02plunging us into a world of political turbulence
0:02:02 > 0:02:04that no-one can control.
0:02:20 > 0:02:24It might not look like it, but anger is building in Silicon Valley.
0:02:24 > 0:02:27It's usually pretty quiet around here.
0:02:27 > 0:02:29But not today.
0:02:29 > 0:02:34Every day you wake up and you wonder what's going to be today's grief,
0:02:34 > 0:02:37brought by certain politicians and leaders in the world.
0:02:37 > 0:02:41This is a very unusual demonstration.
0:02:41 > 0:02:44I've been to loads of demonstrations,
0:02:44 > 0:02:46but these aren't the people
0:02:46 > 0:02:48that usually go to demonstrations.
0:02:48 > 0:02:52In some ways, these are the winners of society.
0:02:52 > 0:02:55This is the tech community in Silicon Valley.
0:02:55 > 0:02:59Some of the wealthiest people in the world.
0:02:59 > 0:03:05And they're here protesting and demonstrating...
0:03:05 > 0:03:07against what they see as the kind of changing world
0:03:07 > 0:03:09that they don't like.
0:03:09 > 0:03:13The problem, of course, is the election of Donald Trump.
0:03:15 > 0:03:17Every day, I'm sure, you think,
0:03:17 > 0:03:20"I could be a part of resisting those efforts
0:03:20 > 0:03:22"to mess things up for the rest of us."
0:03:22 > 0:03:26Trump came to power promising to control immigration...
0:03:26 > 0:03:30We will build the wall, 100%.
0:03:30 > 0:03:33..and to disengage from the world.
0:03:33 > 0:03:37From this day forward, it's going to be
0:03:37 > 0:03:40only America first.
0:03:42 > 0:03:44America first.
0:03:46 > 0:03:51Now, Silicon Valley is mobilising against him.
0:03:51 > 0:03:54We are seeing this explosion of political activism, you know,
0:03:54 > 0:03:56all through the US and in Europe.
0:03:56 > 0:03:59Before he became an activist,
0:03:59 > 0:04:03Dex Torricke-Barton was speech writer to the chairman of Google
0:04:03 > 0:04:05and the founder of Facebook.
0:04:07 > 0:04:10It's a moment when people who believe in this global vision,
0:04:10 > 0:04:12as opposed to the nationalist vision of the world,
0:04:12 > 0:04:15who believe in a world that isn't about protectionism,
0:04:15 > 0:04:17whether it's data or whether it's about trade,
0:04:17 > 0:04:20they're coming to stand up and to mobilise in response to that.
0:04:20 > 0:04:23Because it feels like the whole of Silicon Valley
0:04:23 > 0:04:26- has been slightly taken by surprise by what's happening.- Absolutely.
0:04:26 > 0:04:28But these are the smartest minds in the world.
0:04:28 > 0:04:30- Yeah.- With the most amazing data models and polling.
0:04:30 > 0:04:33The smartest minds in the world often can be very,
0:04:33 > 0:04:36very ignorant of the things that are going on in the world.
0:04:36 > 0:04:41The tech god with the most ambitious global vision is Dex's old boss -
0:04:41 > 0:04:44Facebook founder Mark Zuckerberg.
0:04:44 > 0:04:48One word captures the world Zuck, as he's known here,
0:04:48 > 0:04:51is trying to build.
0:04:51 > 0:04:53Connectivity and access. If we connected them...
0:04:53 > 0:04:55You give people connectivity... That's the mission.
0:04:55 > 0:04:58Connecting with their friends... You connect people over time...
0:04:58 > 0:05:00Connect everyone in the world... I'm really optimistic about that.
0:05:00 > 0:05:03Make the world more open and connected, that's what I care about.
0:05:03 > 0:05:06This is part of the critical enabling infrastructure for the world.
0:05:06 > 0:05:08Thank you, guys.
0:05:08 > 0:05:10What's Mark Zuckerberg worried about most?
0:05:10 > 0:05:13Well, you know, Mark has dedicated his life to connecting, you know,
0:05:13 > 0:05:16the world. You know, this is something that he really, you know,
0:05:16 > 0:05:17cares passionately about.
0:05:17 > 0:05:19And, you know, as I said, you know,
0:05:19 > 0:05:22the same worldview and set of policies that, you know,
0:05:22 > 0:05:24we'll build walls here,
0:05:24 > 0:05:26we'll build walls against the sharing of information
0:05:26 > 0:05:29and building those kind of, you know, networks.
0:05:29 > 0:05:30It's bigger than just the tech.
0:05:30 > 0:05:33- It is.- It's about the society that Silicon Valley also wants to create?
0:05:33 > 0:05:34Absolutely.
0:05:34 > 0:05:36The tech gods believe the election
0:05:36 > 0:05:39of Donald Trump threatens their vision
0:05:39 > 0:05:41of a globalised world.
0:05:43 > 0:05:44But in a cruel twist,
0:05:44 > 0:05:47is it possible their mission to connect the world
0:05:47 > 0:05:50actually helped bring him to power?
0:05:50 > 0:05:53The question I have is whether the revolution brought about
0:05:53 > 0:05:56by social media companies like Facebook
0:05:56 > 0:06:00has actually led to the political changes in the world
0:06:00 > 0:06:03that these guys are so worried about.
0:06:06 > 0:06:08To answer that question,
0:06:08 > 0:06:12you have to understand how the tech titans of Silicon Valley
0:06:12 > 0:06:15rose to power.
0:06:15 > 0:06:18For that, you have to go back 20 years
0:06:18 > 0:06:22to a time when the online world was still in its infancy.
0:06:22 > 0:06:25MUSIC: Rock N Roll Star by Oasis
0:06:27 > 0:06:31There were fears the new internet was like the Wild West,
0:06:31 > 0:06:34anarchic and potentially harmful.
0:06:34 > 0:06:40Today our world is being remade, yet again, by an information revolution.
0:06:40 > 0:06:43Changing the way we work, the way we live,
0:06:43 > 0:06:46the way we relate to each other.
0:06:46 > 0:06:49The Telecommunications Act of 1996
0:06:49 > 0:06:51was designed to civilise the internet,
0:06:51 > 0:06:56including protect children from pornography.
0:06:56 > 0:07:00Today with the stroke of a pen, our laws will catch up with our future.
0:07:00 > 0:07:05But buried deep within the act was a secret whose impact no-one foresaw.
0:07:14 > 0:07:15Jeremy?
0:07:15 > 0:07:16Jamie. How are you doing?
0:07:16 > 0:07:19Nice to meet you.
0:07:19 > 0:07:21VOICEOVER: Jeremy Malcolm is an analyst
0:07:21 > 0:07:23at the Electronic Frontier Foundation,
0:07:23 > 0:07:26a civil liberties group for the digital age.
0:07:28 > 0:07:32Much of Silicon Valley's accelerated growth in the last two decades
0:07:32 > 0:07:36has been enabled by one clause in the legislation.
0:07:37 > 0:07:40Hidden away in the middle of that is this Section 230.
0:07:40 > 0:07:42- What's the key line? - It literally just says,
0:07:42 > 0:07:45no provider or user of an interactive computer service
0:07:45 > 0:07:48shall be treated as the publisher or speaker
0:07:48 > 0:07:50of any information provided
0:07:50 > 0:07:53by another information content provider. That's it.
0:07:53 > 0:07:56So what that basically means is, if you're an internet platform,
0:07:56 > 0:07:58you don't get treated as the publisher or speaker
0:07:58 > 0:08:01of something that your users say using your platform.
0:08:01 > 0:08:05If the user says something online that is, say, defamatory,
0:08:05 > 0:08:07the platform that they communicate on
0:08:07 > 0:08:09isn't going to be held responsible for it.
0:08:09 > 0:08:13And the user, of course, can be held directly responsible.
0:08:13 > 0:08:19How important is this line for social media companies today?
0:08:19 > 0:08:22I think if we didn't have this, we probably wouldn't have
0:08:22 > 0:08:25the same kind of social media companies that we have today.
0:08:25 > 0:08:28They wouldn't be willing to take on the risk of having so much
0:08:28 > 0:08:33- unfettered discussion.- It's key to the internet's freedom, really?
0:08:33 > 0:08:36We wouldn't have the internet of today without this.
0:08:36 > 0:08:39And so, if we are going to make any changes to it,
0:08:39 > 0:08:41we have to be really, really careful.
0:08:43 > 0:08:46These 26 words changed the world.
0:08:50 > 0:08:53They allowed a new kind of business to spring up -
0:08:53 > 0:08:58online platforms that became the internet giants of today.
0:08:58 > 0:09:02Facebook, Google, YouTube - they encouraged users to upload content,
0:09:02 > 0:09:06often things about their lives or moments that mattered to them,
0:09:06 > 0:09:09onto their sites for free.
0:09:09 > 0:09:13And in exchange, they got to hoard all of that data
0:09:13 > 0:09:15but without any real responsibility
0:09:15 > 0:09:19for the effects of the content that people were posting.
0:09:22 > 0:09:25Hundreds of millions of us flocked to these new sites,
0:09:25 > 0:09:29putting more of our lives online.
0:09:29 > 0:09:32At first, the tech firms couldn't figure out
0:09:32 > 0:09:35how to turn that data into big money.
0:09:35 > 0:09:40But that changed when a secret within that data was unlocked.
0:09:40 > 0:09:42Antonio, Jamie.
0:09:42 > 0:09:46'A secret Antonio Garcia Martinez helped reveal at Facebook.'
0:09:49 > 0:09:51Tell me a bit about your time at Facebook.
0:09:51 > 0:09:53Well, that was interesting.
0:09:53 > 0:09:56I was what's called a product manager for ads targeting.
0:09:56 > 0:09:58That means basically taking your data and using it to basically
0:09:58 > 0:10:02make money on Facebook, to monetise Facebook's data.
0:10:02 > 0:10:04If you go browse the internet or buy stuff in stores or whatever,
0:10:04 > 0:10:06and then you see ads related to all that stuff
0:10:06 > 0:10:08inside Facebook - I created that.
0:10:10 > 0:10:12Facebook offers advertisers ways
0:10:12 > 0:10:15to target individual users of the site with adverts.
0:10:15 > 0:10:21It can be driven by data about how we use the platform.
0:10:21 > 0:10:24Here's some examples of what's data for Facebook that makes money.
0:10:24 > 0:10:26What you've liked on Facebook, links that you shared,
0:10:26 > 0:10:29who you happen to know on Facebook, for example.
0:10:29 > 0:10:32Where you've used Facebook, what devices, your iPad, your work computer,
0:10:32 > 0:10:34your home computer. In the case of Amazon,
0:10:34 > 0:10:35it's obviously what you've purchased.
0:10:35 > 0:10:38In the case of Google, it's what you searched for.
0:10:38 > 0:10:40How do they turn me... I like something on Facebook,
0:10:40 > 0:10:43and I share a link on Facebook, how could they turn that
0:10:43 > 0:10:46into something that another company would care about?
0:10:46 > 0:10:48There is what's called a targeting system,
0:10:48 > 0:10:50and so the advertiser can actually go in and specify,
0:10:50 > 0:10:54I want people who are within this city and who have liked BMW or Burberry, for example.
0:10:54 > 0:10:56So an advertiser pays Facebook and says,
0:10:56 > 0:11:00- I want these sorts of people? - That's effectively it, that's right.
0:11:01 > 0:11:05The innovation that opened up bigger profits was to allow Facebook users
0:11:05 > 0:11:12to be targeted using data about what they do on the rest of the internet.
0:11:12 > 0:11:14The real key thing that marketers want
0:11:14 > 0:11:18is the unique, immutable, flawless, high fidelity ID
0:11:18 > 0:11:21for one person on the internet, and Facebook provides that.
0:11:21 > 0:11:23It is your identity online.
0:11:23 > 0:11:26Facebook can tell an advertiser, this is the real Jamie Barlow,
0:11:26 > 0:11:28- this is what he's like?- Yeah.
0:11:28 > 0:11:32A company like Walmart can literally take your data, your e-mail,
0:11:32 > 0:11:35phone number, whatever you use for their frequent shopper programme, etc,
0:11:35 > 0:11:39and join that to Facebook and literally target those people based on that data.
0:11:39 > 0:11:40That's part of what I built.
0:11:45 > 0:11:49The tech gods suck in all this data about how we use their technologies
0:11:49 > 0:11:53to build their vast fortunes.
0:11:55 > 0:11:58I mean, it sounds like data is like oil, it's keeping the economy going?
0:11:58 > 0:12:00Right. I mean, the difference is these companies,
0:12:00 > 0:12:04instead of drilling for this oil, they generate this oil via,
0:12:04 > 0:12:07by getting users to actually use their apps and then they actually
0:12:07 > 0:12:09monetise it. Usually via advertising or other mechanisms.
0:12:09 > 0:12:12But, yeah, it is the new oil.
0:12:12 > 0:12:15Data about billions of us is propelling Silicon Valley
0:12:15 > 0:12:19to the pinnacle of the global economy.
0:12:21 > 0:12:23The world's largest hotel company, Airbnb,
0:12:23 > 0:12:25doesn't own a single piece of real estate.
0:12:25 > 0:12:28The world's largest taxi company, Uber, doesn't own any cars.
0:12:28 > 0:12:31The world's largest media company, Facebook, doesn't produce any media, right?
0:12:31 > 0:12:32So what do they have?
0:12:32 > 0:12:35Well, they have the data around how you use those resources and how you
0:12:35 > 0:12:38use those assets. And that's really what they are.
0:12:41 > 0:12:46The secret of targeting us with adverts is keeping us online
0:12:46 > 0:12:49for as long as possible.
0:12:49 > 0:12:53I thought I'd just see how much time I spend on here.
0:12:53 > 0:12:58So I've got an app that counts how often I pick this thing up.
0:12:58 > 0:13:02Our time is the Holy Grail of Silicon Valley.
0:13:03 > 0:13:06Here's what my life looks like on a typical day.
0:13:09 > 0:13:12- Yeah, could I have a flat white, please?- Flat white?- Yeah.
0:13:12 > 0:13:18Like more and more of us, my phone is my gateway to the online world.
0:13:18 > 0:13:20It's how I check my social media accounts.
0:13:20 > 0:13:26On average, Facebook users spend 50 minutes every day on the site.
0:13:31 > 0:13:33The longer we spend connected,
0:13:33 > 0:13:37the more Silicon Valley can learn about us,
0:13:37 > 0:13:41and the more targeted and effective their advertising can be.
0:14:05 > 0:14:08So apparently, today, I...
0:14:08 > 0:14:13I've checked my phone 117 times...
0:14:14 > 0:14:20..and I've been on this phone for nearly five and a half hours.
0:14:20 > 0:14:22Well, I mean, that's a lot, that's a lot of hours.
0:14:22 > 0:14:25I mean, it's kind of nearly half the day,
0:14:25 > 0:14:28spent on this phone.
0:14:28 > 0:14:31But it's weird, because it doesn't feel like I spend that long on it.
0:14:31 > 0:14:33The strange thing about it is
0:14:33 > 0:14:36that I don't even really know what I'm doing
0:14:36 > 0:14:38for these five hours that I'm spending on this phone.
0:14:40 > 0:14:45What is it that is keeping us hooked to Silicon Valley's global network?
0:14:49 > 0:14:54I'm in Seattle to meet someone who saw how the tech gods embraced
0:14:54 > 0:14:59new psychological insights into how we all make decisions.
0:15:02 > 0:15:04I was a post-doc with Stephen Hawking.
0:15:04 > 0:15:07Once Chief Technology Officer at Microsoft,
0:15:07 > 0:15:10Nathan Myhrvold is the most passionate technologist
0:15:10 > 0:15:13I have ever met.
0:15:13 > 0:15:15Some complicated-looking equations in the background.
0:15:15 > 0:15:17Well, it turns out if you work with Stephen Hawking,
0:15:17 > 0:15:20you do work with complicated equations.
0:15:20 > 0:15:22It's kind of the nature of the beast!
0:15:22 > 0:15:24Amazing picture.
0:15:27 > 0:15:30A decade ago, Nathan brought together Daniel Kahneman,
0:15:30 > 0:15:33pioneer of the new science of behavioural economics,
0:15:33 > 0:15:37and Silicon Valley's leaders, for a series of meetings.
0:15:37 > 0:15:40I came and Jeff Bezos came.
0:15:40 > 0:15:46That's Jeff Bezos, the founder of Amazon, worth 76 billion.
0:15:46 > 0:15:48Sean Parker. Sean was there.
0:15:48 > 0:15:52That's Sean Parker, the first president of Facebook.
0:15:52 > 0:15:54And the Google founders were there.
0:15:56 > 0:16:03The proposition was, come to this very nice resort in Napa,
0:16:03 > 0:16:07and for several days, just have Kahneman
0:16:07 > 0:16:09and then also a couple of
0:16:09 > 0:16:12other behavioural economists explain things.
0:16:12 > 0:16:15And ask questions and see what happens.
0:16:15 > 0:16:21Kahneman had a simple but brilliant theory on how we make decisions.
0:16:21 > 0:16:27He had found we use one of two different systems of thinking.
0:16:28 > 0:16:31In this dichotomy, you have...
0:16:31 > 0:16:34over here is a hunch...
0:16:36 > 0:16:37..a guess,
0:16:37 > 0:16:40a gut feeling...
0:16:44 > 0:16:47..and "I just know".
0:16:47 > 0:16:53So this is sort of emotional and more like, just instant stuff?
0:16:53 > 0:16:56That's the idea. This set of things
0:16:56 > 0:17:00is not particularly good at a different set of stuff
0:17:00 > 0:17:02that involves...
0:17:04 > 0:17:05..analysis...
0:17:08 > 0:17:10..numbers...
0:17:12 > 0:17:13..probability.
0:17:14 > 0:17:18The meetings in Napa didn't deal with the basics
0:17:18 > 0:17:21of behavioural economics, but how might the insights of the new
0:17:21 > 0:17:23science have helped the tech gods?
0:17:23 > 0:17:26A lot of advertising is about trying to hook people
0:17:26 > 0:17:30in these type-one things to get interested one way or the other.
0:17:30 > 0:17:35Technology companies undoubtedly use that to one degree or another.
0:17:35 > 0:17:37You know, the term "clickbait",
0:17:37 > 0:17:39for things that look exciting to click on.
0:17:39 > 0:17:42There's billions of dollars change hands
0:17:42 > 0:17:46because we all get enticed into clicking something.
0:17:46 > 0:17:50And there's a lot of things that I click on and then you get there,
0:17:50 > 0:17:53you're like, OK, fine, you were just messing with me.
0:17:53 > 0:17:55You're playing to the type-one things,
0:17:55 > 0:17:58you're putting a set of triggers out there
0:17:58 > 0:18:01that make me want to click on it,
0:18:01 > 0:18:06and even though, like, I'm aware of that, I still sometimes click!
0:18:06 > 0:18:09Tech companies both try to understand
0:18:09 > 0:18:12our behaviour by having smart humans think about it and increasingly
0:18:12 > 0:18:15by having machines think about it.
0:18:15 > 0:18:16By having machines track us
0:18:16 > 0:18:19to see, what is the clickbait Nathan falls for?
0:18:19 > 0:18:21What are the things he really likes to spend time on?
0:18:21 > 0:18:24Let's show him more of that stuff!
0:18:25 > 0:18:29Trying to grab the attention of the consumer is nothing new.
0:18:29 > 0:18:32That's what advertising is all about.
0:18:32 > 0:18:35But insights into how we make decisions helped Silicon Valley
0:18:35 > 0:18:37to shape the online world.
0:18:37 > 0:18:43And little wonder, their success depends on keeping us engaged.
0:18:43 > 0:18:47From 1-Click buying on Amazon to the Facebook like,
0:18:47 > 0:18:51the more they've hooked us, the more the money has rolled in.
0:18:53 > 0:18:55As Silicon Valley became more influential,
0:18:55 > 0:19:00it started attracting powerful friends...in politics.
0:19:00 > 0:19:07In 2008, Barack Obama had pioneered political campaigning on Facebook.
0:19:07 > 0:19:12As President, he was drawn to Facebook's founder, Zuck.
0:19:12 > 0:19:14Sorry, I'm kind of nervous.
0:19:14 > 0:19:17We have the President of the United States here!
0:19:17 > 0:19:21My name is Barack Obama and I'm the guy who got Mark
0:19:21 > 0:19:24to wear a jacket and tie!
0:19:27 > 0:19:28How you doing?
0:19:28 > 0:19:31- Great.- I'll have huevos rancheros, please.
0:19:31 > 0:19:36And if I could have an egg and cheese sandwich on English muffin?
0:19:36 > 0:19:40Aneesh Chopra was Obama's first Chief Technology Officer,
0:19:40 > 0:19:42and saw how close the relationship
0:19:42 > 0:19:47between the White House and Silicon Valley became.
0:19:47 > 0:19:50The President's philosophy and his approach to governing
0:19:50 > 0:19:53garnered a great deal of personal interest
0:19:53 > 0:19:56among many executives in Silicon Valley.
0:19:56 > 0:19:59They were donors to his campaign, volunteers,
0:19:59 > 0:20:01active recruiters of engineering talent
0:20:01 > 0:20:03to support the campaign apparatus.
0:20:03 > 0:20:05He had struck a chord.
0:20:05 > 0:20:07- Why?- Because frankly,
0:20:07 > 0:20:10it was an inspiring voice that really tapped into
0:20:10 > 0:20:13the hopefulness of the country.
0:20:13 > 0:20:17And a lot of Silicon Valley shares this sense of hopefulness,
0:20:17 > 0:20:19this optimistic view that we can solve problems
0:20:19 > 0:20:20if we would work together
0:20:20 > 0:20:24and take advantage of these new capabilities that are coming online.
0:20:24 > 0:20:27So people in Silicon Valley saw President Obama
0:20:27 > 0:20:32- as a bit of a kindred spirit? - Oh, yeah. Oh, yeah.
0:20:32 > 0:20:34If you'd like, Mark, we can take our jackets off.
0:20:34 > 0:20:36That's good!
0:20:36 > 0:20:39Facebook's mission to connect the world
0:20:39 > 0:20:44went hand-in-hand with Obama's policies promoting globalisation
0:20:44 > 0:20:46and free markets.
0:20:46 > 0:20:48And Facebook was seen to be improving
0:20:48 > 0:20:50the political process itself.
0:20:50 > 0:20:55Part of what makes for a healthy democracy
0:20:55 > 0:20:59is when you've got citizens who are informed, who are engaged,
0:20:59 > 0:21:02and what Facebook allows us to do
0:21:02 > 0:21:05is make sure this isn't just a one-way conversation.
0:21:05 > 0:21:09You have the tech mind-set and governments increasingly share
0:21:09 > 0:21:12the same view of the world. That's not a natural...
0:21:12 > 0:21:13It's a spirit of liberty.
0:21:13 > 0:21:15It's a spirit of freedom.
0:21:15 > 0:21:19That is manifest today in these new technologies.
0:21:19 > 0:21:23It happens to be that freedom means I can tweet something offensive.
0:21:23 > 0:21:26But it also means that I have a voice.
0:21:26 > 0:21:29Please raise your right hand and repeat after me...
0:21:29 > 0:21:32By the time Obama won his second term,
0:21:32 > 0:21:37he was feted for his mastery of social media's persuasive power.
0:21:37 > 0:21:39But across the political spectrum,
0:21:39 > 0:21:43the race was on to find new ways to gain a digital edge.
0:21:43 > 0:21:47The world was about to change for Facebook.
0:21:56 > 0:22:01Stanford University, in the heart of Silicon Valley.
0:22:01 > 0:22:05Home to a psychologist investigating just how revealing
0:22:05 > 0:22:10Facebook's hoard of information about each of us could really be.
0:22:10 > 0:22:13How are you doing? Nice to meet you.
0:22:13 > 0:22:18Dr Michal Kosinski specialises in psychometrics -
0:22:18 > 0:22:21the science of predicting psychological traits
0:22:21 > 0:22:23like personality.
0:22:23 > 0:22:26So in the past, when you wanted to measure someone's personality
0:22:26 > 0:22:30or intelligence, you needed to give them a question or a test,
0:22:30 > 0:22:33and they would have to answer a bunch of questions.
0:22:33 > 0:22:37Now, many of those questions would basically ask you
0:22:37 > 0:22:39about whether you like poetry,
0:22:39 > 0:22:41or you like hanging out with other people,
0:22:41 > 0:22:43or you like the theatre, and so on.
0:22:43 > 0:22:47But these days, you don't need to ask these questions any more.
0:22:47 > 0:22:51Why? Because while going through our lives,
0:22:51 > 0:22:54we are leaving behind a lot of digital footprints
0:22:54 > 0:22:57that basically contain the same information.
0:22:57 > 0:23:01So instead of asking you whether you like poetry,
0:23:01 > 0:23:05I can just look at your reading history on Amazon
0:23:05 > 0:23:08or your Facebook likes,
0:23:08 > 0:23:10and I would just get exactly the same information.
0:23:12 > 0:23:16In 2011, Dr Kosinski and his team at the University of Cambridge
0:23:16 > 0:23:18developed an online survey
0:23:18 > 0:23:21to measure volunteers' personality traits.
0:23:21 > 0:23:24With their permission, he matched their results
0:23:24 > 0:23:26with their Facebook data.
0:23:26 > 0:23:30More than 6 million people took part.
0:23:30 > 0:23:32We have people's Facebook likes,
0:23:32 > 0:23:36people's status updates and profile data,
0:23:36 > 0:23:40and this allows us to build those... to gain better understanding
0:23:40 > 0:23:43of how psychological traits are being expressed
0:23:43 > 0:23:45in the digital environment.
0:23:45 > 0:23:50How you can measure psychological traits using digital footprints.
0:23:50 > 0:23:52An algorithm that can look at millions of people
0:23:52 > 0:23:55and it can look at hundreds of thousands
0:23:55 > 0:23:56or tens of thousands of your likes
0:23:56 > 0:24:01can extract and utilise even those little pieces of information
0:24:01 > 0:24:04and combine it into a very accurate profile.
0:24:04 > 0:24:09You can quite accurately predict mine, and in fact, everybody else's
0:24:09 > 0:24:14personality, based on the things that they've liked?
0:24:14 > 0:24:15That's correct.
0:24:15 > 0:24:18It can also be used to turn your digital footprint
0:24:18 > 0:24:21into a very accurate prediction
0:24:21 > 0:24:23of your intimate traits, such as religiosity,
0:24:23 > 0:24:26political views, personality, intelligence,
0:24:26 > 0:24:30sexual orientation and a bunch of other psychological traits.
0:24:30 > 0:24:33If I'm logged in, we can maybe see how accurate this actually is.
0:24:33 > 0:24:37So I hit "make prediction" and it's going to try and predict
0:24:37 > 0:24:39my personality from my Facebook page.
0:24:39 > 0:24:41From your Facebook likes.
0:24:41 > 0:24:42According to your likes,
0:24:42 > 0:24:46you're open-minded, liberal and artistic.
0:24:46 > 0:24:49Judges your intelligence to be extremely high.
0:24:49 > 0:24:53- Well done. - Yes. I'm extremely intelligent!
0:24:53 > 0:24:56You're not religious, but if you are religious,
0:24:56 > 0:24:59- most likely you'd be a Catholic. - I was raised a Catholic!
0:24:59 > 0:25:02I can't believe it knows that.
0:25:02 > 0:25:05Because I... I don't say anything about being a Catholic anywhere,
0:25:05 > 0:25:09but I was raised as a Catholic, but I'm not a practising Catholic.
0:25:09 > 0:25:11So it's like...
0:25:11 > 0:25:13It's absolutely spot on.
0:25:13 > 0:25:17Oh, my God! Journalism, but also what did I study?
0:25:17 > 0:25:21Studied history, and I didn't put anything about history in there.
0:25:21 > 0:25:22I think this is one of the things
0:25:22 > 0:25:26that people don't really get about those predictions, that they think,
0:25:26 > 0:25:29look, if I like Lady Gaga on Facebook,
0:25:29 > 0:25:31obviously people will know that I like Lady Gaga,
0:25:31 > 0:25:34or the Government will know that I like Lady Gaga.
0:25:34 > 0:25:36Look, you don't need a rocket scientist
0:25:36 > 0:25:39to look at your Spotify playlist or your Facebook likes
0:25:39 > 0:25:42to figure out that you like Lady Gaga.
0:25:42 > 0:25:47What's really world-changing about those algorithms is that they can
0:25:47 > 0:25:51take your music preferences or your book preferences
0:25:51 > 0:25:55and extract from this seemingly innocent information
0:25:55 > 0:25:59very accurate predictions about your religiosity, leadership potential,
0:25:59 > 0:26:02political views, personality and so on.
0:26:02 > 0:26:05Can you predict people's political persuasions with this?
0:26:05 > 0:26:08In fact, the first dimension here, openness to experience,
0:26:08 > 0:26:11is a very good predictor of political views.
0:26:11 > 0:26:14People scoring high on openness, they tend to be liberal,
0:26:14 > 0:26:19people who score low on openness, we even call it conservative and
0:26:19 > 0:26:22traditional, they tend to vote for conservative candidates.
0:26:22 > 0:26:25What about the potential to manipulate people?
0:26:25 > 0:26:29So, obviously, if you now can use an algorithm to get to know millions
0:26:29 > 0:26:33of people very intimately and then use another algorithm
0:26:33 > 0:26:37to adjust the message that you are sending to them,
0:26:37 > 0:26:40to make it most persuasive,
0:26:40 > 0:26:43obviously gives you a lot of power.
0:26:52 > 0:26:55I'm quite surprised at just how accurate this model could be.
0:26:55 > 0:27:00I mean, I cannot believe that on the basis of a few things
0:27:00 > 0:27:05I just, you know, carelessly clicked that I liked,
0:27:05 > 0:27:10the model was able to work out that I could have been Catholic,
0:27:10 > 0:27:12or had a Catholic upbringing.
0:27:12 > 0:27:18And clearly, this is a very powerful way of understanding people.
0:27:19 > 0:27:22Very exciting possibilities, but I can't help
0:27:22 > 0:27:28fearing that there is that potential, whoever has that power,
0:27:28 > 0:27:31whoever can control that model...
0:27:33 > 0:27:37..will have sort of unprecedented possibilities of manipulating
0:27:37 > 0:27:41what people think, how they behave, what they see,
0:27:41 > 0:27:45whether that's selling things to people or how people vote,
0:27:45 > 0:27:48and that's pretty scary too.
0:27:52 > 0:27:55Our era is defined by political shocks.
0:27:59 > 0:28:02None bigger than the rise of Donald Trump,
0:28:02 > 0:28:05who defied pollsters and the mainstream media
0:28:05 > 0:28:07to win the American presidency.
0:28:08 > 0:28:12Now questions swirl around his use of the American affiliate
0:28:12 > 0:28:14of a British insights company,
0:28:14 > 0:28:17Cambridge Analytica, who use psychographics.
0:28:25 > 0:28:27I'm in Texas to uncover how far
0:28:27 > 0:28:31Cambridge Analytica's expertise in personality prediction
0:28:31 > 0:28:36played a part in Trump's political triumph,
0:28:36 > 0:28:38and how his revolutionary campaign
0:28:38 > 0:28:42exploited Silicon Valley's social networks.
0:28:44 > 0:28:45Everyone seems to agree
0:28:45 > 0:28:49that Trump ran an exceptional election campaign
0:28:49 > 0:28:52using digital technologies.
0:28:52 > 0:28:56But no-one really knows what they did,
0:28:56 > 0:28:59who they were working with, who was helping them,
0:28:59 > 0:29:03what the techniques they used were.
0:29:03 > 0:29:07So I've come here to try to unravel the mystery.
0:29:09 > 0:29:13The operation inside this unassuming building in San Antonio
0:29:13 > 0:29:18was largely hidden from view, but crucial to Trump's success.
0:29:18 > 0:29:22Since then, I'm the only person to get in here
0:29:22 > 0:29:25to find out what really happened.
0:29:25 > 0:29:27This was our Project Alamo,
0:29:27 > 0:29:30so this was where the digital arm
0:29:30 > 0:29:33of the Trump campaign operation was held.
0:29:33 > 0:29:36Theresa Hong is speaking publicly
0:29:36 > 0:29:40for the first time about her role as digital content director
0:29:40 > 0:29:42for the Trump campaign.
0:29:43 > 0:29:47So, why is it called Project Alamo?
0:29:47 > 0:29:51It was called Project Alamo based on the data, actually,
0:29:51 > 0:29:53that was Cambridge Analytica -
0:29:53 > 0:29:56they came up with the Alamo data set, right?
0:29:56 > 0:30:00So, we just kind of adopted the name Project Alamo.
0:30:00 > 0:30:04It does conjure up sort of images of a battle of some sort.
0:30:04 > 0:30:06Yeah. It kind of was!
0:30:06 > 0:30:09In a sense, you know. Yeah, yeah.
0:30:13 > 0:30:16Project Alamo was so important,
0:30:16 > 0:30:20Donald Trump visited the hundred or so workers based here
0:30:20 > 0:30:22during the campaign.
0:30:22 > 0:30:26Ever since he started tweeting in 2009,
0:30:26 > 0:30:29Trump had grasped the power of social media.
0:30:29 > 0:30:32Now, in the fight of his life,
0:30:32 > 0:30:36the campaign manipulated his Facebook presence.
0:30:36 > 0:30:40Trump's Twitter account, that's all his, he's the one that ran that,
0:30:40 > 0:30:43and I did a lot of his Facebook,
0:30:43 > 0:30:48so I wrote a lot for him, you know, I kind of channelled Mr Trump.
0:30:48 > 0:30:52How do you possibly write a post on Facebook like Donald Trump?
0:30:52 > 0:30:54"Believe me."
0:30:54 > 0:30:58A lot of believe me's, a lot of alsos, a lot of...verys.
0:30:58 > 0:31:01Actually, he was really wonderful to write for, just because
0:31:01 > 0:31:06it was so refreshing, it was so authentic.
0:31:09 > 0:31:11We headed to the heart of the operation.
0:31:14 > 0:31:16Cambridge Analytica was here.
0:31:16 > 0:31:19It was just a line of computers, right?
0:31:19 > 0:31:22This is where their operation was and this was kind of
0:31:22 > 0:31:26the brain of the data, this was the data centre.
0:31:26 > 0:31:30This was the data centre, this was the centre of the data centre.
0:31:30 > 0:31:33Exactly right, yes, it was. Yes, it was.
0:31:33 > 0:31:35Cambridge Analytica were using data
0:31:35 > 0:31:40on around 220 million Americans to target potential donors and voters.
0:31:40 > 0:31:43It was, you know,
0:31:43 > 0:31:48a bunch of card tables that sat here and a bunch of computers and people
0:31:48 > 0:31:51that were behind the computers, monitoring the data.
0:31:51 > 0:31:55We've got to target this state, that state or this universe or whatever.
0:31:55 > 0:31:58So that's what they were doing, they were gathering all the data.
0:31:58 > 0:32:02A "universe" was the name given to a group of voters
0:32:02 > 0:32:05defined by Cambridge Analytica.
0:32:05 > 0:32:07What sort of attributes did these people have
0:32:07 > 0:32:09that they had been able to work out?
0:32:09 > 0:32:12Some of the attributes would be, when was the last time they voted?
0:32:12 > 0:32:14Who did they vote for?
0:32:14 > 0:32:17You know, what kind of car do they drive?
0:32:17 > 0:32:20What kind of things do they look at on the internet?
0:32:20 > 0:32:22What do they stand for?
0:32:22 > 0:32:25I mean, one person might really be about job creation
0:32:25 > 0:32:27and keeping jobs in America, you know.
0:32:27 > 0:32:30Another person, they might resonate with, you know,
0:32:30 > 0:32:33Second Amendment and gun rights, so...
0:32:33 > 0:32:34How would they know that?
0:32:34 > 0:32:38- ..within that...- How would they know that?- That is their secret sauce.
0:32:40 > 0:32:42Did the "secret sauce"
0:32:42 > 0:32:46contain predictions of personality and political leanings?
0:32:46 > 0:32:50Were they able to kind of understand people's personalities?
0:32:50 > 0:32:52Yes, I mean, you know,
0:32:52 > 0:32:56they do specialise in psychographics, right?
0:32:56 > 0:33:01But based on personal interests and based on what a person cares for
0:33:01 > 0:33:03and what means something to them,
0:33:03 > 0:33:06they were able to extract and then we were able to target.
0:33:06 > 0:33:10So, the psychographic stuff, were they using that here?
0:33:10 > 0:33:13Was that part of the model you were working on?
0:33:13 > 0:33:17Well, I mean, towards the end with the persuasion, absolutely.
0:33:17 > 0:33:19I mean, we really were targeting
0:33:19 > 0:33:22on these universes that they had collected.
0:33:22 > 0:33:25I mean, did some of the attributes that you were able to use
0:33:25 > 0:33:29from Cambridge Analytica have a sort of emotional effect, you know,
0:33:29 > 0:33:32happy, sad, anxious, worried - moods, that kind of thing?
0:33:32 > 0:33:34Right, yeah.
0:33:34 > 0:33:38I do know Cambridge Analytica follows something called Ocean
0:33:38 > 0:33:41and it's based on, you know, whether or not, like,
0:33:41 > 0:33:43are you an extrovert?
0:33:43 > 0:33:45Are you more of an introvert?
0:33:45 > 0:33:50Are you fearful? Are you positive?
0:33:50 > 0:33:52So, they did use that.
0:33:54 > 0:33:58Armed with Cambridge Analytica's revolutionary insights,
0:33:58 > 0:34:02the next step in the battle to win over millions of Americans
0:34:02 > 0:34:06was to shape the online messages they would see.
0:34:06 > 0:34:10Now we're going to go into the big kind of bull pen where a lot of
0:34:10 > 0:34:14- the creatives were, and this is where I was as well.- Right.
0:34:14 > 0:34:17For today's modern working-class families,
0:34:17 > 0:34:19the challenge is very, very real.
0:34:19 > 0:34:21With childcare costs...
0:34:21 > 0:34:25Adverts were tailored to particular audiences, defined by data.
0:34:25 > 0:34:28Donald Trump wants to give families the break they deserve.
0:34:28 > 0:34:31This universe right here that Cambridge Analytica,
0:34:31 > 0:34:34they've collected data and they have identified as working mothers
0:34:34 > 0:34:37that are concerned about childcare,
0:34:37 > 0:34:40and childcare, obviously, that's not going to be, like,
0:34:40 > 0:34:43a war-ridden, you know, destructive ad, right?
0:34:43 > 0:34:44That's more warm and fuzzy,
0:34:44 > 0:34:46and this is what Trump is going to do for you.
0:34:46 > 0:34:49But if you notice, Trump's not speaking.
0:34:49 > 0:34:51He wasn't speaking, he wasn't in it at all.
0:34:51 > 0:34:53Right, Trump wasn't speaking in it.
0:34:53 > 0:34:56That audience there, we wanted a softer approach,
0:34:56 > 0:34:59so this is the type of approach that we would take.
0:34:59 > 0:35:03The campaign made thousands of different versions
0:35:03 > 0:35:05of the same fundraising adverts.
0:35:05 > 0:35:11The design was constantly tweaked to see which version performed best.
0:35:11 > 0:35:15It wasn't uncommon to have about 35-45,000 iterations
0:35:15 > 0:35:18of these types of ads every day, right?
0:35:18 > 0:35:22So, you know, it could be
0:35:22 > 0:35:28as subtle and just, you know, people wouldn't even notice,
0:35:28 > 0:35:30where you have the green button,
0:35:30 > 0:35:32sometimes a red button would work better
0:35:32 > 0:35:34or a blue button would work better.
0:35:34 > 0:35:38Now, the voters Cambridge Analytica had targeted
0:35:38 > 0:35:40were bombarded with adverts.
0:35:46 > 0:35:47I may have short-circuited...
0:35:49 > 0:35:51I'm Donald Trump and I approve this message.
0:35:52 > 0:35:57On a typical day, the campaign would run more than 100 different adverts.
0:35:57 > 0:36:02The delivery system was Silicon Valley's vast social networks.
0:36:02 > 0:36:05We had the Facebook and YouTube and Google people,
0:36:05 > 0:36:07they would congregate here.
0:36:07 > 0:36:12Almost all of America's voters could now be reached online in an instant.
0:36:12 > 0:36:16I mean, what were Facebook and Google and YouTube people actually
0:36:16 > 0:36:19- doing here, why were they here? - They were helping us, you know.
0:36:19 > 0:36:24They were basically our kind of hands-on partners
0:36:24 > 0:36:27as far as being able to utilise the platform
0:36:27 > 0:36:29as effectively as possible.
0:36:29 > 0:36:34The Trump campaign spent the lion's share of its advertising budget,
0:36:34 > 0:36:38around 85 million, on Facebook.
0:36:38 > 0:36:42When you're pumping in millions and millions of dollars
0:36:42 > 0:36:44to these social platforms,
0:36:44 > 0:36:46you're going to get white-glove treatment,
0:36:46 > 0:36:48so, they would send people, you know,
0:36:48 > 0:36:51representatives to, you know,
0:36:51 > 0:36:56Project Alamo to ensure that all of our needs were being met.
0:36:56 > 0:36:59The success of Trump's digital strategy was built on
0:36:59 > 0:37:03the effectiveness of Facebook as an advertising medium.
0:37:03 > 0:37:06It's become a very powerful political tool
0:37:06 > 0:37:08that's largely unregulated.
0:37:08 > 0:37:12Without Facebook, we wouldn't have won.
0:37:12 > 0:37:16I mean, Facebook really and truly put us over the edge,
0:37:16 > 0:37:18I mean, Facebook was the medium
0:37:18 > 0:37:21that proved most successful for this campaign.
0:37:25 > 0:37:28Facebook didn't want to meet me, but made it clear that like all
0:37:28 > 0:37:32advertisers on Facebook, political campaigns must ensure
0:37:32 > 0:37:36their ads comply with all applicable laws and regulations.
0:37:36 > 0:37:40The company also said no personally identifiable
0:37:40 > 0:37:42information can be shared with advertising,
0:37:42 > 0:37:46measurement or analytics partners unless people give permission.
0:37:51 > 0:37:52In London, I'm on the trail
0:37:52 > 0:37:56of the data company Cambridge Analytica.
0:37:59 > 0:38:03Ever since the American election, the firm has been under pressure
0:38:03 > 0:38:08to come clean over its use of personality prediction.
0:38:08 > 0:38:12I want to know exactly how the firm used psychographics
0:38:12 > 0:38:16to target voters for the Trump campaign.
0:38:16 > 0:38:18- Alexander.- Hi.
0:38:18 > 0:38:19How do you do? Pleasure.
0:38:19 > 0:38:23VOICEOVER: Alexander Nix's firm first used data to target voters
0:38:23 > 0:38:27in the American presidential elections while working on
0:38:27 > 0:38:31the Ted Cruz campaign for the Republican nomination.
0:38:31 > 0:38:35When Cruz lost, they went to work for Trump.
0:38:35 > 0:38:37I want to start with the Trump campaign.
0:38:37 > 0:38:40Did Cambridge Analytica ever use
0:38:40 > 0:38:44psychometric or psychographic methods in this campaign?
0:38:44 > 0:38:47We left the Cruz campaign in April after the nomination was over.
0:38:47 > 0:38:50We pivoted right across onto the Trump campaign.
0:38:50 > 0:38:53It was about five and a half months before polling.
0:38:53 > 0:38:58And whilst on the Cruz campaign, we were able to invest
0:38:58 > 0:39:01a lot more time into building psychographic models,
0:39:01 > 0:39:04into profiling, using behavioural profiling to understand different
0:39:04 > 0:39:07personality groups and different personality drivers,
0:39:07 > 0:39:10in order to inform our messaging and our creative.
0:39:10 > 0:39:14We simply didn't have the time to employ this level of
0:39:14 > 0:39:17rigorous methodology for Trump.
0:39:17 > 0:39:21For Cruz, Cambridge Analytica built computer models which could crunch
0:39:21 > 0:39:26huge amounts of data on each voter, including psychographic data
0:39:26 > 0:39:30predicting voters' personality types.
0:39:30 > 0:39:32When the firm transferred to the Trump campaign,
0:39:32 > 0:39:34they took data with them.
0:39:34 > 0:39:40Now, there is clearly some legacy psychographics in the data,
0:39:40 > 0:39:44because the data is, um, model data,
0:39:44 > 0:39:46or a lot of it is model data that we had used across
0:39:46 > 0:39:50the last 14, 15 months of campaigning through the midterms,
0:39:50 > 0:39:52and then through the primaries.
0:39:52 > 0:39:57But specifically, did we build specific psychographic models
0:39:57 > 0:39:59for the Trump campaign? No, we didn't.
0:39:59 > 0:40:03So you didn't build specific models for this campaign,
0:40:03 > 0:40:07but it sounds like you did use some element of psychographic modelling,
0:40:07 > 0:40:10as an approach in the Trump campaign?
0:40:10 > 0:40:13Only as a result of legacy data models.
0:40:13 > 0:40:17So, the answer... The answer you are looking for is no.
0:40:17 > 0:40:20The answer I am looking for is the extent to which it was used.
0:40:20 > 0:40:23I mean, legacy... I don't know what that means, legacy data modelling.
0:40:23 > 0:40:26What does that mean for the Trump campaign?
0:40:26 > 0:40:28Well, so we were able to take models we had made previously
0:40:28 > 0:40:30over the last two or three years,
0:40:30 > 0:40:33and integrate those into some of the work we were doing.
0:40:33 > 0:40:34Where did all the information
0:40:34 > 0:40:38to predict voters' personalities come from?
0:40:38 > 0:40:42Very originally, we used a combination of telephone surveys
0:40:42 > 0:40:45and then we used a number of...
0:40:46 > 0:40:50..online platforms...
0:40:50 > 0:40:52for gathering questions.
0:40:52 > 0:40:55As we started to gather more data,
0:40:55 > 0:40:57we started to look at other platforms.
0:40:57 > 0:41:00Such as Facebook, for instance.
0:41:02 > 0:41:05Predicting personality is just one element in the big data
0:41:05 > 0:41:09companies like Cambridge Analytica are using to revolutionise
0:41:09 > 0:41:11the way democracy works.
0:41:11 > 0:41:13Can you understand, though, why maybe
0:41:13 > 0:41:16some people find it a little bit creepy?
0:41:16 > 0:41:17No, I can't. Quite the opposite.
0:41:17 > 0:41:21I think that the move away from blanket advertising,
0:41:21 > 0:41:24the move towards ever more personalised communication,
0:41:24 > 0:41:25is a very natural progression.
0:41:25 > 0:41:27I think it is only going to increase.
0:41:27 > 0:41:30I find it a little bit weird, if I had a very sort of detailed
0:41:30 > 0:41:35personality assessment of me, based on all sorts of different data
0:41:35 > 0:41:37that I had put all over the internet,
0:41:37 > 0:41:40and that as a result of that, some profile had been made of me
0:41:40 > 0:41:43that was then used to target me for adverts,
0:41:43 > 0:41:45and I didn't really know that any of that had happened.
0:41:45 > 0:41:48That's why some people might find it a little bit sinister.
0:41:48 > 0:41:50Well, you have just said yourself,
0:41:50 > 0:41:52you are putting this data out into the public domain.
0:41:52 > 0:41:56I'm sure that you have a supermarket loyalty card.
0:41:56 > 0:42:01I'm sure you understand the reciprocity that is going on there -
0:42:01 > 0:42:05you get points, and in return, they gather your data
0:42:05 > 0:42:07on your consumer behaviour.
0:42:07 > 0:42:08I mean, we are talking about politics
0:42:08 > 0:42:12and we're talking about shopping. Are they really the same thing?
0:42:12 > 0:42:14The technology is the same.
0:42:14 > 0:42:15In the next ten years,
0:42:15 > 0:42:18the sheer volumes of data that are going to be available,
0:42:18 > 0:42:20that are going to be driving all sorts of things
0:42:20 > 0:42:23including marketing and communications,
0:42:23 > 0:42:25is going to be a paradigm shift from where we are now
0:42:25 > 0:42:27and it's going to be a revolution,
0:42:27 > 0:42:29and that is the way the world is moving.
0:42:29 > 0:42:31And, you know, I think,
0:42:31 > 0:42:35whether you like it or not, it, it...
0:42:35 > 0:42:37it is an inevitable fact.
0:42:39 > 0:42:42To the new data barons, this is all just business.
0:42:42 > 0:42:46But to the rest of us, it's more than that.
0:42:49 > 0:42:52By the time of Donald Trump's inauguration,
0:42:52 > 0:42:55it was accepted that his mastery of data and social media
0:42:55 > 0:42:59had made him the most powerful man in the world.
0:42:59 > 0:43:04We will make America great again.
0:43:04 > 0:43:06The election of Donald Trump
0:43:06 > 0:43:10was greeted with barely concealed fury in Silicon Valley.
0:43:10 > 0:43:13But Facebook and other tech companies had made
0:43:13 > 0:43:16millions of dollars by helping to make it happen.
0:43:16 > 0:43:19Their power as advertising platforms
0:43:19 > 0:43:21had been exploited by a politician
0:43:21 > 0:43:24with a very different view of the world.
0:43:27 > 0:43:30But Facebook's problems were only just beginning.
0:43:32 > 0:43:34Another phenomenon of the election
0:43:34 > 0:43:37was plunging the tech titan into crisis.
0:43:43 > 0:43:46Fake news, often targeting Hillary Clinton,
0:43:46 > 0:43:49had dominated the election campaign.
0:43:52 > 0:43:55Now, the departing President turned on the social media giant
0:43:55 > 0:43:58he had once embraced.
0:43:59 > 0:44:01In an age where, uh...
0:44:01 > 0:44:06there is so much active misinformation,
0:44:06 > 0:44:08and it is packaged very well,
0:44:08 > 0:44:11and it looks the same when you see it on a Facebook page,
0:44:11 > 0:44:13or you turn on your television,
0:44:13 > 0:44:17if everything, uh...
0:44:17 > 0:44:20seems to be the same,
0:44:20 > 0:44:22and no distinctions are made,
0:44:22 > 0:44:25then we won't know what to protect.
0:44:25 > 0:44:27We won't know what to fight for.
0:44:33 > 0:44:36Fake news had provoked a storm of criticism
0:44:36 > 0:44:39over Facebook's impact on democracy.
0:44:39 > 0:44:42Its founder, Zuck, claimed it was extremely unlikely
0:44:42 > 0:44:46fake news had changed the election's outcome.
0:44:46 > 0:44:49But he didn't address why it had spread like wildfire
0:44:49 > 0:44:51across the platform.
0:44:51 > 0:44:53- Jamie.- Hi, Jamie.
0:44:53 > 0:44:55VOICEOVER: Meet Jeff Hancock,
0:44:55 > 0:44:58a psychologist who has investigated a hidden aspect of Facebook
0:44:58 > 0:45:03that helps explain how the platform became weaponised in this way.
0:45:06 > 0:45:11It turns out the power of Facebook to affect our emotions is key,
0:45:11 > 0:45:13something that had been uncovered
0:45:13 > 0:45:18in an experiment the company itself had run in 2012.
0:45:18 > 0:45:19It was one of the earlier, you know,
0:45:19 > 0:45:22what we would call big data, social science-type studies.
0:45:26 > 0:45:30The newsfeeds of nearly 700,000 users were secretly manipulated
0:45:30 > 0:45:35so they would see fewer positive or negative posts.
0:45:35 > 0:45:38Jeff helped interpret the results.
0:45:38 > 0:45:40So, what did you actually find?
0:45:40 > 0:45:43We found that if you were one of those people
0:45:43 > 0:45:46that were seeing less negative emotion words in their posts,
0:45:46 > 0:45:49then you would write with less negative emotion in your own posts,
0:45:49 > 0:45:51and more positive emotion.
0:45:51 > 0:45:55- This is emotional contagion. - And what about positive posts?
0:45:55 > 0:45:57Did that have the same kind of effect?
0:45:57 > 0:45:59Yeah, we saw the same effect
0:45:59 > 0:46:03when positive emotion worded posts were decreased.
0:46:03 > 0:46:06We saw the same thing. So I would produce fewer positive
0:46:06 > 0:46:08emotion words, and more negative emotion words.
0:46:08 > 0:46:11And that is consistent with emotional contagion theory.
0:46:11 > 0:46:15Basically, we were showing that people were writing in a way
0:46:15 > 0:46:18that was matching the emotion that they were seeing
0:46:18 > 0:46:21in the Facebook news feed.
0:46:21 > 0:46:27Emotion draws people to fake news, and then supercharges its spread.
0:46:27 > 0:46:29So the more emotional the content,
0:46:29 > 0:46:31the more likely it is to spread online.
0:46:31 > 0:46:34- Is that true?- Yeah, the more intense the emotion in content,
0:46:34 > 0:46:38the more likely it is to spread, to go viral.
0:46:38 > 0:46:42It doesn't matter whether it is sad or happy, like negative or positive,
0:46:42 > 0:46:44the more important thing is how intense the emotion is.
0:46:44 > 0:46:47The process of emotional contagion
0:46:47 > 0:46:50helps explain why fake news has spread
0:46:50 > 0:46:52so far across social media.
0:46:55 > 0:46:58Advertisers have been well aware of how emotion can be used
0:46:58 > 0:46:59to manipulate people's attention.
0:46:59 > 0:47:02But now we are seeing this with a whole host of other actors,
0:47:02 > 0:47:04some of them nefarious.
0:47:04 > 0:47:06So, other state actors trying to influence your election,
0:47:06 > 0:47:08people trying to manipulate the media
0:47:08 > 0:47:10are using emotional contagion,
0:47:10 > 0:47:14and also using those original platforms like Facebook
0:47:14 > 0:47:19to accomplish other objectives, like sowing distrust,
0:47:19 > 0:47:21or creating false beliefs.
0:47:21 > 0:47:23You see it sort of, maybe weaponised
0:47:23 > 0:47:26and being used at scale.
0:47:35 > 0:47:37I'm in Germany,
0:47:37 > 0:47:39where the presence of more than a million new refugees
0:47:39 > 0:47:43has caused tension across the political spectrum.
0:47:43 > 0:47:44Earlier this year,
0:47:44 > 0:47:49a story appeared on the American alt-right news site Breitbart
0:47:49 > 0:47:54about the torching of a church in Dortmund by a North African mob.
0:47:54 > 0:47:57It was widely shared on social media -
0:47:57 > 0:47:59but it wasn't true.
0:48:04 > 0:48:06The problem with social networks
0:48:06 > 0:48:08is that all information is treated equally.
0:48:08 > 0:48:12So you have good, honest, accurate information
0:48:12 > 0:48:15sitting alongside and treated equally to
0:48:15 > 0:48:18lies and propaganda.
0:48:18 > 0:48:21And the difficulty for citizens is that it can be very hard
0:48:21 > 0:48:24to tell the difference between the two.
0:48:24 > 0:48:30But one data scientist here has discovered an even darker side
0:48:30 > 0:48:32to the way Facebook is being manipulated.
0:48:33 > 0:48:37We created a network of all the likes
0:48:37 > 0:48:40in the refugee debate on Facebook.
0:48:42 > 0:48:46Professor Simon Hegelich has found evidence the debate
0:48:46 > 0:48:47about refugees on Facebook
0:48:47 > 0:48:52is being skewed by anonymous political forces.
0:48:52 > 0:48:56So you are trying to understand who is liking pages?
0:48:56 > 0:48:58Exactly.
0:48:58 > 0:49:01One statistic among many used by Facebook to rank stories in
0:49:01 > 0:49:04your news feed is the number of likes they get.
0:49:04 > 0:49:08The red area of this chart shows a network of people liking
0:49:08 > 0:49:12anti-refugee posts on Facebook.
0:49:12 > 0:49:18Most of the likes are sent by just a handful of people - 25 people are...
0:49:20 > 0:49:24..responsible for more than 90% of all these likes.
0:49:24 > 0:49:2625 Facebook accounts each liked
0:49:26 > 0:49:30more than 30,000 comments over six months.
0:49:30 > 0:49:36These hyperactive accounts could be run by real people, or software.
0:49:36 > 0:49:38I think the rationale behind this
0:49:38 > 0:49:42is that they tried to make their content viral.
0:49:42 > 0:49:46They think if we like all this stuff, then Facebook,
0:49:46 > 0:49:47the algorithm of Facebook,
0:49:47 > 0:49:51will pick up our content and show it to other users,
0:49:51 > 0:49:54and then the whole world sees that refugees are bad
0:49:54 > 0:49:57and that they shouldn't come to Germany.
0:49:57 > 0:50:02This is evidence the number of likes on Facebook can be easily gamed
0:50:02 > 0:50:05as part of an effort to try to influence the prominence
0:50:05 > 0:50:08of anti-refugee content on the site.
0:50:08 > 0:50:10Does this worry you, though?
0:50:10 > 0:50:14It's definitely changing structure of public opinion.
0:50:14 > 0:50:18Democracy is built on public opinion,
0:50:18 > 0:50:23so such a change definitely has to change the way democracy works.
0:50:26 > 0:50:30Facebook told us they are working to disrupt the economic incentives
0:50:30 > 0:50:34behind false news, removing tens of thousands of fake accounts,
0:50:34 > 0:50:36and building new products
0:50:36 > 0:50:39to identify and limit the spread of false news.
0:50:39 > 0:50:44Zuck is trying to hold the line that his company is not a publisher,
0:50:44 > 0:50:49based on that obscure legal clause from the 1990s.
0:50:49 > 0:50:51You know, Facebook is a new kind of platform.
0:50:51 > 0:50:54You know, it's not a traditional technology company,
0:50:54 > 0:50:57it's not a traditional media company.
0:50:57 > 0:50:59Um, we don't write the news that people...
0:50:59 > 0:51:01that people read on the platform.
0:51:03 > 0:51:08In Germany, one man is taking on the tech gods over their responsibility
0:51:08 > 0:51:10for what appears on their sites.
0:51:12 > 0:51:17Ulrich Kelber is a minister in the Justice Department.
0:51:17 > 0:51:20He was once called a "Jewish pig" in a post on Facebook.
0:51:20 > 0:51:24He reported it to the company, who refused to delete it.
0:51:24 > 0:51:26IN GERMAN:
0:51:43 > 0:51:45Under a new law,
0:51:45 > 0:51:47Facebook and other social networking sites
0:51:47 > 0:51:50could be fined up to 50 million euros
0:51:50 > 0:51:55if they fail to take down hate speech posts that are illegal.
0:51:55 > 0:51:56Is this too much?
0:51:56 > 0:51:59Is this a little bit Draconian?
0:52:20 > 0:52:23This is the first time a government has challenged
0:52:23 > 0:52:27the principles underlying a Silicon Valley platform.
0:52:27 > 0:52:29Once this thread has been pulled,
0:52:29 > 0:52:32their whole world could start to unravel.
0:52:32 > 0:52:35They must find you a pain.
0:52:35 > 0:52:37I mean, this must be annoying for them.
0:52:37 > 0:52:38It must be.
0:52:47 > 0:52:51Facebook told us they share the goal of fighting hate speech,
0:52:51 > 0:52:54and they have made substantial progress
0:52:54 > 0:52:56in removing illegal content,
0:52:56 > 0:53:00adding 3,000 people to their community operations team.
0:53:04 > 0:53:08Facebook now connects more than two billion people around the world,
0:53:08 > 0:53:12including more and more voters in the West.
0:53:12 > 0:53:14When you think of what Facebook has become,
0:53:14 > 0:53:18in such a short space of time, it's actually pretty bizarre.
0:53:18 > 0:53:20I mean, this was just a platform
0:53:20 > 0:53:23for sharing photos or chatting to friends,
0:53:23 > 0:53:27but in less than a decade, it has become a platform that has
0:53:27 > 0:53:30dramatic implications for how our democracy works.
0:53:32 > 0:53:35Old structures of power are falling away.
0:53:35 > 0:53:40Social media is giving ordinary people access to huge audiences.
0:53:40 > 0:53:43And politics is changing as a result.
0:53:46 > 0:53:47Significant numbers were motivated
0:53:47 > 0:53:50through social media to vote for Brexit.
0:53:52 > 0:53:54Jeremy Corbyn lost the general election,
0:53:54 > 0:53:58but enjoyed unexpected gains,
0:53:58 > 0:54:03an achievement he put down in part to the power of social media.
0:54:06 > 0:54:12One influential force in this new world is found here in Bristol.
0:54:12 > 0:54:15The Canary is an online political news outlet.
0:54:15 > 0:54:18During the election campaign,
0:54:18 > 0:54:22their stories got more than 25 million hits on a tiny budget.
0:54:24 > 0:54:27So, how much of your readership comes through Facebook?
0:54:27 > 0:54:30It's really high, it's about 80%.
0:54:30 > 0:54:33So it is an enormously important distribution mechanism.
0:54:33 > 0:54:35And free.
0:54:35 > 0:54:39The Canary's presentation of its pro-Corbyn news
0:54:39 > 0:54:43is tailored to social media.
0:54:43 > 0:54:45I have noticed that a lot of your headlines, or your pictures,
0:54:45 > 0:54:47they are quite emotional.
0:54:47 > 0:54:50They are sort of pictures of sad Theresa May,
0:54:50 > 0:54:52or delighted Jeremy Corbyn.
0:54:52 > 0:54:54Is that part of the purpose of it all?
0:54:54 > 0:54:57Yeah, it has to be. We are out there trying to have a conversation
0:54:57 > 0:55:00with a lot of people, so it is on us to be compelling.
0:55:00 > 0:55:04Human beings work on facts, but they also work on gut instinct,
0:55:04 > 0:55:09they work on emotions, feelings, and fidelity and community.
0:55:09 > 0:55:10All of these issues.
0:55:10 > 0:55:15Social media enables those with few resources to compete
0:55:15 > 0:55:20with the mainstream media for the attention of millions of us.
0:55:20 > 0:55:23You put up quite sort of clickbait-y stories, you know,
0:55:23 > 0:55:26the headlines are there to get clicks.
0:55:26 > 0:55:28- Yeah.- Um... Is that a fair criticism?
0:55:28 > 0:55:30Of course they're there to get clicks.
0:55:30 > 0:55:32We don't want to have a conversation with ten people.
0:55:32 > 0:55:34You can't change the world talking to ten people.
0:55:42 > 0:55:48The tech gods are giving all of us the power to influence the world.
0:55:48 > 0:55:50Connectivity and access... Connect everyone in the world...
0:55:50 > 0:55:51Make the world more open and connected...
0:55:51 > 0:55:54You connect people over time... You get people connectivity...
0:55:54 > 0:55:57That's the mission. That's what I care about.
0:55:57 > 0:56:01Social media's unparalleled power to persuade,
0:56:01 > 0:56:03first developed for advertisers,
0:56:03 > 0:56:06is now being exploited by political forces of all kinds.
0:56:06 > 0:56:11Grassroots movements are regaining their power,
0:56:11 > 0:56:13challenging political elites.
0:56:13 > 0:56:19Extremists are discovering new ways to stoke hatred and spread lies.
0:56:19 > 0:56:21And wealthy political parties are developing the ability
0:56:21 > 0:56:24to manipulate our thoughts and feelings
0:56:24 > 0:56:27using powerful psychological tools,
0:56:27 > 0:56:32which is leading to a world of unexpected political opportunity,
0:56:32 > 0:56:33and turbulence.
0:56:33 > 0:56:37I think the people that connected the world really believed that
0:56:37 > 0:56:42somehow, just by us being connected, our politics would be better.
0:56:42 > 0:56:48But the world is changing in ways that they never imagined,
0:56:48 > 0:56:51and they are probably not happy about any more.
0:56:51 > 0:56:53But in truth,
0:56:53 > 0:56:56they are no more in charge of this technology than any of us are now.
0:56:56 > 0:57:00Silicon Valley's philosophy is called disruption.
0:57:00 > 0:57:02Breaking down the way we do things
0:57:02 > 0:57:06and using technology to improve the world.
0:57:06 > 0:57:08In this series, I have seen how
0:57:08 > 0:57:11sharing platforms like Uber and Airbnb
0:57:11 > 0:57:14are transforming our cities.
0:57:18 > 0:57:21And how automation and artificial intelligence
0:57:21 > 0:57:24threaten to destroy millions of jobs.
0:57:24 > 0:57:27Within 30 years, half of humanity won't have a job.
0:57:27 > 0:57:29It could get ugly, there could be revolution.
0:57:29 > 0:57:34Now, the technology to connect the world unleashed by a few billionaire
0:57:34 > 0:57:39entrepreneurs is having a dramatic influence on our politics.
0:57:39 > 0:57:42The people who are responsible for building this technology,
0:57:42 > 0:57:47for unleashing this disruption onto all of us,
0:57:47 > 0:57:51don't ever feel like they are responsible for the consequences
0:57:51 > 0:57:53of any of that.
0:57:53 > 0:57:58They retain this absolute religious faith that technology and
0:57:58 > 0:58:03connectivity is always going to make things turn out for the best.
0:58:03 > 0:58:05And it doesn't matter what happens,
0:58:05 > 0:58:08it doesn't matter how much that's proven not to be the case,
0:58:08 > 0:58:10they still believe.
0:58:22 > 0:58:25How did Silicon Valley become so influential?
0:58:25 > 0:58:28The Open University has produced an interactive timeline
0:58:28 > 0:58:30exploring the history of this place.
0:58:30 > 0:58:32To find out more, visit...
0:58:36 > 0:58:40..and follow the links to the Open University.