Browse content similar to The Persuasion Machine. Check below for episodes and series from the same categories and more!
Line | From | To | |
---|---|---|---|
It was the biggest political earthquake of the century. | 0:00:04 | 0:00:08 | |
We will make America great again! | 0:00:08 | 0:00:12 | |
But just how did Donald Trump defy the predictions | 0:00:15 | 0:00:17 | |
of political pundits and pollsters? | 0:00:17 | 0:00:21 | |
The secret lies here, in San Antonio, Texas. | 0:00:21 | 0:00:25 | |
This was our Project Alamo. | 0:00:27 | 0:00:29 | |
This is where the digital arm of the Trump campaign operation was held. | 0:00:29 | 0:00:35 | |
This is the extraordinary story of two men | 0:00:37 | 0:00:40 | |
with two very different views of the world. | 0:00:40 | 0:00:43 | |
We will build the wall... | 0:00:44 | 0:00:46 | |
The path forward is to connect more, not less. | 0:00:46 | 0:00:49 | |
And how Facebook's Mark Zuckerberg inadvertently helped Donald Trump | 0:00:49 | 0:00:54 | |
become the most powerful man on the planet. | 0:00:54 | 0:00:57 | |
Without Facebook, we wouldn't have won. | 0:00:57 | 0:01:00 | |
I mean, Facebook really and truly put us over the edge. | 0:01:00 | 0:01:04 | |
With their secret algorithms and online tracking, | 0:01:04 | 0:01:08 | |
social media companies know more about us than anyone. | 0:01:08 | 0:01:12 | |
So you can predict mine and everybody else's personality | 0:01:13 | 0:01:16 | |
based on the things that they've liked? | 0:01:16 | 0:01:19 | |
That's correct. | 0:01:19 | 0:01:20 | |
A very accurate prediction of your intimate traits such as | 0:01:20 | 0:01:24 | |
religiosity, political views, intelligence, sexual orientation. | 0:01:24 | 0:01:28 | |
Now, this power is transforming politics. | 0:01:28 | 0:01:31 | |
Can you understand though why maybe | 0:01:31 | 0:01:33 | |
some people find it a little bit creepy? | 0:01:33 | 0:01:36 | |
No, I can't - quite the opposite. That is the way the world is moving. | 0:01:36 | 0:01:38 | |
Whether you like it or not, it's an inevitable fact. | 0:01:38 | 0:01:42 | |
Social media can bring politics closer to the people, | 0:01:42 | 0:01:46 | |
but its destructive power | 0:01:46 | 0:01:48 | |
is creating a new and unpredictable world. | 0:01:48 | 0:01:51 | |
This is the story of how Silicon Valley's mission to connect | 0:01:51 | 0:01:55 | |
all of us is disrupting politics, | 0:01:55 | 0:01:59 | |
plunging us into a world of political turbulence | 0:01:59 | 0:02:02 | |
that no-one can control. | 0:02:02 | 0:02:04 | |
It might not look like it, but anger is building in Silicon Valley. | 0:02:20 | 0:02:24 | |
It's usually pretty quiet around here. | 0:02:24 | 0:02:27 | |
But not today. | 0:02:27 | 0:02:29 | |
Every day you wake up and you wonder what's going to be today's grief, | 0:02:29 | 0:02:34 | |
brought by certain politicians and leaders in the world. | 0:02:34 | 0:02:37 | |
This is a very unusual demonstration. | 0:02:37 | 0:02:41 | |
I've been to loads of demonstrations, | 0:02:41 | 0:02:44 | |
but these aren't the people | 0:02:44 | 0:02:46 | |
that usually go to demonstrations. | 0:02:46 | 0:02:48 | |
In some ways, these are the winners of society. | 0:02:48 | 0:02:52 | |
This is the tech community in Silicon Valley. | 0:02:52 | 0:02:55 | |
Some of the wealthiest people in the world. | 0:02:55 | 0:02:59 | |
And they're here protesting and demonstrating... | 0:02:59 | 0:03:05 | |
against what they see as the kind of changing world | 0:03:05 | 0:03:07 | |
that they don't like. | 0:03:07 | 0:03:09 | |
The problem, of course, is the election of Donald Trump. | 0:03:09 | 0:03:13 | |
Every day, I'm sure, you think, | 0:03:15 | 0:03:17 | |
"I could be a part of resisting those efforts | 0:03:17 | 0:03:20 | |
"to mess things up for the rest of us." | 0:03:20 | 0:03:22 | |
Trump came to power promising to control immigration... | 0:03:22 | 0:03:26 | |
We will build the wall, 100%. | 0:03:26 | 0:03:30 | |
..and to disengage from the world. | 0:03:30 | 0:03:33 | |
From this day forward, it's going to be | 0:03:33 | 0:03:37 | |
only America first. | 0:03:37 | 0:03:40 | |
America first. | 0:03:42 | 0:03:44 | |
Now, Silicon Valley is mobilising against him. | 0:03:46 | 0:03:51 | |
We are seeing this explosion of political activism, you know, | 0:03:51 | 0:03:54 | |
all through the US and in Europe. | 0:03:54 | 0:03:56 | |
Before he became an activist, | 0:03:56 | 0:03:59 | |
Dex Torricke-Barton was speech writer to the chairman of Google | 0:03:59 | 0:04:03 | |
and the founder of Facebook. | 0:04:03 | 0:04:05 | |
It's a moment when people who believe in this global vision, | 0:04:07 | 0:04:10 | |
as opposed to the nationalist vision of the world, | 0:04:10 | 0:04:12 | |
who believe in a world that isn't about protectionism, | 0:04:12 | 0:04:15 | |
whether it's data or whether it's about trade, | 0:04:15 | 0:04:17 | |
they're coming to stand up and to mobilise in response to that. | 0:04:17 | 0:04:20 | |
Because it feels like the whole of Silicon Valley | 0:04:20 | 0:04:23 | |
-has been slightly taken by surprise by what's happening. -Absolutely. | 0:04:23 | 0:04:26 | |
But these are the smartest minds in the world. | 0:04:26 | 0:04:28 | |
-Yeah. -With the most amazing data models and polling. | 0:04:28 | 0:04:30 | |
The smartest minds in the world often can be very, | 0:04:30 | 0:04:33 | |
very ignorant of the things that are going on in the world. | 0:04:33 | 0:04:36 | |
The tech god with the most ambitious global vision is Dex's old boss - | 0:04:36 | 0:04:41 | |
Facebook founder Mark Zuckerberg. | 0:04:41 | 0:04:44 | |
One word captures the world Zuck, as he's known here, | 0:04:44 | 0:04:48 | |
is trying to build. | 0:04:48 | 0:04:51 | |
Connectivity and access. If we connected them... | 0:04:51 | 0:04:53 | |
You give people connectivity... That's the mission. | 0:04:53 | 0:04:55 | |
Connecting with their friends... You connect people over time... | 0:04:55 | 0:04:58 | |
Connect everyone in the world... I'm really optimistic about that. | 0:04:58 | 0:05:00 | |
Make the world more open and connected, that's what I care about. | 0:05:00 | 0:05:03 | |
This is part of the critical enabling infrastructure for the world. | 0:05:03 | 0:05:06 | |
Thank you, guys. | 0:05:06 | 0:05:08 | |
What's Mark Zuckerberg worried about most? | 0:05:08 | 0:05:10 | |
Well, you know, Mark has dedicated his life to connecting, you know, | 0:05:10 | 0:05:13 | |
the world. You know, this is something that he really, you know, | 0:05:13 | 0:05:16 | |
cares passionately about. | 0:05:16 | 0:05:17 | |
And, you know, as I said, you know, | 0:05:17 | 0:05:19 | |
the same worldview and set of policies that, you know, | 0:05:19 | 0:05:22 | |
we'll build walls here, | 0:05:22 | 0:05:24 | |
we'll build walls against the sharing of information | 0:05:24 | 0:05:26 | |
and building those kind of, you know, networks. | 0:05:26 | 0:05:29 | |
It's bigger than just the tech. | 0:05:29 | 0:05:30 | |
-It is. -It's about the society that Silicon Valley also wants to create? | 0:05:30 | 0:05:33 | |
Absolutely. | 0:05:33 | 0:05:34 | |
The tech gods believe the election | 0:05:34 | 0:05:36 | |
of Donald Trump threatens their vision | 0:05:36 | 0:05:39 | |
of a globalised world. | 0:05:39 | 0:05:41 | |
But in a cruel twist, | 0:05:43 | 0:05:44 | |
is it possible their mission to connect the world | 0:05:44 | 0:05:47 | |
actually helped bring him to power? | 0:05:47 | 0:05:50 | |
The question I have is whether the revolution brought about | 0:05:50 | 0:05:53 | |
by social media companies like Facebook | 0:05:53 | 0:05:56 | |
has actually led to the political changes in the world | 0:05:56 | 0:06:00 | |
that these guys are so worried about. | 0:06:00 | 0:06:03 | |
To answer that question, | 0:06:06 | 0:06:08 | |
you have to understand how the tech titans of Silicon Valley | 0:06:08 | 0:06:12 | |
rose to power. | 0:06:12 | 0:06:15 | |
For that, you have to go back 20 years | 0:06:15 | 0:06:18 | |
to a time when the online world was still in its infancy. | 0:06:18 | 0:06:22 | |
MUSIC: Rock N Roll Star by Oasis | 0:06:22 | 0:06:25 | |
There were fears the new internet was like the Wild West, | 0:06:27 | 0:06:31 | |
anarchic and potentially harmful. | 0:06:31 | 0:06:34 | |
Today our world is being remade, yet again, by an information revolution. | 0:06:34 | 0:06:40 | |
Changing the way we work, the way we live, | 0:06:40 | 0:06:43 | |
the way we relate to each other. | 0:06:43 | 0:06:46 | |
The Telecommunications Act of 1996 | 0:06:46 | 0:06:49 | |
was designed to civilise the internet, | 0:06:49 | 0:06:51 | |
including protect children from pornography. | 0:06:51 | 0:06:56 | |
Today with the stroke of a pen, our laws will catch up with our future. | 0:06:56 | 0:07:00 | |
But buried deep within the act was a secret whose impact no-one foresaw. | 0:07:00 | 0:07:05 | |
Jeremy? | 0:07:14 | 0:07:15 | |
Jamie. How are you doing? | 0:07:15 | 0:07:16 | |
Nice to meet you. | 0:07:16 | 0:07:19 | |
VOICEOVER: Jeremy Malcolm is an analyst | 0:07:19 | 0:07:21 | |
at the Electronic Frontier Foundation, | 0:07:21 | 0:07:23 | |
a civil liberties group for the digital age. | 0:07:23 | 0:07:26 | |
Much of Silicon Valley's accelerated growth in the last two decades | 0:07:28 | 0:07:32 | |
has been enabled by one clause in the legislation. | 0:07:32 | 0:07:36 | |
Hidden away in the middle of that is this Section 230. | 0:07:37 | 0:07:40 | |
-What's the key line? -It literally just says, | 0:07:40 | 0:07:42 | |
no provider or user of an interactive computer service | 0:07:42 | 0:07:45 | |
shall be treated as the publisher or speaker | 0:07:45 | 0:07:48 | |
of any information provided | 0:07:48 | 0:07:50 | |
by another information content provider. That's it. | 0:07:50 | 0:07:53 | |
So what that basically means is, if you're an internet platform, | 0:07:53 | 0:07:56 | |
you don't get treated as the publisher or speaker | 0:07:56 | 0:07:58 | |
of something that your users say using your platform. | 0:07:58 | 0:08:01 | |
If the user says something online that is, say, defamatory, | 0:08:01 | 0:08:05 | |
the platform that they communicate on | 0:08:05 | 0:08:07 | |
isn't going to be held responsible for it. | 0:08:07 | 0:08:09 | |
And the user, of course, can be held directly responsible. | 0:08:09 | 0:08:13 | |
How important is this line for social media companies today? | 0:08:13 | 0:08:19 | |
I think if we didn't have this, we probably wouldn't have | 0:08:19 | 0:08:22 | |
the same kind of social media companies that we have today. | 0:08:22 | 0:08:25 | |
They wouldn't be willing to take on the risk of having so much | 0:08:25 | 0:08:28 | |
-unfettered discussion. -It's key to the internet's freedom, really? | 0:08:28 | 0:08:33 | |
We wouldn't have the internet of today without this. | 0:08:33 | 0:08:36 | |
And so, if we are going to make any changes to it, | 0:08:36 | 0:08:39 | |
we have to be really, really careful. | 0:08:39 | 0:08:41 | |
These 26 words changed the world. | 0:08:43 | 0:08:46 | |
They allowed a new kind of business to spring up - | 0:08:50 | 0:08:53 | |
online platforms that became the internet giants of today. | 0:08:53 | 0:08:58 | |
Facebook, Google, YouTube - they encouraged users to upload content, | 0:08:58 | 0:09:02 | |
often things about their lives or moments that mattered to them, | 0:09:02 | 0:09:06 | |
onto their sites for free. | 0:09:06 | 0:09:09 | |
And in exchange, they got to hoard all of that data | 0:09:09 | 0:09:13 | |
but without any real responsibility | 0:09:13 | 0:09:15 | |
for the effects of the content that people were posting. | 0:09:15 | 0:09:19 | |
Hundreds of millions of us flocked to these new sites, | 0:09:22 | 0:09:25 | |
putting more of our lives online. | 0:09:25 | 0:09:29 | |
At first, the tech firms couldn't figure out | 0:09:29 | 0:09:32 | |
how to turn that data into big money. | 0:09:32 | 0:09:35 | |
But that changed when a secret within that data was unlocked. | 0:09:35 | 0:09:40 | |
Antonio, Jamie. | 0:09:40 | 0:09:42 | |
'A secret Antonio Garcia Martinez helped reveal at Facebook.' | 0:09:42 | 0:09:46 | |
Tell me a bit about your time at Facebook. | 0:09:49 | 0:09:51 | |
Well, that was interesting. | 0:09:51 | 0:09:53 | |
I was what's called a product manager for ads targeting. | 0:09:53 | 0:09:56 | |
That means basically taking your data and using it to basically | 0:09:56 | 0:09:58 | |
make money on Facebook, to monetise Facebook's data. | 0:09:58 | 0:10:02 | |
If you go browse the internet or buy stuff in stores or whatever, | 0:10:02 | 0:10:04 | |
and then you see ads related to all that stuff | 0:10:04 | 0:10:06 | |
inside Facebook - I created that. | 0:10:06 | 0:10:08 | |
Facebook offers advertisers ways | 0:10:10 | 0:10:12 | |
to target individual users of the site with adverts. | 0:10:12 | 0:10:15 | |
It can be driven by data about how we use the platform. | 0:10:15 | 0:10:21 | |
Here's some examples of what's data for Facebook that makes money. | 0:10:21 | 0:10:24 | |
What you've liked on Facebook, links that you shared, | 0:10:24 | 0:10:26 | |
who you happen to know on Facebook, for example. | 0:10:26 | 0:10:29 | |
Where you've used Facebook, what devices, your iPad, your work computer, | 0:10:29 | 0:10:32 | |
your home computer. In the case of Amazon, | 0:10:32 | 0:10:34 | |
it's obviously what you've purchased. | 0:10:34 | 0:10:35 | |
In the case of Google, it's what you searched for. | 0:10:35 | 0:10:38 | |
How do they turn me... I like something on Facebook, | 0:10:38 | 0:10:40 | |
and I share a link on Facebook, how could they turn that | 0:10:40 | 0:10:43 | |
into something that another company would care about? | 0:10:43 | 0:10:46 | |
There is what's called a targeting system, | 0:10:46 | 0:10:48 | |
and so the advertiser can actually go in and specify, | 0:10:48 | 0:10:50 | |
I want people who are within this city and who have liked BMW or Burberry, for example. | 0:10:50 | 0:10:54 | |
So an advertiser pays Facebook and says, | 0:10:54 | 0:10:56 | |
-I want these sorts of people? -That's effectively it, that's right. | 0:10:56 | 0:11:00 | |
The innovation that opened up bigger profits was to allow Facebook users | 0:11:01 | 0:11:05 | |
to be targeted using data about what they do on the rest of the internet. | 0:11:05 | 0:11:12 | |
The real key thing that marketers want | 0:11:12 | 0:11:14 | |
is the unique, immutable, flawless, high fidelity ID | 0:11:14 | 0:11:18 | |
for one person on the internet, and Facebook provides that. | 0:11:18 | 0:11:21 | |
It is your identity online. | 0:11:21 | 0:11:23 | |
Facebook can tell an advertiser, this is the real Jamie Barlow, | 0:11:23 | 0:11:26 | |
-this is what he's like? -Yeah. | 0:11:26 | 0:11:28 | |
A company like Walmart can literally take your data, your e-mail, | 0:11:28 | 0:11:32 | |
phone number, whatever you use for their frequent shopper programme, etc, | 0:11:32 | 0:11:35 | |
and join that to Facebook and literally target those people based on that data. | 0:11:35 | 0:11:39 | |
That's part of what I built. | 0:11:39 | 0:11:40 | |
The tech gods suck in all this data about how we use their technologies | 0:11:45 | 0:11:49 | |
to build their vast fortunes. | 0:11:49 | 0:11:53 | |
I mean, it sounds like data is like oil, it's keeping the economy going? | 0:11:55 | 0:11:58 | |
Right. I mean, the difference is these companies, | 0:11:58 | 0:12:00 | |
instead of drilling for this oil, they generate this oil via, | 0:12:00 | 0:12:04 | |
by getting users to actually use their apps and then they actually | 0:12:04 | 0:12:07 | |
monetise it. Usually via advertising or other mechanisms. | 0:12:07 | 0:12:09 | |
But, yeah, it is the new oil. | 0:12:09 | 0:12:12 | |
Data about billions of us is propelling Silicon Valley | 0:12:12 | 0:12:15 | |
to the pinnacle of the global economy. | 0:12:15 | 0:12:19 | |
The world's largest hotel company, Airbnb, | 0:12:21 | 0:12:23 | |
doesn't own a single piece of real estate. | 0:12:23 | 0:12:25 | |
The world's largest taxi company, Uber, doesn't own any cars. | 0:12:25 | 0:12:28 | |
The world's largest media company, Facebook, doesn't produce any media, right? | 0:12:28 | 0:12:31 | |
So what do they have? | 0:12:31 | 0:12:32 | |
Well, they have the data around how you use those resources and how you | 0:12:32 | 0:12:35 | |
use those assets. And that's really what they are. | 0:12:35 | 0:12:38 | |
The secret of targeting us with adverts is keeping us online | 0:12:41 | 0:12:46 | |
for as long as possible. | 0:12:46 | 0:12:49 | |
I thought I'd just see how much time I spend on here. | 0:12:49 | 0:12:53 | |
So I've got an app that counts how often I pick this thing up. | 0:12:53 | 0:12:58 | |
Our time is the Holy Grail of Silicon Valley. | 0:12:58 | 0:13:02 | |
Here's what my life looks like on a typical day. | 0:13:03 | 0:13:06 | |
-Yeah, could I have a flat white, please? -Flat white? -Yeah. | 0:13:09 | 0:13:12 | |
Like more and more of us, my phone is my gateway to the online world. | 0:13:12 | 0:13:18 | |
It's how I check my social media accounts. | 0:13:18 | 0:13:20 | |
On average, Facebook users spend 50 minutes every day on the site. | 0:13:20 | 0:13:26 | |
The longer we spend connected, | 0:13:31 | 0:13:33 | |
the more Silicon Valley can learn about us, | 0:13:33 | 0:13:37 | |
and the more targeted and effective their advertising can be. | 0:13:37 | 0:13:41 | |
So apparently, today, I... | 0:14:05 | 0:14:08 | |
I've checked my phone 117 times... | 0:14:08 | 0:14:13 | |
..and I've been on this phone for nearly five and a half hours. | 0:14:14 | 0:14:20 | |
Well, I mean, that's a lot, that's a lot of hours. | 0:14:20 | 0:14:22 | |
I mean, it's kind of nearly half the day, | 0:14:22 | 0:14:25 | |
spent on this phone. | 0:14:25 | 0:14:28 | |
But it's weird, because it doesn't feel like I spend that long on it. | 0:14:28 | 0:14:31 | |
The strange thing about it is | 0:14:31 | 0:14:33 | |
that I don't even really know what I'm doing | 0:14:33 | 0:14:36 | |
for these five hours that I'm spending on this phone. | 0:14:36 | 0:14:38 | |
What is it that is keeping us hooked to Silicon Valley's global network? | 0:14:40 | 0:14:45 | |
I'm in Seattle to meet someone who saw how the tech gods embraced | 0:14:49 | 0:14:54 | |
new psychological insights into how we all make decisions. | 0:14:54 | 0:14:59 | |
I was a post-doc with Stephen Hawking. | 0:15:02 | 0:15:04 | |
Once Chief Technology Officer at Microsoft, | 0:15:04 | 0:15:07 | |
Nathan Myhrvold is the most passionate technologist | 0:15:07 | 0:15:10 | |
I have ever met. | 0:15:10 | 0:15:13 | |
Some complicated-looking equations in the background. | 0:15:13 | 0:15:15 | |
Well, it turns out if you work with Stephen Hawking, | 0:15:15 | 0:15:17 | |
you do work with complicated equations. | 0:15:17 | 0:15:20 | |
It's kind of the nature of the beast! | 0:15:20 | 0:15:22 | |
Amazing picture. | 0:15:22 | 0:15:24 | |
A decade ago, Nathan brought together Daniel Kahneman, | 0:15:27 | 0:15:30 | |
pioneer of the new science of behavioural economics, | 0:15:30 | 0:15:33 | |
and Silicon Valley's leaders, for a series of meetings. | 0:15:33 | 0:15:37 | |
I came and Jeff Bezos came. | 0:15:37 | 0:15:40 | |
That's Jeff Bezos, the founder of Amazon, worth 76 billion. | 0:15:40 | 0:15:46 | |
Sean Parker. Sean was there. | 0:15:46 | 0:15:48 | |
That's Sean Parker, the first president of Facebook. | 0:15:48 | 0:15:52 | |
And the Google founders were there. | 0:15:52 | 0:15:54 | |
The proposition was, come to this very nice resort in Napa, | 0:15:56 | 0:16:03 | |
and for several days, just have Kahneman | 0:16:03 | 0:16:07 | |
and then also a couple of | 0:16:07 | 0:16:09 | |
other behavioural economists explain things. | 0:16:09 | 0:16:12 | |
And ask questions and see what happens. | 0:16:12 | 0:16:15 | |
Kahneman had a simple but brilliant theory on how we make decisions. | 0:16:15 | 0:16:21 | |
He had found we use one of two different systems of thinking. | 0:16:21 | 0:16:27 | |
In this dichotomy, you have... | 0:16:28 | 0:16:31 | |
over here is a hunch... | 0:16:31 | 0:16:34 | |
..a guess, | 0:16:36 | 0:16:37 | |
a gut feeling... | 0:16:37 | 0:16:40 | |
..and "I just know". | 0:16:44 | 0:16:47 | |
So this is sort of emotional and more like, just instant stuff? | 0:16:47 | 0:16:53 | |
That's the idea. This set of things | 0:16:53 | 0:16:56 | |
is not particularly good at a different set of stuff | 0:16:56 | 0:17:00 | |
that involves... | 0:17:00 | 0:17:02 | |
..analysis... | 0:17:04 | 0:17:05 | |
..numbers... | 0:17:08 | 0:17:10 | |
..probability. | 0:17:12 | 0:17:13 | |
The meetings in Napa didn't deal with the basics | 0:17:14 | 0:17:18 | |
of behavioural economics, but how might the insights of the new | 0:17:18 | 0:17:21 | |
science have helped the tech gods? | 0:17:21 | 0:17:23 | |
A lot of advertising is about trying to hook people | 0:17:23 | 0:17:26 | |
in these type-one things to get interested one way or the other. | 0:17:26 | 0:17:30 | |
Technology companies undoubtedly use that to one degree or another. | 0:17:30 | 0:17:35 | |
You know, the term "clickbait", | 0:17:35 | 0:17:37 | |
for things that look exciting to click on. | 0:17:37 | 0:17:39 | |
There's billions of dollars change hands | 0:17:39 | 0:17:42 | |
because we all get enticed into clicking something. | 0:17:42 | 0:17:46 | |
And there's a lot of things that I click on and then you get there, | 0:17:46 | 0:17:50 | |
you're like, OK, fine, you were just messing with me. | 0:17:50 | 0:17:53 | |
You're playing to the type-one things, | 0:17:53 | 0:17:55 | |
you're putting a set of triggers out there | 0:17:55 | 0:17:58 | |
that make me want to click on it, | 0:17:58 | 0:18:01 | |
and even though, like, I'm aware of that, I still sometimes click! | 0:18:01 | 0:18:06 | |
Tech companies both try to understand | 0:18:06 | 0:18:09 | |
our behaviour by having smart humans think about it and increasingly | 0:18:09 | 0:18:12 | |
by having machines think about it. | 0:18:12 | 0:18:15 | |
By having machines track us | 0:18:15 | 0:18:16 | |
to see, what is the clickbait Nathan falls for? | 0:18:16 | 0:18:19 | |
What are the things he really likes to spend time on? | 0:18:19 | 0:18:21 | |
Let's show him more of that stuff! | 0:18:21 | 0:18:24 | |
Trying to grab the attention of the consumer is nothing new. | 0:18:25 | 0:18:29 | |
That's what advertising is all about. | 0:18:29 | 0:18:32 | |
But insights into how we make decisions helped Silicon Valley | 0:18:32 | 0:18:35 | |
to shape the online world. | 0:18:35 | 0:18:37 | |
And little wonder, their success depends on keeping us engaged. | 0:18:37 | 0:18:43 | |
From 1-Click buying on Amazon to the Facebook like, | 0:18:43 | 0:18:47 | |
the more they've hooked us, the more the money has rolled in. | 0:18:47 | 0:18:51 | |
As Silicon Valley became more influential, | 0:18:53 | 0:18:55 | |
it started attracting powerful friends...in politics. | 0:18:55 | 0:19:00 | |
In 2008, Barack Obama had pioneered political campaigning on Facebook. | 0:19:00 | 0:19:07 | |
As President, he was drawn to Facebook's founder, Zuck. | 0:19:07 | 0:19:12 | |
Sorry, I'm kind of nervous. | 0:19:12 | 0:19:14 | |
We have the President of the United States here! | 0:19:14 | 0:19:17 | |
My name is Barack Obama and I'm the guy who got Mark | 0:19:17 | 0:19:21 | |
to wear a jacket and tie! | 0:19:21 | 0:19:24 | |
How you doing? | 0:19:27 | 0:19:28 | |
-Great. -I'll have huevos rancheros, please. | 0:19:28 | 0:19:31 | |
And if I could have an egg and cheese sandwich on English muffin? | 0:19:31 | 0:19:36 | |
Aneesh Chopra was Obama's first Chief Technology Officer, | 0:19:36 | 0:19:40 | |
and saw how close the relationship | 0:19:40 | 0:19:42 | |
between the White House and Silicon Valley became. | 0:19:42 | 0:19:47 | |
The President's philosophy and his approach to governing | 0:19:47 | 0:19:50 | |
garnered a great deal of personal interest | 0:19:50 | 0:19:53 | |
among many executives in Silicon Valley. | 0:19:53 | 0:19:56 | |
They were donors to his campaign, volunteers, | 0:19:56 | 0:19:59 | |
active recruiters of engineering talent | 0:19:59 | 0:20:01 | |
to support the campaign apparatus. | 0:20:01 | 0:20:03 | |
He had struck a chord. | 0:20:03 | 0:20:05 | |
-Why? -Because frankly, | 0:20:05 | 0:20:07 | |
it was an inspiring voice that really tapped into | 0:20:07 | 0:20:10 | |
the hopefulness of the country. | 0:20:10 | 0:20:13 | |
And a lot of Silicon Valley shares this sense of hopefulness, | 0:20:13 | 0:20:17 | |
this optimistic view that we can solve problems | 0:20:17 | 0:20:19 | |
if we would work together | 0:20:19 | 0:20:20 | |
and take advantage of these new capabilities that are coming online. | 0:20:20 | 0:20:24 | |
So people in Silicon Valley saw President Obama | 0:20:24 | 0:20:27 | |
-as a bit of a kindred spirit? -Oh, yeah. Oh, yeah. | 0:20:27 | 0:20:32 | |
If you'd like, Mark, we can take our jackets off. | 0:20:32 | 0:20:34 | |
That's good! | 0:20:34 | 0:20:36 | |
Facebook's mission to connect the world | 0:20:36 | 0:20:39 | |
went hand-in-hand with Obama's policies promoting globalisation | 0:20:39 | 0:20:44 | |
and free markets. | 0:20:44 | 0:20:46 | |
And Facebook was seen to be improving | 0:20:46 | 0:20:48 | |
the political process itself. | 0:20:48 | 0:20:50 | |
Part of what makes for a healthy democracy | 0:20:50 | 0:20:55 | |
is when you've got citizens who are informed, who are engaged, | 0:20:55 | 0:20:59 | |
and what Facebook allows us to do | 0:20:59 | 0:21:02 | |
is make sure this isn't just a one-way conversation. | 0:21:02 | 0:21:05 | |
You have the tech mind-set and governments increasingly share | 0:21:05 | 0:21:09 | |
the same view of the world. That's not a natural... | 0:21:09 | 0:21:12 | |
It's a spirit of liberty. | 0:21:12 | 0:21:13 | |
It's a spirit of freedom. | 0:21:13 | 0:21:15 | |
That is manifest today in these new technologies. | 0:21:15 | 0:21:19 | |
It happens to be that freedom means I can tweet something offensive. | 0:21:19 | 0:21:23 | |
But it also means that I have a voice. | 0:21:23 | 0:21:26 | |
Please raise your right hand and repeat after me... | 0:21:26 | 0:21:29 | |
By the time Obama won his second term, | 0:21:29 | 0:21:32 | |
he was feted for his mastery of social media's persuasive power. | 0:21:32 | 0:21:37 | |
But across the political spectrum, | 0:21:37 | 0:21:39 | |
the race was on to find new ways to gain a digital edge. | 0:21:39 | 0:21:43 | |
The world was about to change for Facebook. | 0:21:43 | 0:21:47 | |
Stanford University, in the heart of Silicon Valley. | 0:21:56 | 0:22:01 | |
Home to a psychologist investigating just how revealing | 0:22:01 | 0:22:05 | |
Facebook's hoard of information about each of us could really be. | 0:22:05 | 0:22:10 | |
How are you doing? Nice to meet you. | 0:22:10 | 0:22:13 | |
Dr Michal Kosinski specialises in psychometrics - | 0:22:13 | 0:22:18 | |
the science of predicting psychological traits | 0:22:18 | 0:22:21 | |
like personality. | 0:22:21 | 0:22:23 | |
So in the past, when you wanted to measure someone's personality | 0:22:23 | 0:22:26 | |
or intelligence, you needed to give them a question or a test, | 0:22:26 | 0:22:30 | |
and they would have to answer a bunch of questions. | 0:22:30 | 0:22:33 | |
Now, many of those questions would basically ask you | 0:22:33 | 0:22:37 | |
about whether you like poetry, | 0:22:37 | 0:22:39 | |
or you like hanging out with other people, | 0:22:39 | 0:22:41 | |
or you like the theatre, and so on. | 0:22:41 | 0:22:43 | |
But these days, you don't need to ask these questions any more. | 0:22:43 | 0:22:47 | |
Why? Because while going through our lives, | 0:22:47 | 0:22:51 | |
we are leaving behind a lot of digital footprints | 0:22:51 | 0:22:54 | |
that basically contain the same information. | 0:22:54 | 0:22:57 | |
So instead of asking you whether you like poetry, | 0:22:57 | 0:23:01 | |
I can just look at your reading history on Amazon | 0:23:01 | 0:23:05 | |
or your Facebook likes, | 0:23:05 | 0:23:08 | |
and I would just get exactly the same information. | 0:23:08 | 0:23:10 | |
In 2011, Dr Kosinski and his team at the University of Cambridge | 0:23:12 | 0:23:16 | |
developed an online survey | 0:23:16 | 0:23:18 | |
to measure volunteers' personality traits. | 0:23:18 | 0:23:21 | |
With their permission, he matched their results | 0:23:21 | 0:23:24 | |
with their Facebook data. | 0:23:24 | 0:23:26 | |
More than 6 million people took part. | 0:23:26 | 0:23:30 | |
We have people's Facebook likes, | 0:23:30 | 0:23:32 | |
people's status updates and profile data, | 0:23:32 | 0:23:36 | |
and this allows us to build those... to gain better understanding | 0:23:36 | 0:23:40 | |
of how psychological traits are being expressed | 0:23:40 | 0:23:43 | |
in the digital environment. | 0:23:43 | 0:23:45 | |
How you can measure psychological traits using digital footprints. | 0:23:45 | 0:23:50 | |
An algorithm that can look at millions of people | 0:23:50 | 0:23:52 | |
and it can look at hundreds of thousands | 0:23:52 | 0:23:55 | |
or tens of thousands of your likes | 0:23:55 | 0:23:56 | |
can extract and utilise even those little pieces of information | 0:23:56 | 0:24:01 | |
and combine it into a very accurate profile. | 0:24:01 | 0:24:04 | |
You can quite accurately predict mine, and in fact, everybody else's | 0:24:04 | 0:24:09 | |
personality, based on the things that they've liked? | 0:24:09 | 0:24:14 | |
That's correct. | 0:24:14 | 0:24:15 | |
It can also be used to turn your digital footprint | 0:24:15 | 0:24:18 | |
into a very accurate prediction | 0:24:18 | 0:24:21 | |
of your intimate traits, such as religiosity, | 0:24:21 | 0:24:23 | |
political views, personality, intelligence, | 0:24:23 | 0:24:26 | |
sexual orientation and a bunch of other psychological traits. | 0:24:26 | 0:24:30 | |
If I'm logged in, we can maybe see how accurate this actually is. | 0:24:30 | 0:24:33 | |
So I hit "make prediction" and it's going to try and predict | 0:24:33 | 0:24:37 | |
my personality from my Facebook page. | 0:24:37 | 0:24:39 | |
From your Facebook likes. | 0:24:39 | 0:24:41 | |
According to your likes, | 0:24:41 | 0:24:42 | |
you're open-minded, liberal and artistic. | 0:24:42 | 0:24:46 | |
Judges your intelligence to be extremely high. | 0:24:46 | 0:24:49 | |
-Well done. -Yes. I'm extremely intelligent! | 0:24:49 | 0:24:53 | |
You're not religious, but if you are religious, | 0:24:53 | 0:24:56 | |
-most likely you'd be a Catholic. -I was raised a Catholic! | 0:24:56 | 0:24:59 | |
I can't believe it knows that. | 0:24:59 | 0:25:02 | |
Because I... I don't say anything about being a Catholic anywhere, | 0:25:02 | 0:25:05 | |
but I was raised as a Catholic, but I'm not a practising Catholic. | 0:25:05 | 0:25:09 | |
So it's like... | 0:25:09 | 0:25:11 | |
It's absolutely spot on. | 0:25:11 | 0:25:13 | |
Oh, my God! Journalism, but also what did I study? | 0:25:13 | 0:25:17 | |
Studied history, and I didn't put anything about history in there. | 0:25:17 | 0:25:21 | |
I think this is one of the things | 0:25:21 | 0:25:22 | |
that people don't really get about those predictions, that they think, | 0:25:22 | 0:25:26 | |
look, if I like Lady Gaga on Facebook, | 0:25:26 | 0:25:29 | |
obviously people will know that I like Lady Gaga, | 0:25:29 | 0:25:31 | |
or the Government will know that I like Lady Gaga. | 0:25:31 | 0:25:34 | |
Look, you don't need a rocket scientist | 0:25:34 | 0:25:36 | |
to look at your Spotify playlist or your Facebook likes | 0:25:36 | 0:25:39 | |
to figure out that you like Lady Gaga. | 0:25:39 | 0:25:42 | |
What's really world-changing about those algorithms is that they can | 0:25:42 | 0:25:47 | |
take your music preferences or your book preferences | 0:25:47 | 0:25:51 | |
and extract from this seemingly innocent information | 0:25:51 | 0:25:55 | |
very accurate predictions about your religiosity, leadership potential, | 0:25:55 | 0:25:59 | |
political views, personality and so on. | 0:25:59 | 0:26:02 | |
Can you predict people's political persuasions with this? | 0:26:02 | 0:26:05 | |
In fact, the first dimension here, openness to experience, | 0:26:05 | 0:26:08 | |
is a very good predictor of political views. | 0:26:08 | 0:26:11 | |
People scoring high on openness, they tend to be liberal, | 0:26:11 | 0:26:14 | |
people who score low on openness, we even call it conservative and | 0:26:14 | 0:26:19 | |
traditional, they tend to vote for conservative candidates. | 0:26:19 | 0:26:22 | |
What about the potential to manipulate people? | 0:26:22 | 0:26:25 | |
So, obviously, if you now can use an algorithm to get to know millions | 0:26:25 | 0:26:29 | |
of people very intimately and then use another algorithm | 0:26:29 | 0:26:33 | |
to adjust the message that you are sending to them, | 0:26:33 | 0:26:37 | |
to make it most persuasive, | 0:26:37 | 0:26:40 | |
obviously gives you a lot of power. | 0:26:40 | 0:26:43 | |
I'm quite surprised at just how accurate this model could be. | 0:26:52 | 0:26:55 | |
I mean, I cannot believe that on the basis of a few things | 0:26:55 | 0:27:00 | |
I just, you know, carelessly clicked that I liked, | 0:27:00 | 0:27:05 | |
the model was able to work out that I could have been Catholic, | 0:27:05 | 0:27:10 | |
or had a Catholic upbringing. | 0:27:10 | 0:27:12 | |
And clearly, this is a very powerful way of understanding people. | 0:27:12 | 0:27:18 | |
Very exciting possibilities, but I can't help | 0:27:19 | 0:27:22 | |
fearing that there is that potential, whoever has that power, | 0:27:22 | 0:27:28 | |
whoever can control that model... | 0:27:28 | 0:27:31 | |
..will have sort of unprecedented possibilities of manipulating | 0:27:33 | 0:27:37 | |
what people think, how they behave, what they see, | 0:27:37 | 0:27:41 | |
whether that's selling things to people or how people vote, | 0:27:41 | 0:27:45 | |
and that's pretty scary too. | 0:27:45 | 0:27:48 | |
Our era is defined by political shocks. | 0:27:52 | 0:27:55 | |
None bigger than the rise of Donald Trump, | 0:27:59 | 0:28:02 | |
who defied pollsters and the mainstream media | 0:28:02 | 0:28:05 | |
to win the American presidency. | 0:28:05 | 0:28:07 | |
Now questions swirl around his use of the American affiliate | 0:28:08 | 0:28:12 | |
of a British insights company, | 0:28:12 | 0:28:14 | |
Cambridge Analytica, who use psychographics. | 0:28:14 | 0:28:17 | |
I'm in Texas to uncover how far | 0:28:25 | 0:28:27 | |
Cambridge Analytica's expertise in personality prediction | 0:28:27 | 0:28:31 | |
played a part in Trump's political triumph, | 0:28:31 | 0:28:36 | |
and how his revolutionary campaign | 0:28:36 | 0:28:38 | |
exploited Silicon Valley's social networks. | 0:28:38 | 0:28:42 | |
Everyone seems to agree | 0:28:44 | 0:28:45 | |
that Trump ran an exceptional election campaign | 0:28:45 | 0:28:49 | |
using digital technologies. | 0:28:49 | 0:28:52 | |
But no-one really knows what they did, | 0:28:52 | 0:28:56 | |
who they were working with, who was helping them, | 0:28:56 | 0:28:59 | |
what the techniques they used were. | 0:28:59 | 0:29:03 | |
So I've come here to try to unravel the mystery. | 0:29:03 | 0:29:07 | |
The operation inside this unassuming building in San Antonio | 0:29:09 | 0:29:13 | |
was largely hidden from view, but crucial to Trump's success. | 0:29:13 | 0:29:18 | |
Since then, I'm the only person to get in here | 0:29:18 | 0:29:22 | |
to find out what really happened. | 0:29:22 | 0:29:25 | |
This was our Project Alamo, | 0:29:25 | 0:29:27 | |
so this was where the digital arm | 0:29:27 | 0:29:30 | |
of the Trump campaign operation was held. | 0:29:30 | 0:29:33 | |
Theresa Hong is speaking publicly | 0:29:33 | 0:29:36 | |
for the first time about her role as digital content director | 0:29:36 | 0:29:40 | |
for the Trump campaign. | 0:29:40 | 0:29:42 | |
So, why is it called Project Alamo? | 0:29:43 | 0:29:47 | |
It was called Project Alamo based on the data, actually, | 0:29:47 | 0:29:51 | |
that was Cambridge Analytica - | 0:29:51 | 0:29:53 | |
they came up with the Alamo data set, right? | 0:29:53 | 0:29:56 | |
So, we just kind of adopted the name Project Alamo. | 0:29:56 | 0:30:00 | |
It does conjure up sort of images of a battle of some sort. | 0:30:00 | 0:30:04 | |
Yeah. It kind of was! | 0:30:04 | 0:30:06 | |
In a sense, you know. Yeah, yeah. | 0:30:06 | 0:30:09 | |
Project Alamo was so important, | 0:30:13 | 0:30:16 | |
Donald Trump visited the hundred or so workers based here | 0:30:16 | 0:30:20 | |
during the campaign. | 0:30:20 | 0:30:22 | |
Ever since he started tweeting in 2009, | 0:30:22 | 0:30:26 | |
Trump had grasped the power of social media. | 0:30:26 | 0:30:29 | |
Now, in the fight of his life, | 0:30:29 | 0:30:32 | |
the campaign manipulated his Facebook presence. | 0:30:32 | 0:30:36 | |
Trump's Twitter account, that's all his, he's the one that ran that, | 0:30:36 | 0:30:40 | |
and I did a lot of his Facebook, | 0:30:40 | 0:30:43 | |
so I wrote a lot for him, you know, I kind of channelled Mr Trump. | 0:30:43 | 0:30:48 | |
How do you possibly write a post on Facebook like Donald Trump? | 0:30:48 | 0:30:52 | |
"Believe me." | 0:30:52 | 0:30:54 | |
A lot of believe me's, a lot of alsos, a lot of...verys. | 0:30:54 | 0:30:58 | |
Actually, he was really wonderful to write for, just because | 0:30:58 | 0:31:01 | |
it was so refreshing, it was so authentic. | 0:31:01 | 0:31:06 | |
We headed to the heart of the operation. | 0:31:09 | 0:31:11 | |
Cambridge Analytica was here. | 0:31:14 | 0:31:16 | |
It was just a line of computers, right? | 0:31:16 | 0:31:19 | |
This is where their operation was and this was kind of | 0:31:19 | 0:31:22 | |
the brain of the data, this was the data centre. | 0:31:22 | 0:31:26 | |
This was the data centre, this was the centre of the data centre. | 0:31:26 | 0:31:30 | |
Exactly right, yes, it was. Yes, it was. | 0:31:30 | 0:31:33 | |
Cambridge Analytica were using data | 0:31:33 | 0:31:35 | |
on around 220 million Americans to target potential donors and voters. | 0:31:35 | 0:31:40 | |
It was, you know, | 0:31:40 | 0:31:43 | |
a bunch of card tables that sat here and a bunch of computers and people | 0:31:43 | 0:31:48 | |
that were behind the computers, monitoring the data. | 0:31:48 | 0:31:51 | |
We've got to target this state, that state or this universe or whatever. | 0:31:51 | 0:31:55 | |
So that's what they were doing, they were gathering all the data. | 0:31:55 | 0:31:58 | |
A "universe" was the name given to a group of voters | 0:31:58 | 0:32:02 | |
defined by Cambridge Analytica. | 0:32:02 | 0:32:05 | |
What sort of attributes did these people have | 0:32:05 | 0:32:07 | |
that they had been able to work out? | 0:32:07 | 0:32:09 | |
Some of the attributes would be, when was the last time they voted? | 0:32:09 | 0:32:12 | |
Who did they vote for? | 0:32:12 | 0:32:14 | |
You know, what kind of car do they drive? | 0:32:14 | 0:32:17 | |
What kind of things do they look at on the internet? | 0:32:17 | 0:32:20 | |
What do they stand for? | 0:32:20 | 0:32:22 | |
I mean, one person might really be about job creation | 0:32:22 | 0:32:25 | |
and keeping jobs in America, you know. | 0:32:25 | 0:32:27 | |
Another person, they might resonate with, you know, | 0:32:27 | 0:32:30 | |
Second Amendment and gun rights, so... | 0:32:30 | 0:32:33 | |
How would they know that? | 0:32:33 | 0:32:34 | |
-..within that... -How would they know that? -That is their secret sauce. | 0:32:34 | 0:32:38 | |
Did the "secret sauce" | 0:32:40 | 0:32:42 | |
contain predictions of personality and political leanings? | 0:32:42 | 0:32:46 | |
Were they able to kind of understand people's personalities? | 0:32:46 | 0:32:50 | |
Yes, I mean, you know, | 0:32:50 | 0:32:52 | |
they do specialise in psychographics, right? | 0:32:52 | 0:32:56 | |
But based on personal interests and based on what a person cares for | 0:32:56 | 0:33:01 | |
and what means something to them, | 0:33:01 | 0:33:03 | |
they were able to extract and then we were able to target. | 0:33:03 | 0:33:06 | |
So, the psychographic stuff, were they using that here? | 0:33:06 | 0:33:10 | |
Was that part of the model you were working on? | 0:33:10 | 0:33:13 | |
Well, I mean, towards the end with the persuasion, absolutely. | 0:33:13 | 0:33:17 | |
I mean, we really were targeting | 0:33:17 | 0:33:19 | |
on these universes that they had collected. | 0:33:19 | 0:33:22 | |
I mean, did some of the attributes that you were able to use | 0:33:22 | 0:33:25 | |
from Cambridge Analytica have a sort of emotional effect, you know, | 0:33:25 | 0:33:29 | |
happy, sad, anxious, worried - moods, that kind of thing? | 0:33:29 | 0:33:32 | |
Right, yeah. | 0:33:32 | 0:33:34 | |
I do know Cambridge Analytica follows something called Ocean | 0:33:34 | 0:33:38 | |
and it's based on, you know, whether or not, like, | 0:33:38 | 0:33:41 | |
are you an extrovert? | 0:33:41 | 0:33:43 | |
Are you more of an introvert? | 0:33:43 | 0:33:45 | |
Are you fearful? Are you positive? | 0:33:45 | 0:33:50 | |
So, they did use that. | 0:33:50 | 0:33:52 | |
Armed with Cambridge Analytica's revolutionary insights, | 0:33:54 | 0:33:58 | |
the next step in the battle to win over millions of Americans | 0:33:58 | 0:34:02 | |
was to shape the online messages they would see. | 0:34:02 | 0:34:06 | |
Now we're going to go into the big kind of bull pen where a lot of | 0:34:06 | 0:34:10 | |
-the creatives were, and this is where I was as well. -Right. | 0:34:10 | 0:34:14 | |
For today's modern working-class families, | 0:34:14 | 0:34:17 | |
the challenge is very, very real. | 0:34:17 | 0:34:19 | |
With childcare costs... | 0:34:19 | 0:34:21 | |
Adverts were tailored to particular audiences, defined by data. | 0:34:21 | 0:34:25 | |
Donald Trump wants to give families the break they deserve. | 0:34:25 | 0:34:28 | |
This universe right here that Cambridge Analytica, | 0:34:28 | 0:34:31 | |
they've collected data and they have identified as working mothers | 0:34:31 | 0:34:34 | |
that are concerned about childcare, | 0:34:34 | 0:34:37 | |
and childcare, obviously, that's not going to be, like, | 0:34:37 | 0:34:40 | |
a war-ridden, you know, destructive ad, right? | 0:34:40 | 0:34:43 | |
That's more warm and fuzzy, | 0:34:43 | 0:34:44 | |
and this is what Trump is going to do for you. | 0:34:44 | 0:34:46 | |
But if you notice, Trump's not speaking. | 0:34:46 | 0:34:49 | |
He wasn't speaking, he wasn't in it at all. | 0:34:49 | 0:34:51 | |
Right, Trump wasn't speaking in it. | 0:34:51 | 0:34:53 | |
That audience there, we wanted a softer approach, | 0:34:53 | 0:34:56 | |
so this is the type of approach that we would take. | 0:34:56 | 0:34:59 | |
The campaign made thousands of different versions | 0:34:59 | 0:35:03 | |
of the same fundraising adverts. | 0:35:03 | 0:35:05 | |
The design was constantly tweaked to see which version performed best. | 0:35:05 | 0:35:11 | |
It wasn't uncommon to have about 35-45,000 iterations | 0:35:11 | 0:35:15 | |
of these types of ads every day, right? | 0:35:15 | 0:35:18 | |
So, you know, it could be | 0:35:18 | 0:35:22 | |
as subtle and just, you know, people wouldn't even notice, | 0:35:22 | 0:35:28 | |
where you have the green button, | 0:35:28 | 0:35:30 | |
sometimes a red button would work better | 0:35:30 | 0:35:32 | |
or a blue button would work better. | 0:35:32 | 0:35:34 | |
Now, the voters Cambridge Analytica had targeted | 0:35:34 | 0:35:38 | |
were bombarded with adverts. | 0:35:38 | 0:35:40 | |
I may have short-circuited... | 0:35:46 | 0:35:47 | |
I'm Donald Trump and I approve this message. | 0:35:49 | 0:35:51 | |
On a typical day, the campaign would run more than 100 different adverts. | 0:35:52 | 0:35:57 | |
The delivery system was Silicon Valley's vast social networks. | 0:35:57 | 0:36:02 | |
We had the Facebook and YouTube and Google people, | 0:36:02 | 0:36:05 | |
they would congregate here. | 0:36:05 | 0:36:07 | |
Almost all of America's voters could now be reached online in an instant. | 0:36:07 | 0:36:12 | |
I mean, what were Facebook and Google and YouTube people actually | 0:36:12 | 0:36:16 | |
-doing here, why were they here? -They were helping us, you know. | 0:36:16 | 0:36:19 | |
They were basically our kind of hands-on partners | 0:36:19 | 0:36:24 | |
as far as being able to utilise the platform | 0:36:24 | 0:36:27 | |
as effectively as possible. | 0:36:27 | 0:36:29 | |
The Trump campaign spent the lion's share of its advertising budget, | 0:36:29 | 0:36:34 | |
around 85 million, on Facebook. | 0:36:34 | 0:36:38 | |
When you're pumping in millions and millions of dollars | 0:36:38 | 0:36:42 | |
to these social platforms, | 0:36:42 | 0:36:44 | |
you're going to get white-glove treatment, | 0:36:44 | 0:36:46 | |
so, they would send people, you know, | 0:36:46 | 0:36:48 | |
representatives to, you know, | 0:36:48 | 0:36:51 | |
Project Alamo to ensure that all of our needs were being met. | 0:36:51 | 0:36:56 | |
The success of Trump's digital strategy was built on | 0:36:56 | 0:36:59 | |
the effectiveness of Facebook as an advertising medium. | 0:36:59 | 0:37:03 | |
It's become a very powerful political tool | 0:37:03 | 0:37:06 | |
that's largely unregulated. | 0:37:06 | 0:37:08 | |
Without Facebook, we wouldn't have won. | 0:37:08 | 0:37:12 | |
I mean, Facebook really and truly put us over the edge, | 0:37:12 | 0:37:16 | |
I mean, Facebook was the medium | 0:37:16 | 0:37:18 | |
that proved most successful for this campaign. | 0:37:18 | 0:37:21 | |
Facebook didn't want to meet me, but made it clear that like all | 0:37:25 | 0:37:28 | |
advertisers on Facebook, political campaigns must ensure | 0:37:28 | 0:37:32 | |
their ads comply with all applicable laws and regulations. | 0:37:32 | 0:37:36 | |
The company also said no personally identifiable | 0:37:36 | 0:37:40 | |
information can be shared with advertising, | 0:37:40 | 0:37:42 | |
measurement or analytics partners unless people give permission. | 0:37:42 | 0:37:46 | |
In London, I'm on the trail | 0:37:51 | 0:37:52 | |
of the data company Cambridge Analytica. | 0:37:52 | 0:37:56 | |
Ever since the American election, the firm has been under pressure | 0:37:59 | 0:38:03 | |
to come clean over its use of personality prediction. | 0:38:03 | 0:38:08 | |
I want to know exactly how the firm used psychographics | 0:38:08 | 0:38:12 | |
to target voters for the Trump campaign. | 0:38:12 | 0:38:16 | |
-Alexander. -Hi. | 0:38:16 | 0:38:18 | |
How do you do? Pleasure. | 0:38:18 | 0:38:19 | |
VOICEOVER: Alexander Nix's firm first used data to target voters | 0:38:19 | 0:38:23 | |
in the American presidential elections while working on | 0:38:23 | 0:38:27 | |
the Ted Cruz campaign for the Republican nomination. | 0:38:27 | 0:38:31 | |
When Cruz lost, they went to work for Trump. | 0:38:31 | 0:38:35 | |
I want to start with the Trump campaign. | 0:38:35 | 0:38:37 | |
Did Cambridge Analytica ever use | 0:38:37 | 0:38:40 | |
psychometric or psychographic methods in this campaign? | 0:38:40 | 0:38:44 | |
We left the Cruz campaign in April after the nomination was over. | 0:38:44 | 0:38:47 | |
We pivoted right across onto the Trump campaign. | 0:38:47 | 0:38:50 | |
It was about five and a half months before polling. | 0:38:50 | 0:38:53 | |
And whilst on the Cruz campaign, we were able to invest | 0:38:53 | 0:38:58 | |
a lot more time into building psychographic models, | 0:38:58 | 0:39:01 | |
into profiling, using behavioural profiling to understand different | 0:39:01 | 0:39:04 | |
personality groups and different personality drivers, | 0:39:04 | 0:39:07 | |
in order to inform our messaging and our creative. | 0:39:07 | 0:39:10 | |
We simply didn't have the time to employ this level of | 0:39:10 | 0:39:14 | |
rigorous methodology for Trump. | 0:39:14 | 0:39:17 | |
For Cruz, Cambridge Analytica built computer models which could crunch | 0:39:17 | 0:39:21 | |
huge amounts of data on each voter, including psychographic data | 0:39:21 | 0:39:26 | |
predicting voters' personality types. | 0:39:26 | 0:39:30 | |
When the firm transferred to the Trump campaign, | 0:39:30 | 0:39:32 | |
they took data with them. | 0:39:32 | 0:39:34 | |
Now, there is clearly some legacy psychographics in the data, | 0:39:34 | 0:39:40 | |
because the data is, um, model data, | 0:39:40 | 0:39:44 | |
or a lot of it is model data that we had used across | 0:39:44 | 0:39:46 | |
the last 14, 15 months of campaigning through the midterms, | 0:39:46 | 0:39:50 | |
and then through the primaries. | 0:39:50 | 0:39:52 | |
But specifically, did we build specific psychographic models | 0:39:52 | 0:39:57 | |
for the Trump campaign? No, we didn't. | 0:39:57 | 0:39:59 | |
So you didn't build specific models for this campaign, | 0:39:59 | 0:40:03 | |
but it sounds like you did use some element of psychographic modelling, | 0:40:03 | 0:40:07 | |
as an approach in the Trump campaign? | 0:40:07 | 0:40:10 | |
Only as a result of legacy data models. | 0:40:10 | 0:40:13 | |
So, the answer... The answer you are looking for is no. | 0:40:13 | 0:40:17 | |
The answer I am looking for is the extent to which it was used. | 0:40:17 | 0:40:20 | |
I mean, legacy... I don't know what that means, legacy data modelling. | 0:40:20 | 0:40:23 | |
What does that mean for the Trump campaign? | 0:40:23 | 0:40:26 | |
Well, so we were able to take models we had made previously | 0:40:26 | 0:40:28 | |
over the last two or three years, | 0:40:28 | 0:40:30 | |
and integrate those into some of the work we were doing. | 0:40:30 | 0:40:33 | |
Where did all the information | 0:40:33 | 0:40:34 | |
to predict voters' personalities come from? | 0:40:34 | 0:40:38 | |
Very originally, we used a combination of telephone surveys | 0:40:38 | 0:40:42 | |
and then we used a number of... | 0:40:42 | 0:40:45 | |
..online platforms... | 0:40:46 | 0:40:50 | |
for gathering questions. | 0:40:50 | 0:40:52 | |
As we started to gather more data, | 0:40:52 | 0:40:55 | |
we started to look at other platforms. | 0:40:55 | 0:40:57 | |
Such as Facebook, for instance. | 0:40:57 | 0:41:00 | |
Predicting personality is just one element in the big data | 0:41:02 | 0:41:05 | |
companies like Cambridge Analytica are using to revolutionise | 0:41:05 | 0:41:09 | |
the way democracy works. | 0:41:09 | 0:41:11 | |
Can you understand, though, why maybe | 0:41:11 | 0:41:13 | |
some people find it a little bit creepy? | 0:41:13 | 0:41:16 | |
No, I can't. Quite the opposite. | 0:41:16 | 0:41:17 | |
I think that the move away from blanket advertising, | 0:41:17 | 0:41:21 | |
the move towards ever more personalised communication, | 0:41:21 | 0:41:24 | |
is a very natural progression. | 0:41:24 | 0:41:25 | |
I think it is only going to increase. | 0:41:25 | 0:41:27 | |
I find it a little bit weird, if I had a very sort of detailed | 0:41:27 | 0:41:30 | |
personality assessment of me, based on all sorts of different data | 0:41:30 | 0:41:35 | |
that I had put all over the internet, | 0:41:35 | 0:41:37 | |
and that as a result of that, some profile had been made of me | 0:41:37 | 0:41:40 | |
that was then used to target me for adverts, | 0:41:40 | 0:41:43 | |
and I didn't really know that any of that had happened. | 0:41:43 | 0:41:45 | |
That's why some people might find it a little bit sinister. | 0:41:45 | 0:41:48 | |
Well, you have just said yourself, | 0:41:48 | 0:41:50 | |
you are putting this data out into the public domain. | 0:41:50 | 0:41:52 | |
I'm sure that you have a supermarket loyalty card. | 0:41:52 | 0:41:56 | |
I'm sure you understand the reciprocity that is going on there - | 0:41:56 | 0:42:01 | |
you get points, and in return, they gather your data | 0:42:01 | 0:42:05 | |
on your consumer behaviour. | 0:42:05 | 0:42:07 | |
I mean, we are talking about politics | 0:42:07 | 0:42:08 | |
and we're talking about shopping. Are they really the same thing? | 0:42:08 | 0:42:12 | |
The technology is the same. | 0:42:12 | 0:42:14 | |
In the next ten years, | 0:42:14 | 0:42:15 | |
the sheer volumes of data that are going to be available, | 0:42:15 | 0:42:18 | |
that are going to be driving all sorts of things | 0:42:18 | 0:42:20 | |
including marketing and communications, | 0:42:20 | 0:42:23 | |
is going to be a paradigm shift from where we are now | 0:42:23 | 0:42:25 | |
and it's going to be a revolution, | 0:42:25 | 0:42:27 | |
and that is the way the world is moving. | 0:42:27 | 0:42:29 | |
And, you know, I think, | 0:42:29 | 0:42:31 | |
whether you like it or not, it, it... | 0:42:31 | 0:42:35 | |
it is an inevitable fact. | 0:42:35 | 0:42:37 | |
To the new data barons, this is all just business. | 0:42:39 | 0:42:42 | |
But to the rest of us, it's more than that. | 0:42:42 | 0:42:46 | |
By the time of Donald Trump's inauguration, | 0:42:49 | 0:42:52 | |
it was accepted that his mastery of data and social media | 0:42:52 | 0:42:55 | |
had made him the most powerful man in the world. | 0:42:55 | 0:42:59 | |
We will make America great again. | 0:42:59 | 0:43:04 | |
The election of Donald Trump | 0:43:04 | 0:43:06 | |
was greeted with barely concealed fury in Silicon Valley. | 0:43:06 | 0:43:10 | |
But Facebook and other tech companies had made | 0:43:10 | 0:43:13 | |
millions of dollars by helping to make it happen. | 0:43:13 | 0:43:16 | |
Their power as advertising platforms | 0:43:16 | 0:43:19 | |
had been exploited by a politician | 0:43:19 | 0:43:21 | |
with a very different view of the world. | 0:43:21 | 0:43:24 | |
But Facebook's problems were only just beginning. | 0:43:27 | 0:43:30 | |
Another phenomenon of the election | 0:43:32 | 0:43:34 | |
was plunging the tech titan into crisis. | 0:43:34 | 0:43:37 | |
Fake news, often targeting Hillary Clinton, | 0:43:43 | 0:43:46 | |
had dominated the election campaign. | 0:43:46 | 0:43:49 | |
Now, the departing President turned on the social media giant | 0:43:52 | 0:43:55 | |
he had once embraced. | 0:43:55 | 0:43:58 | |
In an age where, uh... | 0:43:59 | 0:44:01 | |
there is so much active misinformation, | 0:44:01 | 0:44:06 | |
and it is packaged very well, | 0:44:06 | 0:44:08 | |
and it looks the same when you see it on a Facebook page, | 0:44:08 | 0:44:11 | |
or you turn on your television, | 0:44:11 | 0:44:13 | |
if everything, uh... | 0:44:13 | 0:44:17 | |
seems to be the same, | 0:44:17 | 0:44:20 | |
and no distinctions are made, | 0:44:20 | 0:44:22 | |
then we won't know what to protect. | 0:44:22 | 0:44:25 | |
We won't know what to fight for. | 0:44:25 | 0:44:27 | |
Fake news had provoked a storm of criticism | 0:44:33 | 0:44:36 | |
over Facebook's impact on democracy. | 0:44:36 | 0:44:39 | |
Its founder, Zuck, claimed it was extremely unlikely | 0:44:39 | 0:44:42 | |
fake news had changed the election's outcome. | 0:44:42 | 0:44:46 | |
But he didn't address why it had spread like wildfire | 0:44:46 | 0:44:49 | |
across the platform. | 0:44:49 | 0:44:51 | |
-Jamie. -Hi, Jamie. | 0:44:51 | 0:44:53 | |
VOICEOVER: Meet Jeff Hancock, | 0:44:53 | 0:44:55 | |
a psychologist who has investigated a hidden aspect of Facebook | 0:44:55 | 0:44:58 | |
that helps explain how the platform became weaponised in this way. | 0:44:58 | 0:45:03 | |
It turns out the power of Facebook to affect our emotions is key, | 0:45:06 | 0:45:11 | |
something that had been uncovered | 0:45:11 | 0:45:13 | |
in an experiment the company itself had run in 2012. | 0:45:13 | 0:45:18 | |
It was one of the earlier, you know, | 0:45:18 | 0:45:19 | |
what we would call big data, social science-type studies. | 0:45:19 | 0:45:22 | |
The newsfeeds of nearly 700,000 users were secretly manipulated | 0:45:26 | 0:45:30 | |
so they would see fewer positive or negative posts. | 0:45:30 | 0:45:35 | |
Jeff helped interpret the results. | 0:45:35 | 0:45:38 | |
So, what did you actually find? | 0:45:38 | 0:45:40 | |
We found that if you were one of those people | 0:45:40 | 0:45:43 | |
that were seeing less negative emotion words in their posts, | 0:45:43 | 0:45:46 | |
then you would write with less negative emotion in your own posts, | 0:45:46 | 0:45:49 | |
and more positive emotion. | 0:45:49 | 0:45:51 | |
-This is emotional contagion. -And what about positive posts? | 0:45:51 | 0:45:55 | |
Did that have the same kind of effect? | 0:45:55 | 0:45:57 | |
Yeah, we saw the same effect | 0:45:57 | 0:45:59 | |
when positive emotion worded posts were decreased. | 0:45:59 | 0:46:03 | |
We saw the same thing. So I would produce fewer positive | 0:46:03 | 0:46:06 | |
emotion words, and more negative emotion words. | 0:46:06 | 0:46:08 | |
And that is consistent with emotional contagion theory. | 0:46:08 | 0:46:11 | |
Basically, we were showing that people were writing in a way | 0:46:11 | 0:46:15 | |
that was matching the emotion that they were seeing | 0:46:15 | 0:46:18 | |
in the Facebook news feed. | 0:46:18 | 0:46:21 | |
Emotion draws people to fake news, and then supercharges its spread. | 0:46:21 | 0:46:27 | |
So the more emotional the content, | 0:46:27 | 0:46:29 | |
the more likely it is to spread online. | 0:46:29 | 0:46:31 | |
-Is that true? -Yeah, the more intense the emotion in content, | 0:46:31 | 0:46:34 | |
the more likely it is to spread, to go viral. | 0:46:34 | 0:46:38 | |
It doesn't matter whether it is sad or happy, like negative or positive, | 0:46:38 | 0:46:42 | |
the more important thing is how intense the emotion is. | 0:46:42 | 0:46:44 | |
The process of emotional contagion | 0:46:44 | 0:46:47 | |
helps explain why fake news has spread | 0:46:47 | 0:46:50 | |
so far across social media. | 0:46:50 | 0:46:52 | |
Advertisers have been well aware of how emotion can be used | 0:46:55 | 0:46:58 | |
to manipulate people's attention. | 0:46:58 | 0:46:59 | |
But now we are seeing this with a whole host of other actors, | 0:46:59 | 0:47:02 | |
some of them nefarious. | 0:47:02 | 0:47:04 | |
So, other state actors trying to influence your election, | 0:47:04 | 0:47:06 | |
people trying to manipulate the media | 0:47:06 | 0:47:08 | |
are using emotional contagion, | 0:47:08 | 0:47:10 | |
and also using those original platforms like Facebook | 0:47:10 | 0:47:14 | |
to accomplish other objectives, like sowing distrust, | 0:47:14 | 0:47:19 | |
or creating false beliefs. | 0:47:19 | 0:47:21 | |
You see it sort of, maybe weaponised | 0:47:21 | 0:47:23 | |
and being used at scale. | 0:47:23 | 0:47:26 | |
I'm in Germany, | 0:47:35 | 0:47:37 | |
where the presence of more than a million new refugees | 0:47:37 | 0:47:39 | |
has caused tension across the political spectrum. | 0:47:39 | 0:47:43 | |
Earlier this year, | 0:47:43 | 0:47:44 | |
a story appeared on the American alt-right news site Breitbart | 0:47:44 | 0:47:49 | |
about the torching of a church in Dortmund by a North African mob. | 0:47:49 | 0:47:54 | |
It was widely shared on social media - | 0:47:54 | 0:47:57 | |
but it wasn't true. | 0:47:57 | 0:47:59 | |
The problem with social networks | 0:48:04 | 0:48:06 | |
is that all information is treated equally. | 0:48:06 | 0:48:08 | |
So you have good, honest, accurate information | 0:48:08 | 0:48:12 | |
sitting alongside and treated equally to | 0:48:12 | 0:48:15 | |
lies and propaganda. | 0:48:15 | 0:48:18 | |
And the difficulty for citizens is that it can be very hard | 0:48:18 | 0:48:21 | |
to tell the difference between the two. | 0:48:21 | 0:48:24 | |
But one data scientist here has discovered an even darker side | 0:48:24 | 0:48:30 | |
to the way Facebook is being manipulated. | 0:48:30 | 0:48:32 | |
We created a network of all the likes | 0:48:33 | 0:48:37 | |
in the refugee debate on Facebook. | 0:48:37 | 0:48:40 | |
Professor Simon Hegelich has found evidence the debate | 0:48:42 | 0:48:46 | |
about refugees on Facebook | 0:48:46 | 0:48:47 | |
is being skewed by anonymous political forces. | 0:48:47 | 0:48:52 | |
So you are trying to understand who is liking pages? | 0:48:52 | 0:48:56 | |
Exactly. | 0:48:56 | 0:48:58 | |
One statistic among many used by Facebook to rank stories in | 0:48:58 | 0:49:01 | |
your news feed is the number of likes they get. | 0:49:01 | 0:49:04 | |
The red area of this chart shows a network of people liking | 0:49:04 | 0:49:08 | |
anti-refugee posts on Facebook. | 0:49:08 | 0:49:12 | |
Most of the likes are sent by just a handful of people - 25 people are... | 0:49:12 | 0:49:18 | |
..responsible for more than 90% of all these likes. | 0:49:20 | 0:49:24 | |
25 Facebook accounts each liked | 0:49:24 | 0:49:26 | |
more than 30,000 comments over six months. | 0:49:26 | 0:49:30 | |
These hyperactive accounts could be run by real people, or software. | 0:49:30 | 0:49:36 | |
I think the rationale behind this | 0:49:36 | 0:49:38 | |
is that they tried to make their content viral. | 0:49:38 | 0:49:42 | |
They think if we like all this stuff, then Facebook, | 0:49:42 | 0:49:46 | |
the algorithm of Facebook, | 0:49:46 | 0:49:47 | |
will pick up our content and show it to other users, | 0:49:47 | 0:49:51 | |
and then the whole world sees that refugees are bad | 0:49:51 | 0:49:54 | |
and that they shouldn't come to Germany. | 0:49:54 | 0:49:57 | |
This is evidence the number of likes on Facebook can be easily gamed | 0:49:57 | 0:50:02 | |
as part of an effort to try to influence the prominence | 0:50:02 | 0:50:05 | |
of anti-refugee content on the site. | 0:50:05 | 0:50:08 | |
Does this worry you, though? | 0:50:08 | 0:50:10 | |
It's definitely changing structure of public opinion. | 0:50:10 | 0:50:14 | |
Democracy is built on public opinion, | 0:50:14 | 0:50:18 | |
so such a change definitely has to change the way democracy works. | 0:50:18 | 0:50:23 | |
Facebook told us they are working to disrupt the economic incentives | 0:50:26 | 0:50:30 | |
behind false news, removing tens of thousands of fake accounts, | 0:50:30 | 0:50:34 | |
and building new products | 0:50:34 | 0:50:36 | |
to identify and limit the spread of false news. | 0:50:36 | 0:50:39 | |
Zuck is trying to hold the line that his company is not a publisher, | 0:50:39 | 0:50:44 | |
based on that obscure legal clause from the 1990s. | 0:50:44 | 0:50:49 | |
You know, Facebook is a new kind of platform. | 0:50:49 | 0:50:51 | |
You know, it's not a traditional technology company, | 0:50:51 | 0:50:54 | |
it's not a traditional media company. | 0:50:54 | 0:50:57 | |
Um, we don't write the news that people... | 0:50:57 | 0:50:59 | |
that people read on the platform. | 0:50:59 | 0:51:01 | |
In Germany, one man is taking on the tech gods over their responsibility | 0:51:03 | 0:51:08 | |
for what appears on their sites. | 0:51:08 | 0:51:10 | |
Ulrich Kelber is a minister in the Justice Department. | 0:51:12 | 0:51:17 | |
He was once called a "Jewish pig" in a post on Facebook. | 0:51:17 | 0:51:20 | |
He reported it to the company, who refused to delete it. | 0:51:20 | 0:51:24 | |
IN GERMAN: | 0:51:24 | 0:51:26 | |
Under a new law, | 0:51:43 | 0:51:45 | |
Facebook and other social networking sites | 0:51:45 | 0:51:47 | |
could be fined up to 50 million euros | 0:51:47 | 0:51:50 | |
if they fail to take down hate speech posts that are illegal. | 0:51:50 | 0:51:55 | |
Is this too much? | 0:51:55 | 0:51:56 | |
Is this a little bit Draconian? | 0:51:56 | 0:51:59 | |
This is the first time a government has challenged | 0:52:20 | 0:52:23 | |
the principles underlying a Silicon Valley platform. | 0:52:23 | 0:52:27 | |
Once this thread has been pulled, | 0:52:27 | 0:52:29 | |
their whole world could start to unravel. | 0:52:29 | 0:52:32 | |
They must find you a pain. | 0:52:32 | 0:52:35 | |
I mean, this must be annoying for them. | 0:52:35 | 0:52:37 | |
It must be. | 0:52:37 | 0:52:38 | |
Facebook told us they share the goal of fighting hate speech, | 0:52:47 | 0:52:51 | |
and they have made substantial progress | 0:52:51 | 0:52:54 | |
in removing illegal content, | 0:52:54 | 0:52:56 | |
adding 3,000 people to their community operations team. | 0:52:56 | 0:53:00 | |
Facebook now connects more than two billion people around the world, | 0:53:04 | 0:53:08 | |
including more and more voters in the West. | 0:53:08 | 0:53:12 | |
When you think of what Facebook has become, | 0:53:12 | 0:53:14 | |
in such a short space of time, it's actually pretty bizarre. | 0:53:14 | 0:53:18 | |
I mean, this was just a platform | 0:53:18 | 0:53:20 | |
for sharing photos or chatting to friends, | 0:53:20 | 0:53:23 | |
but in less than a decade, it has become a platform that has | 0:53:23 | 0:53:27 | |
dramatic implications for how our democracy works. | 0:53:27 | 0:53:30 | |
Old structures of power are falling away. | 0:53:32 | 0:53:35 | |
Social media is giving ordinary people access to huge audiences. | 0:53:35 | 0:53:40 | |
And politics is changing as a result. | 0:53:40 | 0:53:43 | |
Significant numbers were motivated | 0:53:46 | 0:53:47 | |
through social media to vote for Brexit. | 0:53:47 | 0:53:50 | |
Jeremy Corbyn lost the general election, | 0:53:52 | 0:53:54 | |
but enjoyed unexpected gains, | 0:53:54 | 0:53:58 | |
an achievement he put down in part to the power of social media. | 0:53:58 | 0:54:03 | |
One influential force in this new world is found here in Bristol. | 0:54:06 | 0:54:12 | |
The Canary is an online political news outlet. | 0:54:12 | 0:54:15 | |
During the election campaign, | 0:54:15 | 0:54:18 | |
their stories got more than 25 million hits on a tiny budget. | 0:54:18 | 0:54:22 | |
So, how much of your readership comes through Facebook? | 0:54:24 | 0:54:27 | |
It's really high, it's about 80%. | 0:54:27 | 0:54:30 | |
So it is an enormously important distribution mechanism. | 0:54:30 | 0:54:33 | |
And free. | 0:54:33 | 0:54:35 | |
The Canary's presentation of its pro-Corbyn news | 0:54:35 | 0:54:39 | |
is tailored to social media. | 0:54:39 | 0:54:43 | |
I have noticed that a lot of your headlines, or your pictures, | 0:54:43 | 0:54:45 | |
they are quite emotional. | 0:54:45 | 0:54:47 | |
They are sort of pictures of sad Theresa May, | 0:54:47 | 0:54:50 | |
or delighted Jeremy Corbyn. | 0:54:50 | 0:54:52 | |
Is that part of the purpose of it all? | 0:54:52 | 0:54:54 | |
Yeah, it has to be. We are out there trying to have a conversation | 0:54:54 | 0:54:57 | |
with a lot of people, so it is on us to be compelling. | 0:54:57 | 0:55:00 | |
Human beings work on facts, but they also work on gut instinct, | 0:55:00 | 0:55:04 | |
they work on emotions, feelings, and fidelity and community. | 0:55:04 | 0:55:09 | |
All of these issues. | 0:55:09 | 0:55:10 | |
Social media enables those with few resources to compete | 0:55:10 | 0:55:15 | |
with the mainstream media for the attention of millions of us. | 0:55:15 | 0:55:20 | |
You put up quite sort of clickbait-y stories, you know, | 0:55:20 | 0:55:23 | |
the headlines are there to get clicks. | 0:55:23 | 0:55:26 | |
-Yeah. -Um... Is that a fair criticism? | 0:55:26 | 0:55:28 | |
Of course they're there to get clicks. | 0:55:28 | 0:55:30 | |
We don't want to have a conversation with ten people. | 0:55:30 | 0:55:32 | |
You can't change the world talking to ten people. | 0:55:32 | 0:55:34 | |
The tech gods are giving all of us the power to influence the world. | 0:55:42 | 0:55:48 | |
Connectivity and access... Connect everyone in the world... | 0:55:48 | 0:55:50 | |
Make the world more open and connected... | 0:55:50 | 0:55:51 | |
You connect people over time... You get people connectivity... | 0:55:51 | 0:55:54 | |
That's the mission. That's what I care about. | 0:55:54 | 0:55:57 | |
Social media's unparalleled power to persuade, | 0:55:57 | 0:56:01 | |
first developed for advertisers, | 0:56:01 | 0:56:03 | |
is now being exploited by political forces of all kinds. | 0:56:03 | 0:56:06 | |
Grassroots movements are regaining their power, | 0:56:06 | 0:56:11 | |
challenging political elites. | 0:56:11 | 0:56:13 | |
Extremists are discovering new ways to stoke hatred and spread lies. | 0:56:13 | 0:56:19 | |
And wealthy political parties are developing the ability | 0:56:19 | 0:56:21 | |
to manipulate our thoughts and feelings | 0:56:21 | 0:56:24 | |
using powerful psychological tools, | 0:56:24 | 0:56:27 | |
which is leading to a world of unexpected political opportunity, | 0:56:27 | 0:56:32 | |
and turbulence. | 0:56:32 | 0:56:33 | |
I think the people that connected the world really believed that | 0:56:33 | 0:56:37 | |
somehow, just by us being connected, our politics would be better. | 0:56:37 | 0:56:42 | |
But the world is changing in ways that they never imagined, | 0:56:42 | 0:56:48 | |
and they are probably not happy about any more. | 0:56:48 | 0:56:51 | |
But in truth, | 0:56:51 | 0:56:53 | |
they are no more in charge of this technology than any of us are now. | 0:56:53 | 0:56:56 | |
Silicon Valley's philosophy is called disruption. | 0:56:56 | 0:57:00 | |
Breaking down the way we do things | 0:57:00 | 0:57:02 | |
and using technology to improve the world. | 0:57:02 | 0:57:06 | |
In this series, I have seen how | 0:57:06 | 0:57:08 | |
sharing platforms like Uber and Airbnb | 0:57:08 | 0:57:11 | |
are transforming our cities. | 0:57:11 | 0:57:14 | |
And how automation and artificial intelligence | 0:57:18 | 0:57:21 | |
threaten to destroy millions of jobs. | 0:57:21 | 0:57:24 | |
Within 30 years, half of humanity won't have a job. | 0:57:24 | 0:57:27 | |
It could get ugly, there could be revolution. | 0:57:27 | 0:57:29 | |
Now, the technology to connect the world unleashed by a few billionaire | 0:57:29 | 0:57:34 | |
entrepreneurs is having a dramatic influence on our politics. | 0:57:34 | 0:57:39 | |
The people who are responsible for building this technology, | 0:57:39 | 0:57:42 | |
for unleashing this disruption onto all of us, | 0:57:42 | 0:57:47 | |
don't ever feel like they are responsible for the consequences | 0:57:47 | 0:57:51 | |
of any of that. | 0:57:51 | 0:57:53 | |
They retain this absolute religious faith that technology and | 0:57:53 | 0:57:58 | |
connectivity is always going to make things turn out for the best. | 0:57:58 | 0:58:03 | |
And it doesn't matter what happens, | 0:58:03 | 0:58:05 | |
it doesn't matter how much that's proven not to be the case, | 0:58:05 | 0:58:08 | |
they still believe. | 0:58:08 | 0:58:10 | |
How did Silicon Valley become so influential? | 0:58:22 | 0:58:25 | |
The Open University has produced an interactive timeline | 0:58:25 | 0:58:28 | |
exploring the history of this place. | 0:58:28 | 0:58:30 | |
To find out more, visit... | 0:58:30 | 0:58:32 | |
..and follow the links to the Open University. | 0:58:36 | 0:58:40 |