Inside the Dark Web Horizon


Inside the Dark Web

Similar Content

Browse content similar to Inside the Dark Web. Check below for episodes and series from the same categories and more!

Transcript


LineFromTo

It's just 25 years since the World Wide Web was created.

0:00:020:00:05

It now touches all of our lives, our personal information

0:00:050:00:09

and data swirling through the internet on a daily basis.

0:00:090:00:13

Yet it's now caught in the greatest controversy of its life -

0:00:130:00:17

surveillance.

0:00:170:00:18

This is a spy master's dream.

0:00:180:00:21

No spy of the previous generations could have imagined that we

0:00:210:00:25

would all volunteer for the world's best tracking device.

0:00:250:00:28

The revelations of US intelligence contractor Edward Snowden

0:00:280:00:32

have led many to ask if the Web we love has been turned against us...

0:00:320:00:37

They don't just want your search data or your e-mail.

0:00:370:00:40

They want everything.

0:00:400:00:42

And, as you are being surveilled 24/7, you are more under control.

0:00:420:00:47

You are less free.

0:00:470:00:49

..leading to soul-searching amongst those responsible for the Web itself.

0:00:490:00:54

I used to think that in some countries you worry about the

0:00:540:00:57

government and in some countries you worry about the corporations.

0:00:570:01:00

I realise now that that was naive.

0:01:000:01:03

But thanks to a collection of brilliant thinkers

0:01:030:01:06

and researchers, science has been fighting back...

0:01:060:01:08

It's really no surprise that the privacy issue has unfolded the way it has.

0:01:080:01:15

..developing technology to defeat surveillance, protecting activists...

0:01:150:01:20

They tried to intimidate me. I knew that they don't know anything.

0:01:200:01:24

..and in the process coming into conflict with global power.

0:01:240:01:28

They are detaining me at airports, threatening me.

0:01:280:01:33

But now, thanks to a new digital currency...

0:01:330:01:36

If you're buying less than half a Bitcoin, you'll have to go to market price.

0:01:360:01:40

..this technology is sending law makers into a panic,

0:01:400:01:43

due to the growth of a new black market in the Dark Web.

0:01:430:01:47

It was like a buffet dinner for narcotics.

0:01:470:01:51

Our detection rate is dropping. It's risk-free crime.

0:01:510:01:55

This is the story of a battle to shape the technology which now

0:01:550:01:59

defines our world, and whoever wins will influence not only the

0:01:590:02:03

future of the internet but the very idea of what it means to be free.

0:02:030:02:09

The sickness that befalls the internet is something that

0:02:090:02:11

befalls the whole world.

0:02:110:02:13

Where does the outside world stop and private space begin?

0:02:240:02:29

What details of your life are you willing to share with strangers?

0:02:290:02:33

Take this house.

0:02:360:02:38

Every morning, lights come on.

0:02:400:02:44

Coffee brews - automatically.

0:02:440:02:48

Technology just like this is increasingly being

0:02:500:02:53

installed into millions of homes across the world

0:02:530:02:56

and it promises to change the way we live for ever.

0:02:560:03:01

So there are sensors of all kinds,

0:03:010:03:04

there's lights and locks and thermostats

0:03:040:03:05

and once they're connected

0:03:050:03:07

then our platform can make them do whatever you want them to do.

0:03:070:03:10

So as an example,

0:03:100:03:11

if I wake up in the morning, the house knows that I'm waking up.

0:03:110:03:15

It can wake up with me.

0:03:150:03:16

When we walk in the kitchen, it will play the local news

0:03:160:03:20

and sort of greet us into the day, tell us the weather forecast

0:03:200:03:23

so we know how to dress for the day and so on.

0:03:230:03:26

This technology is known as the internet of things,

0:03:260:03:30

where the objects in our houses - kitchen appliances,

0:03:300:03:33

anything electronic - can be connected to the internet.

0:03:330:03:37

But for it to be useful, we're going to have to share intimate details of our private life.

0:03:370:03:43

So this is my things app.

0:03:430:03:46

It can run on your mobile phone or on a tablet or something like that.

0:03:460:03:49

I can do things like look at the comings

0:03:490:03:51

and goings of family members.

0:03:510:03:52

It can automatically detect when we come and go

0:03:520:03:54

based on our mobile phones or you can have it detect your presence

0:03:540:03:58

with a little sensor, you can put it in your car or something like that.

0:03:580:04:01

So when we leave the house, and there's no-one home, that's when it'll lock up

0:04:010:04:04

and shut down all the electricity used and so on.

0:04:040:04:07

For most of us, this is deeply private information.

0:04:070:04:11

Yet once we hand it over,

0:04:110:04:12

we have to trust a company to keep it confidential.

0:04:120:04:14

The consumer really owns 100% of their own data so they're opting in.

0:04:140:04:18

It's not something where that data would ever be shared

0:04:180:04:21

without their giving their permission.

0:04:210:04:23

This house represents a new normal where even the movements

0:04:280:04:33

within our own home are documented and stored.

0:04:330:04:36

It's a new frontier.

0:04:360:04:38

The internet is asking us to redefine what we consider private.

0:04:400:04:45

To understand the enormous changes taking place,

0:04:540:04:58

it's necessary to come here.

0:04:580:05:00

Almost 150 years ago, this hut was on the frontier of the world's

0:05:000:05:03

first information revolution - the telegraph.

0:05:030:05:08

It was a crucial hub for a global network of wires -

0:05:100:05:13

a role that is just as important today.

0:05:130:05:16

Seeing the cables in a room like this shows that the physical

0:05:180:05:21

infrastructure needed to move information

0:05:210:05:24

around the world hasn't changed very much in the past hundred years.

0:05:240:05:28

I think that the internet,

0:05:280:05:30

we tend to think of as a cloud floating somewhere off in cyberspace,

0:05:300:05:33

but they're physical wires, physical cables

0:05:330:05:35

and sometimes wireless signals that are communicating with each other.

0:05:350:05:40

Cornwall, where the telegraph cables come ashore,

0:05:400:05:44

still remains crucial for today's internet.

0:05:440:05:48

25% of all traffic passes through here.

0:05:480:05:52

Running from the United States

0:05:520:05:54

and other places to the United Kingdom are a large number of the

0:05:540:05:57

most significant fibre-optic cables that carry huge amounts of data.

0:05:570:06:03

Alongside this information super-highway is a site

0:06:030:06:06

belonging to the UK Government, GCHQ Bude.

0:06:060:06:10

We now know that this listening station has been gathering

0:06:120:06:15

and analysing everything that comes across these wires.

0:06:150:06:20

Any data that passes across the internet could theoretically come

0:06:200:06:23

down these cables, so that's e-mails, websites, the bit torrent downloads,

0:06:230:06:28

the films that you're accessing through Netflix and online services.

0:06:280:06:33

The sheer amount of data captured here is almost impossible to comprehend.

0:06:330:06:38

In terms of what the GCHQ were looking at, we've got

0:06:380:06:41

from internal documents that, in 2011, they were tapping 200

0:06:410:06:45

ten-gigabit cables coming into Cornwall.

0:06:450:06:48

To give a rough idea of how much data that is, if you were to

0:06:480:06:52

digitise the entire contents of the British Library, then you

0:06:520:06:55

could transfer it down that set of cables in about 40 seconds.

0:06:550:07:00

Tapping the wires is surprisingly simple.

0:07:000:07:03

The data carried by the fibre-optic cable just needs to be diverted.

0:07:110:07:17

A fibre-optic cable signal is a beam of light

0:07:170:07:20

travelling down a cable made from glass.

0:07:200:07:25

Pulses of light represent the pieces of information

0:07:250:07:28

travelling across the internet, which is the e-mails, the web pages,

0:07:280:07:31

everything that's going over the internet.

0:07:310:07:35

Every 50 miles or so, that signal becomes sufficiently

0:07:350:07:38

weak that it needs to be repeated, and this is the weak spot.

0:07:380:07:42

And it's very easy to insert an optical tap at that point.

0:07:430:07:48

And that's just what GCHQ did.

0:07:480:07:52

A device was placed into the beam of data which created a mirror

0:07:520:07:56

image of the millions of e-mails, web searches

0:07:560:07:59

and internet traffic passing through the cables every second.

0:07:590:08:02

What you effectively get is two copies of the signal, one going

0:08:050:08:09

off to the GCHQ and one carrying on in its original destination.

0:08:090:08:13

All of the information going over those cables is able to be

0:08:220:08:25

replayed over the course of three days so you can rewind

0:08:250:08:29

and see what was going over the internet at a particular moment.

0:08:290:08:32

Analysing this amount of data is an impressive achievement

0:08:320:08:35

but it also attracts criticism.

0:08:350:08:38

When you see the capacity and the potential for that technology,

0:08:380:08:42

and the fact that it is being used without transparency

0:08:420:08:45

and without very high levels of accountability, it's incredibly

0:08:450:08:49

concerning because the power of that data to predict and analyse what

0:08:490:08:53

we're going to do is very, very high

0:08:530:08:55

and giving that power to somebody else,

0:08:550:08:57

regardless of the original or stated intentions, is very worrying.

0:08:570:09:01

We only know about the GCHQ project thanks to documents released

0:09:090:09:13

by US whistle-blower Edward Snowden, and the revelation has begun

0:09:130:09:18

to change the way many people think about privacy and the internet -

0:09:180:09:23

among them the inventor of the World Wide Web, Tim Berners-Lee.

0:09:230:09:27

Information is a funny sort of power.

0:09:270:09:30

The way a government can use it to

0:09:300:09:34

keep control of its citizens

0:09:340:09:35

is insidious and sneaky, nasty in some cases.

0:09:350:09:39

We have to all learn more about it

0:09:390:09:43

and we have to in a way rethink, rebase a lot of our philosophy.

0:09:430:09:49

Part of changing that philosophy is to understand that our lives are

0:09:500:09:54

not analysed in the first instance by people,

0:09:540:09:58

but by computer programmes.

0:09:580:10:00

Some people just don't have an understanding about what's possible.

0:10:010:10:05

I've heard people say, "Well, we know nobody's reading our e-mails

0:10:050:10:08

"because they don't have enough people."

0:10:080:10:10

Actually, hello, it's not... These e-mails are not being read by people.

0:10:100:10:13

They're being read by machines.

0:10:130:10:15

They're being read by machines which can do the sorts of things

0:10:170:10:20

that search engines do and can look at all the e-mails

0:10:200:10:23

and at all the social connections and can watch.

0:10:230:10:27

Sometimes these machines learn how to spot trends

0:10:270:10:31

and can build systems which will just watch a huge amount of data

0:10:310:10:36

and start to pick things out and then will suggest to the

0:10:360:10:40

security agency, well, "These people need to be investigated."

0:10:400:10:44

But then the thing could be wrong. Boy, we have to have a protection.

0:10:460:10:50

The Snowden revelations have generated greater interest

0:10:520:10:55

than ever in how the internet is being used for the purposes

0:10:550:10:58

of surveillance.

0:10:580:11:01

But watching isn't just done by governments.

0:11:010:11:04

The most detailed documenting of our lives is done by technology companies.

0:11:080:11:13

Two years ago, tech researcher Julia Angwin decided to investigate

0:11:130:11:17

how much these companies track our behaviour daily.

0:11:170:11:21

Her findings give us one of the best pictures yet of just who is

0:11:210:11:25

watching us online every minute of every day.

0:11:250:11:29

When I talk about the underbelly of the information revolution,

0:11:290:11:32

I'm really talking about the unseen downside

0:11:320:11:37

of the information revolution.

0:11:370:11:38

You know, we obviously have seen all the benefits of having

0:11:380:11:41

all this information at our fingertips.

0:11:410:11:43

But we're just awakening to the fact that we're also being

0:11:430:11:46

surveilled all the time.

0:11:460:11:48

We're being monitored in ways that were never before possible.

0:11:480:11:52

Every time we browse the internet,

0:11:550:11:57

what we do can be collated and sold to advertisers.

0:11:570:12:02

So, basically, online there are hundreds of companies that

0:12:020:12:05

sort of install invisible tracking technology on websites.

0:12:050:12:09

They've installed basically a serial number on your computer

0:12:130:12:16

and they watch you whenever they see you across the Web

0:12:160:12:19

and build a dossier about your reading habits,

0:12:190:12:21

your shopping habits, whatever they can obtain.

0:12:210:12:24

And then there's a real market for that data.

0:12:240:12:26

They buy and sell it

0:12:260:12:27

and there's an online auction bidding for information about you.

0:12:270:12:31

And so the people who know your browsing habits can also discover

0:12:310:12:35

your deepest secrets, and then sell them to the highest bidder.

0:12:350:12:40

So let's say a woman takes a pregnancy test

0:12:420:12:44

and finds out she's pregnant.

0:12:440:12:46

Then she might go look for something online.

0:12:460:12:49

By making pregnancy-related searches, this woman has become a

0:12:490:12:53

hot property for the people looking for a good target for advertising.

0:12:530:12:58

The process of selling then begins.

0:12:580:13:00

Within seconds, she is identified by the companies watching her.

0:13:020:13:07

Her profile is now sold multiple times.

0:13:070:13:09

She will then find herself bombarded with ads relating to pregnancy.

0:13:110:13:15

She may well find that she's been followed around the Web by ads

0:13:170:13:20

and that's really a result of that auction house.

0:13:200:13:23

Now they will say to you that they know, they know she's pregnant but they don't know her name

0:13:230:13:27

but what's happening is now that,

0:13:270:13:29

more and more, that information is really not that anonymous.

0:13:290:13:32

You know it's sort of like if they know you're pregnant,

0:13:320:13:34

and they know where you live, yes maybe they don't know your name.

0:13:340:13:37

That's just because they haven't bothered to look it up.

0:13:370:13:40

We continually create new information about ourselves and this

0:13:400:13:43

information gives a continual window into our behaviours and habits.

0:13:430:13:48

The next stage of surveillance comes from the offices

0:13:480:13:51

and buildings around us,

0:13:510:13:55

sending out Wi-Fi signals across the city.

0:13:550:13:59

OK, so let's see how many signals we have right here, Wi-Fi.

0:13:590:14:03

So we have one, two - 16, 17, 18, 19,

0:14:030:14:08

20, 21, oh, and a few more just added themselves,

0:14:080:14:11

so we're around 25 right now.

0:14:110:14:13

Right there at this point we have 25 signals that

0:14:130:14:15

are reaching out, basically

0:14:150:14:16

sending a little signal to my phone saying, "I'm here,"

0:14:160:14:18

and my phone is sending a signal back saying, "I'm also here."

0:14:180:14:21

As long as your phone's Wi-Fi connection is on

0:14:210:14:23

and connected to a signal, you can be tracked.

0:14:230:14:28

Google, Apple, other big companies are racing to map the whole

0:14:280:14:31

world using Wi-Fi signals and then, whenever your phone is somewhere,

0:14:310:14:35

they know exactly how far you are from the closest Wi-Fi

0:14:350:14:38

signal and they can map you much more precisely even than GPS.

0:14:380:14:41

And we now know that this data has been seized by governments

0:14:410:14:45

as part of their internet surveillance operations.

0:14:450:14:50

The NSA was like, "Oh, that's an awesome way to track people.

0:14:500:14:53

"Let's scoop up that information too."

0:14:530:14:57

So we have seen that the governments find all this data irresistible.

0:14:570:15:02

But is this simply a matter of principle?

0:15:050:15:08

Is losing privacy ultimately the price

0:15:150:15:18

we pay for peaceful streets and freedom from terror attacks?

0:15:180:15:22

This man thinks we should look deeper.

0:15:260:15:28

Bruce Schneier is a leading internet security expert.

0:15:310:15:34

He was part of the team which first analysed the Snowden documents.

0:15:340:15:39

They don't just want your search data

0:15:390:15:41

or your e-mail. They want everything.

0:15:410:15:43

They want to tie it to your real world behaviours - location

0:15:430:15:46

data from your cellphone - and it's these correlations.

0:15:460:15:49

And as you are being surveilled 24/7, you are more under control.

0:15:490:15:56

Right? You are less free, you are less autonomous.

0:15:560:16:00

Schneier believes the ultimate result from all this

0:16:000:16:03

surveillance may be a loss of freedom.

0:16:030:16:05

What data does is gives someone control over you.

0:16:050:16:09

The reason Google and Facebook are collecting this

0:16:090:16:12

is for psychological manipulation.

0:16:120:16:15

That's their stated business purpose, right? Advertising.

0:16:150:16:20

They want to convince you

0:16:200:16:22

to buy things you might not want to buy otherwise.

0:16:220:16:26

And so that data is all about control.

0:16:260:16:28

Governments collect it also for control. Right?

0:16:300:16:33

They want to control their population.

0:16:330:16:35

Maybe they're concerned about dissidents,

0:16:350:16:37

maybe they're concerned about criminals.

0:16:370:16:40

It's hard to imagine that this data can exert such

0:16:430:16:46

a level of potential control over us but looking for the patterns

0:16:460:16:51

in the data can give anyone analysing it huge power.

0:16:510:16:57

What's become increasingly understood is how much is

0:16:590:17:03

revealed by meta-data, by the information about who is

0:17:030:17:06

sending messages and how often they're sending them to each other.

0:17:060:17:09

This reveals information about our social networks, it can

0:17:090:17:12

reveal information about the places that we go on a day-to-day basis.

0:17:120:17:17

All this meta-data stacks up to allow a very detailed

0:17:170:17:20

view into our lives and increasingly what

0:17:200:17:23

we find is that these patterns of communication can even be

0:17:230:17:26

used in a predictive sense to determine factors about our lives.

0:17:260:17:31

And what some people fear is what the spread of this analysis could lead to.

0:17:310:17:36

Ultimately the concern of this is that, in a dystopian scenario,

0:17:360:17:40

you have a situation where every facet of your life is

0:17:400:17:44

something that is open to analysis by whoever has access to the data.

0:17:440:17:48

They can look at the likelihood that you should be given health insurance

0:17:480:17:53

because you come from a family that has a history of heart disease.

0:17:530:17:57

They can look at the likelihood that you're going to get

0:17:570:18:00

Alzheimer's disease because of the amount of active intellectual

0:18:000:18:04

entertainment you take part in on a day-to-day basis.

0:18:040:18:06

And when you start giving this level of minute control over

0:18:060:18:10

people's lives, then you allow far too much power over individuals.

0:18:100:18:14

The picture painted is certainly dark but there is another way.

0:18:170:18:23

It comes from the insights of a scientist whose work once

0:18:270:18:31

seemed like a footnote in the history of the internet - until now.

0:18:310:18:37

The story begins in the late '70s in Berkeley, California.

0:18:420:18:46

Governments and companies had begun to harness the power of computing.

0:18:530:18:57

They seemed to promise a future of efficiency,

0:19:000:19:03

of problems becoming overcome thanks to technology.

0:19:030:19:06

Yet not everyone was so convinced.

0:19:090:19:12

David Chaum was a computer scientist at Berkeley.

0:19:120:19:16

For him, a world in which we would become increasingly joined

0:19:160:19:19

together by machines in a network held grave dangers.

0:19:190:19:23

As computing advanced, he grew increasingly

0:19:260:19:29

convinced of the threats these networks could pose.

0:19:290:19:32

David Chaum was very far ahead of his time.

0:19:330:19:36

He predicted in the early 1980s concerns that would

0:19:360:19:40

arise on the internet 15 or 20 years later -

0:19:400:19:43

the whole field of traffic analysis

0:19:430:19:45

that allows you to predict the behaviours of individuals,

0:19:450:19:47

not by looking at the contents of their e-mails but by looking at the patterns of communication.

0:19:470:19:52

David Chaum to some extent foresaw that and solved the problem.

0:19:520:19:56

Well, it's sad to me but it is really no surprise

0:20:010:20:06

that the privacy issue has unfolded the way it has.

0:20:060:20:10

I spelled it out in the early publications in the '80s.

0:20:100:20:16

Chaum's papers explained that in a future world where we would

0:20:160:20:19

increasingly use computers, it would be easy

0:20:190:20:22

to conduct mass surveillance.

0:20:220:20:25

Chaum wanted to find a way to stop it.

0:20:250:20:29

I always had a deep feeling that privacy is intimately tied to

0:20:290:20:34

human potential and

0:20:340:20:38

that it's an extraordinarily important aspect of democracy.

0:20:380:20:43

Chaum focused on the new technology of e-mails.

0:20:460:20:48

Anyone watching the network through which these messages

0:20:500:20:53

travelled could find out enormous amounts about that person.

0:20:530:20:57

He wanted to make this more difficult.

0:20:570:20:59

Well, I was driving from Berkeley to Santa Barbara

0:21:010:21:05

along the coastline in my VW Camper van and

0:21:050:21:09

out of nowhere... You know, it was

0:21:090:21:11

beautiful scenery, I was just driving along and it occurred to me

0:21:110:21:14

how to solve this problem I'd been trying to solve for a long time.

0:21:140:21:18

Yeah, it was a kind of a, you know, a eureka-moment type.

0:21:180:21:21

I felt like, "Hey, this is it."

0:21:210:21:23

Chaum's focus was the pattern of communications that

0:21:260:21:29

a computer made on the network.

0:21:290:21:31

If that pattern could be disguised using cryptography, it would

0:21:310:21:35

be harder for anyone watching to identify individuals and carry out

0:21:350:21:39

effective surveillance,

0:21:390:21:42

and Chaum's system had a twist.

0:21:420:21:46

Cryptography has traditionally been used to provide

0:21:460:21:50

secrecy for message content and so I used this message secrecy

0:21:500:21:57

technology of encryption to actually protect the meta-data of who

0:21:570:22:03

talks to who and when, and that was quite a paradigm shift.

0:22:030:22:09

Chaum had realised something about surveillance.

0:22:090:22:12

Who we talk to and when is just as important as what we say.

0:22:120:22:17

The key to avoiding this type of traffic analysis was to

0:22:190:22:23

render the user effectively anonymous.

0:22:230:22:25

But he realised that wasn't enough.

0:22:300:22:34

He wanted to build a secure network and to do this

0:22:340:22:39

he needed more anonymous users.

0:22:390:22:41

One cannot be anonymous alone.

0:22:440:22:46

One can only be anonymous relative to a set of people.

0:22:460:22:52

The more anonymous users you can gather together in a network,

0:22:520:22:56

the harder it becomes for someone watching to keep track of them,

0:22:560:23:00

especially if they're mixed up.

0:23:000:23:03

And so a whole batch of input messages from different

0:23:030:23:07

people are shuffled and then sent to another computer, then shuffled

0:23:070:23:14

again and so forth, and you can't tell as an observer of the network

0:23:140:23:19

which item that went in corresponds to which item coming out.

0:23:190:23:24

David Chaum was trying to provide protection against a world in which

0:23:280:23:33

our communications would be analysed and potentially used against us.

0:23:330:23:37

Chaum's response to this was to say, in order to have a free society,

0:23:370:23:43

we need to have freedom from analysis of our behaviours and our communications.

0:23:430:23:48

But Chaum's system didn't take off because communication using

0:23:480:23:51

e-mail was still the preserve of a few academics and technicians.

0:23:510:23:55

Yet his insights weren't forgotten.

0:23:580:24:00

Within a decade, the arrival of the World Wide Web took

0:24:030:24:07

communication increasingly online.

0:24:070:24:10

The US Government understood the importance of protecting

0:24:160:24:19

its own online communications from surveillance.

0:24:190:24:22

It began to put money into research.

0:24:220:24:24

At the US Naval Research Laboratory,

0:24:290:24:32

a team led by scientist Paul Syverson got to work.

0:24:320:24:37

Suppose we wanted to have a system where people could communicate

0:24:410:24:46

back to their home office or with each other over the internet

0:24:460:24:51

but without people being able to associate source and destination.

0:24:510:24:58

Syverson soon came across the work of David Chaum.

0:24:580:25:02

The first work which is associated with this area is

0:25:020:25:07

the work of David Chaum.

0:25:070:25:08

A Chaum mix basically gets its security

0:25:090:25:13

because it takes in a bunch of messages

0:25:130:25:16

and then re-orders them and changes their appearance and spews them out.

0:25:160:25:22

But this was now the age of the World Wide Web.

0:25:220:25:26

The Navy wanted to develop anonymous communications for this new era.

0:25:260:25:30

So Syverson and his colleagues set to work on building a system

0:25:340:25:38

that could be used by operatives across the world.

0:25:380:25:41

You have enough of a network with enough distribution that it's

0:25:450:25:51

going to be very hard for an adversary to be in all

0:25:510:25:54

the places and to see all the traffic wherever it is.

0:25:540:25:59

Syverson's system was called the Tor network.

0:25:590:26:03

Tor stands for "the onion router". It works like this.

0:26:030:26:07

A user wants to visit a website but doesn't want to

0:26:100:26:13

reveal their IP address, the marker that identifies their computer.

0:26:130:26:18

As they send the request, three layers of encryption

0:26:180:26:21

are placed around it like the layers of an onion.

0:26:210:26:25

The message is then sent through a series of computers which

0:26:250:26:28

have volunteered to act as relay points.

0:26:280:26:31

As the message passes from computer to computer,

0:26:330:26:36

a layer of encryption is removed.

0:26:360:26:39

Each time it is removed, all the relay computer can see

0:26:390:26:43

is an order which tells it to pass the message on.

0:26:430:26:46

The final computer relay decrypts the innermost

0:26:460:26:49

layer of encryption, revealing the content of the communication.

0:26:490:26:53

However - importantly - the identity of the user is hidden.

0:26:530:26:57

Somebody who wants to look at things around the Web and not necessarily

0:27:000:27:06

have people know what he's interested in. It might just be the

0:27:060:27:10

local internet services provider, he doesn't want them to know

0:27:100:27:13

which things he's looking at, but it might be also the destination.

0:27:130:27:17

Syverson's system worked.

0:27:200:27:21

It was now possible to surf the net without being watched.

0:27:240:27:28

As David Chaum had observed, the more anonymous people,

0:27:300:27:34

the better the security.

0:27:340:27:36

The Navy had what they believed was a smart way of achieving that...

0:27:360:27:40

..open the network out to everyone.

0:27:450:27:47

It's not enough for a government system to carry traffic just for the government.

0:27:500:27:55

It also has to carry traffic for other people.

0:27:550:27:59

Part of anonymity is having a large number of people who are also

0:27:590:28:02

anonymous because you can't be anonymous on your own.

0:28:020:28:06

What Syverson and his team had done, building on the work of David Chaum,

0:28:060:28:11

would begin to revolutionise the way that people could operate online.

0:28:110:28:15

Over the coming years,

0:28:190:28:21

the Tor network expanded as more people volunteered to become

0:28:210:28:25

relay computers, the points through which the messages could be relayed.

0:28:250:28:30

What Tor did was it made a useable system for people.

0:28:300:28:33

People wanted to protect what they were

0:28:330:28:35

looking at on the internet from being watched by their ISP or

0:28:350:28:39

their government or the company that they're working for at the time.

0:28:390:28:42

But Tor's success wasn't just down to the Navy.

0:28:420:28:46

In the mid-2000s, they handed the network over to a non-profit

0:28:500:28:54

organisation who overhauled the system.

0:28:540:28:57

Now the network would be represented by people like this -

0:28:590:29:03

Jake Applebaum...

0:29:030:29:05

..researchers dedicated to the opportunities

0:29:070:29:10

they felt Tor could give for free speech.

0:29:100:29:12

We work with the research community

0:29:160:29:18

all around the world, the academic community, the hacker community.

0:29:180:29:21

It's a free software project

0:29:210:29:23

so that means that all the source code is available.

0:29:230:29:25

That means that anyone can look at it and see how it works,

0:29:250:29:27

and that means everybody that does and shares it with us helps improve

0:29:270:29:31

the programme for everyone else on the planet and the network as a whole.

0:29:310:29:35

Applebaum now travels the world promoting the use of the software.

0:29:360:29:43

The Tor network gives each person the ability to read without

0:29:430:29:46

creating a data trail that will later be used against them.

0:29:460:29:50

It gives every person a voice.

0:29:500:29:54

Every person has the right to read and to speak freely,

0:29:540:29:56

not one human excluded.

0:29:560:29:58

And one place Tor has become important is the Middle East.

0:29:580:30:02

During the Arab Spring,

0:30:050:30:06

as disturbances spread across the region, it became a vital

0:30:060:30:10

tool for dissidents...

0:30:100:30:11

..especially in places like Syria.

0:30:130:30:16

One of those who used it from the beginning was opposition activist Reem al Assil.

0:30:180:30:22

I found out first about Tor back in 2011.

0:30:260:30:29

Surveillance in Syria is a very big problem for activists

0:30:320:30:37

or for anyone even,

0:30:370:30:39

because the Syrian regime are trying all the time to get into people's

0:30:390:30:44

e-mails and Facebook to see what they are up to, what they are doing.

0:30:440:30:50

By the time the Syrian uprising happened,

0:30:500:30:52

the Tor project had developed a browser which made

0:30:520:30:56

downloading the software very simple.

0:30:560:30:59

You basically go and download Tor in your computer

0:30:590:31:02

and once it's installed, whenever you want to browse the Web,

0:31:020:31:07

you go and click on it just like Internet Explorer or Google Chrome.

0:31:070:31:13

Reem had personal experience of the protection offered by Tor

0:31:130:31:17

when she was arrested by the secret police.

0:31:170:31:19

I denied having any relation with any opposition work, you know,

0:31:220:31:26

or anything and they tried to intimidate me

0:31:260:31:29

and they said, "Well, see, we have... we know everything about you so now,

0:31:290:31:34

"just we need you to tell us," but I knew that they don't know anything.

0:31:340:31:39

By using Tor, I was an anonymous user so they couldn't tell

0:31:390:31:43

that Reem is doing so-and-so, is watching so-and-so.

0:31:430:31:48

So that's why Tor protected me in this way.

0:31:480:31:52

Syria was not the only place where Tor was vital.

0:31:540:31:57

It's used in China and in Iran.

0:32:010:32:06

In any country where internet access is restricted,

0:32:060:32:09

Tor can be used by citizens to avoid the gaze of the authorities.

0:32:090:32:16

China, for example, regularly attacks and blocks the Tor network

0:32:160:32:19

and they don't attack us directly

0:32:190:32:22

so much as they actually attack people in China using Tor.

0:32:220:32:25

They stop them from using the Tor network.

0:32:250:32:27

But Tor wasn't just helping inside repressive regimes.

0:32:290:32:34

It was now being used for whistle-blowing in the West,

0:32:340:32:39

through WikiLeaks,

0:32:390:32:42

founded by Julian Assange.

0:32:420:32:44

Assange has spent the last two years under

0:32:490:32:52

the protection of the Ecuadorian Embassy in London.

0:32:520:32:55

He is fighting extradition to Sweden on sexual assault charges,

0:32:550:32:59

charges he denies.

0:32:590:33:01

I'd been involved in cryptography and anonymous communications

0:33:020:33:05

for almost 20 years, since the early 1990s.

0:33:050:33:09

Cryptographic anonymity didn't come from nowhere.

0:33:090:33:12

It was a long-standing quest which had a Holy Grail,

0:33:120:33:14

which is to be able to communicate

0:33:140:33:17

individual-to-individual freely and anonymously.

0:33:170:33:20

Tor was the first protocol,

0:33:200:33:23

first anonymous protocol,

0:33:230:33:25

that got the balance right.

0:33:250:33:27

From its early years, people who wanted to submit documents

0:33:290:33:32

anonymously to WikiLeaks could use Tor.

0:33:320:33:35

Tor was and is one of the mechanisms

0:33:370:33:40

which we have received important documents, yes.

0:33:400:33:42

One man who provided a link between WikiLeaks

0:33:450:33:48

and the Tor project was Jake Applebaum.

0:33:480:33:51

Sources that want to leak documents

0:33:530:33:56

need to be able to communicate with WikiLeaks

0:33:560:33:58

and it has always been the case

0:33:580:34:00

that they have offered a Tor-hidden service and that Tor-hidden service

0:34:000:34:05

allows people to reach the WikiLeaks submission engine.

0:34:050:34:09

In 2010, what could be achieved

0:34:100:34:13

when web activism met anonymity was revealed to the world.

0:34:130:34:17

WikiLeaks received a huge leak

0:34:220:34:24

of confidential US government material,

0:34:240:34:27

mainly relating to the wars in Afghanistan and Iraq.

0:34:270:34:30

The first release was this footage

0:34:360:34:38

which showed a US Apache helicopter attack in Iraq.

0:34:380:34:41

Among those dead were two journalists from Reuters,

0:34:490:34:53

Saeed Chmagh and Namir Noor-Eldeen.

0:34:530:34:55

The Americans investigated,

0:34:560:34:58

but say there was no wrong-doing

0:34:580:35:01

yet this footage would never have become public if Chelsea Manning,

0:35:010:35:05

a US contractor in Iraq, hadn't leaked it.

0:35:050:35:08

Chelsea Manning has said that to the court,

0:35:100:35:13

that he used Tor amongst a number of other things

0:35:130:35:16

to submit documents to WikiLeaks.

0:35:160:35:19

Obviously, we can't comment on that,

0:35:190:35:20

because we have an obligation to protect our sources.

0:35:200:35:23

The release of the documents provided by Manning,

0:35:260:35:29

which culminated in 250,000 cables,

0:35:290:35:32

seemed to reveal the power of anonymity through encryption.

0:35:320:35:35

Because with encryption,

0:35:380:35:39

two people can come together to communicate privately.

0:35:390:35:43

The full might of a superpower cannot break that encryption

0:35:430:35:47

if it is properly implemented and that's an extraordinary thing,

0:35:470:35:50

where individuals are given a certain type of freedom of action

0:35:500:35:54

that is equivalent to the freedom of action that a superpower has.

0:35:540:35:57

But it wasn't the technology that let Manning down.

0:35:590:36:02

He confessed what he had done to a contact and was arrested.

0:36:040:36:08

Those who had used anonymity to leak secrets were now under fire.

0:36:110:36:15

WikiLeaks had already been criticised for the release

0:36:190:36:22

of un-redacted documents revealing the names of Afghans

0:36:220:36:25

who had assisted the US.

0:36:250:36:27

The US government was also on the attack.

0:36:280:36:31

The United States strongly condemns the illegal

0:36:310:36:35

disclosure of classified information.

0:36:350:36:38

It puts people's lives in danger, threatens our national security...

0:36:380:36:43

Meanwhile, the Tor project, started by the US government,

0:36:450:36:49

was becoming a target.

0:36:490:36:51

It's very funny, right, because on the one hand,

0:36:530:36:56

these people are funding Tor because they say they believe in anonymity.

0:36:560:36:59

And on the other hand, they're detaining me at airports,

0:36:590:37:02

threatening me and doing things like that.

0:37:020:37:04

And they've even said to me, "We love what you do in Iran

0:37:040:37:07

"and in China, in helping Tibetan people.

0:37:070:37:09

"We love all the stuff that you're doing,

0:37:090:37:11

"but why do you have to do it here?"

0:37:110:37:13

Thanks to Edward Snowden,

0:37:150:37:16

we now know that this culminated in the Tor network

0:37:160:37:19

being the focus of failed attacks by America's National Security Agency.

0:37:190:37:23

They revealed their frustration

0:37:250:37:27

in a confidential PowerPoint presentation called Tor Stinks,

0:37:270:37:31

which set out the ways in which the NSA had tried to crack the network.

0:37:310:37:36

They think Tor stinks because they want to attack people

0:37:360:37:40

and sometimes technology makes that harder.

0:37:400:37:43

It is because the users have something which bothers them,

0:37:430:37:47

which is real autonomy.

0:37:470:37:49

It gives them true privacy and security.

0:37:490:37:52

Tor, invented and funded by the US government,

0:37:520:37:55

was now used by activists, journalists,

0:37:550:37:58

anybody who wanted to communicate anonymously,

0:37:580:38:02

and it wasn't long before its potential

0:38:020:38:04

began to attract a darker type of user.

0:38:040:38:07

It began here in Washington.

0:38:120:38:13

Jon Iadonisi is a former Navy SEAL

0:38:140:38:17

turned advisor on cyber operations to government.

0:38:170:38:20

There was some tips that came in out of Baltimore,

0:38:220:38:26

two federal agents saying,

0:38:260:38:28

"You are a police officer,

0:38:280:38:30

"you really should take a look at this website

0:38:300:38:32

"and, oh, by the way, the only way you get to it

0:38:320:38:35

"is if you anonymise yourself

0:38:350:38:37

"through something called the Tor router."

0:38:370:38:39

What they found was a website called Silk Road.

0:38:390:38:44

And they were amazed when they were able to download this plug-in

0:38:440:38:48

on their browser, go into the Silk Road

0:38:480:38:51

and then from there, see, literally they can make a purchase,

0:38:510:38:55

it was like a buffet dinner for narcotics.

0:38:550:38:58

Silk Road was a global drugs marketplace

0:39:010:39:04

which brought together anonymous buyers and sellers

0:39:040:39:07

from around the world.

0:39:070:39:08

And what they found was people aren't just buying

0:39:100:39:13

one or two instances of designer drugs

0:39:130:39:15

but they're buying massive quantity wholesale.

0:39:150:39:18

In London, the tech community was watching closely.

0:39:180:39:23

Thomas Olofsson was one of many interested

0:39:230:39:26

in how much money the site was making.

0:39:260:39:28

Well, in Silk Road, they have something called an "escrow" system

0:39:280:39:32

so if you want to buy drugs, you pay money into Silk Road

0:39:320:39:36

as a facilitator

0:39:360:39:37

and they keep the money in escrow until you sign off

0:39:370:39:40

that you have had your drugs delivered.

0:39:400:39:43

They will then release your money to the drug dealer.

0:39:430:39:48

We're talking about several millions a day in trade.

0:39:480:39:51

The extraordinary success of Silk Road

0:39:550:39:57

attracted new customers to new illegal sites.

0:39:570:40:01

This part of the internet even had a new name - the Dark Web.

0:40:020:40:06

A dark website is impossible to shut down

0:40:070:40:10

because you don't know where a dark website is hosted

0:40:100:40:13

or even where it's physically located or who's behind it.

0:40:130:40:17

And there was one other thing which made Silk Road

0:40:210:40:24

and its imitators difficult to stop.

0:40:240:40:26

You paid with a new currency that only exists online, called Bitcoin.

0:40:260:40:33

Before, even if you had anonymity as a user,

0:40:330:40:36

you could still track the transactions,

0:40:360:40:39

the money flowing between persons,

0:40:390:40:42

because if you use your Visa card, your ATM, bank transfer,

0:40:420:40:46

Western Union, there is always... Money leaves a mark.

0:40:460:40:49

This is the first time that you can anonymously

0:40:490:40:52

move money between two persons.

0:40:520:40:54

Bitcoin is no longer an underground phenomenon.

0:41:020:41:06

Buy Bitcoin, 445.

0:41:060:41:09

Sellers will sell at 460.

0:41:090:41:13

If you're buying less than half a Bitcoin,

0:41:130:41:15

you'll have to go to market price.

0:41:150:41:17

This Bitcoin event is taking place on Wall Street.

0:41:170:41:20

45 bid, 463 asked.

0:41:230:41:25

But how does the currency work?

0:41:280:41:30

One of the people best placed to explain is Peter Todd.

0:41:310:41:35

He's chief scientist for a number of Bitcoin companies

0:41:350:41:38

including one of the hottest, Dark Wallet.

0:41:380:41:42

So, what Bitcoin is, is it's virtual money.

0:41:440:41:47

I can give it to you electronically,

0:41:470:41:48

you can give it to someone else electronically.

0:41:480:41:51

The key thing to understand about Bitcoin

0:41:510:41:53

is that these two people trading here are making a deal

0:41:530:41:57

without any bank involvement.

0:41:570:41:59

It's a form of electronic cash

0:41:590:42:02

and that has massive implications.

0:42:020:42:05

So, of course, in normal electronic banking systems,

0:42:050:42:07

what I would say is, "Please transfer money from my account

0:42:070:42:10

"to someone else's account," but fundamentally,

0:42:100:42:13

who owns what money is recorded by the bank, by the intermediary?

0:42:130:42:17

What's really interesting about Bitcoin is this virtual money

0:42:190:42:23

is not controlled by a bank, it's not controlled by government.

0:42:230:42:26

It's controlled by an algorithm.

0:42:260:42:28

The algorithm in question is a triumph of mathematics.

0:42:280:42:31

This is a Bitcoin transaction in action.

0:42:320:42:35

To pay someone in Bitcoin, the transaction must be signed

0:42:380:42:41

using an cryptographic key which proves the parties agree to it.

0:42:410:42:46

An unchangeable electronic record is then made of this transaction.

0:42:460:42:50

This electronic record contains every transaction

0:42:510:42:54

ever made on Bitcoin.

0:42:540:42:57

It's called the block chain

0:42:570:42:59

and it's stored in a distributed form by every user,

0:42:590:43:04

not by a bank or other authority.

0:43:040:43:07

The block chain is really the revolutionary part of Bitcoin.

0:43:070:43:10

What's really unique about it is it's all public

0:43:100:43:13

so you can run the Bitcoin algorithm on your computer

0:43:130:43:16

and your computer's inspecting every single transaction

0:43:160:43:19

to be sure that it actually followed the rules,

0:43:190:43:21

and the rules are really what Bitcoin is.

0:43:210:43:23

You know, that's the rules of the system, that's the algorithm.

0:43:230:43:26

It says things like, "You can only send money to one person at once,"

0:43:260:43:29

and we all agree to those rules.

0:43:290:43:32

And Bitcoin has one other characteristic

0:43:320:43:34

which it shares with cash.

0:43:340:43:37

It can be very hard to trace.

0:43:370:43:40

Well, what's controversial about Bitcoin

0:43:400:43:43

is that it goes back to something quite like cash.

0:43:430:43:46

It's not like a bank account where a government investigator

0:43:460:43:49

can just call up the bank and get all the records

0:43:490:43:51

of who I've ever transacted with without any effort at all.

0:43:510:43:55

You know, if they want to go and find out where I got my Bitcoins,

0:43:550:43:58

they're going to have to ask me.

0:43:580:43:59

They're going to have to investigate.

0:43:590:44:01

The emergence of Bitcoin and the growth of the Dark Web

0:44:050:44:08

was now leading law enforcement in Washington to take a close interest.

0:44:080:44:12

Transactions in the Dark Web were unbelievably more enabled

0:44:140:44:18

and in many cases could exist

0:44:180:44:21

because of this virtual currency known as Bitcoin.

0:44:210:44:24

So, as people started to take a look at Bitcoin

0:44:250:44:28

and understand, "How do we regulate this, how do we monitor this?

0:44:280:44:31

"Oh, my God! It's completely anonymous, we have no record,"

0:44:310:44:34

they started seeing transactions in the sort of the digital exhaust

0:44:340:44:38

that led them into Silk Road.

0:44:380:44:40

And so, now you had criminal grounds to start taking a look at this

0:44:400:44:44

coupled with the movement from the financial side and regulatory side

0:44:440:44:50

on the virtual currency.

0:44:500:44:52

So these two fronts began converging.

0:44:520:44:55

The FBI began to mount a complex plot

0:44:550:44:57

against the alleged lead administrator of Silk Road

0:44:570:45:01

who used the pseudonym "Dread Pirate Roberts".

0:45:010:45:04

They really started focusing on Dread Pirate Roberts

0:45:060:45:10

and his role as not just a leader but sort of the mastermind

0:45:100:45:14

in the whole ecosystem of the platform, right,

0:45:140:45:17

from management administratively to financial merchandising

0:45:170:45:22

to vendor placement, recruitment, etc.

0:45:220:45:25

Dread Pirate Roberts was believed to be Ross Ulbricht,

0:45:260:45:30

listed on his LinkedIn entry

0:45:300:45:32

as an investment advisor and entrepreneur from Austin, Texas.

0:45:320:45:37

Taking him down was really almost a story out of a Hollywood movie.

0:45:370:45:42

I mean, we had people that staged the death

0:45:420:45:45

of one of the potential informants.

0:45:450:45:48

We had undercover police officers acting as cocaine...

0:45:480:45:52

under-kingpins inside Silk Road.

0:45:520:45:54

So, at the conclusion of all these different elements,

0:45:540:45:58

they actually finally ended up bringing in Dread Pirate Roberts

0:45:580:46:03

and now are trying to move forward with his trial.

0:46:030:46:07

Whether or not he is Dread Pirate Roberts,

0:46:070:46:10

Ulbricht's arrest in 2013 brought down Silk Road.

0:46:100:46:14

In the process, the FBI seized 28.5 million

0:46:140:46:18

from the site's escrow account, money destined for drug dealers.

0:46:180:46:23

Yet the problem for the US authorities

0:46:260:46:28

was that their success was only temporary.

0:46:280:46:31

The site is now up and running again.

0:46:350:46:37

It re-emerged just two months later by some other guys

0:46:380:46:42

that took the same code base, the same,

0:46:420:46:44

actually the same site more or less,

0:46:440:46:46

just other people running the site,

0:46:460:46:48

because it's obviously a very, very profitable site to run.

0:46:480:46:53

And Silk Road has now been joined by a host of other sites.

0:46:530:46:57

One of the boom industries on the Dark Web is financial crime.

0:47:000:47:05

Yeah, this website is quite focused

0:47:060:47:08

on credit card numbers and stolen data.

0:47:080:47:11

On here, for instance, is credit card numbers

0:47:110:47:14

from around 5-6 per credit card number

0:47:140:47:17

paying in equivalent of Bitcoins, totally anonymous.

0:47:170:47:20

The cards are sold in something called a "dump".

0:47:210:47:25

I mean, it's quite a large number of entries, it's tens of thousands.

0:47:260:47:31

You get everything you need to be able to buy stuff online

0:47:310:47:34

with these credit cards - first name, last name, address,

0:47:340:47:39

the card number, everything, the CVV number,

0:47:390:47:42

everything but the PIN code, basically.

0:47:420:47:44

The Dark Web is now used for various criminal activities -

0:47:440:47:48

drugs and guns, financial crime,

0:47:480:47:51

and even child sexual exploitation.

0:47:510:47:54

So, is anonymity a genuine threat to society?

0:48:020:48:05

A nightmare that should haunt us?

0:48:080:48:10

This man who leads Europe's fight against cybercrime believes so.

0:48:150:48:20

The Tor network plays a role because it hides criminals.

0:48:200:48:24

I know it was not the intention,

0:48:240:48:26

but that's the outcome and this is my job,

0:48:260:48:28

to tell the society what is the trade-offs here.

0:48:280:48:32

By having no possibilities to penetrate this,

0:48:320:48:35

we will then secure criminals

0:48:350:48:37

that they can continue their crimes on a global network.

0:48:370:48:40

And despite the success over Silk Road and others,

0:48:420:48:45

he is worried for the future.

0:48:450:48:47

Our detection rate is dropping. It's very simple.

0:48:500:48:53

The business model of the criminal is to make profit with low risk

0:48:530:48:57

and, here, you actually eliminate the risk

0:48:570:49:00

because there is no risk to get identified

0:49:000:49:02

so either you have to screw up or be very unlucky

0:49:020:49:05

as somebody rats you out.

0:49:050:49:07

Otherwise, you're secure. So, if you run a tight operation,

0:49:070:49:10

it's very, very difficult for the police to penetrate

0:49:100:49:13

so it's risk-free crime.

0:49:130:49:15

So, does the anonymity offered by the Tor network

0:49:150:49:18

encourage crime or simply displace it?

0:49:180:49:22

Those who work for the project are well aware of the charges it faces.

0:49:220:49:26

There is often asserted certain narratives about anonymity

0:49:260:49:30

and, of course, one of the narratives is that anonymity creates crime

0:49:300:49:35

so you hear about things like the Silk Road and you hear,

0:49:350:49:38

"Oh, it's terrible, someone can do something illegal on the internet."

0:49:380:49:41

Well, welcome to the internet.

0:49:410:49:43

It is a reflection of human society

0:49:430:49:45

where there is sometimes illegal behaviour.

0:49:450:49:48

These arguments aside,

0:49:480:49:50

for users wanting to avoid surveillance,

0:49:500:49:53

Tor has limitations -

0:49:530:49:55

it's slow, content isn't automatically encrypted

0:49:550:49:59

on exiting the network

0:49:590:50:00

and some have claimed a bug can de-anonymise users.

0:50:000:50:04

Tor say they have a fix for this problem.

0:50:040:50:06

Whilst the search for a solution to bulk surveillance continues,

0:50:070:50:11

this man has a different approach.

0:50:110:50:13

He is Eugene Kaspersky, CEO of one of the world's

0:50:160:50:19

fastest-growing internet security companies.

0:50:190:50:23

300 million users now rely on its software.

0:50:230:50:26

For Kaspersky, widespread anonymity is not a solution.

0:50:270:50:32

In fact, he thinks we need the absolute opposite,

0:50:320:50:36

a form of online passport.

0:50:360:50:38

My idea is that all the services in the internet, they must be split.

0:50:400:50:45

So, the non-critical - your personal e-mails,

0:50:470:50:51

the news from the internet,

0:50:510:50:53

your chatting with your family,

0:50:530:50:55

what else? - leave them alone so don't need, you don't need any ID.

0:50:550:51:01

It's a place of freedom.

0:51:010:51:02

And there are critical services, like banking services,

0:51:020:51:06

financial services, banks, booking their tickets,

0:51:060:51:10

booking the hotels or what else?

0:51:100:51:12

So please present your ID if you do this.

0:51:120:51:16

So, it's a kind of balance -

0:51:180:51:21

freedom and security,

0:51:210:51:23

anonymity and wearing the badge, wearing your ID.

0:51:230:51:26

In his view, this shouldn't be a problem,

0:51:260:51:29

as privacy has pretty much disappeared online anyway.

0:51:290:51:32

The reality is that we don't have too much privacy in the cyber space.

0:51:340:51:38

If you want to travel, if you want to pay with your credit card,

0:51:380:51:42

if you want to access internet, forget about privacy.

0:51:420:51:45

Yet, the idea that we could soon inhabit a world

0:51:490:51:52

where our lives are ever more transparent is very unpopular.

0:51:520:51:56

Some people say, "Privacy is over, get over it",

0:51:570:52:02

because basically we're trending towards the idea

0:52:020:52:05

of just completely transparent lives.

0:52:050:52:08

I think that's nonsense because information boundaries are important

0:52:080:52:13

so that means that we've got to have systems which respect them

0:52:130:52:16

so we've got to have the technology to produce,

0:52:160:52:19

which can produce privacy.

0:52:190:52:20

For those of us who might wish to resist surveillance,

0:52:260:52:29

the hunt for that technology is now on

0:52:290:52:32

and it is to cryptographers that we must look for answers.

0:52:320:52:36

If you want a demonstration of their importance to today's internet,

0:52:380:52:42

you only have to come to this bunker in Virginia.

0:52:420:52:45

Today, an unusual ceremony is taking place.

0:52:450:52:49

COMPUTER: 'Please centre your eyes in the mirror.'

0:52:520:52:55

The internet is being upgraded.

0:52:550:52:57

'Thank you, your identity has been verified.'

0:52:570:52:59

Right, we're in.

0:53:000:53:01

This is the biggest security upgrade to the internet

0:53:010:53:04

in over 20 years.

0:53:040:53:07

A group of tech specialists have been summoned by Steve Crocker,

0:53:070:53:11

one of the godfathers of the internet.

0:53:110:53:15

We discovered that there were some vulnerabilities

0:53:150:53:18

in the basic domain name system structure

0:53:180:53:20

that can lead to having a domain name hijacked

0:53:200:53:25

or having someone directed to a false site and, from there,

0:53:250:53:29

passwords can be detected and accounts can be cleaned out.

0:53:290:53:32

At the heart of this upgrade is complex cryptography.

0:53:340:53:37

Work began on adding cryptographically strong signatures

0:53:390:53:42

to every entry in the domain name system

0:53:420:53:45

in order to make it impossible to spoof or plant false information.

0:53:450:53:51

To make sure these cryptographic codes remain secure,

0:53:550:53:58

they are reset every three months.

0:53:580:54:00

Three trusted experts from around the world have been summoned

0:54:020:54:06

with three keys, keys that open these safety deposit boxes.

0:54:060:54:10

So, now we are opening box 1-2-4-0.

0:54:100:54:13

Inside are smart cards.

0:54:130:54:16

When the three are put together,

0:54:160:54:18

a master code can be validated which resets the system

0:54:180:54:22

for millions of websites.

0:54:220:54:24

We are very lucky because every one of these people

0:54:240:54:26

are respected members of the technical community,

0:54:260:54:29

technical internet community,

0:54:290:54:31

and that's what we were doing.

0:54:310:54:32

Just like the internet itself was formed from the bottom up,

0:54:320:54:35

high techies from around the world.

0:54:350:54:37

So, step 22.

0:54:370:54:39

A complex set of instructions is followed

0:54:400:54:43

to make sure the system is ready to operate.

0:54:430:54:46

And now we need to verify the KSR.

0:54:460:54:49

This is the key step.

0:54:490:54:51

So, this is it. When I hit "yes", it will be signed.

0:54:530:54:57

There you go, thank you very much.

0:54:590:55:01

APPLAUSE

0:55:010:55:03

These scientists want to use cryptography to make

0:55:060:55:09

the architecture of the internet more resilient.

0:55:090:55:13

But, for users, cryptography has another purpose.

0:55:130:55:17

We can use it to encrypt our message content.

0:55:170:55:21

So, if we want to change the way that mass surveillance is done,

0:55:210:55:24

encryption, it turns out,

0:55:240:55:26

is one of the ways that we do that.

0:55:260:55:28

When we encrypt our data,

0:55:280:55:30

we change the value that mass surveillance presents.

0:55:300:55:33

When phone calls are end-to-end encrypted

0:55:330:55:35

such that no-one else can decipher their content,

0:55:350:55:38

thanks to the science of mathematics, of cryptography.

0:55:380:55:41

And those who understand surveillance from the inside

0:55:430:55:46

agree that it's the only way to protect our communications.

0:55:460:55:50

However there's a problem.

0:56:210:56:23

It's still very difficult to encrypt content in a user-friendly way.

0:56:230:56:27

One of the best ways to protect your privacy is to use encryption,

0:56:280:56:32

but encryption is so incredibly hard to use.

0:56:320:56:35

I am a technology person

0:56:350:56:37

and I struggle all the time to get my encryption products to work.

0:56:370:56:41

And that's the next challenge,

0:56:410:56:44

developing an internet where useable encryption

0:56:440:56:47

makes bulk surveillance,

0:56:470:56:49

for those who want to avoid it, more difficult.

0:56:490:56:52

At the moment, the systems we're using are fairly insecure

0:56:530:56:57

in lots of ways. I think the programmers out there

0:56:570:56:59

have got to help build tools to make that easier.

0:56:590:57:02

If encryption is to become more commonplace,

0:57:050:57:07

the pressure will likely come from the market.

0:57:070:57:10

The result of the revelations about the National Security Agency

0:57:120:57:15

is that people are becoming quite paranoid.

0:57:150:57:18

It's important to not be paralysed by that paranoia,

0:57:180:57:21

but rather to express the desire for privacy

0:57:210:57:25

and by expressing the desire for privacy,

0:57:250:57:28

the market will fill the demand.

0:57:280:57:30

By making it harder, you protect yourself,

0:57:300:57:33

you protect your family and you also protect other people

0:57:330:57:36

in just making it more expensive to surveil everyone all the time.

0:57:360:57:39

In the end, encryption is all about mathematics

0:57:390:57:43

and, for those who want more privacy,

0:57:430:57:46

the numbers work in their favour.

0:57:460:57:48

It turns out that it's easier in this universe

0:57:490:57:52

to encrypt information, much easier, than it is to decrypt it

0:57:520:57:56

if you're someone watching from the outside.

0:57:560:57:59

The universe fundamentally favours privacy.

0:57:590:58:02

We have reached a critical moment

0:58:040:58:06

when the limits of privacy

0:58:060:58:08

could be defined for a generation.

0:58:080:58:11

We are beginning to grasp the shape of this new world.

0:58:110:58:14

It's time to decide whether we are happy to accept it.

0:58:160:58:19

'Thank you. Your identity has been verified.'

0:58:450:58:48

Download Subtitles

SRT

ASS