Inside the Dark Web

Download Subtitles

Transcript

0:00:02 > 0:00:05It's just 25 years since the World Wide Web was created.

0:00:05 > 0:00:09It now touches all of our lives, our personal information

0:00:09 > 0:00:13and data swirling through the internet on a daily basis.

0:00:13 > 0:00:17Yet it's now caught in the greatest controversy of its life -

0:00:17 > 0:00:18surveillance.

0:00:18 > 0:00:21This is a spy master's dream.

0:00:21 > 0:00:25No spy of the previous generations could have imagined that we

0:00:25 > 0:00:28would all volunteer for the world's best tracking device.

0:00:28 > 0:00:32The revelations of US intelligence contractor Edward Snowden

0:00:32 > 0:00:37have led many to ask if the Web we love has been turned against us...

0:00:37 > 0:00:40They don't just want your search data or your e-mail.

0:00:40 > 0:00:42They want everything.

0:00:42 > 0:00:47And, as you are being surveilled 24/7, you are more under control.

0:00:47 > 0:00:49You are less free.

0:00:49 > 0:00:54..leading to soul-searching amongst those responsible for the Web itself.

0:00:54 > 0:00:57I used to think that in some countries you worry about the

0:00:57 > 0:01:00government and in some countries you worry about the corporations.

0:01:00 > 0:01:03I realise now that that was naive.

0:01:03 > 0:01:06But thanks to a collection of brilliant thinkers

0:01:06 > 0:01:08and researchers, science has been fighting back...

0:01:08 > 0:01:15It's really no surprise that the privacy issue has unfolded the way it has.

0:01:15 > 0:01:20..developing technology to defeat surveillance, protecting activists...

0:01:20 > 0:01:24They tried to intimidate me. I knew that they don't know anything.

0:01:24 > 0:01:28..and in the process coming into conflict with global power.

0:01:28 > 0:01:33They are detaining me at airports, threatening me.

0:01:33 > 0:01:36But now, thanks to a new digital currency...

0:01:36 > 0:01:40If you're buying less than half a Bitcoin, you'll have to go to market price.

0:01:40 > 0:01:43..this technology is sending law makers into a panic,

0:01:43 > 0:01:47due to the growth of a new black market in the Dark Web.

0:01:47 > 0:01:51It was like a buffet dinner for narcotics.

0:01:51 > 0:01:55Our detection rate is dropping. It's risk-free crime.

0:01:55 > 0:01:59This is the story of a battle to shape the technology which now

0:01:59 > 0:02:03defines our world, and whoever wins will influence not only the

0:02:03 > 0:02:09future of the internet but the very idea of what it means to be free.

0:02:09 > 0:02:11The sickness that befalls the internet is something that

0:02:11 > 0:02:13befalls the whole world.

0:02:24 > 0:02:29Where does the outside world stop and private space begin?

0:02:29 > 0:02:33What details of your life are you willing to share with strangers?

0:02:36 > 0:02:38Take this house.

0:02:40 > 0:02:44Every morning, lights come on.

0:02:44 > 0:02:48Coffee brews - automatically.

0:02:50 > 0:02:53Technology just like this is increasingly being

0:02:53 > 0:02:56installed into millions of homes across the world

0:02:56 > 0:03:01and it promises to change the way we live for ever.

0:03:01 > 0:03:04So there are sensors of all kinds,

0:03:04 > 0:03:05there's lights and locks and thermostats

0:03:05 > 0:03:07and once they're connected

0:03:07 > 0:03:10then our platform can make them do whatever you want them to do.

0:03:10 > 0:03:11So as an example,

0:03:11 > 0:03:15if I wake up in the morning, the house knows that I'm waking up.

0:03:15 > 0:03:16It can wake up with me.

0:03:16 > 0:03:20When we walk in the kitchen, it will play the local news

0:03:20 > 0:03:23and sort of greet us into the day, tell us the weather forecast

0:03:23 > 0:03:26so we know how to dress for the day and so on.

0:03:26 > 0:03:30This technology is known as the internet of things,

0:03:30 > 0:03:33where the objects in our houses - kitchen appliances,

0:03:33 > 0:03:37anything electronic - can be connected to the internet.

0:03:37 > 0:03:43But for it to be useful, we're going to have to share intimate details of our private life.

0:03:43 > 0:03:46So this is my things app.

0:03:46 > 0:03:49It can run on your mobile phone or on a tablet or something like that.

0:03:49 > 0:03:51I can do things like look at the comings

0:03:51 > 0:03:52and goings of family members.

0:03:52 > 0:03:54It can automatically detect when we come and go

0:03:54 > 0:03:58based on our mobile phones or you can have it detect your presence

0:03:58 > 0:04:01with a little sensor, you can put it in your car or something like that.

0:04:01 > 0:04:04So when we leave the house, and there's no-one home, that's when it'll lock up

0:04:04 > 0:04:07and shut down all the electricity used and so on.

0:04:07 > 0:04:11For most of us, this is deeply private information.

0:04:11 > 0:04:12Yet once we hand it over,

0:04:12 > 0:04:14we have to trust a company to keep it confidential.

0:04:14 > 0:04:18The consumer really owns 100% of their own data so they're opting in.

0:04:18 > 0:04:21It's not something where that data would ever be shared

0:04:21 > 0:04:23without their giving their permission.

0:04:28 > 0:04:33This house represents a new normal where even the movements

0:04:33 > 0:04:36within our own home are documented and stored.

0:04:36 > 0:04:38It's a new frontier.

0:04:40 > 0:04:45The internet is asking us to redefine what we consider private.

0:04:54 > 0:04:58To understand the enormous changes taking place,

0:04:58 > 0:05:00it's necessary to come here.

0:05:00 > 0:05:03Almost 150 years ago, this hut was on the frontier of the world's

0:05:03 > 0:05:08first information revolution - the telegraph.

0:05:10 > 0:05:13It was a crucial hub for a global network of wires -

0:05:13 > 0:05:16a role that is just as important today.

0:05:18 > 0:05:21Seeing the cables in a room like this shows that the physical

0:05:21 > 0:05:24infrastructure needed to move information

0:05:24 > 0:05:28around the world hasn't changed very much in the past hundred years.

0:05:28 > 0:05:30I think that the internet,

0:05:30 > 0:05:33we tend to think of as a cloud floating somewhere off in cyberspace,

0:05:33 > 0:05:35but they're physical wires, physical cables

0:05:35 > 0:05:40and sometimes wireless signals that are communicating with each other.

0:05:40 > 0:05:44Cornwall, where the telegraph cables come ashore,

0:05:44 > 0:05:48still remains crucial for today's internet.

0:05:48 > 0:05:5225% of all traffic passes through here.

0:05:52 > 0:05:54Running from the United States

0:05:54 > 0:05:57and other places to the United Kingdom are a large number of the

0:05:57 > 0:06:03most significant fibre-optic cables that carry huge amounts of data.

0:06:03 > 0:06:06Alongside this information super-highway is a site

0:06:06 > 0:06:10belonging to the UK Government, GCHQ Bude.

0:06:12 > 0:06:15We now know that this listening station has been gathering

0:06:15 > 0:06:20and analysing everything that comes across these wires.

0:06:20 > 0:06:23Any data that passes across the internet could theoretically come

0:06:23 > 0:06:28down these cables, so that's e-mails, websites, the bit torrent downloads,

0:06:28 > 0:06:33the films that you're accessing through Netflix and online services.

0:06:33 > 0:06:38The sheer amount of data captured here is almost impossible to comprehend.

0:06:38 > 0:06:41In terms of what the GCHQ were looking at, we've got

0:06:41 > 0:06:45from internal documents that, in 2011, they were tapping 200

0:06:45 > 0:06:48ten-gigabit cables coming into Cornwall.

0:06:48 > 0:06:52To give a rough idea of how much data that is, if you were to

0:06:52 > 0:06:55digitise the entire contents of the British Library, then you

0:06:55 > 0:07:00could transfer it down that set of cables in about 40 seconds.

0:07:00 > 0:07:03Tapping the wires is surprisingly simple.

0:07:11 > 0:07:17The data carried by the fibre-optic cable just needs to be diverted.

0:07:17 > 0:07:20A fibre-optic cable signal is a beam of light

0:07:20 > 0:07:25travelling down a cable made from glass.

0:07:25 > 0:07:28Pulses of light represent the pieces of information

0:07:28 > 0:07:31travelling across the internet, which is the e-mails, the web pages,

0:07:31 > 0:07:35everything that's going over the internet.

0:07:35 > 0:07:38Every 50 miles or so, that signal becomes sufficiently

0:07:38 > 0:07:42weak that it needs to be repeated, and this is the weak spot.

0:07:43 > 0:07:48And it's very easy to insert an optical tap at that point.

0:07:48 > 0:07:52And that's just what GCHQ did.

0:07:52 > 0:07:56A device was placed into the beam of data which created a mirror

0:07:56 > 0:07:59image of the millions of e-mails, web searches

0:07:59 > 0:08:02and internet traffic passing through the cables every second.

0:08:05 > 0:08:09What you effectively get is two copies of the signal, one going

0:08:09 > 0:08:13off to the GCHQ and one carrying on in its original destination.

0:08:22 > 0:08:25All of the information going over those cables is able to be

0:08:25 > 0:08:29replayed over the course of three days so you can rewind

0:08:29 > 0:08:32and see what was going over the internet at a particular moment.

0:08:32 > 0:08:35Analysing this amount of data is an impressive achievement

0:08:35 > 0:08:38but it also attracts criticism.

0:08:38 > 0:08:42When you see the capacity and the potential for that technology,

0:08:42 > 0:08:45and the fact that it is being used without transparency

0:08:45 > 0:08:49and without very high levels of accountability, it's incredibly

0:08:49 > 0:08:53concerning because the power of that data to predict and analyse what

0:08:53 > 0:08:55we're going to do is very, very high

0:08:55 > 0:08:57and giving that power to somebody else,

0:08:57 > 0:09:01regardless of the original or stated intentions, is very worrying.

0:09:09 > 0:09:13We only know about the GCHQ project thanks to documents released

0:09:13 > 0:09:18by US whistle-blower Edward Snowden, and the revelation has begun

0:09:18 > 0:09:23to change the way many people think about privacy and the internet -

0:09:23 > 0:09:27among them the inventor of the World Wide Web, Tim Berners-Lee.

0:09:27 > 0:09:30Information is a funny sort of power.

0:09:30 > 0:09:34The way a government can use it to

0:09:34 > 0:09:35keep control of its citizens

0:09:35 > 0:09:39is insidious and sneaky, nasty in some cases.

0:09:39 > 0:09:43We have to all learn more about it

0:09:43 > 0:09:49and we have to in a way rethink, rebase a lot of our philosophy.

0:09:50 > 0:09:54Part of changing that philosophy is to understand that our lives are

0:09:54 > 0:09:58not analysed in the first instance by people,

0:09:58 > 0:10:00but by computer programmes.

0:10:01 > 0:10:05Some people just don't have an understanding about what's possible.

0:10:05 > 0:10:08I've heard people say, "Well, we know nobody's reading our e-mails

0:10:08 > 0:10:10"because they don't have enough people."

0:10:10 > 0:10:13Actually, hello, it's not... These e-mails are not being read by people.

0:10:13 > 0:10:15They're being read by machines.

0:10:17 > 0:10:20They're being read by machines which can do the sorts of things

0:10:20 > 0:10:23that search engines do and can look at all the e-mails

0:10:23 > 0:10:27and at all the social connections and can watch.

0:10:27 > 0:10:31Sometimes these machines learn how to spot trends

0:10:31 > 0:10:36and can build systems which will just watch a huge amount of data

0:10:36 > 0:10:40and start to pick things out and then will suggest to the

0:10:40 > 0:10:44security agency, well, "These people need to be investigated."

0:10:46 > 0:10:50But then the thing could be wrong. Boy, we have to have a protection.

0:10:52 > 0:10:55The Snowden revelations have generated greater interest

0:10:55 > 0:10:58than ever in how the internet is being used for the purposes

0:10:58 > 0:11:01of surveillance.

0:11:01 > 0:11:04But watching isn't just done by governments.

0:11:08 > 0:11:13The most detailed documenting of our lives is done by technology companies.

0:11:13 > 0:11:17Two years ago, tech researcher Julia Angwin decided to investigate

0:11:17 > 0:11:21how much these companies track our behaviour daily.

0:11:21 > 0:11:25Her findings give us one of the best pictures yet of just who is

0:11:25 > 0:11:29watching us online every minute of every day.

0:11:29 > 0:11:32When I talk about the underbelly of the information revolution,

0:11:32 > 0:11:37I'm really talking about the unseen downside

0:11:37 > 0:11:38of the information revolution.

0:11:38 > 0:11:41You know, we obviously have seen all the benefits of having

0:11:41 > 0:11:43all this information at our fingertips.

0:11:43 > 0:11:46But we're just awakening to the fact that we're also being

0:11:46 > 0:11:48surveilled all the time.

0:11:48 > 0:11:52We're being monitored in ways that were never before possible.

0:11:55 > 0:11:57Every time we browse the internet,

0:11:57 > 0:12:02what we do can be collated and sold to advertisers.

0:12:02 > 0:12:05So, basically, online there are hundreds of companies that

0:12:05 > 0:12:09sort of install invisible tracking technology on websites.

0:12:13 > 0:12:16They've installed basically a serial number on your computer

0:12:16 > 0:12:19and they watch you whenever they see you across the Web

0:12:19 > 0:12:21and build a dossier about your reading habits,

0:12:21 > 0:12:24your shopping habits, whatever they can obtain.

0:12:24 > 0:12:26And then there's a real market for that data.

0:12:26 > 0:12:27They buy and sell it

0:12:27 > 0:12:31and there's an online auction bidding for information about you.

0:12:31 > 0:12:35And so the people who know your browsing habits can also discover

0:12:35 > 0:12:40your deepest secrets, and then sell them to the highest bidder.

0:12:42 > 0:12:44So let's say a woman takes a pregnancy test

0:12:44 > 0:12:46and finds out she's pregnant.

0:12:46 > 0:12:49Then she might go look for something online.

0:12:49 > 0:12:53By making pregnancy-related searches, this woman has become a

0:12:53 > 0:12:58hot property for the people looking for a good target for advertising.

0:12:58 > 0:13:00The process of selling then begins.

0:13:02 > 0:13:07Within seconds, she is identified by the companies watching her.

0:13:07 > 0:13:09Her profile is now sold multiple times.

0:13:11 > 0:13:15She will then find herself bombarded with ads relating to pregnancy.

0:13:17 > 0:13:20She may well find that she's been followed around the Web by ads

0:13:20 > 0:13:23and that's really a result of that auction house.

0:13:23 > 0:13:27Now they will say to you that they know, they know she's pregnant but they don't know her name

0:13:27 > 0:13:29but what's happening is now that,

0:13:29 > 0:13:32more and more, that information is really not that anonymous.

0:13:32 > 0:13:34You know it's sort of like if they know you're pregnant,

0:13:34 > 0:13:37and they know where you live, yes maybe they don't know your name.

0:13:37 > 0:13:40That's just because they haven't bothered to look it up.

0:13:40 > 0:13:43We continually create new information about ourselves and this

0:13:43 > 0:13:48information gives a continual window into our behaviours and habits.

0:13:48 > 0:13:51The next stage of surveillance comes from the offices

0:13:51 > 0:13:55and buildings around us,

0:13:55 > 0:13:59sending out Wi-Fi signals across the city.

0:13:59 > 0:14:03OK, so let's see how many signals we have right here, Wi-Fi.

0:14:03 > 0:14:08So we have one, two - 16, 17, 18, 19,

0:14:08 > 0:14:1120, 21, oh, and a few more just added themselves,

0:14:11 > 0:14:13so we're around 25 right now.

0:14:13 > 0:14:15Right there at this point we have 25 signals that

0:14:15 > 0:14:16are reaching out, basically

0:14:16 > 0:14:18sending a little signal to my phone saying, "I'm here,"

0:14:18 > 0:14:21and my phone is sending a signal back saying, "I'm also here."

0:14:21 > 0:14:23As long as your phone's Wi-Fi connection is on

0:14:23 > 0:14:28and connected to a signal, you can be tracked.

0:14:28 > 0:14:31Google, Apple, other big companies are racing to map the whole

0:14:31 > 0:14:35world using Wi-Fi signals and then, whenever your phone is somewhere,

0:14:35 > 0:14:38they know exactly how far you are from the closest Wi-Fi

0:14:38 > 0:14:41signal and they can map you much more precisely even than GPS.

0:14:41 > 0:14:45And we now know that this data has been seized by governments

0:14:45 > 0:14:50as part of their internet surveillance operations.

0:14:50 > 0:14:53The NSA was like, "Oh, that's an awesome way to track people.

0:14:53 > 0:14:57"Let's scoop up that information too."

0:14:57 > 0:15:02So we have seen that the governments find all this data irresistible.

0:15:05 > 0:15:08But is this simply a matter of principle?

0:15:15 > 0:15:18Is losing privacy ultimately the price

0:15:18 > 0:15:22we pay for peaceful streets and freedom from terror attacks?

0:15:26 > 0:15:28This man thinks we should look deeper.

0:15:31 > 0:15:34Bruce Schneier is a leading internet security expert.

0:15:34 > 0:15:39He was part of the team which first analysed the Snowden documents.

0:15:39 > 0:15:41They don't just want your search data

0:15:41 > 0:15:43or your e-mail. They want everything.

0:15:43 > 0:15:46They want to tie it to your real world behaviours - location

0:15:46 > 0:15:49data from your cellphone - and it's these correlations.

0:15:49 > 0:15:56And as you are being surveilled 24/7, you are more under control.

0:15:56 > 0:16:00Right? You are less free, you are less autonomous.

0:16:00 > 0:16:03Schneier believes the ultimate result from all this

0:16:03 > 0:16:05surveillance may be a loss of freedom.

0:16:05 > 0:16:09What data does is gives someone control over you.

0:16:09 > 0:16:12The reason Google and Facebook are collecting this

0:16:12 > 0:16:15is for psychological manipulation.

0:16:15 > 0:16:20That's their stated business purpose, right? Advertising.

0:16:20 > 0:16:22They want to convince you

0:16:22 > 0:16:26to buy things you might not want to buy otherwise.

0:16:26 > 0:16:28And so that data is all about control.

0:16:30 > 0:16:33Governments collect it also for control. Right?

0:16:33 > 0:16:35They want to control their population.

0:16:35 > 0:16:37Maybe they're concerned about dissidents,

0:16:37 > 0:16:40maybe they're concerned about criminals.

0:16:43 > 0:16:46It's hard to imagine that this data can exert such

0:16:46 > 0:16:51a level of potential control over us but looking for the patterns

0:16:51 > 0:16:57in the data can give anyone analysing it huge power.

0:16:59 > 0:17:03What's become increasingly understood is how much is

0:17:03 > 0:17:06revealed by meta-data, by the information about who is

0:17:06 > 0:17:09sending messages and how often they're sending them to each other.

0:17:09 > 0:17:12This reveals information about our social networks, it can

0:17:12 > 0:17:17reveal information about the places that we go on a day-to-day basis.

0:17:17 > 0:17:20All this meta-data stacks up to allow a very detailed

0:17:20 > 0:17:23view into our lives and increasingly what

0:17:23 > 0:17:26we find is that these patterns of communication can even be

0:17:26 > 0:17:31used in a predictive sense to determine factors about our lives.

0:17:31 > 0:17:36And what some people fear is what the spread of this analysis could lead to.

0:17:36 > 0:17:40Ultimately the concern of this is that, in a dystopian scenario,

0:17:40 > 0:17:44you have a situation where every facet of your life is

0:17:44 > 0:17:48something that is open to analysis by whoever has access to the data.

0:17:48 > 0:17:53They can look at the likelihood that you should be given health insurance

0:17:53 > 0:17:57because you come from a family that has a history of heart disease.

0:17:57 > 0:18:00They can look at the likelihood that you're going to get

0:18:00 > 0:18:04Alzheimer's disease because of the amount of active intellectual

0:18:04 > 0:18:06entertainment you take part in on a day-to-day basis.

0:18:06 > 0:18:10And when you start giving this level of minute control over

0:18:10 > 0:18:14people's lives, then you allow far too much power over individuals.

0:18:17 > 0:18:23The picture painted is certainly dark but there is another way.

0:18:27 > 0:18:31It comes from the insights of a scientist whose work once

0:18:31 > 0:18:37seemed like a footnote in the history of the internet - until now.

0:18:42 > 0:18:46The story begins in the late '70s in Berkeley, California.

0:18:53 > 0:18:57Governments and companies had begun to harness the power of computing.

0:19:00 > 0:19:03They seemed to promise a future of efficiency,

0:19:03 > 0:19:06of problems becoming overcome thanks to technology.

0:19:09 > 0:19:12Yet not everyone was so convinced.

0:19:12 > 0:19:16David Chaum was a computer scientist at Berkeley.

0:19:16 > 0:19:19For him, a world in which we would become increasingly joined

0:19:19 > 0:19:23together by machines in a network held grave dangers.

0:19:26 > 0:19:29As computing advanced, he grew increasingly

0:19:29 > 0:19:32convinced of the threats these networks could pose.

0:19:33 > 0:19:36David Chaum was very far ahead of his time.

0:19:36 > 0:19:40He predicted in the early 1980s concerns that would

0:19:40 > 0:19:43arise on the internet 15 or 20 years later -

0:19:43 > 0:19:45the whole field of traffic analysis

0:19:45 > 0:19:47that allows you to predict the behaviours of individuals,

0:19:47 > 0:19:52not by looking at the contents of their e-mails but by looking at the patterns of communication.

0:19:52 > 0:19:56David Chaum to some extent foresaw that and solved the problem.

0:20:01 > 0:20:06Well, it's sad to me but it is really no surprise

0:20:06 > 0:20:10that the privacy issue has unfolded the way it has.

0:20:10 > 0:20:16I spelled it out in the early publications in the '80s.

0:20:16 > 0:20:19Chaum's papers explained that in a future world where we would

0:20:19 > 0:20:22increasingly use computers, it would be easy

0:20:22 > 0:20:25to conduct mass surveillance.

0:20:25 > 0:20:29Chaum wanted to find a way to stop it.

0:20:29 > 0:20:34I always had a deep feeling that privacy is intimately tied to

0:20:34 > 0:20:38human potential and

0:20:38 > 0:20:43that it's an extraordinarily important aspect of democracy.

0:20:46 > 0:20:48Chaum focused on the new technology of e-mails.

0:20:50 > 0:20:53Anyone watching the network through which these messages

0:20:53 > 0:20:57travelled could find out enormous amounts about that person.

0:20:57 > 0:20:59He wanted to make this more difficult.

0:21:01 > 0:21:05Well, I was driving from Berkeley to Santa Barbara

0:21:05 > 0:21:09along the coastline in my VW Camper van and

0:21:09 > 0:21:11out of nowhere... You know, it was

0:21:11 > 0:21:14beautiful scenery, I was just driving along and it occurred to me

0:21:14 > 0:21:18how to solve this problem I'd been trying to solve for a long time.

0:21:18 > 0:21:21Yeah, it was a kind of a, you know, a eureka-moment type.

0:21:21 > 0:21:23I felt like, "Hey, this is it."

0:21:26 > 0:21:29Chaum's focus was the pattern of communications that

0:21:29 > 0:21:31a computer made on the network.

0:21:31 > 0:21:35If that pattern could be disguised using cryptography, it would

0:21:35 > 0:21:39be harder for anyone watching to identify individuals and carry out

0:21:39 > 0:21:42effective surveillance,

0:21:42 > 0:21:46and Chaum's system had a twist.

0:21:46 > 0:21:50Cryptography has traditionally been used to provide

0:21:50 > 0:21:57secrecy for message content and so I used this message secrecy

0:21:57 > 0:22:03technology of encryption to actually protect the meta-data of who

0:22:03 > 0:22:09talks to who and when, and that was quite a paradigm shift.

0:22:09 > 0:22:12Chaum had realised something about surveillance.

0:22:12 > 0:22:17Who we talk to and when is just as important as what we say.

0:22:19 > 0:22:23The key to avoiding this type of traffic analysis was to

0:22:23 > 0:22:25render the user effectively anonymous.

0:22:30 > 0:22:34But he realised that wasn't enough.

0:22:34 > 0:22:39He wanted to build a secure network and to do this

0:22:39 > 0:22:41he needed more anonymous users.

0:22:44 > 0:22:46One cannot be anonymous alone.

0:22:46 > 0:22:52One can only be anonymous relative to a set of people.

0:22:52 > 0:22:56The more anonymous users you can gather together in a network,

0:22:56 > 0:23:00the harder it becomes for someone watching to keep track of them,

0:23:00 > 0:23:03especially if they're mixed up.

0:23:03 > 0:23:07And so a whole batch of input messages from different

0:23:07 > 0:23:14people are shuffled and then sent to another computer, then shuffled

0:23:14 > 0:23:19again and so forth, and you can't tell as an observer of the network

0:23:19 > 0:23:24which item that went in corresponds to which item coming out.

0:23:28 > 0:23:33David Chaum was trying to provide protection against a world in which

0:23:33 > 0:23:37our communications would be analysed and potentially used against us.

0:23:37 > 0:23:43Chaum's response to this was to say, in order to have a free society,

0:23:43 > 0:23:48we need to have freedom from analysis of our behaviours and our communications.

0:23:48 > 0:23:51But Chaum's system didn't take off because communication using

0:23:51 > 0:23:55e-mail was still the preserve of a few academics and technicians.

0:23:58 > 0:24:00Yet his insights weren't forgotten.

0:24:03 > 0:24:07Within a decade, the arrival of the World Wide Web took

0:24:07 > 0:24:10communication increasingly online.

0:24:16 > 0:24:19The US Government understood the importance of protecting

0:24:19 > 0:24:22its own online communications from surveillance.

0:24:22 > 0:24:24It began to put money into research.

0:24:29 > 0:24:32At the US Naval Research Laboratory,

0:24:32 > 0:24:37a team led by scientist Paul Syverson got to work.

0:24:41 > 0:24:46Suppose we wanted to have a system where people could communicate

0:24:46 > 0:24:51back to their home office or with each other over the internet

0:24:51 > 0:24:58but without people being able to associate source and destination.

0:24:58 > 0:25:02Syverson soon came across the work of David Chaum.

0:25:02 > 0:25:07The first work which is associated with this area is

0:25:07 > 0:25:08the work of David Chaum.

0:25:09 > 0:25:13A Chaum mix basically gets its security

0:25:13 > 0:25:16because it takes in a bunch of messages

0:25:16 > 0:25:22and then re-orders them and changes their appearance and spews them out.

0:25:22 > 0:25:26But this was now the age of the World Wide Web.

0:25:26 > 0:25:30The Navy wanted to develop anonymous communications for this new era.

0:25:34 > 0:25:38So Syverson and his colleagues set to work on building a system

0:25:38 > 0:25:41that could be used by operatives across the world.

0:25:45 > 0:25:51You have enough of a network with enough distribution that it's

0:25:51 > 0:25:54going to be very hard for an adversary to be in all

0:25:54 > 0:25:59the places and to see all the traffic wherever it is.

0:25:59 > 0:26:03Syverson's system was called the Tor network.

0:26:03 > 0:26:07Tor stands for "the onion router". It works like this.

0:26:10 > 0:26:13A user wants to visit a website but doesn't want to

0:26:13 > 0:26:18reveal their IP address, the marker that identifies their computer.

0:26:18 > 0:26:21As they send the request, three layers of encryption

0:26:21 > 0:26:25are placed around it like the layers of an onion.

0:26:25 > 0:26:28The message is then sent through a series of computers which

0:26:28 > 0:26:31have volunteered to act as relay points.

0:26:33 > 0:26:36As the message passes from computer to computer,

0:26:36 > 0:26:39a layer of encryption is removed.

0:26:39 > 0:26:43Each time it is removed, all the relay computer can see

0:26:43 > 0:26:46is an order which tells it to pass the message on.

0:26:46 > 0:26:49The final computer relay decrypts the innermost

0:26:49 > 0:26:53layer of encryption, revealing the content of the communication.

0:26:53 > 0:26:57However - importantly - the identity of the user is hidden.

0:27:00 > 0:27:06Somebody who wants to look at things around the Web and not necessarily

0:27:06 > 0:27:10have people know what he's interested in. It might just be the

0:27:10 > 0:27:13local internet services provider, he doesn't want them to know

0:27:13 > 0:27:17which things he's looking at, but it might be also the destination.

0:27:20 > 0:27:21Syverson's system worked.

0:27:24 > 0:27:28It was now possible to surf the net without being watched.

0:27:30 > 0:27:34As David Chaum had observed, the more anonymous people,

0:27:34 > 0:27:36the better the security.

0:27:36 > 0:27:40The Navy had what they believed was a smart way of achieving that...

0:27:45 > 0:27:47..open the network out to everyone.

0:27:50 > 0:27:55It's not enough for a government system to carry traffic just for the government.

0:27:55 > 0:27:59It also has to carry traffic for other people.

0:27:59 > 0:28:02Part of anonymity is having a large number of people who are also

0:28:02 > 0:28:06anonymous because you can't be anonymous on your own.

0:28:06 > 0:28:11What Syverson and his team had done, building on the work of David Chaum,

0:28:11 > 0:28:15would begin to revolutionise the way that people could operate online.

0:28:19 > 0:28:21Over the coming years,

0:28:21 > 0:28:25the Tor network expanded as more people volunteered to become

0:28:25 > 0:28:30relay computers, the points through which the messages could be relayed.

0:28:30 > 0:28:33What Tor did was it made a useable system for people.

0:28:33 > 0:28:35People wanted to protect what they were

0:28:35 > 0:28:39looking at on the internet from being watched by their ISP or

0:28:39 > 0:28:42their government or the company that they're working for at the time.

0:28:42 > 0:28:46But Tor's success wasn't just down to the Navy.

0:28:50 > 0:28:54In the mid-2000s, they handed the network over to a non-profit

0:28:54 > 0:28:57organisation who overhauled the system.

0:28:59 > 0:29:03Now the network would be represented by people like this -

0:29:03 > 0:29:05Jake Applebaum...

0:29:07 > 0:29:10..researchers dedicated to the opportunities

0:29:10 > 0:29:12they felt Tor could give for free speech.

0:29:16 > 0:29:18We work with the research community

0:29:18 > 0:29:21all around the world, the academic community, the hacker community.

0:29:21 > 0:29:23It's a free software project

0:29:23 > 0:29:25so that means that all the source code is available.

0:29:25 > 0:29:27That means that anyone can look at it and see how it works,

0:29:27 > 0:29:31and that means everybody that does and shares it with us helps improve

0:29:31 > 0:29:35the programme for everyone else on the planet and the network as a whole.

0:29:36 > 0:29:43Applebaum now travels the world promoting the use of the software.

0:29:43 > 0:29:46The Tor network gives each person the ability to read without

0:29:46 > 0:29:50creating a data trail that will later be used against them.

0:29:50 > 0:29:54It gives every person a voice.

0:29:54 > 0:29:56Every person has the right to read and to speak freely,

0:29:56 > 0:29:58not one human excluded.

0:29:58 > 0:30:02And one place Tor has become important is the Middle East.

0:30:05 > 0:30:06During the Arab Spring,

0:30:06 > 0:30:10as disturbances spread across the region, it became a vital

0:30:10 > 0:30:11tool for dissidents...

0:30:13 > 0:30:16..especially in places like Syria.

0:30:18 > 0:30:22One of those who used it from the beginning was opposition activist Reem al Assil.

0:30:26 > 0:30:29I found out first about Tor back in 2011.

0:30:32 > 0:30:37Surveillance in Syria is a very big problem for activists

0:30:37 > 0:30:39or for anyone even,

0:30:39 > 0:30:44because the Syrian regime are trying all the time to get into people's

0:30:44 > 0:30:50e-mails and Facebook to see what they are up to, what they are doing.

0:30:50 > 0:30:52By the time the Syrian uprising happened,

0:30:52 > 0:30:56the Tor project had developed a browser which made

0:30:56 > 0:30:59downloading the software very simple.

0:30:59 > 0:31:02You basically go and download Tor in your computer

0:31:02 > 0:31:07and once it's installed, whenever you want to browse the Web,

0:31:07 > 0:31:13you go and click on it just like Internet Explorer or Google Chrome.

0:31:13 > 0:31:17Reem had personal experience of the protection offered by Tor

0:31:17 > 0:31:19when she was arrested by the secret police.

0:31:22 > 0:31:26I denied having any relation with any opposition work, you know,

0:31:26 > 0:31:29or anything and they tried to intimidate me

0:31:29 > 0:31:34and they said, "Well, see, we have... we know everything about you so now,

0:31:34 > 0:31:39"just we need you to tell us," but I knew that they don't know anything.

0:31:39 > 0:31:43By using Tor, I was an anonymous user so they couldn't tell

0:31:43 > 0:31:48that Reem is doing so-and-so, is watching so-and-so.

0:31:48 > 0:31:52So that's why Tor protected me in this way.

0:31:54 > 0:31:57Syria was not the only place where Tor was vital.

0:32:01 > 0:32:06It's used in China and in Iran.

0:32:06 > 0:32:09In any country where internet access is restricted,

0:32:09 > 0:32:16Tor can be used by citizens to avoid the gaze of the authorities.

0:32:16 > 0:32:19China, for example, regularly attacks and blocks the Tor network

0:32:19 > 0:32:22and they don't attack us directly

0:32:22 > 0:32:25so much as they actually attack people in China using Tor.

0:32:25 > 0:32:27They stop them from using the Tor network.

0:32:29 > 0:32:34But Tor wasn't just helping inside repressive regimes.

0:32:34 > 0:32:39It was now being used for whistle-blowing in the West,

0:32:39 > 0:32:42through WikiLeaks,

0:32:42 > 0:32:44founded by Julian Assange.

0:32:49 > 0:32:52Assange has spent the last two years under

0:32:52 > 0:32:55the protection of the Ecuadorian Embassy in London.

0:32:55 > 0:32:59He is fighting extradition to Sweden on sexual assault charges,

0:32:59 > 0:33:01charges he denies.

0:33:02 > 0:33:05I'd been involved in cryptography and anonymous communications

0:33:05 > 0:33:09for almost 20 years, since the early 1990s.

0:33:09 > 0:33:12Cryptographic anonymity didn't come from nowhere.

0:33:12 > 0:33:14It was a long-standing quest which had a Holy Grail,

0:33:14 > 0:33:17which is to be able to communicate

0:33:17 > 0:33:20individual-to-individual freely and anonymously.

0:33:20 > 0:33:23Tor was the first protocol,

0:33:23 > 0:33:25first anonymous protocol,

0:33:25 > 0:33:27that got the balance right.

0:33:29 > 0:33:32From its early years, people who wanted to submit documents

0:33:32 > 0:33:35anonymously to WikiLeaks could use Tor.

0:33:37 > 0:33:40Tor was and is one of the mechanisms

0:33:40 > 0:33:42which we have received important documents, yes.

0:33:45 > 0:33:48One man who provided a link between WikiLeaks

0:33:48 > 0:33:51and the Tor project was Jake Applebaum.

0:33:53 > 0:33:56Sources that want to leak documents

0:33:56 > 0:33:58need to be able to communicate with WikiLeaks

0:33:58 > 0:34:00and it has always been the case

0:34:00 > 0:34:05that they have offered a Tor-hidden service and that Tor-hidden service

0:34:05 > 0:34:09allows people to reach the WikiLeaks submission engine.

0:34:10 > 0:34:13In 2010, what could be achieved

0:34:13 > 0:34:17when web activism met anonymity was revealed to the world.

0:34:22 > 0:34:24WikiLeaks received a huge leak

0:34:24 > 0:34:27of confidential US government material,

0:34:27 > 0:34:30mainly relating to the wars in Afghanistan and Iraq.

0:34:36 > 0:34:38The first release was this footage

0:34:38 > 0:34:41which showed a US Apache helicopter attack in Iraq.

0:34:49 > 0:34:53Among those dead were two journalists from Reuters,

0:34:53 > 0:34:55Saeed Chmagh and Namir Noor-Eldeen.

0:34:56 > 0:34:58The Americans investigated,

0:34:58 > 0:35:01but say there was no wrong-doing

0:35:01 > 0:35:05yet this footage would never have become public if Chelsea Manning,

0:35:05 > 0:35:08a US contractor in Iraq, hadn't leaked it.

0:35:10 > 0:35:13Chelsea Manning has said that to the court,

0:35:13 > 0:35:16that he used Tor amongst a number of other things

0:35:16 > 0:35:19to submit documents to WikiLeaks.

0:35:19 > 0:35:20Obviously, we can't comment on that,

0:35:20 > 0:35:23because we have an obligation to protect our sources.

0:35:26 > 0:35:29The release of the documents provided by Manning,

0:35:29 > 0:35:32which culminated in 250,000 cables,

0:35:32 > 0:35:35seemed to reveal the power of anonymity through encryption.

0:35:38 > 0:35:39Because with encryption,

0:35:39 > 0:35:43two people can come together to communicate privately.

0:35:43 > 0:35:47The full might of a superpower cannot break that encryption

0:35:47 > 0:35:50if it is properly implemented and that's an extraordinary thing,

0:35:50 > 0:35:54where individuals are given a certain type of freedom of action

0:35:54 > 0:35:57that is equivalent to the freedom of action that a superpower has.

0:35:59 > 0:36:02But it wasn't the technology that let Manning down.

0:36:04 > 0:36:08He confessed what he had done to a contact and was arrested.

0:36:11 > 0:36:15Those who had used anonymity to leak secrets were now under fire.

0:36:19 > 0:36:22WikiLeaks had already been criticised for the release

0:36:22 > 0:36:25of un-redacted documents revealing the names of Afghans

0:36:25 > 0:36:27who had assisted the US.

0:36:28 > 0:36:31The US government was also on the attack.

0:36:31 > 0:36:35The United States strongly condemns the illegal

0:36:35 > 0:36:38disclosure of classified information.

0:36:38 > 0:36:43It puts people's lives in danger, threatens our national security...

0:36:45 > 0:36:49Meanwhile, the Tor project, started by the US government,

0:36:49 > 0:36:51was becoming a target.

0:36:53 > 0:36:56It's very funny, right, because on the one hand,

0:36:56 > 0:36:59these people are funding Tor because they say they believe in anonymity.

0:36:59 > 0:37:02And on the other hand, they're detaining me at airports,

0:37:02 > 0:37:04threatening me and doing things like that.

0:37:04 > 0:37:07And they've even said to me, "We love what you do in Iran

0:37:07 > 0:37:09"and in China, in helping Tibetan people.

0:37:09 > 0:37:11"We love all the stuff that you're doing,

0:37:11 > 0:37:13"but why do you have to do it here?"

0:37:15 > 0:37:16Thanks to Edward Snowden,

0:37:16 > 0:37:19we now know that this culminated in the Tor network

0:37:19 > 0:37:23being the focus of failed attacks by America's National Security Agency.

0:37:25 > 0:37:27They revealed their frustration

0:37:27 > 0:37:31in a confidential PowerPoint presentation called Tor Stinks,

0:37:31 > 0:37:36which set out the ways in which the NSA had tried to crack the network.

0:37:36 > 0:37:40They think Tor stinks because they want to attack people

0:37:40 > 0:37:43and sometimes technology makes that harder.

0:37:43 > 0:37:47It is because the users have something which bothers them,

0:37:47 > 0:37:49which is real autonomy.

0:37:49 > 0:37:52It gives them true privacy and security.

0:37:52 > 0:37:55Tor, invented and funded by the US government,

0:37:55 > 0:37:58was now used by activists, journalists,

0:37:58 > 0:38:02anybody who wanted to communicate anonymously,

0:38:02 > 0:38:04and it wasn't long before its potential

0:38:04 > 0:38:07began to attract a darker type of user.

0:38:12 > 0:38:13It began here in Washington.

0:38:14 > 0:38:17Jon Iadonisi is a former Navy SEAL

0:38:17 > 0:38:20turned advisor on cyber operations to government.

0:38:22 > 0:38:26There was some tips that came in out of Baltimore,

0:38:26 > 0:38:28two federal agents saying,

0:38:28 > 0:38:30"You are a police officer,

0:38:30 > 0:38:32"you really should take a look at this website

0:38:32 > 0:38:35"and, oh, by the way, the only way you get to it

0:38:35 > 0:38:37"is if you anonymise yourself

0:38:37 > 0:38:39"through something called the Tor router."

0:38:39 > 0:38:44What they found was a website called Silk Road.

0:38:44 > 0:38:48And they were amazed when they were able to download this plug-in

0:38:48 > 0:38:51on their browser, go into the Silk Road

0:38:51 > 0:38:55and then from there, see, literally they can make a purchase,

0:38:55 > 0:38:58it was like a buffet dinner for narcotics.

0:39:01 > 0:39:04Silk Road was a global drugs marketplace

0:39:04 > 0:39:07which brought together anonymous buyers and sellers

0:39:07 > 0:39:08from around the world.

0:39:10 > 0:39:13And what they found was people aren't just buying

0:39:13 > 0:39:15one or two instances of designer drugs

0:39:15 > 0:39:18but they're buying massive quantity wholesale.

0:39:18 > 0:39:23In London, the tech community was watching closely.

0:39:23 > 0:39:26Thomas Olofsson was one of many interested

0:39:26 > 0:39:28in how much money the site was making.

0:39:28 > 0:39:32Well, in Silk Road, they have something called an "escrow" system

0:39:32 > 0:39:36so if you want to buy drugs, you pay money into Silk Road

0:39:36 > 0:39:37as a facilitator

0:39:37 > 0:39:40and they keep the money in escrow until you sign off

0:39:40 > 0:39:43that you have had your drugs delivered.

0:39:43 > 0:39:48They will then release your money to the drug dealer.

0:39:48 > 0:39:51We're talking about several millions a day in trade.

0:39:55 > 0:39:57The extraordinary success of Silk Road

0:39:57 > 0:40:01attracted new customers to new illegal sites.

0:40:02 > 0:40:06This part of the internet even had a new name - the Dark Web.

0:40:07 > 0:40:10A dark website is impossible to shut down

0:40:10 > 0:40:13because you don't know where a dark website is hosted

0:40:13 > 0:40:17or even where it's physically located or who's behind it.

0:40:21 > 0:40:24And there was one other thing which made Silk Road

0:40:24 > 0:40:26and its imitators difficult to stop.

0:40:26 > 0:40:33You paid with a new currency that only exists online, called Bitcoin.

0:40:33 > 0:40:36Before, even if you had anonymity as a user,

0:40:36 > 0:40:39you could still track the transactions,

0:40:39 > 0:40:42the money flowing between persons,

0:40:42 > 0:40:46because if you use your Visa card, your ATM, bank transfer,

0:40:46 > 0:40:49Western Union, there is always... Money leaves a mark.

0:40:49 > 0:40:52This is the first time that you can anonymously

0:40:52 > 0:40:54move money between two persons.

0:41:02 > 0:41:06Bitcoin is no longer an underground phenomenon.

0:41:06 > 0:41:09Buy Bitcoin, 445.

0:41:09 > 0:41:13Sellers will sell at 460.

0:41:13 > 0:41:15If you're buying less than half a Bitcoin,

0:41:15 > 0:41:17you'll have to go to market price.

0:41:17 > 0:41:20This Bitcoin event is taking place on Wall Street.

0:41:23 > 0:41:2545 bid, 463 asked.

0:41:28 > 0:41:30But how does the currency work?

0:41:31 > 0:41:35One of the people best placed to explain is Peter Todd.

0:41:35 > 0:41:38He's chief scientist for a number of Bitcoin companies

0:41:38 > 0:41:42including one of the hottest, Dark Wallet.

0:41:44 > 0:41:47So, what Bitcoin is, is it's virtual money.

0:41:47 > 0:41:48I can give it to you electronically,

0:41:48 > 0:41:51you can give it to someone else electronically.

0:41:51 > 0:41:53The key thing to understand about Bitcoin

0:41:53 > 0:41:57is that these two people trading here are making a deal

0:41:57 > 0:41:59without any bank involvement.

0:41:59 > 0:42:02It's a form of electronic cash

0:42:02 > 0:42:05and that has massive implications.

0:42:05 > 0:42:07So, of course, in normal electronic banking systems,

0:42:07 > 0:42:10what I would say is, "Please transfer money from my account

0:42:10 > 0:42:13"to someone else's account," but fundamentally,

0:42:13 > 0:42:17who owns what money is recorded by the bank, by the intermediary?

0:42:19 > 0:42:23What's really interesting about Bitcoin is this virtual money

0:42:23 > 0:42:26is not controlled by a bank, it's not controlled by government.

0:42:26 > 0:42:28It's controlled by an algorithm.

0:42:28 > 0:42:31The algorithm in question is a triumph of mathematics.

0:42:32 > 0:42:35This is a Bitcoin transaction in action.

0:42:38 > 0:42:41To pay someone in Bitcoin, the transaction must be signed

0:42:41 > 0:42:46using an cryptographic key which proves the parties agree to it.

0:42:46 > 0:42:50An unchangeable electronic record is then made of this transaction.

0:42:51 > 0:42:54This electronic record contains every transaction

0:42:54 > 0:42:57ever made on Bitcoin.

0:42:57 > 0:42:59It's called the block chain

0:42:59 > 0:43:04and it's stored in a distributed form by every user,

0:43:04 > 0:43:07not by a bank or other authority.

0:43:07 > 0:43:10The block chain is really the revolutionary part of Bitcoin.

0:43:10 > 0:43:13What's really unique about it is it's all public

0:43:13 > 0:43:16so you can run the Bitcoin algorithm on your computer

0:43:16 > 0:43:19and your computer's inspecting every single transaction

0:43:19 > 0:43:21to be sure that it actually followed the rules,

0:43:21 > 0:43:23and the rules are really what Bitcoin is.

0:43:23 > 0:43:26You know, that's the rules of the system, that's the algorithm.

0:43:26 > 0:43:29It says things like, "You can only send money to one person at once,"

0:43:29 > 0:43:32and we all agree to those rules.

0:43:32 > 0:43:34And Bitcoin has one other characteristic

0:43:34 > 0:43:37which it shares with cash.

0:43:37 > 0:43:40It can be very hard to trace.

0:43:40 > 0:43:43Well, what's controversial about Bitcoin

0:43:43 > 0:43:46is that it goes back to something quite like cash.

0:43:46 > 0:43:49It's not like a bank account where a government investigator

0:43:49 > 0:43:51can just call up the bank and get all the records

0:43:51 > 0:43:55of who I've ever transacted with without any effort at all.

0:43:55 > 0:43:58You know, if they want to go and find out where I got my Bitcoins,

0:43:58 > 0:43:59they're going to have to ask me.

0:43:59 > 0:44:01They're going to have to investigate.

0:44:05 > 0:44:08The emergence of Bitcoin and the growth of the Dark Web

0:44:08 > 0:44:12was now leading law enforcement in Washington to take a close interest.

0:44:14 > 0:44:18Transactions in the Dark Web were unbelievably more enabled

0:44:18 > 0:44:21and in many cases could exist

0:44:21 > 0:44:24because of this virtual currency known as Bitcoin.

0:44:25 > 0:44:28So, as people started to take a look at Bitcoin

0:44:28 > 0:44:31and understand, "How do we regulate this, how do we monitor this?

0:44:31 > 0:44:34"Oh, my God! It's completely anonymous, we have no record,"

0:44:34 > 0:44:38they started seeing transactions in the sort of the digital exhaust

0:44:38 > 0:44:40that led them into Silk Road.

0:44:40 > 0:44:44And so, now you had criminal grounds to start taking a look at this

0:44:44 > 0:44:50coupled with the movement from the financial side and regulatory side

0:44:50 > 0:44:52on the virtual currency.

0:44:52 > 0:44:55So these two fronts began converging.

0:44:55 > 0:44:57The FBI began to mount a complex plot

0:44:57 > 0:45:01against the alleged lead administrator of Silk Road

0:45:01 > 0:45:04who used the pseudonym "Dread Pirate Roberts".

0:45:06 > 0:45:10They really started focusing on Dread Pirate Roberts

0:45:10 > 0:45:14and his role as not just a leader but sort of the mastermind

0:45:14 > 0:45:17in the whole ecosystem of the platform, right,

0:45:17 > 0:45:22from management administratively to financial merchandising

0:45:22 > 0:45:25to vendor placement, recruitment, etc.

0:45:26 > 0:45:30Dread Pirate Roberts was believed to be Ross Ulbricht,

0:45:30 > 0:45:32listed on his LinkedIn entry

0:45:32 > 0:45:37as an investment advisor and entrepreneur from Austin, Texas.

0:45:37 > 0:45:42Taking him down was really almost a story out of a Hollywood movie.

0:45:42 > 0:45:45I mean, we had people that staged the death

0:45:45 > 0:45:48of one of the potential informants.

0:45:48 > 0:45:52We had undercover police officers acting as cocaine...

0:45:52 > 0:45:54under-kingpins inside Silk Road.

0:45:54 > 0:45:58So, at the conclusion of all these different elements,

0:45:58 > 0:46:03they actually finally ended up bringing in Dread Pirate Roberts

0:46:03 > 0:46:07and now are trying to move forward with his trial.

0:46:07 > 0:46:10Whether or not he is Dread Pirate Roberts,

0:46:10 > 0:46:14Ulbricht's arrest in 2013 brought down Silk Road.

0:46:14 > 0:46:18In the process, the FBI seized 28.5 million

0:46:18 > 0:46:23from the site's escrow account, money destined for drug dealers.

0:46:26 > 0:46:28Yet the problem for the US authorities

0:46:28 > 0:46:31was that their success was only temporary.

0:46:35 > 0:46:37The site is now up and running again.

0:46:38 > 0:46:42It re-emerged just two months later by some other guys

0:46:42 > 0:46:44that took the same code base, the same,

0:46:44 > 0:46:46actually the same site more or less,

0:46:46 > 0:46:48just other people running the site,

0:46:48 > 0:46:53because it's obviously a very, very profitable site to run.

0:46:53 > 0:46:57And Silk Road has now been joined by a host of other sites.

0:47:00 > 0:47:05One of the boom industries on the Dark Web is financial crime.

0:47:06 > 0:47:08Yeah, this website is quite focused

0:47:08 > 0:47:11on credit card numbers and stolen data.

0:47:11 > 0:47:14On here, for instance, is credit card numbers

0:47:14 > 0:47:17from around 5-6 per credit card number

0:47:17 > 0:47:20paying in equivalent of Bitcoins, totally anonymous.

0:47:21 > 0:47:25The cards are sold in something called a "dump".

0:47:26 > 0:47:31I mean, it's quite a large number of entries, it's tens of thousands.

0:47:31 > 0:47:34You get everything you need to be able to buy stuff online

0:47:34 > 0:47:39with these credit cards - first name, last name, address,

0:47:39 > 0:47:42the card number, everything, the CVV number,

0:47:42 > 0:47:44everything but the PIN code, basically.

0:47:44 > 0:47:48The Dark Web is now used for various criminal activities -

0:47:48 > 0:47:51drugs and guns, financial crime,

0:47:51 > 0:47:54and even child sexual exploitation.

0:48:02 > 0:48:05So, is anonymity a genuine threat to society?

0:48:08 > 0:48:10A nightmare that should haunt us?

0:48:15 > 0:48:20This man who leads Europe's fight against cybercrime believes so.

0:48:20 > 0:48:24The Tor network plays a role because it hides criminals.

0:48:24 > 0:48:26I know it was not the intention,

0:48:26 > 0:48:28but that's the outcome and this is my job,

0:48:28 > 0:48:32to tell the society what is the trade-offs here.

0:48:32 > 0:48:35By having no possibilities to penetrate this,

0:48:35 > 0:48:37we will then secure criminals

0:48:37 > 0:48:40that they can continue their crimes on a global network.

0:48:42 > 0:48:45And despite the success over Silk Road and others,

0:48:45 > 0:48:47he is worried for the future.

0:48:50 > 0:48:53Our detection rate is dropping. It's very simple.

0:48:53 > 0:48:57The business model of the criminal is to make profit with low risk

0:48:57 > 0:49:00and, here, you actually eliminate the risk

0:49:00 > 0:49:02because there is no risk to get identified

0:49:02 > 0:49:05so either you have to screw up or be very unlucky

0:49:05 > 0:49:07as somebody rats you out.

0:49:07 > 0:49:10Otherwise, you're secure. So, if you run a tight operation,

0:49:10 > 0:49:13it's very, very difficult for the police to penetrate

0:49:13 > 0:49:15so it's risk-free crime.

0:49:15 > 0:49:18So, does the anonymity offered by the Tor network

0:49:18 > 0:49:22encourage crime or simply displace it?

0:49:22 > 0:49:26Those who work for the project are well aware of the charges it faces.

0:49:26 > 0:49:30There is often asserted certain narratives about anonymity

0:49:30 > 0:49:35and, of course, one of the narratives is that anonymity creates crime

0:49:35 > 0:49:38so you hear about things like the Silk Road and you hear,

0:49:38 > 0:49:41"Oh, it's terrible, someone can do something illegal on the internet."

0:49:41 > 0:49:43Well, welcome to the internet.

0:49:43 > 0:49:45It is a reflection of human society

0:49:45 > 0:49:48where there is sometimes illegal behaviour.

0:49:48 > 0:49:50These arguments aside,

0:49:50 > 0:49:53for users wanting to avoid surveillance,

0:49:53 > 0:49:55Tor has limitations -

0:49:55 > 0:49:59it's slow, content isn't automatically encrypted

0:49:59 > 0:50:00on exiting the network

0:50:00 > 0:50:04and some have claimed a bug can de-anonymise users.

0:50:04 > 0:50:06Tor say they have a fix for this problem.

0:50:07 > 0:50:11Whilst the search for a solution to bulk surveillance continues,

0:50:11 > 0:50:13this man has a different approach.

0:50:16 > 0:50:19He is Eugene Kaspersky, CEO of one of the world's

0:50:19 > 0:50:23fastest-growing internet security companies.

0:50:23 > 0:50:26300 million users now rely on its software.

0:50:27 > 0:50:32For Kaspersky, widespread anonymity is not a solution.

0:50:32 > 0:50:36In fact, he thinks we need the absolute opposite,

0:50:36 > 0:50:38a form of online passport.

0:50:40 > 0:50:45My idea is that all the services in the internet, they must be split.

0:50:47 > 0:50:51So, the non-critical - your personal e-mails,

0:50:51 > 0:50:53the news from the internet,

0:50:53 > 0:50:55your chatting with your family,

0:50:55 > 0:51:01what else? - leave them alone so don't need, you don't need any ID.

0:51:01 > 0:51:02It's a place of freedom.

0:51:02 > 0:51:06And there are critical services, like banking services,

0:51:06 > 0:51:10financial services, banks, booking their tickets,

0:51:10 > 0:51:12booking the hotels or what else?

0:51:12 > 0:51:16So please present your ID if you do this.

0:51:18 > 0:51:21So, it's a kind of balance -

0:51:21 > 0:51:23freedom and security,

0:51:23 > 0:51:26anonymity and wearing the badge, wearing your ID.

0:51:26 > 0:51:29In his view, this shouldn't be a problem,

0:51:29 > 0:51:32as privacy has pretty much disappeared online anyway.

0:51:34 > 0:51:38The reality is that we don't have too much privacy in the cyber space.

0:51:38 > 0:51:42If you want to travel, if you want to pay with your credit card,

0:51:42 > 0:51:45if you want to access internet, forget about privacy.

0:51:49 > 0:51:52Yet, the idea that we could soon inhabit a world

0:51:52 > 0:51:56where our lives are ever more transparent is very unpopular.

0:51:57 > 0:52:02Some people say, "Privacy is over, get over it",

0:52:02 > 0:52:05because basically we're trending towards the idea

0:52:05 > 0:52:08of just completely transparent lives.

0:52:08 > 0:52:13I think that's nonsense because information boundaries are important

0:52:13 > 0:52:16so that means that we've got to have systems which respect them

0:52:16 > 0:52:19so we've got to have the technology to produce,

0:52:19 > 0:52:20which can produce privacy.

0:52:26 > 0:52:29For those of us who might wish to resist surveillance,

0:52:29 > 0:52:32the hunt for that technology is now on

0:52:32 > 0:52:36and it is to cryptographers that we must look for answers.

0:52:38 > 0:52:42If you want a demonstration of their importance to today's internet,

0:52:42 > 0:52:45you only have to come to this bunker in Virginia.

0:52:45 > 0:52:49Today, an unusual ceremony is taking place.

0:52:52 > 0:52:55COMPUTER: 'Please centre your eyes in the mirror.'

0:52:55 > 0:52:57The internet is being upgraded.

0:52:57 > 0:52:59'Thank you, your identity has been verified.'

0:53:00 > 0:53:01Right, we're in.

0:53:01 > 0:53:04This is the biggest security upgrade to the internet

0:53:04 > 0:53:07in over 20 years.

0:53:07 > 0:53:11A group of tech specialists have been summoned by Steve Crocker,

0:53:11 > 0:53:15one of the godfathers of the internet.

0:53:15 > 0:53:18We discovered that there were some vulnerabilities

0:53:18 > 0:53:20in the basic domain name system structure

0:53:20 > 0:53:25that can lead to having a domain name hijacked

0:53:25 > 0:53:29or having someone directed to a false site and, from there,

0:53:29 > 0:53:32passwords can be detected and accounts can be cleaned out.

0:53:34 > 0:53:37At the heart of this upgrade is complex cryptography.

0:53:39 > 0:53:42Work began on adding cryptographically strong signatures

0:53:42 > 0:53:45to every entry in the domain name system

0:53:45 > 0:53:51in order to make it impossible to spoof or plant false information.

0:53:55 > 0:53:58To make sure these cryptographic codes remain secure,

0:53:58 > 0:54:00they are reset every three months.

0:54:02 > 0:54:06Three trusted experts from around the world have been summoned

0:54:06 > 0:54:10with three keys, keys that open these safety deposit boxes.

0:54:10 > 0:54:13So, now we are opening box 1-2-4-0.

0:54:13 > 0:54:16Inside are smart cards.

0:54:16 > 0:54:18When the three are put together,

0:54:18 > 0:54:22a master code can be validated which resets the system

0:54:22 > 0:54:24for millions of websites.

0:54:24 > 0:54:26We are very lucky because every one of these people

0:54:26 > 0:54:29are respected members of the technical community,

0:54:29 > 0:54:31technical internet community,

0:54:31 > 0:54:32and that's what we were doing.

0:54:32 > 0:54:35Just like the internet itself was formed from the bottom up,

0:54:35 > 0:54:37high techies from around the world.

0:54:37 > 0:54:39So, step 22.

0:54:40 > 0:54:43A complex set of instructions is followed

0:54:43 > 0:54:46to make sure the system is ready to operate.

0:54:46 > 0:54:49And now we need to verify the KSR.

0:54:49 > 0:54:51This is the key step.

0:54:53 > 0:54:57So, this is it. When I hit "yes", it will be signed.

0:54:59 > 0:55:01There you go, thank you very much.

0:55:01 > 0:55:03APPLAUSE

0:55:06 > 0:55:09These scientists want to use cryptography to make

0:55:09 > 0:55:13the architecture of the internet more resilient.

0:55:13 > 0:55:17But, for users, cryptography has another purpose.

0:55:17 > 0:55:21We can use it to encrypt our message content.

0:55:21 > 0:55:24So, if we want to change the way that mass surveillance is done,

0:55:24 > 0:55:26encryption, it turns out,

0:55:26 > 0:55:28is one of the ways that we do that.

0:55:28 > 0:55:30When we encrypt our data,

0:55:30 > 0:55:33we change the value that mass surveillance presents.

0:55:33 > 0:55:35When phone calls are end-to-end encrypted

0:55:35 > 0:55:38such that no-one else can decipher their content,

0:55:38 > 0:55:41thanks to the science of mathematics, of cryptography.

0:55:43 > 0:55:46And those who understand surveillance from the inside

0:55:46 > 0:55:50agree that it's the only way to protect our communications.

0:56:21 > 0:56:23However there's a problem.

0:56:23 > 0:56:27It's still very difficult to encrypt content in a user-friendly way.

0:56:28 > 0:56:32One of the best ways to protect your privacy is to use encryption,

0:56:32 > 0:56:35but encryption is so incredibly hard to use.

0:56:35 > 0:56:37I am a technology person

0:56:37 > 0:56:41and I struggle all the time to get my encryption products to work.

0:56:41 > 0:56:44And that's the next challenge,

0:56:44 > 0:56:47developing an internet where useable encryption

0:56:47 > 0:56:49makes bulk surveillance,

0:56:49 > 0:56:52for those who want to avoid it, more difficult.

0:56:53 > 0:56:57At the moment, the systems we're using are fairly insecure

0:56:57 > 0:56:59in lots of ways. I think the programmers out there

0:56:59 > 0:57:02have got to help build tools to make that easier.

0:57:05 > 0:57:07If encryption is to become more commonplace,

0:57:07 > 0:57:10the pressure will likely come from the market.

0:57:12 > 0:57:15The result of the revelations about the National Security Agency

0:57:15 > 0:57:18is that people are becoming quite paranoid.

0:57:18 > 0:57:21It's important to not be paralysed by that paranoia,

0:57:21 > 0:57:25but rather to express the desire for privacy

0:57:25 > 0:57:28and by expressing the desire for privacy,

0:57:28 > 0:57:30the market will fill the demand.

0:57:30 > 0:57:33By making it harder, you protect yourself,

0:57:33 > 0:57:36you protect your family and you also protect other people

0:57:36 > 0:57:39in just making it more expensive to surveil everyone all the time.

0:57:39 > 0:57:43In the end, encryption is all about mathematics

0:57:43 > 0:57:46and, for those who want more privacy,

0:57:46 > 0:57:48the numbers work in their favour.

0:57:49 > 0:57:52It turns out that it's easier in this universe

0:57:52 > 0:57:56to encrypt information, much easier, than it is to decrypt it

0:57:56 > 0:57:59if you're someone watching from the outside.

0:57:59 > 0:58:02The universe fundamentally favours privacy.

0:58:04 > 0:58:06We have reached a critical moment

0:58:06 > 0:58:08when the limits of privacy

0:58:08 > 0:58:11could be defined for a generation.

0:58:11 > 0:58:14We are beginning to grasp the shape of this new world.

0:58:16 > 0:58:19It's time to decide whether we are happy to accept it.

0:58:45 > 0:58:48'Thank you. Your identity has been verified.'