Browse content similar to Inside the Dark Web. Check below for episodes and series from the same categories and more!
Line | From | To | |
---|---|---|---|
It's just 25 years since the World Wide Web was created. | 0:00:02 | 0:00:05 | |
It now touches all of our lives, our personal information | 0:00:05 | 0:00:09 | |
and data swirling through the internet on a daily basis. | 0:00:09 | 0:00:13 | |
Yet it's now caught in the greatest controversy of its life - | 0:00:13 | 0:00:17 | |
surveillance. | 0:00:17 | 0:00:18 | |
This is a spy master's dream. | 0:00:18 | 0:00:21 | |
No spy of the previous generations could have imagined that we | 0:00:21 | 0:00:25 | |
would all volunteer for the world's best tracking device. | 0:00:25 | 0:00:28 | |
The revelations of US intelligence contractor Edward Snowden | 0:00:28 | 0:00:32 | |
have led many to ask if the Web we love has been turned against us... | 0:00:32 | 0:00:37 | |
They don't just want your search data or your e-mail. | 0:00:37 | 0:00:40 | |
They want everything. | 0:00:40 | 0:00:42 | |
And, as you are being surveilled 24/7, you are more under control. | 0:00:42 | 0:00:47 | |
You are less free. | 0:00:47 | 0:00:49 | |
..leading to soul-searching amongst those responsible for the Web itself. | 0:00:49 | 0:00:54 | |
I used to think that in some countries you worry about the | 0:00:54 | 0:00:57 | |
government and in some countries you worry about the corporations. | 0:00:57 | 0:01:00 | |
I realise now that that was naive. | 0:01:00 | 0:01:03 | |
But thanks to a collection of brilliant thinkers | 0:01:03 | 0:01:06 | |
and researchers, science has been fighting back... | 0:01:06 | 0:01:08 | |
It's really no surprise that the privacy issue has unfolded the way it has. | 0:01:08 | 0:01:15 | |
..developing technology to defeat surveillance, protecting activists... | 0:01:15 | 0:01:20 | |
They tried to intimidate me. I knew that they don't know anything. | 0:01:20 | 0:01:24 | |
..and in the process coming into conflict with global power. | 0:01:24 | 0:01:28 | |
They are detaining me at airports, threatening me. | 0:01:28 | 0:01:33 | |
But now, thanks to a new digital currency... | 0:01:33 | 0:01:36 | |
If you're buying less than half a Bitcoin, you'll have to go to market price. | 0:01:36 | 0:01:40 | |
..this technology is sending law makers into a panic, | 0:01:40 | 0:01:43 | |
due to the growth of a new black market in the Dark Web. | 0:01:43 | 0:01:47 | |
It was like a buffet dinner for narcotics. | 0:01:47 | 0:01:51 | |
Our detection rate is dropping. It's risk-free crime. | 0:01:51 | 0:01:55 | |
This is the story of a battle to shape the technology which now | 0:01:55 | 0:01:59 | |
defines our world, and whoever wins will influence not only the | 0:01:59 | 0:02:03 | |
future of the internet but the very idea of what it means to be free. | 0:02:03 | 0:02:09 | |
The sickness that befalls the internet is something that | 0:02:09 | 0:02:11 | |
befalls the whole world. | 0:02:11 | 0:02:13 | |
Where does the outside world stop and private space begin? | 0:02:24 | 0:02:29 | |
What details of your life are you willing to share with strangers? | 0:02:29 | 0:02:33 | |
Take this house. | 0:02:36 | 0:02:38 | |
Every morning, lights come on. | 0:02:40 | 0:02:44 | |
Coffee brews - automatically. | 0:02:44 | 0:02:48 | |
Technology just like this is increasingly being | 0:02:50 | 0:02:53 | |
installed into millions of homes across the world | 0:02:53 | 0:02:56 | |
and it promises to change the way we live for ever. | 0:02:56 | 0:03:01 | |
So there are sensors of all kinds, | 0:03:01 | 0:03:04 | |
there's lights and locks and thermostats | 0:03:04 | 0:03:05 | |
and once they're connected | 0:03:05 | 0:03:07 | |
then our platform can make them do whatever you want them to do. | 0:03:07 | 0:03:10 | |
So as an example, | 0:03:10 | 0:03:11 | |
if I wake up in the morning, the house knows that I'm waking up. | 0:03:11 | 0:03:15 | |
It can wake up with me. | 0:03:15 | 0:03:16 | |
When we walk in the kitchen, it will play the local news | 0:03:16 | 0:03:20 | |
and sort of greet us into the day, tell us the weather forecast | 0:03:20 | 0:03:23 | |
so we know how to dress for the day and so on. | 0:03:23 | 0:03:26 | |
This technology is known as the internet of things, | 0:03:26 | 0:03:30 | |
where the objects in our houses - kitchen appliances, | 0:03:30 | 0:03:33 | |
anything electronic - can be connected to the internet. | 0:03:33 | 0:03:37 | |
But for it to be useful, we're going to have to share intimate details of our private life. | 0:03:37 | 0:03:43 | |
So this is my things app. | 0:03:43 | 0:03:46 | |
It can run on your mobile phone or on a tablet or something like that. | 0:03:46 | 0:03:49 | |
I can do things like look at the comings | 0:03:49 | 0:03:51 | |
and goings of family members. | 0:03:51 | 0:03:52 | |
It can automatically detect when we come and go | 0:03:52 | 0:03:54 | |
based on our mobile phones or you can have it detect your presence | 0:03:54 | 0:03:58 | |
with a little sensor, you can put it in your car or something like that. | 0:03:58 | 0:04:01 | |
So when we leave the house, and there's no-one home, that's when it'll lock up | 0:04:01 | 0:04:04 | |
and shut down all the electricity used and so on. | 0:04:04 | 0:04:07 | |
For most of us, this is deeply private information. | 0:04:07 | 0:04:11 | |
Yet once we hand it over, | 0:04:11 | 0:04:12 | |
we have to trust a company to keep it confidential. | 0:04:12 | 0:04:14 | |
The consumer really owns 100% of their own data so they're opting in. | 0:04:14 | 0:04:18 | |
It's not something where that data would ever be shared | 0:04:18 | 0:04:21 | |
without their giving their permission. | 0:04:21 | 0:04:23 | |
This house represents a new normal where even the movements | 0:04:28 | 0:04:33 | |
within our own home are documented and stored. | 0:04:33 | 0:04:36 | |
It's a new frontier. | 0:04:36 | 0:04:38 | |
The internet is asking us to redefine what we consider private. | 0:04:40 | 0:04:45 | |
To understand the enormous changes taking place, | 0:04:54 | 0:04:58 | |
it's necessary to come here. | 0:04:58 | 0:05:00 | |
Almost 150 years ago, this hut was on the frontier of the world's | 0:05:00 | 0:05:03 | |
first information revolution - the telegraph. | 0:05:03 | 0:05:08 | |
It was a crucial hub for a global network of wires - | 0:05:10 | 0:05:13 | |
a role that is just as important today. | 0:05:13 | 0:05:16 | |
Seeing the cables in a room like this shows that the physical | 0:05:18 | 0:05:21 | |
infrastructure needed to move information | 0:05:21 | 0:05:24 | |
around the world hasn't changed very much in the past hundred years. | 0:05:24 | 0:05:28 | |
I think that the internet, | 0:05:28 | 0:05:30 | |
we tend to think of as a cloud floating somewhere off in cyberspace, | 0:05:30 | 0:05:33 | |
but they're physical wires, physical cables | 0:05:33 | 0:05:35 | |
and sometimes wireless signals that are communicating with each other. | 0:05:35 | 0:05:40 | |
Cornwall, where the telegraph cables come ashore, | 0:05:40 | 0:05:44 | |
still remains crucial for today's internet. | 0:05:44 | 0:05:48 | |
25% of all traffic passes through here. | 0:05:48 | 0:05:52 | |
Running from the United States | 0:05:52 | 0:05:54 | |
and other places to the United Kingdom are a large number of the | 0:05:54 | 0:05:57 | |
most significant fibre-optic cables that carry huge amounts of data. | 0:05:57 | 0:06:03 | |
Alongside this information super-highway is a site | 0:06:03 | 0:06:06 | |
belonging to the UK Government, GCHQ Bude. | 0:06:06 | 0:06:10 | |
We now know that this listening station has been gathering | 0:06:12 | 0:06:15 | |
and analysing everything that comes across these wires. | 0:06:15 | 0:06:20 | |
Any data that passes across the internet could theoretically come | 0:06:20 | 0:06:23 | |
down these cables, so that's e-mails, websites, the bit torrent downloads, | 0:06:23 | 0:06:28 | |
the films that you're accessing through Netflix and online services. | 0:06:28 | 0:06:33 | |
The sheer amount of data captured here is almost impossible to comprehend. | 0:06:33 | 0:06:38 | |
In terms of what the GCHQ were looking at, we've got | 0:06:38 | 0:06:41 | |
from internal documents that, in 2011, they were tapping 200 | 0:06:41 | 0:06:45 | |
ten-gigabit cables coming into Cornwall. | 0:06:45 | 0:06:48 | |
To give a rough idea of how much data that is, if you were to | 0:06:48 | 0:06:52 | |
digitise the entire contents of the British Library, then you | 0:06:52 | 0:06:55 | |
could transfer it down that set of cables in about 40 seconds. | 0:06:55 | 0:07:00 | |
Tapping the wires is surprisingly simple. | 0:07:00 | 0:07:03 | |
The data carried by the fibre-optic cable just needs to be diverted. | 0:07:11 | 0:07:17 | |
A fibre-optic cable signal is a beam of light | 0:07:17 | 0:07:20 | |
travelling down a cable made from glass. | 0:07:20 | 0:07:25 | |
Pulses of light represent the pieces of information | 0:07:25 | 0:07:28 | |
travelling across the internet, which is the e-mails, the web pages, | 0:07:28 | 0:07:31 | |
everything that's going over the internet. | 0:07:31 | 0:07:35 | |
Every 50 miles or so, that signal becomes sufficiently | 0:07:35 | 0:07:38 | |
weak that it needs to be repeated, and this is the weak spot. | 0:07:38 | 0:07:42 | |
And it's very easy to insert an optical tap at that point. | 0:07:43 | 0:07:48 | |
And that's just what GCHQ did. | 0:07:48 | 0:07:52 | |
A device was placed into the beam of data which created a mirror | 0:07:52 | 0:07:56 | |
image of the millions of e-mails, web searches | 0:07:56 | 0:07:59 | |
and internet traffic passing through the cables every second. | 0:07:59 | 0:08:02 | |
What you effectively get is two copies of the signal, one going | 0:08:05 | 0:08:09 | |
off to the GCHQ and one carrying on in its original destination. | 0:08:09 | 0:08:13 | |
All of the information going over those cables is able to be | 0:08:22 | 0:08:25 | |
replayed over the course of three days so you can rewind | 0:08:25 | 0:08:29 | |
and see what was going over the internet at a particular moment. | 0:08:29 | 0:08:32 | |
Analysing this amount of data is an impressive achievement | 0:08:32 | 0:08:35 | |
but it also attracts criticism. | 0:08:35 | 0:08:38 | |
When you see the capacity and the potential for that technology, | 0:08:38 | 0:08:42 | |
and the fact that it is being used without transparency | 0:08:42 | 0:08:45 | |
and without very high levels of accountability, it's incredibly | 0:08:45 | 0:08:49 | |
concerning because the power of that data to predict and analyse what | 0:08:49 | 0:08:53 | |
we're going to do is very, very high | 0:08:53 | 0:08:55 | |
and giving that power to somebody else, | 0:08:55 | 0:08:57 | |
regardless of the original or stated intentions, is very worrying. | 0:08:57 | 0:09:01 | |
We only know about the GCHQ project thanks to documents released | 0:09:09 | 0:09:13 | |
by US whistle-blower Edward Snowden, and the revelation has begun | 0:09:13 | 0:09:18 | |
to change the way many people think about privacy and the internet - | 0:09:18 | 0:09:23 | |
among them the inventor of the World Wide Web, Tim Berners-Lee. | 0:09:23 | 0:09:27 | |
Information is a funny sort of power. | 0:09:27 | 0:09:30 | |
The way a government can use it to | 0:09:30 | 0:09:34 | |
keep control of its citizens | 0:09:34 | 0:09:35 | |
is insidious and sneaky, nasty in some cases. | 0:09:35 | 0:09:39 | |
We have to all learn more about it | 0:09:39 | 0:09:43 | |
and we have to in a way rethink, rebase a lot of our philosophy. | 0:09:43 | 0:09:49 | |
Part of changing that philosophy is to understand that our lives are | 0:09:50 | 0:09:54 | |
not analysed in the first instance by people, | 0:09:54 | 0:09:58 | |
but by computer programmes. | 0:09:58 | 0:10:00 | |
Some people just don't have an understanding about what's possible. | 0:10:01 | 0:10:05 | |
I've heard people say, "Well, we know nobody's reading our e-mails | 0:10:05 | 0:10:08 | |
"because they don't have enough people." | 0:10:08 | 0:10:10 | |
Actually, hello, it's not... These e-mails are not being read by people. | 0:10:10 | 0:10:13 | |
They're being read by machines. | 0:10:13 | 0:10:15 | |
They're being read by machines which can do the sorts of things | 0:10:17 | 0:10:20 | |
that search engines do and can look at all the e-mails | 0:10:20 | 0:10:23 | |
and at all the social connections and can watch. | 0:10:23 | 0:10:27 | |
Sometimes these machines learn how to spot trends | 0:10:27 | 0:10:31 | |
and can build systems which will just watch a huge amount of data | 0:10:31 | 0:10:36 | |
and start to pick things out and then will suggest to the | 0:10:36 | 0:10:40 | |
security agency, well, "These people need to be investigated." | 0:10:40 | 0:10:44 | |
But then the thing could be wrong. Boy, we have to have a protection. | 0:10:46 | 0:10:50 | |
The Snowden revelations have generated greater interest | 0:10:52 | 0:10:55 | |
than ever in how the internet is being used for the purposes | 0:10:55 | 0:10:58 | |
of surveillance. | 0:10:58 | 0:11:01 | |
But watching isn't just done by governments. | 0:11:01 | 0:11:04 | |
The most detailed documenting of our lives is done by technology companies. | 0:11:08 | 0:11:13 | |
Two years ago, tech researcher Julia Angwin decided to investigate | 0:11:13 | 0:11:17 | |
how much these companies track our behaviour daily. | 0:11:17 | 0:11:21 | |
Her findings give us one of the best pictures yet of just who is | 0:11:21 | 0:11:25 | |
watching us online every minute of every day. | 0:11:25 | 0:11:29 | |
When I talk about the underbelly of the information revolution, | 0:11:29 | 0:11:32 | |
I'm really talking about the unseen downside | 0:11:32 | 0:11:37 | |
of the information revolution. | 0:11:37 | 0:11:38 | |
You know, we obviously have seen all the benefits of having | 0:11:38 | 0:11:41 | |
all this information at our fingertips. | 0:11:41 | 0:11:43 | |
But we're just awakening to the fact that we're also being | 0:11:43 | 0:11:46 | |
surveilled all the time. | 0:11:46 | 0:11:48 | |
We're being monitored in ways that were never before possible. | 0:11:48 | 0:11:52 | |
Every time we browse the internet, | 0:11:55 | 0:11:57 | |
what we do can be collated and sold to advertisers. | 0:11:57 | 0:12:02 | |
So, basically, online there are hundreds of companies that | 0:12:02 | 0:12:05 | |
sort of install invisible tracking technology on websites. | 0:12:05 | 0:12:09 | |
They've installed basically a serial number on your computer | 0:12:13 | 0:12:16 | |
and they watch you whenever they see you across the Web | 0:12:16 | 0:12:19 | |
and build a dossier about your reading habits, | 0:12:19 | 0:12:21 | |
your shopping habits, whatever they can obtain. | 0:12:21 | 0:12:24 | |
And then there's a real market for that data. | 0:12:24 | 0:12:26 | |
They buy and sell it | 0:12:26 | 0:12:27 | |
and there's an online auction bidding for information about you. | 0:12:27 | 0:12:31 | |
And so the people who know your browsing habits can also discover | 0:12:31 | 0:12:35 | |
your deepest secrets, and then sell them to the highest bidder. | 0:12:35 | 0:12:40 | |
So let's say a woman takes a pregnancy test | 0:12:42 | 0:12:44 | |
and finds out she's pregnant. | 0:12:44 | 0:12:46 | |
Then she might go look for something online. | 0:12:46 | 0:12:49 | |
By making pregnancy-related searches, this woman has become a | 0:12:49 | 0:12:53 | |
hot property for the people looking for a good target for advertising. | 0:12:53 | 0:12:58 | |
The process of selling then begins. | 0:12:58 | 0:13:00 | |
Within seconds, she is identified by the companies watching her. | 0:13:02 | 0:13:07 | |
Her profile is now sold multiple times. | 0:13:07 | 0:13:09 | |
She will then find herself bombarded with ads relating to pregnancy. | 0:13:11 | 0:13:15 | |
She may well find that she's been followed around the Web by ads | 0:13:17 | 0:13:20 | |
and that's really a result of that auction house. | 0:13:20 | 0:13:23 | |
Now they will say to you that they know, they know she's pregnant but they don't know her name | 0:13:23 | 0:13:27 | |
but what's happening is now that, | 0:13:27 | 0:13:29 | |
more and more, that information is really not that anonymous. | 0:13:29 | 0:13:32 | |
You know it's sort of like if they know you're pregnant, | 0:13:32 | 0:13:34 | |
and they know where you live, yes maybe they don't know your name. | 0:13:34 | 0:13:37 | |
That's just because they haven't bothered to look it up. | 0:13:37 | 0:13:40 | |
We continually create new information about ourselves and this | 0:13:40 | 0:13:43 | |
information gives a continual window into our behaviours and habits. | 0:13:43 | 0:13:48 | |
The next stage of surveillance comes from the offices | 0:13:48 | 0:13:51 | |
and buildings around us, | 0:13:51 | 0:13:55 | |
sending out Wi-Fi signals across the city. | 0:13:55 | 0:13:59 | |
OK, so let's see how many signals we have right here, Wi-Fi. | 0:13:59 | 0:14:03 | |
So we have one, two - 16, 17, 18, 19, | 0:14:03 | 0:14:08 | |
20, 21, oh, and a few more just added themselves, | 0:14:08 | 0:14:11 | |
so we're around 25 right now. | 0:14:11 | 0:14:13 | |
Right there at this point we have 25 signals that | 0:14:13 | 0:14:15 | |
are reaching out, basically | 0:14:15 | 0:14:16 | |
sending a little signal to my phone saying, "I'm here," | 0:14:16 | 0:14:18 | |
and my phone is sending a signal back saying, "I'm also here." | 0:14:18 | 0:14:21 | |
As long as your phone's Wi-Fi connection is on | 0:14:21 | 0:14:23 | |
and connected to a signal, you can be tracked. | 0:14:23 | 0:14:28 | |
Google, Apple, other big companies are racing to map the whole | 0:14:28 | 0:14:31 | |
world using Wi-Fi signals and then, whenever your phone is somewhere, | 0:14:31 | 0:14:35 | |
they know exactly how far you are from the closest Wi-Fi | 0:14:35 | 0:14:38 | |
signal and they can map you much more precisely even than GPS. | 0:14:38 | 0:14:41 | |
And we now know that this data has been seized by governments | 0:14:41 | 0:14:45 | |
as part of their internet surveillance operations. | 0:14:45 | 0:14:50 | |
The NSA was like, "Oh, that's an awesome way to track people. | 0:14:50 | 0:14:53 | |
"Let's scoop up that information too." | 0:14:53 | 0:14:57 | |
So we have seen that the governments find all this data irresistible. | 0:14:57 | 0:15:02 | |
But is this simply a matter of principle? | 0:15:05 | 0:15:08 | |
Is losing privacy ultimately the price | 0:15:15 | 0:15:18 | |
we pay for peaceful streets and freedom from terror attacks? | 0:15:18 | 0:15:22 | |
This man thinks we should look deeper. | 0:15:26 | 0:15:28 | |
Bruce Schneier is a leading internet security expert. | 0:15:31 | 0:15:34 | |
He was part of the team which first analysed the Snowden documents. | 0:15:34 | 0:15:39 | |
They don't just want your search data | 0:15:39 | 0:15:41 | |
or your e-mail. They want everything. | 0:15:41 | 0:15:43 | |
They want to tie it to your real world behaviours - location | 0:15:43 | 0:15:46 | |
data from your cellphone - and it's these correlations. | 0:15:46 | 0:15:49 | |
And as you are being surveilled 24/7, you are more under control. | 0:15:49 | 0:15:56 | |
Right? You are less free, you are less autonomous. | 0:15:56 | 0:16:00 | |
Schneier believes the ultimate result from all this | 0:16:00 | 0:16:03 | |
surveillance may be a loss of freedom. | 0:16:03 | 0:16:05 | |
What data does is gives someone control over you. | 0:16:05 | 0:16:09 | |
The reason Google and Facebook are collecting this | 0:16:09 | 0:16:12 | |
is for psychological manipulation. | 0:16:12 | 0:16:15 | |
That's their stated business purpose, right? Advertising. | 0:16:15 | 0:16:20 | |
They want to convince you | 0:16:20 | 0:16:22 | |
to buy things you might not want to buy otherwise. | 0:16:22 | 0:16:26 | |
And so that data is all about control. | 0:16:26 | 0:16:28 | |
Governments collect it also for control. Right? | 0:16:30 | 0:16:33 | |
They want to control their population. | 0:16:33 | 0:16:35 | |
Maybe they're concerned about dissidents, | 0:16:35 | 0:16:37 | |
maybe they're concerned about criminals. | 0:16:37 | 0:16:40 | |
It's hard to imagine that this data can exert such | 0:16:43 | 0:16:46 | |
a level of potential control over us but looking for the patterns | 0:16:46 | 0:16:51 | |
in the data can give anyone analysing it huge power. | 0:16:51 | 0:16:57 | |
What's become increasingly understood is how much is | 0:16:59 | 0:17:03 | |
revealed by meta-data, by the information about who is | 0:17:03 | 0:17:06 | |
sending messages and how often they're sending them to each other. | 0:17:06 | 0:17:09 | |
This reveals information about our social networks, it can | 0:17:09 | 0:17:12 | |
reveal information about the places that we go on a day-to-day basis. | 0:17:12 | 0:17:17 | |
All this meta-data stacks up to allow a very detailed | 0:17:17 | 0:17:20 | |
view into our lives and increasingly what | 0:17:20 | 0:17:23 | |
we find is that these patterns of communication can even be | 0:17:23 | 0:17:26 | |
used in a predictive sense to determine factors about our lives. | 0:17:26 | 0:17:31 | |
And what some people fear is what the spread of this analysis could lead to. | 0:17:31 | 0:17:36 | |
Ultimately the concern of this is that, in a dystopian scenario, | 0:17:36 | 0:17:40 | |
you have a situation where every facet of your life is | 0:17:40 | 0:17:44 | |
something that is open to analysis by whoever has access to the data. | 0:17:44 | 0:17:48 | |
They can look at the likelihood that you should be given health insurance | 0:17:48 | 0:17:53 | |
because you come from a family that has a history of heart disease. | 0:17:53 | 0:17:57 | |
They can look at the likelihood that you're going to get | 0:17:57 | 0:18:00 | |
Alzheimer's disease because of the amount of active intellectual | 0:18:00 | 0:18:04 | |
entertainment you take part in on a day-to-day basis. | 0:18:04 | 0:18:06 | |
And when you start giving this level of minute control over | 0:18:06 | 0:18:10 | |
people's lives, then you allow far too much power over individuals. | 0:18:10 | 0:18:14 | |
The picture painted is certainly dark but there is another way. | 0:18:17 | 0:18:23 | |
It comes from the insights of a scientist whose work once | 0:18:27 | 0:18:31 | |
seemed like a footnote in the history of the internet - until now. | 0:18:31 | 0:18:37 | |
The story begins in the late '70s in Berkeley, California. | 0:18:42 | 0:18:46 | |
Governments and companies had begun to harness the power of computing. | 0:18:53 | 0:18:57 | |
They seemed to promise a future of efficiency, | 0:19:00 | 0:19:03 | |
of problems becoming overcome thanks to technology. | 0:19:03 | 0:19:06 | |
Yet not everyone was so convinced. | 0:19:09 | 0:19:12 | |
David Chaum was a computer scientist at Berkeley. | 0:19:12 | 0:19:16 | |
For him, a world in which we would become increasingly joined | 0:19:16 | 0:19:19 | |
together by machines in a network held grave dangers. | 0:19:19 | 0:19:23 | |
As computing advanced, he grew increasingly | 0:19:26 | 0:19:29 | |
convinced of the threats these networks could pose. | 0:19:29 | 0:19:32 | |
David Chaum was very far ahead of his time. | 0:19:33 | 0:19:36 | |
He predicted in the early 1980s concerns that would | 0:19:36 | 0:19:40 | |
arise on the internet 15 or 20 years later - | 0:19:40 | 0:19:43 | |
the whole field of traffic analysis | 0:19:43 | 0:19:45 | |
that allows you to predict the behaviours of individuals, | 0:19:45 | 0:19:47 | |
not by looking at the contents of their e-mails but by looking at the patterns of communication. | 0:19:47 | 0:19:52 | |
David Chaum to some extent foresaw that and solved the problem. | 0:19:52 | 0:19:56 | |
Well, it's sad to me but it is really no surprise | 0:20:01 | 0:20:06 | |
that the privacy issue has unfolded the way it has. | 0:20:06 | 0:20:10 | |
I spelled it out in the early publications in the '80s. | 0:20:10 | 0:20:16 | |
Chaum's papers explained that in a future world where we would | 0:20:16 | 0:20:19 | |
increasingly use computers, it would be easy | 0:20:19 | 0:20:22 | |
to conduct mass surveillance. | 0:20:22 | 0:20:25 | |
Chaum wanted to find a way to stop it. | 0:20:25 | 0:20:29 | |
I always had a deep feeling that privacy is intimately tied to | 0:20:29 | 0:20:34 | |
human potential and | 0:20:34 | 0:20:38 | |
that it's an extraordinarily important aspect of democracy. | 0:20:38 | 0:20:43 | |
Chaum focused on the new technology of e-mails. | 0:20:46 | 0:20:48 | |
Anyone watching the network through which these messages | 0:20:50 | 0:20:53 | |
travelled could find out enormous amounts about that person. | 0:20:53 | 0:20:57 | |
He wanted to make this more difficult. | 0:20:57 | 0:20:59 | |
Well, I was driving from Berkeley to Santa Barbara | 0:21:01 | 0:21:05 | |
along the coastline in my VW Camper van and | 0:21:05 | 0:21:09 | |
out of nowhere... You know, it was | 0:21:09 | 0:21:11 | |
beautiful scenery, I was just driving along and it occurred to me | 0:21:11 | 0:21:14 | |
how to solve this problem I'd been trying to solve for a long time. | 0:21:14 | 0:21:18 | |
Yeah, it was a kind of a, you know, a eureka-moment type. | 0:21:18 | 0:21:21 | |
I felt like, "Hey, this is it." | 0:21:21 | 0:21:23 | |
Chaum's focus was the pattern of communications that | 0:21:26 | 0:21:29 | |
a computer made on the network. | 0:21:29 | 0:21:31 | |
If that pattern could be disguised using cryptography, it would | 0:21:31 | 0:21:35 | |
be harder for anyone watching to identify individuals and carry out | 0:21:35 | 0:21:39 | |
effective surveillance, | 0:21:39 | 0:21:42 | |
and Chaum's system had a twist. | 0:21:42 | 0:21:46 | |
Cryptography has traditionally been used to provide | 0:21:46 | 0:21:50 | |
secrecy for message content and so I used this message secrecy | 0:21:50 | 0:21:57 | |
technology of encryption to actually protect the meta-data of who | 0:21:57 | 0:22:03 | |
talks to who and when, and that was quite a paradigm shift. | 0:22:03 | 0:22:09 | |
Chaum had realised something about surveillance. | 0:22:09 | 0:22:12 | |
Who we talk to and when is just as important as what we say. | 0:22:12 | 0:22:17 | |
The key to avoiding this type of traffic analysis was to | 0:22:19 | 0:22:23 | |
render the user effectively anonymous. | 0:22:23 | 0:22:25 | |
But he realised that wasn't enough. | 0:22:30 | 0:22:34 | |
He wanted to build a secure network and to do this | 0:22:34 | 0:22:39 | |
he needed more anonymous users. | 0:22:39 | 0:22:41 | |
One cannot be anonymous alone. | 0:22:44 | 0:22:46 | |
One can only be anonymous relative to a set of people. | 0:22:46 | 0:22:52 | |
The more anonymous users you can gather together in a network, | 0:22:52 | 0:22:56 | |
the harder it becomes for someone watching to keep track of them, | 0:22:56 | 0:23:00 | |
especially if they're mixed up. | 0:23:00 | 0:23:03 | |
And so a whole batch of input messages from different | 0:23:03 | 0:23:07 | |
people are shuffled and then sent to another computer, then shuffled | 0:23:07 | 0:23:14 | |
again and so forth, and you can't tell as an observer of the network | 0:23:14 | 0:23:19 | |
which item that went in corresponds to which item coming out. | 0:23:19 | 0:23:24 | |
David Chaum was trying to provide protection against a world in which | 0:23:28 | 0:23:33 | |
our communications would be analysed and potentially used against us. | 0:23:33 | 0:23:37 | |
Chaum's response to this was to say, in order to have a free society, | 0:23:37 | 0:23:43 | |
we need to have freedom from analysis of our behaviours and our communications. | 0:23:43 | 0:23:48 | |
But Chaum's system didn't take off because communication using | 0:23:48 | 0:23:51 | |
e-mail was still the preserve of a few academics and technicians. | 0:23:51 | 0:23:55 | |
Yet his insights weren't forgotten. | 0:23:58 | 0:24:00 | |
Within a decade, the arrival of the World Wide Web took | 0:24:03 | 0:24:07 | |
communication increasingly online. | 0:24:07 | 0:24:10 | |
The US Government understood the importance of protecting | 0:24:16 | 0:24:19 | |
its own online communications from surveillance. | 0:24:19 | 0:24:22 | |
It began to put money into research. | 0:24:22 | 0:24:24 | |
At the US Naval Research Laboratory, | 0:24:29 | 0:24:32 | |
a team led by scientist Paul Syverson got to work. | 0:24:32 | 0:24:37 | |
Suppose we wanted to have a system where people could communicate | 0:24:41 | 0:24:46 | |
back to their home office or with each other over the internet | 0:24:46 | 0:24:51 | |
but without people being able to associate source and destination. | 0:24:51 | 0:24:58 | |
Syverson soon came across the work of David Chaum. | 0:24:58 | 0:25:02 | |
The first work which is associated with this area is | 0:25:02 | 0:25:07 | |
the work of David Chaum. | 0:25:07 | 0:25:08 | |
A Chaum mix basically gets its security | 0:25:09 | 0:25:13 | |
because it takes in a bunch of messages | 0:25:13 | 0:25:16 | |
and then re-orders them and changes their appearance and spews them out. | 0:25:16 | 0:25:22 | |
But this was now the age of the World Wide Web. | 0:25:22 | 0:25:26 | |
The Navy wanted to develop anonymous communications for this new era. | 0:25:26 | 0:25:30 | |
So Syverson and his colleagues set to work on building a system | 0:25:34 | 0:25:38 | |
that could be used by operatives across the world. | 0:25:38 | 0:25:41 | |
You have enough of a network with enough distribution that it's | 0:25:45 | 0:25:51 | |
going to be very hard for an adversary to be in all | 0:25:51 | 0:25:54 | |
the places and to see all the traffic wherever it is. | 0:25:54 | 0:25:59 | |
Syverson's system was called the Tor network. | 0:25:59 | 0:26:03 | |
Tor stands for "the onion router". It works like this. | 0:26:03 | 0:26:07 | |
A user wants to visit a website but doesn't want to | 0:26:10 | 0:26:13 | |
reveal their IP address, the marker that identifies their computer. | 0:26:13 | 0:26:18 | |
As they send the request, three layers of encryption | 0:26:18 | 0:26:21 | |
are placed around it like the layers of an onion. | 0:26:21 | 0:26:25 | |
The message is then sent through a series of computers which | 0:26:25 | 0:26:28 | |
have volunteered to act as relay points. | 0:26:28 | 0:26:31 | |
As the message passes from computer to computer, | 0:26:33 | 0:26:36 | |
a layer of encryption is removed. | 0:26:36 | 0:26:39 | |
Each time it is removed, all the relay computer can see | 0:26:39 | 0:26:43 | |
is an order which tells it to pass the message on. | 0:26:43 | 0:26:46 | |
The final computer relay decrypts the innermost | 0:26:46 | 0:26:49 | |
layer of encryption, revealing the content of the communication. | 0:26:49 | 0:26:53 | |
However - importantly - the identity of the user is hidden. | 0:26:53 | 0:26:57 | |
Somebody who wants to look at things around the Web and not necessarily | 0:27:00 | 0:27:06 | |
have people know what he's interested in. It might just be the | 0:27:06 | 0:27:10 | |
local internet services provider, he doesn't want them to know | 0:27:10 | 0:27:13 | |
which things he's looking at, but it might be also the destination. | 0:27:13 | 0:27:17 | |
Syverson's system worked. | 0:27:20 | 0:27:21 | |
It was now possible to surf the net without being watched. | 0:27:24 | 0:27:28 | |
As David Chaum had observed, the more anonymous people, | 0:27:30 | 0:27:34 | |
the better the security. | 0:27:34 | 0:27:36 | |
The Navy had what they believed was a smart way of achieving that... | 0:27:36 | 0:27:40 | |
..open the network out to everyone. | 0:27:45 | 0:27:47 | |
It's not enough for a government system to carry traffic just for the government. | 0:27:50 | 0:27:55 | |
It also has to carry traffic for other people. | 0:27:55 | 0:27:59 | |
Part of anonymity is having a large number of people who are also | 0:27:59 | 0:28:02 | |
anonymous because you can't be anonymous on your own. | 0:28:02 | 0:28:06 | |
What Syverson and his team had done, building on the work of David Chaum, | 0:28:06 | 0:28:11 | |
would begin to revolutionise the way that people could operate online. | 0:28:11 | 0:28:15 | |
Over the coming years, | 0:28:19 | 0:28:21 | |
the Tor network expanded as more people volunteered to become | 0:28:21 | 0:28:25 | |
relay computers, the points through which the messages could be relayed. | 0:28:25 | 0:28:30 | |
What Tor did was it made a useable system for people. | 0:28:30 | 0:28:33 | |
People wanted to protect what they were | 0:28:33 | 0:28:35 | |
looking at on the internet from being watched by their ISP or | 0:28:35 | 0:28:39 | |
their government or the company that they're working for at the time. | 0:28:39 | 0:28:42 | |
But Tor's success wasn't just down to the Navy. | 0:28:42 | 0:28:46 | |
In the mid-2000s, they handed the network over to a non-profit | 0:28:50 | 0:28:54 | |
organisation who overhauled the system. | 0:28:54 | 0:28:57 | |
Now the network would be represented by people like this - | 0:28:59 | 0:29:03 | |
Jake Applebaum... | 0:29:03 | 0:29:05 | |
..researchers dedicated to the opportunities | 0:29:07 | 0:29:10 | |
they felt Tor could give for free speech. | 0:29:10 | 0:29:12 | |
We work with the research community | 0:29:16 | 0:29:18 | |
all around the world, the academic community, the hacker community. | 0:29:18 | 0:29:21 | |
It's a free software project | 0:29:21 | 0:29:23 | |
so that means that all the source code is available. | 0:29:23 | 0:29:25 | |
That means that anyone can look at it and see how it works, | 0:29:25 | 0:29:27 | |
and that means everybody that does and shares it with us helps improve | 0:29:27 | 0:29:31 | |
the programme for everyone else on the planet and the network as a whole. | 0:29:31 | 0:29:35 | |
Applebaum now travels the world promoting the use of the software. | 0:29:36 | 0:29:43 | |
The Tor network gives each person the ability to read without | 0:29:43 | 0:29:46 | |
creating a data trail that will later be used against them. | 0:29:46 | 0:29:50 | |
It gives every person a voice. | 0:29:50 | 0:29:54 | |
Every person has the right to read and to speak freely, | 0:29:54 | 0:29:56 | |
not one human excluded. | 0:29:56 | 0:29:58 | |
And one place Tor has become important is the Middle East. | 0:29:58 | 0:30:02 | |
During the Arab Spring, | 0:30:05 | 0:30:06 | |
as disturbances spread across the region, it became a vital | 0:30:06 | 0:30:10 | |
tool for dissidents... | 0:30:10 | 0:30:11 | |
..especially in places like Syria. | 0:30:13 | 0:30:16 | |
One of those who used it from the beginning was opposition activist Reem al Assil. | 0:30:18 | 0:30:22 | |
I found out first about Tor back in 2011. | 0:30:26 | 0:30:29 | |
Surveillance in Syria is a very big problem for activists | 0:30:32 | 0:30:37 | |
or for anyone even, | 0:30:37 | 0:30:39 | |
because the Syrian regime are trying all the time to get into people's | 0:30:39 | 0:30:44 | |
e-mails and Facebook to see what they are up to, what they are doing. | 0:30:44 | 0:30:50 | |
By the time the Syrian uprising happened, | 0:30:50 | 0:30:52 | |
the Tor project had developed a browser which made | 0:30:52 | 0:30:56 | |
downloading the software very simple. | 0:30:56 | 0:30:59 | |
You basically go and download Tor in your computer | 0:30:59 | 0:31:02 | |
and once it's installed, whenever you want to browse the Web, | 0:31:02 | 0:31:07 | |
you go and click on it just like Internet Explorer or Google Chrome. | 0:31:07 | 0:31:13 | |
Reem had personal experience of the protection offered by Tor | 0:31:13 | 0:31:17 | |
when she was arrested by the secret police. | 0:31:17 | 0:31:19 | |
I denied having any relation with any opposition work, you know, | 0:31:22 | 0:31:26 | |
or anything and they tried to intimidate me | 0:31:26 | 0:31:29 | |
and they said, "Well, see, we have... we know everything about you so now, | 0:31:29 | 0:31:34 | |
"just we need you to tell us," but I knew that they don't know anything. | 0:31:34 | 0:31:39 | |
By using Tor, I was an anonymous user so they couldn't tell | 0:31:39 | 0:31:43 | |
that Reem is doing so-and-so, is watching so-and-so. | 0:31:43 | 0:31:48 | |
So that's why Tor protected me in this way. | 0:31:48 | 0:31:52 | |
Syria was not the only place where Tor was vital. | 0:31:54 | 0:31:57 | |
It's used in China and in Iran. | 0:32:01 | 0:32:06 | |
In any country where internet access is restricted, | 0:32:06 | 0:32:09 | |
Tor can be used by citizens to avoid the gaze of the authorities. | 0:32:09 | 0:32:16 | |
China, for example, regularly attacks and blocks the Tor network | 0:32:16 | 0:32:19 | |
and they don't attack us directly | 0:32:19 | 0:32:22 | |
so much as they actually attack people in China using Tor. | 0:32:22 | 0:32:25 | |
They stop them from using the Tor network. | 0:32:25 | 0:32:27 | |
But Tor wasn't just helping inside repressive regimes. | 0:32:29 | 0:32:34 | |
It was now being used for whistle-blowing in the West, | 0:32:34 | 0:32:39 | |
through WikiLeaks, | 0:32:39 | 0:32:42 | |
founded by Julian Assange. | 0:32:42 | 0:32:44 | |
Assange has spent the last two years under | 0:32:49 | 0:32:52 | |
the protection of the Ecuadorian Embassy in London. | 0:32:52 | 0:32:55 | |
He is fighting extradition to Sweden on sexual assault charges, | 0:32:55 | 0:32:59 | |
charges he denies. | 0:32:59 | 0:33:01 | |
I'd been involved in cryptography and anonymous communications | 0:33:02 | 0:33:05 | |
for almost 20 years, since the early 1990s. | 0:33:05 | 0:33:09 | |
Cryptographic anonymity didn't come from nowhere. | 0:33:09 | 0:33:12 | |
It was a long-standing quest which had a Holy Grail, | 0:33:12 | 0:33:14 | |
which is to be able to communicate | 0:33:14 | 0:33:17 | |
individual-to-individual freely and anonymously. | 0:33:17 | 0:33:20 | |
Tor was the first protocol, | 0:33:20 | 0:33:23 | |
first anonymous protocol, | 0:33:23 | 0:33:25 | |
that got the balance right. | 0:33:25 | 0:33:27 | |
From its early years, people who wanted to submit documents | 0:33:29 | 0:33:32 | |
anonymously to WikiLeaks could use Tor. | 0:33:32 | 0:33:35 | |
Tor was and is one of the mechanisms | 0:33:37 | 0:33:40 | |
which we have received important documents, yes. | 0:33:40 | 0:33:42 | |
One man who provided a link between WikiLeaks | 0:33:45 | 0:33:48 | |
and the Tor project was Jake Applebaum. | 0:33:48 | 0:33:51 | |
Sources that want to leak documents | 0:33:53 | 0:33:56 | |
need to be able to communicate with WikiLeaks | 0:33:56 | 0:33:58 | |
and it has always been the case | 0:33:58 | 0:34:00 | |
that they have offered a Tor-hidden service and that Tor-hidden service | 0:34:00 | 0:34:05 | |
allows people to reach the WikiLeaks submission engine. | 0:34:05 | 0:34:09 | |
In 2010, what could be achieved | 0:34:10 | 0:34:13 | |
when web activism met anonymity was revealed to the world. | 0:34:13 | 0:34:17 | |
WikiLeaks received a huge leak | 0:34:22 | 0:34:24 | |
of confidential US government material, | 0:34:24 | 0:34:27 | |
mainly relating to the wars in Afghanistan and Iraq. | 0:34:27 | 0:34:30 | |
The first release was this footage | 0:34:36 | 0:34:38 | |
which showed a US Apache helicopter attack in Iraq. | 0:34:38 | 0:34:41 | |
Among those dead were two journalists from Reuters, | 0:34:49 | 0:34:53 | |
Saeed Chmagh and Namir Noor-Eldeen. | 0:34:53 | 0:34:55 | |
The Americans investigated, | 0:34:56 | 0:34:58 | |
but say there was no wrong-doing | 0:34:58 | 0:35:01 | |
yet this footage would never have become public if Chelsea Manning, | 0:35:01 | 0:35:05 | |
a US contractor in Iraq, hadn't leaked it. | 0:35:05 | 0:35:08 | |
Chelsea Manning has said that to the court, | 0:35:10 | 0:35:13 | |
that he used Tor amongst a number of other things | 0:35:13 | 0:35:16 | |
to submit documents to WikiLeaks. | 0:35:16 | 0:35:19 | |
Obviously, we can't comment on that, | 0:35:19 | 0:35:20 | |
because we have an obligation to protect our sources. | 0:35:20 | 0:35:23 | |
The release of the documents provided by Manning, | 0:35:26 | 0:35:29 | |
which culminated in 250,000 cables, | 0:35:29 | 0:35:32 | |
seemed to reveal the power of anonymity through encryption. | 0:35:32 | 0:35:35 | |
Because with encryption, | 0:35:38 | 0:35:39 | |
two people can come together to communicate privately. | 0:35:39 | 0:35:43 | |
The full might of a superpower cannot break that encryption | 0:35:43 | 0:35:47 | |
if it is properly implemented and that's an extraordinary thing, | 0:35:47 | 0:35:50 | |
where individuals are given a certain type of freedom of action | 0:35:50 | 0:35:54 | |
that is equivalent to the freedom of action that a superpower has. | 0:35:54 | 0:35:57 | |
But it wasn't the technology that let Manning down. | 0:35:59 | 0:36:02 | |
He confessed what he had done to a contact and was arrested. | 0:36:04 | 0:36:08 | |
Those who had used anonymity to leak secrets were now under fire. | 0:36:11 | 0:36:15 | |
WikiLeaks had already been criticised for the release | 0:36:19 | 0:36:22 | |
of un-redacted documents revealing the names of Afghans | 0:36:22 | 0:36:25 | |
who had assisted the US. | 0:36:25 | 0:36:27 | |
The US government was also on the attack. | 0:36:28 | 0:36:31 | |
The United States strongly condemns the illegal | 0:36:31 | 0:36:35 | |
disclosure of classified information. | 0:36:35 | 0:36:38 | |
It puts people's lives in danger, threatens our national security... | 0:36:38 | 0:36:43 | |
Meanwhile, the Tor project, started by the US government, | 0:36:45 | 0:36:49 | |
was becoming a target. | 0:36:49 | 0:36:51 | |
It's very funny, right, because on the one hand, | 0:36:53 | 0:36:56 | |
these people are funding Tor because they say they believe in anonymity. | 0:36:56 | 0:36:59 | |
And on the other hand, they're detaining me at airports, | 0:36:59 | 0:37:02 | |
threatening me and doing things like that. | 0:37:02 | 0:37:04 | |
And they've even said to me, "We love what you do in Iran | 0:37:04 | 0:37:07 | |
"and in China, in helping Tibetan people. | 0:37:07 | 0:37:09 | |
"We love all the stuff that you're doing, | 0:37:09 | 0:37:11 | |
"but why do you have to do it here?" | 0:37:11 | 0:37:13 | |
Thanks to Edward Snowden, | 0:37:15 | 0:37:16 | |
we now know that this culminated in the Tor network | 0:37:16 | 0:37:19 | |
being the focus of failed attacks by America's National Security Agency. | 0:37:19 | 0:37:23 | |
They revealed their frustration | 0:37:25 | 0:37:27 | |
in a confidential PowerPoint presentation called Tor Stinks, | 0:37:27 | 0:37:31 | |
which set out the ways in which the NSA had tried to crack the network. | 0:37:31 | 0:37:36 | |
They think Tor stinks because they want to attack people | 0:37:36 | 0:37:40 | |
and sometimes technology makes that harder. | 0:37:40 | 0:37:43 | |
It is because the users have something which bothers them, | 0:37:43 | 0:37:47 | |
which is real autonomy. | 0:37:47 | 0:37:49 | |
It gives them true privacy and security. | 0:37:49 | 0:37:52 | |
Tor, invented and funded by the US government, | 0:37:52 | 0:37:55 | |
was now used by activists, journalists, | 0:37:55 | 0:37:58 | |
anybody who wanted to communicate anonymously, | 0:37:58 | 0:38:02 | |
and it wasn't long before its potential | 0:38:02 | 0:38:04 | |
began to attract a darker type of user. | 0:38:04 | 0:38:07 | |
It began here in Washington. | 0:38:12 | 0:38:13 | |
Jon Iadonisi is a former Navy SEAL | 0:38:14 | 0:38:17 | |
turned advisor on cyber operations to government. | 0:38:17 | 0:38:20 | |
There was some tips that came in out of Baltimore, | 0:38:22 | 0:38:26 | |
two federal agents saying, | 0:38:26 | 0:38:28 | |
"You are a police officer, | 0:38:28 | 0:38:30 | |
"you really should take a look at this website | 0:38:30 | 0:38:32 | |
"and, oh, by the way, the only way you get to it | 0:38:32 | 0:38:35 | |
"is if you anonymise yourself | 0:38:35 | 0:38:37 | |
"through something called the Tor router." | 0:38:37 | 0:38:39 | |
What they found was a website called Silk Road. | 0:38:39 | 0:38:44 | |
And they were amazed when they were able to download this plug-in | 0:38:44 | 0:38:48 | |
on their browser, go into the Silk Road | 0:38:48 | 0:38:51 | |
and then from there, see, literally they can make a purchase, | 0:38:51 | 0:38:55 | |
it was like a buffet dinner for narcotics. | 0:38:55 | 0:38:58 | |
Silk Road was a global drugs marketplace | 0:39:01 | 0:39:04 | |
which brought together anonymous buyers and sellers | 0:39:04 | 0:39:07 | |
from around the world. | 0:39:07 | 0:39:08 | |
And what they found was people aren't just buying | 0:39:10 | 0:39:13 | |
one or two instances of designer drugs | 0:39:13 | 0:39:15 | |
but they're buying massive quantity wholesale. | 0:39:15 | 0:39:18 | |
In London, the tech community was watching closely. | 0:39:18 | 0:39:23 | |
Thomas Olofsson was one of many interested | 0:39:23 | 0:39:26 | |
in how much money the site was making. | 0:39:26 | 0:39:28 | |
Well, in Silk Road, they have something called an "escrow" system | 0:39:28 | 0:39:32 | |
so if you want to buy drugs, you pay money into Silk Road | 0:39:32 | 0:39:36 | |
as a facilitator | 0:39:36 | 0:39:37 | |
and they keep the money in escrow until you sign off | 0:39:37 | 0:39:40 | |
that you have had your drugs delivered. | 0:39:40 | 0:39:43 | |
They will then release your money to the drug dealer. | 0:39:43 | 0:39:48 | |
We're talking about several millions a day in trade. | 0:39:48 | 0:39:51 | |
The extraordinary success of Silk Road | 0:39:55 | 0:39:57 | |
attracted new customers to new illegal sites. | 0:39:57 | 0:40:01 | |
This part of the internet even had a new name - the Dark Web. | 0:40:02 | 0:40:06 | |
A dark website is impossible to shut down | 0:40:07 | 0:40:10 | |
because you don't know where a dark website is hosted | 0:40:10 | 0:40:13 | |
or even where it's physically located or who's behind it. | 0:40:13 | 0:40:17 | |
And there was one other thing which made Silk Road | 0:40:21 | 0:40:24 | |
and its imitators difficult to stop. | 0:40:24 | 0:40:26 | |
You paid with a new currency that only exists online, called Bitcoin. | 0:40:26 | 0:40:33 | |
Before, even if you had anonymity as a user, | 0:40:33 | 0:40:36 | |
you could still track the transactions, | 0:40:36 | 0:40:39 | |
the money flowing between persons, | 0:40:39 | 0:40:42 | |
because if you use your Visa card, your ATM, bank transfer, | 0:40:42 | 0:40:46 | |
Western Union, there is always... Money leaves a mark. | 0:40:46 | 0:40:49 | |
This is the first time that you can anonymously | 0:40:49 | 0:40:52 | |
move money between two persons. | 0:40:52 | 0:40:54 | |
Bitcoin is no longer an underground phenomenon. | 0:41:02 | 0:41:06 | |
Buy Bitcoin, 445. | 0:41:06 | 0:41:09 | |
Sellers will sell at 460. | 0:41:09 | 0:41:13 | |
If you're buying less than half a Bitcoin, | 0:41:13 | 0:41:15 | |
you'll have to go to market price. | 0:41:15 | 0:41:17 | |
This Bitcoin event is taking place on Wall Street. | 0:41:17 | 0:41:20 | |
45 bid, 463 asked. | 0:41:23 | 0:41:25 | |
But how does the currency work? | 0:41:28 | 0:41:30 | |
One of the people best placed to explain is Peter Todd. | 0:41:31 | 0:41:35 | |
He's chief scientist for a number of Bitcoin companies | 0:41:35 | 0:41:38 | |
including one of the hottest, Dark Wallet. | 0:41:38 | 0:41:42 | |
So, what Bitcoin is, is it's virtual money. | 0:41:44 | 0:41:47 | |
I can give it to you electronically, | 0:41:47 | 0:41:48 | |
you can give it to someone else electronically. | 0:41:48 | 0:41:51 | |
The key thing to understand about Bitcoin | 0:41:51 | 0:41:53 | |
is that these two people trading here are making a deal | 0:41:53 | 0:41:57 | |
without any bank involvement. | 0:41:57 | 0:41:59 | |
It's a form of electronic cash | 0:41:59 | 0:42:02 | |
and that has massive implications. | 0:42:02 | 0:42:05 | |
So, of course, in normal electronic banking systems, | 0:42:05 | 0:42:07 | |
what I would say is, "Please transfer money from my account | 0:42:07 | 0:42:10 | |
"to someone else's account," but fundamentally, | 0:42:10 | 0:42:13 | |
who owns what money is recorded by the bank, by the intermediary? | 0:42:13 | 0:42:17 | |
What's really interesting about Bitcoin is this virtual money | 0:42:19 | 0:42:23 | |
is not controlled by a bank, it's not controlled by government. | 0:42:23 | 0:42:26 | |
It's controlled by an algorithm. | 0:42:26 | 0:42:28 | |
The algorithm in question is a triumph of mathematics. | 0:42:28 | 0:42:31 | |
This is a Bitcoin transaction in action. | 0:42:32 | 0:42:35 | |
To pay someone in Bitcoin, the transaction must be signed | 0:42:38 | 0:42:41 | |
using an cryptographic key which proves the parties agree to it. | 0:42:41 | 0:42:46 | |
An unchangeable electronic record is then made of this transaction. | 0:42:46 | 0:42:50 | |
This electronic record contains every transaction | 0:42:51 | 0:42:54 | |
ever made on Bitcoin. | 0:42:54 | 0:42:57 | |
It's called the block chain | 0:42:57 | 0:42:59 | |
and it's stored in a distributed form by every user, | 0:42:59 | 0:43:04 | |
not by a bank or other authority. | 0:43:04 | 0:43:07 | |
The block chain is really the revolutionary part of Bitcoin. | 0:43:07 | 0:43:10 | |
What's really unique about it is it's all public | 0:43:10 | 0:43:13 | |
so you can run the Bitcoin algorithm on your computer | 0:43:13 | 0:43:16 | |
and your computer's inspecting every single transaction | 0:43:16 | 0:43:19 | |
to be sure that it actually followed the rules, | 0:43:19 | 0:43:21 | |
and the rules are really what Bitcoin is. | 0:43:21 | 0:43:23 | |
You know, that's the rules of the system, that's the algorithm. | 0:43:23 | 0:43:26 | |
It says things like, "You can only send money to one person at once," | 0:43:26 | 0:43:29 | |
and we all agree to those rules. | 0:43:29 | 0:43:32 | |
And Bitcoin has one other characteristic | 0:43:32 | 0:43:34 | |
which it shares with cash. | 0:43:34 | 0:43:37 | |
It can be very hard to trace. | 0:43:37 | 0:43:40 | |
Well, what's controversial about Bitcoin | 0:43:40 | 0:43:43 | |
is that it goes back to something quite like cash. | 0:43:43 | 0:43:46 | |
It's not like a bank account where a government investigator | 0:43:46 | 0:43:49 | |
can just call up the bank and get all the records | 0:43:49 | 0:43:51 | |
of who I've ever transacted with without any effort at all. | 0:43:51 | 0:43:55 | |
You know, if they want to go and find out where I got my Bitcoins, | 0:43:55 | 0:43:58 | |
they're going to have to ask me. | 0:43:58 | 0:43:59 | |
They're going to have to investigate. | 0:43:59 | 0:44:01 | |
The emergence of Bitcoin and the growth of the Dark Web | 0:44:05 | 0:44:08 | |
was now leading law enforcement in Washington to take a close interest. | 0:44:08 | 0:44:12 | |
Transactions in the Dark Web were unbelievably more enabled | 0:44:14 | 0:44:18 | |
and in many cases could exist | 0:44:18 | 0:44:21 | |
because of this virtual currency known as Bitcoin. | 0:44:21 | 0:44:24 | |
So, as people started to take a look at Bitcoin | 0:44:25 | 0:44:28 | |
and understand, "How do we regulate this, how do we monitor this? | 0:44:28 | 0:44:31 | |
"Oh, my God! It's completely anonymous, we have no record," | 0:44:31 | 0:44:34 | |
they started seeing transactions in the sort of the digital exhaust | 0:44:34 | 0:44:38 | |
that led them into Silk Road. | 0:44:38 | 0:44:40 | |
And so, now you had criminal grounds to start taking a look at this | 0:44:40 | 0:44:44 | |
coupled with the movement from the financial side and regulatory side | 0:44:44 | 0:44:50 | |
on the virtual currency. | 0:44:50 | 0:44:52 | |
So these two fronts began converging. | 0:44:52 | 0:44:55 | |
The FBI began to mount a complex plot | 0:44:55 | 0:44:57 | |
against the alleged lead administrator of Silk Road | 0:44:57 | 0:45:01 | |
who used the pseudonym "Dread Pirate Roberts". | 0:45:01 | 0:45:04 | |
They really started focusing on Dread Pirate Roberts | 0:45:06 | 0:45:10 | |
and his role as not just a leader but sort of the mastermind | 0:45:10 | 0:45:14 | |
in the whole ecosystem of the platform, right, | 0:45:14 | 0:45:17 | |
from management administratively to financial merchandising | 0:45:17 | 0:45:22 | |
to vendor placement, recruitment, etc. | 0:45:22 | 0:45:25 | |
Dread Pirate Roberts was believed to be Ross Ulbricht, | 0:45:26 | 0:45:30 | |
listed on his LinkedIn entry | 0:45:30 | 0:45:32 | |
as an investment advisor and entrepreneur from Austin, Texas. | 0:45:32 | 0:45:37 | |
Taking him down was really almost a story out of a Hollywood movie. | 0:45:37 | 0:45:42 | |
I mean, we had people that staged the death | 0:45:42 | 0:45:45 | |
of one of the potential informants. | 0:45:45 | 0:45:48 | |
We had undercover police officers acting as cocaine... | 0:45:48 | 0:45:52 | |
under-kingpins inside Silk Road. | 0:45:52 | 0:45:54 | |
So, at the conclusion of all these different elements, | 0:45:54 | 0:45:58 | |
they actually finally ended up bringing in Dread Pirate Roberts | 0:45:58 | 0:46:03 | |
and now are trying to move forward with his trial. | 0:46:03 | 0:46:07 | |
Whether or not he is Dread Pirate Roberts, | 0:46:07 | 0:46:10 | |
Ulbricht's arrest in 2013 brought down Silk Road. | 0:46:10 | 0:46:14 | |
In the process, the FBI seized 28.5 million | 0:46:14 | 0:46:18 | |
from the site's escrow account, money destined for drug dealers. | 0:46:18 | 0:46:23 | |
Yet the problem for the US authorities | 0:46:26 | 0:46:28 | |
was that their success was only temporary. | 0:46:28 | 0:46:31 | |
The site is now up and running again. | 0:46:35 | 0:46:37 | |
It re-emerged just two months later by some other guys | 0:46:38 | 0:46:42 | |
that took the same code base, the same, | 0:46:42 | 0:46:44 | |
actually the same site more or less, | 0:46:44 | 0:46:46 | |
just other people running the site, | 0:46:46 | 0:46:48 | |
because it's obviously a very, very profitable site to run. | 0:46:48 | 0:46:53 | |
And Silk Road has now been joined by a host of other sites. | 0:46:53 | 0:46:57 | |
One of the boom industries on the Dark Web is financial crime. | 0:47:00 | 0:47:05 | |
Yeah, this website is quite focused | 0:47:06 | 0:47:08 | |
on credit card numbers and stolen data. | 0:47:08 | 0:47:11 | |
On here, for instance, is credit card numbers | 0:47:11 | 0:47:14 | |
from around 5-6 per credit card number | 0:47:14 | 0:47:17 | |
paying in equivalent of Bitcoins, totally anonymous. | 0:47:17 | 0:47:20 | |
The cards are sold in something called a "dump". | 0:47:21 | 0:47:25 | |
I mean, it's quite a large number of entries, it's tens of thousands. | 0:47:26 | 0:47:31 | |
You get everything you need to be able to buy stuff online | 0:47:31 | 0:47:34 | |
with these credit cards - first name, last name, address, | 0:47:34 | 0:47:39 | |
the card number, everything, the CVV number, | 0:47:39 | 0:47:42 | |
everything but the PIN code, basically. | 0:47:42 | 0:47:44 | |
The Dark Web is now used for various criminal activities - | 0:47:44 | 0:47:48 | |
drugs and guns, financial crime, | 0:47:48 | 0:47:51 | |
and even child sexual exploitation. | 0:47:51 | 0:47:54 | |
So, is anonymity a genuine threat to society? | 0:48:02 | 0:48:05 | |
A nightmare that should haunt us? | 0:48:08 | 0:48:10 | |
This man who leads Europe's fight against cybercrime believes so. | 0:48:15 | 0:48:20 | |
The Tor network plays a role because it hides criminals. | 0:48:20 | 0:48:24 | |
I know it was not the intention, | 0:48:24 | 0:48:26 | |
but that's the outcome and this is my job, | 0:48:26 | 0:48:28 | |
to tell the society what is the trade-offs here. | 0:48:28 | 0:48:32 | |
By having no possibilities to penetrate this, | 0:48:32 | 0:48:35 | |
we will then secure criminals | 0:48:35 | 0:48:37 | |
that they can continue their crimes on a global network. | 0:48:37 | 0:48:40 | |
And despite the success over Silk Road and others, | 0:48:42 | 0:48:45 | |
he is worried for the future. | 0:48:45 | 0:48:47 | |
Our detection rate is dropping. It's very simple. | 0:48:50 | 0:48:53 | |
The business model of the criminal is to make profit with low risk | 0:48:53 | 0:48:57 | |
and, here, you actually eliminate the risk | 0:48:57 | 0:49:00 | |
because there is no risk to get identified | 0:49:00 | 0:49:02 | |
so either you have to screw up or be very unlucky | 0:49:02 | 0:49:05 | |
as somebody rats you out. | 0:49:05 | 0:49:07 | |
Otherwise, you're secure. So, if you run a tight operation, | 0:49:07 | 0:49:10 | |
it's very, very difficult for the police to penetrate | 0:49:10 | 0:49:13 | |
so it's risk-free crime. | 0:49:13 | 0:49:15 | |
So, does the anonymity offered by the Tor network | 0:49:15 | 0:49:18 | |
encourage crime or simply displace it? | 0:49:18 | 0:49:22 | |
Those who work for the project are well aware of the charges it faces. | 0:49:22 | 0:49:26 | |
There is often asserted certain narratives about anonymity | 0:49:26 | 0:49:30 | |
and, of course, one of the narratives is that anonymity creates crime | 0:49:30 | 0:49:35 | |
so you hear about things like the Silk Road and you hear, | 0:49:35 | 0:49:38 | |
"Oh, it's terrible, someone can do something illegal on the internet." | 0:49:38 | 0:49:41 | |
Well, welcome to the internet. | 0:49:41 | 0:49:43 | |
It is a reflection of human society | 0:49:43 | 0:49:45 | |
where there is sometimes illegal behaviour. | 0:49:45 | 0:49:48 | |
These arguments aside, | 0:49:48 | 0:49:50 | |
for users wanting to avoid surveillance, | 0:49:50 | 0:49:53 | |
Tor has limitations - | 0:49:53 | 0:49:55 | |
it's slow, content isn't automatically encrypted | 0:49:55 | 0:49:59 | |
on exiting the network | 0:49:59 | 0:50:00 | |
and some have claimed a bug can de-anonymise users. | 0:50:00 | 0:50:04 | |
Tor say they have a fix for this problem. | 0:50:04 | 0:50:06 | |
Whilst the search for a solution to bulk surveillance continues, | 0:50:07 | 0:50:11 | |
this man has a different approach. | 0:50:11 | 0:50:13 | |
He is Eugene Kaspersky, CEO of one of the world's | 0:50:16 | 0:50:19 | |
fastest-growing internet security companies. | 0:50:19 | 0:50:23 | |
300 million users now rely on its software. | 0:50:23 | 0:50:26 | |
For Kaspersky, widespread anonymity is not a solution. | 0:50:27 | 0:50:32 | |
In fact, he thinks we need the absolute opposite, | 0:50:32 | 0:50:36 | |
a form of online passport. | 0:50:36 | 0:50:38 | |
My idea is that all the services in the internet, they must be split. | 0:50:40 | 0:50:45 | |
So, the non-critical - your personal e-mails, | 0:50:47 | 0:50:51 | |
the news from the internet, | 0:50:51 | 0:50:53 | |
your chatting with your family, | 0:50:53 | 0:50:55 | |
what else? - leave them alone so don't need, you don't need any ID. | 0:50:55 | 0:51:01 | |
It's a place of freedom. | 0:51:01 | 0:51:02 | |
And there are critical services, like banking services, | 0:51:02 | 0:51:06 | |
financial services, banks, booking their tickets, | 0:51:06 | 0:51:10 | |
booking the hotels or what else? | 0:51:10 | 0:51:12 | |
So please present your ID if you do this. | 0:51:12 | 0:51:16 | |
So, it's a kind of balance - | 0:51:18 | 0:51:21 | |
freedom and security, | 0:51:21 | 0:51:23 | |
anonymity and wearing the badge, wearing your ID. | 0:51:23 | 0:51:26 | |
In his view, this shouldn't be a problem, | 0:51:26 | 0:51:29 | |
as privacy has pretty much disappeared online anyway. | 0:51:29 | 0:51:32 | |
The reality is that we don't have too much privacy in the cyber space. | 0:51:34 | 0:51:38 | |
If you want to travel, if you want to pay with your credit card, | 0:51:38 | 0:51:42 | |
if you want to access internet, forget about privacy. | 0:51:42 | 0:51:45 | |
Yet, the idea that we could soon inhabit a world | 0:51:49 | 0:51:52 | |
where our lives are ever more transparent is very unpopular. | 0:51:52 | 0:51:56 | |
Some people say, "Privacy is over, get over it", | 0:51:57 | 0:52:02 | |
because basically we're trending towards the idea | 0:52:02 | 0:52:05 | |
of just completely transparent lives. | 0:52:05 | 0:52:08 | |
I think that's nonsense because information boundaries are important | 0:52:08 | 0:52:13 | |
so that means that we've got to have systems which respect them | 0:52:13 | 0:52:16 | |
so we've got to have the technology to produce, | 0:52:16 | 0:52:19 | |
which can produce privacy. | 0:52:19 | 0:52:20 | |
For those of us who might wish to resist surveillance, | 0:52:26 | 0:52:29 | |
the hunt for that technology is now on | 0:52:29 | 0:52:32 | |
and it is to cryptographers that we must look for answers. | 0:52:32 | 0:52:36 | |
If you want a demonstration of their importance to today's internet, | 0:52:38 | 0:52:42 | |
you only have to come to this bunker in Virginia. | 0:52:42 | 0:52:45 | |
Today, an unusual ceremony is taking place. | 0:52:45 | 0:52:49 | |
COMPUTER: 'Please centre your eyes in the mirror.' | 0:52:52 | 0:52:55 | |
The internet is being upgraded. | 0:52:55 | 0:52:57 | |
'Thank you, your identity has been verified.' | 0:52:57 | 0:52:59 | |
Right, we're in. | 0:53:00 | 0:53:01 | |
This is the biggest security upgrade to the internet | 0:53:01 | 0:53:04 | |
in over 20 years. | 0:53:04 | 0:53:07 | |
A group of tech specialists have been summoned by Steve Crocker, | 0:53:07 | 0:53:11 | |
one of the godfathers of the internet. | 0:53:11 | 0:53:15 | |
We discovered that there were some vulnerabilities | 0:53:15 | 0:53:18 | |
in the basic domain name system structure | 0:53:18 | 0:53:20 | |
that can lead to having a domain name hijacked | 0:53:20 | 0:53:25 | |
or having someone directed to a false site and, from there, | 0:53:25 | 0:53:29 | |
passwords can be detected and accounts can be cleaned out. | 0:53:29 | 0:53:32 | |
At the heart of this upgrade is complex cryptography. | 0:53:34 | 0:53:37 | |
Work began on adding cryptographically strong signatures | 0:53:39 | 0:53:42 | |
to every entry in the domain name system | 0:53:42 | 0:53:45 | |
in order to make it impossible to spoof or plant false information. | 0:53:45 | 0:53:51 | |
To make sure these cryptographic codes remain secure, | 0:53:55 | 0:53:58 | |
they are reset every three months. | 0:53:58 | 0:54:00 | |
Three trusted experts from around the world have been summoned | 0:54:02 | 0:54:06 | |
with three keys, keys that open these safety deposit boxes. | 0:54:06 | 0:54:10 | |
So, now we are opening box 1-2-4-0. | 0:54:10 | 0:54:13 | |
Inside are smart cards. | 0:54:13 | 0:54:16 | |
When the three are put together, | 0:54:16 | 0:54:18 | |
a master code can be validated which resets the system | 0:54:18 | 0:54:22 | |
for millions of websites. | 0:54:22 | 0:54:24 | |
We are very lucky because every one of these people | 0:54:24 | 0:54:26 | |
are respected members of the technical community, | 0:54:26 | 0:54:29 | |
technical internet community, | 0:54:29 | 0:54:31 | |
and that's what we were doing. | 0:54:31 | 0:54:32 | |
Just like the internet itself was formed from the bottom up, | 0:54:32 | 0:54:35 | |
high techies from around the world. | 0:54:35 | 0:54:37 | |
So, step 22. | 0:54:37 | 0:54:39 | |
A complex set of instructions is followed | 0:54:40 | 0:54:43 | |
to make sure the system is ready to operate. | 0:54:43 | 0:54:46 | |
And now we need to verify the KSR. | 0:54:46 | 0:54:49 | |
This is the key step. | 0:54:49 | 0:54:51 | |
So, this is it. When I hit "yes", it will be signed. | 0:54:53 | 0:54:57 | |
There you go, thank you very much. | 0:54:59 | 0:55:01 | |
APPLAUSE | 0:55:01 | 0:55:03 | |
These scientists want to use cryptography to make | 0:55:06 | 0:55:09 | |
the architecture of the internet more resilient. | 0:55:09 | 0:55:13 | |
But, for users, cryptography has another purpose. | 0:55:13 | 0:55:17 | |
We can use it to encrypt our message content. | 0:55:17 | 0:55:21 | |
So, if we want to change the way that mass surveillance is done, | 0:55:21 | 0:55:24 | |
encryption, it turns out, | 0:55:24 | 0:55:26 | |
is one of the ways that we do that. | 0:55:26 | 0:55:28 | |
When we encrypt our data, | 0:55:28 | 0:55:30 | |
we change the value that mass surveillance presents. | 0:55:30 | 0:55:33 | |
When phone calls are end-to-end encrypted | 0:55:33 | 0:55:35 | |
such that no-one else can decipher their content, | 0:55:35 | 0:55:38 | |
thanks to the science of mathematics, of cryptography. | 0:55:38 | 0:55:41 | |
And those who understand surveillance from the inside | 0:55:43 | 0:55:46 | |
agree that it's the only way to protect our communications. | 0:55:46 | 0:55:50 | |
However there's a problem. | 0:56:21 | 0:56:23 | |
It's still very difficult to encrypt content in a user-friendly way. | 0:56:23 | 0:56:27 | |
One of the best ways to protect your privacy is to use encryption, | 0:56:28 | 0:56:32 | |
but encryption is so incredibly hard to use. | 0:56:32 | 0:56:35 | |
I am a technology person | 0:56:35 | 0:56:37 | |
and I struggle all the time to get my encryption products to work. | 0:56:37 | 0:56:41 | |
And that's the next challenge, | 0:56:41 | 0:56:44 | |
developing an internet where useable encryption | 0:56:44 | 0:56:47 | |
makes bulk surveillance, | 0:56:47 | 0:56:49 | |
for those who want to avoid it, more difficult. | 0:56:49 | 0:56:52 | |
At the moment, the systems we're using are fairly insecure | 0:56:53 | 0:56:57 | |
in lots of ways. I think the programmers out there | 0:56:57 | 0:56:59 | |
have got to help build tools to make that easier. | 0:56:59 | 0:57:02 | |
If encryption is to become more commonplace, | 0:57:05 | 0:57:07 | |
the pressure will likely come from the market. | 0:57:07 | 0:57:10 | |
The result of the revelations about the National Security Agency | 0:57:12 | 0:57:15 | |
is that people are becoming quite paranoid. | 0:57:15 | 0:57:18 | |
It's important to not be paralysed by that paranoia, | 0:57:18 | 0:57:21 | |
but rather to express the desire for privacy | 0:57:21 | 0:57:25 | |
and by expressing the desire for privacy, | 0:57:25 | 0:57:28 | |
the market will fill the demand. | 0:57:28 | 0:57:30 | |
By making it harder, you protect yourself, | 0:57:30 | 0:57:33 | |
you protect your family and you also protect other people | 0:57:33 | 0:57:36 | |
in just making it more expensive to surveil everyone all the time. | 0:57:36 | 0:57:39 | |
In the end, encryption is all about mathematics | 0:57:39 | 0:57:43 | |
and, for those who want more privacy, | 0:57:43 | 0:57:46 | |
the numbers work in their favour. | 0:57:46 | 0:57:48 | |
It turns out that it's easier in this universe | 0:57:49 | 0:57:52 | |
to encrypt information, much easier, than it is to decrypt it | 0:57:52 | 0:57:56 | |
if you're someone watching from the outside. | 0:57:56 | 0:57:59 | |
The universe fundamentally favours privacy. | 0:57:59 | 0:58:02 | |
We have reached a critical moment | 0:58:04 | 0:58:06 | |
when the limits of privacy | 0:58:06 | 0:58:08 | |
could be defined for a generation. | 0:58:08 | 0:58:11 | |
We are beginning to grasp the shape of this new world. | 0:58:11 | 0:58:14 | |
It's time to decide whether we are happy to accept it. | 0:58:16 | 0:58:19 | |
'Thank you. Your identity has been verified.' | 0:58:45 | 0:58:48 |