Browse content similar to Cyber Attack - The Day the NHS Stopped. Check below for episodes and series from the same categories and more!
Line | From | To | |
---|---|---|---|
On the morning of May 12th, | 2:52:24 | 2:52:26 | |
NHS staff were about to be confronted by a major outbreak... | 2:52:26 | 2:52:30 | |
..as an epidemic swept like wildfire across the country. | 2:52:34 | 2:52:38 | |
But the disease didn't infect patients, and it wasn't biological. | 2:52:43 | 2:52:47 | |
Instead it attacked the central nervous system of the NHS itself. | 2:52:49 | 2:52:53 | |
Across the country, | 2:52:58 | 2:52:59 | |
computer systems were knocked out by a highly contagious computer virus. | 2:52:59 | 2:53:03 | |
Hello, can I speak to IT, please? | 2:53:05 | 2:53:07 | |
It became known as WannaCry. | 2:53:07 | 2:53:10 | |
There's a message on my screen, it says my files have been encrypted. | 2:53:10 | 2:53:13 | |
This is the story of a uniquely challenging day | 2:53:13 | 2:53:15 | |
for the National Health Service. | 2:53:15 | 2:53:18 | |
A day when the NHS itself became a patient. | 2:53:18 | 2:53:21 | |
It was attacked by a particularly vicious piece of computer code | 2:53:21 | 2:53:25 | |
which took down its networks, | 2:53:25 | 2:53:26 | |
its computers and anything attached to them. | 2:53:26 | 2:53:29 | |
And that meant patient record systems, CT scanners, | 2:53:29 | 2:53:32 | |
even MRI machines, | 2:53:32 | 2:53:34 | |
putting not just data but also patients' lives at risk. | 2:53:34 | 2:53:37 | |
The surgeon looked very forlorn and very sorry, | 2:53:39 | 2:53:42 | |
and that was when he then told me that he couldn't do the operation. | 2:53:42 | 2:53:46 | |
We were unable to book appointments, | 2:53:46 | 2:53:48 | |
we were unable to see who would be coming in tomorrow, | 2:53:48 | 2:53:51 | |
so we were really paralysed and at a loss of what to do. | 2:53:51 | 2:53:56 | |
Horizon unpicks the science behind the recent widespread cyber attack | 2:53:56 | 2:54:00 | |
that hit our National Health Service. | 2:54:00 | 2:54:02 | |
And, in his first television interview, we meet the 22-year-old | 2:54:03 | 2:54:07 | |
cyber security specialist who stopped it in its tracks. | 2:54:07 | 2:54:11 | |
I checked the message board. | 2:54:11 | 2:54:13 | |
There were maybe 16, 17 reports of different NHS, sort of, | 2:54:13 | 2:54:17 | |
organisations being hit. | 2:54:17 | 2:54:19 | |
And that was sort of the point where I decided, "My holiday's over, | 2:54:19 | 2:54:22 | |
"I've got to look into this." | 2:54:22 | 2:54:24 | |
The outbreak exposed a vulnerability at the heart of the NHS. | 2:54:24 | 2:54:30 | |
I am a doctor, and all of this is a worry. | 2:54:30 | 2:54:33 | |
I want to know what happens, I want to know why it happens, | 2:54:33 | 2:54:36 | |
and I want to know how I can protect my patients | 2:54:36 | 2:54:39 | |
from this new strain of infectious disease. | 2:54:39 | 2:54:42 | |
I found out about the attacks the way most people did, | 2:55:02 | 2:55:05 | |
through news reports. | 2:55:05 | 2:55:07 | |
Now, mercifully, the hospital that I work for wasn't affected, | 2:55:07 | 2:55:10 | |
but as details emerged, it became clear that colleagues all | 2:55:10 | 2:55:13 | |
over the NHS were getting into work that day, | 2:55:13 | 2:55:16 | |
setting up their computers | 2:55:16 | 2:55:19 | |
and being greeted with a screen that looks like this. | 2:55:19 | 2:55:22 | |
Now it's very polite - it tells you what it's done, it's encrypted | 2:55:22 | 2:55:25 | |
all of your data, tells you what you have to do, which is pay some money, | 2:55:25 | 2:55:28 | |
and it tells you that if you pay the money now, | 2:55:28 | 2:55:31 | |
you won't have to pay quite so much. | 2:55:31 | 2:55:32 | |
Otherwise you're going to lose everything. | 2:55:32 | 2:55:35 | |
On 12th May 2017, the cyber attack wrought havoc across the NHS. | 2:55:38 | 2:55:44 | |
It hit many hospital trusts, | 2:55:44 | 2:55:46 | |
and some A&E departments even closed their doors to ambulances. | 2:55:46 | 2:55:50 | |
Operations were cancelled. | 2:55:50 | 2:55:52 | |
Patients were diverted. | 2:55:52 | 2:55:54 | |
But the story of the virus itself | 2:55:55 | 2:55:58 | |
goes back far further than the events of that day. | 2:55:58 | 2:56:01 | |
With all outbreaks, there's always a point of origin. | 2:56:26 | 2:56:30 | |
A moment when the virus first emerges. | 2:56:30 | 2:56:33 | |
DRAMATIC MUSIC PLAYS | 2:56:41 | 2:56:44 | |
Down! Down! Hands on your head! | 2:57:07 | 2:57:10 | |
Down, down, down! | 2:57:10 | 2:57:12 | |
Cuff him! | 2:57:13 | 2:57:15 | |
For over 20 years, | 2:57:21 | 2:57:22 | |
Harold Martin worked as a contractor for US government intelligence. | 2:57:22 | 2:57:26 | |
On the day of his arrest, agents found stolen drives | 2:57:35 | 2:57:38 | |
containing more than 50 terabytes of classified data... | 2:57:38 | 2:57:42 | |
..allegedly including top-secret hacking tools | 2:57:44 | 2:57:47 | |
stockpiled by the National Security Agency. | 2:57:47 | 2:57:50 | |
Harold Martin's arrest followed a tweet | 2:58:00 | 2:58:03 | |
by a mysterious group calling themselves the Shadow Brokers. | 2:58:03 | 2:58:06 | |
They were offering National Security Agency hacking tools | 2:58:10 | 2:58:13 | |
to anyone prepared to pay the 580 million asking price. | 2:58:13 | 2:58:18 | |
According to reports, | 2:58:20 | 2:58:22 | |
once they found out about the Shadow Brokers' demands, | 2:58:22 | 2:58:24 | |
the NSA triggered an internal investigation and, | 2:58:24 | 2:58:28 | |
just a couple of weeks later, Harold Martin was arrested. | 2:58:28 | 2:58:31 | |
Now, there's no evidence at all | 2:58:31 | 2:58:33 | |
that he passed on information to the Shadow Brokers, | 2:58:33 | 2:58:36 | |
but, interestingly, on the hard drives in his home, | 2:58:36 | 2:58:39 | |
was found the hacking tool, Eternal Blue. | 2:58:39 | 2:58:42 | |
Now, Eternal Blue is a kind of key that allows you to prise open | 2:58:42 | 2:58:46 | |
the Windows 7 operating system, and it is that which allowed hackers to | 2:58:46 | 2:58:49 | |
cause havoc across organisations all over the world, including the NHS. | 2:58:49 | 2:58:56 | |
When it comes to attribution, | 2:59:02 | 2:59:03 | |
in other words identifying the true source of attacks, | 2:59:03 | 2:59:06 | |
the world in cyber is a lot more difficult than, | 2:59:06 | 2:59:11 | |
say for example, physical, because, you know, you can make your attack | 2:59:11 | 2:59:14 | |
appear to come from anywhere in the world. | 2:59:14 | 2:59:16 | |
So, Shadow Brokers is an anonymous entity, | 2:59:17 | 2:59:20 | |
we don't really know who's behind Shadow Brokers. | 2:59:20 | 2:59:22 | |
It's generally assumed in the security research community | 2:59:24 | 2:59:27 | |
that the Shadow Brokers are, in effect, an arm of the Russian state. | 2:59:27 | 2:59:32 | |
35 days before the cyber attack, | 2:59:40 | 2:59:42 | |
it was business as usual across the NHS. | 2:59:42 | 2:59:44 | |
But at this moment, the Shadow Brokers made a fateful decision. | 2:59:47 | 2:59:51 | |
With no buyer coming forward, | 2:59:52 | 2:59:54 | |
they dumped their trove of stolen cyber-weapons online, for free. | 2:59:54 | 2:59:59 | |
They were now available for anyone to use. | 3:00:01 | 3:00:04 | |
Cal Leeming is someone with unique insight into the cyber underworld. | 3:00:08 | 3:00:13 | |
He taught himself to hack, and he started young. | 3:00:13 | 3:00:16 | |
When I was about nine years old, | 3:00:16 | 3:00:18 | |
my grandparents got me my first computer. | 3:00:18 | 3:00:22 | |
A proper computer. | 3:00:22 | 3:00:23 | |
My eyes were opened when I started using these chatrooms | 3:00:23 | 3:00:27 | |
and started talking to this wider audience. | 3:00:27 | 3:00:29 | |
People were talking about being able to share PlayStation games. | 3:00:29 | 3:00:34 | |
They were sharing credit card information. | 3:00:34 | 3:00:37 | |
Attracted to free games as an escape from his hard upbringing, | 3:00:37 | 3:00:40 | |
he soon graduated to something more serious. | 3:00:40 | 3:00:44 | |
There wasn't much money at all. | 3:00:44 | 3:00:45 | |
So I found myself using credit cards that I had got from hacking | 3:00:45 | 3:00:51 | |
to send food deliveries to the house. | 3:00:51 | 3:00:53 | |
So it was a mixture of 50% just utter curiosity | 3:00:54 | 3:00:58 | |
and wanting to learn more, | 3:00:58 | 3:01:00 | |
and the other 50% survival. | 3:01:00 | 3:01:02 | |
At the age of just 12, Cal was arrested. | 3:01:04 | 3:01:07 | |
He became the UK's youngest ever cybercriminal. | 3:01:07 | 3:01:11 | |
It was very, very traumatic. | 3:01:11 | 3:01:12 | |
And they sat me down and said, | 3:01:12 | 3:01:14 | |
"Cal, do you understand what you have done was against the law?" | 3:01:14 | 3:01:18 | |
My answer to them was, "All I've done was typed on a keyboard." | 3:01:19 | 3:01:23 | |
Because that's my mind-set, at the time. | 3:01:23 | 3:01:25 | |
I was like, "Why is it that I'm typing on the keyboard to | 3:01:25 | 3:01:27 | |
"survive and I'm now getting arrested?" | 3:01:27 | 3:01:30 | |
And I thought that was very unfair at the time. | 3:01:30 | 3:01:33 | |
Cal continued to hack until 2005, | 3:01:33 | 3:01:36 | |
when he was caught again for using over 10,000 stolen identities | 3:01:36 | 3:01:40 | |
to purchase goods worth £750,000. | 3:01:40 | 3:01:45 | |
Eventually, when I was 18, I handed myself in, | 3:01:46 | 3:01:51 | |
and the arresting officer in my case gave me a chance | 3:01:51 | 3:01:55 | |
to turn my life around in exchange for going to prison | 3:01:55 | 3:01:58 | |
for a little bit. | 3:01:58 | 3:01:59 | |
I owe that guy a lot. | 3:02:00 | 3:02:02 | |
After serving a 15-month jail sentence, he changed sides, | 3:02:05 | 3:02:09 | |
and now runs a cyber security firm. | 3:02:09 | 3:02:12 | |
Why do hackers do what they do? Why do hackers hack? | 3:02:14 | 3:02:17 | |
People have their own motivations for wanting to get into hacking. | 3:02:17 | 3:02:21 | |
Sometimes it is financial, other times criminal, | 3:02:21 | 3:02:24 | |
and sometimes it's just pure curiosity. | 3:02:24 | 3:02:26 | |
Right now we don't know who started this attack, at least not for sure. | 3:02:26 | 3:02:31 | |
Do you think, at any level, the people who carried out this attack | 3:02:31 | 3:02:34 | |
would have felt slightly appalled that this attack spilt over | 3:02:34 | 3:02:38 | |
into the National Health Service? | 3:02:38 | 3:02:40 | |
That's a difficult one to answer, | 3:02:40 | 3:02:43 | |
because it's not a single group that does all hacking in the world, | 3:02:43 | 3:02:46 | |
it's lots and lots of very tiny groups, | 3:02:46 | 3:02:48 | |
sometimes a single person, sometimes lots of people, | 3:02:48 | 3:02:50 | |
and with each group, within each environment, | 3:02:50 | 3:02:52 | |
you have your own set of rules, | 3:02:52 | 3:02:54 | |
conditions and social etiquette and all these things. | 3:02:54 | 3:02:57 | |
So, in some cases, yes, there are going to be some people | 3:02:57 | 3:03:01 | |
that are outraged, even on the criminal side, that they've... | 3:03:01 | 3:03:04 | |
That it went this far. | 3:03:04 | 3:03:05 | |
And in other cases, they might have purposefully wanted it | 3:03:05 | 3:03:08 | |
to go that far. It depends on the individual. | 3:03:08 | 3:03:12 | |
Whatever their motivation, what we know for sure is that someone | 3:03:14 | 3:03:19 | |
did use the alleged NSA exploit Eternal Blue | 3:03:19 | 3:03:23 | |
to create a devastating cyber-weapon. | 3:03:23 | 3:03:25 | |
Within four weeks of Eternal Blue being released, | 3:03:26 | 3:03:29 | |
the attack was ready. | 3:03:29 | 3:03:31 | |
Eternal Blue was mashed together with other pieces of malicious code | 3:03:31 | 3:03:35 | |
and then unleashed on the world, and it was given a name. | 3:03:35 | 3:03:38 | |
A security patch against Eternal Blue | 3:03:48 | 3:03:51 | |
had been made available by Microsoft. | 3:03:51 | 3:03:54 | |
But on the night before the cyber attack, | 3:03:54 | 3:03:57 | |
any machine that hadn't installed the update was still vulnerable... | 3:03:57 | 3:04:02 | |
including many in the NHS. | 3:04:02 | 3:04:04 | |
Infection was now just a matter of time. | 3:04:09 | 3:04:12 | |
On the morning of the cyber-attack, | 3:04:21 | 3:04:23 | |
22-year-old Marcus Hutchins was in the middle of his holiday. | 3:04:23 | 3:04:28 | |
If there was any surf, I might have been surfing. | 3:04:28 | 3:04:30 | |
It's so dynamic, the waves are never the same on two days. | 3:04:30 | 3:04:34 | |
Marcus works remotely for an LA-based cyber intelligence company. | 3:04:35 | 3:04:40 | |
I track malware. I track malicious code that affects users, | 3:04:40 | 3:04:43 | |
and I find ways to track and stop it. | 3:04:43 | 3:04:46 | |
And despite being on leave, | 3:04:47 | 3:04:48 | |
he was still monitoring the global malware outbreak. | 3:04:48 | 3:04:52 | |
I woke up, I checked the message board, there were a couple of | 3:04:52 | 3:04:55 | |
reports of ransomware infections, but I didn't think much of it. | 3:04:55 | 3:04:59 | |
From his home in Devon, | 3:04:59 | 3:05:00 | |
his curiosity would play a crucial role as the day's events unfolded. | 3:05:00 | 3:05:05 | |
In London, Patrick Ward had spent the night | 3:05:09 | 3:05:11 | |
in St Bartholomew's Hospital. | 3:05:11 | 3:05:14 | |
Like thousands of others, | 3:05:15 | 3:05:16 | |
in operating theatres across the country, | 3:05:16 | 3:05:19 | |
he was in for planned surgery, | 3:05:19 | 3:05:21 | |
in his case to correct a serious heart problem. | 3:05:21 | 3:05:24 | |
They woke me at six o'clock, | 3:05:24 | 3:05:26 | |
as they do in hospital, | 3:05:26 | 3:05:29 | |
and one of the nurses came round and shaved my chest, ready for, | 3:05:29 | 3:05:36 | |
obviously, the opening of the chest cavity. | 3:05:36 | 3:05:38 | |
I was nervous, but I was very excited, very... | 3:05:39 | 3:05:43 | |
confident about the operation and what was going to happen. | 3:05:43 | 3:05:49 | |
I'd...yeah, mentally got myself in the right place | 3:05:49 | 3:05:52 | |
to have open heart surgery, | 3:05:52 | 3:05:55 | |
and was, yeah, fantastic, ready to go. | 3:05:55 | 3:05:57 | |
PHONE RINGS | 3:05:57 | 3:06:00 | |
The condition I have is hypertrophic cardiomyopathy, | 3:06:00 | 3:06:04 | |
which is an enlarged heart. | 3:06:04 | 3:06:08 | |
It means I struggle to do normal things, - walk, | 3:06:08 | 3:06:10 | |
I can't do any sporting activities, lifting heavy objects | 3:06:10 | 3:06:13 | |
obviously puts a big strain on the heart. | 3:06:13 | 3:06:16 | |
It makes me feel extremely useless. | 3:06:16 | 3:06:19 | |
I've had some very dark moments over the last couple of years, | 3:06:19 | 3:06:24 | |
so I'd like to, yeah, get back to leading | 3:06:24 | 3:06:27 | |
a normal fit and healthy life. | 3:06:27 | 3:06:30 | |
But before surgery could start, Patrick needed some tests. | 3:06:30 | 3:06:34 | |
They wanted to check out my arteries, | 3:06:34 | 3:06:36 | |
so they sent me down for a cardio angiogram in the morning. | 3:06:36 | 3:06:40 | |
So after having the angiogram and some drugs, I was very... | 3:06:40 | 3:06:43 | |
I was even more relaxed and ready for the afternoon operation. | 3:06:43 | 3:06:48 | |
While Patrick waited for theatre, | 3:06:50 | 3:06:52 | |
in Devon, Marcus was keeping an eye out for global cyber-attacks. | 3:06:52 | 3:06:57 | |
I checked the message board. There were maybe 16, 17 reports | 3:06:59 | 3:07:04 | |
of different NHS, sort of, organisations being hit. | 3:07:04 | 3:07:07 | |
And that was the point where I decided my holiday is over. | 3:07:07 | 3:07:11 | |
By late morning, the attack had begun. | 3:07:15 | 3:07:17 | |
Somehow, a worm had got into the NHS. | 3:07:17 | 3:07:21 | |
And on the other side of the world, | 3:07:21 | 3:07:22 | |
somebody was tracking the progress | 3:07:22 | 3:07:25 | |
of the outbreak. | 3:07:25 | 3:07:26 | |
Marcen Kochinski runs a cyber security firm in California. | 3:07:29 | 3:07:33 | |
Their software is installed on machines across the world. | 3:07:33 | 3:07:37 | |
Every time we disinfect a machine, it pings that information back | 3:07:37 | 3:07:41 | |
to the labs teams. Real-time information was streaming in, | 3:07:41 | 3:07:44 | |
regarding these specific attacks. | 3:07:44 | 3:07:46 | |
We were able to actually create a live map, | 3:07:46 | 3:07:49 | |
where the infection is spreading. Very similar to a human infection | 3:07:49 | 3:07:53 | |
spreading worldwide. We were able to do that from a computer perspective. | 3:07:53 | 3:07:57 | |
So, we started detecting the attack. | 3:07:57 | 3:07:59 | |
Actually, our first detection was, according to this, Thursday. | 3:07:59 | 3:08:03 | |
We call that, kind of, day minus one, day one. | 3:08:03 | 3:08:06 | |
And one of the first computers that we disinfected was in Russia, | 3:08:06 | 3:08:10 | |
which was very interesting for us to see. | 3:08:10 | 3:08:12 | |
But then, you look at Friday and Saturday | 3:08:17 | 3:08:19 | |
and through the rest of the weekend, | 3:08:19 | 3:08:22 | |
the map just completely explodes. | 3:08:22 | 3:08:24 | |
We see infections all over the world, predominantly in Europe, | 3:08:24 | 3:08:28 | |
but also in the US and they do not relent. | 3:08:28 | 3:08:32 | |
They were witnessing the largest and fastest-spreading outbreak | 3:08:34 | 3:08:37 | |
anyone had seen in recent years. | 3:08:37 | 3:08:40 | |
The threat spread so quickly | 3:08:41 | 3:08:42 | |
that we actually would have to go | 3:08:42 | 3:08:44 | |
down to the milliseconds | 3:08:44 | 3:08:45 | |
to see when it first appeared | 3:08:45 | 3:08:46 | |
in the UK. | 3:08:46 | 3:08:47 | |
We think it is sometime Friday morning. | 3:08:47 | 3:08:49 | |
But we really have to slow this down | 3:08:51 | 3:08:52 | |
and look at the millions of data points we have here to isolate | 3:08:52 | 3:08:55 | |
the day we saw it in the UK first. | 3:08:55 | 3:08:57 | |
The first outbreak Marcen detected in London | 3:08:58 | 3:09:01 | |
showed up in the afternoon at 18 minutes past one. | 3:09:01 | 3:09:05 | |
Across the country, hospitals like this found themselves | 3:09:09 | 3:09:12 | |
either in the grip of the attack or desperately trying to switch off | 3:09:12 | 3:09:15 | |
systems in an attempt to prevent possible infection. | 3:09:15 | 3:09:19 | |
One of London's largest, most capable hospital trusts, | 3:09:19 | 3:09:22 | |
St Bartholomew's and the Royal London, | 3:09:22 | 3:09:24 | |
found itself amongst the most severely affected. | 3:09:24 | 3:09:27 | |
So, NHS staff put into place contingency plans, | 3:09:27 | 3:09:30 | |
working tirelessly to keep everything running. | 3:09:30 | 3:09:33 | |
But there were consequences. | 3:09:33 | 3:09:35 | |
The surgeon, he had been to see me, to say, "Pat, I'll be with you | 3:09:38 | 3:09:41 | |
"at one o'clock-ish, after I've done my rounds." | 3:09:41 | 3:09:44 | |
He then came back again and said, "How are you doing? Everything OK?" | 3:09:44 | 3:09:48 | |
I said, "Yeah, fine. I'm here, ready and waiting. | 3:09:48 | 3:09:51 | |
"I'm not going anywhere." And he said, "Great. We're all ready. | 3:09:51 | 3:09:54 | |
"Everybody is getting organised for you down in theatre. | 3:09:54 | 3:09:56 | |
"The team are there, they are looking forward to meeting you." | 3:09:56 | 3:09:59 | |
This was 10 o'clock, 12 o'clock | 3:09:59 | 3:10:01 | |
and then, at half past one, | 3:10:01 | 3:10:02 | |
he turned up again and looked very, yeah, forlorn | 3:10:02 | 3:10:09 | |
and very sorry. And that was when he then told me | 3:10:09 | 3:10:12 | |
that he couldn't do the operation. | 3:10:12 | 3:10:14 | |
With computer systems down, the surgeon was unable to access | 3:10:15 | 3:10:18 | |
Patrick's angiogram and blood results. | 3:10:18 | 3:10:21 | |
Without them, the operation could not go ahead. | 3:10:21 | 3:10:24 | |
I was numb. It is the only way I can describe it. | 3:10:25 | 3:10:29 | |
Yeah, I just felt nothing. I was absolutely... | 3:10:29 | 3:10:32 | |
I couldn't believe it. I was just absolutely flabbergasted. | 3:10:32 | 3:10:35 | |
It wasn't until the Monday, really, | 3:10:39 | 3:10:41 | |
that the realisation of "What do I do?" | 3:10:41 | 3:10:45 | |
I didn't have any idea as to whether I'd have to wait another year | 3:10:45 | 3:10:48 | |
for the operation. There was just no information available. | 3:10:48 | 3:10:52 | |
It's very frustrating. | 3:10:52 | 3:10:54 | |
Speak to my wife, she will tell you how grumpy I have been | 3:10:54 | 3:10:58 | |
since the operation was cancelled. Not having a date, | 3:10:58 | 3:11:01 | |
something to aim for. So it was extremely, extremely frustrating. | 3:11:01 | 3:11:06 | |
This is what makes me angriest about this whole thing. | 3:11:09 | 3:11:11 | |
This cyber attack isn't about an abstract piece of technology, | 3:11:11 | 3:11:15 | |
it's not about ransoms or ransomware. | 3:11:15 | 3:11:17 | |
It's not about firewalls or patches. It's about people and their lives | 3:11:17 | 3:11:21 | |
and how it affects them. It is about being forced, as a doctor, | 3:11:21 | 3:11:24 | |
to look someone like Patrick in the eye and to let him down | 3:11:24 | 3:11:27 | |
at the worst possible moment. | 3:11:27 | 3:11:29 | |
And Patrick wasn't alone. | 3:11:32 | 3:11:33 | |
The cyber attack had become national news. | 3:11:33 | 3:11:36 | |
The NHS is the victim of a major cyber attack. | 3:11:36 | 3:11:40 | |
At least 25 hospital trusts and GP surgeries have been affected. | 3:11:40 | 3:11:44 | |
Routine operations at some hospitals are being cancelled, | 3:11:44 | 3:11:47 | |
ambulances diverted and patients sent home. | 3:11:47 | 3:11:50 | |
I went out to lunch. I got back. | 3:11:53 | 3:11:55 | |
I then saw lots of reports from different sectors of the NHS. | 3:11:55 | 3:12:00 | |
They were all just simultaneously saying, "We're being hit." | 3:12:00 | 3:12:02 | |
I thought, "This one thing is hitting all these sectors, | 3:12:06 | 3:12:08 | |
"so it's got to be something pretty big", | 3:12:08 | 3:12:10 | |
so I went and I looked into it. | 3:12:10 | 3:12:12 | |
I asked a friend of mine in the industry if he had a sample | 3:12:15 | 3:12:17 | |
of the actual malware that was going around | 3:12:17 | 3:12:19 | |
and he sent it to me. | 3:12:19 | 3:12:21 | |
I use virtualisation software, which basically makes a computer | 3:12:21 | 3:12:24 | |
within your computer, so that it wouldn't affect me | 3:12:24 | 3:12:28 | |
and I saw what it did. | 3:12:28 | 3:12:29 | |
Marcus wasn't alone. | 3:12:33 | 3:12:34 | |
Cal, too, set to work examining the malware. | 3:12:36 | 3:12:39 | |
I wanted to find out from him what made this cyber attack | 3:12:41 | 3:12:45 | |
so ruthlessly effective. | 3:12:45 | 3:12:46 | |
So, what we've got is a machine | 3:12:49 | 3:12:51 | |
that is going to effectively act as patient zero. | 3:12:51 | 3:12:54 | |
We've got a second machine to reconstruct how this | 3:12:54 | 3:12:58 | |
particular variant of WannaCry spreads across multiple machines. | 3:12:58 | 3:13:03 | |
In here is what I have dubbed, "The internet in a box." | 3:13:03 | 3:13:08 | |
To make the malware reveal itself, we have to make it believe | 3:13:08 | 3:13:12 | |
these computers are connected to the real internet | 3:13:12 | 3:13:15 | |
and this box provide the necessary dummy signals, whilst protecting | 3:13:15 | 3:13:20 | |
the outside world from harm. | 3:13:20 | 3:13:21 | |
What we're going to do now is run the WannaCry ransomware. | 3:13:22 | 3:13:27 | |
There you go. | 3:13:32 | 3:13:33 | |
And that's the screen of doom. | 3:13:33 | 3:13:34 | |
-So, this is this machine out of action. -Exactly. | 3:13:36 | 3:13:39 | |
With the files locked up, | 3:13:44 | 3:13:45 | |
the clock is ticking. | 3:13:45 | 3:13:47 | |
But as the victim decides whether or not to pay, | 3:13:49 | 3:13:51 | |
the malware is already planning its next attacks. | 3:13:51 | 3:13:53 | |
This particular strain has two components. | 3:13:55 | 3:13:57 | |
It has the ransomware itself, | 3:13:57 | 3:13:59 | |
which is what we see here, and it has the worm component, | 3:13:59 | 3:14:01 | |
which was taken from Eternal Blue, | 3:14:01 | 3:14:05 | |
which is a government weapons-grade exploit. | 3:14:05 | 3:14:09 | |
This machine here is actually giving us a bit of insight. | 3:14:09 | 3:14:12 | |
And what this is showing us is that it is trying | 3:14:12 | 3:14:15 | |
to spread across the network. | 3:14:15 | 3:14:17 | |
You don't really think about it, do you? All the output from | 3:14:17 | 3:14:21 | |
a machine isn't just what you see on your screen. | 3:14:21 | 3:14:23 | |
-There is a lot of silent chatter going on in the background. -Exactly. | 3:14:23 | 3:14:26 | |
If you imagine a big room of people and you shout out, "Who's here?!" | 3:14:26 | 3:14:29 | |
And everyone puts their hand up. That is effectively what | 3:14:29 | 3:14:31 | |
these machines are doing. It shouts out and says, "Who's here?!" | 3:14:31 | 3:14:34 | |
and then, the machines reply. What it then tries to do is it hit | 3:14:34 | 3:14:38 | |
each of those machines with this payload. This worm is now spreading | 3:14:38 | 3:14:42 | |
out across the network and in an instance where you have got... | 3:14:42 | 3:14:45 | |
-There we go. -And as you can see, it's now spread onto this machine. | 3:14:45 | 3:14:50 | |
Eternal Blue had been expertly designed to silently move | 3:14:54 | 3:14:58 | |
from one machine to another across a local area network or LAN. | 3:14:58 | 3:15:03 | |
Groups of computers joined together inside a business or a hospital. | 3:15:03 | 3:15:07 | |
With the LAN infected, it spread to the internet. | 3:15:09 | 3:15:12 | |
If you imagine you have got your big internet cloud down here | 3:15:16 | 3:15:20 | |
and each dot represents a machine and there is billions | 3:15:20 | 3:15:23 | |
of these machines, OK? And what it does is | 3:15:23 | 3:15:27 | |
the attack will make a direct connection to your machine | 3:15:27 | 3:15:30 | |
and if you are exposing this port to the internet, someone could | 3:15:30 | 3:15:35 | |
infect your machine without needing to have local access to it | 3:15:35 | 3:15:41 | |
or be on the same network. | 3:15:41 | 3:15:43 | |
What is even more disturbing from there is, | 3:15:43 | 3:15:47 | |
if you look at the research tools that actually analyse the internet, | 3:15:47 | 3:15:51 | |
you can go and query today, right now, how many of these machines | 3:15:51 | 3:15:54 | |
on the internet have got this vulnerable service open. | 3:15:54 | 3:15:57 | |
Through the internet, anyone can go and try and exploit them | 3:15:57 | 3:16:00 | |
and there are hundreds and hundreds and hundreds of thousands | 3:16:00 | 3:16:03 | |
of these machines. | 3:16:03 | 3:16:05 | |
The malware sought out these weaknesses | 3:16:07 | 3:16:09 | |
and wormed its way into all manner of networks. | 3:16:09 | 3:16:11 | |
From companies like Nissan in the UK to Renault in France, | 3:16:11 | 3:16:16 | |
from a postal service in Russia to a German railway operator. | 3:16:16 | 3:16:20 | |
And to be clear, this does not depend upon any human interaction? | 3:16:24 | 3:16:29 | |
It's automatic propagation. There is no human interaction here required | 3:16:29 | 3:16:34 | |
at all. And that is why the ransomware itself was | 3:16:34 | 3:16:39 | |
relatively low-key, to be fair. There wasn't anything particularly | 3:16:39 | 3:16:43 | |
special about it, but when combined with | 3:16:43 | 3:16:47 | |
a government weapons-grade exploit, the impact has been devastating. | 3:16:47 | 3:16:53 | |
No-one needed to click on a link or open a dodgy e-mail. | 3:16:57 | 3:17:01 | |
The worm spread all by itself, | 3:17:01 | 3:17:04 | |
exploding across networks in a matter of hours. | 3:17:04 | 3:17:08 | |
Across the country, the surprisingly virulent attack meant that | 3:17:20 | 3:17:24 | |
several hospitals were beginning to struggle. | 3:17:24 | 3:17:26 | |
And wherever the ransomware was found, they would switch off | 3:17:26 | 3:17:29 | |
machines in an attempt to contain the outbreak. | 3:17:29 | 3:17:32 | |
Nevertheless, some of those networks went dark. | 3:17:32 | 3:17:36 | |
Now, even that was not a complete disaster, | 3:17:36 | 3:17:38 | |
because in the NHS, we have contingency plans | 3:17:38 | 3:17:40 | |
for almost every conceivable emergency, | 3:17:40 | 3:17:43 | |
from power outages, terrorist attacks, | 3:17:43 | 3:17:46 | |
even a cyber attack of this kind. | 3:17:46 | 3:17:48 | |
So, what was it that forced some accident and emergency departments | 3:17:52 | 3:17:56 | |
to close their doors that day? A&E relies upon support | 3:17:56 | 3:18:00 | |
from state-of-the-art technologies and specialities. | 3:18:00 | 3:18:05 | |
And these were some of the hardest hit, | 3:18:05 | 3:18:08 | |
among them, doctors and their systems in radiology. | 3:18:08 | 3:18:11 | |
It is packed with the latest kit. | 3:18:11 | 3:18:14 | |
X-rays, MRI scanners and CT machines that allow doctors | 3:18:14 | 3:18:19 | |
to investigate the hidden extent of injury inside the body. | 3:18:19 | 3:18:23 | |
When time is critical, such as with a stroke, | 3:18:23 | 3:18:26 | |
radiologists like Navin Ramachandran help us to make quick, accurate, | 3:18:26 | 3:18:30 | |
life-saving decisions. | 3:18:30 | 3:18:32 | |
When a patient comes in, | 3:18:34 | 3:18:36 | |
they turn up with typical symptoms, | 3:18:36 | 3:18:38 | |
you can see they may not be able to feel an area, they may not be able | 3:18:38 | 3:18:41 | |
an area, they may not | 3:18:41 | 3:18:42 | |
be able to speak. That gives us an idea that there is something | 3:18:42 | 3:18:44 | |
going on in the brain, but it doesn't necessarily tell us | 3:18:44 | 3:18:47 | |
what the underlying cause is. | 3:18:47 | 3:18:49 | |
So, it could be, if we look at this case, | 3:18:49 | 3:18:51 | |
where a vessel to a part of the brain has got blocked off | 3:18:51 | 3:18:56 | |
by a clot and that area is the part that has been | 3:18:56 | 3:18:59 | |
-deprived of blood currently. -The treatment is to give | 3:18:59 | 3:19:02 | |
a clot-busting drug as fast as possible, | 3:19:02 | 3:19:04 | |
but there is jeopardy involved. You have to be sure precisely | 3:19:04 | 3:19:08 | |
what type of stroke you're dealing with. | 3:19:08 | 3:19:10 | |
The one thing you have to be aware of is that, once in a while, | 3:19:11 | 3:19:14 | |
patients that come in with exactly the same symptoms, | 3:19:14 | 3:19:16 | |
they are getting the same symptoms not because of the blocked vessel, | 3:19:16 | 3:19:19 | |
but because of a bleeding vessel. In this case, this vessel | 3:19:19 | 3:19:22 | |
has bled. With this patient, if you give them the clot-busting drug, | 3:19:22 | 3:19:26 | |
that is catastrophic and can lead to death. | 3:19:26 | 3:19:30 | |
And these two patients would look very similar at presentation? | 3:19:30 | 3:19:33 | |
Without doing these scans, you really wouldn't know the difference? | 3:19:33 | 3:19:36 | |
Exactly. The only thing that makes it possible is having access | 3:19:36 | 3:19:39 | |
to these scans, to allow others to triage people into the right | 3:19:39 | 3:19:42 | |
-treatment pathway. -The same is true for the whole | 3:19:42 | 3:19:45 | |
of emergency medicine, from car accidents to cancer. | 3:19:45 | 3:19:49 | |
Radiology is an essential front line asset. | 3:19:49 | 3:19:51 | |
The whole department relies on computers. | 3:19:53 | 3:19:55 | |
They run the scanning machines, display the images | 3:19:55 | 3:19:58 | |
and send them on to doctors in A&E. | 3:19:58 | 3:20:00 | |
If these computers were infected, | 3:20:01 | 3:20:03 | |
hospital managers would have little choice but to close A&E. | 3:20:03 | 3:20:08 | |
It simply wouldn't be safe to stay open. | 3:20:08 | 3:20:09 | |
We were very lucky in that it didn't hit our services at all. | 3:20:11 | 3:20:14 | |
We have had fully digital systems for over 10-15 years, | 3:20:14 | 3:20:18 | |
whereas most of the rest of the hospital still uses paper. | 3:20:18 | 3:20:21 | |
But we were completely unaffected. | 3:20:21 | 3:20:23 | |
No change to the day. | 3:20:23 | 3:20:25 | |
Some hospitals, like mine, UCLH, got away unscathed, | 3:20:25 | 3:20:30 | |
but for those unlucky enough to be affected, | 3:20:30 | 3:20:33 | |
there was still enough flex in the system to compensate. | 3:20:33 | 3:20:36 | |
Nevertheless, patients were on the move, | 3:20:36 | 3:20:38 | |
being transferred from hospital to hospital. | 3:20:38 | 3:20:41 | |
The infection continued to spread | 3:20:44 | 3:20:47 | |
and began to show up in GP surgeries across the country. | 3:20:47 | 3:20:51 | |
So, this is one of the consulting rooms we are going into now. | 3:21:00 | 3:21:03 | |
Dr George Farrelly is a GP working at a surgery in Tower Hamlets. | 3:21:03 | 3:21:07 | |
This is our standard desktop PC and so on. | 3:21:07 | 3:21:10 | |
Each consulting room has one of these. | 3:21:10 | 3:21:12 | |
We have 15 machines. We do consultations with this. | 3:21:15 | 3:21:17 | |
We access people's notes, we are able to make appointments, | 3:21:17 | 3:21:21 | |
we send prescriptions to the chemist and plan care. | 3:21:21 | 3:21:24 | |
So, this is our reception area. | 3:21:27 | 3:21:28 | |
A lot happens here. This is like the information hub of the practice. | 3:21:31 | 3:21:36 | |
We take our computer system a little bit for granted, I think, | 3:21:36 | 3:21:39 | |
and only realised | 3:21:39 | 3:21:40 | |
how reliant we are on it when we lose it. | 3:21:40 | 3:21:43 | |
On Friday, 12th of May, we got a phone call | 3:21:51 | 3:21:53 | |
from a neighbouring practice and they told us that they had been hit | 3:21:53 | 3:21:56 | |
by some virus. | 3:21:56 | 3:21:59 | |
So, we printed out the appointment for that day, | 3:21:59 | 3:22:03 | |
which would give us some information, just in case | 3:22:03 | 3:22:05 | |
we had the same problem. | 3:22:05 | 3:22:06 | |
I was in a meeting with some colleagues discussing patients | 3:22:10 | 3:22:14 | |
and the PC we were using suddenly blanked out. | 3:22:14 | 3:22:19 | |
We had to shut all our computers down, to hopefully stop any more | 3:22:23 | 3:22:27 | |
of them becoming infected. | 3:22:27 | 3:22:30 | |
It was complete paralysis. | 3:22:30 | 3:22:31 | |
Along with the hospitals, some GP surgeries were now struggling, too. | 3:22:33 | 3:22:37 | |
They connect with the rest of the NHS via a network known as N3. | 3:22:38 | 3:22:43 | |
N3 is the NHS's national broadband network, | 3:22:45 | 3:22:48 | |
connecting all NHS locations | 3:22:48 | 3:22:51 | |
and its 1.3 million employees across England. | 3:22:51 | 3:22:54 | |
It's one of the largest networks in Europe, | 3:22:56 | 3:22:58 | |
with in excess of 51,000 connections. | 3:22:58 | 3:23:01 | |
N3 allows us to communicate with our colleagues | 3:23:03 | 3:23:05 | |
who we share care with other people. | 3:23:05 | 3:23:07 | |
For example, when we send e-mails to each other | 3:23:07 | 3:23:11 | |
from our NHS net e-mail account, it's more secure. | 3:23:11 | 3:23:14 | |
Our security antivirus and so on is done centrally, | 3:23:14 | 3:23:19 | |
it's not something we worry about. | 3:23:19 | 3:23:21 | |
We never have to do patches ourselves. | 3:23:21 | 3:23:23 | |
They didn't know it at the time, | 3:23:27 | 3:23:30 | |
but the N3 network was actually unaffected. | 3:23:30 | 3:23:34 | |
However, Windows 7 machines without the patch WERE going down. | 3:23:34 | 3:23:38 | |
So some teams disconnected their computers... | 3:23:39 | 3:23:42 | |
..cutting off access to essential clinical systems, | 3:23:43 | 3:23:46 | |
deepening the disruption. | 3:23:46 | 3:23:48 | |
The people who've done this | 3:23:50 | 3:23:52 | |
don't understand the implications of what they're doing. | 3:23:52 | 3:23:54 | |
They hadn't thought them through. | 3:23:55 | 3:23:58 | |
My guess is their project is to make money | 3:23:58 | 3:24:01 | |
and they just send this stuff out and it lands wherever it lands | 3:24:01 | 3:24:04 | |
and they don't give any thought to it. | 3:24:04 | 3:24:06 | |
What they DID give some thought to is how they got paid. | 3:24:10 | 3:24:13 | |
With the ransomware hitting thousands of computers, | 3:24:16 | 3:24:19 | |
the hackers needed a secure, globally accepted form of payment | 3:24:19 | 3:24:23 | |
that ideally would be untraceable. | 3:24:23 | 3:24:25 | |
They decided to use Bitcoin - | 3:24:27 | 3:24:30 | |
an entirely electronic form of so-called cryptocurrency. | 3:24:30 | 3:24:34 | |
I've never used Bitcoin. | 3:24:36 | 3:24:38 | |
But it's easy enough to buy some on a phone. | 3:24:39 | 3:24:42 | |
And once loaded, you can spend it in all manner of places. | 3:24:43 | 3:24:46 | |
-So, can I get a flat white and a mint tea, please? -Sure. | 3:24:52 | 3:24:55 | |
I've come to a cafe in east London to meet Sarah Meiklejohn, | 3:24:55 | 3:24:59 | |
an expert in Bitcoin, | 3:24:59 | 3:25:01 | |
to find out why it's such an attractive currency for hackers. | 3:25:01 | 3:25:05 | |
-Perfect. Can I pay with Bitcoin? -Sure. | 3:25:07 | 3:25:11 | |
-OK. And I just... -£3.50, please. You just scan this. | 3:25:11 | 3:25:15 | |
OK, I'll lean over and scan that. | 3:25:15 | 3:25:17 | |
-That's it. -And it's as easy as that. -That's it. | 3:25:18 | 3:25:22 | |
-Perfect, thank you very much. -Thank you. -Thank you. | 3:25:22 | 3:25:24 | |
Marvellous, right. | 3:25:24 | 3:25:25 | |
Explain to me, then, as a complete non-initiate, | 3:25:27 | 3:25:29 | |
what Bitcoin is and how it works. | 3:25:29 | 3:25:32 | |
Right, so, Bitcoin is basically a purely digital form of currency. | 3:25:32 | 3:25:36 | |
So it's just a currency, like the dollar, the pound. | 3:25:36 | 3:25:40 | |
The main differences are that it's not backed by any government, | 3:25:40 | 3:25:44 | |
there's no central bank involved in generating Bitcoins | 3:25:44 | 3:25:47 | |
and you don't need a bank account to use it. | 3:25:47 | 3:25:50 | |
If I want to use Bitcoin, you know, | 3:25:50 | 3:25:52 | |
I want to send people Bitcoins, | 3:25:52 | 3:25:54 | |
I'm going to download a piece of software, | 3:25:54 | 3:25:56 | |
and in doing that, | 3:25:56 | 3:25:57 | |
I'm going to join Bitcoin's peer-to-peer network. | 3:25:57 | 3:26:00 | |
So this network is basically collectively responsible for | 3:26:00 | 3:26:04 | |
playing all the traditional roles | 3:26:04 | 3:26:06 | |
that we're used to in traditional banking. | 3:26:06 | 3:26:08 | |
The recent WannaCry attack, which affected many organisations, | 3:26:08 | 3:26:11 | |
including the National Health Service, | 3:26:11 | 3:26:14 | |
was conducted using Bitcoin as the currency of ransom. | 3:26:14 | 3:26:19 | |
Why did they use Bitcoin? | 3:26:19 | 3:26:21 | |
Opening a Bitcoin wallet, saying we're open for business, | 3:26:21 | 3:26:24 | |
we can accept Bitcoins, takes very little time and effort, | 3:26:24 | 3:26:27 | |
and then getting paid in Bitcoin equally takes very little effort. | 3:26:27 | 3:26:31 | |
If I want to pay someone on the other side of the world, | 3:26:31 | 3:26:34 | |
I can do that using Bitcoin | 3:26:34 | 3:26:35 | |
and they'll get the payment instantaneously. | 3:26:35 | 3:26:38 | |
It's the convenience and speed that makes it easy for hackers | 3:26:39 | 3:26:43 | |
to gather their ransom. | 3:26:43 | 3:26:44 | |
But as cyber security expert Mikko Hypponen explains, | 3:26:44 | 3:26:49 | |
Bitcoin also offers a certain level of anonymity. | 3:26:49 | 3:26:53 | |
The only thing we can see is that someone is sending money | 3:26:55 | 3:26:57 | |
from one address to another address, and these addresses are | 3:26:57 | 3:27:01 | |
these long lists of numbers and letters which look really random. | 3:27:01 | 3:27:04 | |
They are tied to a user, but we have no idea who these users are. | 3:27:04 | 3:27:09 | |
What was invented to ensure an individual's privacy | 3:27:09 | 3:27:13 | |
had unforeseen consequences. | 3:27:13 | 3:27:16 | |
So we very quickly started seeing Bitcoin being used in online crime. | 3:27:16 | 3:27:21 | |
First, in online drug trade, | 3:27:21 | 3:27:23 | |
cos when you're buying illegal drugs online, | 3:27:23 | 3:27:25 | |
you don't want to use your credit card | 3:27:25 | 3:27:27 | |
because the credit card will lead back to you and Bitcoins don't. | 3:27:27 | 3:27:31 | |
And then we started seeing ransom attacks. | 3:27:31 | 3:27:35 | |
Ransomware has been around for years and years, | 3:27:35 | 3:27:37 | |
way before Bitcoin. | 3:27:37 | 3:27:39 | |
But the megatrend which really made ransomware | 3:27:39 | 3:27:41 | |
such a big problem is cryptocurrencies, like Bitcoin. | 3:27:41 | 3:27:44 | |
By allowing transactions to take place between pseudonyms | 3:27:44 | 3:27:48 | |
rather than real identities, | 3:27:48 | 3:27:50 | |
Bitcoin became the go-to currency for cyber crime. | 3:27:50 | 3:27:53 | |
But it turns out that the details of Bitcoin's original design | 3:27:56 | 3:28:00 | |
could, for some criminals, actually be their undoing. | 3:28:00 | 3:28:03 | |
Bitcoin was invented by a figure called Satoshi Nakamoto | 3:28:05 | 3:28:09 | |
around six years ago. | 3:28:09 | 3:28:11 | |
It's based on an innovation called blockchain, | 3:28:11 | 3:28:14 | |
and blockchain basically means a public ledger of transactions. | 3:28:14 | 3:28:19 | |
When a transaction is made between two Bitcoin users, | 3:28:20 | 3:28:23 | |
the details of that transaction are locked into a permanent ledger, | 3:28:23 | 3:28:27 | |
known as the blockchain. | 3:28:27 | 3:28:30 | |
And the blockchain data isn't kept on a single computer or server - | 3:28:31 | 3:28:35 | |
it's distributed across the entire network. | 3:28:35 | 3:28:38 | |
Which means, even if an individual machine goes down, | 3:28:40 | 3:28:43 | |
it can never be erased. | 3:28:43 | 3:28:45 | |
So the entire history of every Bitcoin transaction | 3:28:47 | 3:28:51 | |
is accessible to all users now and for ever. | 3:28:51 | 3:28:55 | |
Until this point, | 3:28:56 | 3:28:57 | |
what I understood by Bitcoin was that it was fully anonymous | 3:28:57 | 3:29:00 | |
and therefore it's the perfect currency | 3:29:00 | 3:29:03 | |
in which the underworld can operate. | 3:29:03 | 3:29:06 | |
Is that not true? | 3:29:06 | 3:29:07 | |
No, it's definitely not true. | 3:29:07 | 3:29:09 | |
Bitcoin exchanges are what's responsible for trading Bitcoin | 3:29:09 | 3:29:13 | |
with traditional, government-backed currencies. | 3:29:13 | 3:29:16 | |
But the second you send your Bitcoins to this exchange, | 3:29:16 | 3:29:19 | |
you've created a link between your activities in the Bitcoin network | 3:29:19 | 3:29:24 | |
and your identity as a real person. | 3:29:24 | 3:29:27 | |
The second I know that a given pseudonym | 3:29:27 | 3:29:30 | |
belongs to a criminal or belongs to anyone, | 3:29:30 | 3:29:32 | |
I can then start trying to understand what that user | 3:29:32 | 3:29:36 | |
has done with that money. | 3:29:36 | 3:29:37 | |
We've seen in the past | 3:29:37 | 3:29:39 | |
that attackers have stolen Bitcoins | 3:29:39 | 3:29:41 | |
and then they've sat on them for years, | 3:29:41 | 3:29:44 | |
probably because they don't really know what to do with them next. | 3:29:44 | 3:29:47 | |
Attribution is hard, | 3:29:47 | 3:29:49 | |
this could have been anybody in the world | 3:29:49 | 3:29:51 | |
carrying out this attack. | 3:29:51 | 3:29:52 | |
If you're looking for my opinion, | 3:29:52 | 3:29:54 | |
it's some script kiddie in a basement somewhere, | 3:29:54 | 3:29:56 | |
not a government agency. | 3:29:56 | 3:29:58 | |
And if he's got any sense whatsoever, | 3:29:58 | 3:30:00 | |
he'll take his hard disk, smash it up with a sledgehammer | 3:30:00 | 3:30:04 | |
and burn it in a bonfire. | 3:30:04 | 3:30:06 | |
And he will not, whatever he does, | 3:30:06 | 3:30:08 | |
go and try spend of those Bitcoins that ended up in his wallets, | 3:30:08 | 3:30:11 | |
cos if he does, there's quite a number of governments | 3:30:11 | 3:30:14 | |
would like to offer him some hospitality | 3:30:14 | 3:30:16 | |
for quite a long period of his life. | 3:30:16 | 3:30:18 | |
As the ransomware continued to spread, | 3:30:23 | 3:30:26 | |
thousands of people faced the same dilemma - | 3:30:26 | 3:30:29 | |
should they pay the ransom or not? | 3:30:29 | 3:30:32 | |
It's a question that Moti Cristal has given a lot of thought. | 3:30:34 | 3:30:38 | |
I'm a negotiator, by profession. | 3:30:44 | 3:30:47 | |
I started my career in the political negotiations | 3:30:47 | 3:30:49 | |
between Israel and the Arab world. | 3:30:49 | 3:30:51 | |
And later on, I do hostage negotiations | 3:30:51 | 3:30:53 | |
in high-intensity conflicts. | 3:30:53 | 3:30:55 | |
In a hostage situation, you negotiate with a person | 3:31:02 | 3:31:04 | |
but if you have the opportunity | 3:31:04 | 3:31:07 | |
to talk him to come to the window and then shoot him in the head | 3:31:07 | 3:31:11 | |
because he just killed three kids, you will do it, | 3:31:11 | 3:31:14 | |
and without any moral hesitation. | 3:31:14 | 3:31:17 | |
But in the cyber world, you cannot do that. | 3:31:17 | 3:31:21 | |
The reliance on talk is | 3:31:21 | 3:31:24 | |
significantly more important. | 3:31:24 | 3:31:27 | |
Extortionists, like the people behind WannaCry, | 3:31:27 | 3:31:30 | |
are increasingly abandoning the real world and moving online. | 3:31:30 | 3:31:35 | |
It's lower risk and more profitable. | 3:31:35 | 3:31:38 | |
But whilst the setting may have changed, | 3:31:38 | 3:31:40 | |
Moti's job remains the same, | 3:31:40 | 3:31:43 | |
and much of his work is now in cyber crime. | 3:31:43 | 3:31:47 | |
There's always a human being behind the keyboard. | 3:31:47 | 3:31:50 | |
So at the end of this ransomware attack, | 3:31:50 | 3:31:53 | |
there are people that have feelings, | 3:31:53 | 3:31:57 | |
logics, emotions... | 3:31:57 | 3:32:00 | |
There's always a human being | 3:32:00 | 3:32:02 | |
to whom you can, and you should try to, connect. | 3:32:02 | 3:32:05 | |
No-one has been able to reach out to those behind WannaCry. | 3:32:08 | 3:32:12 | |
But perhaps Moti can help shed light | 3:32:12 | 3:32:14 | |
on how these criminal organisations think. | 3:32:14 | 3:32:18 | |
In October 2015, he was called in | 3:32:18 | 3:32:21 | |
to negotiate for a financial institution | 3:32:21 | 3:32:23 | |
that had been attacked by another piece of malware. | 3:32:23 | 3:32:26 | |
The hackers attempted to portray themselves | 3:32:27 | 3:32:30 | |
as an arm of the Russian state, APT28. | 3:32:30 | 3:32:34 | |
Moti reached out to them. | 3:32:36 | 3:32:38 | |
You know, I teased them. | 3:32:40 | 3:32:42 | |
I said, "Are you really APT28, | 3:32:42 | 3:32:45 | |
"the Russian...proclaimed Russian team?" | 3:32:45 | 3:32:48 | |
"Yes, correct." | 3:32:48 | 3:32:49 | |
And I said, "If you are APT28, | 3:32:49 | 3:32:52 | |
"why you start to do this low stuff of extortion | 3:32:52 | 3:32:58 | |
"instead of the very fascinating cool government stuff?" | 3:32:58 | 3:33:02 | |
Through this kind of engagement, over many months, | 3:33:03 | 3:33:06 | |
Moti created a dialogue with the attackers. | 3:33:06 | 3:33:10 | |
We already start moving towards a deal | 3:33:10 | 3:33:12 | |
and they write to me. | 3:33:12 | 3:33:14 | |
"The way we can do it..." | 3:33:14 | 3:33:17 | |
Pay attention to the language - | 3:33:17 | 3:33:19 | |
"the way WE can do it," we're already a team. | 3:33:19 | 3:33:22 | |
"..is two equal payments. | 3:33:22 | 3:33:24 | |
"After the first one, we tell you exactly how you were breached | 3:33:24 | 3:33:29 | |
"and which systems are most vulnerable." | 3:33:29 | 3:33:32 | |
So suddenly, after the first payment, | 3:33:32 | 3:33:34 | |
they start actually to be my consultant, my advisers. | 3:33:34 | 3:33:38 | |
They start to tell me how my system was breached, | 3:33:38 | 3:33:41 | |
which is very valuable information. | 3:33:41 | 3:33:43 | |
"This is something we never do. | 3:33:43 | 3:33:46 | |
"But consider it as a gesture..." | 3:33:46 | 3:33:50 | |
And then I immediately reply, | 3:33:50 | 3:33:52 | |
"I never recommend moving forward | 3:33:52 | 3:33:56 | |
"based on a virtual contract," | 3:33:56 | 3:33:58 | |
I'm telling them. | 3:33:58 | 3:33:59 | |
"But with you, I feel we have this otnoshenya." | 3:33:59 | 3:34:03 | |
The Russian word for relationship. | 3:34:03 | 3:34:05 | |
To signal them that, "we are on the same page, | 3:34:05 | 3:34:08 | |
"I do appreciate this." | 3:34:08 | 3:34:10 | |
Though the ransom was paid, by negotiating with the hackers, | 3:34:11 | 3:34:15 | |
Moti successfully ensured that the company's data were not released. | 3:34:15 | 3:34:20 | |
But for those facing the ransom on the 12th of May attack, | 3:34:22 | 3:34:25 | |
was paying the right thing to do? | 3:34:25 | 3:34:28 | |
There are several costs involved when you pay the ransomware. | 3:34:28 | 3:34:32 | |
And I do think, most important, | 3:34:32 | 3:34:33 | |
is that you feel bad | 3:34:33 | 3:34:35 | |
that, actually, you surrendered | 3:34:35 | 3:34:38 | |
to this type of criminal. | 3:34:38 | 3:34:40 | |
So if you pay, you feel bad. | 3:34:40 | 3:34:43 | |
And there's another risk to paying. | 3:34:44 | 3:34:46 | |
You open yourself up to further cyber attacks. | 3:34:46 | 3:34:49 | |
I do believe, in the darknet, | 3:34:49 | 3:34:51 | |
dark in the darknet, | 3:34:51 | 3:34:53 | |
people do exchange lists | 3:34:53 | 3:34:55 | |
of people who paid. | 3:34:55 | 3:34:56 | |
Why? Because that's, again, | 3:34:56 | 3:34:58 | |
a human pattern. | 3:34:58 | 3:34:59 | |
If you've paid once, you might pay again and again. | 3:34:59 | 3:35:03 | |
Ransoms paid in Bitcoin, hostage negotiators... | 3:35:13 | 3:35:17 | |
It's all fine if you're a high-net-worth individual | 3:35:17 | 3:35:19 | |
or a private mega-corporation, | 3:35:19 | 3:35:21 | |
but none of that is going to work in the NHS. | 3:35:21 | 3:35:24 | |
Even if it could pay - | 3:35:24 | 3:35:25 | |
which it can't, because there's no money - | 3:35:25 | 3:35:27 | |
it wouldn't be allowed to pay. | 3:35:27 | 3:35:29 | |
The best you can hope for in that situation as a hacker | 3:35:29 | 3:35:32 | |
is that you don't inadvertently kill somebody | 3:35:32 | 3:35:35 | |
and, instead of the local cyber crime division, | 3:35:35 | 3:35:38 | |
suddenly find the murder squad kicking down your front door. | 3:35:38 | 3:35:41 | |
Those hospitals and GPs that had been infected | 3:35:43 | 3:35:45 | |
had no option but to keep their computers off | 3:35:45 | 3:35:48 | |
and hope that something could stop the spread. | 3:35:48 | 3:35:51 | |
And incredibly, an answer was found, | 3:35:53 | 3:35:56 | |
thanks to a bit of luck and Marcus's inquisitive nature. | 3:35:56 | 3:35:59 | |
By late afternoon, | 3:36:06 | 3:36:07 | |
he'd spotted something curious in the malware's code. | 3:36:07 | 3:36:10 | |
It was trying to connect to one specific web address. | 3:36:10 | 3:36:14 | |
A domain. | 3:36:14 | 3:36:16 | |
I saw this domain was not registered, | 3:36:18 | 3:36:20 | |
so my first idea was to just go and reserve it, just in case. | 3:36:20 | 3:36:24 | |
By registering it, we could track the infection across the globe. | 3:36:24 | 3:36:29 | |
Straight after registering the domain, | 3:36:29 | 3:36:31 | |
we were seeing thousands of queries per second. | 3:36:31 | 3:36:33 | |
Maybe 100,000 unique infections within the first hour. | 3:36:33 | 3:36:38 | |
It was sort of, like, a bingo moment. | 3:36:38 | 3:36:40 | |
He didn't yet realise it, but by registering the domain, | 3:36:41 | 3:36:45 | |
at a cost of just 10, | 3:36:45 | 3:36:47 | |
Marcus wasn't just tracking the infection - | 3:36:47 | 3:36:49 | |
he was also preventing it from spreading. | 3:36:49 | 3:36:52 | |
The plan was to track it and then look for a way to stop it, | 3:36:52 | 3:36:55 | |
but it actually turned out the tracking it was stopping it. | 3:36:55 | 3:36:58 | |
It was like finding a vaccine. | 3:37:02 | 3:37:05 | |
For now, WannaCry could do no further damage. | 3:37:05 | 3:37:09 | |
The NHS didn't realise it yet | 3:37:11 | 3:37:13 | |
and were still relying on emergency systems, | 3:37:13 | 3:37:16 | |
but the cyber attack was over, | 3:37:16 | 3:37:19 | |
the malware defeated. | 3:37:19 | 3:37:22 | |
"Kill switch" was, sort of, the term the media ran with. | 3:37:22 | 3:37:25 | |
It sort of makes a lot of sense, cos it is a kill switch. | 3:37:25 | 3:37:27 | |
It stops the malware. | 3:37:27 | 3:37:28 | |
It seems silly that simply registering a domain | 3:37:28 | 3:37:31 | |
would stop a global cyber attack, but it happened. | 3:37:31 | 3:37:34 | |
In the days following the cyber attack, | 3:37:35 | 3:37:38 | |
the NHS slowly came back online. | 3:37:38 | 3:37:40 | |
Machines were given the patch, | 3:37:40 | 3:37:43 | |
backup data was used to restore the encrypted files, | 3:37:43 | 3:37:47 | |
and news of Marcus's cure spread. | 3:37:47 | 3:37:49 | |
Well, as we've been hearing, | 3:37:49 | 3:37:51 | |
the global cyber attack was halted almost by accident. | 3:37:51 | 3:37:54 | |
It was a 22-year-old in the UK who checked the code | 3:37:54 | 3:37:57 | |
and found a reference to an unregistered website name. | 3:37:57 | 3:38:01 | |
With systems restored, | 3:38:01 | 3:38:03 | |
Patrick finally got the news he was waiting for. | 3:38:03 | 3:38:06 | |
I'd gone back to work, then I had a phone call to say | 3:38:06 | 3:38:10 | |
that they had managed to get an operation date for me | 3:38:10 | 3:38:15 | |
for next week, which... | 3:38:15 | 3:38:18 | |
I was with a customer and I was, yeah, absolutely delighted. | 3:38:18 | 3:38:22 | |
I can't describe the people who did the ransomware. | 3:38:25 | 3:38:29 | |
I'm sure that wasn't in their thought process, | 3:38:29 | 3:38:32 | |
to attack individual people, | 3:38:32 | 3:38:36 | |
but that's the result of exactly what's happened. | 3:38:36 | 3:38:40 | |
In a detached sort of way, | 3:38:45 | 3:38:47 | |
you've got to have at least a bit of respect for the malware. | 3:38:47 | 3:38:50 | |
As poorly constructed as it was, | 3:38:50 | 3:38:52 | |
it still did a lot of damage. | 3:38:52 | 3:38:54 | |
That's not unlike a real infection. | 3:38:54 | 3:38:57 | |
Real viruses have a lot of flaws, | 3:38:57 | 3:38:59 | |
and yet still go on to wreak havoc. | 3:38:59 | 3:39:01 | |
Like a real infection, the malware was able to hide, | 3:39:01 | 3:39:06 | |
evade natural defences, | 3:39:06 | 3:39:07 | |
avoid surveillance, go dormant, | 3:39:07 | 3:39:10 | |
and then go on to cause | 3:39:10 | 3:39:11 | |
all of that chaos. | 3:39:11 | 3:39:13 | |
But like a real infection, | 3:39:13 | 3:39:14 | |
there was, in the end, | 3:39:14 | 3:39:16 | |
a way to fight it, | 3:39:16 | 3:39:17 | |
and so the NHS survived... | 3:39:17 | 3:39:19 | |
At least this time. | 3:39:20 | 3:39:21 | |
WannaCry soon disappeared from the front pages, | 3:39:33 | 3:39:36 | |
but at a gathering of cyber security experts | 3:39:36 | 3:39:40 | |
a fortnight after the attack, it was still making waves. | 3:39:40 | 3:39:44 | |
WHISPERS: This is a long-planned cyber security conference. | 3:39:47 | 3:39:50 | |
It predates the NHS cyber attack by many months, | 3:39:50 | 3:39:53 | |
but it's clearly dominating the agenda here. | 3:39:53 | 3:39:56 | |
Every single speaker has mentioned it. | 3:39:56 | 3:39:58 | |
I wanted to know why, in this country, | 3:39:58 | 3:40:01 | |
it was the NHS that seemed to bear the brunt | 3:40:01 | 3:40:04 | |
of the ransomware infection. | 3:40:04 | 3:40:06 | |
Thank you. I'm Kevin Fong. I'm a doctor in the NHS. | 3:40:06 | 3:40:11 | |
We still can't quite understand how worried we should be | 3:40:11 | 3:40:16 | |
or how vulnerable we continue to be. | 3:40:16 | 3:40:19 | |
We had the person responsible | 3:40:19 | 3:40:21 | |
for one of the trusts | 3:40:21 | 3:40:23 | |
talking about her experiences and day-to-day life | 3:40:23 | 3:40:27 | |
of running IT in the NHS. | 3:40:27 | 3:40:29 | |
It really stuck with me and resonated that, actually, | 3:40:29 | 3:40:34 | |
the amount of budget that she had to protect the IT | 3:40:34 | 3:40:39 | |
was vanishingly small. | 3:40:39 | 3:40:40 | |
They have one support person for 1,000 machines | 3:40:40 | 3:40:44 | |
and things like that. | 3:40:44 | 3:40:45 | |
That's just not a sustainable investment. | 3:40:45 | 3:40:49 | |
I think the NHS really does need to think about | 3:40:49 | 3:40:51 | |
its balance of investment. | 3:40:51 | 3:40:53 | |
It must put more money into this. | 3:40:53 | 3:40:55 | |
It's always a hard trade-off, to think patients versus IT, | 3:40:55 | 3:41:01 | |
you know, but actually, you've got to have that infrastructure | 3:41:01 | 3:41:04 | |
to be able to do a good job on the patients, I would say. | 3:41:04 | 3:41:07 | |
Spending varies across the NHS, but it's been reported that in 2015, | 3:41:08 | 3:41:13 | |
seven trusts spent nothing at all on IT security. | 3:41:13 | 3:41:18 | |
If this is true, surely this needs urgent attention, | 3:41:18 | 3:41:22 | |
now that weaknesses have been exposed by the WannaCry attack. | 3:41:22 | 3:41:26 | |
I was shocked by what happened to the NHS. | 3:41:26 | 3:41:29 | |
I think the shock is more in the vulnerability of the hospitals | 3:41:29 | 3:41:32 | |
than it was in the way that the attack was executed. | 3:41:32 | 3:41:36 | |
We are always afraid of the next attack | 3:41:37 | 3:41:40 | |
hitting critical infrastructures, | 3:41:40 | 3:41:42 | |
so now health care systems were hit, | 3:41:42 | 3:41:47 | |
we are afraid that the electricity, the water departments, | 3:41:47 | 3:41:51 | |
you know, those types of infrastructures being hit... | 3:41:51 | 3:41:56 | |
That didn't happen, but it can happen, | 3:41:56 | 3:41:59 | |
so I think that this is what we're kind of waiting for. | 3:41:59 | 3:42:02 | |
I think there has to be a recognition that it's not an IT | 3:42:05 | 3:42:08 | |
or a computer issue, this is about everyday life now. | 3:42:08 | 3:42:11 | |
In a world where everything's online and where there are ever more | 3:42:11 | 3:42:14 | |
online threats and where government agencies involved in security | 3:42:14 | 3:42:19 | |
are much more interested in adding to the threat level | 3:42:19 | 3:42:22 | |
than in adding to the defence level, | 3:42:22 | 3:42:24 | |
there's an awful lot of conflicts there | 3:42:24 | 3:42:26 | |
that we're going to have to manage. | 3:42:26 | 3:42:28 | |
This attack affected Russian banks, Chinese universities, | 3:42:30 | 3:42:34 | |
Spanish telecoms companies, even FedEx. | 3:42:34 | 3:42:38 | |
The vulnerabilities were there for all of us | 3:42:38 | 3:42:41 | |
across countries and continents, private and public sector, | 3:42:41 | 3:42:44 | |
all walks of life. | 3:42:44 | 3:42:46 | |
The NHS was simply one in a long list of casualties. | 3:42:46 | 3:42:50 | |
Collateral damage in a global cyber war. | 3:42:50 | 3:42:54 | |
The new reality is that we're all at risk. | 3:42:59 | 3:43:02 | |
It's not only businesses and governments - | 3:43:02 | 3:43:04 | |
anyone who's connected could be a target. | 3:43:04 | 3:43:07 | |
As the world of network technology gets ever more complex, | 3:43:10 | 3:43:14 | |
it opens up whole new realms of vulnerability. | 3:43:14 | 3:43:17 | |
It's no longer just our computers that are at risk. | 3:43:19 | 3:43:23 | |
Our homes and offices are now filled with devices | 3:43:23 | 3:43:26 | |
that are online and ripe for hacking. | 3:43:26 | 3:43:29 | |
-Which one are you pinning our hopes on being...? -The... Yeah, that one. | 3:43:30 | 3:43:35 | |
Ken Munro leads a team of ethical hackers | 3:43:35 | 3:43:38 | |
that test the security of internet-enabled household devices, | 3:43:38 | 3:43:42 | |
the so-called internet of things, | 3:43:42 | 3:43:44 | |
to find out where their weak spots are | 3:43:44 | 3:43:46 | |
and to see how much havoc they could wreak. | 3:43:46 | 3:43:48 | |
This is kind of the most fundamental aspect of hacking. | 3:43:48 | 3:43:52 | |
You're in there at the nitty-gritty, at the level of the circuit board. | 3:43:52 | 3:43:56 | |
Yeah, so that's what's different about the internet of things. | 3:43:56 | 3:43:59 | |
Unlike, say, an eCommerce site, | 3:43:59 | 3:44:01 | |
which is safely hosted in a data centre on a server somewhere, | 3:44:01 | 3:44:04 | |
with the internet of things, you can go and buy the kit, | 3:44:04 | 3:44:07 | |
you can dismantle it. | 3:44:07 | 3:44:08 | |
You can find the chips and the hardware and then connect to it. | 3:44:08 | 3:44:12 | |
So literally put logic probes, | 3:44:12 | 3:44:13 | |
electric wires onto the circuit cables | 3:44:13 | 3:44:15 | |
and then pull off the software and reverse-engineer | 3:44:15 | 3:44:18 | |
how it works from 1s and 0s. | 3:44:18 | 3:44:20 | |
Once you've got that, you can find security flaws. | 3:44:20 | 3:44:23 | |
As Ken discovered, some devices are far easier to hack than others. | 3:44:25 | 3:44:30 | |
This is your hackable shop of horrors. | 3:44:30 | 3:44:34 | |
What have you got here? | 3:44:34 | 3:44:36 | |
Probably the first one we look at, this is My Friend Cayla, | 3:44:36 | 3:44:39 | |
she's an interactive kids' doll. | 3:44:39 | 3:44:42 | |
She works over Bluetooth with an app, | 3:44:42 | 3:44:43 | |
but the manufacturer forgot to put security | 3:44:43 | 3:44:46 | |
on the Bluetooth connection, so, as a result, | 3:44:46 | 3:44:48 | |
it means that someone could be sat on the street outside, | 3:44:48 | 3:44:50 | |
could be listening to what's going on in the room, | 3:44:50 | 3:44:53 | |
so snooping on your child, | 3:44:53 | 3:44:54 | |
or potentially speaking to the child through the speaker. | 3:44:54 | 3:44:57 | |
Our interest was we wanted to see | 3:44:57 | 3:44:58 | |
if we could bypass her protection measures. | 3:44:58 | 3:45:01 | |
You can't make her swear. | 3:45:01 | 3:45:02 | |
But, of course, we discovered you could hack her, | 3:45:02 | 3:45:05 | |
and she swears like a docker now. | 3:45:05 | 3:45:06 | |
-RECORDED MESSAGE: -Hey, calm down or I will kick the shit out of you. | 3:45:06 | 3:45:10 | |
Creepy, but it's a really serious issue. | 3:45:11 | 3:45:13 | |
The German telecommunications regulator | 3:45:13 | 3:45:15 | |
has now classified her as a covert bugging device | 3:45:15 | 3:45:19 | |
and has banned her. | 3:45:19 | 3:45:20 | |
It's illegal to own her in Germany now. | 3:45:20 | 3:45:22 | |
All right, OK. So this is a wireless kettle, | 3:45:22 | 3:45:25 | |
but I don't actually care if someone hacks my kettle. | 3:45:25 | 3:45:29 | |
I mean, what can they possibly do with that? | 3:45:29 | 3:45:31 | |
This is a Wi-Fi kettle, though. | 3:45:31 | 3:45:32 | |
How else would you boil a kettle from the car home? | 3:45:32 | 3:45:35 | |
So this is the scary bit. This is the Wi-Fi Module. | 3:45:37 | 3:45:39 | |
We're going to show you how we managed to hack that. | 3:45:39 | 3:45:42 | |
Imagine I'm outside your house. | 3:45:42 | 3:45:44 | |
If I want to get your Wi-Fi key from your kettle, | 3:45:44 | 3:45:46 | |
it's really surprisingly easy. | 3:45:46 | 3:45:47 | |
All I need to do, I'm going to connect to it. | 3:45:47 | 3:45:49 | |
I need to put a password in. | 3:45:49 | 3:45:51 | |
You think, "Password - great security." | 3:45:51 | 3:45:53 | |
Unfortunately, the password on these kettles is, | 3:45:53 | 3:45:55 | |
believe it or not, six zeros. | 3:45:55 | 3:45:57 | |
Once I connect to it, all I have to do is send one command, | 3:45:57 | 3:45:59 | |
-and I can retrieve your wireless network encryption key. -No! | 3:45:59 | 3:46:02 | |
That's the key that secures all of your traffic on your Wi-Fi network. | 3:46:02 | 3:46:05 | |
So if I was a malicious hacker on your network, | 3:46:05 | 3:46:08 | |
I can now intercept everything you do on your home wireless network. | 3:46:08 | 3:46:12 | |
Online banking, your social media - | 3:46:12 | 3:46:15 | |
everything you do, we can see, | 3:46:15 | 3:46:17 | |
because we've got your wireless network key. | 3:46:17 | 3:46:19 | |
I can see a thermostat over here. | 3:46:19 | 3:46:21 | |
I think I have something similar in my house. | 3:46:21 | 3:46:24 | |
What's the problem with a wireless thermostat? | 3:46:24 | 3:46:27 | |
Unfortunately, we found some pretty shocking security | 3:46:27 | 3:46:29 | |
on some brands of Smart thermostat. | 3:46:29 | 3:46:32 | |
This one we managed to actually hold it to ransom. | 3:46:32 | 3:46:34 | |
So just like you've heard with the NHS ransomware issue, | 3:46:34 | 3:46:37 | |
holding critical devices to ransom, actually, | 3:46:37 | 3:46:40 | |
we've found you can even hold your SmartStat to ransom, | 3:46:40 | 3:46:43 | |
-to lock you out of heating unless you pay cash. -So... | 3:46:43 | 3:46:47 | |
That would be quite unpleasant, but in the end, | 3:46:47 | 3:46:49 | |
surely you just take it off the wall and reset it. | 3:46:49 | 3:46:51 | |
I'm not so worried about that. | 3:46:51 | 3:46:53 | |
What I'm more worried about is actually taking control | 3:46:53 | 3:46:56 | |
of lots of Smart thermostats. | 3:46:56 | 3:46:58 | |
Imagine you've got several hundred thousands of these | 3:46:58 | 3:47:00 | |
and someone finds a way to compromise them, which we have. | 3:47:00 | 3:47:03 | |
They could switch them on and off, synchronously. | 3:47:03 | 3:47:06 | |
You can create unexpected power spikes | 3:47:06 | 3:47:08 | |
using people's thermostats. | 3:47:08 | 3:47:10 | |
So, in theory, you could knock out the grid on a bad day, | 3:47:10 | 3:47:14 | |
if you wanted to. | 3:47:14 | 3:47:15 | |
So, I mean, that's fascinating and terrifying. | 3:47:15 | 3:47:17 | |
This is not about what it does to the individual. | 3:47:17 | 3:47:20 | |
This is about what it might do to an entire nation's power grid. | 3:47:20 | 3:47:23 | |
Damn right. | 3:47:23 | 3:47:24 | |
Imagine you were a foreign power and you wanted to soften up | 3:47:24 | 3:47:27 | |
a country on a particular day. | 3:47:27 | 3:47:29 | |
I don't know, maybe an election day. You knocked out the power. | 3:47:29 | 3:47:32 | |
That's going to influence the outcome of an election. | 3:47:32 | 3:47:35 | |
All right. | 3:47:36 | 3:47:37 | |
The internet of things has also arrived in health care. | 3:47:42 | 3:47:46 | |
Devices that regulate drug dosages | 3:47:46 | 3:47:49 | |
can now be operated over the internet, | 3:47:49 | 3:47:52 | |
and some of the latest pacemakers are controlled by Bluetooth. | 3:47:52 | 3:47:56 | |
A recent study revealed that there might be thousands of exploits. | 3:47:56 | 3:48:01 | |
Do you think this fundamentally limits how useful | 3:48:01 | 3:48:05 | |
the digital revolution might be in health care? | 3:48:05 | 3:48:08 | |
Well, I think we've got things out of step. | 3:48:08 | 3:48:10 | |
I think we've got amazing technical advances, | 3:48:10 | 3:48:14 | |
fantastic technological steps forward, which are brilliant, | 3:48:14 | 3:48:16 | |
which allow us to do cool stuff, | 3:48:16 | 3:48:18 | |
that allows us much better diagnostics - brilliant. | 3:48:18 | 3:48:20 | |
But we've got that out of step with the security. | 3:48:20 | 3:48:23 | |
We're in a catch-up game. | 3:48:23 | 3:48:24 | |
Once the security has caught up with the technological advances, great - | 3:48:24 | 3:48:28 | |
we get fantastic medical benefits. | 3:48:28 | 3:48:30 | |
But until then, it's all a little bit dangerous to me. | 3:48:30 | 3:48:33 | |
We can't go back to the Stone Age. | 3:48:41 | 3:48:43 | |
We need digital technology and all of its promise | 3:48:43 | 3:48:46 | |
to push back the frontiers of medicine, | 3:48:46 | 3:48:49 | |
so we have to learn how to protect ourselves. | 3:48:49 | 3:48:51 | |
But there is hope. | 3:48:51 | 3:48:53 | |
Hope, because there are people on our side in this fight. | 3:48:53 | 3:48:56 | |
We've met some of them. | 3:48:56 | 3:48:57 | |
Hope too because of all professions, | 3:48:57 | 3:49:00 | |
medicine should be able to learn how to deal with this, | 3:49:00 | 3:49:03 | |
because this is the feat of host immunity - | 3:49:03 | 3:49:06 | |
of taking the hit from an infection, recognising it, | 3:49:06 | 3:49:10 | |
and then continually evolving your defences until, eventually, | 3:49:10 | 3:49:14 | |
you're impervious. | 3:49:14 | 3:49:16 | |
Hope as well because, despite reports, | 3:49:16 | 3:49:19 | |
the NHS never stopped. | 3:49:19 | 3:49:21 | |
Yes, parts of its network were severely affected, | 3:49:21 | 3:49:25 | |
but it kept doing what it always does. | 3:49:25 | 3:49:28 | |
If the last few terrible weeks have taught us anything, | 3:49:28 | 3:49:32 | |
it's that the NHS can take whatever you throw at it. | 3:49:32 | 3:49:36 | |
It has a plan, it will learn | 3:49:36 | 3:49:38 | |
and it will be ready for the next time. | 3:49:38 | 3:49:41 |