Browse content similar to Zero Days: Nuclear Cyber Sabotage. Check below for episodes and series from the same categories and more!
Line | From | To | |
---|---|---|---|
This programme contains some strong language. | 0:00:02 | 0:00:09 | |
-DISTORTED MALE VOICE: -Through the darkness | 0:00:19 | 0:00:23 | |
of the pathways that we march, | 0:00:23 | 0:00:25 | |
evil and good live side by side, | 0:00:25 | 0:00:29 | |
and this is the nature of life. | 0:00:29 | 0:00:31 | |
We are in an unbalanced and un-equivalent confrontation | 0:00:46 | 0:00:51 | |
between democracies who are obliged to play by the rules | 0:00:51 | 0:00:55 | |
and entities who thinks democracy is a joke. | 0:00:55 | 0:00:59 | |
You can't convince fanatics by saying, | 0:01:01 | 0:01:04 | |
"Hey, hatred paralyses you, love releases you." | 0:01:04 | 0:01:10 | |
There are different rules that we have to play by. | 0:01:10 | 0:01:15 | |
-NEWS REPORT: -'Today, two of Iran's top nuclear scientists | 0:01:29 | 0:01:32 | |
'were targeted by hit squads...' | 0:01:32 | 0:01:34 | |
'Bomb attacks in the capital, Tehran...' | 0:01:34 | 0:01:36 | |
'The latest in a string of attacks...' | 0:01:36 | 0:01:37 | |
'Today's attack has all the hallmarks | 0:01:37 | 0:01:40 | |
'of major strategic sabotage...' | 0:01:40 | 0:01:41 | |
'Iran immediately accused the US and Israel | 0:01:41 | 0:01:44 | |
'of trying to damage its nuclear programme...' | 0:01:44 | 0:01:46 | |
I want to categorically deny any United States involvement | 0:01:55 | 0:02:01 | |
in any kind of active violence inside Iran. | 0:02:01 | 0:02:06 | |
Covert actions can help, can assist. | 0:02:06 | 0:02:11 | |
They are needed. They are not all the time essentials. | 0:02:11 | 0:02:15 | |
And they in no way can replace political wisdom. | 0:02:15 | 0:02:19 | |
INTERVIEWER: Were the assassinations in Iran | 0:02:19 | 0:02:22 | |
related to the Stuxnet computer attacks? | 0:02:22 | 0:02:25 | |
Er, next question, please. | 0:02:25 | 0:02:27 | |
-NEWS REPORT: -'Iran's infrastructure is being targeted | 0:02:27 | 0:02:30 | |
'by a new and dangerously powerful cyber worm. | 0:02:30 | 0:02:34 | |
'The so-called Stuxnet worm is specifically designed, it seems, | 0:02:34 | 0:02:37 | |
'to infiltrate and sabotage real world power plants | 0:02:37 | 0:02:40 | |
'and factories and refineries...' | 0:02:40 | 0:02:41 | |
'It's not trying to steal information | 0:02:41 | 0:02:43 | |
'or grab your credit card, | 0:02:43 | 0:02:44 | |
'it's trying to get into some sort of industrial plant and wreck havoc, | 0:02:44 | 0:02:47 | |
'try to blow up an engine...' | 0:02:47 | 0:02:49 | |
'No-one knows who's behind the worm | 0:03:05 | 0:03:07 | |
'and the exact nature of its mission, | 0:03:07 | 0:03:08 | |
'but there are fears Iran will hold Israel or America responsible | 0:03:08 | 0:03:13 | |
'and seek retaliation.' | 0:03:13 | 0:03:14 | |
'It's not impossible that some group of hackers did it, | 0:03:14 | 0:03:17 | |
'but the security experts that are studying this | 0:03:17 | 0:03:19 | |
'really think this required the resources of a nation state.' | 0:03:19 | 0:03:22 | |
-OK? And speaking. -OK, ready. -OK, good. Here we go. | 0:03:28 | 0:03:31 | |
INTERVIEWER: What impact, ultimately, | 0:03:32 | 0:03:34 | |
did the Stuxnet attack have? Can you say? | 0:03:34 | 0:03:37 | |
Er, I don't want to get into the details. | 0:03:37 | 0:03:39 | |
Since the event has already happened, | 0:03:39 | 0:03:41 | |
why can't we talk more openly and publically about Stuxnet? | 0:03:41 | 0:03:45 | |
Yeah. I mean, my answer's "Because it's classified." | 0:03:45 | 0:03:49 | |
I won't acknowledge... | 0:03:49 | 0:03:50 | |
Knowingly offer up anything I consider classified. | 0:03:50 | 0:03:54 | |
I know that you can't talk much about Stuxnet, | 0:03:54 | 0:03:56 | |
because Stuxnet is officially classified. | 0:03:56 | 0:03:59 | |
You're right on both those counts. | 0:03:59 | 0:04:00 | |
People might find it frustrating not to be able to talk about it | 0:04:00 | 0:04:03 | |
when it's in the public domain, but... | 0:04:03 | 0:04:06 | |
-I find it frustrating. -Yeah, I'm sure you do. | 0:04:06 | 0:04:09 | |
-I don't answer that question. -Unfortunately, I can't comment. | 0:04:09 | 0:04:12 | |
I do not know how to answer that. | 0:04:12 | 0:04:13 | |
Two answers before you even get started, I don't know and if I did, | 0:04:13 | 0:04:17 | |
-we wouldn't talk about it anyway. -How can you have a debate | 0:04:17 | 0:04:19 | |
-if everything's secret? -I think, right now, that's just where we are. | 0:04:19 | 0:04:22 | |
No-one wants to... | 0:04:22 | 0:04:23 | |
Countries aren't happy about confessing or owning up | 0:04:23 | 0:04:27 | |
to what they did, because they're not quite sure | 0:04:27 | 0:04:29 | |
where they want the system to go. | 0:04:29 | 0:04:31 | |
And so, whoever was behind Stuxnet hasn't admitted they were behind it. | 0:04:31 | 0:04:35 | |
Asking officials about Stuxnet was frustrating and surreal. | 0:04:38 | 0:04:42 | |
Like asking the Emperor about his new clothes. | 0:04:42 | 0:04:45 | |
Even after the cyber weapon had penetrated computers | 0:04:45 | 0:04:48 | |
all over the world, no-one was willing to admit that it was loose | 0:04:48 | 0:04:52 | |
or talk about the dangers it posed. | 0:04:52 | 0:04:54 | |
What was it about the Stuxnet operation | 0:04:54 | 0:04:56 | |
that was hiding in plain sight? | 0:04:56 | 0:04:59 | |
Maybe there was a way the computer code could speak for itself. | 0:05:00 | 0:05:04 | |
Stuxnet first surfaced in Belarus. | 0:05:04 | 0:05:07 | |
I started with a call to the man who discovered it | 0:05:07 | 0:05:10 | |
when his clients in Iran began to panic | 0:05:10 | 0:05:12 | |
over an epidemic of computer shutdowns. | 0:05:12 | 0:05:15 | |
Had you ever seen anything quite so sophisticated before? | 0:05:15 | 0:05:19 | |
On a daily basis, basically, | 0:06:39 | 0:06:40 | |
we are sifting through a massive haystack, | 0:06:40 | 0:06:43 | |
looking for that proverbial needle. | 0:06:43 | 0:06:47 | |
We get millions of pieces of new malicious threats | 0:06:47 | 0:06:49 | |
and there are millions of attacks going on every single day. | 0:06:49 | 0:06:52 | |
And not only are we trying to protect people and their computers, | 0:06:52 | 0:06:55 | |
and their systems, and countries' infrastructure | 0:06:55 | 0:06:58 | |
from being taken down by those attacks, | 0:06:58 | 0:07:01 | |
but, more importantly, we have to find attacks that matter. | 0:07:01 | 0:07:04 | |
When you're talking about that many, impact is extremely important. | 0:07:04 | 0:07:09 | |
20 years ago, the antivirus companies, | 0:07:20 | 0:07:21 | |
they were hunting for computer viruses | 0:07:21 | 0:07:23 | |
because there were not so many. | 0:07:23 | 0:07:25 | |
So, we had, like, tens or dozens a month | 0:07:25 | 0:07:28 | |
and they were just in little numbers. | 0:07:28 | 0:07:31 | |
Now, we collect millions of unique attacks every month. | 0:07:31 | 0:07:34 | |
This room we call a woodpeckers' room, or a virus lab, | 0:07:36 | 0:07:39 | |
and this is where virus analysts sit. | 0:07:39 | 0:07:41 | |
We call them "woodpeckers" because they are pecking the worms, | 0:07:41 | 0:07:44 | |
network worms and viruses. | 0:07:44 | 0:07:47 | |
We see, like, three different groups of actors | 0:07:47 | 0:07:50 | |
behind cyber attacks. They are traditional cybercriminals. | 0:07:50 | 0:07:53 | |
Those guys are interested only in illegal profit - | 0:07:53 | 0:07:57 | |
quick and dirty money. | 0:07:57 | 0:07:59 | |
Activists or hacktivists, they are hacking for fun, | 0:07:59 | 0:08:02 | |
or hacking to push some political message. | 0:08:02 | 0:08:04 | |
And the third group is nation states. | 0:08:04 | 0:08:07 | |
They're interested in high-quality intelligence or sabotage activity. | 0:08:07 | 0:08:11 | |
Security companies not only share information, | 0:08:12 | 0:08:15 | |
but we also share binary samples. | 0:08:15 | 0:08:16 | |
So, when this threat was found by a Belarusian security company | 0:08:16 | 0:08:20 | |
on one of their customer's machines in Iran, | 0:08:20 | 0:08:22 | |
the sample was shared amongst the security community. | 0:08:22 | 0:08:25 | |
When we try to name threats, we just try to pick some sort of string, | 0:08:25 | 0:08:28 | |
some sort of words, that are inside of the binary. | 0:08:28 | 0:08:32 | |
In this case, there was a couple of words in there. | 0:08:32 | 0:08:35 | |
We took pieces of each and that forms "Stuxnet". | 0:08:35 | 0:08:38 | |
I got the news about Stuxnet from one of my engineers. | 0:08:40 | 0:08:43 | |
He came to my office, opened the door, | 0:08:43 | 0:08:46 | |
and he said, "So, Eugene, | 0:08:46 | 0:08:47 | |
"of course, you know what we're waiting for? Something really bad? | 0:08:47 | 0:08:52 | |
"It happened." | 0:08:52 | 0:08:54 | |
Give me some sense of what it was like in the lab at that time. | 0:08:59 | 0:09:02 | |
Was there a palpable sense of amazement | 0:09:02 | 0:09:04 | |
that you had something really different there? | 0:09:04 | 0:09:06 | |
Well, I wouldn't call it amazement. | 0:09:06 | 0:09:08 | |
It was kind of a shock. | 0:09:08 | 0:09:11 | |
It went beyond our worst fears, our worst nightmares. | 0:09:11 | 0:09:14 | |
And this continued the more we analysed, | 0:09:14 | 0:09:17 | |
the more we researched, | 0:09:17 | 0:09:19 | |
the more bizarre the whole story got. | 0:09:19 | 0:09:22 | |
We look at so much malware every day | 0:09:22 | 0:09:24 | |
that we can just look at the code and say, | 0:09:24 | 0:09:25 | |
"OK, there's something bad going on here | 0:09:25 | 0:09:27 | |
"and I need to investigate that." | 0:09:27 | 0:09:29 | |
That's the way it was when we looked at Stuxnet for the first time. | 0:09:29 | 0:09:32 | |
We opened it up, and there was just bad things everywhere. | 0:09:32 | 0:09:34 | |
Like, "OK, this is bad and that's bad, and, you know, | 0:09:34 | 0:09:37 | |
"we need to investigate this." | 0:09:37 | 0:09:38 | |
Suddenly, we had, like, 100 questions straightaway. | 0:09:38 | 0:09:41 | |
The most interesting thing that we do is the detective work | 0:09:43 | 0:09:45 | |
where we try to track down who's behind a threat. | 0:09:45 | 0:09:47 | |
What are they doing? What's their motivation? | 0:09:47 | 0:09:49 | |
And try to really stop it at the root. | 0:09:49 | 0:09:51 | |
It is kind of all-consuming. | 0:09:51 | 0:09:53 | |
You get this new puzzle and it's very difficult to put it down. | 0:09:53 | 0:09:56 | |
You know, work until, like, 4:00am in the morning | 0:09:56 | 0:09:58 | |
and figure these things out. | 0:09:58 | 0:10:00 | |
I was in that zone where I was very consumed by this, | 0:10:00 | 0:10:02 | |
very excited about it, | 0:10:02 | 0:10:03 | |
very interested to know what was happening. | 0:10:03 | 0:10:05 | |
And Eric was also in that same sort of zone. | 0:10:05 | 0:10:08 | |
So, the two of us were, like, back and forth all the time. | 0:10:08 | 0:10:11 | |
Liam and I continued to grind at the code. | 0:10:11 | 0:10:14 | |
Sharing pieces, comparing notes, | 0:10:14 | 0:10:16 | |
bouncing ideas off of each other. | 0:10:16 | 0:10:18 | |
We realised that we needed to do what we call "deep analysis" - | 0:10:18 | 0:10:21 | |
pick apart the threat, every single byte, every single zero-one, | 0:10:21 | 0:10:25 | |
and understand everything that was inside of it. | 0:10:25 | 0:10:28 | |
I'll just give you some context. | 0:10:28 | 0:10:29 | |
We can go through and understand every line of code | 0:10:29 | 0:10:31 | |
for the average threat in minutes. | 0:10:31 | 0:10:33 | |
And here we are one month into this threat | 0:10:33 | 0:10:35 | |
and we're just starting to discover | 0:10:35 | 0:10:37 | |
what we call the "payload", or its whole purpose. | 0:10:37 | 0:10:39 | |
When looking at the Stuxnet code, | 0:10:41 | 0:10:43 | |
it's 20 times the size of the average piece of code | 0:10:43 | 0:10:46 | |
but contains almost no bugs inside of it and that's extremely rare. | 0:10:46 | 0:10:49 | |
Malicious code always has bugs inside of it. | 0:10:49 | 0:10:52 | |
This wasn't the case with Stuxnet. | 0:10:52 | 0:10:53 | |
It's dense and every piece of code does something | 0:10:53 | 0:10:56 | |
and does something right in order to conduct its attack. | 0:10:56 | 0:10:59 | |
One of the things that surprised us was that Stuxnet utilised | 0:11:00 | 0:11:03 | |
what's called a "zero-day exploit". | 0:11:03 | 0:11:05 | |
Or, basically, a piece of code | 0:11:05 | 0:11:07 | |
that allows it to spread without you having to do anything. | 0:11:07 | 0:11:10 | |
You don't have to, for example, download a file and run it. | 0:11:10 | 0:11:13 | |
A zero-day exploit is an exploit | 0:11:13 | 0:11:15 | |
that nobody knows about except the attacker. | 0:11:15 | 0:11:17 | |
So there's no protection against it, there's been no patch released. | 0:11:17 | 0:11:20 | |
There's been zero days' protection, you know, against it. | 0:11:20 | 0:11:24 | |
That's what attackers value | 0:11:24 | 0:11:26 | |
because they know 100%, if they have this zero-day exploit, | 0:11:26 | 0:11:29 | |
they can get in wherever they want. | 0:11:29 | 0:11:31 | |
They're actually very valuable. You can sell these on the underground | 0:11:31 | 0:11:34 | |
for hundreds of thousands of dollars. | 0:11:34 | 0:11:36 | |
Then we became more worried because we discovered more zero-days. | 0:11:36 | 0:11:40 | |
And, again, these zero-days are extremely rare. | 0:11:40 | 0:11:42 | |
Inside Stuxnet we had, you know, four zero-days, | 0:11:42 | 0:11:44 | |
and for the entire rest of the year | 0:11:44 | 0:11:47 | |
we only saw 12 zero-days used. | 0:11:47 | 0:11:48 | |
It blows everything else out of the water. | 0:11:48 | 0:11:50 | |
We've never seen this before. We've never seen it since, either. | 0:11:50 | 0:11:53 | |
We've seen one in a malware you could understand, | 0:11:53 | 0:11:56 | |
because the malware authors are making money, | 0:11:56 | 0:11:57 | |
they're stealing people's credit cards. | 0:11:57 | 0:11:59 | |
They're making money, so it's worth their while to use it. | 0:11:59 | 0:12:02 | |
But seeing four zero-days could be worth 500,000 right there, | 0:12:02 | 0:12:05 | |
used in one piece of malware. | 0:12:05 | 0:12:06 | |
This is not your ordinary criminal gang who's doing this. | 0:12:06 | 0:12:09 | |
This is someone bigger. | 0:12:09 | 0:12:10 | |
It's definitely not traditional crime, not hacktivists. | 0:12:10 | 0:12:14 | |
Who else? | 0:12:14 | 0:12:17 | |
It was evident at a very early stage that, | 0:12:17 | 0:12:20 | |
just given the sophistication of this malware, | 0:12:20 | 0:12:24 | |
it suggested that there must have been a nation state involved - | 0:12:24 | 0:12:28 | |
at least one nation state involved in the development. | 0:12:28 | 0:12:31 | |
When we look at code that's coming from | 0:12:31 | 0:12:33 | |
what appears to be a state attacker, or state-sponsored attacker, | 0:12:33 | 0:12:36 | |
usually they're scrubbed clean. | 0:12:36 | 0:12:37 | |
They don't leave little bits behind. | 0:12:37 | 0:12:39 | |
They don't leave little hints behind. | 0:12:39 | 0:12:41 | |
But in Stuxnet there were actually a few hints left behind. | 0:12:41 | 0:12:44 | |
One was that in order to get lower level access to Microsoft Windows, | 0:12:46 | 0:12:50 | |
Stuxnet needed to use a digital certificate | 0:12:50 | 0:12:52 | |
which certifies that this piece of code came from a particular company. | 0:12:52 | 0:12:57 | |
Now, those attackers obviously couldn't go to Microsoft and say, | 0:12:57 | 0:13:00 | |
"Hey, test our code out for us and give us a digital certificate." | 0:13:00 | 0:13:04 | |
So they essentially stole them... | 0:13:04 | 0:13:06 | |
..from two companies in Taiwan. | 0:13:07 | 0:13:08 | |
And these two companies have nothing to do with each other except | 0:13:08 | 0:13:11 | |
for their close proximity in the exact same business park. | 0:13:11 | 0:13:14 | |
Digital certificates are guarded very, very closely, | 0:13:16 | 0:13:19 | |
behind multiple doors, | 0:13:19 | 0:13:21 | |
and they require multiple people to unlock. | 0:13:21 | 0:13:24 | |
And they need to provide both biometrics, | 0:13:24 | 0:13:26 | |
and, as well, pass phrases. | 0:13:26 | 0:13:28 | |
It wasn't like those certificates were just sitting on some machine | 0:13:28 | 0:13:31 | |
connected to the internet. Some human assets had to be involved. | 0:13:31 | 0:13:34 | |
-O'MURCHU: -Spies, like a cleaner who comes in at night | 0:13:34 | 0:13:37 | |
and has stolen these certificates from these companies. | 0:13:37 | 0:13:40 | |
It did feel like walking onto the set of this James Bond movie | 0:13:43 | 0:13:47 | |
and you've been embroiled in this thing that, | 0:13:47 | 0:13:49 | |
you know, you'd never expected. | 0:13:49 | 0:13:52 | |
We continued to search, and we continued to search in the code, | 0:13:54 | 0:13:56 | |
and, eventually, we found some other breadcrumbs left | 0:13:56 | 0:13:59 | |
that we were able to follow. | 0:13:59 | 0:14:01 | |
It was doing something with Siemens. | 0:14:01 | 0:14:03 | |
Siemens software, possibly Siemens hardware. | 0:14:03 | 0:14:05 | |
We'd never, ever seen that in any malware before, | 0:14:05 | 0:14:07 | |
something targeting Siemens. | 0:14:07 | 0:14:09 | |
We didn't even know why they would be doing that. | 0:14:09 | 0:14:11 | |
But after googling, very quickly we understood | 0:14:12 | 0:14:15 | |
it was targeting Siemens PLCs. | 0:14:15 | 0:14:17 | |
Stuxnet was targeting a very specific hardware device, | 0:14:17 | 0:14:20 | |
something called a PLC, or a programmable logic controller. | 0:14:20 | 0:14:24 | |
-LANGNER: -The PLC is kind of a very small computer | 0:14:24 | 0:14:27 | |
attached to physical equipment, | 0:14:27 | 0:14:30 | |
like pumps, like valves, like motors. | 0:14:30 | 0:14:33 | |
So, this little box is running a digital program | 0:14:33 | 0:14:38 | |
and the actions of this program | 0:14:38 | 0:14:40 | |
turns that motor on, off or sets a specific speed. | 0:14:40 | 0:14:44 | |
Those programmable logic controller | 0:14:44 | 0:14:46 | |
control things like power plants, power grids. | 0:14:46 | 0:14:48 | |
This is used in factories, it's used in critical infrastructure. | 0:14:48 | 0:14:53 | |
Critical infrastructure, it's everywhere around us. | 0:14:53 | 0:14:56 | |
Transportation, telecommunication, financial services, health care... | 0:14:56 | 0:15:01 | |
So, the payload of Stuxnet | 0:15:01 | 0:15:02 | |
was designed to attack some very important part of our world. | 0:15:02 | 0:15:08 | |
The payload is going to be important. | 0:15:08 | 0:15:10 | |
What happens there could be very dangerous. | 0:15:10 | 0:15:12 | |
-LANGNER: -The next very big surprise came | 0:15:14 | 0:15:17 | |
when we infected our lab system. | 0:15:17 | 0:15:20 | |
We figured out that the malware was probing the controls. | 0:15:20 | 0:15:24 | |
It was quite picky on its target. | 0:15:24 | 0:15:27 | |
It didn't try to manipulate any given control | 0:15:27 | 0:15:30 | |
in a network that it would see. | 0:15:30 | 0:15:32 | |
It went through several checks | 0:15:32 | 0:15:34 | |
and when those checks failed, it would not implement the attack. | 0:15:34 | 0:15:39 | |
It was obviously probing for a specific target. | 0:15:41 | 0:15:44 | |
You've got to put this in context that, at the time, | 0:15:46 | 0:15:48 | |
we already knew, "Well, this was the most sophisticated piece of malware | 0:15:48 | 0:15:52 | |
"that we have ever seen." | 0:15:52 | 0:15:54 | |
So, it's kind of strange. | 0:15:54 | 0:15:56 | |
Somebody takes that huge effort | 0:15:56 | 0:15:59 | |
to hit that one specific target? | 0:15:59 | 0:16:01 | |
Well, that must be quite a significant target. | 0:16:01 | 0:16:04 | |
-CHIEN: -At Symantec, we have probes on networks all over the world | 0:16:07 | 0:16:10 | |
watching for malicious activity. | 0:16:10 | 0:16:13 | |
-O'MURCHU: -We'd seen infections of Stuxnet all over the world. | 0:16:13 | 0:16:15 | |
In the US, in Australia, in the UK, | 0:16:15 | 0:16:18 | |
France, Germany, all over Europe. | 0:16:18 | 0:16:20 | |
It spread to any Windows machine in the entire world. | 0:16:20 | 0:16:23 | |
You know, we had these organisations inside the United States. | 0:16:23 | 0:16:26 | |
They were in charge of industrial control facilities saying, | 0:16:26 | 0:16:29 | |
"We're infected, what's going to happen?" | 0:16:29 | 0:16:31 | |
We didn't know if there was a deadline coming up | 0:16:31 | 0:16:33 | |
where this threat would trigger and suddenly would, like, | 0:16:33 | 0:16:36 | |
turn off all electricity plants around the world | 0:16:36 | 0:16:38 | |
or it would start shutting things down or launching some attack. | 0:16:38 | 0:16:42 | |
We knew that Stuxnet could have very dire consequences. | 0:16:42 | 0:16:46 | |
And we were very worried about what the payload contained | 0:16:46 | 0:16:49 | |
and there was an imperative speed | 0:16:49 | 0:16:52 | |
that we had to race and try and beat this ticking bomb. | 0:16:52 | 0:16:56 | |
Eventually, we were able to refine this a little bit | 0:16:56 | 0:16:58 | |
and we saw that Iran was the number one infected country in the world. | 0:16:58 | 0:17:01 | |
That immediately raised our eyebrows. | 0:17:01 | 0:17:04 | |
We have never seen a threat before | 0:17:04 | 0:17:06 | |
where it was predominantly in Iran. | 0:17:06 | 0:17:08 | |
And so we began to follow what was going on in the geopolitical world. | 0:17:08 | 0:17:12 | |
What was happening in the general news. And, at that time, | 0:17:12 | 0:17:15 | |
there were actually multiple explosions of gas pipelines | 0:17:15 | 0:17:18 | |
going in and out of Iran. | 0:17:18 | 0:17:20 | |
Unexplained explosions. | 0:17:20 | 0:17:21 | |
And, of course, we did notice that, at the time, | 0:17:23 | 0:17:26 | |
there have been assassinations of nuclear scientists, | 0:17:26 | 0:17:28 | |
so that was worrying. | 0:17:28 | 0:17:30 | |
We knew there was something bad happening. | 0:17:30 | 0:17:33 | |
Did you get concerned for yourself? | 0:17:33 | 0:17:35 | |
Did you begin start looking over your shoulder from time to time? | 0:17:35 | 0:17:38 | |
Yeah, definitely looking over my shoulder | 0:17:38 | 0:17:40 | |
and being careful about what I spoke about on the phone. | 0:17:40 | 0:17:43 | |
Um... I was... | 0:17:43 | 0:17:45 | |
pretty confident my conversations on the phone were being listened to. | 0:17:45 | 0:17:48 | |
We were only half joking, | 0:17:48 | 0:17:50 | |
when we would look at each other and tell each other things like, | 0:17:50 | 0:17:54 | |
"Look, I'm not suicidal. | 0:17:54 | 0:17:56 | |
"If I drop dead on Monday, it wasn't me." | 0:17:56 | 0:18:00 | |
We'd been publishing information about Stuxnet | 0:18:08 | 0:18:11 | |
all through that summer. | 0:18:11 | 0:18:13 | |
And then, in November, | 0:18:13 | 0:18:14 | |
the industrial control systems expert in Holland contacted us. | 0:18:14 | 0:18:18 | |
And he said, "All of these devices | 0:18:18 | 0:18:21 | |
"that would be inside of an industrial control system | 0:18:21 | 0:18:23 | |
"hold a unique identifier number | 0:18:23 | 0:18:26 | |
"that identified the make and model of that device." | 0:18:26 | 0:18:28 | |
And we actually had a couple of these numbers in the code, | 0:18:30 | 0:18:33 | |
except we didn't know what they were. | 0:18:33 | 0:18:36 | |
And so we realised maybe what he was referring to | 0:18:36 | 0:18:38 | |
was the magic numbers we had. | 0:18:38 | 0:18:39 | |
And when we searched for those magic numbers in that context, | 0:18:39 | 0:18:42 | |
we saw that what had to be connected to this industrial control system | 0:18:42 | 0:18:45 | |
that was being targeted | 0:18:45 | 0:18:46 | |
were something called "frequency converters" | 0:18:46 | 0:18:49 | |
from two specific manufacturers. One of which was in Iran. | 0:18:49 | 0:18:52 | |
And so, at this time, we absolutely knew | 0:18:52 | 0:18:54 | |
that the facility that was being targeted had to be in Iran, | 0:18:54 | 0:18:58 | |
and it had equipment made from Iranian manufacturers. | 0:18:58 | 0:19:01 | |
When we looked up those frequency converters, | 0:19:01 | 0:19:04 | |
we immediately found out that they were actually export controlled | 0:19:04 | 0:19:07 | |
by the Nuclear Regulatory Commission. | 0:19:07 | 0:19:08 | |
And that immediately led us, then, to some nuclear facility. | 0:19:08 | 0:19:13 | |
This was more than a computer story, | 0:19:29 | 0:19:31 | |
so I left the world of the antivirus detectives | 0:19:31 | 0:19:34 | |
and sought out journalist David Sanger, | 0:19:34 | 0:19:36 | |
who specialised in the strange intersection of cyber, | 0:19:36 | 0:19:39 | |
nuclear weapons and espionage. | 0:19:39 | 0:19:41 | |
The emergence of the code is what put me on alert | 0:19:42 | 0:19:45 | |
that an attack was underway. | 0:19:45 | 0:19:47 | |
And because of the covert nature of the operation, | 0:19:48 | 0:19:51 | |
not only were official government spokesmen unable to talk about it, | 0:19:51 | 0:19:56 | |
they didn't even KNOW about it. | 0:19:56 | 0:19:58 | |
Eventually, the more I dug into it, | 0:19:58 | 0:20:01 | |
the more I began to find individuals | 0:20:01 | 0:20:04 | |
who had been involved in some piece of it | 0:20:04 | 0:20:07 | |
or who had witnessed some piece of it. | 0:20:07 | 0:20:10 | |
And that meant talking to Americans, | 0:20:10 | 0:20:12 | |
talking to Israelis, talking to Europeans, | 0:20:12 | 0:20:15 | |
because this was, obviously, the first, biggest | 0:20:15 | 0:20:19 | |
and most sophisticated example | 0:20:19 | 0:20:22 | |
of a state or two states using a cyber weapon for offensive purposes. | 0:20:22 | 0:20:26 | |
I came to this with a fair bit of history - | 0:20:29 | 0:20:32 | |
understanding the Iranian nuclear programme. | 0:20:32 | 0:20:36 | |
How did Iran get its first nuclear reactor? | 0:20:36 | 0:20:40 | |
We gave it to them under the Shah, | 0:20:40 | 0:20:43 | |
because the Shah was considered an American ally. | 0:20:43 | 0:20:47 | |
APPLAUSE | 0:20:47 | 0:20:49 | |
-SAMORE: -But the revolution which overthrew the Shah in '79 | 0:20:49 | 0:20:53 | |
really curtailed the programme | 0:20:53 | 0:20:55 | |
before it ever got any head of steam going. | 0:20:55 | 0:20:58 | |
Part of our policy against Iran after the revolution | 0:20:58 | 0:21:02 | |
was to deny them nuclear technology, | 0:21:02 | 0:21:05 | |
so most of the period, when I was involved, in the '80s and the '90s, | 0:21:05 | 0:21:10 | |
was the US running around the world | 0:21:10 | 0:21:13 | |
and persuading potential nuclear suppliers | 0:21:13 | 0:21:16 | |
not to provide even peaceful nuclear technology to Iran. | 0:21:16 | 0:21:19 | |
And what we missed was the clandestine transfer | 0:21:19 | 0:21:22 | |
in the mid-1980s from Pakistan to Iran. | 0:21:22 | 0:21:26 | |
-MOWATT-LARSSEN: -Abdul Qadeer Khan is what we would call | 0:21:29 | 0:21:32 | |
the father of the Pakistan nuclear programme. | 0:21:32 | 0:21:35 | |
He had the full authority and confidence | 0:21:35 | 0:21:37 | |
of the Pakistan Government from its inception | 0:21:37 | 0:21:39 | |
to the production of nuclear weapons. | 0:21:39 | 0:21:42 | |
The AQ Khan network is so notable | 0:21:44 | 0:21:46 | |
because, aside from | 0:21:46 | 0:21:48 | |
building the Pakistani programme | 0:21:48 | 0:21:50 | |
for decades, | 0:21:50 | 0:21:52 | |
it also was the means by which other countries | 0:21:52 | 0:21:56 | |
were able to develop nuclear weapons - including Iran. | 0:21:56 | 0:21:59 | |
-SAMORE: -By 2006, the Iranians had started producing | 0:21:59 | 0:22:02 | |
low-enriched uranium, producing more centrifuges, installing them | 0:22:02 | 0:22:06 | |
at the large-scale underground enrichment facility at Natanz. | 0:22:06 | 0:22:09 | |
How many times have you visited Natanz? | 0:23:00 | 0:23:02 | |
Not that many, because I left a few years ago already, IAEA, | 0:23:02 | 0:23:05 | |
but I was there quite a few times. | 0:23:05 | 0:23:08 | |
Natanz is in the middle of the desert. | 0:23:11 | 0:23:14 | |
When they were building it in secret, | 0:23:16 | 0:23:18 | |
they were calling it a "desert irrigation facility". | 0:23:18 | 0:23:22 | |
There is a lot of artillery and air force. | 0:23:24 | 0:23:27 | |
It's better protected against attack from the air | 0:23:27 | 0:23:31 | |
than any other nuclear installation I have seen. | 0:23:31 | 0:23:34 | |
And so, all the monitoring activities of the IAEA, | 0:23:38 | 0:23:41 | |
they are basic principle - you want to see what goes in, what goes out, | 0:23:41 | 0:23:44 | |
and then, on top of that, | 0:23:44 | 0:23:46 | |
you make sure that it produces low-enriched uranium. | 0:23:46 | 0:23:49 | |
Is that anything to do with the higher enrichments | 0:23:49 | 0:23:52 | |
and nuclear-weapon-grade uranium? | 0:23:52 | 0:23:54 | |
Iran's nuclear facilities are under 24-hour watch | 0:24:00 | 0:24:03 | |
of the United Nations nuclear watchdog, the IAEA, | 0:24:03 | 0:24:07 | |
the International Atomic Energy Agency. | 0:24:07 | 0:24:10 | |
Every single gram of Iranian fissile material... | 0:24:10 | 0:24:14 | |
..is accounted for. | 0:24:16 | 0:24:17 | |
-HEINONEN: -When you look at the uranium which was there in Natanz, | 0:24:20 | 0:24:24 | |
it was a very special uranium. | 0:24:24 | 0:24:26 | |
This was called isotope 236. | 0:24:26 | 0:24:29 | |
And that was a puzzle to us, | 0:24:29 | 0:24:31 | |
because you only see this sort of uranium | 0:24:31 | 0:24:34 | |
in states which have nuclear weapons. | 0:24:34 | 0:24:37 | |
We realised that they had cheated us. | 0:24:38 | 0:24:41 | |
This sort of equipment has been bought from | 0:24:41 | 0:24:44 | |
what they call a black market. | 0:24:44 | 0:24:47 | |
They never point it out to... | 0:24:47 | 0:24:48 | |
They were caught at that point in time. | 0:24:48 | 0:24:51 | |
What I was surprised was the sophistication | 0:24:51 | 0:24:55 | |
and the quality control. | 0:24:55 | 0:24:56 | |
The way they have the manufacturing, it was really professional. | 0:24:56 | 0:25:00 | |
It was not something, you know, | 0:25:00 | 0:25:01 | |
you just create in a few months' time. | 0:25:01 | 0:25:04 | |
This was the result of a long process. | 0:25:04 | 0:25:07 | |
The centrifuge. You feed uranium gas | 0:25:13 | 0:25:16 | |
in and you have a cascade, thousands of centrifuges, | 0:25:16 | 0:25:19 | |
and from the other end, you get enriched uranium out. | 0:25:19 | 0:25:23 | |
It separates uranium based on spinning the rotor, | 0:25:23 | 0:25:26 | |
it spins so fast. | 0:25:26 | 0:25:28 | |
300 metres per second. | 0:25:28 | 0:25:30 | |
The same as the velocity of sound. | 0:25:30 | 0:25:33 | |
These are tremendous forces and, as a result, | 0:25:34 | 0:25:37 | |
the rotor, it twists and looks like a banana at one point of time. | 0:25:37 | 0:25:41 | |
So, it has to be in balance, | 0:25:41 | 0:25:44 | |
because any small vibration, it would blow up. | 0:25:44 | 0:25:47 | |
This is what makes them very difficult to manufacture. | 0:25:47 | 0:25:49 | |
You can model it, you can calculate it, but at the very end, | 0:25:49 | 0:25:53 | |
it's actually based on practice and experience, | 0:25:53 | 0:25:57 | |
so it's a piece of art, so to say. | 0:25:57 | 0:26:00 | |
'Ahmadinejad came into his presidency saying that,' | 0:26:14 | 0:26:18 | |
"If international community wants to derail us, | 0:26:18 | 0:26:20 | |
"we will stand up to it. | 0:26:20 | 0:26:22 | |
"If they want us to sign more inspections | 0:26:22 | 0:26:25 | |
"and more additional protocols and other measures, no, we will not. | 0:26:25 | 0:26:29 | |
"We will fight for our right. | 0:26:29 | 0:26:31 | |
"Iran is a signatory to the nuclear Non-Proliferation Treaty. | 0:26:31 | 0:26:35 | |
"And under that treaty, Iran has the right to nuclear programme. | 0:26:35 | 0:26:39 | |
"We can have enrichment. | 0:26:39 | 0:26:40 | |
"Who are you, world powers, | 0:26:40 | 0:26:42 | |
"to come and tell us that we cannot have enrichment?" | 0:26:42 | 0:26:45 | |
This was his mantra. | 0:26:45 | 0:26:47 | |
And it galvanised the public. | 0:26:47 | 0:26:51 | |
By 2007, 2008, | 0:26:54 | 0:26:56 | |
the US Government was in a very bad place with the Iranian programme. | 0:26:56 | 0:27:00 | |
President Bush recognised | 0:27:01 | 0:27:03 | |
that he could not even come out in public and declare | 0:27:03 | 0:27:05 | |
that the Iranians were building a nuclear weapon | 0:27:05 | 0:27:07 | |
because, by this time, | 0:27:07 | 0:27:09 | |
he had gone through the entire WMD fiasco in Iraq. | 0:27:09 | 0:27:13 | |
He could not really take military action. | 0:27:13 | 0:27:16 | |
Condoleezza Rice said to him at one point, | 0:27:16 | 0:27:18 | |
"You know, Mr President, | 0:27:18 | 0:27:19 | |
"I think you've invaded your last Muslim country, | 0:27:19 | 0:27:22 | |
"even for the best of reasons." | 0:27:22 | 0:27:25 | |
He didn't want to let the Israelis conduct the military operation. | 0:27:26 | 0:27:31 | |
'It's 1938,' | 0:27:31 | 0:27:33 | |
and Iran is Germany and it's racing | 0:27:33 | 0:27:37 | |
to arm itself with atomic bombs. | 0:27:37 | 0:27:40 | |
Iran's nuclear ambitions must be stopped and have to be stopped. | 0:27:40 | 0:27:46 | |
We all have to stop it now. | 0:27:46 | 0:27:48 | |
That's the one message I have for you today. | 0:27:48 | 0:27:51 | |
-Thank you. -APPLAUSE | 0:27:51 | 0:27:53 | |
Israel was saying they were going to bomb Iran. | 0:27:53 | 0:27:56 | |
And the government here in Washington | 0:27:56 | 0:27:58 | |
did all sorts of scenarios about what would happen | 0:27:58 | 0:28:01 | |
if that Israeli attack occurred. | 0:28:01 | 0:28:04 | |
They were all very ugly scenarios. | 0:28:04 | 0:28:06 | |
Our belief was that, if they went on their own, | 0:28:06 | 0:28:09 | |
knowing their limitations... | 0:28:09 | 0:28:10 | |
They have a very good air force, all right, | 0:28:10 | 0:28:12 | |
but it's small and the distances are great | 0:28:12 | 0:28:15 | |
and the targets dispersed and hardened. | 0:28:15 | 0:28:17 | |
If they would have attempted a raid on a military plane, | 0:28:17 | 0:28:23 | |
we would have been assuming that they were assuming | 0:28:23 | 0:28:26 | |
we would finish that which they started. | 0:28:26 | 0:28:28 | |
In other words, there would be many of us in government | 0:28:28 | 0:28:31 | |
thinking that the purpose of the raid | 0:28:31 | 0:28:33 | |
wasn't to destroy the Iranian nuclear system, | 0:28:33 | 0:28:35 | |
but the purpose of the raid was to put us at war with Iran. | 0:28:35 | 0:28:38 | |
The two countries agreed on the goal. | 0:28:40 | 0:28:43 | |
There is no... A page between us | 0:28:43 | 0:28:46 | |
that Iran should not have a nuclear military capability. | 0:28:46 | 0:28:51 | |
There are some differences on how to achieve it | 0:28:51 | 0:28:56 | |
and when action is needed. | 0:28:56 | 0:28:58 | |
We are taking very seriously leaders of countries | 0:29:07 | 0:29:10 | |
who call to the destruction and annihilation of our people. | 0:29:10 | 0:29:14 | |
-SAMORE: -The Israelis believe that the Iranian leadership | 0:29:14 | 0:29:18 | |
has already made the decision to build nuclear weapons | 0:29:18 | 0:29:21 | |
when they think they can get away with it. The view in the US | 0:29:21 | 0:29:24 | |
is that the Iranians haven't made that final decision yet. | 0:29:24 | 0:29:29 | |
To me, that doesn't make any difference. | 0:29:29 | 0:29:31 | |
I mean, it really doesn't make any difference, | 0:29:31 | 0:29:33 | |
and it's probably unknowable. | 0:29:33 | 0:29:35 | |
Unless you can put Supreme Leader Khomeini on the couch | 0:29:35 | 0:29:38 | |
and interview him, I think, from our standpoint, | 0:29:38 | 0:29:41 | |
stopping Iran from getting the threshold capacity | 0:29:41 | 0:29:45 | |
is the primary policy objective. | 0:29:45 | 0:29:48 | |
Once they had the fissile material, | 0:29:49 | 0:29:50 | |
once they had the capacity to produce nuclear weapons, | 0:29:50 | 0:29:53 | |
then the game is lost. | 0:29:53 | 0:29:55 | |
-HAYDEN: -President Bush once said to me, he said, | 0:30:00 | 0:30:02 | |
"Mike, I don't want any president ever to be faced | 0:30:02 | 0:30:05 | |
"with only two options - bombing or the bomb." Right? | 0:30:05 | 0:30:09 | |
He wanted options that... | 0:30:09 | 0:30:11 | |
made it... | 0:30:11 | 0:30:14 | |
made it far less likely he or his successor, or successors, | 0:30:14 | 0:30:18 | |
would ever get to that point where that's all you've got. | 0:30:18 | 0:30:20 | |
The intelligence cooperation between Israel and the United States | 0:30:20 | 0:30:24 | |
is very, very good. | 0:30:24 | 0:30:26 | |
And, therefore, the Israelis went to the Americans and said, | 0:30:26 | 0:30:29 | |
"OK, guys, you don't want us to bomb Iran. | 0:30:29 | 0:30:32 | |
"OK, let's do it differently." | 0:30:32 | 0:30:36 | |
One day a group of intelligence and military officials showed up | 0:30:36 | 0:30:41 | |
in President Bush's office and said, "Sir, we have an idea. | 0:30:41 | 0:30:46 | |
"It's a big risk, it might not work, but here it is." | 0:30:46 | 0:30:50 | |
-LANGNER: -Moving forward in my analysis of the code, | 0:30:57 | 0:31:00 | |
I took a closer look at the photographs | 0:31:00 | 0:31:03 | |
that have been published by the Iranians themselves | 0:31:03 | 0:31:08 | |
in a press tour from 2008 | 0:31:08 | 0:31:11 | |
of Ahmadinejad and the shiny centrifuges. | 0:31:11 | 0:31:14 | |
The photographs of Ahmadinejad going through the centrifuges at Natanz | 0:31:15 | 0:31:20 | |
provided some very important clues. | 0:31:20 | 0:31:24 | |
There was a huge amount to be learned. | 0:31:24 | 0:31:26 | |
First of all, those photographs | 0:31:34 | 0:31:36 | |
showed many of the individuals | 0:31:36 | 0:31:38 | |
who were guiding Ahmadinejad through the programme. | 0:31:38 | 0:31:41 | |
And there's one very famous photograph | 0:31:41 | 0:31:43 | |
that shows Ahmadinejad being shown something. | 0:31:43 | 0:31:46 | |
You see his face, you can't see what's on the computer. | 0:31:46 | 0:31:48 | |
And one of the scientists who was behind him | 0:31:48 | 0:31:52 | |
was assassinated a few months later. | 0:31:52 | 0:31:55 | |
In one of those photographs, | 0:31:58 | 0:32:00 | |
you could see parts of a computer screen. | 0:32:00 | 0:32:03 | |
We refer to that as a "stata screen". | 0:32:03 | 0:32:06 | |
The stata system is basically a piece of software | 0:32:06 | 0:32:08 | |
running on a computer. | 0:32:08 | 0:32:10 | |
It enables the operators to monitor the process. | 0:32:10 | 0:32:13 | |
What you could see... | 0:32:14 | 0:32:16 | |
when you look close enough | 0:32:16 | 0:32:19 | |
was a more detailed view of the configuration. | 0:32:19 | 0:32:23 | |
There were these six groups of centrifuges | 0:32:23 | 0:32:27 | |
and each group had 164 entries. | 0:32:27 | 0:32:29 | |
And guess what? | 0:32:31 | 0:32:32 | |
That was a perfect match to what we saw in the attack code. | 0:32:32 | 0:32:36 | |
It was absolutely clear that this piece of code | 0:32:37 | 0:32:40 | |
was attacking an array of six different groups of, | 0:32:40 | 0:32:44 | |
let's just say "thingies", physical objects, | 0:32:44 | 0:32:48 | |
and in those six groups, | 0:32:48 | 0:32:50 | |
there were 164 elements. | 0:32:50 | 0:32:53 | |
Were you able to do any actual physical tests? | 0:32:57 | 0:32:59 | |
Or was it all just code analysis? | 0:32:59 | 0:33:01 | |
So, we couldn't set up our own nuclear enrichment facility. | 0:33:01 | 0:33:06 | |
So, what we did was we did obtain some PLCs, the exact models. | 0:33:06 | 0:33:09 | |
We then ordered an air pump. | 0:33:16 | 0:33:18 | |
And that's what we used sort of as our proof of concept. | 0:33:18 | 0:33:21 | |
-O'MURCHU: -We needed a visual demonstration | 0:33:21 | 0:33:23 | |
to show people what we discovered. | 0:33:23 | 0:33:25 | |
So, we thought of different things that we could do | 0:33:25 | 0:33:27 | |
and we settled on blowing up a balloon. | 0:33:27 | 0:33:29 | |
We were able to write a program that would inflate a balloon | 0:33:33 | 0:33:36 | |
and it was set to stop after five seconds. | 0:33:36 | 0:33:38 | |
So, we would inflate the balloon to a certain size, | 0:33:47 | 0:33:50 | |
but we wouldn't burst the balloon, and it was all safe. | 0:33:50 | 0:33:52 | |
And we showed everybody, "This is the code that's on the PLC." | 0:33:52 | 0:33:55 | |
And the timer says, "Stop after five seconds". | 0:33:55 | 0:33:58 | |
We know that's what's going to happen. | 0:33:58 | 0:34:00 | |
And then we would infect the computer with Stuxnet | 0:34:00 | 0:34:03 | |
and we would run the test again. | 0:34:03 | 0:34:05 | |
Here is a piece of software that should only exist in the cyber realm | 0:34:35 | 0:34:39 | |
and it is able to infect physical equipment in a plant or factory | 0:34:39 | 0:34:44 | |
and cause physical damage. | 0:34:44 | 0:34:46 | |
Real-world physical destruction. | 0:34:46 | 0:34:48 | |
At that time, things became very scary to us. | 0:34:51 | 0:34:53 | |
Here you had malware potentially killing people | 0:34:53 | 0:34:56 | |
and that was something that was always Hollywood-esque to us, | 0:34:56 | 0:34:59 | |
that we would always laugh at, when people make that kind of assertion. | 0:34:59 | 0:35:02 | |
At this point, you had to have started developing theories | 0:35:06 | 0:35:10 | |
as to who had built Stuxnet. | 0:35:10 | 0:35:14 | |
It wasn't lost on us | 0:35:14 | 0:35:15 | |
that there were probably only a few countries in the world | 0:35:15 | 0:35:20 | |
that would want and have the motivation | 0:35:20 | 0:35:22 | |
to sabotage the Iranians' nuclear enrichment facility. | 0:35:22 | 0:35:25 | |
The US Government would be up there. | 0:35:25 | 0:35:27 | |
The Israeli government, certainly, would be up there. | 0:35:27 | 0:35:29 | |
You know, maybe UK, France, Germany, those sorts of countries, | 0:35:29 | 0:35:32 | |
but we never found any information | 0:35:32 | 0:35:35 | |
that would tie it back 100% to those countries. | 0:35:35 | 0:35:38 | |
There are no telltale signs. | 0:35:38 | 0:35:40 | |
The attackers don't leave a message inside saying, | 0:35:40 | 0:35:42 | |
you know, "It was me!" | 0:35:42 | 0:35:44 | |
And even if they did, | 0:35:44 | 0:35:46 | |
all of that stuff can be faked. | 0:35:46 | 0:35:48 | |
So, it's very, very difficult | 0:35:48 | 0:35:50 | |
to do attribution when looking at computer code. | 0:35:50 | 0:35:53 | |
Subsequent work that's been done | 0:35:53 | 0:35:55 | |
leads us to believe that this was the work of a collaboration | 0:35:55 | 0:35:58 | |
between Israel and the United States. | 0:35:58 | 0:36:00 | |
Did you have any evidence | 0:36:00 | 0:36:01 | |
in terms of your analysis that would lead you | 0:36:01 | 0:36:03 | |
to believe that that's correct, also? | 0:36:03 | 0:36:05 | |
Nothing that I could talk about on camera. | 0:36:05 | 0:36:08 | |
-INTERVIEWER CHUCKLES Can I ask why? -No. | 0:36:09 | 0:36:12 | |
Well, you can, but I won't answer. | 0:36:13 | 0:36:15 | |
BOTH LAUGH | 0:36:15 | 0:36:18 | |
But even in the case of nation states, one of the concerns... | 0:36:18 | 0:36:20 | |
'This was beginning to really piss me off. | 0:36:20 | 0:36:23 | |
'Even civilians with an interest in telling the Stuxnet story | 0:36:23 | 0:36:26 | |
'were refusing to address the role of Tel Aviv and Washington. | 0:36:26 | 0:36:31 | |
'But, luckily for me, | 0:36:31 | 0:36:32 | |
'whilst DC is a city of secrets, | 0:36:32 | 0:36:34 | |
'it is also a city of leaks. | 0:36:34 | 0:36:37 | |
'They're as regular as a heartbeat and just as hard to stop. | 0:36:37 | 0:36:41 | |
'That's what I was counting on.' | 0:36:41 | 0:36:43 | |
'Finally, after speaking to a number of people on background, | 0:36:47 | 0:36:51 | |
'I did find a way of confirming, on the record, | 0:36:51 | 0:36:53 | |
'the American role in Stuxnet. | 0:36:53 | 0:36:55 | |
'In exchange for details of the operation, | 0:36:55 | 0:36:58 | |
'I had to agree to find a way | 0:36:58 | 0:37:00 | |
'to disguise the source of the information.' | 0:37:00 | 0:37:03 | |
-We're good? -We're on. | 0:37:03 | 0:37:05 | |
So, the first question I have to ask you is about secrecy. | 0:37:05 | 0:37:09 | |
I mean, at this point, everyone knows about Stuxnet. | 0:37:09 | 0:37:12 | |
Why can't we talk about it? | 0:37:12 | 0:37:14 | |
-DISTORTED WOMAN'S VOICE: -It's a covert operation. | 0:37:14 | 0:37:16 | |
Not any more. We know what happened, we know who did it. | 0:37:16 | 0:37:19 | |
Well, maybe you don't know as much as we think you know. | 0:37:19 | 0:37:22 | |
I'm talking to you because I want to get the story right. | 0:37:24 | 0:37:26 | |
That's the same reason I'm talking to you. | 0:37:26 | 0:37:28 | |
Even though it's a covert operation? | 0:37:31 | 0:37:32 | |
Well, this is not a Snowden kind of thing. | 0:37:34 | 0:37:37 | |
OK? I think what he did was wrong. He went too far. | 0:37:37 | 0:37:40 | |
He gave away too much. | 0:37:40 | 0:37:42 | |
Unlike Snowden, who was a contractor, I was in the NSA. | 0:37:42 | 0:37:46 | |
I believe in the agency, so what I'm willing to give you will be limited, | 0:37:46 | 0:37:49 | |
but we're talking because everyone's getting the story wrong | 0:37:49 | 0:37:52 | |
and we have to get it right. We have to understand these new weapons. | 0:37:52 | 0:37:55 | |
-The stakes are too high. -What do you mean? | 0:37:55 | 0:37:57 | |
We did Stuxnet. | 0:37:59 | 0:38:02 | |
It's a fact. | 0:38:02 | 0:38:04 | |
You know, we came so fucking close to disaster, | 0:38:04 | 0:38:07 | |
and we're still on the edge. | 0:38:07 | 0:38:10 | |
It was a huge multinational inter-agency operation. | 0:38:10 | 0:38:15 | |
In the US, it was CIA, | 0:38:16 | 0:38:19 | |
NSA, and the military, Cyber Command. | 0:38:19 | 0:38:23 | |
From Britain, we used Iran intel out of GCHQ. | 0:38:23 | 0:38:27 | |
But the main partner was Israel. | 0:38:27 | 0:38:29 | |
Over there, Mossad ran the show | 0:38:29 | 0:38:31 | |
and the technical work was done by Unit 8200. | 0:38:31 | 0:38:34 | |
Israel is really the key to the story. | 0:38:34 | 0:38:37 | |
Our traffic in Israel is so unpredictable... | 0:38:41 | 0:38:44 | |
Yossi, how did you get into this Stuxnet story? | 0:38:46 | 0:38:50 | |
I have been covering the Israeli intelligence, in general, | 0:38:50 | 0:38:54 | |
and the Mossad in particular | 0:38:54 | 0:38:56 | |
for nearly 30 years. | 0:38:56 | 0:38:59 | |
I knew that Israel is trying to slow down Iran's nuclear programme | 0:38:59 | 0:39:04 | |
and, therefore, I came to the conclusion | 0:39:04 | 0:39:06 | |
that if there was a virus affecting Iran's computers, | 0:39:06 | 0:39:10 | |
it's one more element in this larger picture. | 0:39:10 | 0:39:15 | |
Amos Yadlin, General Yadlin, | 0:39:16 | 0:39:19 | |
he was the head of the military intelligence. | 0:39:19 | 0:39:22 | |
The biggest unit within that organisation is Unit 8200. | 0:39:22 | 0:39:27 | |
They bug telephones, they bug faxes, they break into computers. | 0:39:27 | 0:39:32 | |
A decade ago, when Yadlin became the Chief Of Military Intelligence, | 0:39:34 | 0:39:38 | |
there was no cyber warfare unit in 8200. | 0:39:38 | 0:39:43 | |
So, they started recruiting very talented people, hackers, | 0:39:46 | 0:39:50 | |
either from the military or outside the military | 0:39:50 | 0:39:53 | |
that can contribute to the project of building a cyber warfare unit. | 0:39:53 | 0:39:57 | |
It's another kind of weapon and it's for unlimited range, | 0:39:59 | 0:40:02 | |
in a very high speed and in a very low signature. | 0:40:02 | 0:40:07 | |
So this gives you a huge opportunity, | 0:40:07 | 0:40:10 | |
and the superpowers have to change the way we think about warfare. | 0:40:10 | 0:40:15 | |
Finally, we are transforming our military for a new kind of war | 0:40:17 | 0:40:20 | |
that we're fighting now... | 0:40:20 | 0:40:22 | |
..and for wars of tomorrow. | 0:40:23 | 0:40:25 | |
-SANGER: -Back in the end of the Bush administration, | 0:40:28 | 0:40:31 | |
people in the US Government | 0:40:31 | 0:40:33 | |
were just beginning to convince President Bush to pour money | 0:40:33 | 0:40:36 | |
into offensive cyber weapons. | 0:40:36 | 0:40:39 | |
Stuxnet started off in the Defense Department. | 0:40:39 | 0:40:43 | |
Then Robert Gates, the Secretary of Defense, | 0:40:43 | 0:40:45 | |
reviewed this program and he said, | 0:40:45 | 0:40:47 | |
"This program shouldn't be in the Defense Department. | 0:40:47 | 0:40:50 | |
"This should be under the covert authorities | 0:40:50 | 0:40:52 | |
"over in the intelligence world." | 0:40:52 | 0:40:55 | |
So, the CIA was very deeply involved in this operation, | 0:40:55 | 0:41:00 | |
while much of the coding work | 0:41:00 | 0:41:01 | |
was done by the National Security Agency and Unit 8200 - | 0:41:01 | 0:41:06 | |
its Israeli equivalent - working together | 0:41:06 | 0:41:09 | |
with a newly created military position called US Cyber Command. | 0:41:09 | 0:41:14 | |
And, interestingly, the Director of the National Security Agency | 0:41:14 | 0:41:19 | |
would also have a second role | 0:41:19 | 0:41:21 | |
as the Commander of US Cyber Command. | 0:41:21 | 0:41:25 | |
And US Cyber Command is located at Fort Meade, | 0:41:25 | 0:41:30 | |
in the same building as the NSA. | 0:41:30 | 0:41:33 | |
-HAYDEN: -NSA has no legal authority to attack. | 0:41:33 | 0:41:35 | |
It's never had it, I doubt that it ever will. | 0:41:35 | 0:41:38 | |
It might explain why US Cyber Command is sitting out of Fort Meade | 0:41:38 | 0:41:41 | |
on top of the National Security Agency. | 0:41:41 | 0:41:43 | |
Because NSA has the abilities to do these things. | 0:41:43 | 0:41:46 | |
Cyber Command has the AUTHORITY to do these things, | 0:41:46 | 0:41:49 | |
and "these things" here refer to the cyber attack. | 0:41:49 | 0:41:52 | |
This is a huge change for the nature of the intelligence agencies. | 0:41:52 | 0:41:58 | |
The NSA is supposed to be a code-making | 0:41:58 | 0:42:01 | |
and code-breaking operation, | 0:42:01 | 0:42:04 | |
to monitor the communications of foreign powers | 0:42:04 | 0:42:07 | |
and American adversaries | 0:42:07 | 0:42:09 | |
in the defence of the United States. | 0:42:09 | 0:42:11 | |
But creating a Cyber Command | 0:42:11 | 0:42:14 | |
meant using the same technology to do offensive work. | 0:42:14 | 0:42:18 | |
Once you get inside an adversary's computer networks, | 0:42:20 | 0:42:24 | |
you put an implant in that network, | 0:42:24 | 0:42:27 | |
and we have tens of thousands of foreign computers and networks | 0:42:27 | 0:42:30 | |
that the United States has put implants in. | 0:42:30 | 0:42:33 | |
You can use it to monitor what's going across that network | 0:42:33 | 0:42:36 | |
and you can use it to insert cyber weapons, malware. | 0:42:36 | 0:42:41 | |
If you can spy on a network, you can manipulate it. | 0:42:41 | 0:42:45 | |
It's already included. The only thing you need is an act of will. | 0:42:45 | 0:42:50 | |
-DISTORTED FEMALE VOICE: -I played a role in Iraq. | 0:42:53 | 0:42:56 | |
I can't tell you whether it was military or not, | 0:42:56 | 0:42:58 | |
but I can tell you NSA had combat support teams in the country | 0:42:58 | 0:43:02 | |
and, for the first time, | 0:43:02 | 0:43:04 | |
units in the field had direct access to NSA intel. | 0:43:04 | 0:43:07 | |
Over time, we thought more about offence than defence. | 0:43:10 | 0:43:13 | |
More about attacking than intelligence. | 0:43:13 | 0:43:16 | |
In the old days, units would try to track radios, | 0:43:16 | 0:43:19 | |
but through NSA in Iraq, | 0:43:19 | 0:43:21 | |
we had access to all the networks going in and out of the country. | 0:43:21 | 0:43:25 | |
We hoovered up every text message, e-mail and phone call. | 0:43:25 | 0:43:29 | |
The complete surveillance state. | 0:43:29 | 0:43:31 | |
We could find the bad guys. | 0:43:31 | 0:43:33 | |
Say, a gang making IEDs - | 0:43:33 | 0:43:36 | |
map their networks and follow them in real-time. | 0:43:36 | 0:43:40 | |
We could lock into cellphones, | 0:43:40 | 0:43:41 | |
even when they were off, send a fake text message from a friend, | 0:43:41 | 0:43:45 | |
suggest a meeting place and then capture... | 0:43:45 | 0:43:48 | |
-SOLDIER: -'You're clear to fire.' | 0:43:48 | 0:43:50 | |
..or kill. | 0:43:50 | 0:43:51 | |
I was in TAOS 321, the ROC. | 0:43:53 | 0:43:57 | |
OK, the TAO? The ROC? | 0:43:57 | 0:44:00 | |
Right, sorry, TAO is Tailored Access Operations. | 0:44:00 | 0:44:03 | |
It's where NSA's hackers work. Of course, we didn't call them that. | 0:44:03 | 0:44:06 | |
What did you call them? | 0:44:06 | 0:44:08 | |
On-net operators. They're the only people at NSA | 0:44:08 | 0:44:11 | |
allowed to break in or attack on the internet. | 0:44:11 | 0:44:14 | |
Inside TAO headquarters is the ROC - "Remote Operations Center". | 0:44:14 | 0:44:18 | |
If the US Government wants to get in somewhere, | 0:44:18 | 0:44:23 | |
it goes to the ROC. | 0:44:23 | 0:44:25 | |
I mean, we were flooded with requests. | 0:44:25 | 0:44:27 | |
So many that we could only do about 30% of the missions | 0:44:27 | 0:44:32 | |
that were requested of us at the one time. | 0:44:32 | 0:44:33 | |
Through the web, but also by hijacking shipments of parts. | 0:44:33 | 0:44:38 | |
You know, sometimes the CIA | 0:44:38 | 0:44:40 | |
would assist in putting implants in machines. | 0:44:40 | 0:44:44 | |
So, once inside a target network, | 0:44:44 | 0:44:47 | |
we could just...watch... | 0:44:47 | 0:44:51 | |
..or we could attack. | 0:44:52 | 0:44:54 | |
Inside NSA was a strange kind of culture - | 0:44:58 | 0:45:01 | |
like two parts macho military | 0:45:01 | 0:45:03 | |
and two parts cyber geek. | 0:45:03 | 0:45:06 | |
I mean, I came from Iraq, so I was used to, "Yes, sir!" "No, sir!" | 0:45:06 | 0:45:09 | |
but for the weapons programmers, | 0:45:09 | 0:45:11 | |
we needed more "think outside the box" types. | 0:45:11 | 0:45:14 | |
Were they all working on Stuxnet? | 0:45:14 | 0:45:17 | |
We never called it Stuxnet. | 0:45:17 | 0:45:19 | |
That was the name invented by the anti-virus guys. | 0:45:19 | 0:45:22 | |
When it hit the papers - we're not allowed | 0:45:22 | 0:45:24 | |
to read about classified operations even if it's in the New York Times - | 0:45:24 | 0:45:27 | |
we went out our way to avoid the term. | 0:45:27 | 0:45:29 | |
I mean, saying "Stuxnet" out loud | 0:45:29 | 0:45:30 | |
was like saying "Voldemort" in Harry Potter - | 0:45:30 | 0:45:32 | |
the Name That Shall Not Be Spoken. | 0:45:32 | 0:45:34 | |
What did you call it, then? | 0:45:34 | 0:45:36 | |
The Natanz attack, and this is out there already, | 0:45:43 | 0:45:48 | |
was called Olympic Games or OG. | 0:45:48 | 0:45:51 | |
There was a huge operation to test the code | 0:45:54 | 0:45:58 | |
on PLCs here at Fort Meade, | 0:45:58 | 0:46:00 | |
and in Sandia, New Mexico. | 0:46:00 | 0:46:02 | |
Remember during the Bush era, | 0:46:04 | 0:46:06 | |
when Libya turned over all of its centrifuges? | 0:46:06 | 0:46:08 | |
Those were the same models the Iranians got from AQ Khan, P1s. | 0:46:08 | 0:46:12 | |
We took them to Oak Ridge and used them to test the code, | 0:46:13 | 0:46:17 | |
which demolished the insides. | 0:46:17 | 0:46:20 | |
At Dimona, the Israelis also tested on the P1s. | 0:46:20 | 0:46:24 | |
Then, probably by using our intel on Iran, | 0:46:25 | 0:46:28 | |
we got the plans for the newer models, the IR2s. | 0:46:28 | 0:46:31 | |
We tried out different attack vectors. | 0:46:31 | 0:46:34 | |
We ended up focusing on ways to destroy the rotor tubes. | 0:46:34 | 0:46:39 | |
In the tests we ran, we blew them apart. | 0:46:39 | 0:46:42 | |
They swept up the pieces, they put it on an aeroplane, | 0:46:44 | 0:46:46 | |
they flew to Washington, they stuck it in a truck, | 0:46:46 | 0:46:49 | |
they drove it through the gates of the White House, | 0:46:49 | 0:46:52 | |
and dumped the shards out | 0:46:52 | 0:46:53 | |
on the conference room table in the Situation Room, | 0:46:53 | 0:46:56 | |
and then they invited President Bush to come down and take a look. | 0:46:56 | 0:47:00 | |
And when he could pick up the shard of a piece of centrifuge, | 0:47:00 | 0:47:04 | |
he was convinced this might be worth it, | 0:47:04 | 0:47:07 | |
and he said, "Go ahead and try." | 0:47:07 | 0:47:09 | |
Was there a legal concern inside the Bush administration | 0:47:09 | 0:47:12 | |
that this might be an act of undeclared war? | 0:47:12 | 0:47:16 | |
If there were concerns, I haven't found them. | 0:47:16 | 0:47:19 | |
That doesn't mean that they didn't exist | 0:47:20 | 0:47:23 | |
and that some lawyers somewhere were concerned about it, | 0:47:23 | 0:47:27 | |
but this was an entirely new territory. | 0:47:27 | 0:47:30 | |
At the time, there were only very few people who had expertise | 0:47:30 | 0:47:34 | |
specifically on the law of war and cyber. | 0:47:34 | 0:47:36 | |
And what we did was, looking at, | 0:47:36 | 0:47:38 | |
"OK, here's our broad direction. | 0:47:38 | 0:47:40 | |
"Now let's look, technically, | 0:47:40 | 0:47:42 | |
"what can we do to facilitate this broad direction?" | 0:47:42 | 0:47:46 | |
After that, maybe the... | 0:47:46 | 0:47:47 | |
I would come in, or one of my lawyers would come in and say, | 0:47:47 | 0:47:51 | |
"OK, this is what we may do." | 0:47:51 | 0:47:54 | |
OK? There are many things we CAN do but we are not ALLOWED to do them. | 0:47:54 | 0:47:59 | |
And then, after that, there's still a final level that we look at, | 0:47:59 | 0:48:02 | |
and that's, what should we do? | 0:48:02 | 0:48:03 | |
Because there are many things that would be technically possible | 0:48:03 | 0:48:07 | |
and technically legal, but a bad idea. | 0:48:07 | 0:48:10 | |
For Natanz, it was a CIA-led operation, | 0:48:10 | 0:48:13 | |
so we had to have agency sign-off. | 0:48:13 | 0:48:16 | |
Really? | 0:48:16 | 0:48:18 | |
Someone from the agency... | 0:48:18 | 0:48:21 | |
stood behind the operator and the analyst, | 0:48:21 | 0:48:23 | |
and gave the order to launch every attack. | 0:48:23 | 0:48:26 | |
Before they even started this attack, | 0:48:33 | 0:48:35 | |
they put inside of the code the kill date, | 0:48:35 | 0:48:37 | |
a date at which it would stop operating. | 0:48:37 | 0:48:39 | |
Cut-off dates, we don't normally see that in other threats, | 0:48:39 | 0:48:42 | |
and you have to think, "Well, why is there a cut-off date in there?" | 0:48:42 | 0:48:46 | |
When you realise that a section of it | 0:48:46 | 0:48:47 | |
was probably written by Government, | 0:48:47 | 0:48:49 | |
and that there are laws regarding | 0:48:49 | 0:48:51 | |
how you can use this sort of software, | 0:48:51 | 0:48:53 | |
that there may have been a legal team who said, | 0:48:53 | 0:48:56 | |
"No, you need to have a cut-off date in there, | 0:48:56 | 0:48:58 | |
"you can only do this and you can only go that far, | 0:48:58 | 0:49:00 | |
"and we need to check if this is legal or not." | 0:49:00 | 0:49:02 | |
That date is a few days before Obama's inauguration. | 0:49:04 | 0:49:07 | |
So, the theory is that | 0:49:07 | 0:49:09 | |
this was an operation that needed to be stopped at a certain time, | 0:49:09 | 0:49:13 | |
because there was going to be a handover | 0:49:13 | 0:49:15 | |
and that more approval was needed. | 0:49:15 | 0:49:18 | |
-Are you prepared to take the oath, Senator? -I am. | 0:49:21 | 0:49:24 | |
I, Barack Hussein Obama... | 0:49:24 | 0:49:26 | |
-I, Barack... -..do solemnly swear. | 0:49:26 | 0:49:27 | |
I, Barack Hussein Obama, do solemnly swear... | 0:49:27 | 0:49:30 | |
-SANGER: -Olympic Games was reauthorised by President Obama | 0:49:30 | 0:49:33 | |
in his first year in office, 2009. | 0:49:33 | 0:49:35 | |
It was fascinating because it was the first year | 0:49:39 | 0:49:42 | |
of the Obama administration | 0:49:42 | 0:49:43 | |
and they would talk to you ENDLESSLY about cyber defence. | 0:49:43 | 0:49:46 | |
-OBAMA: -We count on computer networks to deliver our oil and gas, | 0:49:46 | 0:49:49 | |
our power and our water. | 0:49:49 | 0:49:51 | |
We rely on them for public transportation | 0:49:51 | 0:49:54 | |
and air-traffic control. | 0:49:54 | 0:49:56 | |
But just as we failed in the past to invest | 0:49:56 | 0:49:58 | |
in our physical infrastructure, our roads, | 0:49:58 | 0:50:02 | |
our bridges and rails, we've failed to invest | 0:50:02 | 0:50:04 | |
in the security of our digital infrastructure. | 0:50:04 | 0:50:07 | |
But when you asked questions | 0:50:07 | 0:50:09 | |
about the use of offensive cyber weapons, | 0:50:09 | 0:50:12 | |
everything went dead. | 0:50:12 | 0:50:14 | |
No cooperation. White House wouldn't help. Pentagon wouldn't help. | 0:50:14 | 0:50:17 | |
NSA wouldn't help. Nobody would talk to you about it. | 0:50:17 | 0:50:20 | |
But when you dug into the budget for cyber spending | 0:50:20 | 0:50:23 | |
during the Obama administration, | 0:50:23 | 0:50:26 | |
what you discovered was | 0:50:26 | 0:50:27 | |
much of it was being spent on offensive cyber weapons. | 0:50:27 | 0:50:31 | |
You'd see phrases like "Title 10 CNO". | 0:50:32 | 0:50:37 | |
"Title 10" means "operations for the US Military", | 0:50:37 | 0:50:40 | |
and "CNO" means "computer network operations". | 0:50:40 | 0:50:45 | |
This is considerable evidence that Stuxnet was just the opening wedge | 0:50:45 | 0:50:50 | |
of what is a much broader US Government effort now | 0:50:50 | 0:50:53 | |
to develop an entire new class of weapons. | 0:50:53 | 0:50:57 | |
-CHIEN: -Stuxnet wasn't just an evolution - | 0:51:02 | 0:51:05 | |
it was really a revolution in the threat landscape. | 0:51:05 | 0:51:07 | |
In the past, the vast majority of threats that we saw were always | 0:51:09 | 0:51:12 | |
controlled by an operator somewhere. | 0:51:12 | 0:51:13 | |
They wouldn't infect your machines, | 0:51:13 | 0:51:15 | |
but they would have what's called a "call-back" | 0:51:15 | 0:51:17 | |
or "command and control channel". | 0:51:17 | 0:51:18 | |
The threats would actually contact the operator and say, | 0:51:18 | 0:51:21 | |
"What do you want me to do next?" The operator would send commands | 0:51:21 | 0:51:23 | |
and say, maybe, "Search through this directory, find these folders, | 0:51:23 | 0:51:26 | |
"find these files, upload these files to me. | 0:51:26 | 0:51:28 | |
"Spread to this other machine." | 0:51:28 | 0:51:29 | |
Things of that nature. | 0:51:29 | 0:51:30 | |
But Stuxnet couldn't have a command and control channel, | 0:51:30 | 0:51:34 | |
because once it got inside of Natanz, | 0:51:34 | 0:51:37 | |
it would not have been able to reach back out to the attackers. | 0:51:37 | 0:51:40 | |
The Natanz network is completely air-gapped | 0:51:40 | 0:51:42 | |
from the rest of the internet. It's not connected to the internet. | 0:51:42 | 0:51:45 | |
It's its own isolated network. | 0:51:45 | 0:51:46 | |
Getting across an air gap is one of the more difficult challenges | 0:51:46 | 0:51:49 | |
that attackers will face, | 0:51:49 | 0:51:50 | |
just because of the fact that | 0:51:50 | 0:51:52 | |
everything is in place to prevent that. | 0:51:52 | 0:51:53 | |
You know, everything... You know, the policies and procedures | 0:51:53 | 0:51:56 | |
and the physical network that's in place is specifically designed | 0:51:56 | 0:52:00 | |
to prevent you crossing the air gap. | 0:52:00 | 0:52:01 | |
But there is no truly air-gapped network | 0:52:01 | 0:52:03 | |
in these real-world production environments. | 0:52:03 | 0:52:06 | |
People have got to get new code into Natanz. | 0:52:06 | 0:52:08 | |
People have to get log files off of this network in Natanz. | 0:52:08 | 0:52:11 | |
People have to upgrade equipment. People have to upgrade computers. | 0:52:11 | 0:52:14 | |
This highlights one of the major security issues | 0:52:14 | 0:52:18 | |
that we have in the field. | 0:52:18 | 0:52:20 | |
If you think, "Well, nobody can attack this power plant | 0:52:20 | 0:52:23 | |
"or this chemical plant because it's not connected to the internet," | 0:52:23 | 0:52:27 | |
that's a bizarre illusion. | 0:52:27 | 0:52:29 | |
-DISTORTED FEMALE VOICE: -And the first time we introduced the code | 0:52:32 | 0:52:36 | |
into Natanz, we used human assets. | 0:52:36 | 0:52:39 | |
Maybe CIA - more likely, Mossad - but... | 0:52:39 | 0:52:43 | |
our team was kept in the dark about the tradecraft. | 0:52:43 | 0:52:45 | |
We heard rumours in Moscow, | 0:52:45 | 0:52:48 | |
an Iranian laptop infected by a phoney Siemens technician | 0:52:48 | 0:52:52 | |
with a flash drive. | 0:52:52 | 0:52:54 | |
A double agent in Iran with access to Natanz. | 0:52:55 | 0:52:58 | |
But I don't really know. | 0:52:58 | 0:53:00 | |
What we had to focus on was to write the code | 0:53:00 | 0:53:03 | |
so that, once inside, the worm acted on its own. | 0:53:03 | 0:53:07 | |
They built in all the code and all the logic into the threat | 0:53:07 | 0:53:10 | |
to be able to operate all by itself. | 0:53:10 | 0:53:12 | |
It had the ability to spread by itself. | 0:53:12 | 0:53:14 | |
It had the ability to figure out, "Do I have the right PLCs? | 0:53:14 | 0:53:17 | |
"Have I arrived in Natanz? | 0:53:17 | 0:53:19 | |
"Am I at the target?" | 0:53:19 | 0:53:20 | |
-LANGNER: -And when it's on target, it executes autonomously. | 0:53:20 | 0:53:24 | |
That also means you... you cannot call off the attack. | 0:53:24 | 0:53:28 | |
It was definitely the type of attack where someone had decided that this | 0:53:28 | 0:53:32 | |
is what they wanted to do. There was no turning back | 0:53:32 | 0:53:35 | |
once Stuxnet was released. | 0:53:35 | 0:53:37 | |
When it began to actually execute its payload, | 0:53:42 | 0:53:44 | |
you would have a whole bunch of centrifuges | 0:53:44 | 0:53:46 | |
in a huge array of cascades, | 0:53:46 | 0:53:48 | |
sitting in a big hall, and then, just off that hall, | 0:53:48 | 0:53:51 | |
you would have an operators' room, the control panels in front of them, | 0:53:51 | 0:53:53 | |
a big window where they could see into the hall. | 0:53:53 | 0:53:56 | |
Computers monitor the activities of all these centrifuges. | 0:53:56 | 0:54:01 | |
So, a centrifuge, | 0:54:01 | 0:54:02 | |
it's driven by an electrical motor, | 0:54:02 | 0:54:05 | |
and the speed of this electrical motor | 0:54:05 | 0:54:08 | |
is controlled by another PLC, | 0:54:08 | 0:54:11 | |
by another programmable logic controller. | 0:54:11 | 0:54:13 | |
Stuxnet would wait for 13 days before doing anything. | 0:54:15 | 0:54:19 | |
These 13 days is about the time it takes to actually fill | 0:54:19 | 0:54:23 | |
an entire cascade of centrifuges with uranium. | 0:54:23 | 0:54:27 | |
They didn't want to attack when the centrifuges were empty | 0:54:27 | 0:54:29 | |
or at the beginning of the enrichment process. | 0:54:29 | 0:54:31 | |
What Stuxnet did | 0:54:31 | 0:54:34 | |
was it actually would sit there during the 13 days | 0:54:34 | 0:54:36 | |
and basically record all of the normal activities | 0:54:36 | 0:54:39 | |
that were happening, and save it. | 0:54:39 | 0:54:41 | |
And once they saw them spinning for 13 days, then the attack occurred. | 0:54:41 | 0:54:45 | |
Centrifuges spin at incredible speeds, at about 1,000 hertz. | 0:54:46 | 0:54:50 | |
They have a safe operating speed - | 0:54:50 | 0:54:53 | |
63,000 revolutions per minute. | 0:54:53 | 0:54:56 | |
Stuxnet caused the uranium enrichment centrifuges | 0:54:56 | 0:54:58 | |
to spin up to 1,400 hertz. | 0:54:58 | 0:55:00 | |
Up to 80,000 revolutions per minute. | 0:55:00 | 0:55:02 | |
What would happen was those centrifuges would go through | 0:55:07 | 0:55:09 | |
what's called a "resonance frequency". | 0:55:09 | 0:55:11 | |
It would go through a frequency | 0:55:11 | 0:55:12 | |
at which the metal would basically vibrate uncontrollably, | 0:55:12 | 0:55:15 | |
and essentially shatter. There'd be uranium gas everywhere. | 0:55:15 | 0:55:19 | |
And then the second attack they attempted | 0:55:19 | 0:55:22 | |
was they actually tried to lower it to two hertz. | 0:55:22 | 0:55:24 | |
They were slowed down... | 0:55:24 | 0:55:26 | |
to almost standstill. | 0:55:26 | 0:55:27 | |
And at two hertz, an opposite effect occurs. | 0:55:27 | 0:55:31 | |
You can imagine a toy top that you spin, | 0:55:31 | 0:55:33 | |
and as the top begins to slow down, it begins to wobble. | 0:55:33 | 0:55:35 | |
That's what happened to these centrifuges - | 0:55:35 | 0:55:37 | |
they would begin to wobble and essentially shatter and fall apart. | 0:55:37 | 0:55:40 | |
And instead of sending back to the computer what was really happening, | 0:55:44 | 0:55:47 | |
it would send back that old data that it had recorded. | 0:55:47 | 0:55:50 | |
So, the computer's sitting there thinking, | 0:55:50 | 0:55:52 | |
"Yup, running at 1,000 hertz, everything's fine. | 0:55:52 | 0:55:54 | |
"Running at 1,000 hertz, everything's fine." | 0:55:54 | 0:55:56 | |
But those centrifuges are spinning up wildly. | 0:55:56 | 0:55:58 | |
A huge noise would occur. It'd be like, you know, a jet engine. | 0:55:58 | 0:56:02 | |
JETS POWERING UP | 0:56:02 | 0:56:05 | |
The operators would know, "Whoa, something is going wrong here." | 0:56:05 | 0:56:08 | |
They might look at their monitors and say, "It says it's 1,000 hertz." | 0:56:08 | 0:56:11 | |
But they would hear that, in the room, | 0:56:11 | 0:56:13 | |
something gravely bad was happening. | 0:56:13 | 0:56:14 | |
Not only are the operators fooled into thinking everything's normal, | 0:56:14 | 0:56:19 | |
but also any kind of automated protective logic is fooled. | 0:56:19 | 0:56:25 | |
You can't just turn these centrifuges off. | 0:56:25 | 0:56:28 | |
They have to be brought down in a very controlled manner. | 0:56:28 | 0:56:31 | |
And so they would hit, literally, | 0:56:31 | 0:56:32 | |
the big red button to initiate a graceful shutdown. | 0:56:32 | 0:56:35 | |
And Stuxnet intercepts that code, so you would have these operators | 0:56:35 | 0:56:38 | |
slamming on that button over and over again, | 0:56:38 | 0:56:40 | |
and nothing would happen. | 0:56:40 | 0:56:42 | |
-YADLIN: -If your cyber weapon is good enough, | 0:56:43 | 0:56:46 | |
if your enemy is not aware of it, | 0:56:46 | 0:56:49 | |
it is an ideal weapon, | 0:56:49 | 0:56:52 | |
because the enemy don't understand what is happening to them. | 0:56:52 | 0:56:54 | |
Maybe, even better, the enemy begins to doubt their own capability? | 0:56:54 | 0:56:58 | |
Absolutely. | 0:56:58 | 0:56:59 | |
Certainly, | 0:56:59 | 0:57:01 | |
one must conclude that what happened at Natanz | 0:57:01 | 0:57:04 | |
must have driven the engineers crazy. | 0:57:04 | 0:57:07 | |
Because the worst thing that can happen to a maintenance engineer | 0:57:07 | 0:57:11 | |
is not being able to figure out | 0:57:11 | 0:57:13 | |
what the cause of the specific trouble is, | 0:57:13 | 0:57:16 | |
so they must have been analysing themselves to death. | 0:57:16 | 0:57:19 | |
-SANGER: -Through 2009, it was going pretty smoothly. | 0:57:24 | 0:57:27 | |
Centrifuges were blowing up. The International Atomic Energy Agency | 0:57:27 | 0:57:30 | |
inspectors would go into Natanz and they would see | 0:57:30 | 0:57:33 | |
that whole sections of the centrifuges had been removed. | 0:57:33 | 0:57:36 | |
The United States knew from its intelligence channels | 0:57:38 | 0:57:41 | |
that some Iranian scientists and engineers were being fired, | 0:57:41 | 0:57:45 | |
because the centrifuges were blowing up, | 0:57:45 | 0:57:47 | |
and the Iranians had assumed that this was because | 0:57:47 | 0:57:50 | |
they would have been making errors, | 0:57:50 | 0:57:52 | |
there were manufacturing mistakes, | 0:57:52 | 0:57:53 | |
clearly this was somebody's fault. | 0:57:53 | 0:57:56 | |
So, the program was doing exactly what it was supposed to be doing, | 0:57:56 | 0:58:00 | |
which was, it was blowing up centrifuges | 0:58:00 | 0:58:03 | |
and it was leaving no trace, | 0:58:03 | 0:58:06 | |
and leaving the Iranians to wonder what they got hit by. | 0:58:06 | 0:58:10 | |
This was the brilliance of Olympic Games. | 0:58:10 | 0:58:12 | |
You know, as a former director | 0:58:12 | 0:58:14 | |
of a couple of big three-letter agencies, | 0:58:14 | 0:58:16 | |
slowing down 1,000 centrifuges in Natanz? | 0:58:16 | 0:58:19 | |
An unalloyed good. | 0:58:19 | 0:58:20 | |
There was a need for, for buying time. | 0:58:20 | 0:58:23 | |
There was a need for slowing them down. | 0:58:23 | 0:58:25 | |
There was a need to try and push them to the negotiating table. | 0:58:25 | 0:58:28 | |
I mean, there were a lot of variables at play here. | 0:58:28 | 0:58:31 | |
-SANGER: -President Obama would go down into the Situation Room | 0:58:35 | 0:58:39 | |
and he would have laid out in front of him | 0:58:39 | 0:58:41 | |
what they call the horse blanket, | 0:58:41 | 0:58:43 | |
which was a giant schematic | 0:58:43 | 0:58:46 | |
of the Natanz nuclear enrichment plant. | 0:58:46 | 0:58:49 | |
And the designers of Olympic Games | 0:58:49 | 0:58:52 | |
would describe to him what kind of progress they made, | 0:58:52 | 0:58:55 | |
and look for him for the authorisation | 0:58:55 | 0:58:57 | |
to move on ahead to the next attack. | 0:58:57 | 0:59:00 | |
And at one point during those discussions, | 0:59:01 | 0:59:04 | |
he said to a number of his aides, | 0:59:04 | 0:59:05 | |
"You know, I have some concerns, | 0:59:05 | 0:59:07 | |
"because once word of this gets out..." | 0:59:07 | 0:59:09 | |
And he knew it would get out. | 0:59:09 | 0:59:11 | |
"..the Chinese may use it as an excuse for their attacks on us, | 0:59:11 | 0:59:14 | |
"the Russians might, or others." | 0:59:14 | 0:59:17 | |
So, he clearly had some misgivings, | 0:59:17 | 0:59:19 | |
but they weren't big enough to stop him | 0:59:19 | 0:59:21 | |
from going ahead with the programme. | 0:59:21 | 0:59:23 | |
And then, in 2010, | 0:59:24 | 0:59:27 | |
a decision was made to change the code. | 0:59:27 | 0:59:30 | |
Our human assets weren't always able to get code updates into Natanz, | 0:59:36 | 0:59:41 | |
and we weren't told exactly why, but... | 0:59:41 | 0:59:45 | |
we were told we had to have a cyber solution | 0:59:45 | 0:59:48 | |
for delivering the code. | 0:59:48 | 0:59:50 | |
But the delivery systems were tricky. | 0:59:50 | 0:59:52 | |
If they weren't aggressive enough, they wouldn't get in. | 0:59:52 | 0:59:55 | |
If they were too aggressive, | 0:59:55 | 0:59:57 | |
it could spread and be discovered. | 0:59:57 | 0:59:59 | |
-CHIEN: -When we got the first sample, | 1:00:01 | 1:00:03 | |
there was some configuration information inside of it, | 1:00:03 | 1:00:05 | |
and one of the pieces in there was a version number, 1.1. | 1:00:05 | 1:00:09 | |
And that made us realise, | 1:00:09 | 1:00:10 | |
"Well, look, this likely isn't the only copy." | 1:00:10 | 1:00:12 | |
We went back to our databases, | 1:00:12 | 1:00:14 | |
looking for anything that looked similar to Stuxnet. | 1:00:14 | 1:00:17 | |
As we began to collect more samples, | 1:00:19 | 1:00:21 | |
we found a few earlier versions of Stuxnet. | 1:00:21 | 1:00:23 | |
And when we analysed that code, | 1:00:23 | 1:00:24 | |
we saw that versions previous to 1.1 | 1:00:24 | 1:00:27 | |
were a lot less aggressive. | 1:00:27 | 1:00:29 | |
The earlier version of Stuxnet, | 1:00:29 | 1:00:31 | |
it, basically, required humans to do a little bit of double-clicking | 1:00:31 | 1:00:34 | |
in order for it to spread from one computer to another. | 1:00:34 | 1:00:37 | |
And so, what we believe, after looking at that code, is two things. | 1:00:37 | 1:00:40 | |
One, either they didn't get into Natanz with that earlier version | 1:00:40 | 1:00:44 | |
because it simply wasn't aggressive enough, | 1:00:44 | 1:00:46 | |
wasn't able to jump over that air gap. | 1:00:46 | 1:00:48 | |
And/or two, that payload, as well, didn't work properly. | 1:00:48 | 1:00:52 | |
It didn't work to their satisfaction. | 1:00:52 | 1:00:54 | |
Maybe it was not explosive enough. | 1:00:54 | 1:00:57 | |
There were slightly different versions | 1:00:57 | 1:00:59 | |
which were aimed at different parts of the centrifuge cascade. | 1:00:59 | 1:01:02 | |
But the guys at Symantec figured you changed the code because | 1:01:02 | 1:01:06 | |
the first variations couldn't get in and didn't work right. | 1:01:06 | 1:01:08 | |
Bullshit. We always found a way to get across the air gap. | 1:01:08 | 1:01:13 | |
At TAO, we laughed when people | 1:01:13 | 1:01:14 | |
thought they were protected by an air gap. | 1:01:14 | 1:01:17 | |
And for OG, the early versions of the payload did work. | 1:01:17 | 1:01:20 | |
But what NSA did... | 1:01:20 | 1:01:22 | |
..was always low-key | 1:01:23 | 1:01:25 | |
and subtle. | 1:01:25 | 1:01:27 | |
The problem was that Unit 8200, the Israelis, | 1:01:27 | 1:01:30 | |
kept pushing us to be more aggressive. | 1:01:30 | 1:01:33 | |
The later version of Stuxnet, 1.1 - | 1:01:34 | 1:01:36 | |
that version had multiple ways of spreading. | 1:01:36 | 1:01:38 | |
It had the four zero-days inside of it, for example, | 1:01:38 | 1:01:41 | |
that allowed it to spread all by itself, without you doing anything. | 1:01:41 | 1:01:43 | |
It could spread via network shares. It could spread via USB keys. | 1:01:43 | 1:01:47 | |
It was able to spread via network exploits. | 1:01:47 | 1:01:49 | |
That's the sample that introduces the stolen digital certificates. | 1:01:49 | 1:01:53 | |
That is the sample that, all of a sudden, | 1:01:53 | 1:01:55 | |
became so noisy | 1:01:55 | 1:01:57 | |
and caught the attention of the antivirus guys. | 1:01:57 | 1:02:00 | |
In the first sample, we don't find that. | 1:02:00 | 1:02:03 | |
And this is very strange | 1:02:05 | 1:02:07 | |
because it tells us that, | 1:02:07 | 1:02:10 | |
in the process of this development, | 1:02:10 | 1:02:13 | |
the attackers were less concerned with operational security. | 1:02:13 | 1:02:17 | |
Stuxnet actually kept a log inside of itself | 1:02:23 | 1:02:25 | |
of all the machines that had been infected along the way, | 1:02:25 | 1:02:28 | |
as it jumped from one machine to another to another to another. | 1:02:28 | 1:02:32 | |
And we were able to gather up | 1:02:32 | 1:02:33 | |
all of the samples that we could acquire, | 1:02:33 | 1:02:35 | |
tens of thousands of samples, and we extracted all of those logs. | 1:02:35 | 1:02:38 | |
We can see the exact path that Stuxnet took. | 1:02:38 | 1:02:42 | |
Eventually we were able to trace back this version of Stuxnet | 1:02:43 | 1:02:46 | |
to ground zero - to the first five infections in the world. | 1:02:46 | 1:02:50 | |
The first five infections were all outside of Natanz plant, | 1:02:50 | 1:02:54 | |
all inside of organisations inside of Iran. | 1:02:54 | 1:02:57 | |
All organisations that are involved in industrial control systems, | 1:02:57 | 1:03:00 | |
and construction of industrial control facilities. | 1:03:00 | 1:03:03 | |
Clearly contractors who were working on the Natanz facility, | 1:03:03 | 1:03:07 | |
and the attackers knew that. | 1:03:07 | 1:03:08 | |
They're electrical companies. They're piping companies. | 1:03:08 | 1:03:11 | |
They're, you know, these sorts of companies. | 1:03:11 | 1:03:13 | |
And they knew that technicians from those companies would visit Natanz. | 1:03:13 | 1:03:17 | |
So, they would infect these companies and then technicians | 1:03:17 | 1:03:20 | |
would take their computer or their laptop on their USB... | 1:03:20 | 1:03:23 | |
That operator then goes down to Natanz and he plugs in his USB key | 1:03:23 | 1:03:26 | |
which has some code that he needs to update into Natanz, | 1:03:26 | 1:03:28 | |
into the Natanz network, and now Stuxnet is able | 1:03:28 | 1:03:30 | |
to get inside Natanz and conduct its attack. | 1:03:30 | 1:03:33 | |
These five companies were specifically targeted | 1:03:34 | 1:03:36 | |
to spread Stuxnet into Natanz, | 1:03:36 | 1:03:38 | |
and it wasn't that Stuxnet escaped out of Natanz | 1:03:38 | 1:03:41 | |
and then spread all over the world, and it was this big mistake and, | 1:03:41 | 1:03:44 | |
"Oh, it wasn't meant to spread that far but it really did." | 1:03:44 | 1:03:47 | |
No, that's not the way we see it. The way we see it is that | 1:03:47 | 1:03:49 | |
they wanted it to spread far so that they could get it into Natanz. | 1:03:49 | 1:03:53 | |
Someone decided that we're going to create something new, | 1:03:53 | 1:03:57 | |
something evolved, that's going to be far, far, far more aggressive. | 1:03:57 | 1:04:02 | |
And we're OK, frankly, | 1:04:02 | 1:04:04 | |
with it spreading all over the world to innocent machines, | 1:04:04 | 1:04:07 | |
in order to go after our target. | 1:04:07 | 1:04:09 | |
The Mossad had the role, | 1:04:14 | 1:04:17 | |
had the assignment, | 1:04:17 | 1:04:20 | |
to deliver the virus, | 1:04:20 | 1:04:23 | |
to make sure that Stuxnet | 1:04:23 | 1:04:26 | |
would be put in place in Natanz to affect the centrifuges. | 1:04:26 | 1:04:31 | |
Meir Dagan, the head of Mossad, | 1:04:32 | 1:04:34 | |
was under growing pressure from the Prime Minister, Benjamin Netanyahu, | 1:04:34 | 1:04:39 | |
to produce results. | 1:04:39 | 1:04:41 | |
Inside the ROC, we were furious. | 1:04:42 | 1:04:45 | |
The Israelis took our code for the delivery system and changed it. | 1:04:47 | 1:04:51 | |
Then, on their own, without our agreement, | 1:04:52 | 1:04:55 | |
they just fucking launched it. | 1:04:55 | 1:04:57 | |
2010, around the same time they started killing Iranian scientists. | 1:04:57 | 1:05:01 | |
And they fucked up the code. | 1:05:01 | 1:05:03 | |
Instead of hiding, the code started shutting down computers. | 1:05:03 | 1:05:07 | |
So, naturally, people noticed. | 1:05:07 | 1:05:09 | |
Because they were in a hurry, they opened Pandora's Box, | 1:05:11 | 1:05:14 | |
they let it out, and it spread... | 1:05:14 | 1:05:17 | |
all over the world. | 1:05:17 | 1:05:19 | |
The worm spread quickly, | 1:05:24 | 1:05:25 | |
but somehow it remained unseen until it was identified in Belarus. | 1:05:25 | 1:05:30 | |
Soon after, Israeli intelligence confirmed | 1:05:30 | 1:05:32 | |
that it had made its way into the hands | 1:05:32 | 1:05:34 | |
of the Russian Federal Security Service, the successor to the KGB. | 1:05:34 | 1:05:38 | |
And so it happened that the formula for a secret cyber weapon | 1:05:40 | 1:05:43 | |
designed by the US and Israel | 1:05:43 | 1:05:44 | |
fell into the hands of Russia | 1:05:44 | 1:05:46 | |
and the very country it was meant to attack. | 1:05:46 | 1:05:49 | |
ANGRY CHANTING | 1:06:09 | 1:06:10 | |
-KIYAEI: -In international law, | 1:06:10 | 1:06:12 | |
when some country, or a coalition of countries, | 1:06:12 | 1:06:15 | |
targets a nuclear facility, | 1:06:15 | 1:06:18 | |
it's an act of war. | 1:06:18 | 1:06:20 | |
Please, let's be frank here. | 1:06:20 | 1:06:24 | |
If it wasn't Iran, | 1:06:24 | 1:06:27 | |
let's say a nuclear facility in the United States | 1:06:27 | 1:06:30 | |
was targeted in the same way... | 1:06:30 | 1:06:33 | |
..the American Government would not sit by and let this go. | 1:06:34 | 1:06:40 | |
Stuxnet is an attack in peacetime on critical infrastructure. | 1:06:40 | 1:06:44 | |
Yes, it is. Look, when I read about it, | 1:06:44 | 1:06:47 | |
all right, I go, | 1:06:47 | 1:06:48 | |
"Whoa, this is a big deal!" Yeah. | 1:06:48 | 1:06:51 | |
-SANGER: -The people who were running this program, | 1:06:52 | 1:06:55 | |
including Leon Panetta, the director of the CIA at the time, | 1:06:55 | 1:06:59 | |
had to go down into the Situation Room and face President Obama | 1:06:59 | 1:07:03 | |
and Vice President Biden | 1:07:03 | 1:07:05 | |
and explain that this program was suddenly on the loose. | 1:07:05 | 1:07:10 | |
Vice President Biden at one point during this discussion, sort of, | 1:07:11 | 1:07:16 | |
exploded in Biden-esque fashion | 1:07:16 | 1:07:18 | |
and blamed the Israelis. He said, "It must have been the Israelis | 1:07:18 | 1:07:22 | |
"who made a change in the code that enabled it to get out." | 1:07:22 | 1:07:26 | |
President Obama said to the senior leadership, | 1:07:28 | 1:07:30 | |
"You told me it wouldn't get out of the network. It did. | 1:07:30 | 1:07:32 | |
"You told me Iranians would never figure out | 1:07:32 | 1:07:34 | |
"it was the United States. They did. | 1:07:34 | 1:07:37 | |
"You told me it would have a huge effect on their nuclear programme, | 1:07:37 | 1:07:41 | |
"and it didn't." | 1:07:41 | 1:07:43 | |
The Natanz plant is inspected every couple of weeks | 1:07:44 | 1:07:47 | |
by the International Atomic Energy Agency inspectors, | 1:07:47 | 1:07:51 | |
and if you line up what you know about the attacks | 1:07:51 | 1:07:53 | |
with the inspection reports, you can see the effects. | 1:07:53 | 1:07:57 | |
-HEINONEN: -If you go to the IAEA reports, | 1:07:58 | 1:08:00 | |
we really saw that a lot of centrifuges were switched off, | 1:08:00 | 1:08:03 | |
and they were removed. | 1:08:03 | 1:08:06 | |
As much as almost a couple of thousand got compromised. | 1:08:06 | 1:08:09 | |
When you put this all together, | 1:08:09 | 1:08:11 | |
I wouldn't be surprised if their programme | 1:08:11 | 1:08:13 | |
got delayed by the one year. | 1:08:13 | 1:08:15 | |
But go, then, to year 2012-13, and look, you know, | 1:08:15 | 1:08:19 | |
how the centrifuges started to come up again. | 1:08:19 | 1:08:22 | |
-KIYAEI: -So, ironically, cyber warfare, | 1:08:25 | 1:08:28 | |
assassination of its nuclear scientists, | 1:08:28 | 1:08:32 | |
economic sanctions, | 1:08:32 | 1:08:33 | |
political isolation... | 1:08:33 | 1:08:35 | |
Iran has gone through A-X of every coercive policy that the US, | 1:08:36 | 1:08:42 | |
Israel and those who ally with them | 1:08:42 | 1:08:46 | |
have placed on Iran, | 1:08:46 | 1:08:48 | |
and they have actually made Iran's nuclear programme | 1:08:48 | 1:08:51 | |
more advanced today than it was ever before. | 1:08:51 | 1:08:54 | |
CHANTING IN ARABIC | 1:08:54 | 1:08:56 | |
-DISTORTED MALE VOICE: -This is a very, very dangerous minefield | 1:08:57 | 1:09:01 | |
that we are walking, and the nations who decide | 1:09:01 | 1:09:05 | |
to take these covert actions should be | 1:09:05 | 1:09:09 | |
taking into consideration all the effects, | 1:09:09 | 1:09:14 | |
including the moral effects. | 1:09:14 | 1:09:17 | |
I would say that this is the price that we have to pay in this... | 1:09:17 | 1:09:23 | |
in this world, and our blade of righteousness shouldn't be so sharp. | 1:09:23 | 1:09:29 | |
In Israel and in the United States, | 1:09:34 | 1:09:37 | |
the blade of righteousness cut both ways, | 1:09:37 | 1:09:39 | |
wounding the targets and the attackers. | 1:09:39 | 1:09:42 | |
Once Stuxnet infected American computers, | 1:09:42 | 1:09:45 | |
the Department of Homeland Security, | 1:09:45 | 1:09:47 | |
unaware of the cyber weapons launched by the NSA, | 1:09:47 | 1:09:50 | |
devoted enormous resources trying to protect Americans | 1:09:50 | 1:09:53 | |
from their own government. | 1:09:53 | 1:09:55 | |
We had met the enemy and it was us. | 1:09:55 | 1:09:58 | |
Yep, absolutely. | 1:10:09 | 1:10:11 | |
We'll be more than happy to discuss that. | 1:10:11 | 1:10:13 | |
Early July of 2010, I received a call | 1:10:13 | 1:10:16 | |
that said that this piece of malware was discovered, | 1:10:16 | 1:10:19 | |
and could we take a look at it? | 1:10:19 | 1:10:22 | |
When we first started the analysis, there was that, "Oh, crap" moment. | 1:10:22 | 1:10:25 | |
You know, where we sat there and said, "This is something | 1:10:25 | 1:10:27 | |
"that's significant. It's impacting industrial control. | 1:10:27 | 1:10:30 | |
"It can disrupt it to the point where it could cause harm, | 1:10:30 | 1:10:32 | |
"and not only damage to the equipment, | 1:10:32 | 1:10:35 | |
"but potentially harm or loss of life." | 1:10:35 | 1:10:36 | |
We were very concerned, | 1:10:36 | 1:10:38 | |
because Stuxnet was something that we had not seen before, | 1:10:38 | 1:10:41 | |
so there wasn't a lot of sleep at night. | 1:10:41 | 1:10:43 | |
Basically, light up the phones, call everybody we know, | 1:10:43 | 1:10:46 | |
inform the Secretary, inform the White House | 1:10:46 | 1:10:48 | |
inform the other departments and agencies, | 1:10:48 | 1:10:51 | |
wake up the world and figure out what's going on | 1:10:51 | 1:10:54 | |
with this particular malware. | 1:10:54 | 1:10:55 | |
Did anybody ever give you an indication | 1:10:55 | 1:10:58 | |
that it was something that they already knew about? | 1:10:58 | 1:11:01 | |
No, at no time did I get the impression from someone that, | 1:11:01 | 1:11:04 | |
"That's OK," you know, | 1:11:04 | 1:11:05 | |
get a little pat on the head and scooted out the door. | 1:11:05 | 1:11:07 | |
I never received a stand down order. | 1:11:07 | 1:11:09 | |
I never... No-one ever asked, "Stop looking at this." | 1:11:09 | 1:11:13 | |
Sean McGurk, the Director of Cyber | 1:11:13 | 1:11:14 | |
for the Department of Homeland Security, | 1:11:14 | 1:11:16 | |
testified before the Senate | 1:11:16 | 1:11:18 | |
about how he thought Stuxnet was a terrifying threat | 1:11:18 | 1:11:21 | |
-to the United States. Is that not a problem? -No, no... | 1:11:21 | 1:11:24 | |
How do you mean? That, that, that the Stuxnet thing was a bad idea? | 1:11:24 | 1:11:28 | |
No, no, just that before he knew what it was and what it attacks... | 1:11:28 | 1:11:32 | |
Oh, I get it. That, that... | 1:11:32 | 1:11:33 | |
Yeah, that he was responding to something that... | 1:11:33 | 1:11:36 | |
He thought was a threat to critical infrastructure in the United States. | 1:11:36 | 1:11:39 | |
Yeah. "The worm is loose!" | 1:11:39 | 1:11:40 | |
The worm is loose, I understand. | 1:11:40 | 1:11:42 | |
But there's a... | 1:11:42 | 1:11:44 | |
There is a further theory having to do with whether or not, | 1:11:44 | 1:11:47 | |
-following up on David Sanger's... -I got the subplot. And who did that? | 1:11:47 | 1:11:50 | |
Was it the Israelis? And, yeah, I... | 1:11:50 | 1:11:53 | |
I truly don't know and, even though I don't know, | 1:11:53 | 1:11:56 | |
I still can't talk about it. All right? | 1:11:56 | 1:11:58 | |
Stuxnet was somebody's covert action, all right? | 1:11:58 | 1:12:01 | |
And the definition of covert action | 1:12:01 | 1:12:03 | |
is an activity in which | 1:12:03 | 1:12:04 | |
you want to have the hand of the actor forever hidden. | 1:12:04 | 1:12:08 | |
So, by definition, it's going to end up | 1:12:08 | 1:12:10 | |
in this "we don't talk about these things" box. | 1:12:10 | 1:12:13 | |
-SANGER: -To this day, the United States Government | 1:12:18 | 1:12:21 | |
has never acknowledged | 1:12:21 | 1:12:23 | |
conducting any offensive cyber attack anywhere in the world. | 1:12:23 | 1:12:28 | |
But, thanks to Mr Snowden, we know that, in 2012, | 1:12:30 | 1:12:34 | |
President Obama issued an Executive Order | 1:12:34 | 1:12:37 | |
that laid out some of the conditions | 1:12:37 | 1:12:39 | |
under which cyber weapons can be used, | 1:12:39 | 1:12:42 | |
and, interestingly, every use of a cyber weapon | 1:12:42 | 1:12:45 | |
requires presidential sign-off. | 1:12:45 | 1:12:48 | |
That is only true, in the physical world, for nuclear weapons. | 1:12:49 | 1:12:55 | |
-CLARKE: -Nuclear war and nuclear weapons are vastly different | 1:13:05 | 1:13:07 | |
from cyber war and cyber weapons. | 1:13:07 | 1:13:10 | |
Having said that, there are some similarities. | 1:13:10 | 1:13:13 | |
And in the early 1960s, the United States Government | 1:13:13 | 1:13:15 | |
suddenly realised it had thousands of nuclear weapons, | 1:13:15 | 1:13:19 | |
big ones and little ones, weapons on Jeeps, | 1:13:19 | 1:13:21 | |
weapons on submarines, and it really didn't have a doctrine. | 1:13:21 | 1:13:25 | |
It really didn't have a strategy. | 1:13:25 | 1:13:27 | |
It really didn't have an understanding, at the policy level, | 1:13:27 | 1:13:30 | |
about how it was going to use all of these things. | 1:13:30 | 1:13:33 | |
And so academics started publishing unclassified documents | 1:13:33 | 1:13:38 | |
about nuclear war | 1:13:38 | 1:13:40 | |
and nuclear weapons. | 1:13:40 | 1:13:42 | |
And the result was more than 20 years in the United States | 1:13:44 | 1:13:48 | |
of very vigorous national debates | 1:13:48 | 1:13:51 | |
about how we want to go use nuclear weapons. | 1:13:51 | 1:13:55 | |
And not only did that cause the Congress, | 1:13:57 | 1:13:59 | |
and people in the executive branch in Washington, | 1:13:59 | 1:14:02 | |
to think about these things, | 1:14:02 | 1:14:04 | |
it caused the Russians to think about these things. | 1:14:04 | 1:14:07 | |
And out of that grew nuclear doctrine - | 1:14:07 | 1:14:10 | |
mutual assured destruction, | 1:14:10 | 1:14:13 | |
all of that complicated set of nuclear dynamics. | 1:14:13 | 1:14:18 | |
Today, on this vital issue, at least, | 1:14:18 | 1:14:20 | |
we have seen what can be accomplished when we pull together. | 1:14:20 | 1:14:24 | |
We can't have a discussion, not in a sensible way right now, | 1:14:24 | 1:14:28 | |
about cyber war and cyber weapons, because everything is secret. | 1:14:28 | 1:14:33 | |
And when you get into a discussion | 1:14:33 | 1:14:35 | |
with people in the government, people still in the government, | 1:14:35 | 1:14:38 | |
people who have security clearances, you run into a brick wall. | 1:14:38 | 1:14:42 | |
Trying to stop Iran is really my number-one job, and I think... | 1:14:42 | 1:14:45 | |
Wait, can I ask you, in that context, | 1:14:45 | 1:14:48 | |
about the Stuxnet computer virus, potentially? | 1:14:48 | 1:14:50 | |
You can ask but I won't comment. | 1:14:50 | 1:14:52 | |
-Can you tell us anything? -No. | 1:14:52 | 1:14:54 | |
Look, for the longest time, I was in fear | 1:14:54 | 1:14:56 | |
that I couldn't actually say the phrase "computer network attack". | 1:14:56 | 1:15:00 | |
This stuff is hideously over-classified, | 1:15:00 | 1:15:02 | |
and it gets into the way of a... of a mature, public discussion | 1:15:02 | 1:15:07 | |
as to what it is we, as a democracy, | 1:15:07 | 1:15:09 | |
want our nation to be doing up here in the cyber domain. | 1:15:09 | 1:15:13 | |
Now, this is a former director of NSA and CIA | 1:15:13 | 1:15:16 | |
saying this stuff is over-classified. | 1:15:16 | 1:15:18 | |
One of the reasons this is as highly classified as it is, | 1:15:18 | 1:15:21 | |
this is a peculiar weapons system. | 1:15:21 | 1:15:23 | |
This is the weapons system that's come out of the espionage community, | 1:15:23 | 1:15:26 | |
and so those people have a HABIT of secrecy. | 1:15:26 | 1:15:29 | |
While most government officials refuse to acknowledge the operation, | 1:15:29 | 1:15:33 | |
at least one key insider did leak parts of the story to the press. | 1:15:33 | 1:15:38 | |
In 2012, David Sanger wrote a detailed account of Olympic Games | 1:15:38 | 1:15:42 | |
that unmasked the extensive joint operation | 1:15:42 | 1:15:44 | |
between the US and Israel | 1:15:44 | 1:15:46 | |
to launch cyber attacks on Natanz. | 1:15:46 | 1:15:49 | |
The publication of this story, | 1:15:49 | 1:15:51 | |
coming at a time that there were a number of other unrelated | 1:15:51 | 1:15:54 | |
national security stories being published, led to the announcement | 1:15:54 | 1:15:58 | |
of investigations by the Attorney General. | 1:15:58 | 1:16:01 | |
Into the...? Into the press and into the leaks? | 1:16:01 | 1:16:03 | |
Into the press and into the leaks. | 1:16:03 | 1:16:07 | |
When Stuxnet hit the media, they polygraphed everyone in our office, | 1:16:07 | 1:16:10 | |
including people who didn't know shit. | 1:16:10 | 1:16:12 | |
You know, they poly'd the interns, for God's sake. | 1:16:12 | 1:16:14 | |
These are criminal acts when they release information like this, | 1:16:14 | 1:16:18 | |
and we will conduct thorough investigations, | 1:16:18 | 1:16:21 | |
as we have in the past. | 1:16:21 | 1:16:24 | |
The administration never filed charges, | 1:16:25 | 1:16:28 | |
possibly afraid that a prosecution | 1:16:28 | 1:16:29 | |
would reveal classified details about Stuxnet. | 1:16:29 | 1:16:33 | |
To this day, no-one in the US or Israeli Governments | 1:16:33 | 1:16:36 | |
has officially acknowledged the existence of the joint operation. | 1:16:36 | 1:16:40 | |
I would never compromise ongoing operations in the field, | 1:16:42 | 1:16:45 | |
but we should be able to talk about capability. | 1:16:45 | 1:16:49 | |
We can talk about our... | 1:16:50 | 1:16:53 | |
bunker busters - why not our cyber weapons? | 1:16:53 | 1:16:56 | |
The secrecy of the operation has been blown. | 1:16:56 | 1:16:58 | |
Our friends in Israel took a weapon | 1:17:00 | 1:17:01 | |
that we jointly developed - | 1:17:01 | 1:17:03 | |
in part to keep Israel from doing something crazy - | 1:17:03 | 1:17:05 | |
and then used it on their own in a way that blew the cover | 1:17:05 | 1:17:08 | |
of the operation and could have led to war, | 1:17:08 | 1:17:09 | |
and we can't talk about that? | 1:17:09 | 1:17:11 | |
There is a way to talk about Stuxnet. | 1:17:15 | 1:17:18 | |
It happened. That... | 1:17:18 | 1:17:20 | |
To deny that it happened is foolish, | 1:17:20 | 1:17:23 | |
so the fact it happened is really what we're talking about here. | 1:17:23 | 1:17:26 | |
What are the implications of the fact | 1:17:26 | 1:17:27 | |
that we now are in a post-Stuxnet world? | 1:17:27 | 1:17:30 | |
What I said to David Sanger was, | 1:17:30 | 1:17:32 | |
I understand the difference in destruction is dramatic, | 1:17:32 | 1:17:35 | |
but this has the whiff of August 1945. | 1:17:35 | 1:17:38 | |
Somebody just used a new weapon, | 1:17:38 | 1:17:41 | |
and this weapon will not be put back into the box. | 1:17:41 | 1:17:43 | |
I know no operational details, | 1:17:43 | 1:17:46 | |
and don't know what anyone did or didn't do | 1:17:46 | 1:17:48 | |
before someone decided to use the weapon, all right? | 1:17:48 | 1:17:52 | |
I do know this - if we go out and do something, | 1:17:52 | 1:17:55 | |
most of the rest of the world now thinks that's the new standard | 1:17:55 | 1:17:59 | |
and it's something that they now feel legitimated to do, as well. | 1:17:59 | 1:18:02 | |
But the rules of engagement, | 1:18:02 | 1:18:04 | |
international norms, treaty standards, | 1:18:04 | 1:18:07 | |
they don't exist right now. | 1:18:07 | 1:18:09 | |
-SANGER: -For nuclear, we have these extensive inspection regimes. | 1:18:12 | 1:18:15 | |
The Russians come and look at our silos. | 1:18:15 | 1:18:17 | |
We go and look at their silos. | 1:18:17 | 1:18:19 | |
Bad as things get between the two countries, | 1:18:19 | 1:18:21 | |
those inspection regimes have held up. | 1:18:21 | 1:18:24 | |
But working that out for...for cyber would be virtually impossible. | 1:18:24 | 1:18:28 | |
Where do you send your inspector? | 1:18:28 | 1:18:30 | |
Inside the laptop of, you know... | 1:18:30 | 1:18:32 | |
How many laptops are there in the United States and Russia? | 1:18:32 | 1:18:35 | |
It's much more difficult in the cyber area | 1:18:35 | 1:18:37 | |
to construct an international regime | 1:18:37 | 1:18:39 | |
based on treaty commitments and rules of the road and so forth. | 1:18:39 | 1:18:43 | |
Although we've tried to have discussions | 1:18:43 | 1:18:45 | |
with the Chinese and Russians and so forth about that, | 1:18:45 | 1:18:48 | |
but it's very difficult. | 1:18:48 | 1:18:50 | |
-BROWN: -Right now, the norm in cyberspace is... | 1:18:50 | 1:18:54 | |
do whatever you can get away with. | 1:18:54 | 1:18:56 | |
That's not a good norm, but it's the norm that we have. | 1:18:56 | 1:18:59 | |
That's the norm that is preferred by states | 1:18:59 | 1:19:01 | |
that are engaging in lots of different kinds of activities | 1:19:01 | 1:19:04 | |
that they feel are benefiting their national security. | 1:19:04 | 1:19:06 | |
-YADLIN: -Those who excel in cyber | 1:19:06 | 1:19:09 | |
are trying to slow down the process of creating regulation. | 1:19:09 | 1:19:14 | |
Those who are victims | 1:19:14 | 1:19:16 | |
would like the regulation to be in the open as soon as possible. | 1:19:16 | 1:19:21 | |
International law in this area is written by custom, | 1:19:23 | 1:19:26 | |
and customary law requires a nation to say, | 1:19:26 | 1:19:29 | |
"This is what we did this is why we did it." | 1:19:29 | 1:19:31 | |
And the US doesn't want to push the law in that direction, | 1:19:31 | 1:19:34 | |
and so it chooses not to disclose its involvement. | 1:19:34 | 1:19:37 | |
And one of the reasons that I thought it was important | 1:19:37 | 1:19:40 | |
to tell the story of Olympic Games | 1:19:40 | 1:19:42 | |
was not simply because it's a cool spy story - it is - | 1:19:42 | 1:19:45 | |
but it's because, as a nation, | 1:19:45 | 1:19:49 | |
we need to have a debate about how we want to use cyber weapons, | 1:19:49 | 1:19:53 | |
because we are the most vulnerable nation on Earth | 1:19:53 | 1:19:56 | |
to cyber attack ourselves. | 1:19:56 | 1:19:58 | |
Let's say you took over the control system of a railway - | 1:20:00 | 1:20:03 | |
you could switch tracks. | 1:20:03 | 1:20:05 | |
You could cause derailments of trains carrying explosive materials. | 1:20:05 | 1:20:10 | |
What if you were in the control system of gas pipelines | 1:20:10 | 1:20:13 | |
and when a valve was supposed to be open, it was closed, | 1:20:13 | 1:20:17 | |
and the pressure built up and the pipeline exploded? | 1:20:17 | 1:20:21 | |
There are companies that run electric power generation | 1:20:21 | 1:20:25 | |
or electric power distribution - | 1:20:25 | 1:20:27 | |
that we know have been hacked by foreign entities - | 1:20:27 | 1:20:30 | |
that have the ability to shut down the power grid. | 1:20:30 | 1:20:33 | |
-NEWS REPORT: -'According to the officials, | 1:20:35 | 1:20:37 | |
'Iran is the first country ever in the Middle East | 1:20:37 | 1:20:40 | |
'to be engaged in a cyber war with the United States and Israel. | 1:20:40 | 1:20:44 | |
'If anything, they said the recent cyber attacks | 1:20:44 | 1:20:47 | |
'were what encouraged them to plan to set up the Cyber Army, | 1:20:47 | 1:20:51 | |
'which will gather computer scientists, | 1:20:51 | 1:20:53 | |
'programmers, software engineers...' | 1:20:53 | 1:20:56 | |
-KIYAEI: -If you are a youth and you see | 1:20:56 | 1:20:58 | |
assassination of a nuclear scientist, | 1:20:58 | 1:21:00 | |
and your nuclear facilities are getting attacked, | 1:21:00 | 1:21:03 | |
wouldn't you join your national Cyber Army? | 1:21:03 | 1:21:07 | |
Well, many did, and that's why, today, | 1:21:07 | 1:21:11 | |
Iran has one of the largest cyber armies in the world. | 1:21:11 | 1:21:16 | |
So, whoever initiated this, | 1:21:16 | 1:21:18 | |
and was very proud of themselves to see that little dip | 1:21:18 | 1:21:21 | |
in Iran's centrifuge numbers, | 1:21:21 | 1:21:24 | |
should look back now | 1:21:24 | 1:21:26 | |
and acknowledge that it was a major mistake. | 1:21:26 | 1:21:29 | |
Very quickly, Iran sent a message to the United States, | 1:21:29 | 1:21:34 | |
a very sophisticated message, | 1:21:34 | 1:21:36 | |
and they did that with two attacks. | 1:21:36 | 1:21:39 | |
First, they attacked Saudi Aramco, | 1:21:39 | 1:21:42 | |
the biggest oil company in the world, | 1:21:42 | 1:21:45 | |
and wiped out every piece of software, every line of code, | 1:21:45 | 1:21:49 | |
on 30,000 computer devices. | 1:21:49 | 1:21:53 | |
Then Iran did a surge attack on the American banks. | 1:21:53 | 1:21:59 | |
The most extensive attack on American banks ever, | 1:21:59 | 1:22:01 | |
launched from the Middle East, happening right now. | 1:22:01 | 1:22:04 | |
When Iran hit our banks, | 1:22:06 | 1:22:08 | |
we could've shut down their bot net, | 1:22:08 | 1:22:10 | |
but the State Department got nervous, | 1:22:10 | 1:22:12 | |
because the servers weren't actually in Iran, | 1:22:12 | 1:22:15 | |
so until there was a diplomatic solution, | 1:22:15 | 1:22:18 | |
Obama let the private sector deal with the problem. | 1:22:18 | 1:22:21 | |
I imagine that in the White House Situation Room, | 1:22:21 | 1:22:24 | |
people sat around and said... | 1:22:24 | 1:22:27 | |
Let me be clear, I don't imagine I know. | 1:22:27 | 1:22:30 | |
People sat around in the White House Situation Room and said, | 1:22:30 | 1:22:34 | |
"The Iranians have sent us a message, which is essentially - | 1:22:34 | 1:22:37 | |
"stop attacking us in cyberspace the way you did at Natanz with Stuxnet. | 1:22:37 | 1:22:43 | |
"We can do it, too." | 1:22:43 | 1:22:44 | |
There are unintended consequences of the Stuxnet attack. | 1:22:46 | 1:22:50 | |
You wanted to cause confusion and damage to the other side, | 1:22:50 | 1:22:54 | |
but then the other side can do the same to you. | 1:22:54 | 1:22:57 | |
The monster turned against its creator, | 1:22:57 | 1:23:00 | |
and now everyone is in this game. | 1:23:00 | 1:23:03 | |
They did a good job in showing the world, including the bad guys, | 1:23:03 | 1:23:08 | |
what you would need to do in order to cause serious trouble | 1:23:08 | 1:23:11 | |
that could lead to injuries and death. | 1:23:11 | 1:23:14 | |
I mean, you've been focusing on Stuxnet, | 1:23:14 | 1:23:16 | |
but that was just a small part | 1:23:16 | 1:23:18 | |
of the much larger Iranian mission. | 1:23:18 | 1:23:20 | |
There was a larger Iranian mission? | 1:23:20 | 1:23:22 | |
Nitro Zeus, | 1:23:25 | 1:23:27 | |
NZ. | 1:23:27 | 1:23:29 | |
We spent hundreds of millions - maybe billions - on it. | 1:23:30 | 1:23:34 | |
In the event the Israelis did attack Iran, | 1:23:37 | 1:23:40 | |
we assumed we would be drawn into the conflict. | 1:23:40 | 1:23:43 | |
We built in attacks on Iran's command and control system | 1:23:44 | 1:23:47 | |
so the Iranians couldn't talk to each other in a fight. | 1:23:47 | 1:23:49 | |
We infiltrated their IADS, military air defence systems, | 1:23:49 | 1:23:53 | |
so they couldn't shoot down our planes if we flew over. | 1:23:53 | 1:23:56 | |
We also went after their civilian support systems, power grids, | 1:23:56 | 1:24:00 | |
transportation, communications, | 1:24:00 | 1:24:03 | |
financial systems... | 1:24:03 | 1:24:05 | |
We were inside, waiting, watching, | 1:24:05 | 1:24:08 | |
ready to disrupt, degrade and destroy those systems | 1:24:08 | 1:24:11 | |
with cyber attacks. | 1:24:11 | 1:24:13 | |
In comparison, Stuxnet was a back-alley operation. | 1:24:16 | 1:24:20 | |
NZ was the plan for a full-scale cyber war with no attribution. | 1:24:21 | 1:24:27 | |
We need an entirely new way of thinking | 1:24:27 | 1:24:29 | |
about how we're going to solve this problem. | 1:24:29 | 1:24:31 | |
You're not going to get an entirely new way of solving this problem | 1:24:31 | 1:24:35 | |
until you begin to have an open acknowledgement | 1:24:35 | 1:24:38 | |
that we have cyber weapons, as well, | 1:24:38 | 1:24:41 | |
and that we may have to agree to some limits on their use | 1:24:41 | 1:24:44 | |
if we're going to get other nations to limit their use. | 1:24:44 | 1:24:47 | |
It's not going to be a one-way street. | 1:24:47 | 1:24:49 | |
I'm old enough to have worked on nuclear arms control, | 1:24:49 | 1:24:52 | |
and biological weapons arms control, | 1:24:52 | 1:24:54 | |
and chemical weapons arms control. | 1:24:54 | 1:24:56 | |
And I was told in each of those types of arms control, | 1:24:57 | 1:25:02 | |
when we were beginning, "It's too hard. There are all these problems. | 1:25:02 | 1:25:06 | |
"It's technical. There's engineering. | 1:25:06 | 1:25:08 | |
"There's science involved. | 1:25:08 | 1:25:10 | |
"There are real verification difficulties. | 1:25:10 | 1:25:12 | |
"You'll never get there." | 1:25:12 | 1:25:14 | |
Well, it took 20, 30 years in some cases, | 1:25:14 | 1:25:17 | |
but we have a biological weapons treaty that's pretty damn good. | 1:25:17 | 1:25:20 | |
We have a chemical weapons treaty that's pretty damn good. | 1:25:20 | 1:25:23 | |
We've got three or four nuclear weapons treaties. | 1:25:23 | 1:25:25 | |
Yes, it may be hard and it may take 20 or 30 years, | 1:25:25 | 1:25:29 | |
but it'll never happen unless you get serious about it, | 1:25:29 | 1:25:32 | |
and it'll never happen unless you start it. | 1:25:32 | 1:25:35 | |
Today, after two years of negotiations, | 1:25:40 | 1:25:43 | |
the United States, together with our international partners, | 1:25:43 | 1:25:46 | |
has achieved something that decades of animosity has not - | 1:25:46 | 1:25:51 | |
a comprehensive, long-term deal with Iran | 1:25:51 | 1:25:53 | |
that will prevent it from obtaining a nuclear weapon. | 1:25:53 | 1:25:57 | |
It is a deal in which Iran | 1:25:57 | 1:25:59 | |
will cut its installed centrifuges | 1:25:59 | 1:26:02 | |
by more than two thirds. | 1:26:02 | 1:26:04 | |
Iran will not enrich uranium with its advanced centrifuges | 1:26:04 | 1:26:07 | |
for at least the next ten years. | 1:26:07 | 1:26:09 | |
It will make our country, our allies, and our world safer. | 1:26:09 | 1:26:14 | |
70 years after the murder of 6 million Jews, | 1:26:14 | 1:26:18 | |
Iran's rulers | 1:26:18 | 1:26:20 | |
promise to destroy my country, and the response | 1:26:20 | 1:26:24 | |
from nearly every one of the governments represented here | 1:26:24 | 1:26:28 | |
has been utter silence. | 1:26:28 | 1:26:31 | |
Deafening silence. | 1:26:31 | 1:26:33 | |
Perhaps you can now understand | 1:26:40 | 1:26:43 | |
why Israel is not joining you in celebrating this deal. | 1:26:43 | 1:26:47 | |
History shows that America must lead | 1:26:47 | 1:26:49 | |
not just with our might, but with our principles. | 1:26:49 | 1:26:53 | |
It shows we are stronger | 1:26:53 | 1:26:55 | |
not when we are alone but when we bring the world together. | 1:26:55 | 1:27:00 | |
Today's announcement marks one more chapter | 1:27:00 | 1:27:03 | |
in this pursuit of a safer and a more helpful, | 1:27:03 | 1:27:06 | |
more hopeful world. | 1:27:06 | 1:27:08 | |
Thank you. God bless you and God bless the United States of America. | 1:27:08 | 1:27:13 | |
-DISTORTED FEMALE VOICE: -Everyone I know is thrilled | 1:27:18 | 1:27:20 | |
with the Iran deal. Sanctions and diplomacy worked, | 1:27:20 | 1:27:24 | |
but behind that deal was a lot of confidence in our cyber capability. | 1:27:24 | 1:27:27 | |
We were everywhere inside Iran, still are. | 1:27:28 | 1:27:32 | |
I'm not going to tell you the operational details | 1:27:32 | 1:27:34 | |
of what we can do, going forward, or where... | 1:27:34 | 1:27:38 | |
but the science-fiction cyber war scenario is here, | 1:27:38 | 1:27:41 | |
and that's Nitro Zeus. | 1:27:41 | 1:27:43 | |
But my concern, and the reason I'm talking... | 1:27:45 | 1:27:47 | |
..is because when you shut down a country's power grid... | 1:27:48 | 1:27:53 | |
it doesn't just pop back up. | 1:27:53 | 1:27:55 | |
You know, it's more like Humpty Dumpty, | 1:27:55 | 1:27:58 | |
and if all the king's men can't turn the lights back on | 1:27:58 | 1:28:02 | |
or filter the water for weeks, | 1:28:02 | 1:28:04 | |
then lots of people die. | 1:28:04 | 1:28:06 | |
And something we can do to others, they can do to us, too. | 1:28:08 | 1:28:12 | |
Is that something that we should keep quiet | 1:28:14 | 1:28:16 | |
or should we talk about it? | 1:28:16 | 1:28:18 | |
I've gone to many people on this film, even friends of mine, | 1:28:18 | 1:28:22 | |
who won't talk to me about the NSA and Stuxnet, even off the record, | 1:28:22 | 1:28:24 | |
for fear of going to jail. | 1:28:24 | 1:28:26 | |
Is that fear protecting us? | 1:28:26 | 1:28:29 | |
No. But it protects me. | 1:28:29 | 1:28:32 | |
Or should I say "we"? | 1:28:32 | 1:28:34 | |
-NO VOICE DISTORTION: -I'm an actor playing a role, | 1:28:35 | 1:28:37 | |
written from the testimony of a small number of people | 1:28:37 | 1:28:40 | |
from NSA and CIA - all of whom are angry about the secrecy, | 1:28:40 | 1:28:43 | |
but too scared to come forward. | 1:28:43 | 1:28:45 | |
Now, we're forward. | 1:28:45 | 1:28:47 | |
Well... | 1:28:47 | 1:28:49 | |
"forward-leaning". | 1:28:49 | 1:28:51 |