Zero Days: Nuclear Cyber Sabotage Storyville


Zero Days: Nuclear Cyber Sabotage

A Storyville documentary: How the American and Israeli intelligence agencies allegedly unleashed self-replicating computer malware to destroy part of an Iranian nuclear facility.


Similar Content

Browse content similar to Zero Days: Nuclear Cyber Sabotage. Check below for episodes and series from the same categories and more!

Transcript


LineFromTo

This programme contains some strong language.

0:00:020:00:09

-DISTORTED MALE VOICE:

-Through the darkness

0:00:190:00:23

of the pathways that we march,

0:00:230:00:25

evil and good live side by side,

0:00:250:00:29

and this is the nature of life.

0:00:290:00:31

We are in an unbalanced and un-equivalent confrontation

0:00:460:00:51

between democracies who are obliged to play by the rules

0:00:510:00:55

and entities who thinks democracy is a joke.

0:00:550:00:59

You can't convince fanatics by saying,

0:01:010:01:04

"Hey, hatred paralyses you, love releases you."

0:01:040:01:10

There are different rules that we have to play by.

0:01:100:01:15

-NEWS REPORT:

-'Today, two of Iran's top nuclear scientists

0:01:290:01:32

'were targeted by hit squads...'

0:01:320:01:34

'Bomb attacks in the capital, Tehran...'

0:01:340:01:36

'The latest in a string of attacks...'

0:01:360:01:37

'Today's attack has all the hallmarks

0:01:370:01:40

'of major strategic sabotage...'

0:01:400:01:41

'Iran immediately accused the US and Israel

0:01:410:01:44

'of trying to damage its nuclear programme...'

0:01:440:01:46

I want to categorically deny any United States involvement

0:01:550:02:01

in any kind of active violence inside Iran.

0:02:010:02:06

Covert actions can help, can assist.

0:02:060:02:11

They are needed. They are not all the time essentials.

0:02:110:02:15

And they in no way can replace political wisdom.

0:02:150:02:19

INTERVIEWER: Were the assassinations in Iran

0:02:190:02:22

related to the Stuxnet computer attacks?

0:02:220:02:25

Er, next question, please.

0:02:250:02:27

-NEWS REPORT:

-'Iran's infrastructure is being targeted

0:02:270:02:30

'by a new and dangerously powerful cyber worm.

0:02:300:02:34

'The so-called Stuxnet worm is specifically designed, it seems,

0:02:340:02:37

'to infiltrate and sabotage real world power plants

0:02:370:02:40

'and factories and refineries...'

0:02:400:02:41

'It's not trying to steal information

0:02:410:02:43

'or grab your credit card,

0:02:430:02:44

'it's trying to get into some sort of industrial plant and wreck havoc,

0:02:440:02:47

'try to blow up an engine...'

0:02:470:02:49

'No-one knows who's behind the worm

0:03:050:03:07

'and the exact nature of its mission,

0:03:070:03:08

'but there are fears Iran will hold Israel or America responsible

0:03:080:03:13

'and seek retaliation.'

0:03:130:03:14

'It's not impossible that some group of hackers did it,

0:03:140:03:17

'but the security experts that are studying this

0:03:170:03:19

'really think this required the resources of a nation state.'

0:03:190:03:22

-OK? And speaking.

-OK, ready.

-OK, good. Here we go.

0:03:280:03:31

INTERVIEWER: What impact, ultimately,

0:03:320:03:34

did the Stuxnet attack have? Can you say?

0:03:340:03:37

Er, I don't want to get into the details.

0:03:370:03:39

Since the event has already happened,

0:03:390:03:41

why can't we talk more openly and publically about Stuxnet?

0:03:410:03:45

Yeah. I mean, my answer's "Because it's classified."

0:03:450:03:49

I won't acknowledge...

0:03:490:03:50

Knowingly offer up anything I consider classified.

0:03:500:03:54

I know that you can't talk much about Stuxnet,

0:03:540:03:56

because Stuxnet is officially classified.

0:03:560:03:59

You're right on both those counts.

0:03:590:04:00

People might find it frustrating not to be able to talk about it

0:04:000:04:03

when it's in the public domain, but...

0:04:030:04:06

-I find it frustrating.

-Yeah, I'm sure you do.

0:04:060:04:09

-I don't answer that question.

-Unfortunately, I can't comment.

0:04:090:04:12

I do not know how to answer that.

0:04:120:04:13

Two answers before you even get started, I don't know and if I did,

0:04:130:04:17

-we wouldn't talk about it anyway.

-How can you have a debate

0:04:170:04:19

-if everything's secret?

-I think, right now, that's just where we are.

0:04:190:04:22

No-one wants to...

0:04:220:04:23

Countries aren't happy about confessing or owning up

0:04:230:04:27

to what they did, because they're not quite sure

0:04:270:04:29

where they want the system to go.

0:04:290:04:31

And so, whoever was behind Stuxnet hasn't admitted they were behind it.

0:04:310:04:35

Asking officials about Stuxnet was frustrating and surreal.

0:04:380:04:42

Like asking the Emperor about his new clothes.

0:04:420:04:45

Even after the cyber weapon had penetrated computers

0:04:450:04:48

all over the world, no-one was willing to admit that it was loose

0:04:480:04:52

or talk about the dangers it posed.

0:04:520:04:54

What was it about the Stuxnet operation

0:04:540:04:56

that was hiding in plain sight?

0:04:560:04:59

Maybe there was a way the computer code could speak for itself.

0:05:000:05:04

Stuxnet first surfaced in Belarus.

0:05:040:05:07

I started with a call to the man who discovered it

0:05:070:05:10

when his clients in Iran began to panic

0:05:100:05:12

over an epidemic of computer shutdowns.

0:05:120:05:15

Had you ever seen anything quite so sophisticated before?

0:05:150:05:19

On a daily basis, basically,

0:06:390:06:40

we are sifting through a massive haystack,

0:06:400:06:43

looking for that proverbial needle.

0:06:430:06:47

We get millions of pieces of new malicious threats

0:06:470:06:49

and there are millions of attacks going on every single day.

0:06:490:06:52

And not only are we trying to protect people and their computers,

0:06:520:06:55

and their systems, and countries' infrastructure

0:06:550:06:58

from being taken down by those attacks,

0:06:580:07:01

but, more importantly, we have to find attacks that matter.

0:07:010:07:04

When you're talking about that many, impact is extremely important.

0:07:040:07:09

20 years ago, the antivirus companies,

0:07:200:07:21

they were hunting for computer viruses

0:07:210:07:23

because there were not so many.

0:07:230:07:25

So, we had, like, tens or dozens a month

0:07:250:07:28

and they were just in little numbers.

0:07:280:07:31

Now, we collect millions of unique attacks every month.

0:07:310:07:34

This room we call a woodpeckers' room, or a virus lab,

0:07:360:07:39

and this is where virus analysts sit.

0:07:390:07:41

We call them "woodpeckers" because they are pecking the worms,

0:07:410:07:44

network worms and viruses.

0:07:440:07:47

We see, like, three different groups of actors

0:07:470:07:50

behind cyber attacks. They are traditional cybercriminals.

0:07:500:07:53

Those guys are interested only in illegal profit -

0:07:530:07:57

quick and dirty money.

0:07:570:07:59

Activists or hacktivists, they are hacking for fun,

0:07:590:08:02

or hacking to push some political message.

0:08:020:08:04

And the third group is nation states.

0:08:040:08:07

They're interested in high-quality intelligence or sabotage activity.

0:08:070:08:11

Security companies not only share information,

0:08:120:08:15

but we also share binary samples.

0:08:150:08:16

So, when this threat was found by a Belarusian security company

0:08:160:08:20

on one of their customer's machines in Iran,

0:08:200:08:22

the sample was shared amongst the security community.

0:08:220:08:25

When we try to name threats, we just try to pick some sort of string,

0:08:250:08:28

some sort of words, that are inside of the binary.

0:08:280:08:32

In this case, there was a couple of words in there.

0:08:320:08:35

We took pieces of each and that forms "Stuxnet".

0:08:350:08:38

I got the news about Stuxnet from one of my engineers.

0:08:400:08:43

He came to my office, opened the door,

0:08:430:08:46

and he said, "So, Eugene,

0:08:460:08:47

"of course, you know what we're waiting for? Something really bad?

0:08:470:08:52

"It happened."

0:08:520:08:54

Give me some sense of what it was like in the lab at that time.

0:08:590:09:02

Was there a palpable sense of amazement

0:09:020:09:04

that you had something really different there?

0:09:040:09:06

Well, I wouldn't call it amazement.

0:09:060:09:08

It was kind of a shock.

0:09:080:09:11

It went beyond our worst fears, our worst nightmares.

0:09:110:09:14

And this continued the more we analysed,

0:09:140:09:17

the more we researched,

0:09:170:09:19

the more bizarre the whole story got.

0:09:190:09:22

We look at so much malware every day

0:09:220:09:24

that we can just look at the code and say,

0:09:240:09:25

"OK, there's something bad going on here

0:09:250:09:27

"and I need to investigate that."

0:09:270:09:29

That's the way it was when we looked at Stuxnet for the first time.

0:09:290:09:32

We opened it up, and there was just bad things everywhere.

0:09:320:09:34

Like, "OK, this is bad and that's bad, and, you know,

0:09:340:09:37

"we need to investigate this."

0:09:370:09:38

Suddenly, we had, like, 100 questions straightaway.

0:09:380:09:41

The most interesting thing that we do is the detective work

0:09:430:09:45

where we try to track down who's behind a threat.

0:09:450:09:47

What are they doing? What's their motivation?

0:09:470:09:49

And try to really stop it at the root.

0:09:490:09:51

It is kind of all-consuming.

0:09:510:09:53

You get this new puzzle and it's very difficult to put it down.

0:09:530:09:56

You know, work until, like, 4:00am in the morning

0:09:560:09:58

and figure these things out.

0:09:580:10:00

I was in that zone where I was very consumed by this,

0:10:000:10:02

very excited about it,

0:10:020:10:03

very interested to know what was happening.

0:10:030:10:05

And Eric was also in that same sort of zone.

0:10:050:10:08

So, the two of us were, like, back and forth all the time.

0:10:080:10:11

Liam and I continued to grind at the code.

0:10:110:10:14

Sharing pieces, comparing notes,

0:10:140:10:16

bouncing ideas off of each other.

0:10:160:10:18

We realised that we needed to do what we call "deep analysis" -

0:10:180:10:21

pick apart the threat, every single byte, every single zero-one,

0:10:210:10:25

and understand everything that was inside of it.

0:10:250:10:28

I'll just give you some context.

0:10:280:10:29

We can go through and understand every line of code

0:10:290:10:31

for the average threat in minutes.

0:10:310:10:33

And here we are one month into this threat

0:10:330:10:35

and we're just starting to discover

0:10:350:10:37

what we call the "payload", or its whole purpose.

0:10:370:10:39

When looking at the Stuxnet code,

0:10:410:10:43

it's 20 times the size of the average piece of code

0:10:430:10:46

but contains almost no bugs inside of it and that's extremely rare.

0:10:460:10:49

Malicious code always has bugs inside of it.

0:10:490:10:52

This wasn't the case with Stuxnet.

0:10:520:10:53

It's dense and every piece of code does something

0:10:530:10:56

and does something right in order to conduct its attack.

0:10:560:10:59

One of the things that surprised us was that Stuxnet utilised

0:11:000:11:03

what's called a "zero-day exploit".

0:11:030:11:05

Or, basically, a piece of code

0:11:050:11:07

that allows it to spread without you having to do anything.

0:11:070:11:10

You don't have to, for example, download a file and run it.

0:11:100:11:13

A zero-day exploit is an exploit

0:11:130:11:15

that nobody knows about except the attacker.

0:11:150:11:17

So there's no protection against it, there's been no patch released.

0:11:170:11:20

There's been zero days' protection, you know, against it.

0:11:200:11:24

That's what attackers value

0:11:240:11:26

because they know 100%, if they have this zero-day exploit,

0:11:260:11:29

they can get in wherever they want.

0:11:290:11:31

They're actually very valuable. You can sell these on the underground

0:11:310:11:34

for hundreds of thousands of dollars.

0:11:340:11:36

Then we became more worried because we discovered more zero-days.

0:11:360:11:40

And, again, these zero-days are extremely rare.

0:11:400:11:42

Inside Stuxnet we had, you know, four zero-days,

0:11:420:11:44

and for the entire rest of the year

0:11:440:11:47

we only saw 12 zero-days used.

0:11:470:11:48

It blows everything else out of the water.

0:11:480:11:50

We've never seen this before. We've never seen it since, either.

0:11:500:11:53

We've seen one in a malware you could understand,

0:11:530:11:56

because the malware authors are making money,

0:11:560:11:57

they're stealing people's credit cards.

0:11:570:11:59

They're making money, so it's worth their while to use it.

0:11:590:12:02

But seeing four zero-days could be worth 500,000 right there,

0:12:020:12:05

used in one piece of malware.

0:12:050:12:06

This is not your ordinary criminal gang who's doing this.

0:12:060:12:09

This is someone bigger.

0:12:090:12:10

It's definitely not traditional crime, not hacktivists.

0:12:100:12:14

Who else?

0:12:140:12:17

It was evident at a very early stage that,

0:12:170:12:20

just given the sophistication of this malware,

0:12:200:12:24

it suggested that there must have been a nation state involved -

0:12:240:12:28

at least one nation state involved in the development.

0:12:280:12:31

When we look at code that's coming from

0:12:310:12:33

what appears to be a state attacker, or state-sponsored attacker,

0:12:330:12:36

usually they're scrubbed clean.

0:12:360:12:37

They don't leave little bits behind.

0:12:370:12:39

They don't leave little hints behind.

0:12:390:12:41

But in Stuxnet there were actually a few hints left behind.

0:12:410:12:44

One was that in order to get lower level access to Microsoft Windows,

0:12:460:12:50

Stuxnet needed to use a digital certificate

0:12:500:12:52

which certifies that this piece of code came from a particular company.

0:12:520:12:57

Now, those attackers obviously couldn't go to Microsoft and say,

0:12:570:13:00

"Hey, test our code out for us and give us a digital certificate."

0:13:000:13:04

So they essentially stole them...

0:13:040:13:06

..from two companies in Taiwan.

0:13:070:13:08

And these two companies have nothing to do with each other except

0:13:080:13:11

for their close proximity in the exact same business park.

0:13:110:13:14

Digital certificates are guarded very, very closely,

0:13:160:13:19

behind multiple doors,

0:13:190:13:21

and they require multiple people to unlock.

0:13:210:13:24

And they need to provide both biometrics,

0:13:240:13:26

and, as well, pass phrases.

0:13:260:13:28

It wasn't like those certificates were just sitting on some machine

0:13:280:13:31

connected to the internet. Some human assets had to be involved.

0:13:310:13:34

-O'MURCHU:

-Spies, like a cleaner who comes in at night

0:13:340:13:37

and has stolen these certificates from these companies.

0:13:370:13:40

It did feel like walking onto the set of this James Bond movie

0:13:430:13:47

and you've been embroiled in this thing that,

0:13:470:13:49

you know, you'd never expected.

0:13:490:13:52

We continued to search, and we continued to search in the code,

0:13:540:13:56

and, eventually, we found some other breadcrumbs left

0:13:560:13:59

that we were able to follow.

0:13:590:14:01

It was doing something with Siemens.

0:14:010:14:03

Siemens software, possibly Siemens hardware.

0:14:030:14:05

We'd never, ever seen that in any malware before,

0:14:050:14:07

something targeting Siemens.

0:14:070:14:09

We didn't even know why they would be doing that.

0:14:090:14:11

But after googling, very quickly we understood

0:14:120:14:15

it was targeting Siemens PLCs.

0:14:150:14:17

Stuxnet was targeting a very specific hardware device,

0:14:170:14:20

something called a PLC, or a programmable logic controller.

0:14:200:14:24

-LANGNER:

-The PLC is kind of a very small computer

0:14:240:14:27

attached to physical equipment,

0:14:270:14:30

like pumps, like valves, like motors.

0:14:300:14:33

So, this little box is running a digital program

0:14:330:14:38

and the actions of this program

0:14:380:14:40

turns that motor on, off or sets a specific speed.

0:14:400:14:44

Those programmable logic controller

0:14:440:14:46

control things like power plants, power grids.

0:14:460:14:48

This is used in factories, it's used in critical infrastructure.

0:14:480:14:53

Critical infrastructure, it's everywhere around us.

0:14:530:14:56

Transportation, telecommunication, financial services, health care...

0:14:560:15:01

So, the payload of Stuxnet

0:15:010:15:02

was designed to attack some very important part of our world.

0:15:020:15:08

The payload is going to be important.

0:15:080:15:10

What happens there could be very dangerous.

0:15:100:15:12

-LANGNER:

-The next very big surprise came

0:15:140:15:17

when we infected our lab system.

0:15:170:15:20

We figured out that the malware was probing the controls.

0:15:200:15:24

It was quite picky on its target.

0:15:240:15:27

It didn't try to manipulate any given control

0:15:270:15:30

in a network that it would see.

0:15:300:15:32

It went through several checks

0:15:320:15:34

and when those checks failed, it would not implement the attack.

0:15:340:15:39

It was obviously probing for a specific target.

0:15:410:15:44

You've got to put this in context that, at the time,

0:15:460:15:48

we already knew, "Well, this was the most sophisticated piece of malware

0:15:480:15:52

"that we have ever seen."

0:15:520:15:54

So, it's kind of strange.

0:15:540:15:56

Somebody takes that huge effort

0:15:560:15:59

to hit that one specific target?

0:15:590:16:01

Well, that must be quite a significant target.

0:16:010:16:04

-CHIEN:

-At Symantec, we have probes on networks all over the world

0:16:070:16:10

watching for malicious activity.

0:16:100:16:13

-O'MURCHU:

-We'd seen infections of Stuxnet all over the world.

0:16:130:16:15

In the US, in Australia, in the UK,

0:16:150:16:18

France, Germany, all over Europe.

0:16:180:16:20

It spread to any Windows machine in the entire world.

0:16:200:16:23

You know, we had these organisations inside the United States.

0:16:230:16:26

They were in charge of industrial control facilities saying,

0:16:260:16:29

"We're infected, what's going to happen?"

0:16:290:16:31

We didn't know if there was a deadline coming up

0:16:310:16:33

where this threat would trigger and suddenly would, like,

0:16:330:16:36

turn off all electricity plants around the world

0:16:360:16:38

or it would start shutting things down or launching some attack.

0:16:380:16:42

We knew that Stuxnet could have very dire consequences.

0:16:420:16:46

And we were very worried about what the payload contained

0:16:460:16:49

and there was an imperative speed

0:16:490:16:52

that we had to race and try and beat this ticking bomb.

0:16:520:16:56

Eventually, we were able to refine this a little bit

0:16:560:16:58

and we saw that Iran was the number one infected country in the world.

0:16:580:17:01

That immediately raised our eyebrows.

0:17:010:17:04

We have never seen a threat before

0:17:040:17:06

where it was predominantly in Iran.

0:17:060:17:08

And so we began to follow what was going on in the geopolitical world.

0:17:080:17:12

What was happening in the general news. And, at that time,

0:17:120:17:15

there were actually multiple explosions of gas pipelines

0:17:150:17:18

going in and out of Iran.

0:17:180:17:20

Unexplained explosions.

0:17:200:17:21

And, of course, we did notice that, at the time,

0:17:230:17:26

there have been assassinations of nuclear scientists,

0:17:260:17:28

so that was worrying.

0:17:280:17:30

We knew there was something bad happening.

0:17:300:17:33

Did you get concerned for yourself?

0:17:330:17:35

Did you begin start looking over your shoulder from time to time?

0:17:350:17:38

Yeah, definitely looking over my shoulder

0:17:380:17:40

and being careful about what I spoke about on the phone.

0:17:400:17:43

Um... I was...

0:17:430:17:45

pretty confident my conversations on the phone were being listened to.

0:17:450:17:48

We were only half joking,

0:17:480:17:50

when we would look at each other and tell each other things like,

0:17:500:17:54

"Look, I'm not suicidal.

0:17:540:17:56

"If I drop dead on Monday, it wasn't me."

0:17:560:18:00

We'd been publishing information about Stuxnet

0:18:080:18:11

all through that summer.

0:18:110:18:13

And then, in November,

0:18:130:18:14

the industrial control systems expert in Holland contacted us.

0:18:140:18:18

And he said, "All of these devices

0:18:180:18:21

"that would be inside of an industrial control system

0:18:210:18:23

"hold a unique identifier number

0:18:230:18:26

"that identified the make and model of that device."

0:18:260:18:28

And we actually had a couple of these numbers in the code,

0:18:300:18:33

except we didn't know what they were.

0:18:330:18:36

And so we realised maybe what he was referring to

0:18:360:18:38

was the magic numbers we had.

0:18:380:18:39

And when we searched for those magic numbers in that context,

0:18:390:18:42

we saw that what had to be connected to this industrial control system

0:18:420:18:45

that was being targeted

0:18:450:18:46

were something called "frequency converters"

0:18:460:18:49

from two specific manufacturers. One of which was in Iran.

0:18:490:18:52

And so, at this time, we absolutely knew

0:18:520:18:54

that the facility that was being targeted had to be in Iran,

0:18:540:18:58

and it had equipment made from Iranian manufacturers.

0:18:580:19:01

When we looked up those frequency converters,

0:19:010:19:04

we immediately found out that they were actually export controlled

0:19:040:19:07

by the Nuclear Regulatory Commission.

0:19:070:19:08

And that immediately led us, then, to some nuclear facility.

0:19:080:19:13

This was more than a computer story,

0:19:290:19:31

so I left the world of the antivirus detectives

0:19:310:19:34

and sought out journalist David Sanger,

0:19:340:19:36

who specialised in the strange intersection of cyber,

0:19:360:19:39

nuclear weapons and espionage.

0:19:390:19:41

The emergence of the code is what put me on alert

0:19:420:19:45

that an attack was underway.

0:19:450:19:47

And because of the covert nature of the operation,

0:19:480:19:51

not only were official government spokesmen unable to talk about it,

0:19:510:19:56

they didn't even KNOW about it.

0:19:560:19:58

Eventually, the more I dug into it,

0:19:580:20:01

the more I began to find individuals

0:20:010:20:04

who had been involved in some piece of it

0:20:040:20:07

or who had witnessed some piece of it.

0:20:070:20:10

And that meant talking to Americans,

0:20:100:20:12

talking to Israelis, talking to Europeans,

0:20:120:20:15

because this was, obviously, the first, biggest

0:20:150:20:19

and most sophisticated example

0:20:190:20:22

of a state or two states using a cyber weapon for offensive purposes.

0:20:220:20:26

I came to this with a fair bit of history -

0:20:290:20:32

understanding the Iranian nuclear programme.

0:20:320:20:36

How did Iran get its first nuclear reactor?

0:20:360:20:40

We gave it to them under the Shah,

0:20:400:20:43

because the Shah was considered an American ally.

0:20:430:20:47

APPLAUSE

0:20:470:20:49

-SAMORE:

-But the revolution which overthrew the Shah in '79

0:20:490:20:53

really curtailed the programme

0:20:530:20:55

before it ever got any head of steam going.

0:20:550:20:58

Part of our policy against Iran after the revolution

0:20:580:21:02

was to deny them nuclear technology,

0:21:020:21:05

so most of the period, when I was involved, in the '80s and the '90s,

0:21:050:21:10

was the US running around the world

0:21:100:21:13

and persuading potential nuclear suppliers

0:21:130:21:16

not to provide even peaceful nuclear technology to Iran.

0:21:160:21:19

And what we missed was the clandestine transfer

0:21:190:21:22

in the mid-1980s from Pakistan to Iran.

0:21:220:21:26

-MOWATT-LARSSEN:

-Abdul Qadeer Khan is what we would call

0:21:290:21:32

the father of the Pakistan nuclear programme.

0:21:320:21:35

He had the full authority and confidence

0:21:350:21:37

of the Pakistan Government from its inception

0:21:370:21:39

to the production of nuclear weapons.

0:21:390:21:42

The AQ Khan network is so notable

0:21:440:21:46

because, aside from

0:21:460:21:48

building the Pakistani programme

0:21:480:21:50

for decades,

0:21:500:21:52

it also was the means by which other countries

0:21:520:21:56

were able to develop nuclear weapons - including Iran.

0:21:560:21:59

-SAMORE:

-By 2006, the Iranians had started producing

0:21:590:22:02

low-enriched uranium, producing more centrifuges, installing them

0:22:020:22:06

at the large-scale underground enrichment facility at Natanz.

0:22:060:22:09

How many times have you visited Natanz?

0:23:000:23:02

Not that many, because I left a few years ago already, IAEA,

0:23:020:23:05

but I was there quite a few times.

0:23:050:23:08

Natanz is in the middle of the desert.

0:23:110:23:14

When they were building it in secret,

0:23:160:23:18

they were calling it a "desert irrigation facility".

0:23:180:23:22

There is a lot of artillery and air force.

0:23:240:23:27

It's better protected against attack from the air

0:23:270:23:31

than any other nuclear installation I have seen.

0:23:310:23:34

And so, all the monitoring activities of the IAEA,

0:23:380:23:41

they are basic principle - you want to see what goes in, what goes out,

0:23:410:23:44

and then, on top of that,

0:23:440:23:46

you make sure that it produces low-enriched uranium.

0:23:460:23:49

Is that anything to do with the higher enrichments

0:23:490:23:52

and nuclear-weapon-grade uranium?

0:23:520:23:54

Iran's nuclear facilities are under 24-hour watch

0:24:000:24:03

of the United Nations nuclear watchdog, the IAEA,

0:24:030:24:07

the International Atomic Energy Agency.

0:24:070:24:10

Every single gram of Iranian fissile material...

0:24:100:24:14

..is accounted for.

0:24:160:24:17

-HEINONEN:

-When you look at the uranium which was there in Natanz,

0:24:200:24:24

it was a very special uranium.

0:24:240:24:26

This was called isotope 236.

0:24:260:24:29

And that was a puzzle to us,

0:24:290:24:31

because you only see this sort of uranium

0:24:310:24:34

in states which have nuclear weapons.

0:24:340:24:37

We realised that they had cheated us.

0:24:380:24:41

This sort of equipment has been bought from

0:24:410:24:44

what they call a black market.

0:24:440:24:47

They never point it out to...

0:24:470:24:48

They were caught at that point in time.

0:24:480:24:51

What I was surprised was the sophistication

0:24:510:24:55

and the quality control.

0:24:550:24:56

The way they have the manufacturing, it was really professional.

0:24:560:25:00

It was not something, you know,

0:25:000:25:01

you just create in a few months' time.

0:25:010:25:04

This was the result of a long process.

0:25:040:25:07

The centrifuge. You feed uranium gas

0:25:130:25:16

in and you have a cascade, thousands of centrifuges,

0:25:160:25:19

and from the other end, you get enriched uranium out.

0:25:190:25:23

It separates uranium based on spinning the rotor,

0:25:230:25:26

it spins so fast.

0:25:260:25:28

300 metres per second.

0:25:280:25:30

The same as the velocity of sound.

0:25:300:25:33

These are tremendous forces and, as a result,

0:25:340:25:37

the rotor, it twists and looks like a banana at one point of time.

0:25:370:25:41

So, it has to be in balance,

0:25:410:25:44

because any small vibration, it would blow up.

0:25:440:25:47

This is what makes them very difficult to manufacture.

0:25:470:25:49

You can model it, you can calculate it, but at the very end,

0:25:490:25:53

it's actually based on practice and experience,

0:25:530:25:57

so it's a piece of art, so to say.

0:25:570:26:00

'Ahmadinejad came into his presidency saying that,'

0:26:140:26:18

"If international community wants to derail us,

0:26:180:26:20

"we will stand up to it.

0:26:200:26:22

"If they want us to sign more inspections

0:26:220:26:25

"and more additional protocols and other measures, no, we will not.

0:26:250:26:29

"We will fight for our right.

0:26:290:26:31

"Iran is a signatory to the nuclear Non-Proliferation Treaty.

0:26:310:26:35

"And under that treaty, Iran has the right to nuclear programme.

0:26:350:26:39

"We can have enrichment.

0:26:390:26:40

"Who are you, world powers,

0:26:400:26:42

"to come and tell us that we cannot have enrichment?"

0:26:420:26:45

This was his mantra.

0:26:450:26:47

And it galvanised the public.

0:26:470:26:51

By 2007, 2008,

0:26:540:26:56

the US Government was in a very bad place with the Iranian programme.

0:26:560:27:00

President Bush recognised

0:27:010:27:03

that he could not even come out in public and declare

0:27:030:27:05

that the Iranians were building a nuclear weapon

0:27:050:27:07

because, by this time,

0:27:070:27:09

he had gone through the entire WMD fiasco in Iraq.

0:27:090:27:13

He could not really take military action.

0:27:130:27:16

Condoleezza Rice said to him at one point,

0:27:160:27:18

"You know, Mr President,

0:27:180:27:19

"I think you've invaded your last Muslim country,

0:27:190:27:22

"even for the best of reasons."

0:27:220:27:25

He didn't want to let the Israelis conduct the military operation.

0:27:260:27:31

'It's 1938,'

0:27:310:27:33

and Iran is Germany and it's racing

0:27:330:27:37

to arm itself with atomic bombs.

0:27:370:27:40

Iran's nuclear ambitions must be stopped and have to be stopped.

0:27:400:27:46

We all have to stop it now.

0:27:460:27:48

That's the one message I have for you today.

0:27:480:27:51

-Thank you.

-APPLAUSE

0:27:510:27:53

Israel was saying they were going to bomb Iran.

0:27:530:27:56

And the government here in Washington

0:27:560:27:58

did all sorts of scenarios about what would happen

0:27:580:28:01

if that Israeli attack occurred.

0:28:010:28:04

They were all very ugly scenarios.

0:28:040:28:06

Our belief was that, if they went on their own,

0:28:060:28:09

knowing their limitations...

0:28:090:28:10

They have a very good air force, all right,

0:28:100:28:12

but it's small and the distances are great

0:28:120:28:15

and the targets dispersed and hardened.

0:28:150:28:17

If they would have attempted a raid on a military plane,

0:28:170:28:23

we would have been assuming that they were assuming

0:28:230:28:26

we would finish that which they started.

0:28:260:28:28

In other words, there would be many of us in government

0:28:280:28:31

thinking that the purpose of the raid

0:28:310:28:33

wasn't to destroy the Iranian nuclear system,

0:28:330:28:35

but the purpose of the raid was to put us at war with Iran.

0:28:350:28:38

The two countries agreed on the goal.

0:28:400:28:43

There is no... A page between us

0:28:430:28:46

that Iran should not have a nuclear military capability.

0:28:460:28:51

There are some differences on how to achieve it

0:28:510:28:56

and when action is needed.

0:28:560:28:58

We are taking very seriously leaders of countries

0:29:070:29:10

who call to the destruction and annihilation of our people.

0:29:100:29:14

-SAMORE:

-The Israelis believe that the Iranian leadership

0:29:140:29:18

has already made the decision to build nuclear weapons

0:29:180:29:21

when they think they can get away with it. The view in the US

0:29:210:29:24

is that the Iranians haven't made that final decision yet.

0:29:240:29:29

To me, that doesn't make any difference.

0:29:290:29:31

I mean, it really doesn't make any difference,

0:29:310:29:33

and it's probably unknowable.

0:29:330:29:35

Unless you can put Supreme Leader Khomeini on the couch

0:29:350:29:38

and interview him, I think, from our standpoint,

0:29:380:29:41

stopping Iran from getting the threshold capacity

0:29:410:29:45

is the primary policy objective.

0:29:450:29:48

Once they had the fissile material,

0:29:490:29:50

once they had the capacity to produce nuclear weapons,

0:29:500:29:53

then the game is lost.

0:29:530:29:55

-HAYDEN:

-President Bush once said to me, he said,

0:30:000:30:02

"Mike, I don't want any president ever to be faced

0:30:020:30:05

"with only two options - bombing or the bomb." Right?

0:30:050:30:09

He wanted options that...

0:30:090:30:11

made it...

0:30:110:30:14

made it far less likely he or his successor, or successors,

0:30:140:30:18

would ever get to that point where that's all you've got.

0:30:180:30:20

The intelligence cooperation between Israel and the United States

0:30:200:30:24

is very, very good.

0:30:240:30:26

And, therefore, the Israelis went to the Americans and said,

0:30:260:30:29

"OK, guys, you don't want us to bomb Iran.

0:30:290:30:32

"OK, let's do it differently."

0:30:320:30:36

One day a group of intelligence and military officials showed up

0:30:360:30:41

in President Bush's office and said, "Sir, we have an idea.

0:30:410:30:46

"It's a big risk, it might not work, but here it is."

0:30:460:30:50

-LANGNER:

-Moving forward in my analysis of the code,

0:30:570:31:00

I took a closer look at the photographs

0:31:000:31:03

that have been published by the Iranians themselves

0:31:030:31:08

in a press tour from 2008

0:31:080:31:11

of Ahmadinejad and the shiny centrifuges.

0:31:110:31:14

The photographs of Ahmadinejad going through the centrifuges at Natanz

0:31:150:31:20

provided some very important clues.

0:31:200:31:24

There was a huge amount to be learned.

0:31:240:31:26

First of all, those photographs

0:31:340:31:36

showed many of the individuals

0:31:360:31:38

who were guiding Ahmadinejad through the programme.

0:31:380:31:41

And there's one very famous photograph

0:31:410:31:43

that shows Ahmadinejad being shown something.

0:31:430:31:46

You see his face, you can't see what's on the computer.

0:31:460:31:48

And one of the scientists who was behind him

0:31:480:31:52

was assassinated a few months later.

0:31:520:31:55

In one of those photographs,

0:31:580:32:00

you could see parts of a computer screen.

0:32:000:32:03

We refer to that as a "stata screen".

0:32:030:32:06

The stata system is basically a piece of software

0:32:060:32:08

running on a computer.

0:32:080:32:10

It enables the operators to monitor the process.

0:32:100:32:13

What you could see...

0:32:140:32:16

when you look close enough

0:32:160:32:19

was a more detailed view of the configuration.

0:32:190:32:23

There were these six groups of centrifuges

0:32:230:32:27

and each group had 164 entries.

0:32:270:32:29

And guess what?

0:32:310:32:32

That was a perfect match to what we saw in the attack code.

0:32:320:32:36

It was absolutely clear that this piece of code

0:32:370:32:40

was attacking an array of six different groups of,

0:32:400:32:44

let's just say "thingies", physical objects,

0:32:440:32:48

and in those six groups,

0:32:480:32:50

there were 164 elements.

0:32:500:32:53

Were you able to do any actual physical tests?

0:32:570:32:59

Or was it all just code analysis?

0:32:590:33:01

So, we couldn't set up our own nuclear enrichment facility.

0:33:010:33:06

So, what we did was we did obtain some PLCs, the exact models.

0:33:060:33:09

We then ordered an air pump.

0:33:160:33:18

And that's what we used sort of as our proof of concept.

0:33:180:33:21

-O'MURCHU:

-We needed a visual demonstration

0:33:210:33:23

to show people what we discovered.

0:33:230:33:25

So, we thought of different things that we could do

0:33:250:33:27

and we settled on blowing up a balloon.

0:33:270:33:29

We were able to write a program that would inflate a balloon

0:33:330:33:36

and it was set to stop after five seconds.

0:33:360:33:38

So, we would inflate the balloon to a certain size,

0:33:470:33:50

but we wouldn't burst the balloon, and it was all safe.

0:33:500:33:52

And we showed everybody, "This is the code that's on the PLC."

0:33:520:33:55

And the timer says, "Stop after five seconds".

0:33:550:33:58

We know that's what's going to happen.

0:33:580:34:00

And then we would infect the computer with Stuxnet

0:34:000:34:03

and we would run the test again.

0:34:030:34:05

Here is a piece of software that should only exist in the cyber realm

0:34:350:34:39

and it is able to infect physical equipment in a plant or factory

0:34:390:34:44

and cause physical damage.

0:34:440:34:46

Real-world physical destruction.

0:34:460:34:48

At that time, things became very scary to us.

0:34:510:34:53

Here you had malware potentially killing people

0:34:530:34:56

and that was something that was always Hollywood-esque to us,

0:34:560:34:59

that we would always laugh at, when people make that kind of assertion.

0:34:590:35:02

At this point, you had to have started developing theories

0:35:060:35:10

as to who had built Stuxnet.

0:35:100:35:14

It wasn't lost on us

0:35:140:35:15

that there were probably only a few countries in the world

0:35:150:35:20

that would want and have the motivation

0:35:200:35:22

to sabotage the Iranians' nuclear enrichment facility.

0:35:220:35:25

The US Government would be up there.

0:35:250:35:27

The Israeli government, certainly, would be up there.

0:35:270:35:29

You know, maybe UK, France, Germany, those sorts of countries,

0:35:290:35:32

but we never found any information

0:35:320:35:35

that would tie it back 100% to those countries.

0:35:350:35:38

There are no telltale signs.

0:35:380:35:40

The attackers don't leave a message inside saying,

0:35:400:35:42

you know, "It was me!"

0:35:420:35:44

And even if they did,

0:35:440:35:46

all of that stuff can be faked.

0:35:460:35:48

So, it's very, very difficult

0:35:480:35:50

to do attribution when looking at computer code.

0:35:500:35:53

Subsequent work that's been done

0:35:530:35:55

leads us to believe that this was the work of a collaboration

0:35:550:35:58

between Israel and the United States.

0:35:580:36:00

Did you have any evidence

0:36:000:36:01

in terms of your analysis that would lead you

0:36:010:36:03

to believe that that's correct, also?

0:36:030:36:05

Nothing that I could talk about on camera.

0:36:050:36:08

-INTERVIEWER CHUCKLES Can I ask why?

-No.

0:36:090:36:12

Well, you can, but I won't answer.

0:36:130:36:15

BOTH LAUGH

0:36:150:36:18

But even in the case of nation states, one of the concerns...

0:36:180:36:20

'This was beginning to really piss me off.

0:36:200:36:23

'Even civilians with an interest in telling the Stuxnet story

0:36:230:36:26

'were refusing to address the role of Tel Aviv and Washington.

0:36:260:36:31

'But, luckily for me,

0:36:310:36:32

'whilst DC is a city of secrets,

0:36:320:36:34

'it is also a city of leaks.

0:36:340:36:37

'They're as regular as a heartbeat and just as hard to stop.

0:36:370:36:41

'That's what I was counting on.'

0:36:410:36:43

'Finally, after speaking to a number of people on background,

0:36:470:36:51

'I did find a way of confirming, on the record,

0:36:510:36:53

'the American role in Stuxnet.

0:36:530:36:55

'In exchange for details of the operation,

0:36:550:36:58

'I had to agree to find a way

0:36:580:37:00

'to disguise the source of the information.'

0:37:000:37:03

-We're good?

-We're on.

0:37:030:37:05

So, the first question I have to ask you is about secrecy.

0:37:050:37:09

I mean, at this point, everyone knows about Stuxnet.

0:37:090:37:12

Why can't we talk about it?

0:37:120:37:14

-DISTORTED WOMAN'S VOICE:

-It's a covert operation.

0:37:140:37:16

Not any more. We know what happened, we know who did it.

0:37:160:37:19

Well, maybe you don't know as much as we think you know.

0:37:190:37:22

I'm talking to you because I want to get the story right.

0:37:240:37:26

That's the same reason I'm talking to you.

0:37:260:37:28

Even though it's a covert operation?

0:37:310:37:32

Well, this is not a Snowden kind of thing.

0:37:340:37:37

OK? I think what he did was wrong. He went too far.

0:37:370:37:40

He gave away too much.

0:37:400:37:42

Unlike Snowden, who was a contractor, I was in the NSA.

0:37:420:37:46

I believe in the agency, so what I'm willing to give you will be limited,

0:37:460:37:49

but we're talking because everyone's getting the story wrong

0:37:490:37:52

and we have to get it right. We have to understand these new weapons.

0:37:520:37:55

-The stakes are too high.

-What do you mean?

0:37:550:37:57

We did Stuxnet.

0:37:590:38:02

It's a fact.

0:38:020:38:04

You know, we came so fucking close to disaster,

0:38:040:38:07

and we're still on the edge.

0:38:070:38:10

It was a huge multinational inter-agency operation.

0:38:100:38:15

In the US, it was CIA,

0:38:160:38:19

NSA, and the military, Cyber Command.

0:38:190:38:23

From Britain, we used Iran intel out of GCHQ.

0:38:230:38:27

But the main partner was Israel.

0:38:270:38:29

Over there, Mossad ran the show

0:38:290:38:31

and the technical work was done by Unit 8200.

0:38:310:38:34

Israel is really the key to the story.

0:38:340:38:37

Our traffic in Israel is so unpredictable...

0:38:410:38:44

Yossi, how did you get into this Stuxnet story?

0:38:460:38:50

I have been covering the Israeli intelligence, in general,

0:38:500:38:54

and the Mossad in particular

0:38:540:38:56

for nearly 30 years.

0:38:560:38:59

I knew that Israel is trying to slow down Iran's nuclear programme

0:38:590:39:04

and, therefore, I came to the conclusion

0:39:040:39:06

that if there was a virus affecting Iran's computers,

0:39:060:39:10

it's one more element in this larger picture.

0:39:100:39:15

Amos Yadlin, General Yadlin,

0:39:160:39:19

he was the head of the military intelligence.

0:39:190:39:22

The biggest unit within that organisation is Unit 8200.

0:39:220:39:27

They bug telephones, they bug faxes, they break into computers.

0:39:270:39:32

A decade ago, when Yadlin became the Chief Of Military Intelligence,

0:39:340:39:38

there was no cyber warfare unit in 8200.

0:39:380:39:43

So, they started recruiting very talented people, hackers,

0:39:460:39:50

either from the military or outside the military

0:39:500:39:53

that can contribute to the project of building a cyber warfare unit.

0:39:530:39:57

It's another kind of weapon and it's for unlimited range,

0:39:590:40:02

in a very high speed and in a very low signature.

0:40:020:40:07

So this gives you a huge opportunity,

0:40:070:40:10

and the superpowers have to change the way we think about warfare.

0:40:100:40:15

Finally, we are transforming our military for a new kind of war

0:40:170:40:20

that we're fighting now...

0:40:200:40:22

..and for wars of tomorrow.

0:40:230:40:25

-SANGER:

-Back in the end of the Bush administration,

0:40:280:40:31

people in the US Government

0:40:310:40:33

were just beginning to convince President Bush to pour money

0:40:330:40:36

into offensive cyber weapons.

0:40:360:40:39

Stuxnet started off in the Defense Department.

0:40:390:40:43

Then Robert Gates, the Secretary of Defense,

0:40:430:40:45

reviewed this program and he said,

0:40:450:40:47

"This program shouldn't be in the Defense Department.

0:40:470:40:50

"This should be under the covert authorities

0:40:500:40:52

"over in the intelligence world."

0:40:520:40:55

So, the CIA was very deeply involved in this operation,

0:40:550:41:00

while much of the coding work

0:41:000:41:01

was done by the National Security Agency and Unit 8200 -

0:41:010:41:06

its Israeli equivalent - working together

0:41:060:41:09

with a newly created military position called US Cyber Command.

0:41:090:41:14

And, interestingly, the Director of the National Security Agency

0:41:140:41:19

would also have a second role

0:41:190:41:21

as the Commander of US Cyber Command.

0:41:210:41:25

And US Cyber Command is located at Fort Meade,

0:41:250:41:30

in the same building as the NSA.

0:41:300:41:33

-HAYDEN:

-NSA has no legal authority to attack.

0:41:330:41:35

It's never had it, I doubt that it ever will.

0:41:350:41:38

It might explain why US Cyber Command is sitting out of Fort Meade

0:41:380:41:41

on top of the National Security Agency.

0:41:410:41:43

Because NSA has the abilities to do these things.

0:41:430:41:46

Cyber Command has the AUTHORITY to do these things,

0:41:460:41:49

and "these things" here refer to the cyber attack.

0:41:490:41:52

This is a huge change for the nature of the intelligence agencies.

0:41:520:41:58

The NSA is supposed to be a code-making

0:41:580:42:01

and code-breaking operation,

0:42:010:42:04

to monitor the communications of foreign powers

0:42:040:42:07

and American adversaries

0:42:070:42:09

in the defence of the United States.

0:42:090:42:11

But creating a Cyber Command

0:42:110:42:14

meant using the same technology to do offensive work.

0:42:140:42:18

Once you get inside an adversary's computer networks,

0:42:200:42:24

you put an implant in that network,

0:42:240:42:27

and we have tens of thousands of foreign computers and networks

0:42:270:42:30

that the United States has put implants in.

0:42:300:42:33

You can use it to monitor what's going across that network

0:42:330:42:36

and you can use it to insert cyber weapons, malware.

0:42:360:42:41

If you can spy on a network, you can manipulate it.

0:42:410:42:45

It's already included. The only thing you need is an act of will.

0:42:450:42:50

-DISTORTED FEMALE VOICE:

-I played a role in Iraq.

0:42:530:42:56

I can't tell you whether it was military or not,

0:42:560:42:58

but I can tell you NSA had combat support teams in the country

0:42:580:43:02

and, for the first time,

0:43:020:43:04

units in the field had direct access to NSA intel.

0:43:040:43:07

Over time, we thought more about offence than defence.

0:43:100:43:13

More about attacking than intelligence.

0:43:130:43:16

In the old days, units would try to track radios,

0:43:160:43:19

but through NSA in Iraq,

0:43:190:43:21

we had access to all the networks going in and out of the country.

0:43:210:43:25

We hoovered up every text message, e-mail and phone call.

0:43:250:43:29

The complete surveillance state.

0:43:290:43:31

We could find the bad guys.

0:43:310:43:33

Say, a gang making IEDs -

0:43:330:43:36

map their networks and follow them in real-time.

0:43:360:43:40

We could lock into cellphones,

0:43:400:43:41

even when they were off, send a fake text message from a friend,

0:43:410:43:45

suggest a meeting place and then capture...

0:43:450:43:48

-SOLDIER:

-'You're clear to fire.'

0:43:480:43:50

..or kill.

0:43:500:43:51

I was in TAOS 321, the ROC.

0:43:530:43:57

OK, the TAO? The ROC?

0:43:570:44:00

Right, sorry, TAO is Tailored Access Operations.

0:44:000:44:03

It's where NSA's hackers work. Of course, we didn't call them that.

0:44:030:44:06

What did you call them?

0:44:060:44:08

On-net operators. They're the only people at NSA

0:44:080:44:11

allowed to break in or attack on the internet.

0:44:110:44:14

Inside TAO headquarters is the ROC - "Remote Operations Center".

0:44:140:44:18

If the US Government wants to get in somewhere,

0:44:180:44:23

it goes to the ROC.

0:44:230:44:25

I mean, we were flooded with requests.

0:44:250:44:27

So many that we could only do about 30% of the missions

0:44:270:44:32

that were requested of us at the one time.

0:44:320:44:33

Through the web, but also by hijacking shipments of parts.

0:44:330:44:38

You know, sometimes the CIA

0:44:380:44:40

would assist in putting implants in machines.

0:44:400:44:44

So, once inside a target network,

0:44:440:44:47

we could just...watch...

0:44:470:44:51

..or we could attack.

0:44:520:44:54

Inside NSA was a strange kind of culture -

0:44:580:45:01

like two parts macho military

0:45:010:45:03

and two parts cyber geek.

0:45:030:45:06

I mean, I came from Iraq, so I was used to, "Yes, sir!" "No, sir!"

0:45:060:45:09

but for the weapons programmers,

0:45:090:45:11

we needed more "think outside the box" types.

0:45:110:45:14

Were they all working on Stuxnet?

0:45:140:45:17

We never called it Stuxnet.

0:45:170:45:19

That was the name invented by the anti-virus guys.

0:45:190:45:22

When it hit the papers - we're not allowed

0:45:220:45:24

to read about classified operations even if it's in the New York Times -

0:45:240:45:27

we went out our way to avoid the term.

0:45:270:45:29

I mean, saying "Stuxnet" out loud

0:45:290:45:30

was like saying "Voldemort" in Harry Potter -

0:45:300:45:32

the Name That Shall Not Be Spoken.

0:45:320:45:34

What did you call it, then?

0:45:340:45:36

The Natanz attack, and this is out there already,

0:45:430:45:48

was called Olympic Games or OG.

0:45:480:45:51

There was a huge operation to test the code

0:45:540:45:58

on PLCs here at Fort Meade,

0:45:580:46:00

and in Sandia, New Mexico.

0:46:000:46:02

Remember during the Bush era,

0:46:040:46:06

when Libya turned over all of its centrifuges?

0:46:060:46:08

Those were the same models the Iranians got from AQ Khan, P1s.

0:46:080:46:12

We took them to Oak Ridge and used them to test the code,

0:46:130:46:17

which demolished the insides.

0:46:170:46:20

At Dimona, the Israelis also tested on the P1s.

0:46:200:46:24

Then, probably by using our intel on Iran,

0:46:250:46:28

we got the plans for the newer models, the IR2s.

0:46:280:46:31

We tried out different attack vectors.

0:46:310:46:34

We ended up focusing on ways to destroy the rotor tubes.

0:46:340:46:39

In the tests we ran, we blew them apart.

0:46:390:46:42

They swept up the pieces, they put it on an aeroplane,

0:46:440:46:46

they flew to Washington, they stuck it in a truck,

0:46:460:46:49

they drove it through the gates of the White House,

0:46:490:46:52

and dumped the shards out

0:46:520:46:53

on the conference room table in the Situation Room,

0:46:530:46:56

and then they invited President Bush to come down and take a look.

0:46:560:47:00

And when he could pick up the shard of a piece of centrifuge,

0:47:000:47:04

he was convinced this might be worth it,

0:47:040:47:07

and he said, "Go ahead and try."

0:47:070:47:09

Was there a legal concern inside the Bush administration

0:47:090:47:12

that this might be an act of undeclared war?

0:47:120:47:16

If there were concerns, I haven't found them.

0:47:160:47:19

That doesn't mean that they didn't exist

0:47:200:47:23

and that some lawyers somewhere were concerned about it,

0:47:230:47:27

but this was an entirely new territory.

0:47:270:47:30

At the time, there were only very few people who had expertise

0:47:300:47:34

specifically on the law of war and cyber.

0:47:340:47:36

And what we did was, looking at,

0:47:360:47:38

"OK, here's our broad direction.

0:47:380:47:40

"Now let's look, technically,

0:47:400:47:42

"what can we do to facilitate this broad direction?"

0:47:420:47:46

After that, maybe the...

0:47:460:47:47

I would come in, or one of my lawyers would come in and say,

0:47:470:47:51

"OK, this is what we may do."

0:47:510:47:54

OK? There are many things we CAN do but we are not ALLOWED to do them.

0:47:540:47:59

And then, after that, there's still a final level that we look at,

0:47:590:48:02

and that's, what should we do?

0:48:020:48:03

Because there are many things that would be technically possible

0:48:030:48:07

and technically legal, but a bad idea.

0:48:070:48:10

For Natanz, it was a CIA-led operation,

0:48:100:48:13

so we had to have agency sign-off.

0:48:130:48:16

Really?

0:48:160:48:18

Someone from the agency...

0:48:180:48:21

stood behind the operator and the analyst,

0:48:210:48:23

and gave the order to launch every attack.

0:48:230:48:26

Before they even started this attack,

0:48:330:48:35

they put inside of the code the kill date,

0:48:350:48:37

a date at which it would stop operating.

0:48:370:48:39

Cut-off dates, we don't normally see that in other threats,

0:48:390:48:42

and you have to think, "Well, why is there a cut-off date in there?"

0:48:420:48:46

When you realise that a section of it

0:48:460:48:47

was probably written by Government,

0:48:470:48:49

and that there are laws regarding

0:48:490:48:51

how you can use this sort of software,

0:48:510:48:53

that there may have been a legal team who said,

0:48:530:48:56

"No, you need to have a cut-off date in there,

0:48:560:48:58

"you can only do this and you can only go that far,

0:48:580:49:00

"and we need to check if this is legal or not."

0:49:000:49:02

That date is a few days before Obama's inauguration.

0:49:040:49:07

So, the theory is that

0:49:070:49:09

this was an operation that needed to be stopped at a certain time,

0:49:090:49:13

because there was going to be a handover

0:49:130:49:15

and that more approval was needed.

0:49:150:49:18

-Are you prepared to take the oath, Senator?

-I am.

0:49:210:49:24

I, Barack Hussein Obama...

0:49:240:49:26

-I, Barack...

-..do solemnly swear.

0:49:260:49:27

I, Barack Hussein Obama, do solemnly swear...

0:49:270:49:30

-SANGER:

-Olympic Games was reauthorised by President Obama

0:49:300:49:33

in his first year in office, 2009.

0:49:330:49:35

It was fascinating because it was the first year

0:49:390:49:42

of the Obama administration

0:49:420:49:43

and they would talk to you ENDLESSLY about cyber defence.

0:49:430:49:46

-OBAMA:

-We count on computer networks to deliver our oil and gas,

0:49:460:49:49

our power and our water.

0:49:490:49:51

We rely on them for public transportation

0:49:510:49:54

and air-traffic control.

0:49:540:49:56

But just as we failed in the past to invest

0:49:560:49:58

in our physical infrastructure, our roads,

0:49:580:50:02

our bridges and rails, we've failed to invest

0:50:020:50:04

in the security of our digital infrastructure.

0:50:040:50:07

But when you asked questions

0:50:070:50:09

about the use of offensive cyber weapons,

0:50:090:50:12

everything went dead.

0:50:120:50:14

No cooperation. White House wouldn't help. Pentagon wouldn't help.

0:50:140:50:17

NSA wouldn't help. Nobody would talk to you about it.

0:50:170:50:20

But when you dug into the budget for cyber spending

0:50:200:50:23

during the Obama administration,

0:50:230:50:26

what you discovered was

0:50:260:50:27

much of it was being spent on offensive cyber weapons.

0:50:270:50:31

You'd see phrases like "Title 10 CNO".

0:50:320:50:37

"Title 10" means "operations for the US Military",

0:50:370:50:40

and "CNO" means "computer network operations".

0:50:400:50:45

This is considerable evidence that Stuxnet was just the opening wedge

0:50:450:50:50

of what is a much broader US Government effort now

0:50:500:50:53

to develop an entire new class of weapons.

0:50:530:50:57

-CHIEN:

-Stuxnet wasn't just an evolution -

0:51:020:51:05

it was really a revolution in the threat landscape.

0:51:050:51:07

In the past, the vast majority of threats that we saw were always

0:51:090:51:12

controlled by an operator somewhere.

0:51:120:51:13

They wouldn't infect your machines,

0:51:130:51:15

but they would have what's called a "call-back"

0:51:150:51:17

or "command and control channel".

0:51:170:51:18

The threats would actually contact the operator and say,

0:51:180:51:21

"What do you want me to do next?" The operator would send commands

0:51:210:51:23

and say, maybe, "Search through this directory, find these folders,

0:51:230:51:26

"find these files, upload these files to me.

0:51:260:51:28

"Spread to this other machine."

0:51:280:51:29

Things of that nature.

0:51:290:51:30

But Stuxnet couldn't have a command and control channel,

0:51:300:51:34

because once it got inside of Natanz,

0:51:340:51:37

it would not have been able to reach back out to the attackers.

0:51:370:51:40

The Natanz network is completely air-gapped

0:51:400:51:42

from the rest of the internet. It's not connected to the internet.

0:51:420:51:45

It's its own isolated network.

0:51:450:51:46

Getting across an air gap is one of the more difficult challenges

0:51:460:51:49

that attackers will face,

0:51:490:51:50

just because of the fact that

0:51:500:51:52

everything is in place to prevent that.

0:51:520:51:53

You know, everything... You know, the policies and procedures

0:51:530:51:56

and the physical network that's in place is specifically designed

0:51:560:52:00

to prevent you crossing the air gap.

0:52:000:52:01

But there is no truly air-gapped network

0:52:010:52:03

in these real-world production environments.

0:52:030:52:06

People have got to get new code into Natanz.

0:52:060:52:08

People have to get log files off of this network in Natanz.

0:52:080:52:11

People have to upgrade equipment. People have to upgrade computers.

0:52:110:52:14

This highlights one of the major security issues

0:52:140:52:18

that we have in the field.

0:52:180:52:20

If you think, "Well, nobody can attack this power plant

0:52:200:52:23

"or this chemical plant because it's not connected to the internet,"

0:52:230:52:27

that's a bizarre illusion.

0:52:270:52:29

-DISTORTED FEMALE VOICE:

-And the first time we introduced the code

0:52:320:52:36

into Natanz, we used human assets.

0:52:360:52:39

Maybe CIA - more likely, Mossad - but...

0:52:390:52:43

our team was kept in the dark about the tradecraft.

0:52:430:52:45

We heard rumours in Moscow,

0:52:450:52:48

an Iranian laptop infected by a phoney Siemens technician

0:52:480:52:52

with a flash drive.

0:52:520:52:54

A double agent in Iran with access to Natanz.

0:52:550:52:58

But I don't really know.

0:52:580:53:00

What we had to focus on was to write the code

0:53:000:53:03

so that, once inside, the worm acted on its own.

0:53:030:53:07

They built in all the code and all the logic into the threat

0:53:070:53:10

to be able to operate all by itself.

0:53:100:53:12

It had the ability to spread by itself.

0:53:120:53:14

It had the ability to figure out, "Do I have the right PLCs?

0:53:140:53:17

"Have I arrived in Natanz?

0:53:170:53:19

"Am I at the target?"

0:53:190:53:20

-LANGNER:

-And when it's on target, it executes autonomously.

0:53:200:53:24

That also means you... you cannot call off the attack.

0:53:240:53:28

It was definitely the type of attack where someone had decided that this

0:53:280:53:32

is what they wanted to do. There was no turning back

0:53:320:53:35

once Stuxnet was released.

0:53:350:53:37

When it began to actually execute its payload,

0:53:420:53:44

you would have a whole bunch of centrifuges

0:53:440:53:46

in a huge array of cascades,

0:53:460:53:48

sitting in a big hall, and then, just off that hall,

0:53:480:53:51

you would have an operators' room, the control panels in front of them,

0:53:510:53:53

a big window where they could see into the hall.

0:53:530:53:56

Computers monitor the activities of all these centrifuges.

0:53:560:54:01

So, a centrifuge,

0:54:010:54:02

it's driven by an electrical motor,

0:54:020:54:05

and the speed of this electrical motor

0:54:050:54:08

is controlled by another PLC,

0:54:080:54:11

by another programmable logic controller.

0:54:110:54:13

Stuxnet would wait for 13 days before doing anything.

0:54:150:54:19

These 13 days is about the time it takes to actually fill

0:54:190:54:23

an entire cascade of centrifuges with uranium.

0:54:230:54:27

They didn't want to attack when the centrifuges were empty

0:54:270:54:29

or at the beginning of the enrichment process.

0:54:290:54:31

What Stuxnet did

0:54:310:54:34

was it actually would sit there during the 13 days

0:54:340:54:36

and basically record all of the normal activities

0:54:360:54:39

that were happening, and save it.

0:54:390:54:41

And once they saw them spinning for 13 days, then the attack occurred.

0:54:410:54:45

Centrifuges spin at incredible speeds, at about 1,000 hertz.

0:54:460:54:50

They have a safe operating speed -

0:54:500:54:53

63,000 revolutions per minute.

0:54:530:54:56

Stuxnet caused the uranium enrichment centrifuges

0:54:560:54:58

to spin up to 1,400 hertz.

0:54:580:55:00

Up to 80,000 revolutions per minute.

0:55:000:55:02

What would happen was those centrifuges would go through

0:55:070:55:09

what's called a "resonance frequency".

0:55:090:55:11

It would go through a frequency

0:55:110:55:12

at which the metal would basically vibrate uncontrollably,

0:55:120:55:15

and essentially shatter. There'd be uranium gas everywhere.

0:55:150:55:19

And then the second attack they attempted

0:55:190:55:22

was they actually tried to lower it to two hertz.

0:55:220:55:24

They were slowed down...

0:55:240:55:26

to almost standstill.

0:55:260:55:27

And at two hertz, an opposite effect occurs.

0:55:270:55:31

You can imagine a toy top that you spin,

0:55:310:55:33

and as the top begins to slow down, it begins to wobble.

0:55:330:55:35

That's what happened to these centrifuges -

0:55:350:55:37

they would begin to wobble and essentially shatter and fall apart.

0:55:370:55:40

And instead of sending back to the computer what was really happening,

0:55:440:55:47

it would send back that old data that it had recorded.

0:55:470:55:50

So, the computer's sitting there thinking,

0:55:500:55:52

"Yup, running at 1,000 hertz, everything's fine.

0:55:520:55:54

"Running at 1,000 hertz, everything's fine."

0:55:540:55:56

But those centrifuges are spinning up wildly.

0:55:560:55:58

A huge noise would occur. It'd be like, you know, a jet engine.

0:55:580:56:02

JETS POWERING UP

0:56:020:56:05

The operators would know, "Whoa, something is going wrong here."

0:56:050:56:08

They might look at their monitors and say, "It says it's 1,000 hertz."

0:56:080:56:11

But they would hear that, in the room,

0:56:110:56:13

something gravely bad was happening.

0:56:130:56:14

Not only are the operators fooled into thinking everything's normal,

0:56:140:56:19

but also any kind of automated protective logic is fooled.

0:56:190:56:25

You can't just turn these centrifuges off.

0:56:250:56:28

They have to be brought down in a very controlled manner.

0:56:280:56:31

And so they would hit, literally,

0:56:310:56:32

the big red button to initiate a graceful shutdown.

0:56:320:56:35

And Stuxnet intercepts that code, so you would have these operators

0:56:350:56:38

slamming on that button over and over again,

0:56:380:56:40

and nothing would happen.

0:56:400:56:42

-YADLIN:

-If your cyber weapon is good enough,

0:56:430:56:46

if your enemy is not aware of it,

0:56:460:56:49

it is an ideal weapon,

0:56:490:56:52

because the enemy don't understand what is happening to them.

0:56:520:56:54

Maybe, even better, the enemy begins to doubt their own capability?

0:56:540:56:58

Absolutely.

0:56:580:56:59

Certainly,

0:56:590:57:01

one must conclude that what happened at Natanz

0:57:010:57:04

must have driven the engineers crazy.

0:57:040:57:07

Because the worst thing that can happen to a maintenance engineer

0:57:070:57:11

is not being able to figure out

0:57:110:57:13

what the cause of the specific trouble is,

0:57:130:57:16

so they must have been analysing themselves to death.

0:57:160:57:19

-SANGER:

-Through 2009, it was going pretty smoothly.

0:57:240:57:27

Centrifuges were blowing up. The International Atomic Energy Agency

0:57:270:57:30

inspectors would go into Natanz and they would see

0:57:300:57:33

that whole sections of the centrifuges had been removed.

0:57:330:57:36

The United States knew from its intelligence channels

0:57:380:57:41

that some Iranian scientists and engineers were being fired,

0:57:410:57:45

because the centrifuges were blowing up,

0:57:450:57:47

and the Iranians had assumed that this was because

0:57:470:57:50

they would have been making errors,

0:57:500:57:52

there were manufacturing mistakes,

0:57:520:57:53

clearly this was somebody's fault.

0:57:530:57:56

So, the program was doing exactly what it was supposed to be doing,

0:57:560:58:00

which was, it was blowing up centrifuges

0:58:000:58:03

and it was leaving no trace,

0:58:030:58:06

and leaving the Iranians to wonder what they got hit by.

0:58:060:58:10

This was the brilliance of Olympic Games.

0:58:100:58:12

You know, as a former director

0:58:120:58:14

of a couple of big three-letter agencies,

0:58:140:58:16

slowing down 1,000 centrifuges in Natanz?

0:58:160:58:19

An unalloyed good.

0:58:190:58:20

There was a need for, for buying time.

0:58:200:58:23

There was a need for slowing them down.

0:58:230:58:25

There was a need to try and push them to the negotiating table.

0:58:250:58:28

I mean, there were a lot of variables at play here.

0:58:280:58:31

-SANGER:

-President Obama would go down into the Situation Room

0:58:350:58:39

and he would have laid out in front of him

0:58:390:58:41

what they call the horse blanket,

0:58:410:58:43

which was a giant schematic

0:58:430:58:46

of the Natanz nuclear enrichment plant.

0:58:460:58:49

And the designers of Olympic Games

0:58:490:58:52

would describe to him what kind of progress they made,

0:58:520:58:55

and look for him for the authorisation

0:58:550:58:57

to move on ahead to the next attack.

0:58:570:59:00

And at one point during those discussions,

0:59:010:59:04

he said to a number of his aides,

0:59:040:59:05

"You know, I have some concerns,

0:59:050:59:07

"because once word of this gets out..."

0:59:070:59:09

And he knew it would get out.

0:59:090:59:11

"..the Chinese may use it as an excuse for their attacks on us,

0:59:110:59:14

"the Russians might, or others."

0:59:140:59:17

So, he clearly had some misgivings,

0:59:170:59:19

but they weren't big enough to stop him

0:59:190:59:21

from going ahead with the programme.

0:59:210:59:23

And then, in 2010,

0:59:240:59:27

a decision was made to change the code.

0:59:270:59:30

Our human assets weren't always able to get code updates into Natanz,

0:59:360:59:41

and we weren't told exactly why, but...

0:59:410:59:45

we were told we had to have a cyber solution

0:59:450:59:48

for delivering the code.

0:59:480:59:50

But the delivery systems were tricky.

0:59:500:59:52

If they weren't aggressive enough, they wouldn't get in.

0:59:520:59:55

If they were too aggressive,

0:59:550:59:57

it could spread and be discovered.

0:59:570:59:59

-CHIEN:

-When we got the first sample,

1:00:011:00:03

there was some configuration information inside of it,

1:00:031:00:05

and one of the pieces in there was a version number, 1.1.

1:00:051:00:09

And that made us realise,

1:00:091:00:10

"Well, look, this likely isn't the only copy."

1:00:101:00:12

We went back to our databases,

1:00:121:00:14

looking for anything that looked similar to Stuxnet.

1:00:141:00:17

As we began to collect more samples,

1:00:191:00:21

we found a few earlier versions of Stuxnet.

1:00:211:00:23

And when we analysed that code,

1:00:231:00:24

we saw that versions previous to 1.1

1:00:241:00:27

were a lot less aggressive.

1:00:271:00:29

The earlier version of Stuxnet,

1:00:291:00:31

it, basically, required humans to do a little bit of double-clicking

1:00:311:00:34

in order for it to spread from one computer to another.

1:00:341:00:37

And so, what we believe, after looking at that code, is two things.

1:00:371:00:40

One, either they didn't get into Natanz with that earlier version

1:00:401:00:44

because it simply wasn't aggressive enough,

1:00:441:00:46

wasn't able to jump over that air gap.

1:00:461:00:48

And/or two, that payload, as well, didn't work properly.

1:00:481:00:52

It didn't work to their satisfaction.

1:00:521:00:54

Maybe it was not explosive enough.

1:00:541:00:57

There were slightly different versions

1:00:571:00:59

which were aimed at different parts of the centrifuge cascade.

1:00:591:01:02

But the guys at Symantec figured you changed the code because

1:01:021:01:06

the first variations couldn't get in and didn't work right.

1:01:061:01:08

Bullshit. We always found a way to get across the air gap.

1:01:081:01:13

At TAO, we laughed when people

1:01:131:01:14

thought they were protected by an air gap.

1:01:141:01:17

And for OG, the early versions of the payload did work.

1:01:171:01:20

But what NSA did...

1:01:201:01:22

..was always low-key

1:01:231:01:25

and subtle.

1:01:251:01:27

The problem was that Unit 8200, the Israelis,

1:01:271:01:30

kept pushing us to be more aggressive.

1:01:301:01:33

The later version of Stuxnet, 1.1 -

1:01:341:01:36

that version had multiple ways of spreading.

1:01:361:01:38

It had the four zero-days inside of it, for example,

1:01:381:01:41

that allowed it to spread all by itself, without you doing anything.

1:01:411:01:43

It could spread via network shares. It could spread via USB keys.

1:01:431:01:47

It was able to spread via network exploits.

1:01:471:01:49

That's the sample that introduces the stolen digital certificates.

1:01:491:01:53

That is the sample that, all of a sudden,

1:01:531:01:55

became so noisy

1:01:551:01:57

and caught the attention of the antivirus guys.

1:01:571:02:00

In the first sample, we don't find that.

1:02:001:02:03

And this is very strange

1:02:051:02:07

because it tells us that,

1:02:071:02:10

in the process of this development,

1:02:101:02:13

the attackers were less concerned with operational security.

1:02:131:02:17

Stuxnet actually kept a log inside of itself

1:02:231:02:25

of all the machines that had been infected along the way,

1:02:251:02:28

as it jumped from one machine to another to another to another.

1:02:281:02:32

And we were able to gather up

1:02:321:02:33

all of the samples that we could acquire,

1:02:331:02:35

tens of thousands of samples, and we extracted all of those logs.

1:02:351:02:38

We can see the exact path that Stuxnet took.

1:02:381:02:42

Eventually we were able to trace back this version of Stuxnet

1:02:431:02:46

to ground zero - to the first five infections in the world.

1:02:461:02:50

The first five infections were all outside of Natanz plant,

1:02:501:02:54

all inside of organisations inside of Iran.

1:02:541:02:57

All organisations that are involved in industrial control systems,

1:02:571:03:00

and construction of industrial control facilities.

1:03:001:03:03

Clearly contractors who were working on the Natanz facility,

1:03:031:03:07

and the attackers knew that.

1:03:071:03:08

They're electrical companies. They're piping companies.

1:03:081:03:11

They're, you know, these sorts of companies.

1:03:111:03:13

And they knew that technicians from those companies would visit Natanz.

1:03:131:03:17

So, they would infect these companies and then technicians

1:03:171:03:20

would take their computer or their laptop on their USB...

1:03:201:03:23

That operator then goes down to Natanz and he plugs in his USB key

1:03:231:03:26

which has some code that he needs to update into Natanz,

1:03:261:03:28

into the Natanz network, and now Stuxnet is able

1:03:281:03:30

to get inside Natanz and conduct its attack.

1:03:301:03:33

These five companies were specifically targeted

1:03:341:03:36

to spread Stuxnet into Natanz,

1:03:361:03:38

and it wasn't that Stuxnet escaped out of Natanz

1:03:381:03:41

and then spread all over the world, and it was this big mistake and,

1:03:411:03:44

"Oh, it wasn't meant to spread that far but it really did."

1:03:441:03:47

No, that's not the way we see it. The way we see it is that

1:03:471:03:49

they wanted it to spread far so that they could get it into Natanz.

1:03:491:03:53

Someone decided that we're going to create something new,

1:03:531:03:57

something evolved, that's going to be far, far, far more aggressive.

1:03:571:04:02

And we're OK, frankly,

1:04:021:04:04

with it spreading all over the world to innocent machines,

1:04:041:04:07

in order to go after our target.

1:04:071:04:09

The Mossad had the role,

1:04:141:04:17

had the assignment,

1:04:171:04:20

to deliver the virus,

1:04:201:04:23

to make sure that Stuxnet

1:04:231:04:26

would be put in place in Natanz to affect the centrifuges.

1:04:261:04:31

Meir Dagan, the head of Mossad,

1:04:321:04:34

was under growing pressure from the Prime Minister, Benjamin Netanyahu,

1:04:341:04:39

to produce results.

1:04:391:04:41

Inside the ROC, we were furious.

1:04:421:04:45

The Israelis took our code for the delivery system and changed it.

1:04:471:04:51

Then, on their own, without our agreement,

1:04:521:04:55

they just fucking launched it.

1:04:551:04:57

2010, around the same time they started killing Iranian scientists.

1:04:571:05:01

And they fucked up the code.

1:05:011:05:03

Instead of hiding, the code started shutting down computers.

1:05:031:05:07

So, naturally, people noticed.

1:05:071:05:09

Because they were in a hurry, they opened Pandora's Box,

1:05:111:05:14

they let it out, and it spread...

1:05:141:05:17

all over the world.

1:05:171:05:19

The worm spread quickly,

1:05:241:05:25

but somehow it remained unseen until it was identified in Belarus.

1:05:251:05:30

Soon after, Israeli intelligence confirmed

1:05:301:05:32

that it had made its way into the hands

1:05:321:05:34

of the Russian Federal Security Service, the successor to the KGB.

1:05:341:05:38

And so it happened that the formula for a secret cyber weapon

1:05:401:05:43

designed by the US and Israel

1:05:431:05:44

fell into the hands of Russia

1:05:441:05:46

and the very country it was meant to attack.

1:05:461:05:49

ANGRY CHANTING

1:06:091:06:10

-KIYAEI:

-In international law,

1:06:101:06:12

when some country, or a coalition of countries,

1:06:121:06:15

targets a nuclear facility,

1:06:151:06:18

it's an act of war.

1:06:181:06:20

Please, let's be frank here.

1:06:201:06:24

If it wasn't Iran,

1:06:241:06:27

let's say a nuclear facility in the United States

1:06:271:06:30

was targeted in the same way...

1:06:301:06:33

..the American Government would not sit by and let this go.

1:06:341:06:40

Stuxnet is an attack in peacetime on critical infrastructure.

1:06:401:06:44

Yes, it is. Look, when I read about it,

1:06:441:06:47

all right, I go,

1:06:471:06:48

"Whoa, this is a big deal!" Yeah.

1:06:481:06:51

-SANGER:

-The people who were running this program,

1:06:521:06:55

including Leon Panetta, the director of the CIA at the time,

1:06:551:06:59

had to go down into the Situation Room and face President Obama

1:06:591:07:03

and Vice President Biden

1:07:031:07:05

and explain that this program was suddenly on the loose.

1:07:051:07:10

Vice President Biden at one point during this discussion, sort of,

1:07:111:07:16

exploded in Biden-esque fashion

1:07:161:07:18

and blamed the Israelis. He said, "It must have been the Israelis

1:07:181:07:22

"who made a change in the code that enabled it to get out."

1:07:221:07:26

President Obama said to the senior leadership,

1:07:281:07:30

"You told me it wouldn't get out of the network. It did.

1:07:301:07:32

"You told me Iranians would never figure out

1:07:321:07:34

"it was the United States. They did.

1:07:341:07:37

"You told me it would have a huge effect on their nuclear programme,

1:07:371:07:41

"and it didn't."

1:07:411:07:43

The Natanz plant is inspected every couple of weeks

1:07:441:07:47

by the International Atomic Energy Agency inspectors,

1:07:471:07:51

and if you line up what you know about the attacks

1:07:511:07:53

with the inspection reports, you can see the effects.

1:07:531:07:57

-HEINONEN:

-If you go to the IAEA reports,

1:07:581:08:00

we really saw that a lot of centrifuges were switched off,

1:08:001:08:03

and they were removed.

1:08:031:08:06

As much as almost a couple of thousand got compromised.

1:08:061:08:09

When you put this all together,

1:08:091:08:11

I wouldn't be surprised if their programme

1:08:111:08:13

got delayed by the one year.

1:08:131:08:15

But go, then, to year 2012-13, and look, you know,

1:08:151:08:19

how the centrifuges started to come up again.

1:08:191:08:22

-KIYAEI:

-So, ironically, cyber warfare,

1:08:251:08:28

assassination of its nuclear scientists,

1:08:281:08:32

economic sanctions,

1:08:321:08:33

political isolation...

1:08:331:08:35

Iran has gone through A-X of every coercive policy that the US,

1:08:361:08:42

Israel and those who ally with them

1:08:421:08:46

have placed on Iran,

1:08:461:08:48

and they have actually made Iran's nuclear programme

1:08:481:08:51

more advanced today than it was ever before.

1:08:511:08:54

CHANTING IN ARABIC

1:08:541:08:56

-DISTORTED MALE VOICE:

-This is a very, very dangerous minefield

1:08:571:09:01

that we are walking, and the nations who decide

1:09:011:09:05

to take these covert actions should be

1:09:051:09:09

taking into consideration all the effects,

1:09:091:09:14

including the moral effects.

1:09:141:09:17

I would say that this is the price that we have to pay in this...

1:09:171:09:23

in this world, and our blade of righteousness shouldn't be so sharp.

1:09:231:09:29

In Israel and in the United States,

1:09:341:09:37

the blade of righteousness cut both ways,

1:09:371:09:39

wounding the targets and the attackers.

1:09:391:09:42

Once Stuxnet infected American computers,

1:09:421:09:45

the Department of Homeland Security,

1:09:451:09:47

unaware of the cyber weapons launched by the NSA,

1:09:471:09:50

devoted enormous resources trying to protect Americans

1:09:501:09:53

from their own government.

1:09:531:09:55

We had met the enemy and it was us.

1:09:551:09:58

Yep, absolutely.

1:10:091:10:11

We'll be more than happy to discuss that.

1:10:111:10:13

Early July of 2010, I received a call

1:10:131:10:16

that said that this piece of malware was discovered,

1:10:161:10:19

and could we take a look at it?

1:10:191:10:22

When we first started the analysis, there was that, "Oh, crap" moment.

1:10:221:10:25

You know, where we sat there and said, "This is something

1:10:251:10:27

"that's significant. It's impacting industrial control.

1:10:271:10:30

"It can disrupt it to the point where it could cause harm,

1:10:301:10:32

"and not only damage to the equipment,

1:10:321:10:35

"but potentially harm or loss of life."

1:10:351:10:36

We were very concerned,

1:10:361:10:38

because Stuxnet was something that we had not seen before,

1:10:381:10:41

so there wasn't a lot of sleep at night.

1:10:411:10:43

Basically, light up the phones, call everybody we know,

1:10:431:10:46

inform the Secretary, inform the White House

1:10:461:10:48

inform the other departments and agencies,

1:10:481:10:51

wake up the world and figure out what's going on

1:10:511:10:54

with this particular malware.

1:10:541:10:55

Did anybody ever give you an indication

1:10:551:10:58

that it was something that they already knew about?

1:10:581:11:01

No, at no time did I get the impression from someone that,

1:11:011:11:04

"That's OK," you know,

1:11:041:11:05

get a little pat on the head and scooted out the door.

1:11:051:11:07

I never received a stand down order.

1:11:071:11:09

I never... No-one ever asked, "Stop looking at this."

1:11:091:11:13

Sean McGurk, the Director of Cyber

1:11:131:11:14

for the Department of Homeland Security,

1:11:141:11:16

testified before the Senate

1:11:161:11:18

about how he thought Stuxnet was a terrifying threat

1:11:181:11:21

-to the United States. Is that not a problem?

-No, no...

1:11:211:11:24

How do you mean? That, that, that the Stuxnet thing was a bad idea?

1:11:241:11:28

No, no, just that before he knew what it was and what it attacks...

1:11:281:11:32

Oh, I get it. That, that...

1:11:321:11:33

Yeah, that he was responding to something that...

1:11:331:11:36

He thought was a threat to critical infrastructure in the United States.

1:11:361:11:39

Yeah. "The worm is loose!"

1:11:391:11:40

The worm is loose, I understand.

1:11:401:11:42

But there's a...

1:11:421:11:44

There is a further theory having to do with whether or not,

1:11:441:11:47

-following up on David Sanger's...

-I got the subplot. And who did that?

1:11:471:11:50

Was it the Israelis? And, yeah, I...

1:11:501:11:53

I truly don't know and, even though I don't know,

1:11:531:11:56

I still can't talk about it. All right?

1:11:561:11:58

Stuxnet was somebody's covert action, all right?

1:11:581:12:01

And the definition of covert action

1:12:011:12:03

is an activity in which

1:12:031:12:04

you want to have the hand of the actor forever hidden.

1:12:041:12:08

So, by definition, it's going to end up

1:12:081:12:10

in this "we don't talk about these things" box.

1:12:101:12:13

-SANGER:

-To this day, the United States Government

1:12:181:12:21

has never acknowledged

1:12:211:12:23

conducting any offensive cyber attack anywhere in the world.

1:12:231:12:28

But, thanks to Mr Snowden, we know that, in 2012,

1:12:301:12:34

President Obama issued an Executive Order

1:12:341:12:37

that laid out some of the conditions

1:12:371:12:39

under which cyber weapons can be used,

1:12:391:12:42

and, interestingly, every use of a cyber weapon

1:12:421:12:45

requires presidential sign-off.

1:12:451:12:48

That is only true, in the physical world, for nuclear weapons.

1:12:491:12:55

-CLARKE:

-Nuclear war and nuclear weapons are vastly different

1:13:051:13:07

from cyber war and cyber weapons.

1:13:071:13:10

Having said that, there are some similarities.

1:13:101:13:13

And in the early 1960s, the United States Government

1:13:131:13:15

suddenly realised it had thousands of nuclear weapons,

1:13:151:13:19

big ones and little ones, weapons on Jeeps,

1:13:191:13:21

weapons on submarines, and it really didn't have a doctrine.

1:13:211:13:25

It really didn't have a strategy.

1:13:251:13:27

It really didn't have an understanding, at the policy level,

1:13:271:13:30

about how it was going to use all of these things.

1:13:301:13:33

And so academics started publishing unclassified documents

1:13:331:13:38

about nuclear war

1:13:381:13:40

and nuclear weapons.

1:13:401:13:42

And the result was more than 20 years in the United States

1:13:441:13:48

of very vigorous national debates

1:13:481:13:51

about how we want to go use nuclear weapons.

1:13:511:13:55

And not only did that cause the Congress,

1:13:571:13:59

and people in the executive branch in Washington,

1:13:591:14:02

to think about these things,

1:14:021:14:04

it caused the Russians to think about these things.

1:14:041:14:07

And out of that grew nuclear doctrine -

1:14:071:14:10

mutual assured destruction,

1:14:101:14:13

all of that complicated set of nuclear dynamics.

1:14:131:14:18

Today, on this vital issue, at least,

1:14:181:14:20

we have seen what can be accomplished when we pull together.

1:14:201:14:24

We can't have a discussion, not in a sensible way right now,

1:14:241:14:28

about cyber war and cyber weapons, because everything is secret.

1:14:281:14:33

And when you get into a discussion

1:14:331:14:35

with people in the government, people still in the government,

1:14:351:14:38

people who have security clearances, you run into a brick wall.

1:14:381:14:42

Trying to stop Iran is really my number-one job, and I think...

1:14:421:14:45

Wait, can I ask you, in that context,

1:14:451:14:48

about the Stuxnet computer virus, potentially?

1:14:481:14:50

You can ask but I won't comment.

1:14:501:14:52

-Can you tell us anything?

-No.

1:14:521:14:54

Look, for the longest time, I was in fear

1:14:541:14:56

that I couldn't actually say the phrase "computer network attack".

1:14:561:15:00

This stuff is hideously over-classified,

1:15:001:15:02

and it gets into the way of a... of a mature, public discussion

1:15:021:15:07

as to what it is we, as a democracy,

1:15:071:15:09

want our nation to be doing up here in the cyber domain.

1:15:091:15:13

Now, this is a former director of NSA and CIA

1:15:131:15:16

saying this stuff is over-classified.

1:15:161:15:18

One of the reasons this is as highly classified as it is,

1:15:181:15:21

this is a peculiar weapons system.

1:15:211:15:23

This is the weapons system that's come out of the espionage community,

1:15:231:15:26

and so those people have a HABIT of secrecy.

1:15:261:15:29

While most government officials refuse to acknowledge the operation,

1:15:291:15:33

at least one key insider did leak parts of the story to the press.

1:15:331:15:38

In 2012, David Sanger wrote a detailed account of Olympic Games

1:15:381:15:42

that unmasked the extensive joint operation

1:15:421:15:44

between the US and Israel

1:15:441:15:46

to launch cyber attacks on Natanz.

1:15:461:15:49

The publication of this story,

1:15:491:15:51

coming at a time that there were a number of other unrelated

1:15:511:15:54

national security stories being published, led to the announcement

1:15:541:15:58

of investigations by the Attorney General.

1:15:581:16:01

Into the...? Into the press and into the leaks?

1:16:011:16:03

Into the press and into the leaks.

1:16:031:16:07

When Stuxnet hit the media, they polygraphed everyone in our office,

1:16:071:16:10

including people who didn't know shit.

1:16:101:16:12

You know, they poly'd the interns, for God's sake.

1:16:121:16:14

These are criminal acts when they release information like this,

1:16:141:16:18

and we will conduct thorough investigations,

1:16:181:16:21

as we have in the past.

1:16:211:16:24

The administration never filed charges,

1:16:251:16:28

possibly afraid that a prosecution

1:16:281:16:29

would reveal classified details about Stuxnet.

1:16:291:16:33

To this day, no-one in the US or Israeli Governments

1:16:331:16:36

has officially acknowledged the existence of the joint operation.

1:16:361:16:40

I would never compromise ongoing operations in the field,

1:16:421:16:45

but we should be able to talk about capability.

1:16:451:16:49

We can talk about our...

1:16:501:16:53

bunker busters - why not our cyber weapons?

1:16:531:16:56

The secrecy of the operation has been blown.

1:16:561:16:58

Our friends in Israel took a weapon

1:17:001:17:01

that we jointly developed -

1:17:011:17:03

in part to keep Israel from doing something crazy -

1:17:031:17:05

and then used it on their own in a way that blew the cover

1:17:051:17:08

of the operation and could have led to war,

1:17:081:17:09

and we can't talk about that?

1:17:091:17:11

There is a way to talk about Stuxnet.

1:17:151:17:18

It happened. That...

1:17:181:17:20

To deny that it happened is foolish,

1:17:201:17:23

so the fact it happened is really what we're talking about here.

1:17:231:17:26

What are the implications of the fact

1:17:261:17:27

that we now are in a post-Stuxnet world?

1:17:271:17:30

What I said to David Sanger was,

1:17:301:17:32

I understand the difference in destruction is dramatic,

1:17:321:17:35

but this has the whiff of August 1945.

1:17:351:17:38

Somebody just used a new weapon,

1:17:381:17:41

and this weapon will not be put back into the box.

1:17:411:17:43

I know no operational details,

1:17:431:17:46

and don't know what anyone did or didn't do

1:17:461:17:48

before someone decided to use the weapon, all right?

1:17:481:17:52

I do know this - if we go out and do something,

1:17:521:17:55

most of the rest of the world now thinks that's the new standard

1:17:551:17:59

and it's something that they now feel legitimated to do, as well.

1:17:591:18:02

But the rules of engagement,

1:18:021:18:04

international norms, treaty standards,

1:18:041:18:07

they don't exist right now.

1:18:071:18:09

-SANGER:

-For nuclear, we have these extensive inspection regimes.

1:18:121:18:15

The Russians come and look at our silos.

1:18:151:18:17

We go and look at their silos.

1:18:171:18:19

Bad as things get between the two countries,

1:18:191:18:21

those inspection regimes have held up.

1:18:211:18:24

But working that out for...for cyber would be virtually impossible.

1:18:241:18:28

Where do you send your inspector?

1:18:281:18:30

Inside the laptop of, you know...

1:18:301:18:32

How many laptops are there in the United States and Russia?

1:18:321:18:35

It's much more difficult in the cyber area

1:18:351:18:37

to construct an international regime

1:18:371:18:39

based on treaty commitments and rules of the road and so forth.

1:18:391:18:43

Although we've tried to have discussions

1:18:431:18:45

with the Chinese and Russians and so forth about that,

1:18:451:18:48

but it's very difficult.

1:18:481:18:50

-BROWN:

-Right now, the norm in cyberspace is...

1:18:501:18:54

do whatever you can get away with.

1:18:541:18:56

That's not a good norm, but it's the norm that we have.

1:18:561:18:59

That's the norm that is preferred by states

1:18:591:19:01

that are engaging in lots of different kinds of activities

1:19:011:19:04

that they feel are benefiting their national security.

1:19:041:19:06

-YADLIN:

-Those who excel in cyber

1:19:061:19:09

are trying to slow down the process of creating regulation.

1:19:091:19:14

Those who are victims

1:19:141:19:16

would like the regulation to be in the open as soon as possible.

1:19:161:19:21

International law in this area is written by custom,

1:19:231:19:26

and customary law requires a nation to say,

1:19:261:19:29

"This is what we did this is why we did it."

1:19:291:19:31

And the US doesn't want to push the law in that direction,

1:19:311:19:34

and so it chooses not to disclose its involvement.

1:19:341:19:37

And one of the reasons that I thought it was important

1:19:371:19:40

to tell the story of Olympic Games

1:19:401:19:42

was not simply because it's a cool spy story - it is -

1:19:421:19:45

but it's because, as a nation,

1:19:451:19:49

we need to have a debate about how we want to use cyber weapons,

1:19:491:19:53

because we are the most vulnerable nation on Earth

1:19:531:19:56

to cyber attack ourselves.

1:19:561:19:58

Let's say you took over the control system of a railway -

1:20:001:20:03

you could switch tracks.

1:20:031:20:05

You could cause derailments of trains carrying explosive materials.

1:20:051:20:10

What if you were in the control system of gas pipelines

1:20:101:20:13

and when a valve was supposed to be open, it was closed,

1:20:131:20:17

and the pressure built up and the pipeline exploded?

1:20:171:20:21

There are companies that run electric power generation

1:20:211:20:25

or electric power distribution -

1:20:251:20:27

that we know have been hacked by foreign entities -

1:20:271:20:30

that have the ability to shut down the power grid.

1:20:301:20:33

-NEWS REPORT:

-'According to the officials,

1:20:351:20:37

'Iran is the first country ever in the Middle East

1:20:371:20:40

'to be engaged in a cyber war with the United States and Israel.

1:20:401:20:44

'If anything, they said the recent cyber attacks

1:20:441:20:47

'were what encouraged them to plan to set up the Cyber Army,

1:20:471:20:51

'which will gather computer scientists,

1:20:511:20:53

'programmers, software engineers...'

1:20:531:20:56

-KIYAEI:

-If you are a youth and you see

1:20:561:20:58

assassination of a nuclear scientist,

1:20:581:21:00

and your nuclear facilities are getting attacked,

1:21:001:21:03

wouldn't you join your national Cyber Army?

1:21:031:21:07

Well, many did, and that's why, today,

1:21:071:21:11

Iran has one of the largest cyber armies in the world.

1:21:111:21:16

So, whoever initiated this,

1:21:161:21:18

and was very proud of themselves to see that little dip

1:21:181:21:21

in Iran's centrifuge numbers,

1:21:211:21:24

should look back now

1:21:241:21:26

and acknowledge that it was a major mistake.

1:21:261:21:29

Very quickly, Iran sent a message to the United States,

1:21:291:21:34

a very sophisticated message,

1:21:341:21:36

and they did that with two attacks.

1:21:361:21:39

First, they attacked Saudi Aramco,

1:21:391:21:42

the biggest oil company in the world,

1:21:421:21:45

and wiped out every piece of software, every line of code,

1:21:451:21:49

on 30,000 computer devices.

1:21:491:21:53

Then Iran did a surge attack on the American banks.

1:21:531:21:59

The most extensive attack on American banks ever,

1:21:591:22:01

launched from the Middle East, happening right now.

1:22:011:22:04

When Iran hit our banks,

1:22:061:22:08

we could've shut down their bot net,

1:22:081:22:10

but the State Department got nervous,

1:22:101:22:12

because the servers weren't actually in Iran,

1:22:121:22:15

so until there was a diplomatic solution,

1:22:151:22:18

Obama let the private sector deal with the problem.

1:22:181:22:21

I imagine that in the White House Situation Room,

1:22:211:22:24

people sat around and said...

1:22:241:22:27

Let me be clear, I don't imagine I know.

1:22:271:22:30

People sat around in the White House Situation Room and said,

1:22:301:22:34

"The Iranians have sent us a message, which is essentially -

1:22:341:22:37

"stop attacking us in cyberspace the way you did at Natanz with Stuxnet.

1:22:371:22:43

"We can do it, too."

1:22:431:22:44

There are unintended consequences of the Stuxnet attack.

1:22:461:22:50

You wanted to cause confusion and damage to the other side,

1:22:501:22:54

but then the other side can do the same to you.

1:22:541:22:57

The monster turned against its creator,

1:22:571:23:00

and now everyone is in this game.

1:23:001:23:03

They did a good job in showing the world, including the bad guys,

1:23:031:23:08

what you would need to do in order to cause serious trouble

1:23:081:23:11

that could lead to injuries and death.

1:23:111:23:14

I mean, you've been focusing on Stuxnet,

1:23:141:23:16

but that was just a small part

1:23:161:23:18

of the much larger Iranian mission.

1:23:181:23:20

There was a larger Iranian mission?

1:23:201:23:22

Nitro Zeus,

1:23:251:23:27

NZ.

1:23:271:23:29

We spent hundreds of millions - maybe billions - on it.

1:23:301:23:34

In the event the Israelis did attack Iran,

1:23:371:23:40

we assumed we would be drawn into the conflict.

1:23:401:23:43

We built in attacks on Iran's command and control system

1:23:441:23:47

so the Iranians couldn't talk to each other in a fight.

1:23:471:23:49

We infiltrated their IADS, military air defence systems,

1:23:491:23:53

so they couldn't shoot down our planes if we flew over.

1:23:531:23:56

We also went after their civilian support systems, power grids,

1:23:561:24:00

transportation, communications,

1:24:001:24:03

financial systems...

1:24:031:24:05

We were inside, waiting, watching,

1:24:051:24:08

ready to disrupt, degrade and destroy those systems

1:24:081:24:11

with cyber attacks.

1:24:111:24:13

In comparison, Stuxnet was a back-alley operation.

1:24:161:24:20

NZ was the plan for a full-scale cyber war with no attribution.

1:24:211:24:27

We need an entirely new way of thinking

1:24:271:24:29

about how we're going to solve this problem.

1:24:291:24:31

You're not going to get an entirely new way of solving this problem

1:24:311:24:35

until you begin to have an open acknowledgement

1:24:351:24:38

that we have cyber weapons, as well,

1:24:381:24:41

and that we may have to agree to some limits on their use

1:24:411:24:44

if we're going to get other nations to limit their use.

1:24:441:24:47

It's not going to be a one-way street.

1:24:471:24:49

I'm old enough to have worked on nuclear arms control,

1:24:491:24:52

and biological weapons arms control,

1:24:521:24:54

and chemical weapons arms control.

1:24:541:24:56

And I was told in each of those types of arms control,

1:24:571:25:02

when we were beginning, "It's too hard. There are all these problems.

1:25:021:25:06

"It's technical. There's engineering.

1:25:061:25:08

"There's science involved.

1:25:081:25:10

"There are real verification difficulties.

1:25:101:25:12

"You'll never get there."

1:25:121:25:14

Well, it took 20, 30 years in some cases,

1:25:141:25:17

but we have a biological weapons treaty that's pretty damn good.

1:25:171:25:20

We have a chemical weapons treaty that's pretty damn good.

1:25:201:25:23

We've got three or four nuclear weapons treaties.

1:25:231:25:25

Yes, it may be hard and it may take 20 or 30 years,

1:25:251:25:29

but it'll never happen unless you get serious about it,

1:25:291:25:32

and it'll never happen unless you start it.

1:25:321:25:35

Today, after two years of negotiations,

1:25:401:25:43

the United States, together with our international partners,

1:25:431:25:46

has achieved something that decades of animosity has not -

1:25:461:25:51

a comprehensive, long-term deal with Iran

1:25:511:25:53

that will prevent it from obtaining a nuclear weapon.

1:25:531:25:57

It is a deal in which Iran

1:25:571:25:59

will cut its installed centrifuges

1:25:591:26:02

by more than two thirds.

1:26:021:26:04

Iran will not enrich uranium with its advanced centrifuges

1:26:041:26:07

for at least the next ten years.

1:26:071:26:09

It will make our country, our allies, and our world safer.

1:26:091:26:14

70 years after the murder of 6 million Jews,

1:26:141:26:18

Iran's rulers

1:26:181:26:20

promise to destroy my country, and the response

1:26:201:26:24

from nearly every one of the governments represented here

1:26:241:26:28

has been utter silence.

1:26:281:26:31

Deafening silence.

1:26:311:26:33

Perhaps you can now understand

1:26:401:26:43

why Israel is not joining you in celebrating this deal.

1:26:431:26:47

History shows that America must lead

1:26:471:26:49

not just with our might, but with our principles.

1:26:491:26:53

It shows we are stronger

1:26:531:26:55

not when we are alone but when we bring the world together.

1:26:551:27:00

Today's announcement marks one more chapter

1:27:001:27:03

in this pursuit of a safer and a more helpful,

1:27:031:27:06

more hopeful world.

1:27:061:27:08

Thank you. God bless you and God bless the United States of America.

1:27:081:27:13

-DISTORTED FEMALE VOICE:

-Everyone I know is thrilled

1:27:181:27:20

with the Iran deal. Sanctions and diplomacy worked,

1:27:201:27:24

but behind that deal was a lot of confidence in our cyber capability.

1:27:241:27:27

We were everywhere inside Iran, still are.

1:27:281:27:32

I'm not going to tell you the operational details

1:27:321:27:34

of what we can do, going forward, or where...

1:27:341:27:38

but the science-fiction cyber war scenario is here,

1:27:381:27:41

and that's Nitro Zeus.

1:27:411:27:43

But my concern, and the reason I'm talking...

1:27:451:27:47

..is because when you shut down a country's power grid...

1:27:481:27:53

it doesn't just pop back up.

1:27:531:27:55

You know, it's more like Humpty Dumpty,

1:27:551:27:58

and if all the king's men can't turn the lights back on

1:27:581:28:02

or filter the water for weeks,

1:28:021:28:04

then lots of people die.

1:28:041:28:06

And something we can do to others, they can do to us, too.

1:28:081:28:12

Is that something that we should keep quiet

1:28:141:28:16

or should we talk about it?

1:28:161:28:18

I've gone to many people on this film, even friends of mine,

1:28:181:28:22

who won't talk to me about the NSA and Stuxnet, even off the record,

1:28:221:28:24

for fear of going to jail.

1:28:241:28:26

Is that fear protecting us?

1:28:261:28:29

No. But it protects me.

1:28:291:28:32

Or should I say "we"?

1:28:321:28:34

-NO VOICE DISTORTION:

-I'm an actor playing a role,

1:28:351:28:37

written from the testimony of a small number of people

1:28:371:28:40

from NSA and CIA - all of whom are angry about the secrecy,

1:28:401:28:43

but too scared to come forward.

1:28:431:28:45

Now, we're forward.

1:28:451:28:47

Well...

1:28:471:28:49

"forward-leaning".

1:28:491:28:51

Documentary thriller about warfare in a world without rules - the world of cyberwar. It tells the story of Stuxnet, self-replicating computer malware, known as a 'worm' for its ability to burrow from computer to computer on its own. In a covert operation, the American and Israeli intelligence agencies allegedly unleashed Stuxnet to destroy a key part of an Iranian nuclear facility. Ultimately the 'worm' spread beyond its intended target.

Zero Day is the most comprehensive account to date of how a clandestine mission opened forever the Pandora's box of cyber warfare. A cautionary tale of technology, politics, unintended consequences, morality, and the dangers of secrecy.


Download Subtitles

SRT

ASS