Spy Special Click


Spy Special

Similar Content

Browse content similar to Spy Special. Check below for episodes and series from the same categories and more!

Transcript


LineFromTo

Now on BBC News, Click.

0:00:020:00:05

This week, international espionage.

0:00:050:00:08

Shark fishing in Vegas.

0:00:080:00:09

Spooked out in London.

0:00:090:00:12

The way you walk can be used against you.

0:00:120:00:42

And, I spy in Hong Kong.

0:00:420:00:43

conventions have just taken place.

0:00:430:00:45

We've seen ATMs conned into spewing out cash and solar panels controlled

0:00:450:00:48

from the other side of the world.

0:00:480:00:50

And while Black Hat is the conference for the security

0:00:500:00:52

professionals, the hobbyist hackers hang out at Def Con,

0:00:520:00:55

and they're not all good guys.

0:00:550:01:03

There are all night parties and challenges to hack

0:01:030:01:09

into visitor's phones or laptops with the victim's details posted up

0:01:090:01:11

here, on the notorious Wall of Sheep.

0:01:110:01:23

What these big events expose is the vulnerability of anything

0:01:230:01:26

that's connected to the internet.

0:01:260:01:27

And, of course, we are moving towards a world where everything

0:01:270:01:30

will be connected to the internet from the infrastructure,

0:01:300:01:32

to your car, to your home appliances, to your clothes.

0:01:320:01:35

Yet there's always something new in the world of hacking and this

0:01:350:01:38

week we'll show you some of it.

0:01:380:01:45

But first, Dan Simmons is off gambling.

0:01:450:01:50

When Click was offered a rare opportunity to film how a top casino

0:01:500:01:53

has used its own tech to catch the cheats,

0:01:530:01:56

we sent him in with a simple brief - get the story and come home safely.

0:01:560:02:17

But with Sin City this full of bad guys, it didn't quite go to plan.

0:02:170:02:20

I'm on the inside, but who can I trust?

0:02:200:02:22

A big company boss was caught cheating at the tables only

0:02:220:02:25

last month here.

0:02:250:02:50

In the old days, they'd have had agents casually observing suspects

0:02:500:02:53

on the gaming floor or even snooping through a ceiling made of glass.

0:02:530:02:56

Today, this $20 million security set-up is all about the cameras,

0:02:560:02:59

and this casino has some new ones.

0:02:590:03:00

Can you see it?

0:03:000:03:02

There.

0:03:020:03:02

He's got it on his finger.

0:03:020:03:03

Watch how he handles the cards, and that's him putting the mark.

0:03:030:03:06

It's that subtle.

0:03:060:03:07

It's just that little scratch on the back.

0:03:070:03:09

There.

0:03:090:03:09

I can actually see it.

0:03:090:03:11

He put it on pretty heavy, so we can see it here.

0:03:110:03:13

That's why they call it a visible mark.

0:03:130:03:16

And the idea is, when he sees the card -

0:03:160:03:18

a Queen, King, Ace - come up, he's going to bet as much

0:03:180:03:21

as he can.

0:03:210:03:22

So this is just the perfect example.

0:03:220:03:24

Let's see when the cards come over.

0:03:240:03:26

There's an Ace and a King.

0:03:260:03:27

Of course it's appropriate to bet the most he can when he gets

0:03:270:03:30

those good cards.

0:03:300:03:31

The biggest thing that's happened is the addition

0:03:310:03:33

of the 360 degree cameras.

0:03:330:03:43

They take the place of 400 hand-tilt zoom cameras.

0:03:430:03:45

I can replace 400 of those with 50 oncam 360 cameras.

0:03:450:03:47

I can literally follow the bad guy every step he took in this building.

0:03:470:03:51

Look at that, how far that thing goes.

0:03:510:03:55

This is two megapixels, the new ones are 12.

0:03:550:03:57

So, from this shot here, I would be able to identify that person.

0:03:570:04:00

Ted says reliable, automatic facial recognition is just a year

0:04:000:04:03

or two away.

0:04:030:04:04

The cameras can already analyse and report the general movements

0:04:040:04:06

of people, but it's still the skill of the operators in catching a cheat

0:04:060:04:10

in the act that's his trump card.

0:04:100:04:11

Ted's enthusiasm has led him to hack together some unique

0:04:110:04:14

and revealing tech himself.

0:04:140:04:15

You can see right here this card, if you're looking at with your naked

0:04:150:04:18

eye, you cannot see through it.

0:04:180:04:20

But once you put it up to the right light and the right lens,

0:04:200:04:23

you can see right through it, like it's...

0:04:230:04:25

Look, he's got - nine, nine.

0:04:250:04:28

So this is a mark that I had printed on this card,

0:04:280:04:30

but I could apply this type of ink on the fly while I'm sitting

0:04:300:04:34

on a game.

0:04:340:04:36

The thing is, all you need is the right light source

0:04:360:04:38

and the right lens.

0:04:380:04:39

This device here has nine different lenses in it

0:04:390:04:42

which I can cycle through.

0:04:420:04:43

I don't know if your camera can catch it, but I can.

0:04:430:04:51

This lens is in the glasses or in their contacts.

0:04:510:04:53

So they don't need a camera like we have, they can do this

0:04:530:04:56

with manual devices, just like my glasses.

0:04:560:04:58

So this is just a cheap phone that I had the camera modified.

0:04:580:05:01

Let me see if I can find a good one.

0:05:010:05:04

You can see through that card with this phone.

0:05:040:05:07

Casinos buy cards, they have to be careful not to buy the cheapest card

0:05:070:05:10

because this is what you get sometimes.

0:05:100:05:12

Our casinos would never have a card like this on the floor.

0:05:120:05:24

Our cheat is at it again.

0:05:250:05:26

This time he's gambled too much on what turns out to be a poor hand.

0:05:260:05:37

That was a great move.

0:05:370:05:41

So it looked as if he's tucking the cards and he's actually knocking

0:05:410:05:44

the chip back into his hand and therefore he lost less money

0:05:440:05:47

than he was going to lose.

0:05:470:05:49

This is the most popular way to con the casino,

0:05:490:05:51

but the cameras have caught it, and Ted's seen enough.

0:05:510:05:54

Excuse me, sir, how are you doing?

0:05:540:05:55

I need you to step away from the table and come with us.

0:05:550:06:00

Our cheat's a plant.

0:06:000:06:01

For legal reasons we can't show you an actual pick-up but,

0:06:010:06:03

while we were filming, Ted caught one guy doing exactly

0:06:030:06:07

the same cheats we've just shown you.

0:06:070:06:09

And that's payday for the good guys.

0:06:090:06:10

But you can't catch them all.

0:06:100:06:33

Drones can be used as a threat factor against not only stationary

0:06:540:07:10

targets, like industrial wireless, but also against moving

0:07:100:07:12

targets as well.

0:07:120:07:12

My name is Jeff Melrose.

0:07:120:07:14

For about 16 years I worked on designing secure systems

0:07:140:07:16

for the US Department of defence as well as the US intelligence

0:07:160:07:19

community.

0:07:190:07:19

Jeff's research, revealed last week in Las Vegas,

0:07:190:07:21

shows a drone carrying an electromagnetic disrupter

0:07:210:07:36

could create what he calls "a cone of silence" around a moving target.

0:07:360:07:39

The drop off co-ordinates are one, three, seven,

0:07:390:07:41

four, seven, two, nine.

0:07:410:07:46

It locks on automatically without the need for an operator

0:07:460:07:49

and is a threat, he believes, many aren't ready for.

0:07:490:07:53

The ability to provide persistent degradation and navigation signals

0:07:530:07:59

as well as communications is something that would be quite

0:07:590:08:01

useful if you wanted to redirect cargo, redirect a ship,

0:08:010:08:04

potentially causing an environmental disaster in a port, those

0:08:040:08:06

types of things.

0:08:060:08:29

As well as killing coms, interfering with the car's

0:08:290:08:31

telematics or GPS could cause serious problems, especially if it's

0:08:310:08:33

a self-driving vehicle.

0:08:330:08:56

Thankfully, Jeff is on my side.

0:08:560:08:57

OK, maybe making it blow up is a bit of a stretch!

0:08:570:09:00

What happens now, Jeff?

0:09:000:09:01

Just follow the drone.

0:09:010:09:02

What, all the way to London?

0:09:020:09:04

You could have sent a chopper.

0:09:040:09:12

Hello, and welcome to the Week in Tech.

0:09:230:09:25

This was the week that 1,007 miniature robots danced their way

0:09:250:09:29

into the record books.

0:09:290:09:32

Google revealed an experiment to try and combat trolling in virtual

0:09:320:09:35

reality, by making players look like dogs.

0:09:350:09:45

And in honour of International Cat Day, Facebook unveiled research

0:09:450:09:48

showing that cat people have fewer friends and are more likely to be

0:09:480:09:51

single than dog people.

0:09:510:09:52

Happy Cat Day everyone.

0:09:520:09:53

Facebook also announced it will begin bypassing ad blockers

0:09:530:09:55

used by some people on its network.

0:09:550:09:57

The company, which made $17 billion from selling ads last year,

0:09:570:10:00

said it will offer users more options to control what sort

0:10:000:10:03

of ads they see.

0:10:030:10:04

I'm pleased to bring you news that the flying bottom has finally

0:10:040:10:06

emerged from its lair.

0:10:070:10:07

Also known as the Airlander 10 by its creators, Click visited this

0:10:070:10:11

gigantic airship earlier in the year, when it was still

0:10:110:10:13

locked up indoors.

0:10:130:10:15

Now it's outside and, once tests are complete,

0:10:150:10:17

the plan is for it to fly continuously for up to two weeks

0:10:170:10:20

at a time.

0:10:200:10:21

NASA really treated us this week.

0:10:210:10:22

First, they gave us this spectacularly detailed footage

0:10:220:10:24

of a rocket burn, captured with their new high dynamic range

0:10:240:10:27

camera and then, sticking with the spectacular theme,

0:10:270:10:29

they released a smorgasbord of new Mars imagery.

0:10:290:10:33

And, finally...

0:10:330:10:36

Ah, the joy of cycling, the wind in your hair,

0:10:360:10:40

the sun on your back.

0:10:400:10:45

Games developer, Aaron Pewsey, is on a mission to bike the length

0:10:450:10:48

of Britain, all without leaving his front room.

0:10:480:10:50

By hooking up his exercise bike to a pair of virtual reality goggles

0:10:500:10:53

and Google's Street View imagery, he's hoping he'll spice

0:10:530:10:57

up his exercise routine and cycle 1,500 virtual kilometres

0:10:570:10:59

in the process.

0:10:590:11:03

If you've reset your Facebook or your Google password recently

0:11:030:11:07

or you've made a payment on line, you may have received one of these.

0:11:070:11:14

It's a text message to confirm that you're not just some hacker trying

0:11:140:11:21

to access the account from somewhere else in the world.

0:11:210:11:23

The theory being that most hacking is done by someone thousands

0:11:230:11:26

of miles away from you, who doesn't actually know you.

0:11:260:11:29

So although they can try and log on as you,

0:11:290:11:34

they won't have access to your phone and can't see the text message.

0:11:340:11:38

This belts and braces approach, called two factor authentication,

0:11:380:11:40

has worked very well at stopping malicious software from just

0:11:400:11:42

rinsing your bank accounts.

0:11:420:11:44

But it turns out it's not as secure as many think.

0:11:440:11:49

In fact, the US government's National Institute of Standards

0:11:490:11:51

and Technology has recently advised against services using SMS

0:11:510:11:53

confirmation codes in the future.

0:11:530:11:58

The UK's National Crime Agency agrees.

0:11:580:12:01

I'm about to be shown one of the reasons why,

0:12:010:12:05

at Positive Technologies in London.

0:12:060:12:22

It's a weakness in the global mobile phone network,

0:12:220:12:24

the part called SS7, which allows hackers to intercept

0:12:240:12:26

phone messages to a given phone number.

0:12:260:12:28

We can intercept phone calls.

0:12:280:12:29

We can redirect phone calls.

0:12:290:12:30

We can attack a particular subscriber, so that subscriber

0:12:300:12:33

will not be available to make phone calls.

0:12:330:12:35

We can attack a particular infrastructure.

0:12:350:12:42

Yeah, and not only can a hacker disrupt and listen to your calls,

0:12:420:12:46

they can also track you to the nearest phone tower, too.

0:12:460:12:48

It's been known about for a few years and the means to do it

0:12:480:12:52

are available for sale on the dark web for really not that much money.

0:12:520:12:55

But arguably the most powerful thing they can do

0:12:550:12:58

is redirect your text messages.

0:12:580:13:02

To show you how it worked a researcher, somewhere else

0:13:020:13:06

in Europe, is about to intercept a text message and use it to gain

0:13:060:13:09

access to my Facebook account.

0:13:090:13:18

That's my brand new Facebook account, not the real one -

0:13:180:13:21

thanks very much.

0:13:210:13:21

Shall we mail?

0:13:210:13:26

And this is the important bit, I'm going to put in the mobile

0:13:260:13:30

number which Facebook can contact me on should I forget my password.

0:13:300:13:36

Now it's time to unleash our guy.

0:13:360:13:38

Now, we're watching his actions on this screen here.

0:13:380:13:40

He's going to tell Facebook he's me and pretend that I have

0:13:400:13:43

forgotten my password.

0:13:430:13:50

The way Facebook checks that it really is me who's asking is,

0:13:500:13:53

of course, by sending a code to my phone and asking me to enter

0:13:530:13:57

that onto the website.

0:13:570:14:02

Facebook has sent a text to that phone over there.

0:14:020:14:04

Apart from, it hasn't...

0:14:040:14:11

That text has been redirected to our guy's phone elsewhere

0:14:110:14:13

in the world.

0:14:130:14:17

In fact, there's the text message on our guy's computer screen.

0:14:200:14:23

He now has all he needs to reset my password and wreak havoc

0:14:230:14:27

on my account.

0:14:270:14:33

OK, let's see what's been done to me.

0:14:330:14:35

There I am.

0:14:350:14:42

I'm lucky it's just a flower that I've been changed to.

0:14:420:14:45

The thing is, this is actually nothing to do with Facebook.

0:14:450:14:48

Any service that uses texts is vulnerable.

0:14:480:14:57

The way Facebook checks that it really is me who's asking is,

0:14:570:15:03

of course, by sending a code to my phone and asking me to enter

0:15:030:15:07

that onto the website.

0:15:070:15:13

That could be your bank.

0:15:130:15:14

I'll say that again - your bank!

0:15:140:15:16

That's where your money is, remember.

0:15:160:15:18

We asked Facebook whether it should rethink its use of SMS verification.

0:15:180:15:21

It said that the technique we witnessed requires significant

0:15:210:15:23

technical and financial investment and so it's very low-risk for most

0:15:230:15:26

people, although it does offer a more secure log-in feature

0:15:260:15:28

called log-in approvals.

0:15:280:15:34

We also asked the British intelligence and security

0:15:340:15:36

organisation, GCHQ, for advice and we were told that the SS7 system

0:15:360:15:39

is old and doesn't have modern security protections built in,

0:15:390:15:42

but they assured us they are working on building up its resilience.

0:15:420:15:50

But why is the SS7 network so open to abuse?

0:15:500:15:53

To find out, I met up with Professor Alan Woodward,

0:15:530:15:55

long time security adviser to Europol.

0:15:550:15:57

We choose a park bench because, well, it just felt a bit spyie.

0:15:570:16:02

A lot of people talk about SS7 having a security flaw in it,

0:16:020:16:06

but it was never designed to be secure because at the time

0:16:060:16:08

it was designed, which was, you've got to remember,

0:16:080:16:11

nearly 40 years ago, the networks inherently

0:16:110:16:13

trusted each other.

0:16:130:16:13

So they didn't think they had to validate talking to each other,

0:16:130:16:16

they just naturally assumed tht they were the only ones talking.

0:16:160:16:19

Can they fix it?

0:16:200:16:21

They could, but I don't think they will.

0:16:210:16:22

It's not within even the law enforcement agencies gift to do

0:16:220:16:25

anything about it.

0:16:250:16:26

Nobody mandates this standard, it's all by agreement amongst

0:16:260:16:28

the telecommunication providers and I don't see them changing it

0:16:280:16:31

any time soon because it's very targeted, so somebody has

0:16:310:16:33

to want to come after you.

0:16:330:16:35

Plus, one of the reasons that some intelligence agencies won't want it

0:16:350:16:38

fixed is because SS7 allows you to be tracked.

0:16:380:16:40

I, personally, would recommend not using SMS because you have to assume

0:16:400:16:43

it could be intercepted and use something like e-mail only

0:16:430:16:45

or ideally move to one of the online services

0:16:450:16:48

which is encrypted.

0:16:480:17:04

You're crazy, old man.

0:17:060:17:07

You have no idea.

0:17:070:17:08

Get ready.

0:17:080:17:10

Now, from hacking to tracking.

0:17:100:17:11

We wondered whether it was possible to escape state snooping

0:17:110:17:14

when the Intelligence Services are trying to track your every move.

0:17:140:17:16

Lara Lewington tracked down an ex-spook to find out how you can

0:17:160:17:20

shake off any unwanted attention.

0:17:200:17:30

Listen up.

0:17:300:17:31

The suspect's name is Special Agent Elizabeth Keen.

0:17:310:17:33

She was my partner but, as of today, she's a fugitive wanted...

0:17:330:17:39

OK, it may be a glamourised version, but this scene is only too familiar

0:17:390:17:42

to former MI5 officer Annie Machon, who went on-the-run after quitting

0:17:420:17:45

the secret service to go public on her experiences.

0:17:450:17:51

I suppose we were the sort of 1990s version of Edward Snowdon.

0:17:510:17:54

Tracking someone's whereabouts may not always look as exciting as this,

0:17:540:17:57

but doing so has become easier than ever.

0:17:570:17:59

So how simply can we hide?

0:17:590:18:09

How easy is it, using modern technology, to find out

0:18:090:18:11

where someone is and what can we do to protect ourselves from that?

0:18:110:18:15

Now we're looking at programmes, such as Trap Wire, which was one

0:18:150:18:18

of the disclosures over Wikileaks a few years ago,

0:18:180:18:20

where you get meshing of your GPS, from your phone, facial recognition

0:18:200:18:23

technology via Google and also Facebook photo tagging,

0:18:230:18:25

which can be fed into the CCTV network around the UK.

0:18:250:18:47

So in live-time you can be tracked, you can be photographed,

0:18:470:18:49

you can be idnetified and then, the icing on the cake,

0:18:490:18:52

which is Trap Wire, they've written algorithmic predictive programmes

0:18:520:18:54

and if you're loitering or walking in a suspicion manner,

0:18:540:18:57

and you've already been identified, then you might be

0:18:570:18:59

about to commit a crime.

0:18:590:19:00

You might be about to be a terrorist, so then they can swoop

0:19:000:19:03

and get you.

0:19:030:19:04

The way you walk can be used against you.

0:19:040:19:06

The only way to get around that, as well as disguing yourself,

0:19:060:19:09

is to perhaps put a stone in your shoe, so it forces

0:19:090:19:12

you to walk differently.

0:19:120:19:13

But you said you wanted to know when Mrs Hargrave

0:19:130:19:16

contacted your office.

0:19:160:19:17

Put her through.

0:19:170:19:17

Turn it up.

0:19:170:19:18

You're on with Mrs Hargrave.

0:19:180:19:19

Annie also provides us with a reminder as to how the spies

0:19:190:19:22

can snoop on phone calls and e-mails.

0:19:220:19:24

Since 1998 they have been able to do bulk data hacking,

0:19:240:19:27

which means they can gather all our information,

0:19:270:19:29

be it travel, be it finance, be it health, be it work,

0:19:290:19:32

be it social life on Facebook, whatever, and amalgamate it.

0:19:320:19:34

So our entire lives are just laid open to whoever wants to look

0:19:340:19:38

at that information.

0:19:380:19:38

Now as soon as you do something which might trigger an interest,

0:19:380:19:41

they will specifically look at you - if you get involved in a political

0:19:410:19:45

campaign, or whatever.

0:19:450:19:46

So this is very difficult to try and avoid.

0:19:460:19:48

There are certain tools you can use, the privacy tools.

0:19:480:19:50

In fact, Edward Snowdon has said if you use all of these together,

0:19:500:19:54

and you use them effectively, you can protect your privacy.

0:19:540:19:56

If someone discovers they are being snooped on,

0:19:560:19:58

what should they actually do about it?

0:19:580:20:00

Yes, get a Faraday cage wallet and then lock their phone

0:20:000:20:03

in the fridge or in a biscuit tin overnight.

0:20:030:20:05

The other solution is to go back to the old tech.

0:20:050:20:08

You get pre-2008 computers, you get very old burner phones

0:20:080:20:10

with pre-paid SIMS, and that gives you a fighting chance.

0:20:100:20:15

In terms of the people that we communicate with,

0:20:150:20:17

obviously knowing someone's network and connections is actually

0:20:170:20:19

quite valuable information.

0:20:190:20:19

Hugely valuable.

0:20:190:20:20

I mean, it's probably the key information

0:20:200:20:22

for any intelligence agency.

0:20:220:20:23

We would take weeks, if not months, to try and pull together a full

0:20:230:20:26

picture of a target's networks and who they're in relationships

0:20:260:20:29

with, what their activities are and where they live,

0:20:290:20:31

where they work and where they go, and that was very,

0:20:310:20:34

very resource intensive.

0:20:340:20:35

It could involve human agents.

0:20:350:20:36

It could involve mobile surveillance, teams following them

0:20:360:20:38

around a city.

0:20:380:20:42

Now, of course, they can just could do it by triangulating

0:20:420:20:45

information by hacking all our information on the internet,

0:20:450:20:48

and they've got us.

0:20:480:20:49

Only recently Edward Snowdon has been involved in designing a phone

0:20:490:20:51

case aiming to track when data is being sent or received,

0:20:510:20:54

so a user can tell if it's happening unexpectedly.

0:20:540:20:56

This may all sound a bit extreme to most of us,

0:20:560:20:59

but if you ever do need to go under the radar,

0:20:590:21:02

at least now you may know how.

0:21:020:21:04

Um...very useful to know.

0:21:040:21:10

But then there are those people who've actually put themselves under

0:21:100:21:13

surveillance without realising they're on show to everyone.

0:21:130:21:15

This is backdoored.io, an exhibition filled with images

0:21:150:21:17

from webcams whose passwords haven't been changed from admin.

0:21:170:21:19

It's funny what people put security cameras

0:21:190:21:21

on, though, isn't it?

0:21:210:21:34

Yeah.

0:21:340:21:35

That's been one of the really interesting insights.

0:21:350:21:36

I mean, here people are watching their laundry,

0:21:360:21:39

so you can just imagine, you know - is my laundry dry?

0:21:390:21:42

I'll just go and check the security camera.

0:21:420:21:43

I've got one of plugs.

0:21:440:21:45

Is that plug still plugged in?

0:21:450:21:46

You know?

0:21:460:21:53

Artist Nye Thompson and curator Kosha Hussain have sifted

0:21:530:21:55

through 7,000 images which have been collected by search engines

0:21:550:21:57

from unprotected cameras.

0:21:570:21:58

Why have you got a whole area dedicated just to one street in Hong

0:21:580:22:02

Kong?

0:22:020:22:05

So when the images come in, they all come in with geolocation

0:22:050:22:08

data, so you can actually pinpoint where they are,

0:22:080:22:16

and I noticed there was a disproportionately large number

0:22:160:22:18

of images coming in from just this one location.

0:22:180:22:20

I think somebody just blanket installed surveillance cameras

0:22:200:22:22

in this tower block.

0:22:220:22:23

You know, the same person did all of them and they didn't really

0:22:230:22:26

know what they were doing.

0:22:260:22:39

Many of us use webcams to monitor the things that we're anxious about,

0:22:390:22:43

but I wonder how anxious these people would be if they knew

0:22:430:22:45

who was really watching?

0:22:450:22:47

That's it from Wing Lee Street in Hong Kong.

0:22:470:22:57

On behalf of the residents, thank you for watching,

0:22:570:22:59

even though I don't think they realise.

0:22:590:23:01

And curiously, over the next two weeks, we are going back to China

0:23:010:23:04

with another chance to see two of our China programmes

0:23:040:23:07

from earlier this year, as we take a well-deserved summer break.

0:23:070:23:16

'Made in China' is becoming 'designed in China'.

0:23:170:23:19

That's going to cost him at least a minute.

0:23:190:23:22

Underneath that helmet, he's going to be absolutely fuming.

0:23:220:23:24

So, China, one and two, coming in the next fortnight.

0:23:240:23:27

In the meantime, follow us on Twitter, as always, @BBC Click.

0:23:270:23:29

Thanks for watching and we'll see you soon.

0:23:290:24:09

THEME PLAYS.

0:24:200:24:22

Download Subtitles

SRT

ASS