13/02/2016

Download Subtitles

Transcript

:00:12. > :00:17.We'll come to London. It's a crime that is costing corporations dearly

:00:18. > :00:21.and it is growing. Cyber attacks are fast becoming the most common wheat

:00:22. > :00:27.to steal money and information and damage reputations. These invisible

:00:28. > :00:31.enemies are holding businesses to run some with increasingly

:00:32. > :00:36.sophisticated tactics. What can be done to combat what many are calling

:00:37. > :00:37.the crime of this generation? That is what we will be discussing on

:00:38. > :01:06.talking business. In a high-tech world where

:01:07. > :01:10.everything is connected, cyber crime is big business. It's fast becoming

:01:11. > :01:15.the issue which is most on the minds of company bosses. That is not

:01:16. > :01:21.surprising given it is estimated to be costing global businesses around

:01:22. > :01:28.$450 million a year. Internet access has become the cornerstone of modern

:01:29. > :01:32.life. The activity which drives business makes it vulnerable. Last

:01:33. > :01:38.year a number of high-profile breaches highlighted the

:01:39. > :01:43.reputational risk and cost posed by cyber crime. UK telecoms group talk

:01:44. > :01:48.talk fell victim to a hack attack which cost the company around $85

:01:49. > :01:54.million. Over 100,000 customers took their business elsewhere. The

:01:55. > :02:01.personal data of over 6 million children was exposed when Hong

:02:02. > :02:06.Kong-based vtech, which specialises in digital educational toys, fell

:02:07. > :02:11.victim to hackers. In the USA, infidelity website Ashley Madison

:02:12. > :02:14.which has over 30 million users came under attack when hackers threaten

:02:15. > :02:21.to expose the names of adulterers unless the site was taken off-line.

:02:22. > :02:24.And in Ukraine, thousands of people were left without electricity, the

:02:25. > :02:31.first known result of hacking bringing down a power network. The

:02:32. > :02:35.digital world of the 21st century has created a breeding ground for

:02:36. > :02:41.cyber bobsledding affecting businesses. The challenge is how to

:02:42. > :02:45.keep one step ahead of the hackers. And to discuss how businesses might

:02:46. > :02:51.go about that I'm joined by three people on the front line in the

:02:52. > :02:56.fight against cyber crime. George Quigley is cyber security partner in

:02:57. > :03:01.London for KPMG. He has over 20 years of experience in technology,

:03:02. > :03:06.risk and information security. James Chapel is the co-founder and chief

:03:07. > :03:09.technology officer with digital shadows, a company which analyses

:03:10. > :03:16.digital footprints to expose weaknesses. And the Professor of

:03:17. > :03:21.cyber security and director of the global cyber security capacity

:03:22. > :03:27.Centre at Oxford University. Welcome to all of you. James, if I might

:03:28. > :03:32.start with you, first of all we need to define the desk. When we think

:03:33. > :03:35.about cyber security, as individuals we think of personal information

:03:36. > :03:42.being stolen or bank details but it is more complicated than that?

:03:43. > :03:45.Define the desk if you would? We often think about keeping our

:03:46. > :03:50.secrets secret or private information private but just as

:03:51. > :03:55.important as keeping businesses operating normally and keeping trust

:03:56. > :03:57.in information we use day-to-day, so we think about the stock market or

:03:58. > :04:06.we think about our bank website being available. When we hand over

:04:07. > :04:09.any information, just staying on the subject of personal information, we

:04:10. > :04:14.have of course lost control of it and that is part of the problem?

:04:15. > :04:19.Yes, you are lying on the company or somebody else to secure your data,

:04:20. > :04:23.so you are relying on them to understand fundamentally what Baker

:04:24. > :04:28.assets they hold that are perhaps your own and other people's data.

:04:29. > :04:32.And we don't ever have that conversation, how secure is it? We

:04:33. > :04:36.don't ask the question and actually I think there's a problem because

:04:37. > :04:40.there's a lack of information. I don't think really understand how

:04:41. > :04:44.much of this information is leaking from organisations and how much out

:04:45. > :04:48.there already. We don't ask the question is when we give people the

:04:49. > :04:52.information and it is hard to find out if any organisation is properly

:04:53. > :04:55.secured the first place, what certification standard would I look

:04:56. > :05:00.at and how would I know if it is secure or not. There's an awful lot

:05:01. > :05:04.of trust and we have grown up in an environment of trust. And part of

:05:05. > :05:09.the problem as companies themselves are not able to secure the data in a

:05:10. > :05:14.way they would like to because cyber criminals are always one or more

:05:15. > :05:16.steps ahead? Certainly it is an increasingly complex challenge and

:05:17. > :05:22.it is also the nature of the business itself, that is constantly

:05:23. > :05:26.changing, the kinds of technology required to deliver that businesses

:05:27. > :05:32.and the relationships to the supply chain and partners and it of data

:05:33. > :05:36.that a lot of risk lies. They should we see these criminals? Should we

:05:37. > :05:41.see them as random individuals, perhaps Mavericks, or are much more

:05:42. > :05:46.organised than that and do we underestimate them if we do that? In

:05:47. > :05:52.the last ten years we have seen a maturing of the ecosystem that

:05:53. > :05:57.underpins the criminality, so yes, they're always be Mavericks, but

:05:58. > :06:01.what are seeing are different units and organisations coming together to

:06:02. > :06:07.offer different skills to enable crime of many different types. The

:06:08. > :06:11.criminals and particularly the serious organised gangs have found

:06:12. > :06:15.out that cyber crime is a good way to reduce risk, so looking recently

:06:16. > :06:21.at the human traffickers... Because it is anonymous? It is hard to find

:06:22. > :06:26.it hard to detect with very few successful prosecutions and it is

:06:27. > :06:30.not seen as a massive problem. You seem to be suggesting it is the same

:06:31. > :06:35.group of people who may have conducted a bank robbery who are

:06:36. > :06:39.suddenly experts said he? I am calling it crime plc, shadow

:06:40. > :06:44.organisations, and they have the same setup and a systems development

:06:45. > :06:47.team that is one of the main things people are missing. The criminal

:06:48. > :06:53.organisations have discovered with a lower risk massively by turning to

:06:54. > :06:56.cyber crime and it is quick and cheap and easy to do in many cases,

:06:57. > :06:59.because a lot of organisations are not strengthening their controls,

:07:00. > :07:04.and some of the attacks are actually very low levels of sophistication

:07:05. > :07:11.used. Some are very high but the businesses I work with in general,

:07:12. > :07:14.and if you look at the example of TalkTalk, and I do not have any

:07:15. > :07:19.inside information but it was a 15-year-old. The technique has been

:07:20. > :07:23.around for ages and it should have been identified through the risk

:07:24. > :07:28.management process and they should have sorted that, so a lot of this

:07:29. > :07:32.is not very sophisticated, and we have kind of ceded ground to the

:07:33. > :07:36.individuals without acting as we need to respond, and our challenge

:07:37. > :07:41.from an organisational context is understanding what data we need to

:07:42. > :07:47.protect. And of course sometimes the threat is from within? And what we

:07:48. > :07:50.are discovering as we talk to organisations is that it is

:07:51. > :07:58.massively underreported. Not a surprise. Because it is humiliating?

:07:59. > :08:02.And some fear that it would not look as if they were in control of their

:08:03. > :08:06.organisations and what that might do to confidence in the band and the

:08:07. > :08:13.consequences for sheer value. The other issue of course about insiders

:08:14. > :08:18.is they are inherently threats that have access to your systems, often

:08:19. > :08:24.overly lengthy period of time, so we call them systems in the community.

:08:25. > :08:27.That means they have the benefit of serendipity so they might enter the

:08:28. > :08:31.organisation after one particular thing, but because they set on the

:08:32. > :08:36.systems for some time that they can move around and discover parts of

:08:37. > :08:40.the system they wouldn't have got to, and that means they can't turn

:08:41. > :08:43.up data on organisations that is even more valuable than they had set

:08:44. > :08:50.it to get in the first place. That makes it very worrying, for us,

:08:51. > :08:56.because we are humans and we are used to building walls and locking

:08:57. > :09:02.doors, and in the security industry we have great parameters but we can

:09:03. > :09:05.be very soft inside. And we talk much more about personal information

:09:06. > :09:11.but in fact it is a geopolitical threat? They are theirs and that is

:09:12. > :09:16.not my area of expertise but when you look at what is going on across

:09:17. > :09:19.the globe, if you wanted to do something on a geopolitical space

:09:20. > :09:23.cyber is the way to go. We are all connected and there is lots of

:09:24. > :09:28.interesting information flowing around. It is actually quite

:09:29. > :09:31.valuable, and beer is also the threat from terrorism and as yet we

:09:32. > :09:35.have not really seen anything major in terms of taking out a part of a

:09:36. > :09:43.national infrastructure although there have been some attempts. We

:09:44. > :09:47.certainly have seen terrorists use money to fund their activities, and

:09:48. > :09:50.I don't know if you can join this together on a global perspective

:09:51. > :09:55.unless something happens on a global perspective to try to fight this

:09:56. > :09:59.issue and I think it will continue. We saw the Ukrainian power network

:10:00. > :10:04.go down. That was the result of a cyber crime? And in fact if you look

:10:05. > :10:11.at the whole Ukrainian conflict, there have been a whole range of

:10:12. > :10:15.attacks, heavily involved in some of the attacks on the Ukrainian banks,

:10:16. > :10:22.so just for me it is another facet of the complex geopolitical world we

:10:23. > :10:26.live in. And thank you all for now. Later in the programme we will be

:10:27. > :10:31.looking at what role the legislation might play in forcing companies to

:10:32. > :10:35.be more transparent about cyber attacks in the future. But first

:10:36. > :10:40.let's hear some thoughts from our comedy consultant, who has gone

:10:41. > :10:41.underground to explore the world of the cyber criminal in this week's

:10:42. > :10:51.talking point. You can discuss cyber crime properly

:10:52. > :10:57.without a number of cliches, the underground location.

:10:58. > :11:06.I am actually in a meeting room in the 200-year-old vault beneath bug

:11:07. > :11:11.patch laps, here in Dublin. The reason I am here is to get into a

:11:12. > :11:14.Tech frame of mind. A recent survey of security managers in North

:11:15. > :11:17.America and Europe suggested one of the biggest barriers to effective

:11:18. > :11:26.defence was employees' lack of awareness of the risks. I'll just

:11:27. > :11:37.get this. Hello? Right. A virus on my computer, that sounds serious. My

:11:38. > :11:45.password? Yes, it's 123. That'll fix it? Thank you so much. Two things

:11:46. > :11:50.scare me, the first is ours as a species, humans are not always the

:11:51. > :11:59.brightest. Our most popular password is used solely 123456, and the top

:12:00. > :12:06.20 is an embarrassing list of things like the word password. We need to

:12:07. > :12:12.be smarter. The is an enormous return from cyber criminal

:12:13. > :12:17.behaviour, and on that edge there is some very smart hackers who are

:12:18. > :12:22.creating and understanding their target and creating ways to bypass

:12:23. > :12:26.that targets. You must design with security in mind. It is easy to do

:12:27. > :12:30.but you have to put the systems in place to do that. There isn't much

:12:31. > :12:35.time and all I can say is we had in the middle of a cyber war and it is

:12:36. > :12:43.a race against time with the cyber criminals. Now if I can just...

:12:44. > :12:50.Defibrillator the cold, I should be able to reverse the effects of the

:12:51. > :12:59.automatic update and then I should be able to save changes to the

:13:00. > :13:05.Microsoft Word template. Done. Companies don't have a plan for the

:13:06. > :13:09.cyber attack and many companies can get attacked and it is as if they

:13:10. > :13:13.haven't a clue what to do whereas they should have had a plan in place

:13:14. > :13:17.for years in preparation and how they are going to get back in

:13:18. > :13:20.business and who they will inform, law enforcement data protection, and

:13:21. > :13:25.how they get their systems back and running. This will get more and more

:13:26. > :13:32.complex in future. Cyber crime will get more and more sophisticated, can

:13:33. > :13:36.the text system keep up? New things keep coming out new people to see

:13:37. > :13:40.how they can make money and law enforcement has stood learn how to

:13:41. > :13:45.detect a crime but we will always have criminals who learned to break

:13:46. > :13:48.our systems. When it comes to cyber defence you are only as strong as

:13:49. > :13:53.your weakest link and I am sure at some point in the Internet I have

:13:54. > :13:58.signed up for a newsletter whose password is protected by the

:13:59. > :14:01.equivalent of a security guard on the verge of retirement and to set

:14:02. > :14:03.up a chain of security theft so complete that next week's shall will

:14:04. > :14:14.be presented by a 62-year-old woman. The many faces of identity theft,

:14:15. > :14:22.and remember you can see more of his short films on our website. Let's

:14:23. > :14:30.come back to our guests here in the studio. Remind us just how prevalent

:14:31. > :14:34.the insider risk is, because the difficulty with the inside as it is

:14:35. > :14:38.even harder to define, as we can take work home at the next step of

:14:39. > :14:43.the Internet as we all want access to everything, the Internet of

:14:44. > :14:48.things, so the city you are trying to protect is increasingly less

:14:49. > :14:51.well-defined? Yes and I challenge anyone any modern business to be

:14:52. > :14:56.able to draw a line around the perimeter. What we would consider

:14:57. > :15:00.the interface with the rest of the world is constantly shifting and any

:15:01. > :15:07.one of those interfaces is a front door into a business, any one of

:15:08. > :15:11.them, any one of us. What can we do as individuals who work for

:15:12. > :15:15.companies or institutions, what should we be aware of, those of us

:15:16. > :15:23.who have no IT knowledge and I can myself amongst that? I think it is

:15:24. > :15:25.worth thinking about, we all value convenience but each of our

:15:26. > :15:29.decisions to interact with information we should consider

:15:30. > :15:36.security to be a part of, so while we might enjoy a convenient

:15:37. > :15:40.experience and a device purchased, what am I putting on that and what

:15:41. > :15:45.is at connecting to and how was it protected? I get the impression

:15:46. > :15:49.small companies would have difficulty finding a lot of cyber

:15:50. > :15:52.security protection and perhaps therefore some form of collaboration

:15:53. > :15:57.is right around the corner, but you don't hear much about that? It is

:15:58. > :16:02.interesting because collaboration happens very much at the threat

:16:03. > :16:05.level, so the criminals are collaborating with each other day in

:16:06. > :16:09.and an idiot to get what they can out of the system, but yet as

:16:10. > :16:13.individuals we don't collaborate and don't want to share information.

:16:14. > :16:16.Because they are competitors? We don't want to be embarrassed and we

:16:17. > :16:20.don't want to be seen as looking stupid in front of our competitors

:16:21. > :16:25.and peers, and that is one of the big issues but we have to

:16:26. > :16:28.collaborate more. I also wondered if companies are tempted to buy

:16:29. > :16:37.solutions without understanding of analysing what the biscuits. We all

:16:38. > :16:44.know about via the software and buy it off the shelf, to companies do

:16:45. > :16:48.that and buy it off the shelf? A lot of them invest in technology is

:16:49. > :16:51.thinking it is the answer but it is the interaction of technology,

:16:52. > :16:56.people and business processes that is at the core of this problem, and

:16:57. > :16:58.you have to take a holistic view of all of those things if you want to

:16:59. > :17:04.stand a chance of defending against some of the sophisticated attacks. I

:17:05. > :17:07.feel we are a little bit behind the curve and you advise governments,

:17:08. > :17:14.would you say they are behind the curve? Gosh, that varies across the

:17:15. > :17:18.globe. We are lucky in the UK and are very met you in our ability to

:17:19. > :17:22.set effective policy and to engage with all of the relevant

:17:23. > :17:26.stakeholders. That is a good point, what about collaboration on an

:17:27. > :17:33.international level because what might be legal in the United States

:17:34. > :17:36.may not be in the UK or Europe? And that is incredibly challenging for

:17:37. > :17:39.large businesses because they find themselves operating in different

:17:40. > :17:44.parts of the world with different regulatory regimes with different

:17:45. > :17:47.penalties, for example, if they don't look after people's personal

:17:48. > :17:52.data, which actually when you stand back and look at it means

:17:53. > :17:57.incentivising around the world, so you can observe different

:17:58. > :18:03.behaviours. It is really challenging at the sharp end, so when things go

:18:04. > :18:06.wrong and there is a successful attack, so often it is emanating

:18:07. > :18:12.from around the world because that is one of the components for cyber

:18:13. > :18:15.criminals, whatever flavour they are, in that they can leveraged the

:18:16. > :18:19.attack from outside the jurisdiction, so if you then

:18:20. > :18:24.discover that, and a lot of it is not discovered, but the stuff we

:18:25. > :18:28.discover, going and collecting the evidence and holding the criminals

:18:29. > :18:33.to a kid any way that might deter others from engaging in that

:18:34. > :18:35.criminality, that is challenging. It is certainly something the

:18:36. > :18:40.international community is constantly working on. Are we seeing

:18:41. > :18:47.a tightening legal obligation meaning companies have to have basic

:18:48. > :18:54.protection in place? For customers information for example or other

:18:55. > :18:58.information? We're to implement the general data protection regulation.

:18:59. > :19:02.That is within Europe. You are seeing that on a more global basis

:19:03. > :19:05.at the moment but the challenge is, how do you actually get under the

:19:06. > :19:10.skin of this and that would disclose what has happened. If there is an

:19:11. > :19:13.organisation and if you look at the guidance, the information

:19:14. > :19:18.Commissioner's office says you must disclose to them any significant

:19:19. > :19:23.breach, which involves any loss of personal information. What is the

:19:24. > :19:26.definition of significant? There is a market failure in some of this and

:19:27. > :19:30.whether they are busy legislative approach you have to look to some

:19:31. > :19:35.sort of approach built on liability, and try to change it that way, I am

:19:36. > :19:38.not sure what we are to goal. We talked about where the protection

:19:39. > :19:42.might go one collaboration is one answer to that but every time you

:19:43. > :19:48.move to protect yourself, the desk moves somewhere else. We do you see

:19:49. > :19:51.the next risks coming from? If you think about how we connect things

:19:52. > :19:58.like power stations or the German steel plant that had the big

:19:59. > :20:00.industrial accident, we're connecting these systems to the

:20:01. > :20:05.Internet and the ability to influence those can mean much bigger

:20:06. > :20:08.consequences for business. Would you see that as the big problem, that

:20:09. > :20:14.more and more things are getting connected to the Internet and from

:20:15. > :20:17.the... ? That is certainly a big problem and one I would imagine you

:20:18. > :20:24.would see that would affect businesses and people on the street,

:20:25. > :20:28.unpredictable roads where people have hacked the controls and maybe

:20:29. > :20:33.people can hack our cars and stop them, maybe we will see some

:20:34. > :20:39.physical harm. We are likely to see disruption across transport and we

:20:40. > :20:43.may have power going out. Initially that might have the power suppliers

:20:44. > :20:48.first, but ultimately it will have knock-on effects, so second and

:20:49. > :20:54.third order risks as they flow down the chain. The power goes out and

:20:55. > :20:58.somebody has a crash, we couldn't perform this emergency surgery and

:20:59. > :21:01.people died. The power went out and our systems than they do it

:21:02. > :21:08.properly, we have forgotten how much money everybody had on the bike and

:21:09. > :21:15.so on. I think that is coming and I think the other thing that is coming

:21:16. > :21:19.is large-scale attacks against critical parts of the infrastructure

:21:20. > :21:24.that affect the nation any more terrifying way. We have run some

:21:25. > :21:29.weird at the moment where I can encrypt files and you don't get them

:21:30. > :21:34.back unless you give me money. What if I could do the same and stop your

:21:35. > :21:37.pacemaker from working because it is connected to the Internet. I can

:21:38. > :21:41.make even more money and I don't think we really understand where the

:21:42. > :21:45.risks are going to come from. What I see and what I read in the press as

:21:46. > :21:49.a lot of the device is being built on not being built with security

:21:50. > :21:53.first and foremost in mind, and if you go back to the initial days of

:21:54. > :21:56.the Internet, the Internet was not designed for security, but where we

:21:57. > :22:00.are today we should be understanding that perhaps we do need to put

:22:01. > :22:04.security onto some of these devices and yet we are going to replicate

:22:05. > :22:09.the policy you again of creating a whole load of insecure devices and

:22:10. > :22:14.who knows what people will do. Carlos could be won but who knows

:22:15. > :22:17.where it will go, and we need to think differently, we need to think

:22:18. > :22:21.about how could somebody who has got that intent, how could they use that

:22:22. > :22:25.and what role they do and start putting some security in place to

:22:26. > :22:33.stop them from doing it. Thank you to all of you. That's it for talking

:22:34. > :22:39.business in London but do join us again next week. We will be an

:22:40. > :22:40.Australia exploring what role farmers might have in driving

:22:41. > :23:00.economic growth. It is already approaching -10 in the

:23:01. > :23:03.Highlands of Scotland, it is called out there and this is going to stay

:23:04. > :23:08.cold for the next couple of days and nights. Some snow showers across the

:23:09. > :23:10.hills of north-east England and these continue