Browse content similar to 13/02/2016. Check below for episodes and series from the same categories and more!
Line | From | To | |
---|---|---|---|
We'll come to London. It's a crime that is costing corporations dearly | :00:12. | :00:17. | |
and it is growing. Cyber attacks are fast becoming the most common wheat | :00:18. | :00:21. | |
to steal money and information and damage reputations. These invisible | :00:22. | :00:27. | |
enemies are holding businesses to run some with increasingly | :00:28. | :00:31. | |
sophisticated tactics. What can be done to combat what many are calling | :00:32. | :00:36. | |
the crime of this generation? That is what we will be discussing on | :00:37. | :00:37. | |
talking business. In a high-tech world where | :00:38. | :01:06. | |
everything is connected, cyber crime is big business. It's fast becoming | :01:07. | :01:10. | |
the issue which is most on the minds of company bosses. That is not | :01:11. | :01:15. | |
surprising given it is estimated to be costing global businesses around | :01:16. | :01:21. | |
$450 million a year. Internet access has become the cornerstone of modern | :01:22. | :01:28. | |
life. The activity which drives business makes it vulnerable. Last | :01:29. | :01:32. | |
year a number of high-profile breaches highlighted the | :01:33. | :01:38. | |
reputational risk and cost posed by cyber crime. UK telecoms group talk | :01:39. | :01:43. | |
talk fell victim to a hack attack which cost the company around $85 | :01:44. | :01:48. | |
million. Over 100,000 customers took their business elsewhere. The | :01:49. | :01:54. | |
personal data of over 6 million children was exposed when Hong | :01:55. | :02:01. | |
Kong-based vtech, which specialises in digital educational toys, fell | :02:02. | :02:06. | |
victim to hackers. In the USA, infidelity website Ashley Madison | :02:07. | :02:11. | |
which has over 30 million users came under attack when hackers threaten | :02:12. | :02:14. | |
to expose the names of adulterers unless the site was taken off-line. | :02:15. | :02:21. | |
And in Ukraine, thousands of people were left without electricity, the | :02:22. | :02:24. | |
first known result of hacking bringing down a power network. The | :02:25. | :02:31. | |
digital world of the 21st century has created a breeding ground for | :02:32. | :02:35. | |
cyber bobsledding affecting businesses. The challenge is how to | :02:36. | :02:41. | |
keep one step ahead of the hackers. And to discuss how businesses might | :02:42. | :02:45. | |
go about that I'm joined by three people on the front line in the | :02:46. | :02:51. | |
fight against cyber crime. George Quigley is cyber security partner in | :02:52. | :02:56. | |
London for KPMG. He has over 20 years of experience in technology, | :02:57. | :03:01. | |
risk and information security. James Chapel is the co-founder and chief | :03:02. | :03:06. | |
technology officer with digital shadows, a company which analyses | :03:07. | :03:09. | |
digital footprints to expose weaknesses. And the Professor of | :03:10. | :03:16. | |
cyber security and director of the global cyber security capacity | :03:17. | :03:21. | |
Centre at Oxford University. Welcome to all of you. James, if I might | :03:22. | :03:27. | |
start with you, first of all we need to define the desk. When we think | :03:28. | :03:32. | |
about cyber security, as individuals we think of personal information | :03:33. | :03:35. | |
being stolen or bank details but it is more complicated than that? | :03:36. | :03:42. | |
Define the desk if you would? We often think about keeping our | :03:43. | :03:45. | |
secrets secret or private information private but just as | :03:46. | :03:50. | |
important as keeping businesses operating normally and keeping trust | :03:51. | :03:55. | |
in information we use day-to-day, so we think about the stock market or | :03:56. | :03:57. | |
we think about our bank website being available. When we hand over | :03:58. | :04:06. | |
any information, just staying on the subject of personal information, we | :04:07. | :04:09. | |
have of course lost control of it and that is part of the problem? | :04:10. | :04:14. | |
Yes, you are lying on the company or somebody else to secure your data, | :04:15. | :04:19. | |
so you are relying on them to understand fundamentally what Baker | :04:20. | :04:23. | |
assets they hold that are perhaps your own and other people's data. | :04:24. | :04:28. | |
And we don't ever have that conversation, how secure is it? We | :04:29. | :04:32. | |
don't ask the question and actually I think there's a problem because | :04:33. | :04:36. | |
there's a lack of information. I don't think really understand how | :04:37. | :04:40. | |
much of this information is leaking from organisations and how much out | :04:41. | :04:44. | |
there already. We don't ask the question is when we give people the | :04:45. | :04:48. | |
information and it is hard to find out if any organisation is properly | :04:49. | :04:52. | |
secured the first place, what certification standard would I look | :04:53. | :04:55. | |
at and how would I know if it is secure or not. There's an awful lot | :04:56. | :05:00. | |
of trust and we have grown up in an environment of trust. And part of | :05:01. | :05:04. | |
the problem as companies themselves are not able to secure the data in a | :05:05. | :05:09. | |
way they would like to because cyber criminals are always one or more | :05:10. | :05:14. | |
steps ahead? Certainly it is an increasingly complex challenge and | :05:15. | :05:16. | |
it is also the nature of the business itself, that is constantly | :05:17. | :05:22. | |
changing, the kinds of technology required to deliver that businesses | :05:23. | :05:26. | |
and the relationships to the supply chain and partners and it of data | :05:27. | :05:32. | |
that a lot of risk lies. They should we see these criminals? Should we | :05:33. | :05:36. | |
see them as random individuals, perhaps Mavericks, or are much more | :05:37. | :05:41. | |
organised than that and do we underestimate them if we do that? In | :05:42. | :05:46. | |
the last ten years we have seen a maturing of the ecosystem that | :05:47. | :05:52. | |
underpins the criminality, so yes, they're always be Mavericks, but | :05:53. | :05:57. | |
what are seeing are different units and organisations coming together to | :05:58. | :06:01. | |
offer different skills to enable crime of many different types. The | :06:02. | :06:07. | |
criminals and particularly the serious organised gangs have found | :06:08. | :06:11. | |
out that cyber crime is a good way to reduce risk, so looking recently | :06:12. | :06:15. | |
at the human traffickers... Because it is anonymous? It is hard to find | :06:16. | :06:21. | |
it hard to detect with very few successful prosecutions and it is | :06:22. | :06:26. | |
not seen as a massive problem. You seem to be suggesting it is the same | :06:27. | :06:30. | |
group of people who may have conducted a bank robbery who are | :06:31. | :06:35. | |
suddenly experts said he? I am calling it crime plc, shadow | :06:36. | :06:39. | |
organisations, and they have the same setup and a systems development | :06:40. | :06:44. | |
team that is one of the main things people are missing. The criminal | :06:45. | :06:47. | |
organisations have discovered with a lower risk massively by turning to | :06:48. | :06:53. | |
cyber crime and it is quick and cheap and easy to do in many cases, | :06:54. | :06:56. | |
because a lot of organisations are not strengthening their controls, | :06:57. | :06:59. | |
and some of the attacks are actually very low levels of sophistication | :07:00. | :07:04. | |
used. Some are very high but the businesses I work with in general, | :07:05. | :07:11. | |
and if you look at the example of TalkTalk, and I do not have any | :07:12. | :07:14. | |
inside information but it was a 15-year-old. The technique has been | :07:15. | :07:19. | |
around for ages and it should have been identified through the risk | :07:20. | :07:23. | |
management process and they should have sorted that, so a lot of this | :07:24. | :07:28. | |
is not very sophisticated, and we have kind of ceded ground to the | :07:29. | :07:32. | |
individuals without acting as we need to respond, and our challenge | :07:33. | :07:36. | |
from an organisational context is understanding what data we need to | :07:37. | :07:41. | |
protect. And of course sometimes the threat is from within? And what we | :07:42. | :07:47. | |
are discovering as we talk to organisations is that it is | :07:48. | :07:50. | |
massively underreported. Not a surprise. Because it is humiliating? | :07:51. | :07:58. | |
And some fear that it would not look as if they were in control of their | :07:59. | :08:02. | |
organisations and what that might do to confidence in the band and the | :08:03. | :08:06. | |
consequences for sheer value. The other issue of course about insiders | :08:07. | :08:13. | |
is they are inherently threats that have access to your systems, often | :08:14. | :08:18. | |
overly lengthy period of time, so we call them systems in the community. | :08:19. | :08:24. | |
That means they have the benefit of serendipity so they might enter the | :08:25. | :08:27. | |
organisation after one particular thing, but because they set on the | :08:28. | :08:31. | |
systems for some time that they can move around and discover parts of | :08:32. | :08:36. | |
the system they wouldn't have got to, and that means they can't turn | :08:37. | :08:40. | |
up data on organisations that is even more valuable than they had set | :08:41. | :08:43. | |
it to get in the first place. That makes it very worrying, for us, | :08:44. | :08:50. | |
because we are humans and we are used to building walls and locking | :08:51. | :08:56. | |
doors, and in the security industry we have great parameters but we can | :08:57. | :09:02. | |
be very soft inside. And we talk much more about personal information | :09:03. | :09:05. | |
but in fact it is a geopolitical threat? They are theirs and that is | :09:06. | :09:11. | |
not my area of expertise but when you look at what is going on across | :09:12. | :09:16. | |
the globe, if you wanted to do something on a geopolitical space | :09:17. | :09:19. | |
cyber is the way to go. We are all connected and there is lots of | :09:20. | :09:23. | |
interesting information flowing around. It is actually quite | :09:24. | :09:28. | |
valuable, and beer is also the threat from terrorism and as yet we | :09:29. | :09:31. | |
have not really seen anything major in terms of taking out a part of a | :09:32. | :09:35. | |
national infrastructure although there have been some attempts. We | :09:36. | :09:43. | |
certainly have seen terrorists use money to fund their activities, and | :09:44. | :09:47. | |
I don't know if you can join this together on a global perspective | :09:48. | :09:50. | |
unless something happens on a global perspective to try to fight this | :09:51. | :09:55. | |
issue and I think it will continue. We saw the Ukrainian power network | :09:56. | :09:59. | |
go down. That was the result of a cyber crime? And in fact if you look | :10:00. | :10:04. | |
at the whole Ukrainian conflict, there have been a whole range of | :10:05. | :10:11. | |
attacks, heavily involved in some of the attacks on the Ukrainian banks, | :10:12. | :10:15. | |
so just for me it is another facet of the complex geopolitical world we | :10:16. | :10:22. | |
live in. And thank you all for now. Later in the programme we will be | :10:23. | :10:26. | |
looking at what role the legislation might play in forcing companies to | :10:27. | :10:31. | |
be more transparent about cyber attacks in the future. But first | :10:32. | :10:35. | |
let's hear some thoughts from our comedy consultant, who has gone | :10:36. | :10:40. | |
underground to explore the world of the cyber criminal in this week's | :10:41. | :10:41. | |
talking point. You can discuss cyber crime properly | :10:42. | :10:51. | |
without a number of cliches, the underground location. | :10:52. | :10:57. | |
I am actually in a meeting room in the 200-year-old vault beneath bug | :10:58. | :11:06. | |
patch laps, here in Dublin. The reason I am here is to get into a | :11:07. | :11:11. | |
Tech frame of mind. A recent survey of security managers in North | :11:12. | :11:14. | |
America and Europe suggested one of the biggest barriers to effective | :11:15. | :11:17. | |
defence was employees' lack of awareness of the risks. I'll just | :11:18. | :11:26. | |
get this. Hello? Right. A virus on my computer, that sounds serious. My | :11:27. | :11:37. | |
password? Yes, it's 123. That'll fix it? Thank you so much. Two things | :11:38. | :11:45. | |
scare me, the first is ours as a species, humans are not always the | :11:46. | :11:50. | |
brightest. Our most popular password is used solely 123456, and the top | :11:51. | :11:59. | |
20 is an embarrassing list of things like the word password. We need to | :12:00. | :12:06. | |
be smarter. The is an enormous return from cyber criminal | :12:07. | :12:12. | |
behaviour, and on that edge there is some very smart hackers who are | :12:13. | :12:17. | |
creating and understanding their target and creating ways to bypass | :12:18. | :12:22. | |
that targets. You must design with security in mind. It is easy to do | :12:23. | :12:26. | |
but you have to put the systems in place to do that. There isn't much | :12:27. | :12:30. | |
time and all I can say is we had in the middle of a cyber war and it is | :12:31. | :12:35. | |
a race against time with the cyber criminals. Now if I can just... | :12:36. | :12:43. | |
Defibrillator the cold, I should be able to reverse the effects of the | :12:44. | :12:50. | |
automatic update and then I should be able to save changes to the | :12:51. | :12:59. | |
Microsoft Word template. Done. Companies don't have a plan for the | :13:00. | :13:05. | |
cyber attack and many companies can get attacked and it is as if they | :13:06. | :13:09. | |
haven't a clue what to do whereas they should have had a plan in place | :13:10. | :13:13. | |
for years in preparation and how they are going to get back in | :13:14. | :13:17. | |
business and who they will inform, law enforcement data protection, and | :13:18. | :13:20. | |
how they get their systems back and running. This will get more and more | :13:21. | :13:25. | |
complex in future. Cyber crime will get more and more sophisticated, can | :13:26. | :13:32. | |
the text system keep up? New things keep coming out new people to see | :13:33. | :13:36. | |
how they can make money and law enforcement has stood learn how to | :13:37. | :13:40. | |
detect a crime but we will always have criminals who learned to break | :13:41. | :13:45. | |
our systems. When it comes to cyber defence you are only as strong as | :13:46. | :13:48. | |
your weakest link and I am sure at some point in the Internet I have | :13:49. | :13:53. | |
signed up for a newsletter whose password is protected by the | :13:54. | :13:58. | |
equivalent of a security guard on the verge of retirement and to set | :13:59. | :14:01. | |
up a chain of security theft so complete that next week's shall will | :14:02. | :14:03. | |
be presented by a 62-year-old woman. The many faces of identity theft, | :14:04. | :14:14. | |
and remember you can see more of his short films on our website. Let's | :14:15. | :14:22. | |
come back to our guests here in the studio. Remind us just how prevalent | :14:23. | :14:30. | |
the insider risk is, because the difficulty with the inside as it is | :14:31. | :14:34. | |
even harder to define, as we can take work home at the next step of | :14:35. | :14:38. | |
the Internet as we all want access to everything, the Internet of | :14:39. | :14:43. | |
things, so the city you are trying to protect is increasingly less | :14:44. | :14:48. | |
well-defined? Yes and I challenge anyone any modern business to be | :14:49. | :14:51. | |
able to draw a line around the perimeter. What we would consider | :14:52. | :14:56. | |
the interface with the rest of the world is constantly shifting and any | :14:57. | :15:00. | |
one of those interfaces is a front door into a business, any one of | :15:01. | :15:07. | |
them, any one of us. What can we do as individuals who work for | :15:08. | :15:11. | |
companies or institutions, what should we be aware of, those of us | :15:12. | :15:15. | |
who have no IT knowledge and I can myself amongst that? I think it is | :15:16. | :15:23. | |
worth thinking about, we all value convenience but each of our | :15:24. | :15:25. | |
decisions to interact with information we should consider | :15:26. | :15:29. | |
security to be a part of, so while we might enjoy a convenient | :15:30. | :15:36. | |
experience and a device purchased, what am I putting on that and what | :15:37. | :15:40. | |
is at connecting to and how was it protected? I get the impression | :15:41. | :15:45. | |
small companies would have difficulty finding a lot of cyber | :15:46. | :15:49. | |
security protection and perhaps therefore some form of collaboration | :15:50. | :15:52. | |
is right around the corner, but you don't hear much about that? It is | :15:53. | :15:57. | |
interesting because collaboration happens very much at the threat | :15:58. | :16:02. | |
level, so the criminals are collaborating with each other day in | :16:03. | :16:05. | |
and an idiot to get what they can out of the system, but yet as | :16:06. | :16:09. | |
individuals we don't collaborate and don't want to share information. | :16:10. | :16:13. | |
Because they are competitors? We don't want to be embarrassed and we | :16:14. | :16:16. | |
don't want to be seen as looking stupid in front of our competitors | :16:17. | :16:20. | |
and peers, and that is one of the big issues but we have to | :16:21. | :16:25. | |
collaborate more. I also wondered if companies are tempted to buy | :16:26. | :16:28. | |
solutions without understanding of analysing what the biscuits. We all | :16:29. | :16:37. | |
know about via the software and buy it off the shelf, to companies do | :16:38. | :16:44. | |
that and buy it off the shelf? A lot of them invest in technology is | :16:45. | :16:48. | |
thinking it is the answer but it is the interaction of technology, | :16:49. | :16:51. | |
people and business processes that is at the core of this problem, and | :16:52. | :16:56. | |
you have to take a holistic view of all of those things if you want to | :16:57. | :16:58. | |
stand a chance of defending against some of the sophisticated attacks. I | :16:59. | :17:04. | |
feel we are a little bit behind the curve and you advise governments, | :17:05. | :17:07. | |
would you say they are behind the curve? Gosh, that varies across the | :17:08. | :17:14. | |
globe. We are lucky in the UK and are very met you in our ability to | :17:15. | :17:18. | |
set effective policy and to engage with all of the relevant | :17:19. | :17:22. | |
stakeholders. That is a good point, what about collaboration on an | :17:23. | :17:26. | |
international level because what might be legal in the United States | :17:27. | :17:33. | |
may not be in the UK or Europe? And that is incredibly challenging for | :17:34. | :17:36. | |
large businesses because they find themselves operating in different | :17:37. | :17:39. | |
parts of the world with different regulatory regimes with different | :17:40. | :17:44. | |
penalties, for example, if they don't look after people's personal | :17:45. | :17:47. | |
data, which actually when you stand back and look at it means | :17:48. | :17:52. | |
incentivising around the world, so you can observe different | :17:53. | :17:57. | |
behaviours. It is really challenging at the sharp end, so when things go | :17:58. | :18:03. | |
wrong and there is a successful attack, so often it is emanating | :18:04. | :18:06. | |
from around the world because that is one of the components for cyber | :18:07. | :18:12. | |
criminals, whatever flavour they are, in that they can leveraged the | :18:13. | :18:15. | |
attack from outside the jurisdiction, so if you then | :18:16. | :18:19. | |
discover that, and a lot of it is not discovered, but the stuff we | :18:20. | :18:24. | |
discover, going and collecting the evidence and holding the criminals | :18:25. | :18:28. | |
to a kid any way that might deter others from engaging in that | :18:29. | :18:33. | |
criminality, that is challenging. It is certainly something the | :18:34. | :18:35. | |
international community is constantly working on. Are we seeing | :18:36. | :18:40. | |
a tightening legal obligation meaning companies have to have basic | :18:41. | :18:47. | |
protection in place? For customers information for example or other | :18:48. | :18:54. | |
information? We're to implement the general data protection regulation. | :18:55. | :18:58. | |
That is within Europe. You are seeing that on a more global basis | :18:59. | :19:02. | |
at the moment but the challenge is, how do you actually get under the | :19:03. | :19:05. | |
skin of this and that would disclose what has happened. If there is an | :19:06. | :19:10. | |
organisation and if you look at the guidance, the information | :19:11. | :19:13. | |
Commissioner's office says you must disclose to them any significant | :19:14. | :19:18. | |
breach, which involves any loss of personal information. What is the | :19:19. | :19:23. | |
definition of significant? There is a market failure in some of this and | :19:24. | :19:26. | |
whether they are busy legislative approach you have to look to some | :19:27. | :19:30. | |
sort of approach built on liability, and try to change it that way, I am | :19:31. | :19:35. | |
not sure what we are to goal. We talked about where the protection | :19:36. | :19:38. | |
might go one collaboration is one answer to that but every time you | :19:39. | :19:42. | |
move to protect yourself, the desk moves somewhere else. We do you see | :19:43. | :19:48. | |
the next risks coming from? If you think about how we connect things | :19:49. | :19:51. | |
like power stations or the German steel plant that had the big | :19:52. | :19:58. | |
industrial accident, we're connecting these systems to the | :19:59. | :20:00. | |
Internet and the ability to influence those can mean much bigger | :20:01. | :20:05. | |
consequences for business. Would you see that as the big problem, that | :20:06. | :20:08. | |
more and more things are getting connected to the Internet and from | :20:09. | :20:14. | |
the... ? That is certainly a big problem and one I would imagine you | :20:15. | :20:17. | |
would see that would affect businesses and people on the street, | :20:18. | :20:24. | |
unpredictable roads where people have hacked the controls and maybe | :20:25. | :20:28. | |
people can hack our cars and stop them, maybe we will see some | :20:29. | :20:33. | |
physical harm. We are likely to see disruption across transport and we | :20:34. | :20:39. | |
may have power going out. Initially that might have the power suppliers | :20:40. | :20:43. | |
first, but ultimately it will have knock-on effects, so second and | :20:44. | :20:48. | |
third order risks as they flow down the chain. The power goes out and | :20:49. | :20:54. | |
somebody has a crash, we couldn't perform this emergency surgery and | :20:55. | :20:58. | |
people died. The power went out and our systems than they do it | :20:59. | :21:01. | |
properly, we have forgotten how much money everybody had on the bike and | :21:02. | :21:08. | |
so on. I think that is coming and I think the other thing that is coming | :21:09. | :21:15. | |
is large-scale attacks against critical parts of the infrastructure | :21:16. | :21:19. | |
that affect the nation any more terrifying way. We have run some | :21:20. | :21:24. | |
weird at the moment where I can encrypt files and you don't get them | :21:25. | :21:29. | |
back unless you give me money. What if I could do the same and stop your | :21:30. | :21:34. | |
pacemaker from working because it is connected to the Internet. I can | :21:35. | :21:37. | |
make even more money and I don't think we really understand where the | :21:38. | :21:41. | |
risks are going to come from. What I see and what I read in the press as | :21:42. | :21:45. | |
a lot of the device is being built on not being built with security | :21:46. | :21:49. | |
first and foremost in mind, and if you go back to the initial days of | :21:50. | :21:53. | |
the Internet, the Internet was not designed for security, but where we | :21:54. | :21:56. | |
are today we should be understanding that perhaps we do need to put | :21:57. | :22:00. | |
security onto some of these devices and yet we are going to replicate | :22:01. | :22:04. | |
the policy you again of creating a whole load of insecure devices and | :22:05. | :22:09. | |
who knows what people will do. Carlos could be won but who knows | :22:10. | :22:14. | |
where it will go, and we need to think differently, we need to think | :22:15. | :22:17. | |
about how could somebody who has got that intent, how could they use that | :22:18. | :22:21. | |
and what role they do and start putting some security in place to | :22:22. | :22:25. | |
stop them from doing it. Thank you to all of you. That's it for talking | :22:26. | :22:33. | |
business in London but do join us again next week. We will be an | :22:34. | :22:39. | |
Australia exploring what role farmers might have in driving | :22:40. | :22:40. | |
economic growth. It is already approaching -10 in the | :22:41. | :23:00. | |
Highlands of Scotland, it is called out there and this is going to stay | :23:01. | :23:03. | |
cold for the next couple of days and nights. Some snow showers across the | :23:04. | :23:08. | |
hills of north-east England and these continue | :23:09. | :23:10. |