Cyber Security Committee

Download Subtitles

Transcript

:00:14. > :00:19.Thank you for coming on the committee is grateful for you for

:00:20. > :00:23.coming and to give evidence to us. We are fortunate to have a lot of

:00:24. > :00:29.written evidence as well, so we have quite a lot to chew on. If we can

:00:30. > :00:34.begin, what I'd like to do right at the beginning, because this is our

:00:35. > :00:41.enquiry, is to put on record at the enquiry, is to put on record at the

:00:42. > :00:48.start of the evidence taking some simple contexts suggesting

:00:49. > :00:53.questions. You will know that the national security risk assessment

:00:54. > :01:04.characterises cyber as a top tier threat. If you think that is

:01:05. > :01:09.justified, what would you say is the nature of the threat to the United

:01:10. > :01:12.Kingdom and how might it be made manifest, and are there any threats

:01:13. > :01:15.to which either government or the private sector are particularly

:01:16. > :01:25.vulnerable, and if so, what are they? I can start. That is a fair

:01:26. > :01:31.assessment as to what the threat is today. Where it manifests itself,

:01:32. > :01:34.much is made of the nation state capability, and we do have

:01:35. > :01:40.aggressors that are targeting the UK and the private sector. Similarly we

:01:41. > :01:46.should not underestimate the force with which organised crime is

:01:47. > :01:54.embracing cyber capability, and for various reasons. So to gain direct

:01:55. > :02:01.economic benefits from activities, second wary to that, by proxy,

:02:02. > :02:07.supporting certain nation states by creating some clear water, shall we

:02:08. > :02:11.say. Anyone else? I am dubious to the idea that it is a tier one

:02:12. > :02:18.thread. I think it has been overhyped. We do have intense worry

:02:19. > :02:20.about cyber espionage and Internet warfare, but the probability

:02:21. > :02:23.critical infrastructure attack would critical infrastructure attack would

:02:24. > :02:28.only be very likely under the context of a global war, so that is

:02:29. > :02:33.something that is a bit beyond the normal consideration. Most cyber

:02:34. > :02:41.aggressors use it for information advantages, and that is the nature

:02:42. > :02:43.of the cyber threat right now. I would suggest it would be

:02:44. > :02:47.a tier one threat largely because it a tier one threat largely because it

:02:48. > :02:53.has so much impact on the other category one threats to the UK. I

:02:54. > :02:58.guess that cyber is something that can happen directly, but it can also

:02:59. > :03:02.have an impact on public health, on terrorism and it can be an enabler

:03:03. > :03:06.for military conflict, and because of that it would make sense that it

:03:07. > :03:11.would be classed alongside as a tier one threat. The other way of looking

:03:12. > :03:15.at this is that we often talk about what it means Drame nation to nation

:03:16. > :03:20.perspective. We should not lose track of the fact there is a real

:03:21. > :03:26.threat to members of the population through ransom, where we are now

:03:27. > :03:33.seeing organised crime units trying to monetise attacks by targeting end

:03:34. > :03:38.users, encrypting their laptops and having an impact that is probably

:03:39. > :03:46.seen much more frequently on a day-to-day basis. I would come at it

:03:47. > :03:50.from the consideration of exposer. We are a digital society, whether we

:03:51. > :03:54.like it or not, and will no everything that is physical has

:03:55. > :03:58.digital processes behind it, so we are vulnerable to those digital

:03:59. > :04:03.processes and data flow being interfered with. Much of what we do

:04:04. > :04:06.in society depends on whether it could be physical things in

:04:07. > :04:12.hospitals, but there are scheduling, blood ordering, and all things go on

:04:13. > :04:15.in the background that our digital processes, and without those digital

:04:16. > :04:18.processes a lot of the physical world will deteriorate. So I think

:04:19. > :04:23.it is absolutely a tier one threat less because of the threat to date

:04:24. > :04:26.but because of the vulnerability and the possibility a could materialise

:04:27. > :04:30.quickly. Having established that three of you think it is a tier one

:04:31. > :04:37.threat, although you have some doubts about it, without wanting to

:04:38. > :04:41.scaremonger, what is the worst-case scenario for cyber attack on this

:04:42. > :04:46.country, and how likely is it? As you all know, the whole point about

:04:47. > :04:53.the risk register is a mess -- mix of impact unlikelihood. -- and

:04:54. > :04:59.likelihood. Getting into the specifics of what could happen, you

:05:00. > :05:05.could see a very likely scenario where our ability for the financial

:05:06. > :05:11.markets to operate and for much of the health systems, and the

:05:12. > :05:17.infrastructure to function, that could be disabled. That could be

:05:18. > :05:23.severely impacted. Many of the adversaries who might want to do

:05:24. > :05:27.that until now, there are other geopolitical boundaries around that

:05:28. > :05:33.might cause them not to do so. But as capability becomes more and more

:05:34. > :05:41.accessible then I think we could see criminals and terrorists becoming

:05:42. > :05:45.more capable and then you have less political and diplomatic bounce

:05:46. > :05:52.around behaviour and constraint. I would agree with that. If we look at

:05:53. > :05:56.the UK by virtue of our maturity, we adopted technology very early on,

:05:57. > :06:00.and by virtue of that we have some legacy systems which will run some

:06:01. > :06:07.quite key elements of our country, and that in itself provides some

:06:08. > :06:11.unique challenges where we have some ageing systems coupled with an

:06:12. > :06:15.increasing capability, which we might not intend to take them down,

:06:16. > :06:20.but through their own exploration and success in compromise have an

:06:21. > :06:24.inadvertent outcome on the systems. If the worst does happen it might

:06:25. > :06:30.not always be intentional as well, so that is something to bear in

:06:31. > :06:34.mind. My comment would be is to use your imagination in terms of what it

:06:35. > :06:42.might entail. What do I mean by that? Anything that draws upon data,

:06:43. > :06:46.information or anything that touches currency and trading. I read an

:06:47. > :06:53.article last week on the BBC website about how even farmers are now using

:06:54. > :06:58.data to make decisions about how much fertiliser to put onto their

:06:59. > :07:02.crops. The suggestion was that if a hacker could get access to some of

:07:03. > :07:09.the systems and manipulate the information you could have a

:07:10. > :07:13.scenario where where the levels of crops being produced might result in

:07:14. > :07:21.some form of famine. You might come up with another form of famine that

:07:22. > :07:24.says many of the industrial consoled -- control systems now touch data

:07:25. > :07:32.networks, so if somebody could straddle a data network and from

:07:33. > :07:35.there get action -- access to control systems. There are all sorts

:07:36. > :07:40.of different things the individual might be able to do. We have seen

:07:41. > :07:45.some of that both out in the Ukraine and in some other countries over the

:07:46. > :07:49.last two years. The problem is we cannot make policy based on

:07:50. > :07:53.worst-case scenarios, and we need to do it on probabilities what is

:07:54. > :07:56.likely to happen. Aggressors have had the technology for 30 years and

:07:57. > :08:00.they have been generally restrained in using it for aggressive purposes.

:08:01. > :08:05.Terrorists are not so restrained they don't necessarily have the

:08:06. > :08:08.capabilities. So the real difference between who is the aggressor, what

:08:09. > :08:13.is the target and what are the goals? We have to worry about a

:08:14. > :08:17.general societal programme to promote resilience in the state to

:08:18. > :08:20.recover from possible cyber attacks, but the probability of a dramatic

:08:21. > :08:25.cyber attack is only highly likely in the context of a major war with

:08:26. > :08:28.Russia or China and that's not necessarily on the table. Will it

:08:29. > :08:32.happen any time soon? That is something out of the imagination and

:08:33. > :08:37.that is not how we should govern. We need government based on what is

:08:38. > :08:40.likely to happen. We have not seen cyber technology used for massive

:08:41. > :08:44.effect so far and we are unlikely to. We have seen sabotage but that's

:08:45. > :08:50.generally been done by nation states to attack launch capabilities to

:08:51. > :08:53.prevent other countries from acquiring certain technologies but

:08:54. > :08:56.we've not seen any great devastation so far and I don't think we are

:08:57. > :09:04.likely to any time soon. If I may counter that slightly, one thing we

:09:05. > :09:08.cannot say from a policy perspective is say we can deal with it when it

:09:09. > :09:12.becomes an issue. We are building the digital society today. If in ten

:09:13. > :09:16.years' time we find ourselves in a conflict situation we won't rewind

:09:17. > :09:21.the clock ten years and say we can build a secure digital society. I

:09:22. > :09:28.think there is a policy imperative to build a secure society, without

:09:29. > :09:31.necessarily trying to anticipate what the threat may be or may become

:09:32. > :09:36.in the next five or ten years because we have to build today for

:09:37. > :09:41.the future. The word resilience was used there, and I think that is an

:09:42. > :09:46.important word. The free market, if left to its own devices will make

:09:47. > :09:52.money, which is a wonderful thing but security is, to a greater or

:09:53. > :09:57.lesser extent not seen by buyers as a key point at the moment in terms

:09:58. > :10:04.of their purchasing decision, so our resilience is probably not where it

:10:05. > :10:06.should be, and that is where the national cyber strategy outlines the

:10:07. > :10:11.levers and incentives, and this is where we need to take a very long,

:10:12. > :10:18.hard look on how we increase that resilience. Would I be exaggerating

:10:19. > :10:22.if I was to say, from what you have said, that if you had to say where

:10:23. > :10:27.the balance of investment should go, you would put it into resilience

:10:28. > :10:33.rather than the offence, stopping attacks? We recognise now that the

:10:34. > :10:38.concept that you are either secure or not secure as a binary state has

:10:39. > :10:41.long gone. You need to be able to deal with successful attacks of

:10:42. > :10:45.varying degrees of impact and be able to recover accordingly with a

:10:46. > :10:50.minimum level of disruption, which is why resilience is a key concept

:10:51. > :10:54.now in cyber security. We throw this term around without really defining

:10:55. > :10:59.it and without really saying what needs to be done. The key thing

:11:00. > :11:02.would not be investing in their private and public partnership but

:11:03. > :11:06.investing in education and from the ground up. The other thing we need

:11:07. > :11:11.to start to invest in is the public and promoting how they might react

:11:12. > :11:15.away cyber threat or rape cyber attack. -- a cyber attack. We did

:11:16. > :11:18.that in the nuclear error but we are not developing these programmes to

:11:19. > :11:21.manage the emotional reactions of what happened because of the

:11:22. > :11:31.dependency on digital. It's a very good point. I will speak to Mr

:11:32. > :11:34.Cooper, then Lord West. We spoke about the cyber threat to the public

:11:35. > :11:38.infrastructure and the private sector, but in the light of the

:11:39. > :11:45.experience of the US elections, what is your assessment of whether or not

:11:46. > :11:47.there is any sort of cyber threat to British democracy, either to

:11:48. > :11:55.democratic institutions, electoral systems, political parties or

:11:56. > :11:59.Parliament and so on? If we look at what happened in the United States

:12:00. > :12:02.in the primary means taken in terms of manipulation of the message and

:12:03. > :12:07.trying to influence public understanding of the situation by

:12:08. > :12:12.virtue of having a free and open media, which in a highly connected

:12:13. > :12:15.society we are obviously susceptible to. To say they would be able to

:12:16. > :12:21.influence how somebody cast their ballot through pen and paper that we

:12:22. > :12:27.still employ, that would be going too far, but in terms of leaking or

:12:28. > :12:36.manipulation and propaganda, we have to accept that that is a practical

:12:37. > :12:41.means that they can employ. There is a really important general principle

:12:42. > :12:45.about how attackers think in that is illustrated by the question around

:12:46. > :12:48.elections. We see whether it is online banking or all different

:12:49. > :12:54.attack scenarios, the attackers will not target the core system you are

:12:55. > :12:57.trying to defend itself, but attack the inputs and manipulate the input

:12:58. > :13:03.is going into it and therefore get a different result. The discussion

:13:04. > :13:06.around elections, and we know the head of the NCSC has reached out to

:13:07. > :13:10.political parties to talk about their security, is so really

:13:11. > :13:16.important principle. You might not be able to influence the casting of

:13:17. > :13:19.votes itself, but if you can influence the dates that the

:13:20. > :13:22.decisions are made, you might be up to influence it and that is an

:13:23. > :13:27.important principle for how the attackers attack in this space. If

:13:28. > :13:30.political party's e-mails are leaked and then selectively leaked to

:13:31. > :13:34.present a coloured view of what the party stands for, or the individuals

:13:35. > :13:39.stand for, that is the kind of concept that is often used in cyber

:13:40. > :13:43.attacks. We have to be clear about the US attack. It was mainly done

:13:44. > :13:48.through third-party applications, third-party e-mails, people changing

:13:49. > :13:52.e-mail passwords. It was the least of the least in terms of cyber

:13:53. > :13:55.attacks. The real problem with the American electoral system is the

:13:56. > :13:58.lack of trust in the media and that's a question you have to ask

:13:59. > :14:05.yourselves internally. Do we trust the BBC and the media? In America

:14:06. > :14:09.they did it, and they trusted Russia today quite a bit to the tune of 9

:14:10. > :14:12.million views of Hillary Clinton making money from a foundation,

:14:13. > :14:16.which wasn't true. It a question of trust and where people get

:14:17. > :14:22.information from and how are they educated to take in end education.

:14:23. > :14:25.We'd tell the story of the dramatic Russian hack, but it was simpler

:14:26. > :14:32.than that. Lord West and Lady Faulkner. I'm interested in this

:14:33. > :14:36.take. I put a huge amount of working with the Americans on Y2K. An

:14:37. > :14:40.immense amount of work. And it never really quite came to what we

:14:41. > :14:42.thought, although some things we did good. It was thousands and thousands

:14:43. > :14:54.of man-hours on it. We had some information when

:14:55. > :14:58.suddenly he lost cashpoints including by foot, which I think can

:14:59. > :15:08.be used in the context of what happened in a cyber attack. My

:15:09. > :15:12.question is, resilience and recovery, do you believe there is

:15:13. > :15:18.sufficient money being spent by commercial companies in terms of

:15:19. > :15:24.their recovery for when start to go wrong, and even think there is any

:15:25. > :15:29.legislation that forces companies to have proper recovery mechanisms in

:15:30. > :15:34.place? There is a will and that is what were concerned about in the

:15:35. > :15:41.cyber Security industry that has popped up. That is regulated at the

:15:42. > :15:45.moment. We need to ask if they are encouraging good practices. We do

:15:46. > :15:50.not really talk about that, we do not talk about how companies use IT

:15:51. > :15:54.departments to handle their security when they really need is cyber

:15:55. > :15:58.Security group now. We need to understand the nature of the threat

:15:59. > :16:03.and how to respond to it. I worry that we are focusing too much on the

:16:04. > :16:10.fencing we are not thinking about the defence perspective first. Going

:16:11. > :16:17.back to the earlier point, the impression one got from you about

:16:18. > :16:23.the interference with democracy and electoral systems, the implication

:16:24. > :16:31.was that it was a manipulation rather than a direct impact. Angela

:16:32. > :16:35.Merkel has come out three times to warn off interference in the

:16:36. > :16:43.forthcoming German elections. Why is she so concerned? Any of you? The

:16:44. > :16:50.Russians have been engaging in this activity for a long time in Europe.

:16:51. > :16:54.She is making sure the population do not respond to this information

:16:55. > :16:59.attacks. In America that is a lack of trust in Government, hopefully

:17:00. > :17:05.Germany has more trust and they will fight back against the attacks that

:17:06. > :17:14.will come. The German intelligence agency said the date -- said that

:17:15. > :17:18.they caught a Russian group tried to compromise on various systems, which

:17:19. > :17:25.is where her concern will have come from. The three bustling companies

:17:26. > :17:32.that deal with threats and respond to incidents when they happened. It

:17:33. > :17:38.is real. You cannot deny the fact that there are people all over the

:17:39. > :17:45.world whose day job it is, and they work 9-5, and they go on holiday, it

:17:46. > :17:51.is their job to get into networks and compromise digital processes for

:17:52. > :17:54.whatever means it is, if they are criminals for financial gain, as

:17:55. > :17:58.they are nation states it is still a back door as they can exploit in the

:17:59. > :18:07.future to get information out, but there are armies of people out there

:18:08. > :18:10.for whom this is their job. We're focusing on the Government's

:18:11. > :18:17.national cyber Security strategy. What would you see as being the key

:18:18. > :18:20.lessons learnt from the 2011 shattered Gia and whether it was

:18:21. > :18:27.effective and do you think those letters have been sufficiently

:18:28. > :18:31.addressed in the 2016 strategy? There are two key things come out of

:18:32. > :18:39.the recent strategy that our portent. One is the that there may

:18:40. > :18:43.need to be regulation, incentives as you might freeze it, and the other

:18:44. > :18:47.is capacity building in terms of skills. What we have already seen in

:18:48. > :18:56.terms of that policy being enacted around getting to the younger

:18:57. > :19:03.generation, we are talking 11-15, that will benefit us in six years

:19:04. > :19:09.when they enter the workforce, that is absolutely key. I think it did

:19:10. > :19:17.address a lot of the shortcomings. I was in the Cabinet Office in 2011.

:19:18. > :19:23.The thought process back then in 2011 was very much that the

:19:24. > :19:30.Government could encourage the private sector, upon which much of

:19:31. > :19:34.the dependency for action sets on, to understand the risk and take

:19:35. > :19:38.action appropriately. What has become clear is that maybe the

:19:39. > :19:44.Government needs to be a bit more active and take a bit more

:19:45. > :19:47.intervention. Things like the establishment of the national cyber

:19:48. > :19:54.Security Centre, that is a very welcome initiative, it jive some

:19:55. > :19:59.innovation and working together with infrastructure providers in the UK

:20:00. > :20:03.to tackle a lot of the noise at source, remove a lot of the

:20:04. > :20:09.low-level activity that is going on so people can focus on the more

:20:10. > :20:14.sophisticated activity. I would agree with that. One of the

:20:15. > :20:18.challenges were cyber is now left to market forces sometimes you do not

:20:19. > :20:24.get the results that you might imagine. Often cyber security is

:20:25. > :20:28.regarded as a lemon market, that is asymmetry of information between the

:20:29. > :20:32.people who buy services and sell services. If you are a seller you

:20:33. > :20:39.might understand the acronyms and did complex details, while if you

:20:40. > :20:44.are a buyer a lot of those things are confusing and difficult to

:20:45. > :20:48.understand. It is one of those ramps were market forces alone do not come

:20:49. > :20:56.up with an appropriate level of response. We are seeing a greater

:20:57. > :21:06.need for some kind of regulation to try and define bars for people to

:21:07. > :21:11.strive towards. Where we talk about regulation, we do not see it as

:21:12. > :21:15.being universal. It has to be proportional to the criticality of

:21:16. > :21:20.the firm in the services it provides to the nation. As long as we take

:21:21. > :21:23.that into account, and we're seen regulators take lead on this and

:21:24. > :21:32.take a proportional approach, that is where we see a good response and

:21:33. > :21:37.an improvement. You are all saying the market has not delivered in this

:21:38. > :21:42.area, either because the vendors selling people products they do not

:21:43. > :21:46.need or understand, or whatever else. Is that how we should

:21:47. > :21:51.interpret the emphasis on extra cyber offence? Is it an admission

:21:52. > :21:58.that the existing strategy is not working or is not going to work? It

:21:59. > :22:07.addresses to problems. It helps those who cannot help themselves. It

:22:08. > :22:12.is a complex subject. It is an assumption that they have the

:22:13. > :22:16.skilled people. The majority of the France United Kingdom are SMEs and

:22:17. > :22:23.they cannot retain the skill set. Cyber defence provides those firms,

:22:24. > :22:31.some meaningful defence against these actors in the real world. Can

:22:32. > :22:37.I have swore generally about whether or not the 2016 strategy is robust

:22:38. > :22:40.enough for the protection of the UK critical national infrastructure?

:22:41. > :22:48.We're seen recent attacks in Ukraine and in France. Has this latest

:22:49. > :22:53.strategy led the right lessons from those major attacks. You will

:22:54. > :23:00.probably have seen the recent US science board report essentially

:23:01. > :23:05.said actors on behalf of the Russians and Chinese had effectively

:23:06. > :23:14.penetrated the US's key critical infrastructures to the extent that

:23:15. > :23:19.given the opportunity to switch off that infrastructure, which has

:23:20. > :23:25.bigger locations for American society. I we facing the same level

:23:26. > :23:30.of threat? Has our ambition should be penetrated to that extent and, if

:23:31. > :23:35.so, what can we do to deal with that, given that the US defence

:23:36. > :23:41.science board was saying that it would take at least ten years for

:23:42. > :23:46.the US to rectify that position? The honest answer to your question is

:23:47. > :23:53.that no one knows. You can take some confidence and things that happen,

:23:54. > :23:58.but with the US, they did not know what had happened until it became

:23:59. > :24:05.clear. Part of the challenge for the national shattered G is that this is

:24:06. > :24:09.an area that is very fast and one thing that needs to be thought about

:24:10. > :24:15.in the context of the national strategy is the five-year cycle may

:24:16. > :24:20.not be rapid and off to deal with the world as it develops. Our

:24:21. > :24:24.dependencies in the digital world as they develop is much faster than

:24:25. > :24:30.that five-year rhythm. We might need to think about a new more agile

:24:31. > :24:35.approach to revisiting the strategy. I have a different perspective which

:24:36. > :24:40.is that a lot of the things we are calling for, new regulation or extra

:24:41. > :24:50.cyber defence, take a while to build. There will always be in

:24:51. > :24:55.anxiety. If you shorten the life cycle, you will not have data on

:24:56. > :25:01.whether they will work and deliver. That would be shrinking it to

:25:02. > :25:04.something like three years. It's critical national infrastructure

:25:05. > :25:09.compromise to the same extent? I don't think the companies would be

:25:10. > :25:16.willing to comment on client relationships, but it is courtesy

:25:17. > :25:23.that most organisations, when facing threat, will face some incursion.

:25:24. > :25:25.These actors have many cyber operatives and they will target

:25:26. > :25:31.everything, so we need to be prepared for that. I don't know if

:25:32. > :25:37.we are doing the stress tests we need to do to make sure we are

:25:38. > :25:41.prepared. We cannot survive major rainstorm sometimes, so can we

:25:42. > :25:47.really survive is possible attacks? I do not think they are likely, but

:25:48. > :25:53.they are possible. We always have to come to a mindset where we accept

:25:54. > :25:57.that at some point we will be compromise, still a mindset nuisance

:25:58. > :26:03.are looking at how you respond to those kind of attacks. The aggressor

:26:04. > :26:08.only missed event when we enter an organisation, the defendant is to

:26:09. > :26:15.defend against every single attack, so there is asymmetry there. When

:26:16. > :26:18.the debt landscape as much as cover technology but also people and

:26:19. > :26:24.process, that is a really large landscaped to try and protect. We

:26:25. > :26:30.are targeted as employees of the organisations that we work for, but

:26:31. > :26:35.also people in the community, our wives and children, are also

:26:36. > :26:43.targeted. If you take that as a starting point, assuming someone was

:26:44. > :26:50.surely a day would get in, the focus would be on that ability to detect

:26:51. > :26:54.and respond. Those of you who are working closely with the private

:26:55. > :26:58.sector, some of whom are infrastructure providers that many

:26:59. > :27:02.of whom are supplies to the infrastructure, water should

:27:03. > :27:10.experience of the extent to which, without breaching confidentiality,

:27:11. > :27:15.what is the extent to which you find they are already harbouring things

:27:16. > :27:24.under systems that they may not know about and can do much about. Supply

:27:25. > :27:28.chains since it gets very murky. The firm has to report to regulate, but

:27:29. > :27:34.you can report what you know is true. What we are increasing see is

:27:35. > :27:43.that they're looking at their supply chain as an area of risks to them.

:27:44. > :27:46.How do they have the confidence that the smaller firm they have engaged

:27:47. > :27:52.with connectivity into their environment will not launch an

:27:53. > :27:58.attack? Today, that is where it is found wanting in many suppliers, the

:27:59. > :28:02.reason is that they supply and magic widget, something very niche, but

:28:03. > :28:12.they'd not have a fully fledged system because they are a

:28:13. > :28:18.multidisciplinary firm. There is a supply chain that is a worry and a

:28:19. > :28:21.lot of what we do is stand for a cost benefit analysis, and sometimes

:28:22. > :28:30.you need to spend a lot more to ensure security. We feel are back

:28:31. > :28:33.where often. To what extent are larger companies putting in criteria

:28:34. > :28:42.for small suppliers to meet their obligations. Is being used by some.

:28:43. > :28:51.There are contractual obligations but it is a written world card with

:28:52. > :28:54.-- it is the written word, but then you can have any obligation to

:28:55. > :29:01.inform if you wish but if you do not know you cannot tell them. The

:29:02. > :29:06.reality is, you will not. You are outsourcing to them because they're

:29:07. > :29:08.cheaper, or the very specialise, you're not contribute your own

:29:09. > :29:17.requirements on them that will erode the saving you may make. Some

:29:18. > :29:22.organisations are rethinking their organisational decisions. Cyber

:29:23. > :29:27.security may not be a technical issue is much as how you structure

:29:28. > :29:33.your processes. The number of dependencies you have on others is a

:29:34. > :29:35.key part of your risk. Some of the outsourcing and supply chain

:29:36. > :29:40.decisions that have been made in the past, which have been driven by

:29:41. > :29:43.cost, some of those will need to be rethought and organisations will

:29:44. > :29:47.need to reconfigure themselves so they have more control over those

:29:48. > :29:53.processes end to end. They will never be able to eliminate supply

:29:54. > :29:58.risk, and the comment feature of all cyber attacks is that they go for

:29:59. > :30:00.the suppliers and suppliers to suppliers, that can be a part of

:30:01. > :30:09.that thinking. What we've seen over the last three

:30:10. > :30:12.years is more intelligence led assurance frameworks, so rather than

:30:13. > :30:16.looking at organisations straight on, we try to look at the other

:30:17. > :30:19.interconnecting elements around the organisation. Maybe not as

:30:20. > :30:23.far-reaching as supply chain, but certainly looking outside

:30:24. > :30:29.conventional tech within the UK. We will also see a growing number of

:30:30. > :30:32.attacks that have relevance to the UK that target subsidiaries and

:30:33. > :30:38.other operating functions of the national organisations that are

:30:39. > :30:44.outside of the UK. Doctor Morrison and Lord West. With General data

:30:45. > :30:48.protection regulations changing next year and the big fines that some

:30:49. > :30:53.will potentially introduce, will that affect the supply chain that

:30:54. > :31:02.has been discussed, and will it alter behaviour significantly or

:31:03. > :31:07.not? Yes, I think there are certain firms today that are very anxious

:31:08. > :31:17.about GDP are becoming active, and that will drive more of an invasive

:31:18. > :31:24.look at certain suppliers that process personally identifiable

:31:25. > :31:29.information, definitely. One of the things to watch, in general data

:31:30. > :31:35.protection, it provides a strong regulation of personal information,

:31:36. > :31:38.though many of the cyber attacks that my concern this committee from

:31:39. > :31:42.a national security perspective might be less about consumer

:31:43. > :31:45.information and more about attacking infrastructure and military

:31:46. > :31:50.processes and disrupting processes. So there is a strong regulatory

:31:51. > :31:56.regime around consumer information and we need to watch it does not

:31:57. > :32:01.skew co defences do over focus on that and not thinking about

:32:02. > :32:04.disruption to processes. Would there be public health problems in

:32:05. > :32:11.accessing data? Absolutely there would be. I should declare an

:32:12. > :32:16.interest that I am chairman of a company that works in the cyber

:32:17. > :32:21.area. And I also speak a lot on cyber as well, for which I get a bit

:32:22. > :32:27.of loot, which is quite nice. The question Lord Harris was asking, and

:32:28. > :32:32.the report, and I read it as well, is actually saying there are lines

:32:33. > :32:42.of code already in masses of the software within whole areas of the

:32:43. > :32:46.US that can be sparked but we don't know how. What I want to run on from

:32:47. > :32:53.there is, how dangerous do you think it is that switching pieces of gear

:32:54. > :32:57.are incorporated in masses of systems and we don't know how the

:32:58. > :33:04.upgrades go unless we put people onto it to look at it closely, but

:33:05. > :33:08.now Chinese firms are taking the largest data centre in the UK may

:33:09. > :33:13.have taken over the largest CCTV company in Europe, and with the

:33:14. > :33:20.Internet of things, we know what you can do with CCTV remotely. How much

:33:21. > :33:23.of a threat or danger does the panel see that as, particularly when one

:33:24. > :33:28.takes into account these lines of code which we find at times, we go

:33:29. > :33:36.in and say we did not know it was there, so how much of a threat do we

:33:37. > :33:42.think this is? Pure software quality will outstrip by an ordinary

:33:43. > :33:45.magnitude, so any back door being introduced into the line of code,

:33:46. > :33:48.the number of unintentional weaknesses present in software and

:33:49. > :33:56.hardware systems today far outweigh them, and that again is it as

:33:57. > :33:58.economics driving it, quality assurance, good architectural

:33:59. > :34:03.principles, all of these things in a lot of cases don't make sound

:34:04. > :34:07.commercial sense, so we can attribute that to malice, but

:34:08. > :34:11.sometimes we can attribute it to pure neglect. You did not quite

:34:12. > :34:15.answer my question about whether these companies not doing this is an

:34:16. > :34:22.issue? I think it is an issue but for those capable of bringing that

:34:23. > :34:24.force, those same systems will be vulnerable to many, many other

:34:25. > :34:29.things which were not introduced by those actors and I would be more

:34:30. > :34:33.worried about that, especially when we talk about the Internet of

:34:34. > :34:36.things, because those products we expect to have a half life of

:34:37. > :34:42.between three or ten years on the outside, a very cheap price point.

:34:43. > :34:46.Where do we think firms can produce to that point? They don't have sound

:34:47. > :34:49.security and engineering, which is why we are seeing almost the

:34:50. > :34:55.regression of what we have learned over the last 30 years in terms of

:34:56. > :34:59.security design being undone with the advent of IOT because we are

:35:00. > :35:02.moving away from three or four massive multinationals that can

:35:03. > :35:07.invest in cyber security implementation too many smaller

:35:08. > :35:11.manufacturers churning products out as quickly as they can. So there is

:35:12. > :35:14.not one company held to account and you are not going to cut off an

:35:15. > :35:21.entire country from supplying you today. I think you can assume there

:35:22. > :35:26.is definitely a threat from the lines of code as you describe it.

:35:27. > :35:30.But there is equally a thread from people and process. If you look at

:35:31. > :35:34.the worm that targeted a load of IOT equipment at the end of last year, a

:35:35. > :35:40.lot of what it did to propagate initially was just use weak

:35:41. > :35:45.credentials. So when we are faced with problems of people choosing bad

:35:46. > :35:48.passwords and bad user are not putting any kind of security

:35:49. > :35:53.configuration onto devices, that is where we should be focusing. That is

:35:54. > :35:56.the low hanging fruit. We need to think about the coward that powers

:35:57. > :36:02.the systems but it's more than technology alone. Why did the

:36:03. > :36:06.Americans say they would not use it? Why do Australia say that? If it is,

:36:07. > :36:14.as you say, they are both buffoons and they are not doing it correctly?

:36:15. > :36:19.The UK did a compensating control by creating the Secure site. Week at a

:36:20. > :36:26.commercial agreement with Huawei to have UK people to do the diligence

:36:27. > :36:30.on the code -- week at a commercial agreement. So instead of cutting off

:36:31. > :36:35.entirely, we realised the benefits but there was a compensation in risk

:36:36. > :36:37.controls by having those individuals being British National is reviewing

:36:38. > :36:48.the products that they are producing. Just a very quick point.

:36:49. > :36:53.Do you think it is possible that American concerns about some Chinese

:36:54. > :36:57.companies like Huawei are caused by competition factors, and they don't

:36:58. > :37:04.really want Huawei in the market? I take silence as consent. There is a

:37:05. > :37:07.certain amount of prudence in avoiding the mistakes of the past

:37:08. > :37:11.that we have done in terms of the intelligence of the past and making

:37:12. > :37:15.sure we do not diversify that much. That has been the path forward, to

:37:16. > :37:19.be careful about what we let in, and I'm not so much worried about lines

:37:20. > :37:26.of code, I'm worried about the hardware, something we are reluctant

:37:27. > :37:29.to change through time. I think we need to recognise from a national

:37:30. > :37:33.security strategy perspective, in the UK this is an area where we are

:37:34. > :37:37.in a position of weakness because we don't have our own technology supply

:37:38. > :37:40.base in the UK manufacturing the core infrastructure we need, and

:37:41. > :37:50.other countries have the luxury of having that supply base. The

:37:51. > :37:54.strategy contains a list of categories of cyber threats, so

:37:55. > :38:00.cyber criminals, states and state-sponsored threats, terrorists,

:38:01. > :38:05.hacker activists and what I gather are known as script kiddies. I would

:38:06. > :38:09.be interested to hear the views of all of you as to whether it is

:38:10. > :38:16.usable to categorise the threats in that way. Yes, in so much in that it

:38:17. > :38:20.makes it real for both businesses and users, I guess, and maybe to

:38:21. > :38:24.understand who they are trying to defend against and their motives. We

:38:25. > :38:31.need some form of categorisation. That relatively core set is a good

:38:32. > :38:36.measured balance from the archetypal teenager in their bedroom who is a

:38:37. > :38:38.bit bored but massively inquisitive, through to what we see in the

:38:39. > :38:43.Hollywood movies in terms of a nation state threat. It's really

:38:44. > :38:50.critical in terms of method and target. Each threat will have

:38:51. > :38:54.different methods and targets, so you could focus on critical

:38:55. > :38:57.infrastructure attack, and that will be their target, information and

:38:58. > :39:02.architecture. That division tends to be useful from passing out what we

:39:03. > :39:06.need to protect and what is more critical and more important. Of

:39:07. > :39:11.course the nation state is our more capable actor than the terrorist,

:39:12. > :39:16.then the criminal, then the script kiddie, and the basic chaos person.

:39:17. > :39:21.The only caveat their eyes I would mention the nation state. When we

:39:22. > :39:25.communicate with clients we differ between established nation states

:39:26. > :39:29.that understand the international norms of using this capability and

:39:30. > :39:33.emerging nation states, those rapidly developing cyber capability,

:39:34. > :39:36.who maybe not use the sanctions or anything else being placed on them

:39:37. > :39:41.when they go too far or have other sticks wielded at them will stop

:39:42. > :39:47.that is an important one, because the disincentives for those, as we

:39:48. > :39:52.have seen in certain instances like Sony pictures, they will be quite

:39:53. > :39:56.happy to use that capability with an aggressive affect. The only thing I

:39:57. > :40:02.would caution is seeing them as two distinct. This is an area where

:40:03. > :40:05.essentially it is a marketplace of anniversaries, and if you were a

:40:06. > :40:09.nation state with a hugely sophisticated armoury and you wanted

:40:10. > :40:18.to get access to a company, the first thing you might do is go via

:40:19. > :40:26.access of a criminal forum. We see quite a lot of that potential

:40:27. > :40:30.activity happening. We need to recognise that this is a marketplace

:40:31. > :40:34.and the nation state can deploy a tool, and within days that tool can

:40:35. > :40:36.be available for criminals to buy and use. We saw that with poison ivy

:40:37. > :40:42.many years ago. That anticipates my many years ago. That anticipates my

:40:43. > :40:45.next question and I'd be interested to hear from the rest of the panel

:40:46. > :40:50.as the way you think the key overlaps are between the categories.

:40:51. > :40:56.One of the things to pick up on an earlier point is the overlap is

:40:57. > :40:59.definitely significant. It's becoming more and more significant.

:41:00. > :41:01.The modus operandi might remain distinct, but the level of

:41:02. > :41:10.sophistication and capabilities continually grow. So in the security

:41:11. > :41:14.strategy will make reference to different levels of capability, and

:41:15. > :41:16.you might say that organised crime units have a different level of

:41:17. > :41:25.capability compared to nation states. But the gulf between those

:41:26. > :41:29.is beginning to blur. What you might assume is that that will continue to

:41:30. > :41:34.blur down, and when we look at people like script kiddies, who

:41:35. > :41:37.today might be regarded as having on sophisticated tooling, they can

:41:38. > :41:44.still have a significant impact on infrastructure. If we look at one of

:41:45. > :41:47.the large breaches that took place against a telecoms provider a couple

:41:48. > :41:52.of years ago, that was disproportionately devastating for

:41:53. > :41:57.them from somebody who would have been regarded as a script kiddie but

:41:58. > :42:01.had a large financial impact on that telecommunications provider. So I

:42:02. > :42:05.don't think we can just assume that nation states are the only types of

:42:06. > :42:10.entities that can inflict significant damage. The reality is,

:42:11. > :42:16.all of them can. I would agree. On a technical level, organised crime and

:42:17. > :42:21.the others are getting very close, if not the same in terms of what

:42:22. > :42:25.they can unleash. The differences, generally, planning, coordination

:42:26. > :42:29.and capacity to launch that is the difference when we talk about nation

:42:30. > :42:34.states above organised crime. But on a purely technical level, going on

:42:35. > :42:38.one engagement, they can be approximately level. They are not

:42:39. > :42:41.getting that close. The real issue with Russia and China is they will

:42:42. > :42:45.identify and utilise those people who have those capabilities and they

:42:46. > :42:48.will identify them early and pull them out and tell them, if you don't

:42:49. > :42:53.work for us, in addition to what you do in your day job, you will have

:42:54. > :42:56.trouble. That is the real thing. There is no mythical cyber criminal

:42:57. > :43:01.that just has these dramatic capabilities. Quite often what they

:43:02. > :43:06.do, generally falls in line with what we call shrinkage, losses by

:43:07. > :43:09.revenue or volume, so these dramatic attacks on looking at the American

:43:10. > :43:13.cases that happen in the last ten years, the only one that went beyond

:43:14. > :43:20.the shrinkage calculation was the Target attack. Can I stop you there?

:43:21. > :43:24.I'm afraid we have a division in the Commons. I do apologise. Order,

:43:25. > :43:31.order, I am adjourning the committee for 15 minutes. Mr de Villiers you

:43:32. > :43:38.are cut off, but had you almost finished your questions? -- Mrs de

:43:39. > :43:44.Villiers. Could I ask those briefly? A lot of the material I've seen so

:43:45. > :43:49.far indicates that terrorists, while they have malevolent intent have

:43:50. > :43:57.limited capacity in relation to cyber attacks. And yet we have these

:43:58. > :44:03.relatively low-tech script kiddie type categories. How come those kind

:44:04. > :44:08.of individuals, teenagers in their bedrooms can inflict harm that seems

:44:09. > :44:15.to elude terrorist groups? Is there more scope for them? Should we be

:44:16. > :44:20.worried about that overlap between the two categories in cyber security

:44:21. > :44:27.responses? Are blue one reason historically attributed is

:44:28. > :44:30.stability. Terrorists have instabilities in the lives of

:44:31. > :44:33.physical environment which don't allow them to get access to the

:44:34. > :44:37.education required all the facilities to do these things. But

:44:38. > :44:44.you are right to say or highlight the risk. As we assume with

:44:45. > :44:47.organised crime, young, capable individuals that are persuaded,

:44:48. > :44:51.cajoled and otherwise influenced to undertake the activities at somebody

:44:52. > :44:54.else's behest is a risk, and that is where we have to look at the

:44:55. > :44:57.National Crime Agency and what they have been doing there in terms of

:44:58. > :45:02.intervention, where they capture or identify young people that may be on

:45:03. > :45:06.the wrong path, and they sit down with the parents or send a letter to

:45:07. > :45:09.the parents to say, do you know what your son and daughter are up to, and

:45:10. > :45:14.maybe you need to be a better parent?

:45:15. > :45:21.If these individuals are successful that says more about our defects

:45:22. > :45:28.than their success usually. These enterprises by young people, it is

:45:29. > :45:35.like that is competition, but they're injuring nature of attack is

:45:36. > :45:40.not their convicted terrorists. Terrorists have a problem with

:45:41. > :45:49.infrastructure and education. That is a difference in terms of modus

:45:50. > :45:51.operandi to this different groups. Often children want to attack

:45:52. > :46:00.systems, but they do not care what the system is, they just want to try

:46:01. > :46:04.it out to gain peer notoriety. That is slightly different to a terrorist

:46:05. > :46:09.group that may have a specific focus on a specific set of end goals. It

:46:10. > :46:18.is reasonable to assume that as technology develops and the people

:46:19. > :46:23.who are practising these attacks develop their techniques, we will

:46:24. > :46:28.see more terrorist attacks. One last question that takes you back to the

:46:29. > :46:33.first questions you were acid the start of the session, which of the

:46:34. > :46:39.categories in the 2016 review poses the biggest threat? Is that the

:46:40. > :46:48.criminals, the state actors, or the other categories? In terms of

:46:49. > :46:51.availability, long-term financial health, the United Kingdom, you have

:46:52. > :46:59.to take each of those on their merits. Patient states and long-term

:47:00. > :47:02.viability of the nation, but it will be someone left field that will

:47:03. > :47:10.cause the unforeseeable to happen and stop nation states of the most

:47:11. > :47:18.capable but the probability of those people targeting us is very low. The

:47:19. > :47:25.main threat is cybercrime. What people are not talking about is what

:47:26. > :47:29.we will do after Brexit. That is a question we need to dive into and I

:47:30. > :47:43.have not seen a lot of positions on the impact of regulation, use of

:47:44. > :47:46.Interpol and things like that. Given that there are so many signatories

:47:47. > :47:57.to Interpol who are not member of the European Union, who else would

:47:58. > :48:08.be affected? Idle what the impact of Brexit will be on Britain's cyber

:48:09. > :48:12.security. To what extent do you think it is possible to distinguish

:48:13. > :48:16.the source of cyber attacks? You have touched upon it already in some

:48:17. > :48:24.of the remark she made. It is important obviously these sorts of

:48:25. > :48:33.interventions and has a bearing upon what remains do collectively and

:48:34. > :48:38.otherwise it is very dangerous to rely a natural Grecians and cyber

:48:39. > :48:47.for doing that. It is a very hard problem to solve. Today we have

:48:48. > :48:55.levels of confidence, we cannot say categorically that we now they

:48:56. > :49:04.attributed these traits and modus operandi, but these are easy to

:49:05. > :49:09.copy. Just because these things are coming from a country, it doesn't

:49:10. > :49:16.mean it isn't coming from a person. I would always caution at your

:49:17. > :49:25.patient. -- attribution. Adapters are no overstatement. Responsibility

:49:26. > :49:35.is difficult. Finding the leader will very rarely be there. It is not

:49:36. > :49:40.clear what that was. We have got good at getting better at and should

:49:41. > :49:44.you should through the years. We are seeing more cyber attacks being used

:49:45. > :49:48.to signal that states want is to know what they are doing so they can

:49:49. > :49:56.push of the now certain way. I do not worry too much about

:49:57. > :50:02.attribution. Tenet on individuals is very difficult, especially in Russia

:50:03. > :50:12.and China. It is very difficult to attributed attacks. It is easy to

:50:13. > :50:17.subvert a responder and point them in the wrong direction. If we look

:50:18. > :50:22.at the attacks that occurred this year, and initially there was

:50:23. > :50:26.narrative that looks like it was Russian and the rationales was that

:50:27. > :50:32.there were some Russian words in some of the cold. The reality is

:50:33. > :50:39.that it is not difficult to insert Russian code to make it look like

:50:40. > :50:49.someone from another nation state. Things like are used -- things like

:50:50. > :50:55.Tor are used to hide the people are coming from and proxies are common.

:50:56. > :50:59.Sometimes it is only possible to attributed an attack back because

:51:00. > :51:04.the hacker made a mistake. If the work effectively that is very

:51:05. > :51:09.difficult. I would reflect those comments and see that it is an

:51:10. > :51:16.intelligence activity that comes with varying degrees of confidence.

:51:17. > :51:24.None of you will be recommending that we extend article five to the

:51:25. > :51:28.sphere? There are numerous issues that would need to be worked through

:51:29. > :51:35.and that is a hard area to work through. One possibility might be

:51:36. > :51:41.that you'd respond like for like. It could be made more explicit that

:51:42. > :51:46.anything one finds as a result of the intelligence that we have that

:51:47. > :51:50.we believe is reliable, that a state actor in particular vision is

:51:51. > :51:54.possible for something we have picked up. That is one possibility

:51:55. > :52:05.because then you avoid the danger of the spiral effect. The Americans

:52:06. > :52:08.respond with sanctions, they do not necessarily respond back with cyber

:52:09. > :52:21.attacks because that might open Pandora's box. We do not want to do

:52:22. > :52:33.that ourselves. Red-mac I do not want to be too pessimistic about

:52:34. > :52:38.Article five if a country was to continue its assault, then it is

:52:39. > :52:47.right for countries to get together. We can put on sanctions or other

:52:48. > :52:53.things. Including cyber in article five is very sensible, sidled why

:52:54. > :52:58.there is such an objection to it. Another view to consider, from the

:52:59. > :53:04.British perspective, if a criminal organisation in the UK launches an

:53:05. > :53:08.attack against a foreign entity and the UK was held responsible for it

:53:09. > :53:13.rather than it being seen as criminal activity, then we might

:53:14. > :53:19.feel differently. I hope that we would feel very guilty and get to

:53:20. > :53:25.the bottom of who was doing it. In article five the should be physical

:53:26. > :53:33.violence attack ATTACHED to it. If that was to happen then it would be

:53:34. > :53:43.worth doing, at moment it is conjectured. One of the problems is

:53:44. > :53:48.that the whole of the need to resume in general UN Security Council

:53:49. > :54:00.resumed is built around more conventional warfare, though it has

:54:01. > :54:08.adapted. What I would like to hear from you is to what extent,

:54:09. > :54:19.alongside attribution, do you believe that the environment has

:54:20. > :54:22.become safer through the inclusion of collective defence under an

:54:23. > :54:28.enlarging Nato since the inclusion of cyber Cayman? Is seen to me out

:54:29. > :54:40.of the last year at the Warsaw Summit. Factor defence leads to

:54:41. > :54:45.collective goal fence, which leads to escalation. They should be at

:54:46. > :54:50.individual country. That is a threshold I hope we do not get far

:54:51. > :55:00.enough to talk about. There is a case to be made because power is

:55:01. > :55:04.generated in other places off the island. I climb to strategy make

:55:05. > :55:12.sure that the country functions for certain things. -- a collective

:55:13. > :55:17.strategy. Food and so forth can be impacted by cyber defence, so we

:55:18. > :55:22.should look to take a collective approach. I wonder if it would be

:55:23. > :55:28.possible to break down the different cyber threats into certain

:55:29. > :55:33.categories. I have come up with three, but this may be inadequate

:55:34. > :55:39.and I wonder if you think I'm missing anything here. The first

:55:40. > :55:48.category is using cyber for the gathering of intelligence, whether

:55:49. > :55:52.commercial or political or military. That is clearly something that

:55:53. > :55:58.requires defending against, making sure your systems are as well

:55:59. > :56:01.protected as they can be. The second is what was being talked about in

:56:02. > :56:08.terms of the US election which is the use of cyber importers issue

:56:09. > :56:09.additional methods with regard to any medication system for the

:56:10. > :56:19.dissemination of propaganda and this dissemination of propaganda and this

:56:20. > :56:24.information and misinformation. Your best weapon there is to get your own

:56:25. > :56:30.messages out and expose the fact that your opponents have been doing

:56:31. > :56:35.dirty tricks. The third one, which is the one that most concerns us in

:56:36. > :56:39.relation to Nato, because the other two are bound to go on as long as

:56:40. > :56:45.there is a cyber sphere, our tax that could paralyse societies and

:56:46. > :56:51.for those, although you may try to prevent them, that is where the

:56:52. > :56:54.concept of deterrence and the threat of retaliation would come in and

:56:55. > :56:59.then I'm inclined to think that the doctor is right that they got to the

:57:00. > :57:04.stage they would probably be conventional military action going

:57:05. > :57:08.on at the same time anyway. Have I missed anything in terms of the

:57:09. > :57:13.spectrum of threats that would not fall into one or other of these

:57:14. > :57:24.categories? I've been writing a book on this topic. We divided cyber into

:57:25. > :57:27.the activities, you included cyber information which is not

:57:28. > :57:35.traditionally a part of it. In terms of cyber tactics, we divided by

:57:36. > :57:44.espionage and manipulation of information. People can be looking

:57:45. > :57:47.to destroy or sabotage the targets. That seems to be pretty

:57:48. > :57:54.comprehensive in getting most nation state based cyber attacks. When you

:57:55. > :58:01.get into the criminal sphere you're going to have different impacts,

:58:02. > :58:04.equally large... I want to concentrate on defence because then

:58:05. > :58:10.the issue arises of how you do something like this and the only way

:58:11. > :58:19.to deter something is to have a threat of retaliation, which is both

:58:20. > :58:25.unacceptable and unavoidable. Here it is not now and in the past, for

:58:26. > :58:30.example, when the two new wars it was thought that the great threat

:58:31. > :58:36.was gas warfare and saw a lot of resilience was Putin and every

:58:37. > :58:40.household had a gas mask. It is accepted that why it did not happen

:58:41. > :58:46.in the end is that the threat of massive retaliation by this country

:58:47. > :58:56.was an effective deterrent. Can you envisage a threat of massive cyber

:58:57. > :59:00.attack that would be effective in determining that third category that

:59:01. > :59:10.I talked about, attacks that are intended to paralyse societies. In a

:59:11. > :59:15.short word no. Academics disagree that it has never worked. The

:59:16. > :59:19.problem is that you need to have credibility, you need to have some

:59:20. > :59:23.sort of assurance that if someone does this there will be a response.

:59:24. > :59:27.The main problem with cyber is that there is a lack of utility because

:59:28. > :59:36.those weapons can be used against you. There is no issue of

:59:37. > :59:41.retaliation. We can have cyber resiliency, we can have assurance

:59:42. > :59:46.that there would be a response to any attack, but to ever be able to

:59:47. > :59:51.prevent a cyber attack is almost impossible given those conditions

:59:52. > :59:57.and what deterrence as an academic term arrears.

:59:58. > :00:03.Just to last point is, when you say there is a belief deterrence has

:00:04. > :00:06.never worked, I hope you're not suggesting in any field whatsoever,

:00:07. > :00:13.because there is a strong belief that deterrence in form -- the

:00:14. > :00:16.former Nato was effective for half a century, so if you're throwing out

:00:17. > :00:19.is that you don't believe in deterrence in any sphere at all, you

:00:20. > :00:28.are wasting your time with this audience. I hope that wasn't what

:00:29. > :00:41.you were doing. But in terms of dealing with this particular

:00:42. > :00:45.construct, surely the notion that a massive attack was taking place that

:00:46. > :00:53.was trying to bring down a country, if that was going hand-in-hand with

:00:54. > :00:57.conventional warfare, as you seem to think it would, then surely that

:00:58. > :01:01.solves, for this category, the problem of identifying where the

:01:02. > :01:06.massive attack is coming from, because of its part and parcel of a

:01:07. > :01:09.conventional attack then the secrecy around where the cyber attack is

:01:10. > :01:15.coming from will presumably have disappeared because we will know

:01:16. > :01:21.whose tanks it is well simultaneously attacking. The real

:01:22. > :01:25.problem with the military is a lack of one to use cyber tools because of

:01:26. > :01:28.the risk involved. There's a low probability of using cyber tools in

:01:29. > :01:32.any war game because of the risk of using them. There is no assurance

:01:33. > :01:36.that the cyber tools will work so there has been yes -- less utility

:01:37. > :01:43.to want to use that as a response to cyber attacks. Conventional

:01:44. > :01:47.responses are something that should be made very clear, that if you do

:01:48. > :01:50.this, there will be this sort of response, but no government I'm

:01:51. > :01:53.aware of as clearly said that this is what will happen if there is a

:01:54. > :01:57.massive cyber attack, and that's what needs to be done. If you want

:01:58. > :02:03.to establish a series of workable deterrence, that is. We have seen

:02:04. > :02:12.evidence of certain nation states launching attacks against particular

:02:13. > :02:15.sectors. There was a country that launched a financial attack against

:02:16. > :02:23.various countries, or they tried to. Surry? Are you referring to Israel

:02:24. > :02:29.or Saudi Arabia? No, Iran. The concern we have when we think of the

:02:30. > :02:33.responses, we have heard the word asymmetry a lot. Our susceptibility

:02:34. > :02:37.versus the susceptibility of the source to a corresponding cyber

:02:38. > :02:41.attack might be far less because of their own dependence on technology

:02:42. > :02:46.might just be far less. That has to be a key consideration and that is

:02:47. > :02:49.why would have to consider other responses be they sanctions, etc

:02:50. > :02:57.because it simply won't work against the target. One final point on the

:02:58. > :03:01.attribution of deterrence, if you place yourself as someone who wanted

:03:02. > :03:07.to do harm to the UK you might imagine a scenario where you chip

:03:08. > :03:10.away and gradually increase the level of attacks without

:03:11. > :03:15.conventional attacks being part of it and you confuse the attribution,

:03:16. > :03:20.so you undermined and weakened before you use other attack

:03:21. > :03:26.mechanisms as well. Can I just say to everybody that we are getting

:03:27. > :03:29.short of time and there is likely to be another division in the Lords,

:03:30. > :03:33.and who knows if there will be another in the Commons, so good I

:03:34. > :03:39.urge colleagues to make their questions and witness replies as

:03:40. > :03:44.brief as possible? It is to continue with a point from Doctor Lewis. One

:03:45. > :03:48.of the problem seems to me that countries don't even know what their

:03:49. > :03:51.own vulnerability is. They are scared to get into the tit-for-tat

:03:52. > :03:55.with other countries because they haven't really calculated what it

:03:56. > :04:01.sees and the other factor seems to me is there an intelligence balance.

:04:02. > :04:06.There are lots of countries in the world where we march at will through

:04:07. > :04:09.their Computerworld, allegedly, and we march at will through the whole

:04:10. > :04:12.thing and there is a massive intelligence balance so there is a

:04:13. > :04:18.constant argument that if you don't know that bit you don't know what

:04:19. > :04:22.the president is saying to his Pope, so I do agree that those

:04:23. > :04:27.calculations that are tricky. -- to his Po. The first point is

:04:28. > :04:28.understanding the true vulnerability today is one of the biggest

:04:29. > :04:42.concerns. We've had a very interesting

:04:43. > :04:48.presentation from Doctor Mc Cormack about how cyber has become a

:04:49. > :04:57.military weapon, potentially. Leaving aside the various points

:04:58. > :05:02.made about the unlikelihood of cyber being a principal weapon,

:05:03. > :05:07.nevertheless by virtue of the fact that in 2013 the then Secretary of

:05:08. > :05:14.State for Defence, Philip Hammond, announced we would gauge -- engage

:05:15. > :05:21.in offensive cyber, to what extent do you think this is something, as

:05:22. > :05:25.Doctor Mc Cormack suggests, should be matter of public debate bearing

:05:26. > :05:31.in mind that when the Secretary of State made his remarks to the

:05:32. > :05:36.Commons he said much of the action will not be in the public domain.

:05:37. > :05:40.And, indeed, it's been suggested that the necessary secrecy

:05:41. > :05:47.surrounding cyber weapons undermines its credibility. If it is to be a

:05:48. > :05:54.deterrent, if people out there, potential enemies out there

:05:55. > :05:58.understand we will retaliate with a cyber attack, it needs to be

:05:59. > :06:05.credible. To be credible, it has to be known about. How do policymakers

:06:06. > :06:12.managed to juggle -- juggle this? It's quite interesting, does it have

:06:13. > :06:16.to be known about? Can you measure a country's or infer their cyber

:06:17. > :06:20.capability by the health of its industry? If you look at Israel in

:06:21. > :06:23.terms of what they are amassing in terms of the private sector

:06:24. > :06:27.capability and you make the logical leap in terms of what makes the

:06:28. > :06:33.skate -- state capability, can you do something similar with the UK? So

:06:34. > :06:37.without having to disclose your hand entirely, by having a buoyant injury

:06:38. > :06:42.-- industry that can reduce these things in the private sector you can

:06:43. > :06:46.probably jump to what the adversarial is. Credibility is about

:06:47. > :06:50.willingness to use a weapon and that is a major problem we have had,

:06:51. > :06:54.because in the American experience they are considering using lawyers

:06:55. > :06:59.with every unit because of the immense implication for civilian

:07:00. > :07:03.space, so if we start to use cyber tools in the military, how much do

:07:04. > :07:07.we need to train these people in terms of the legal ramifications,

:07:08. > :07:10.but also the difficulties in training if the entire operation is

:07:11. > :07:13.secret. They don't know how to train because there's no qualified

:07:14. > :07:18.teachers to train the units at that level, so it becomes a huge problem.

:07:19. > :07:22.Credibility is more about willingness to use a weapon, and we

:07:23. > :07:27.really willing to use it at this level and I'm not sure the West is.

:07:28. > :07:30.We also have to worry about proliferation. In terms of what we

:07:31. > :07:36.are talking about in terms of a weapon, it could be hundreds of

:07:37. > :07:41.thousands or millions of Americans things on a USB stick, and we have

:07:42. > :07:44.seen some devastating links in the North American programmes. That will

:07:45. > :07:50.always be a consideration. As you lower the bar of entry in terms of

:07:51. > :07:54.who can use it in a defence force in a country, what safeguards are there

:07:55. > :08:02.around it, that won't leave them open? I would just add that this is

:08:03. > :08:08.an area where as soon as you publish details of your technique, you have

:08:09. > :08:12.rescinded the value of that technique, so there has to be a

:08:13. > :08:16.degree of secrecy around the defence of psycho -- cyber capability. And

:08:17. > :08:22.some of the leaks we have talked about earlier, whilst potentially

:08:23. > :08:24.very damaging to our security agencies, they have reinforced the

:08:25. > :08:31.credibility of what their capability has been and buying is -- and by

:08:32. > :08:36.extension continues to be. I don't think the credibility capability an

:08:37. > :08:40.issue. I would add, if you have a genetic weapon, it can be there, or

:08:41. > :08:44.there, but not in both places at the same time. With a cyber weapon, in

:08:45. > :08:49.theory, it can exist in multiple places. We saw at the beginning of

:08:50. > :08:54.March the disclosure by WikiLeaks which talked about a whole load of

:08:55. > :09:02.cyber tools that had been collected allegedly by the US. And as much as

:09:03. > :09:04.they might have been able to use the arsenal against their adversaries,

:09:05. > :09:08.now that has been disclosed, theoretically they could be used

:09:09. > :09:12.against them, and that is one of the challenges of cyber warfare. So your

:09:13. > :09:23.verdict is we should be having a debate about the management the

:09:24. > :09:27.weapon system? If it is to be made more widely available, then yes.

:09:28. > :09:31.There is no division between military and civilian space. The use

:09:32. > :09:34.of this weapon is entirely difficult and problematic because of the

:09:35. > :09:43.involvement of civilians and the possibility of war crimes. That is

:09:44. > :09:50.the real issue here. We hear about -- a lot about the Internet of

:09:51. > :09:55.things and we did have a briefing by Melissa Hathaway who has worked with

:09:56. > :10:01.President Obama and president Trump, I believe, but one of the things she

:10:02. > :10:05.said which was quite amazing to me was that by 2020 is it expected

:10:06. > :10:10.there will be 50 billion devices, globally, attached to the Internet.

:10:11. > :10:15.People I've spoken to think this is a massive understatement. I just

:10:16. > :10:21.wanted to get your reaction. Is this all hype? The prospect of our

:10:22. > :10:26.televisions being a Trojan horse to spy on us, or do you take it

:10:27. > :10:30.seriously? It's going to be a thing. We are seeing smart meters and other

:10:31. > :10:35.benefits in this country, so now that your boiler can report that it

:10:36. > :10:39.is starting to degrading performance, so when the engineer

:10:40. > :10:42.comes out they just replace a part proactively, these are the upsides

:10:43. > :10:47.companies will see from the Internet of things. The connected vehicles,

:10:48. > :10:52.before autonomous vehicles, we are seeing a proliferation of these.

:10:53. > :10:57.There are all these functionalities, so we have to accept that our lives,

:10:58. > :11:04.infrastructures, CDs and buildings will be connected to the Internet --

:11:05. > :11:07.cities. I would add to that and say one thing that needs to be

:11:08. > :11:10.considered as part of the security strategy going forward is a consumer

:11:11. > :11:18.protection angle for these Internet of things. Much of what we bring

:11:19. > :11:21.into our homes, various consumer protection legislation around it,

:11:22. > :11:25.and how could it be extended to protect both of us as individuals

:11:26. > :11:28.from the connected things that are going to be within our houses, but

:11:29. > :11:37.they also connect society and protect them from the Mass effect.

:11:38. > :11:40.One thing I can say is while there will be a proliferation of connected

:11:41. > :11:45.devices, there is something to be said about complexity breeding a

:11:46. > :11:48.problem and being able to attack it. Complexity can make is more safe in

:11:49. > :11:53.some ways. The more things that are connected are less worried that

:11:54. > :11:57.people are about things being disconnected, so that's something to

:11:58. > :12:02.consider. It is inevitable, but it does not mean it is doomed. I would

:12:03. > :12:06.say the risk is very different. If you look at conventional PCs and

:12:07. > :12:10.laptops and smartphones, if one of those devices gets compromised there

:12:11. > :12:14.is a fair chance because people interact on a daily basis they might

:12:15. > :12:17.notice it is doing things that are a bit strange. If you think about the

:12:18. > :12:23.connected device, video recorder, a web camera, broadly speaking people

:12:24. > :12:26.only interact with them to do a specific function and if they are

:12:27. > :12:31.running a bit slow it doesn't make them think it has been compromised.

:12:32. > :12:34.What we see is consumers go out and buy these types of devices and buy

:12:35. > :12:38.them based on their functionality, not on security, that should in

:12:39. > :12:43.theory be built into them. So really what we have is a bit of a

:12:44. > :12:47.disconnect. You have manufacturers trying to produce something as

:12:48. > :12:50.quickly and cost effectively as they can, and users that really care

:12:51. > :12:55.about functionality and security as an afterthought, so what you see is

:12:56. > :13:01.that the risk is passed on from the manufacturer onto the consumer. One

:13:02. > :13:04.of the things we see as an industry, and potentially government needs to

:13:05. > :13:09.look at, is worker how we can manage the risk more effectively. Whether

:13:10. > :13:16.it should be kite marks or standards that manufacturers of IOT devices

:13:17. > :13:19.need to strive towards. Before regulation catches up, what we

:13:20. > :13:23.managed to build in the intervening period gives us a chance. Things are

:13:24. > :13:33.connecting everyday and there will be a legacy to contend with. Just

:13:34. > :13:38.come back to something we discussed a bit earlier, the relation between

:13:39. > :13:41.government and the private sector. There is a tendency to think that

:13:42. > :13:46.government should be responsible for virtually everything and I'm sure

:13:47. > :13:49.you gentlemen don't believe in that since you make a living advising the

:13:50. > :13:52.private sector, so you don't necessarily believe government

:13:53. > :13:58.should be trampling about in it too much. There are areas where

:13:59. > :14:02.government is essential, National infrastructure, intelligence is

:14:03. > :14:05.another. But surely across the private sector generally, government

:14:06. > :14:08.should encourage companies to take steps to protect themselves and make

:14:09. > :14:13.information available, but is there really a need for much more

:14:14. > :14:17.intervention than that? Do we need legislation and regulation? I would

:14:18. > :14:20.suggest it is probably not necessary. I remember President

:14:21. > :14:23.Reagan saying the nine most dangerous words in English is, I'm

:14:24. > :14:30.from the government and I'm here to help you. I wonder whether we are

:14:31. > :14:35.getting to and upon government intervention? I would agree from the

:14:36. > :14:40.perspective of almost the worst outcome, that would be the

:14:41. > :14:42.government setting a whole load of requirements and becomes a

:14:43. > :14:47.compliance exercise because that would drive the wrong behaviour in

:14:48. > :14:50.the private sector. It would drive lawyers and compliance activity

:14:51. > :14:53.rather than risk management. I think the government can be demanding in

:14:54. > :14:59.terms of what it expects private sector in terms of principles. There

:15:00. > :15:04.are principles you can expect them to adopt and across the public

:15:05. > :15:05.sector as well. And you expect a degree of responsibility in stepping

:15:06. > :15:16.up to those principles. In some areas that might be embedded

:15:17. > :15:22.in legislation or code of conduct. There are many policy tools

:15:23. > :15:25.available. It will be unlikely to require legislation as there are

:15:26. > :15:33.already many policy tools the Government can use. If you look at

:15:34. > :15:38.what the Bank of England has managed to achieve with the critical

:15:39. > :15:43.economic functions of our country, you do not tell them what to do,

:15:44. > :15:51.they just say they want independent evidence that they are able to be

:15:52. > :15:55.resilient against an attack. That is simulated every 3-4 years, but it

:15:56. > :16:01.gives them the assurance that the institutions are maintaining the

:16:02. > :16:07.appropriate capability. They want independent assurance that they are

:16:08. > :16:10.doing the right thing. If you pick Government into private transactions

:16:11. > :16:15.it becomes responsible physician 's actions and that is the problem. We

:16:16. > :16:23.need clear lines of oversight and control. If this happens, this is

:16:24. > :16:28.who you should consult an go to. If we look at three different

:16:29. > :16:32.frameworks, some of which are regulations, we see a lot of

:16:33. > :16:38.organisations looking at EPR, because it will come into effect. It

:16:39. > :16:42.is driving his ear in our organisations. If it did not exist

:16:43. > :16:47.some of that focus on disclosure probably would not be happening

:16:48. > :16:56.today. If we look at cyber essentials, we have a great idea and

:16:57. > :17:00.it is aimed at a large part of the population that has significantly is

:17:01. > :17:06.a vulnerability. But the adoption levels are not where anyone would

:17:07. > :17:11.want them to be. How do you bridge that gap? A lot of the organisations

:17:12. > :17:16.that have gone through cyber essentials, look at it like a click

:17:17. > :17:21.box exercise, I need to do this I can report I have done it. The ones

:17:22. > :17:28.that have not done it, they do not understand the threat. They never

:17:29. > :17:33.think they will be targeted. We need to have more of a debate about

:17:34. > :17:37.is there a way of building things is there a way of building things

:17:38. > :17:47.into supply chains or through procurement frameworks. We cannot

:17:48. > :17:53.just have a framework that people look at and do not bother. It isn't

:17:54. > :18:01.the self-interest of the company to do this. In the US, the companies

:18:02. > :18:09.I'm involved with take cyber security very seriously. They are

:18:10. > :18:14.not told today by the Government, they are accountable to the

:18:15. > :18:21.Government. Is that the right way? It is the environment they work in.

:18:22. > :18:23.We do not have a cultural mirror. If we did I think organisations we take

:18:24. > :18:42.a long and hard look at themselves. Many boards around the country do

:18:43. > :18:47.not understand the risk. They think it is technology centric and it is a

:18:48. > :18:53.function of IT departments, when in reality it is wider than that. If

:18:54. > :19:06.you understand what the rescuers, is not a surprise that they do not work

:19:07. > :19:10.as well as we planned. I think one of the challenges is that as our

:19:11. > :19:19.digital world involves investing money in building controls will no

:19:20. > :19:29.longer be adequate and that will involve some hard decisions. That is

:19:30. > :19:33.where existing mindsets where we can tell them to build a set of controls

:19:34. > :19:40.and report back to us, that will not be sufficient. Just another

:19:41. > :19:50.question, is the Government right to be bullying big IT companies? I

:19:51. > :19:58.think the question of rule of law in the digital world jives the criteria

:19:59. > :20:04.that existing balance. We need now is to protect us and in force the

:20:05. > :20:16.rule of law, they need to have some terrifying capability available to

:20:17. > :20:22.them but the rule of losses that is constraint and it demands its

:20:23. > :20:26.misuses constraints. In the digital world you cannot Nessus violate

:20:27. > :20:30.constraint the misuse of some of the capabilities that might be required

:20:31. > :20:36.by law enforcement. A policeman have guns and we are grateful for that

:20:37. > :20:46.and the potential for Ms Lewis -- the potential for misuse is small.

:20:47. > :20:53.The potential I misuse by criminals of the vulnerabilities discovered is

:20:54. > :21:00.much greater. I do not think you can look at one of those criteria in

:21:01. > :21:06.isolation. I would say that goes beyond cyber. It is an ethical and

:21:07. > :21:08.moral question. I have been at conferences where questions have

:21:09. > :21:15.been opposed to the audience about what is more important, privacy or

:21:16. > :21:22.protection. I do not think that is a clear answer. People want both.

:21:23. > :21:27.Sometimes you want one more than another, but they both have their

:21:28. > :21:35.place. It is called having your cake and eating it. English real they are

:21:36. > :21:43.willing to sacrifice more privacy for security, but are we willing to

:21:44. > :21:46.do that? If you make sure that messaging applications cannot using

:21:47. > :21:53.correction, consumers are going to move to another platform, just as

:21:54. > :21:59.they did when the digital music platforms, they moved to other forms

:22:00. > :22:06.of communication. End-to-end encryption was weekend and cyber

:22:07. > :22:16.security, the options available to cyber criminals and so on would be

:22:17. > :22:23.increased. That is correct. We have one example of where a regulator has

:22:24. > :22:31.taken a very assertive approach to the systems at the centre of this,

:22:32. > :22:38.financial services. There are reasons why given the failure is in

:22:39. > :22:41.that sphere, by the regulator has taken this approach. There is a

:22:42. > :22:45.broad spectrum across other regulators as to the extent to which

:22:46. > :22:50.they feel this is an issue they need to get involved with. Are there any

:22:51. > :22:54.particular gaps or the regulator you can think of that should be looking

:22:55. > :23:04.at this more seriously than they are at the moment? We see today that all

:23:05. > :23:11.major regulators understand the challenge, but they are forming

:23:12. > :23:16.their own views. They are not doing it in isolation. Everyone recognises

:23:17. > :23:23.the threat, the challenge we have a certain sectors, if we take energy

:23:24. > :23:30.for example, we have the National Grid comes through tyres, so there

:23:31. > :23:37.will be hard decisions. Do you want more cyber security? You will have

:23:38. > :23:43.two pass the cost on somewhere. Those will be some of the frictions

:23:44. > :23:49.we ran into as opposed to legislators being asleep at the

:23:50. > :23:54.wheel. Regulators focus on the companies that they regulate. Those

:23:55. > :23:57.companies have dependencies on other companies and providers and

:23:58. > :24:05.infrastructure. The attackers do not just attack the regulator company,

:24:06. > :24:09.the attack everything around it. Just focusing on the regulated

:24:10. > :24:16.companies is not going to drive results, it needs to be all business

:24:17. > :24:21.understanding their role in society. I wonder if I could move on to the

:24:22. > :24:28.question about the cyber Security Centre, because we have covered the

:24:29. > :24:37.earlier question. The national cyber Security Centre has been announced

:24:38. > :24:40.as the linchpin in the Government's engagement with the private sector.

:24:41. > :24:47.Is the Government being realistic about what can be achieved by what

:24:48. > :24:51.is at the present moment a small organisation, or does it need to be

:24:52. > :24:58.expanded to taken into account all the dialogue that you think should

:24:59. > :25:02.be taking place? I think they would say the same things themselves, the

:25:03. > :25:06.Government has taken an innovative step into looking to allow

:25:07. > :25:12.innovation to drive policy and to work in partnership with the private

:25:13. > :25:24.sector and industry and the public sector organisations. This is

:25:25. > :25:27.absolutely the right step. Too much too early has always been a distinct

:25:28. > :25:32.risk where they were given too much Ruben Loftus-Cheek number turn on

:25:33. > :25:35.investment. With the initiatives that they have the building the

:25:36. > :25:45.foundations and have the foundations of what will be a critical function

:25:46. > :25:53.within our country. It is a big aspiration but a very welcome one.

:25:54. > :25:58.We have relationships with the Cabinet Office and have an

:25:59. > :26:01.opportunity to engage with a part of Government that consolidates a lot

:26:02. > :26:05.of ideas. That makes for much more joined up thinking. That is well

:26:06. > :26:27.received. That is encouraging. Perhaps you could pick up the last

:26:28. > :26:36.question? Vote where have we got to? -- where have we got to? I am keen

:26:37. > :26:42.to ask number 14. When I introduced the first cyber security strategy,

:26:43. > :26:49.one area I hopes to make better was using the insurance industry and

:26:50. > :26:53.time that in so there was a cost aspect to it. That does not seem to

:26:54. > :26:59.have happened as much as I hoped it would. Do you think it is a useful

:27:00. > :27:04.and important thing to do? When I was working on the banks on the

:27:05. > :27:12.stock exchanges, after the big hit on the New York Stock Exchange is,

:27:13. > :27:25.we went for a double by an electric for logon. Do you think that this

:27:26. > :27:31.login is a useful thing? On the cyber insurance, I would say the

:27:32. > :27:35.market needs to mature. We have done quite a lot of work with insurance

:27:36. > :27:43.companies and I have worked a lot with actuaries and the market is

:27:44. > :27:50.quite premature. The insurance companies are still working out how

:27:51. > :27:57.to set premiums. In the US, you can buy and a certain area of risk that

:27:58. > :28:08.you can ensure where there are more regimented responses to cyber

:28:09. > :28:16.breaches. The challenges insurance means it is insurable. Double

:28:17. > :28:24.biometric, the average citizen can login... There is a huge reliance on

:28:25. > :28:31.passwords and services today and we see so many attacks exploiting weak

:28:32. > :28:36.people and process around passwords. The world needs to move on beyond

:28:37. > :28:42.the use of passwords. Whether that is biometrics or one use passwords,

:28:43. > :28:49.however it works out, stronger methods of authentication required.

:28:50. > :28:53.I think there is synergy with the cyber insurance sector, however it

:28:54. > :28:58.is disappointing that we have not seen much of the take up we've have

:28:59. > :29:02.imagined. You would imagine an insular with care about cyber risk

:29:03. > :29:08.want to mitigate against it. They would want to understand how to have

:29:09. > :29:14.a robust process to measure this before issuing a policy. That is not

:29:15. > :29:17.ready focus appears to be. The insurers are more interested in what

:29:18. > :29:22.happens after the breach and making sure that if an organisation has

:29:23. > :29:26.been compromised from a cyber security perspective may have the

:29:27. > :29:32.friends of the ability to go when and discover what has happened. If

:29:33. > :29:36.there was one thing to take away, if the insurance sector could work more

:29:37. > :29:42.closely with the cyber industry and encourage more organisations to look

:29:43. > :29:46.at vulnerabilities as part of their policy renewal creation process,

:29:47. > :29:51.there would be at the chance of that happening.

:29:52. > :29:58.On cyber assurance we are also seeing things like blast furnace in

:29:59. > :30:02.Germany, whether or not these could be sourced from cyber defence in the

:30:03. > :30:07.future, and for those large policies, they are taking a serious

:30:08. > :30:12.look, and I agree the market is not massive at this time. In terms of

:30:13. > :30:15.your point about double biometrics, do remember that biometrics are not

:30:16. > :30:21.secret and they can be copied. We have done work around copying

:30:22. > :30:29.voices, fingerprints and faces and there will always be solutions we

:30:30. > :30:32.employ. But it is difficult and that is why the stock exchanges and banks

:30:33. > :30:37.did that and then they sent them passwords. To give you a flavour of

:30:38. > :30:42.what is possible today, taking three high-resolution photos of somebody's

:30:43. > :30:46.face, we got a 3-D facemask that would satisfy the current capability

:30:47. > :30:52.in terms of facial biometrics. It only takes three photos, and for a

:30:53. > :30:58.pop star that has social media activity, we have to consider this.

:30:59. > :31:03.It's all is the responsibility problem quite well because you know

:31:04. > :31:09.what the vulnerability is, but in many ways it causes more problems

:31:10. > :31:12.than it is worth because the government biometric system in the

:31:13. > :31:18.US is the worst thing that could happen is if you lose your card, and

:31:19. > :31:27.how long it takes to replace it. I was hoping for an unequivocal bid

:31:28. > :31:33.for an identity card. I am reminded forcefully of the term yesterday

:31:34. > :31:39.whatever human ingenuity can devise, human ingenuity can find a way

:31:40. > :31:44.around. Lady Faulkner? It was question 17 and I will comment after

:31:45. > :31:48.Mrs de Villiers. We are not doing this the way we normally do, but if

:31:49. > :31:52.there is something you want a passionate pick-up on the way down

:31:53. > :31:57.to question 17, I will ask Mr de Villiers do ask that, and that will

:31:58. > :32:01.be the final questions. -- Miss de Villiers. Is it a realistic goal to

:32:02. > :32:08.try and get international agreement on a kind of Geneva convention for

:32:09. > :32:11.cyberspace, to set norms for activities by states in cyberspace

:32:12. > :32:19.and what do you think should be in it? It's a realistic goal. But does

:32:20. > :32:22.it matter is the other question? Countries agree to things and they

:32:23. > :32:28.don't mean to follow along with them. Having some sort of shared

:32:29. > :32:33.standard of behaviour and a normative baseline is something we

:32:34. > :32:35.need to try for but that is tough given the differences and

:32:36. > :32:40.difficulties we have with Russia and China and how they operate in

:32:41. > :32:44.cyberspace, which is diametrically opposed to how we do it, so it will

:32:45. > :32:50.be a tough prospect and we see that from the UN GGE which has been

:32:51. > :32:54.fraught with complications and a lack of involvement with critical

:32:55. > :32:58.countries. Thereafter countries today that have a private sector

:32:59. > :33:04.which, in the name of defence, have some quite aggressive tactics and

:33:05. > :33:08.they earn a lot of money for those countries. It's not intellectual

:33:09. > :33:10.property theft, its intelligence collection happening in the private

:33:11. > :33:15.sector not at the behest of the government. Again, what is the

:33:16. > :33:20.commercial incentive for them to sign up to these norms in what is an

:33:21. > :33:25.immature market? What have we learned about warfare over the last

:33:26. > :33:30.500 or 600 years? We now have a sector that is barely 30 years old.

:33:31. > :33:33.You are sounding like a global politician and I would say there is

:33:34. > :33:36.no harm in striving for something because often at a bilateral

:33:37. > :33:41.agreement you would form along the way can be valuable in itself and

:33:42. > :33:45.you might not achieve the end goal but you will certainly make progress

:33:46. > :33:48.through dialogue, and we've seen examples with the UK, the US and

:33:49. > :33:52.China where dialogue has made a difference. One caution I would have

:33:53. > :33:57.is where we sign up with these things. We have to think of the

:33:58. > :34:01.implications it has on the economic growth prospects of the UK. As a

:34:02. > :34:05.private sector firm trying to do what we are trying to do, which is

:34:06. > :34:11.not offensive by any stretch, we have been caught up somehow and had

:34:12. > :34:17.to seek export licences for its technology because of its potential

:34:18. > :34:24.jewel use application, for example. Lady Faulkner? You have touched on

:34:25. > :34:28.China once or twice. Why do you think China is keen to sign up to

:34:29. > :34:34.some sort of international norms? Is it because it will help it control

:34:35. > :34:40.its own domestic agenda? Is it because it is concerned that its

:34:41. > :34:45.capabilities won't be able to keep up with the developments once the US

:34:46. > :34:51.and other star putting real money into it? What is China's motivation

:34:52. > :34:55.in that case? -- and others start. It is decidedly different to the

:34:56. > :34:58.Russian approach. If I was a betting person I would say that they are

:34:59. > :35:02.sensing a change in the international attitudes and that

:35:03. > :35:06.people are growing tired of their behaviour and they think there may

:35:07. > :35:09.be a ratcheting, so getting that taken out of the source by agreeing

:35:10. > :35:14.to something is probably a prudent step before it comes back to their

:35:15. > :35:18.parliament. China is as vulnerable as we are in cyberspace, as is

:35:19. > :35:23.Russia, so there is an intense idea of an agreement as to how we should

:35:24. > :35:27.behave. I don't really believe cyber conflict is the most important

:35:28. > :35:31.thing, despite it being the basis of my academic work. I think cyber

:35:32. > :35:36.aggression is more important. I'm worried about what states will do to

:35:37. > :35:39.individuals. China will be willing to sacrifice an international

:35:40. > :35:42.conflict for the ability to repress their population and that is what we

:35:43. > :35:46.should really be worried about when that happens. And you think that is

:35:47. > :35:52.a motivation in China? I think it would be a trade-off where they can

:35:53. > :35:54.behave how they want to behave internally and restrict external

:35:55. > :36:00.behaviour, which they have been happy to do until now. I think there

:36:01. > :36:04.is the ability to verify adherence to international agreement is very

:36:05. > :36:08.difficult in this space. In the nuclear world you can see nuclear

:36:09. > :36:14.tests, but you can't see cyber tests. We don't have a buy in from

:36:15. > :36:20.the east on the Tallin manual, and we need that if we don't want to

:36:21. > :36:24.have new expectations. War and constraining behaviour as shown in

:36:25. > :36:29.that way, but that would only work if the other countries agree. Just

:36:30. > :36:34.in closing, we have to also think about spaces like space and the

:36:35. > :36:41.International see, these have connected devices in today, and what

:36:42. > :36:44.applies there? Thoughts of verification had crossed my mind.

:36:45. > :36:48.Thank you very much indeed and thank you for your patience. At least we

:36:49. > :36:52.were not disrupted by a second division in the Lords. But it is

:36:53. > :36:56.always a problem where we have hearings. As I say, thank you very

:36:57. > :37:03.much and I hope you have all found it very helpful and illuminating. I

:37:04. > :37:07.just wish to declare my interest. I might not been the only one I'm

:37:08. > :37:11.nonexecutive director of the cyber security challenge and I chair

:37:12. > :37:14.National trading standards which supports the national trading

:37:15. > :37:19.standards in crime unit and I'm UK coordinator for the electric

:37:20. > :37:25.infrastructure Council. I have connections with three companies

:37:26. > :37:28.which work in the cyber fielding one way or another and to institutions

:37:29. > :37:34.which engaged in research on it, as I declared in my declaration of

:37:35. > :37:41.interests. Thank you very much. I edited a cyber newspaper as well. On

:37:42. > :37:43.that happy note, order, order.