Browse content similar to Cambridge Analytica Urgent Question. Check below for episodes and series from the same categories and more!
Line | From | To | |
---|---|---|---|
Thank you the Secretary of State for
Digital culture sport. The | 0:00:23 | 0:00:35 | |
revelations involving a serious
privacy breach involving Facebook is | 0:00:35 | 0:00:40 | |
very worrying. It was reported that
a whistle-blower X explored the data | 0:00:40 | 0:00:50 | |
of 50 million people globally. In an
increasingly digital world, it's | 0:00:50 | 0:00:54 | |
important that people may have the
confidence of their data is | 0:00:54 | 0:00:56 | |
protected. The regulator is already
investigating. The investigation is | 0:00:56 | 0:01:07 | |
considering how political parties
and campaigns, data and out these | 0:01:07 | 0:01:12 | |
companies, and social media
platforms in the UK have used poor | 0:01:12 | 0:01:19 | |
people's if personal information to
target voters. She is looking | 0:01:19 | 0:01:24 | |
whether Facebook was acquired, and
used illegally. -- face but better. | 0:01:24 | 0:01:29 | |
Notices were issued, it is
imperative that when an organisation | 0:01:29 | 0:01:39 | |
receives an information notice, they
must notice in full. We expect all | 0:01:39 | 0:01:45 | |
organisations involved to co-operate
with this investigation in whatever | 0:01:45 | 0:01:49 | |
way the information Commissioner
sees fit. I'm sure the House will | 0:01:49 | 0:01:52 | |
understand that this only so far we
can go up discussing specific | 0:01:52 | 0:01:57 | |
details of specific cases. But Mr
Speaker, the appropriate use of data | 0:01:57 | 0:02:03 | |
is useful, important for good
campaigning. It's as old as | 0:02:03 | 0:02:07 | |
democracy itself. Indeed we do it in
the House every day. But it's | 0:02:07 | 0:02:11 | |
important that the public is
comfortable with how information is | 0:02:11 | 0:02:15 | |
gathered, used and shared intimate
modern political campaigns. It's | 0:02:15 | 0:02:19 | |
important that the information
Commissioner has the enforcement | 0:02:19 | 0:02:22 | |
powers that she needs. The Bill
currently in committee will | 0:02:22 | 0:02:27 | |
strengthen legislation, and give her
tougher powers to make sure | 0:02:27 | 0:02:32 | |
organisations comply. The Bill gives
her powers to levy significant fines | 0:02:32 | 0:02:37 | |
from malpractice against
organisations that block the ICO's | 0:02:37 | 0:02:43 | |
organisation. -- investigation. This
will help control transparency and | 0:02:43 | 0:02:48 | |
security of data for people and
businesses across the country. | 0:02:48 | 0:02:53 | |
Because of the lessons learned
during the investigation into | 0:02:53 | 0:02:57 | |
difficulties the Commissioner has
found in getting appropriate | 0:02:57 | 0:03:05 | |
organisation -- Hsia has requested
stronger enforcement powers. They | 0:03:05 | 0:03:10 | |
are already in the Bill and she was
requesting even more criminal | 0:03:10 | 0:03:14 | |
sanctions. She has made it clear
that in order to deal with complex | 0:03:14 | 0:03:20 | |
investigations like this, the power
to compel testimonies from | 0:03:20 | 0:03:23 | |
individuals is now needed. We are
considering these new proposals and | 0:03:23 | 0:03:27 | |
I have no doubt the House will
consider them as the Bill passes | 0:03:27 | 0:03:30 | |
through. They took properly used has
massive value and social media is a | 0:03:30 | 0:03:35 | |
good thing. So, we must sent go to
do wrong conclusion and block all | 0:03:35 | 0:03:45 | |
access. We need rules to ensure
clarity and fairness. This is what | 0:03:45 | 0:03:49 | |
the Bill will provide. After all,
strong data protection gives | 0:03:49 | 0:03:54 | |
citizens confidence and that is good
for everyone. Damon Collins. I think | 0:03:54 | 0:04:01 | |
the Secretary of State for his
concern. Surveys were conducted with | 0:04:01 | 0:04:10 | |
Facebook users where he identified
200,000 people who answered his | 0:04:10 | 0:04:16 | |
surveys and from that he was able to
access data and other user profiles. | 0:04:16 | 0:04:26 | |
The information was still was then
sold. He gave evidence to the | 0:04:26 | 0:04:34 | |
committee, but that data was used in
their campaigns. Facebook give this | 0:04:34 | 0:04:45 | |
data breach for two years but did
nothing to target cabbage | 0:04:45 | 0:04:48 | |
analytical. The only when it was
clear that it was going to be | 0:04:48 | 0:04:57 | |
exposed. First, I asked Will someone
be contacting cambered University to | 0:04:57 | 0:05:07 | |
asked about what he was doing with
his team. A strike cambered | 0:05:07 | 0:05:10 | |
University. There is a larger issue
here. That data is used for | 0:05:10 | 0:05:20 | |
political campaigns. No one ever
gave consent that it could be used | 0:05:20 | 0:05:24 | |
in political campaigns of this way.
A lot of people would be shocked | 0:05:24 | 0:05:27 | |
that their data is used in this way.
Not just by a political party but by | 0:05:27 | 0:05:35 | |
a beta and out these consultancy.
Singly -- not simply taking a box on | 0:05:35 | 0:05:43 | |
Facebook, doesn't sign away your
drives. Nowhere does it ask you do | 0:05:43 | 0:05:49 | |
sign off your rides and it would not
be enforceable if the company tried | 0:05:49 | 0:05:53 | |
to do that. People's rides are still
protected. . Many people are | 0:05:53 | 0:06:03 | |
concerned that shadows shadow
companies are being used around | 0:06:03 | 0:06:06 | |
world. People are raising concerns
not just about the way they are | 0:06:06 | 0:06:09 | |
using data, but the ethics and
leadership of the company in all | 0:06:09 | 0:06:12 | |
apps exhibits lives. Stop -- aspects
of his life. I think this incident | 0:06:12 | 0:06:21 | |
shows that someone needs to have
this country than legal authority to | 0:06:21 | 0:06:27 | |
go behind a curtain, and would've up
platforms, see how data are used and | 0:06:27 | 0:06:32 | |
to make sure that they comply with
data protection laws. We need to | 0:06:32 | 0:06:37 | |
beat confident that they are being
enforced con --, conditions are met. | 0:06:37 | 0:06:45 | |
I am pleased that he has raised
that. I know that the committee will | 0:06:45 | 0:06:50 | |
take note of as we go to report
stage of the Bill. Thank you very | 0:06:50 | 0:06:56 | |
much. Mr Speaker. Out as the paint
tribute to the work of the select | 0:06:56 | 0:07:04 | |
committee. The are doing an
incredibly important piece of work | 0:07:04 | 0:07:09 | |
because of the sensitivities of the
impact of this, in terms of its | 0:07:09 | 0:07:17 | |
political nature, the impact of
political campaigning, I think it is | 0:07:17 | 0:07:21 | |
excellent that a cross party group
of MPs leading the group. I pay | 0:07:21 | 0:07:28 | |
tribute to members on both sides. I
do remind them that he did have the | 0:07:28 | 0:07:32 | |
power of summons if people are not
giving them a good in offences. I | 0:07:32 | 0:07:36 | |
will ensure that all the
considerations that he mentioned I | 0:07:36 | 0:07:40 | |
looked into, he raised the point
about consent not be just a tick | 0:07:40 | 0:07:44 | |
box. This is directly addressed in
the data protection built. But | 0:07:44 | 0:07:49 | |
currently, because of the nature of
the very old legislation, in digital | 0:07:49 | 0:07:54 | |
terms 1998, eat can get away with
asking for a box to it taken many | 0:07:54 | 0:08:01 | |
people do not read the small print.
The data protection bill replaces | 0:08:01 | 0:08:06 | |
this take box approach with a
principle -based approach that I | 0:08:06 | 0:08:11 | |
think the whole House should
support. Finally, he asked about the | 0:08:11 | 0:08:16 | |
point about the powers of the
information Commissioner. He is | 0:08:16 | 0:08:20 | |
absolutely right that we've got to
ensure that with this piece of | 0:08:20 | 0:08:23 | |
legislation, and for the House right
now, we've got the power and the | 0:08:23 | 0:08:28 | |
rice so that the information
Commissioner can audit. That is in | 0:08:28 | 0:08:30 | |
the Bill already. The question is
whether it is strong enough. In | 0:08:30 | 0:08:37 | |
terms of what happens if people
choose not to comply with the audit. | 0:08:37 | 0:08:40 | |
At the moment there is a very
serious fine, the question is if, | 0:08:40 | 0:08:47 | |
should the criminal penalty shoes
should be strengthened and it is | 0:08:47 | 0:08:51 | |
something that is being discussed.
Jeni to pay tribute to the | 0:08:51 | 0:08:57 | |
committee, and also the Guardian
newspaper for pursuing this with | 0:08:57 | 0:09:03 | |
such utter relentless despite the
harassment issue has received. If | 0:09:03 | 0:09:07 | |
these allegations are true, it would
provide a direct indictment. The | 0:09:07 | 0:09:16 | |
governor has allowed the data giant
in this country than to be both | 0:09:16 | 0:09:20 | |
careless and carefree in their
misuse of data. If true, 50 million | 0:09:20 | 0:09:25 | |
data records would have been misused
the waive their rights were being | 0:09:25 | 0:09:32 | |
breached, but also in the way that
referendum and elections could I | 0:09:32 | 0:09:37 | |
become comes could've been affected.
Can he confirm that he will bring | 0:09:37 | 0:09:41 | |
forward those amendments stronger
powers the data protection, if he | 0:09:41 | 0:09:48 | |
does, we will back him on this. But
could he also agreed with our | 0:09:48 | 0:09:54 | |
amendments to set a deadline to
modernise e-commerce that treats | 0:09:54 | 0:09:59 | |
these companies with laws that that
date back to before there were even | 0:09:59 | 0:10:05 | |
born. To make it possible to bring
class action when data are breached | 0:10:05 | 0:10:12 | |
so that these class actions are
accessible to people and third, will | 0:10:12 | 0:10:16 | |
he ensure that he supports our
amendment to require disclosure of | 0:10:16 | 0:10:22 | |
funding for dark, social ads which
we know, can influence elections and | 0:10:22 | 0:10:28 | |
indeed, elections. The point you has
to consider, is whether can bridge | 0:10:28 | 0:10:35 | |
analytical can still have proper for
people to hold directorship. Will | 0:10:35 | 0:10:43 | |
this be investigated, with the full
weight of company's... If these | 0:10:43 | 0:10:50 | |
people need to be struck off, they
are struck off for good. I will add | 0:10:50 | 0:10:58 | |
my praise to the Guardian
journalists who have done the work | 0:10:58 | 0:11:02 | |
that we saw published this weekend.
I agree with the right honourable | 0:11:02 | 0:11:07 | |
member with many of the questions he
raises. I think this is an area when | 0:11:07 | 0:11:12 | |
it is best to proceed with cross
party consensus that we can in that | 0:11:12 | 0:11:16 | |
area. I am not sure about the
dragging our feet argument, given | 0:11:16 | 0:11:20 | |
that the date of breach government
is being brought forward by this | 0:11:20 | 0:11:25 | |
government and GDP are also was
also, the European Union was also | 0:11:25 | 0:11:31 | |
back very strong that this
government. You're Artie taking | 0:11:31 | 0:11:34 | |
action in order to put write things
that were have been, needed to be | 0:11:34 | 0:11:42 | |
strengthened because of the
development of technology. Human | 0:11:42 | 0:11:48 | |
mentions -- admonitions as with
e-commerce directors. It isn't | 0:11:48 | 0:11:53 | |
really a question of updating
e-commerce directors. It's a | 0:11:53 | 0:11:56 | |
question of what would putting in
its place. We will be leaving the | 0:11:56 | 0:12:00 | |
single digital market and we'll got
a mention to get an opportunity | 0:12:00 | 0:12:03 | |
would get back that piece of
legislation right in the modern age, | 0:12:03 | 0:12:06 | |
to make sure we can support growth
and to use of modern technology. But | 0:12:06 | 0:12:12 | |
also do so in a way that | 0:12:12 | 0:12:17 | |
He has some of the directorship of
the directors of Cambridge and Oka. | 0:12:17 | 0:12:23 | |
Of course, we will ensure that
people are operating within the law | 0:12:23 | 0:12:29 | |
says there is a question, that is
for different departments but 1am | 0:12:29 | 0:12:36 | |
happy to talk to my Ministers about.
-- I am happy. I'm sure my right | 0:12:36 | 0:12:46 | |
honourable friend will agree that
the news that has come through is | 0:12:46 | 0:12:51 | |
one that should cause us all very
great concern is that the difficulty | 0:12:51 | 0:12:56 | |
that it's been a parable for a long
time that the attaining of data, the | 0:12:56 | 0:13:00 | |
use that can be made of it whether
for commercial or political | 0:13:00 | 0:13:04 | |
purposes, is a gold mine for those
who wish to breach the law and the | 0:13:04 | 0:13:12 | |
sanctions that can be visited on
those who do this are entirely | 0:13:12 | 0:13:17 | |
inadequate. I'm perfectly aware that
the Government is amending the | 0:13:17 | 0:13:21 | |
legislation but I have to say to my
right honourable friend, I don't | 0:13:21 | 0:13:24 | |
think they lost that we are nothing
in terms of the penalties on those | 0:13:24 | 0:13:27 | |
who behave in this fashion are
anything like Joe and out. They | 0:13:27 | 0:13:32 | |
financial incentives are far too
great to the law. The penalties | 0:13:32 | 0:13:36 | |
proportionately insufficient and
ultimately we will have to be much | 0:13:36 | 0:13:40 | |
tougher and for going to stop this
sort of behaviour. I have some | 0:13:40 | 0:13:45 | |
sympathy with the argument that he
makes. 4% of global turnover as a | 0:13:45 | 0:13:50 | |
find is a very significant find. For
an organisation for whom the data | 0:13:50 | 0:13:56 | |
processing is openly part of a
broader business, where the data | 0:13:56 | 0:14:01 | |
processing in this is the whole
business and could be regarded as | 0:14:01 | 0:14:08 | |
less proportionate. Therefore, we
are considering the information | 0:14:08 | 0:14:13 | |
commissioners request. It isn't just
about the 4% global turnover. There | 0:14:13 | 0:14:23 | |
is already a coma offence and caused
145 of the Bill. Which carries the | 0:14:23 | 0:14:29 | |
highest possible fines and criminal
records for providing false | 0:14:29 | 0:14:33 | |
information. So there are many are
stronger sanctions for specific | 0:14:33 | 0:14:37 | |
actions, but the point that he makes
is one the information Commissioner | 0:14:37 | 0:14:44 | |
herself has made and therefore one
that I think is worth listening to. | 0:14:44 | 0:14:50 | |
I think why most people across the
house I was shocked to read the | 0:14:50 | 0:14:55 | |
devolution and the observer. This
story is more evidence that | 0:14:55 | 0:14:58 | |
advertising market is growing
exponentially and becoming more and | 0:14:58 | 0:15:01 | |
more difficult to police. Russian
authorities putting ads... It is | 0:15:01 | 0:15:12 | |
gone and left unregulated in the
market will keep going into this | 0:15:12 | 0:15:15 | |
section. Or plans of the Government
has to take this required action and | 0:15:15 | 0:15:21 | |
of course, Facebook should be
brought back to the select committee | 0:15:21 | 0:15:27 | |
to explained previous evidence which
is alleged passably false. Lastly, | 0:15:27 | 0:15:39 | |
there have been reports that the
Conservative Party have been talked | 0:15:39 | 0:15:42 | |
with for some time, how long has it
been in talks and within the party | 0:15:42 | 0:15:47 | |
know about it deals with Facebook?
Does a government plan to hold an | 0:15:47 | 0:15:52 | |
inquiry by neither plan to do so, is
she worried about conflict of | 0:15:52 | 0:15:56 | |
interest given the Conservative
Party plans to use them for their | 0:15:56 | 0:15:59 | |
own benefit? I answered the first
part of his questions and I probably | 0:15:59 | 0:16:08 | |
agree with him that this is a
serious and boring incident. We need | 0:16:08 | 0:16:13 | |
to make sure the Bill that is
currently before the House puts in | 0:16:13 | 0:16:17 | |
the powers to make that we've got
the enforcement powers behind the | 0:16:17 | 0:16:23 | |
required an ability to audit and the
commission it will get from this the | 0:16:23 | 0:16:31 | |
spell. With this threat to the
question about the Conservative | 0:16:31 | 0:16:33 | |
Party, as far as I understand, they
have no such dealings with Cambridge | 0:16:33 | 0:16:37 | |
Analytica and therefore the
following the conflict arises. -- no | 0:16:37 | 0:16:44 | |
conflict. I have been the victim of
false news stories, being Michael | 0:16:44 | 0:16:50 | |
targeted under Facebook accounts in
my constituency to deliberately | 0:16:50 | 0:16:54 | |
undermine me and cause hate. Can I
think the Minister for prioritising | 0:16:54 | 0:17:00 | |
the data protection bill delivering
the data protection regulations and | 0:17:00 | 0:17:01 | |
to make sure the law is clear and
enforceable. How does he intend to | 0:17:01 | 0:17:06 | |
work with governments and other
countries to make sure there is no | 0:17:06 | 0:17:15 | |
wild West for evil use when it comes
to personal data? I have said that | 0:17:15 | 0:17:25 | |
the wild West in terms of the
digital companies who flout rules | 0:17:25 | 0:17:33 | |
and think the best thing to do is to
move fast and pray things without | 0:17:33 | 0:17:37 | |
respect for the impact it has on
democracy and society, that wild | 0:17:37 | 0:17:40 | |
West is over. The data protection
bill as part of a suite of actions | 0:17:40 | 0:17:46 | |
that we are taking in order to make
sure that we have the freedoms that | 0:17:46 | 0:17:50 | |
we cherish on a line but not the
freedom to harm others. There are | 0:17:50 | 0:17:55 | |
many different areas in which this
impacts, brought together under our | 0:17:55 | 0:18:00 | |
digital charter and getting the
roles right in this space is a very | 0:18:00 | 0:18:04 | |
important part of that response.
Also I want to pay tribute to the | 0:18:04 | 0:18:11 | |
work of the select committee and to
the Guardian newspaper as well. | 0:18:11 | 0:18:16 | |
Doctor was able to present
information to Cambridge and a | 0:18:16 | 0:18:20 | |
little gut and amend to my also be
aware that Doctor Kogan also had a | 0:18:20 | 0:18:25 | |
teaching post and grants from social
media research from a Russian | 0:18:25 | 0:18:29 | |
university. -- information to
Cambridge Analytica. And that | 0:18:29 | 0:18:32 | |
Cambridge Analytica did research for
a firm that is currently on the US | 0:18:32 | 0:18:38 | |
sanctioned list. Ask the Secretary
of State himself investigated the | 0:18:38 | 0:18:41 | |
veracity of these reports? And has
either high or a Home Office | 0:18:41 | 0:18:45 | |
Minister been attached directly with
the Facebook and asked him what for | 0:18:45 | 0:18:49 | |
data breach might have taken place
and to ask him to investigate and if | 0:18:49 | 0:18:54 | |
they will not provide that
information, does he agree with the | 0:18:54 | 0:18:59 | |
colic request that power should be
taken to ensure that we can get that | 0:18:59 | 0:19:02 | |
information? -- agree with the
college. We have any contact with | 0:19:02 | 0:19:08 | |
Facebook and this is very early
stages in terms of the specific | 0:19:08 | 0:19:13 | |
allegations made over the weekend,
but this is part of a longer | 0:19:13 | 0:19:17 | |
dialogue about making sure Facebook
treats these problems with the | 0:19:17 | 0:19:24 | |
seriousness that they deserve. The
focus today is on Facebook, but in | 0:19:24 | 0:19:30 | |
the autumn would came to the house
to talk about Huber's attitude to | 0:19:30 | 0:19:33 | |
data breaches and I don't want to
have to come to the house to talk | 0:19:33 | 0:19:42 | |
about data breaches again and again
from other big data companies. That | 0:19:42 | 0:19:45 | |
is why we need the law updated and
to get in place as soon as possible. | 0:19:45 | 0:19:52 | |
Mr Speaker, if evidence emerges that
any organisation issues people data | 0:19:52 | 0:19:57 | |
to interfere in a UK election or
referendum, by the information | 0:19:57 | 0:20:00 | |
commission, the select committee
work, the Guardian work, or | 0:20:00 | 0:20:04 | |
whatever, with the Minister
guarantee that a full public inquiry | 0:20:04 | 0:20:09 | |
will be established to find out what
happened and what the implications | 0:20:09 | 0:20:11 | |
were? There is no evidence yet of a
successful interference and a UK | 0:20:11 | 0:20:19 | |
election or referendum. We remain
vigilant. Given the important role | 0:20:19 | 0:20:34 | |
that Cambridge Analytica did play in
the EU referendum, given the Lex | 0:20:34 | 0:20:38 | |
made both by the observer's
fantastic journalists and others, | 0:20:38 | 0:20:43 | |
with the Kremlin wider campaign of
undermining and interfering in our | 0:20:43 | 0:20:48 | |
America and our policies, will he
assure the house the holiday | 0:20:48 | 0:20:52 | |
inquiries and investigations that we
have been discussing here today are | 0:20:52 | 0:20:55 | |
getting the full cooperation and
support of the British intelligence | 0:20:55 | 0:20:58 | |
and security services in their work?
Yes. Thank you Mrs. Baker. The | 0:20:58 | 0:21:05 | |
Secretary of State will not have
long called for a comprehensive | 0:21:05 | 0:21:09 | |
forward-looking review of data. --
Mr Speaker. So that our citizens can | 0:21:09 | 0:21:14 | |
have the data rice and deserve. Data
protection bill does not do that, it | 0:21:14 | 0:21:18 | |
doesn't define property rights and
data, it doesn't consider market | 0:21:18 | 0:21:22 | |
power in data or algorithm abuse. A
slick is on the wrong side of | 0:21:22 | 0:21:26 | |
history when it comes to this and
share prices is crashing as a | 0:21:26 | 0:21:28 | |
result. Will he take action or will
he go back down at the last dinosaur | 0:21:28 | 0:21:37 | |
and an age of data ethics? --
Facebook is on the wrong side of | 0:21:37 | 0:21:43 | |
history. Are few governments doing
more to get the roles write in this | 0:21:43 | 0:21:47 | |
space. The data protection bill is a
full suite of data protection | 0:21:47 | 0:21:54 | |
legislation, yes including GDP are
from European law, will more broadly | 0:21:54 | 0:21:58 | |
than that, in order to give people
the power over their own data and | 0:21:58 | 0:22:04 | |
consent over how it is used. What I
would recommend that she read the | 0:22:04 | 0:22:10 | |
Bill and gets on board and as she
has specific improvements, as we | 0:22:10 | 0:22:16 | |
have throughout the passage of the
spell, we are willing to consider | 0:22:16 | 0:22:18 | |
and listen to them, like the
proposals made by the information | 0:22:18 | 0:22:23 | |
Commissioner. We want to make sure
we get this legislation right. In | 0:22:23 | 0:22:31 | |
the years before the 2008 crash, we
were told that people who were | 0:22:31 | 0:22:36 | |
running the City of London with the
masses of the universe at the and | 0:22:36 | 0:22:40 | |
waxing the same sort of arrogance
from the companies like Facebook. -- | 0:22:40 | 0:22:45 | |
waxing the same soda arrogance. The
way they are using data and only the | 0:22:45 | 0:22:48 | |
research and how to use is
completely unregulated. Other areas | 0:22:48 | 0:22:52 | |
of research that I think people's
life are highly regulated. I don't | 0:22:52 | 0:22:56 | |
think the data protection bill goes
far enough in protecting people's | 0:22:56 | 0:23:01 | |
data and the research that goes to
manipulating it. I ask him to | 0:23:01 | 0:23:07 | |
imagine that as the end of his
speech, the gentlemen, there was a | 0:23:07 | 0:23:11 | |
question mark. LAUGHTER I agree with
the premise the question is this | 0:23:11 | 0:23:20 | |
week it, I do agree with him that
the attitude of the social media | 0:23:20 | 0:23:26 | |
giants has been that a government
should get out of the way because | 0:23:26 | 0:23:31 | |
were doing things differently and
better. It may be a good thing for | 0:23:31 | 0:23:36 | |
95% of us that were connected better
and can use social media and also | 0:23:36 | 0:23:40 | |
ways that many people in this House
do, but there are serious risks and | 0:23:40 | 0:23:45 | |
downsides as well they need to be
addressed properly and appropriately | 0:23:45 | 0:23:48 | |
and best are addressed to
legislation where necessary and the | 0:23:48 | 0:23:57 | |
parallels are something that I think
is telling. Not simply a matter of | 0:23:57 | 0:24:05 | |
Cambridge Analytica using data and
handed over by social media | 0:24:05 | 0:24:07 | |
provider, does matter of face with
behaving as though its users will | 0:24:07 | 0:24:10 | |
raw materials to exploited. -- it is
a matter of Facebook behaving as | 0:24:10 | 0:24:16 | |
though it's users will raw material.
Now the time to get providers to | 0:24:16 | 0:24:22 | |
adhere to a code of conduct. Indeed.
A compulsory code of conduct in some | 0:24:22 | 0:24:29 | |
areas is in the Bill. Especially in
terms with respect to the treatment | 0:24:29 | 0:24:33 | |
of children. We have a statutory
code of conduct and the Digital | 0:24:33 | 0:24:40 | |
economy act that was passed last
year, but this whole ever you is one | 0:24:40 | 0:24:44 | |
where we have to make sure the
liberal values we support the | 0:24:44 | 0:24:50 | |
freedom but not the freedom to harm
others, these liberal values that we | 0:24:50 | 0:24:54 | |
apply to legislation to many other
parts of our lives are brought to | 0:24:54 | 0:25:02 | |
bear on the online world as well.
That is what I mean when I say the | 0:25:02 | 0:25:04 | |
wild West is over. On the Bill
committee for the Bill lastly, the | 0:25:04 | 0:25:12 | |
Government rejected opposition
amendments agape will affect to the | 0:25:12 | 0:25:14 | |
European that consumer groups can
bring cause actions on behalf of our | 0:25:14 | 0:25:20 | |
global consumers will be subject to
a data breach. The Government | 0:25:20 | 0:25:23 | |
ignored this and put in a memo don't
think it was on a constant basis. | 0:25:23 | 0:25:27 | |
Given a revelation from Cambridge
Analytica and the fact that none of | 0:25:27 | 0:25:30 | |
us know what that were included in
the 50 billion Facebook profiles, | 0:25:30 | 0:25:33 | |
with the Government reconsider its
position and moved to an opt out | 0:25:33 | 0:25:36 | |
basis in line with European law? The
European Union will offer Austin or | 0:25:36 | 0:25:42 | |
opt out. This whole bill is about
strengthening people's consent and | 0:25:42 | 0:25:48 | |
being able to say that yuppie couple
to forward as a legal action without | 0:25:48 | 0:25:54 | |
your consent unless you opt out. --
that you can take forward. That is | 0:25:54 | 0:25:59 | |
against the spirit of the rest of
the Bill. Nevertheless, we have | 0:25:59 | 0:26:02 | |
listened to the debate and the Other
Place and in here, and we have said | 0:26:02 | 0:26:07 | |
that within 20 months of the Bill,
and the force, we will review how | 0:26:07 | 0:26:11 | |
the opt in system is working because
we want this to be based on the | 0:26:11 | 0:26:14 | |
evidence. | 0:26:14 | 0:26:24 | |
citizens a perfect storm is putting
out democracy in peril. And he urged | 0:26:24 | 0:26:28 | |
for urgent political staffs. Will
the Secretary of State now Athabasca | 0:26:28 | 0:26:35 | |
Beckel is a priority? The question
asked is a priority that we are | 0:26:35 | 0:26:39 | |
considering and will have answers.
Andy Wigmore who was director of | 0:26:39 | 0:26:47 | |
communications for leaves Doctor EU.
Has described what happened without | 0:26:47 | 0:26:59 | |
cabbage analytic that, are biggest
weapon in the campaign. Not a penny | 0:26:59 | 0:27:09 | |
and the donations was reported to
the commission. An investigation was | 0:27:09 | 0:27:14 | |
launched and I'm pleased. But
there's if it turns out that it was | 0:27:14 | 0:27:20 | |
flagrant breach of of our electoral
rude it would put a major question | 0:27:20 | 0:27:25 | |
mark over the referendum result? --
rules. We haven't seen any evidence | 0:27:25 | 0:27:33 | |
of any of this that cause of the
question the impact of the electoral | 0:27:33 | 0:27:39 | |
process whether it is an election or
referendum. We will make the let the | 0:27:39 | 0:27:42 | |
investigation take its course. The
difficulty Mr Speaker, as that | 0:27:42 | 0:27:49 | |
Facebook holds all the evidence and
we can't have access to it. We know | 0:27:49 | 0:27:53 | |
that Facebook approach probably
everyone in this House before the | 0:27:53 | 0:28:00 | |
last general election saying that he
wanted to help us win our seats. | 0:28:00 | 0:28:06 | |
Will he join with those of us who
are very concerned about this issue | 0:28:06 | 0:28:10 | |
to ask Facebook to come clean about
all the information that they have, | 0:28:10 | 0:28:16 | |
where the got it from and how the
used it? There are increased powers | 0:28:16 | 0:28:23 | |
of transparency in the Bill. Most
importantly, the Bill has in it the | 0:28:23 | 0:28:29 | |
power for the information
Commissioner to audit. And | 0:28:29 | 0:28:32 | |
therefore, to be able to demand
information in order to be able to | 0:28:32 | 0:28:37 | |
undertake an investigation like
that. And making sure that this bill | 0:28:37 | 0:28:40 | |
gets onto the statue book is the
single best way that we can make | 0:28:40 | 0:28:47 | |
progress in stopping flagrant
breaches in the future. People | 0:28:47 | 0:28:53 | |
across the nations using Facebook
will be feeling the trade by these | 0:28:53 | 0:28:57 | |
revelations. You will field that
investigation must be held and | 0:28:57 | 0:29:00 | |
lawbreakers brought to account. But
given the minister's assurance and | 0:29:00 | 0:29:05 | |
Tory party involvement, will he
guarantee that everyone involved | 0:29:05 | 0:29:11 | |
with can bridge analytical will be
investigated transparently? My | 0:29:11 | 0:29:18 | |
instinct is absolutely yes but of
course that is a matter for the | 0:29:18 | 0:29:21 | |
information Commissioner be and
rightly, because she's independent | 0:29:21 | 0:29:26 | |
of political parties. Of course, the
final answer is hers but you can see | 0:29:26 | 0:29:33 | |
what my instincts lie. It's becoming
clear we have had a lot of | 0:29:33 | 0:29:42 | |
half-truths and mistruths, the
impact on elections and referenda in | 0:29:42 | 0:29:53 | |
this country is becoming clear, we
can't prove it yet but it is | 0:29:53 | 0:29:56 | |
becoming clear, the data companies
are not giving us evidence on what | 0:29:56 | 0:30:00 | |
he do with the information and are
not coming clean with how to use it. | 0:30:00 | 0:30:06 | |
What is the Secretary of State to
ensure that British people have | 0:30:06 | 0:30:10 | |
confidence that their information is
being used within the law, and that | 0:30:10 | 0:30:14 | |
our elections are absolutely fair
and transparent and well reported? I | 0:30:14 | 0:30:21 | |
agree with her very strongly on the
premise of her question. The first | 0:30:21 | 0:30:24 | |
thing we will do is listen very
carefully to the report of the | 0:30:24 | 0:30:29 | |
select committee, which is doing an
excellent work in this area, and we | 0:30:29 | 0:30:35 | |
insist that all companies comply
properly with the select committee, | 0:30:35 | 0:30:39 | |
and I think the select committee has
got plenty more work to do as we | 0:30:39 | 0:30:44 | |
have just discovered. And we will
not rest until we put this right | 0:30:44 | 0:30:49 | |
because frankly, the quality of our
liberal democracy that we live and | 0:30:49 | 0:30:58 | |
depends on having a high-quality
political discourse and that means | 0:30:58 | 0:30:59 | |
that making sure that online, as
well is off, and have robust | 0:30:59 | 0:31:06 | |
exchanges but exchanges that are
based on objective truth and | 0:31:06 | 0:31:09 | |
reasonableness. These allegations
involving Facebook and uncovered by | 0:31:09 | 0:31:16 | |
brave journalist, can the Minister
ensure the House that he will be | 0:31:16 | 0:31:25 | |
calling all major companies to his
office asking if they have been | 0:31:25 | 0:31:28 | |
involved in any of those breaches
and if they will come clean, so that | 0:31:28 | 0:31:31 | |
we can be confident on how data as
being protective. I think he would | 0:31:31 | 0:31:36 | |
be very sensible of the dead. The
Minister spoke of our important | 0:31:36 | 0:31:44 | |
democratic, liberal values. Could he
said he is aware of Facebook's | 0:31:44 | 0:31:50 | |
attempts to block broadcast, of the
expose... Of all the different | 0:31:50 | 0:32:03 | |
things that have surprised me insert
shocked me in this revelation, the | 0:32:03 | 0:32:08 | |
decision by Facebook to take down
the Facebook account and also the | 0:32:08 | 0:32:14 | |
removal of the what's up account at
Instagram account of the | 0:32:14 | 0:32:18 | |
whistle-blower, was the most
surprising. Because it doesn't, I | 0:32:18 | 0:32:27 | |
thought it was outrageous. I thought
it was outrageous and I won't tell | 0:32:27 | 0:32:33 | |
you why. Because Facebook has got
some serious questions answered | 0:32:33 | 0:32:36 | |
here. They will tell their side of
the store but have some serious | 0:32:36 | 0:32:40 | |
questions answered. And answer it by
blocking an account, when in the | 0:32:40 | 0:32:44 | |
same time we know this House we
don't act fast enough enough to | 0:32:44 | 0:32:52 | |
block some outrageous obviously
outrageous accounts. While this | 0:32:52 | 0:32:56 | |
showed us that when they want to,
they can block an account | 0:32:56 | 0:32:59 |