28/11/2017 Newsnight


28/11/2017

In-depth investigation and analysis of the stories behind the day's headlines with Evan Davis.


Similar Content

Browse content similar to 28/11/2017. Check below for episodes and series from the same categories and more!

Transcript


LineFromTo

16 years after this 17-year-old

private died from gunshot

0:00:060:00:08

wounds at Deepcut barracks,

there's to be a fresh inquest

0:00:080:00:10

into what really happened.

0:00:100:00:16

It sometimes seems as though

the authorities want everyone

0:00:160:00:18

to move along and think

about something else -

0:00:180:00:21

his parents tell us

about why they never gave up

0:00:210:00:23

looking for answers.

0:00:230:00:26

I saw my son on a slab

and he's 17 years old.

0:00:260:00:29

I promised him then that

I would find out the truth.

0:00:290:00:32

I don't think we have

found the truth yet.

0:00:320:00:34

And I still owe him that promise.

0:00:340:00:39

We'll ask a leading human rights

lawyer whether parents should have

0:00:390:00:42

to fight so hard to get

the investigations they need?

0:00:420:00:47

Also tonight, you might

have known your phone

0:00:470:00:49

was addictive, but did you know

it is properly addictive...

0:00:490:00:52

and deliberately so.

0:00:520:00:53

It's part of my body now.

0:00:530:00:54

It is always with me.

0:00:540:00:55

I don't know.

0:00:550:00:57

I don't want to say

it is an addiction,

0:00:570:00:58

but I just need my phone!

0:00:580:01:00

It is not sensational

to say our brains are being hacked,

0:01:000:01:03

because that is pretty much

what is happening.

0:01:030:01:10

And Stella McCartney tells us why

the fashion industry needs

0:01:100:01:13

to be more sustainable.

0:01:130:01:23

This can be fashionable, this can be

sexy, this can be young.

0:01:240:01:32

Hello.

0:01:320:01:34

We start tonight with

the speculation this evening

0:01:340:01:35

of a settlement in the fraught talks

over Britain's Brexit divorce bill.

0:01:350:01:39

We might have an answer

to the fifty billion euro question -

0:01:390:01:42

how much do we pay to leave.

0:01:420:01:46

And the answer seems to be a number

near to fifty billion euros.

0:01:460:01:49

The papers are full

of somewhat varying figures

0:01:490:01:51

as to what Britain is agreeing too -

but that's the order of magnitude.

0:01:510:01:57

Certainly, the general

pattern of these talks,

0:01:570:01:58

that we say no and then we say yes,

while they say very little appears

0:01:580:02:02

to have repeated itself again.

0:02:020:02:04

But our political editor Nick Watt

has been around Westminster finding

0:02:040:02:07

out how this is all going down,

and he's with me now.

0:02:070:02:14

Can we put a figure on it? We can

confirm that the UK has agreed to

0:02:140:02:21

the EU framework for settling that

financial divorce bill with EU when

0:02:210:02:25

we leave. Stage one came in the

Prime Minister's speech in Florence

0:02:250:02:29

when she said that the UK would

agree to cover it share of the EU

0:02:290:02:33

budget up to the end of 2020. The

final stage has come just now, which

0:02:330:02:38

is that the UK has agreed that it

will meet its liabilities racked up

0:02:380:02:44

as a member of the European Union.

0:02:440:02:56

Where you go from there is a matter

of dispute. The sources I have been

0:03:250:03:28

talking to say that they are not

putting a number on the table. The

0:03:280:03:31

Financial Times and the Daily

Telegraph which broke this story,

0:03:310:03:33

the Daily Telegraph is talking about

a figure of up to 55 billion euros

0:03:330:03:36

and the Financial Times is talking

about 100 million euros but think it

0:03:360:03:38

could be mass such down to half of

that. The UK Government do not want

0:03:380:03:41

the figure at the table at the next

summit. They never want a figure on

0:03:410:03:44

the table. What they want to be able

to do is offset that against our net

0:03:440:03:47

payment and pay these liabilities as

they are due in perhaps as long as

0:03:470:03:50

decades to come. They have an ally

in me shall buy. His view is that

0:03:500:03:53

methodology is more important than a

number.

The papers are going on it.

0:03:530:03:56

Financial Times here. The Times.

There is the Daily Telegraph. The

0:03:560:04:00

Guardian. What does this tell us

about Theresa May and her whole

0:04:000:04:08

approach to these negotiations?

Theresa May once a dealer at the end

0:04:080:04:14

of this process. The idea is to have

it by October next year and then you

0:04:140:04:20

can finalise it in Parliament. She

feels very strongly that you have

0:04:200:04:24

got to move on to the second stage,

the final trade talks and the

0:04:240:04:28

transition talks. You have to do

that at the summit next month and to

0:04:280:04:32

do that you have got to come to some

sort of agreement in the next week.

0:04:320:04:38

There are some senior Remain Tories

who say that the Prime Minister is

0:04:380:04:42

talking a tough game. And then she

is quietly sort of doing this deal

0:04:420:04:48

in Brussels. I think what ministers

would say is we are making progress,

0:04:480:04:53

we made this big movement, but on

the money there will still be a line

0:04:530:04:59

by line and analysis and crucially,

they are not there yet on two big

0:04:590:05:03

issues. One is the role of the

European Court of Justice in the

0:05:030:05:08

future relationship with EU citizens

of this country and also the Irish

0:05:080:05:13

border.

Theresa May has gone to the

Middle East. She will not be

0:05:130:05:17

answering questions about it

tomorrow.

It will be Damian Green

0:05:170:05:22

standing in for the Prime Minister

at Prime Minister's Questions. There

0:05:220:05:29

is a report going on into whether he

behaved appropriately with a young

0:05:290:05:34

woman journalist who he had a drink

with an questions about alleged

0:05:340:05:39

pornography on his computer. The

interesting thing is, he is a

0:05:390:05:44

passionate Remainer who is

delivering Brexit bar when he was

0:05:440:05:47

asked about it and how he would vote

in another referendum, he said he

0:05:470:05:50

would vote to Remain.

0:05:500:05:56

It is only a small Surrey

village near Camberley,

0:05:560:05:58

but the name Deepcut has sadly

become synonymous with a series

0:05:580:06:01

of deaths at the Barracks there.

0:06:010:06:02

Four young trainees died by gunshot

wounds over a number of years

0:06:020:06:05

in the late '90s and early 2000s.

0:06:050:06:07

The strange but similar

circumstances made the families

0:06:070:06:09

deeply sceptical of initial

suggestions of death by suicide.

0:06:090:06:11

After perseverance by those

families, subsequent reviews exposed

0:06:110:06:13

a culture of bullying and harassment

at the barracks, and found fault

0:06:130:06:16

in the army's treatment of trainees.

0:06:160:06:19

Questions were raised about

the investigations into the deaths.

0:06:190:06:22

And today - one family successfully

won a high court action to obtain

0:06:220:06:25

a fresh inquest into the death

of their son.

0:06:250:06:30

Private Geoff Gray was only 17

when he died, 16 years ago.

0:06:300:06:32

Should it really have taken so long

for the relatives to get

0:06:320:06:42

His parents spoke to us today.

0:06:440:06:48

I saw my son on a slab

and he was 17 years old.

0:06:480:06:51

I promised him then I'd

find out the truth.

0:06:510:06:54

I don't think we've

found the truth yet.

0:06:540:06:56

And I still owe him that promise.

0:06:560:06:57

Private Geoff Gray died

in 2001 at the Deepcut

0:06:570:06:59

army barracks in Surrey.

0:06:590:07:00

He was found shot twice

in the head with a rifle.

0:07:000:07:04

The Army ruled that his

death was a suicide.

0:07:040:07:09

The week before he died he had

phoned us and he told us that

0:07:090:07:12

somebody had taken their life

in the barracks.

0:07:120:07:17

I think he had taken some

tablets and he had died.

0:07:170:07:20

And he said, that's

a coward's way out.

0:07:200:07:24

And I thought, you know,

he's talking about suicide

0:07:240:07:26

being the coward's way out.

0:07:260:07:27

So I don't think he did.

0:07:270:07:31

It's not in his nature.

0:07:310:07:37

In 2002 an inquest returned an open

verdict and today the High Court

0:07:370:07:40

ordered that it was necessary

or desirable in the interests

0:07:400:07:42

of justice for a fresh

inquest to be held.

0:07:420:07:45

Private Gray's family believe

the truth of his death

0:07:450:07:48

at the Deepcut Barracks has not

yet been uncovered.

0:07:480:07:53

Looking at the fact

that he was shot twice in the head.

0:07:530:07:56

You always have to look at the fact

that he may have been murdered.

0:07:560:07:59

Once you've put one bullet in,

your body will drop.

0:07:590:08:03

You know, to be really graphic

about this, the back

0:08:030:08:06

of your head disappears.

0:08:060:08:08

So your body will drop.

0:08:080:08:10

The rifle will rise.

0:08:100:08:11

You can't do it twice.

0:08:110:08:15

Private Gray's death at Deepcut

sadly was not unique.

0:08:150:08:17

Private Sean Benton died

of gunshot wounds in 1995.

0:08:170:08:21

A fresh inquest into his case

will be heard next year.

0:08:210:08:25

Private Cheryl James

was found to have shot

0:08:250:08:27

herself in the same year.

0:08:270:08:30

There was a fresh inquest

into her case last June.

0:08:300:08:33

And Private James Collinson

was found dead with a single

0:08:330:08:35

gunshot wound in 2002.

0:08:350:08:41

The Army came to my door and said

your son has killed himself.

0:08:410:08:44

Families have been fighting

for years to learn the truth

0:08:440:08:46

about the Deepcut deaths.

0:08:460:08:47

A heavy burden to add

to their bereavement.

0:08:470:08:52

We don't have a choice,

there is no choice in that.

0:08:520:08:54

You have to carry on.

0:08:540:08:56

I have another son at home.

0:08:560:08:57

I have to carry on for his sake.

0:08:570:08:59

Life goes on living.

0:08:590:09:01

It's sort of, we do this

and then we have to carry

0:09:010:09:04

on with family life as well.

0:09:040:09:08

You know.

0:09:080:09:09

And keep on going.

0:09:090:09:10

And just try to get the truth

of what actually happened to Geoff.

0:09:100:09:17

It is also difficult to get

the expertise required to question

0:09:170:09:19

the police and Ministry of Defence.

0:09:190:09:28

Bereaved families in inquests should

have legal aid whenever

0:09:280:09:30

the state are represented.

0:09:300:09:31

So for instance if you go

into an inquest now and the police

0:09:310:09:34

are there, the Ministry of Defence

are there, the Fire Services

0:09:340:09:37

are there, the NHS are there

for instance, they will be quite

0:09:370:09:40

properly represented

by taxpayers money.

0:09:400:09:41

Now if that is the case,

why should believed families sit

0:09:410:09:44

in court and simply faced that bank

of lawyers against them

0:09:440:09:46

with not a single lawyer

being funded for them.

0:09:460:09:50

It is outrageous.

0:09:500:09:52

The families though will press

on in the hope that one day,

0:09:520:09:55

hopefully soon, they get a more

convincing set of answers.

0:09:550:10:01

Geoff signed up

to serve his country.

0:10:010:10:03

He died when he was 17 years old.

0:10:030:10:05

His country should serve him now

and we should find the truth.

0:10:050:10:15

Let us look at some of the general

issues raised by the difficulties

0:10:160:10:22

parents have in getting inquests.

0:10:220:10:25

Joining me now from Salford

is Pete Weatherby QC -

0:10:250:10:27

a specialist in public inquiries

and inquests - including

0:10:270:10:29

Hillsborough, Grenfell

and a whole raft of others.

0:10:290:10:31

Good evening. We have had a string

of cases going back it decades,

0:10:310:10:38

blood contamination, Hillsborough,

Orgreave, cases were families of

0:10:380:10:42

those involved, in some ghastly

tragedy, seal it justice has not

0:10:420:10:46

been done or questions have been

left unanswered.

Do you see

0:10:460:10:51

commonalities? Absolutely. These

families have got to go through the

0:10:510:11:00

whole process yet again to try and

get at the truth, not just one

0:11:000:11:04

family, it is not a coincidence,

three of them as you just reported.

0:11:040:11:10

You mentioned blood contamination, I

could add to that the Birmingham pub

0:11:100:11:16

bombings, the new inquests 45 years

on. In my view there are two common

0:11:160:11:23

problems, the first is a duty, the

candour that needs to be a duty, a

0:11:230:11:29

legal duty of candour. What is a

common part of all of these cases

0:11:290:11:36

and apparently the Deepcut ones as

well is that there is new evidence

0:11:360:11:40

coming forward. Why is that coming

forward so long afterwards? What

0:11:400:11:46

went wrong with the original

investigations? The original

0:11:460:11:50

enquiries? It appears given all of

these historic cases, take the blood

0:11:500:11:58

contamination case, go back to the

early 1980s, when the evidence seems

0:11:580:12:04

to show that the Department of

Health knew that the blood was

0:12:040:12:08

contaminated yet was still supplying

people unknowingly who then went on

0:12:080:12:14

to die from either hepatitis or HIV.

In these cases there is a culture of

0:12:140:12:19

denial as there would be in any

authority when they are

0:12:190:12:23

investigated.

There shouldn't be.

There is a culture of denial,

0:12:230:12:28

institutional defensiveness where

they reach for the denial first. If

0:12:280:12:32

you look at Grenfell, bringing us

right up to date, immediately after

0:12:320:12:38

the fire and you had the council

saying that they have done nothing

0:12:380:12:42

wrong, the emergency response, some

of the contractor is issuing

0:12:420:12:47

condolences, but at the end of them

they were saying, by the way, we did

0:12:470:12:51

nothing wrong either and people are

making the denials before they have

0:12:510:12:55

even looked at their own behaviour.

The other feature of course is that

0:12:550:13:00

we heard it in the peace there,

everybody is very well legally

0:13:000:13:05

represented apart from the families

of the victims.

Exactly. In any of

0:13:050:13:11

these disasters or tragedies, that

is exactly right. The Army, the

0:13:110:13:16

police, the local authorities, the

NHS, whatever it is, all entirely

0:13:160:13:22

properly fully represented, but the

victims are not. If you look at the

0:13:220:13:27

Birmingham pub bombings, the

families after 45 years, were

0:13:270:13:31

spending most of their time trying

to get funding so that they could go

0:13:310:13:35

to the inquests of their loved ones.

It

0:13:350:13:46

is there a point that sometimes it

is right to say it is time to close

0:13:530:13:57

this issue, we are not going to have

an enquiry? May be families are

0:13:570:13:59

clinging onto unrealistic hopes of

what might emerge or maybe the

0:13:590:14:02

authorities genuinely know that

there is nothing to be said. Is

0:14:020:14:04

there ever a defence of Saint, it is

time up?

I would look at it from the

0:14:040:14:07

other end of the telescope, there

should be an enquiry very quickly

0:14:070:14:10

and it should be done properly and

you can do that if there is a legal

0:14:100:14:13

duty of candour, which makes a

public authority or the public

0:14:130:14:15

facing private entity in these days

of privatisation, if there is a

0:14:150:14:17

legal duty on them to be proactive

in coming out with the truth and

0:14:170:14:23

owning up to their own shortcomings.

0:14:230:14:26

Only up to the grand shortcomings.

This is the proposed law, the

0:14:370:14:40

Hillsborough law. And has quite a

bit of cross-party support.

It has

0:14:400:14:47

had its first reading, it has strong

support and will go back soon I hope

0:14:470:14:51

to the House of Commons and it

requires candour. So hopefully we

0:14:510:14:58

will not have these repeat inquests

15 years on. But people proactively

0:14:580:15:03

will have to tell the truth and it

will put the -- put victims on a

0:15:030:15:08

level playing field in terms of

representation. Linking public

0:15:080:15:16

funding to victims in disasters and

unnatural death situations.

Thank

0:15:160:15:20

you very much.

0:15:200:15:21

Something is going badly wrong

with the way we use clothes,

0:15:210:15:24

according to major study

being launched this evening.

0:15:240:15:26

In the last 15 years,

across the world, the average number

0:15:260:15:28

of times a garment is worn before it

ceases to be used has gone down

0:15:280:15:32

by more than a third.

0:15:320:15:33

In China, clothes are

worn 70% fewer times

0:15:330:15:35

than they were in 2002.

0:15:350:15:37

At one level, this just tells us

that as we get richer,

0:15:370:15:40

we like to have more new things.

0:15:400:15:42

And fashionable ones at that.

0:15:420:15:44

But the report suggests this

could potentially take us

0:15:440:15:46

to environmentally catastrophic

outcomes.

0:15:460:15:56

Here's the dilemma.

0:15:570:15:58

Fashion is an industry that's grown

in appeal over the decades.

0:15:580:16:01

But by its nature, it

promotes obsolescence.

0:16:010:16:05

Fashion today implies out

of fashion tomorrow.

0:16:050:16:07

Who wants to wear that?

0:16:070:16:10

The result is what today's

report calls the "take,

0:16:100:16:13

make, dispose" model.

0:16:130:16:15

The clothing industry takes

non-renewable resources,

0:16:150:16:18

makes clothes and textiles

with them, that

0:16:180:16:20

are then disposed of.

0:16:200:16:22

That disposal reflects

massive under recycling.

0:16:220:16:26

And disposal itself costs tens

of millions in Britain alone.

0:16:260:16:31

The fact is, we've become

rather good at this model.

0:16:310:16:35

Going into clothing production

is a way of taking a poor country

0:16:350:16:37

towards middle-income status.

0:16:370:16:40

And clothes for consumers are

extraordinarily cheap as a result.

0:16:400:16:46

This is the graphic

in the report that says it all.

0:16:460:16:48

As the world becomes richer,

the number of garments

0:16:480:16:50

being produced has doubled.

0:16:500:16:53

But the number of times each garment

is worn has plummeted.

0:16:530:16:57

The solutions the report

suggests include radically

0:16:570:16:59

improving recycling,

making durability more

0:16:590:17:01

attractive, and even promoting

more clothing rentals.

0:17:010:17:08

What the report does not suggest

is simply taxing clothes to make

0:17:080:17:12

them more expensive.

0:17:120:17:13

That would hurt the poor.

0:17:130:17:18

And being able to choose clothes

and to use them as a form

0:17:180:17:21

of self-expression almost defines

what it is to be a modern consumer

0:17:210:17:24

in an affluent society.

0:17:240:17:29

Now the report was produced

by the Ellen McArthur Foundation, -

0:17:290:17:33

she was the round the world

yachtswoman who now promotes

0:17:330:17:35

the idea that the economy

should be circular, -

0:17:350:17:37

with goods recycled round and round,

not flowing straight into landfill.

0:17:370:17:43

Contributors to the report included

McKinsey the consultants,

0:17:430:17:46

and many clothing businesses,

including Stella McCartney's.

0:17:460:17:50

Well, just before the official

launch earlier this evening,

0:17:500:17:52

I went to the Fashion Gallery

at the Victoria and Albert museum,

0:17:520:17:55

to talk to Ellen McArthur

and Stella McCartney herself.

0:17:550:17:57

First I asked Ellen if we should

blame the waste of textile materials

0:17:570:18:02

on the very nature of fashion.

0:18:020:18:08

I think the disposable nature of

fashion is one of the challenges of

0:18:080:18:15

the other challenge is to try to

make that fashion that changes by

0:18:150:18:20

definition fits within a system and

that is what the report is about,

0:18:200:18:24

building a broader system within

which the design, the materials

0:18:240:18:29

used, when they come out of the far

end as fast fashion that material

0:18:290:18:35

can be something that is technical

or if by -- is biodegradable.

If we

0:18:350:18:44

did this right and we will more

eco-friendly in the way we dress,

0:18:440:18:47

you might end up out of the job

because you design one piece of

0:18:470:18:52

clothing and instead of us changing

it we would wear it for 20 years and

0:18:520:18:56

not need as many new clothes.

I do

not worry about that. My business

0:18:560:19:04

model is based on sustainability and

I have a successful business. This

0:19:040:19:08

report looks at working together at

all levels of the industry and

0:19:080:19:15

creating new business from it and

looking at essentially the waist and

0:19:150:19:19

finding a way of reusing rate and

making it exciting, not looking at

0:19:190:19:24

it as a problem all the time but an

opportunity.

It is a $500 billion US

0:19:240:19:32

dollar opportunity if we can get

this right and will cover that

0:19:320:19:35

material. $100 billion is not

recycled every year and that is

0:19:350:19:44

value to the industry.

Can you

persuade people that durability is

0:19:440:19:47

attractive.

If we are burning one

truckload of clothing every second

0:19:470:19:54

using it as landfill, there is

nothing attractive about that. We

0:19:540:19:57

all live on the planet together and

we have to survive it. It is not a

0:19:570:20:04

quick fix, but I think today we are

bringing awareness to it and a

0:20:040:20:09

different approach.

It is rarely

seen here but some companies have 5

0:20:090:20:21

million subscribers to have access

to whatever clothing they want. When

0:20:210:20:25

they get a new piece of clothing,

the old one goes back. It is

0:20:250:20:29

effectively rental but the clothing

goes back into the system. Now if it

0:20:290:20:34

has durability that will go out to

someone else. When you build a

0:20:340:20:40

system when clothing goes back they

know what it is made from.

It is

0:20:400:20:44

also a new way to look at the

fashion industry, all industries

0:20:440:20:47

must review their impact on the

planet now. It applies to every

0:20:470:20:52

industry.

It should apply to

everything.

It applies to everything

0:20:520:20:57

and there are exciting alternatives.

It has a second life. My clothes are

0:20:570:21:09

on their and you can swap and barter

clothing.

I'm a bit sceptical about

0:21:090:21:17

the rental model with clothes, I

like my own clothes.

It is a modern

0:21:170:21:24

approach because to have a future we

need to have this conversation for

0:21:240:21:28

our children. We will have to have

these conversations.

What would be

0:21:280:21:32

your advice to an average consumer

who likes to spend money on clothes

0:21:320:21:39

and dress well and who sees dressing

as a form of self-expression, as

0:21:390:21:44

part of their identity and the way

they behave. What could they change

0:21:440:21:48

right now.

Right now the consumer in

this country cannot be circular with

0:21:480:21:53

fashion decisions, that is hard to

do because the industry is not

0:21:530:21:58

circular. With this report we are

trying to get the industry to look

0:21:580:22:01

at this vision and have a high level

of ambition. And collaborate as

0:22:010:22:06

never before.

And incentivise

people. It does not have to be

0:22:060:22:12

punishment, it can be sexy and

young. I get excited about the

0:22:120:22:17

opportunities. As a fashion designer

and businesswoman that is why I'm

0:22:170:22:22

here today, I'm interested in the

new.

What you did not put in the

0:22:220:22:28

report is the possibility of taxes

because people will tell you what

0:22:280:22:34

they've got because they pay more.

It is a valid point but just taxing

0:22:340:22:38

clothing will not solve the problem.

Clothing now is designed in a linear

0:22:380:22:46

way, we burn or landfill a

significant amount.

People need to

0:22:460:22:52

look at the opportunities

financially and there is massive

0:22:520:22:55

opportunity to make money on every

level. There's so much waste, get

0:22:550:23:00

recycling incentives in place but

people would get money for recycling

0:23:000:23:05

their clothes properly.

We're about

to enter I suspect a frenzied

0:23:050:23:12

speculation about what Meghan Markle

is going to wear at her wedding when

0:23:120:23:16

she marries Prince Harry. Is it

healthy that we are so obsessed with

0:23:160:23:21

the dress.

As animals on this planet

we are obsessed with strange things.

0:23:210:23:27

It is OK, some people are obsessed,

some people do not even know who

0:23:270:23:33

you're talking about. It is all

relative. I think the main thing is

0:23:330:23:38

just to bring a new awareness into

the conversation.

It would make a

0:23:380:23:45

big -- a big statement, the dress.

I'm happy to provide some

0:23:450:23:53

eco-options!

0:23:530:23:55

And now Viewsnight.

0:23:550:23:57

And this week we are hearing two

very different opinions

0:23:570:23:59

of the Trump presidency.

0:23:590:24:00

On Thursday, we'll be hearing

from a critic of the President,

0:24:000:24:03

but tonight it's a chance

for Drew Liquerman,

0:24:030:24:05

from Republicans Overseas UK,

to make the case for the defence.

0:24:050:24:12

There is a difference

between liking something,

0:26:160:26:18

and being addicted to it -

I like water and drink it

0:26:180:26:21

several times a day,

but I'm not a water junkie.

0:26:210:26:23

However in some cases

there is a fine line

0:26:230:26:25

between merely wanting something,

and yearning for it to fill

0:26:250:26:28

an acquired chemical need.

0:26:280:26:33

And here's the thing: our attachment

to smartphone technology appears

0:26:330:26:35

to be more in the latter category -

it is actually addictive.

0:26:350:26:39

You may recognise the problem,

but in fact it is not

0:26:390:26:42

accidentally addictive,

it is designed to be so.

0:26:420:26:44

Because the online world is funded

mainly through advertising,

0:26:440:26:47

those working in it,

need to both grab and keep our

0:26:470:26:50

attention to survive and thrive.

0:26:500:26:54

Now understanding how

technology addiction works,

0:26:540:26:56

may make you more resilient

in resisting it.

0:26:560:26:59

Our technology editor David Grossman

has been finding out more -

0:26:590:27:02

and meeting one former Google

executive who believes what is known

0:27:020:27:04

as "the attention economy" poses

a threat to democracy itself.

0:27:040:27:14

For many of us, reaching

for our phones has become automatic.

0:27:140:27:17

As unthinking as blinking.

0:27:170:27:21

Sometimes I'll just unlock my phone

and I'll lock it again

0:27:210:27:24

and I won't even know

what I've looked at.

0:27:240:27:27

All of a sudden I might just go

on my phone and I will think,

0:27:270:27:30

I don't even need to go

on my phone right now.

0:27:300:27:33

If I'm crossing the road I can get

distracted by my phone

0:27:330:27:36

and realise oh wait,

there's a car there!

0:27:360:27:38

It's as if we're driven by a power

beyond our conscious actions.

0:27:380:27:42

It's not sensational

to say our brains are being hacked.

0:27:420:27:45

Because that's pretty

much what is happening.

0:27:450:27:51

Balliol College Oxford was built

to withstand the distractions

0:27:510:27:53

of the pre-smartphone age.

0:27:530:27:57

The heavy wooden doors and

castellated quad are fortifications

0:27:570:28:00

against attention hijack.

0:28:000:28:02

James Williams is a former Google

executive who became concerned that

0:28:020:28:05

Silicon Valley's central mission

is to interrupt our

0:28:050:28:07

every waking thought.

0:28:070:28:08

He resigned and now

studies at Balliol.

0:28:080:28:13

The way we are monetising most

of the information in the world

0:28:130:28:16

is by distracting people,

keeping them from doing

0:28:160:28:18

what they want to do,

rather than helping them do

0:28:180:28:20

what they want to do.

0:28:200:28:24

Now, I don't know anybody,

I've never met anybody at least,

0:28:240:28:27

who wants to spend all day

on Facebook or wants to keep

0:28:270:28:30

clicking articles all day.

0:28:300:28:31

If there are people like that,

I'd love to meet them, because I'd

0:28:310:28:34

love to understand their mind

and their priorities.

0:28:340:28:38

But you know when you think

about the goals that people

0:28:380:28:40

have for themselves,

they tend to be things like,

0:28:400:28:43

you know the things that

when we are on our deathbed

0:28:430:28:45

we will regret not having done.

0:28:450:28:47

Like, you know, I want to take

that trip with my family

0:28:470:28:50

or I want to learn how to play piano

or, you know, spend

0:28:500:28:53

more time with friends.

0:28:530:28:54

These are the real human goals that

people have and these are the goals

0:28:540:28:57

in my mind that technology ought

to be helping us pursue.

0:28:570:29:00

If they don't do that then I don't

know what technology is for.

0:29:000:29:09

Most technology companies

have another goal.

0:29:090:29:11

Welcome to the attention economy.

0:29:110:29:14

Because the internet is funded

largely by advertising,

0:29:140:29:17

companies need us glued

to their apps, or they

0:29:170:29:19

don't make money.

0:29:190:29:22

Today we're going to

set a new mission...

0:29:220:29:24

Although Facebook founder

Mark Zuckerberg maintains that his

0:29:240:29:28

company's mission is, quote...

0:29:280:29:29

To bring the world closer together.

0:29:290:29:33

..A couple of weeks ago

Facebook's first president

0:29:330:29:34

expressed a very different,

even sinister, objective.

0:29:340:29:39

How do we consume as much

of your time and conscious

0:29:390:29:42

attention as possible?

0:29:420:29:46

And that means that we need to sort

of give you a little dopamine hit

0:29:460:29:49

every once in awhile.

0:29:490:29:57

Because someone liked or commented

on the photo or post or whatever.

0:29:570:30:00

And that's going to get

you to contribute more content.

0:30:000:30:02

And that's going to get you,

you know, more likes and comments.

0:30:020:30:05

It's a social validation

feedback group that, I mean,

0:30:050:30:07

is exactly the kind of thing that

a hacker like myself would come up

0:30:070:30:11

with because you are exploiting

a vulnerability in human psychology.

0:30:110:30:13

You can see these results

in places like this.

0:30:130:30:17

These students at Bournemouth

University have grown

0:30:170:30:18

up with smartphones.

0:30:180:30:19

They can't imagine

being without them.

0:30:190:30:25

The relationship I have

with my phone is quite intense

0:30:250:30:27

because I use it like,

all the time.

0:30:270:30:29

I think it is part of my body now.

0:30:290:30:32

It is always with me.

0:30:320:30:33

I feel like if I don't

have my phone with me,

0:30:330:30:36

and everybody's like,

talking about memes, for example,

0:30:360:30:38

I wouldn't understand the joke

or what everybody's laughing

0:30:380:30:40

at because I wasn't on my phone.

0:30:400:30:47

So how does technology

hack its way into our brains?

0:30:470:30:50

According to psychologists, it taps

into our neural reward system.

0:30:500:30:54

We are driven by natural rewards

and these kinds of natural rewards

0:30:540:30:59

are very basic rewards.

0:30:590:31:00

Food, water, sex.

0:31:000:31:01

These are the sorts of things

that are making us happy

0:31:010:31:04

on an everyday basis.

0:31:040:31:05

But with technology,

some of these needs are almost

0:31:050:31:07

being replaced by the kinds

of social notifications

0:31:070:31:09

we may have received,

by the smartphone technology

0:31:090:31:11

that we are using.

0:31:110:31:15

So, you know, these

are technological rewards that can

0:31:150:31:18

be given to us that can

trick our brain into having those

0:31:180:31:21

rewarding moments and to receiving

most technological rewards that can

0:31:210:31:23

make us happy eventually.

0:31:230:31:28

This is the fundament

of our being, though,

0:31:280:31:30

those motivations you describe,

that's the fundamental operating

0:31:300:31:32

system of our minds, isn't it?

0:31:320:31:35

Yes.

0:31:350:31:39

Tapping into our reward system

is just the start of the way

0:31:390:31:42

technology is engineered

to hold our attention.

0:31:420:31:45

In the '50s the psychologist

B F Skinner discovered that pigeons

0:31:450:31:48

could be made more obsessed

with earning rewards if you made

0:31:480:31:51

those rewards unpredictable.

0:31:510:31:58

Now that produces in a rat

or a pigeon or a monkey and in a man

0:31:580:32:02

a very high rate of activity.

0:32:020:32:03

And if you build up,

you can get enormous amounts

0:32:030:32:05

of behaviour out of these organisms

for very little pay.

0:32:050:32:09

You don't need to give them very

much to induce a lot of that.

0:32:090:32:13

Now that is the heart

of all gambling devices.

0:32:130:32:15

Bingo!

0:32:150:32:16

Number three...

0:32:160:32:18

The way apps get you to pull down

to refresh the screen

0:32:180:32:20

is based on Skinner's work.

0:32:200:32:22

It's just like a fruit machine.

0:32:220:32:24

You pull, it whirrs,

and you get a variable reward.

0:32:240:32:27

Sometimes nothing, sometimes

you hit the jackpot.

0:32:270:32:32

There is a whole industry

of consultants, of writers,

0:32:320:32:34

who are basically helping people

who are designers draw

0:32:340:32:37

on this big catalogue

of cognable vulnerabilities.

0:32:370:32:41

And exploit them for

the purposes of giving us hope.

0:32:410:32:49

Keeping us using these products.

0:32:490:32:51

Another of these vulnerabilities

is our brain's in-built

0:32:510:32:53

aversion to loss.

0:32:530:32:54

For example, Snapchat shows

what it calls streaks.

0:32:540:32:56

How many days a message

chain has gone unbroken.

0:32:560:33:00

Facebook is now testing

a similar feature.

0:33:000:33:02

It's all designed to

compel you to message.

0:33:020:33:04

And it works.

0:33:040:33:07

It's like a fire emoji

and then it will be like oh,

0:33:070:33:10

you have been on a streak for three

days and then you want

0:33:100:33:13

to sort of like compete,

like with other people.

0:33:130:33:15

Oh, how many streaks do you have?

0:33:150:33:17

And you feel like you have to reply.

0:33:170:33:19

When it goes low, like you're about

to lose the streak, it tells you.

0:33:190:33:22

So then you feel the need,

even if you weren't going to message

0:33:220:33:25

them anyway, or send any pictures,

you feel the need to.

0:33:250:33:28

Smartphones also exploit our brain's

in-built drive to finish things.

0:33:280:33:31

If you remove the cue that

we've reached the end,

0:33:310:33:33

well, we just keep going.

0:33:330:33:35

A food psychologist discovered that

when a soup bowl was fitted

0:33:350:33:38

with a hidden tube that kept it

topped up, people would drink

0:33:380:33:41

pints and pints of soup

in an effort to finish the bowl.

0:33:410:33:45

That's why the Twitter

and Facebook feeds never end.

0:33:450:33:54

We never get a cue to stop.

0:33:540:33:56

And it's why video sites

like YouTube and Netflix will start

0:33:560:33:59

the next video even before the one

you're watching has finished.

0:33:590:34:01

Before I've been on my phone

watching YouTube videos back to back

0:34:010:34:04

for like, two hours.

0:34:040:34:05

When you do actually sit down

and try and calculate the hours,

0:34:050:34:08

you realise how much time has been

wasted on things that you could have

0:34:080:34:11

been doing that were productive.

0:34:110:34:14

Really?

0:34:140:34:15

You feel like you're...

0:34:150:34:16

So why don't you stop?

0:34:160:34:17

I don't know.

0:34:170:34:19

I don't want to say

it's an addiction,

0:34:190:34:20

but I just need my phone!

0:34:200:34:23

Another very powerful way

that we are manipulated

0:34:230:34:25

is in what we watch.

0:34:250:34:28

The scientists of the attention

economy know that our brains

0:34:280:34:30

are drawn to stories that prompt

strong emotions, like outrage.

0:34:300:34:35

Balanced discussions may appeal

to our conscious intellects,

0:34:350:34:39

but not the subconscious urges that

will keep us clicking and scrolling.

0:34:390:34:42

So that is what we are served,

a diet of outrage.

0:34:420:34:46

And it doesn't matter of the stories

are fake or real, they all serve

0:34:460:34:49

to grab our attention.

0:34:490:34:51

Even reputable news

organisations are having

0:34:510:34:52

to adapt their coverage to compete.

0:34:520:34:56

In the 30s a former student

of Balliol College, Aldous Huxley,

0:34:560:34:59

predicted a world where manipulation

and destruction combined to create

0:34:590:35:01

a happy, docile populace

incapable of self-government.

0:35:010:35:09

One way of looking at this is that

you know, the attention economy

0:35:090:35:12

is a kind of denial of service

attack against the human will.

0:35:120:35:15

And that has big implications

in our own lives because there

0:35:150:35:18

are things we want to do today,

this week, this year.

0:35:180:35:21

It has big political

implications because, you know,

0:35:210:35:23

the will of the people is the basis

of the authority of democracy.

0:35:230:35:27

And if that's being undermined,

our political systems,

0:35:270:35:29

the possibility of democracy is very

straightforwardly being undermined.

0:35:290:35:39

The distraction and manipulation

of the attention economy is only

0:35:400:35:42

going to get more refined and more

compelling, and less noticeable.

0:35:420:35:48

For example, Facebook and other big

tech firms are investing

0:35:480:35:50

heavily in virtual reality.

0:35:500:35:53

So unless we are prepared to change

the way we pay for the online world,

0:35:530:35:56

we could literally lose

ourselves in technology.

0:35:560:36:06

David Grossman there. It could be

the biggest problem of our time.

0:36:070:36:11

I'm joined by Tristan Harris.

0:36:110:36:13

He is co-founded the movement "time

well spent" to spark an important

0:36:130:36:16

conversation about the kind

of future we want from

0:36:160:36:18

the technology industry

and was a design ethicist

0:36:180:36:20

and product philosopher

at Google until 2016,

0:36:200:36:21

where he studied how technology

influences a billion users'

0:36:210:36:23

attention, well-being and behaviour.

0:36:230:36:24

He was described by the Atlantic

Magazine "as the closest thing

0:36:240:36:27

Silicon valley has to a conscience".

0:36:270:36:34

Good evening. I am interested in how

much of a problem we should really

0:36:340:36:41

think this is. You likened it to the

slot machines but this isn't going

0:36:410:36:46

to bankrupt you or kill you in the

way that some other drugs do. I

0:36:460:36:51

wonder whether addiction is quite

the right way to look at it.

It is

0:36:510:36:57

much bigger than addiction, I would

call it an existential threat to the

0:36:570:37:02

human race and the reason is because

there are 2 billion people who use a

0:37:020:37:07

smartphone every day, 2 billion

people use Facebook, that is more

0:37:070:37:11

than a number of followers of

Christianity, these tech companies

0:37:110:37:14

have more influence over our daily

thoughts and some religions given

0:37:140:37:19

that we check our phones 150 times a

day. The total surface area of how

0:37:190:37:29

much technology is steering 2

billion people and their thoughts is

0:37:290:37:33

enormous, even when you are not

looking at your phone, it is

0:37:330:37:37

implementing or creating the kind of

thought you're thinking about now.

0:37:370:37:41

The challenge is as James said and

we were allies at Google in trying

0:37:410:37:45

to raise this conversation, is that

these companies goals are

0:37:450:37:53

fundamentally misaligned with our

goals and the goals of democracy.

0:37:530:37:56

That is why it is an existential

threat.

It is not an existential

0:37:560:38:02

threat, you need to find the harm it

is doing. Yes, we are wasting quite

0:38:020:38:08

a lot of time, yes we are sometimes

misdirected to rubbish when we would

0:38:080:38:12

have better things to do with our

lives, but talking about existential

0:38:120:38:16

threats, you need to say what actual

harm it is doing to all those people

0:38:160:38:20

who choose to use their phones in

this way.

I would ask in the 150

0:38:200:38:27

times a day will recheck, what is

going on in that moment right before

0:38:270:38:30

we check. Is it because we are

sitting there and we a conscious

0:38:300:38:34

choice and that is not what is

happening, what is happening is that

0:38:340:38:38

we are building up anxieties and as

it builds, it causes us to self

0:38:380:38:45

interrupt. We actually interrupt

ourselves about every 40 seconds.

We

0:38:450:38:50

are complicit in this process, we

can, smoking addicts will tell you

0:38:500:38:55

it is very difficult to stop smoking

and lock the cigarettes in a

0:38:550:38:58

cupboard but if you're fed up with

your phone and you want some

0:38:580:39:01

uninterrupted time, you put the

phone away or you turn it off. It is

0:39:010:39:06

not that difficult. The reason we

don't is because we like getting

0:39:060:39:11

stuff on the phone and it connects

us and we get a reward from it,

0:39:110:39:14

don't we?

We get enormous benefits

from these technology companies, I

0:39:140:39:19

think the challenges is that their

goals are not aligned with ours. The

0:39:190:39:24

one you mentioned, you have 100

million teenagers, a vulnerable

0:39:240:39:28

population and you are basically

saying, for each one of your

0:39:280:39:32

friends, it shows the number of days

in a row you have sent messages back

0:39:320:39:37

and forth, it is like putting them

on treadmills and time their legs

0:39:370:39:51

together, they both have to keep

running otherwise they lose their

0:39:510:39:53

streak. It is like we have hijacked

what 100 million teenagers view as

0:39:530:39:55

the currency of friendship, the way

kids know if they are friends is if

0:39:550:39:58

they keep that streak up. That is

where we are developmentally harming

0:39:580:40:02

an entire generation of children.

That is one of the clearest examples

0:40:020:40:05

where it is not just addiction...

A

lot of parents would say, I would

0:40:050:40:14

stop my child doing that. What do

you do? Are you an addict? Do you

0:40:140:40:20

feel you have controlled the

destruction of your phone?

No. I

0:40:200:40:25

haven't and I think one of the

things that we said when we talk to

0:40:250:40:30

all these experts in provision

technology is that even if you know

0:40:300:40:35

how these techniques work, it still

works on you. You're sitting inside

0:40:350:40:44

of this suit, all of these instincts

that are getting close, if you wake

0:40:440:40:48

up in the morning and you see photos

of your friends missing out, you're

0:40:480:40:54

missing out on what they were doing

last night, that will pull on any

0:40:540:40:57

human being, you can be the director

of the CIA and that will affect you.

0:40:570:41:02

We are all human. This is about

whether or not the goals of

0:41:020:41:06

technologies align with that.

Thank

you.

0:41:060:41:11

That's all we've got

time for this evening.

0:41:110:41:13

But before we go, we bring you news

of an exciting new film

0:41:130:41:16

with a soundtrack by the composer

Michael Nyman.

0:41:160:41:18

It's called Washing Machine -

The Movie, and it consists

0:41:180:41:20

of sixty-six minutes of a particular

brand of washing machine

0:41:200:41:22

going through its forty

degree wash cycle -

0:41:220:41:24

accompanied by a specially composed

minimalist soundtrack by Mr Nyman.

0:41:240:41:27

The movie will premiere

in Leicester Square next month -

0:41:270:41:29

but we've managed to get

you a sneak peak.

0:41:290:41:31

Good night.

0:41:310:41:39

PIANO PLAYS.

0:41:390:41:45

Download Subtitles

SRT

ASS