Work Billion Dollar Deals and How They Changed Your World


Work

Similar Content

Browse content similar to Work. Check below for episodes and series from the same categories and more!

Transcript


LineFromTo

What if the way we understand the world is wrong?

0:00:030:00:07

What if it's not politicians and world events that shape our lives,

0:00:080:00:12

but business deals?

0:00:120:00:14

It's natural for people to fight against change,

0:00:140:00:17

but change is coming, and we need to adapt to it.

0:00:170:00:21

Deals made in secret, high up in a boardroom, over a drink,

0:00:220:00:27

or in a car in the dead of night.

0:00:270:00:29

We were trying to protect people from excess medicine.

0:00:290:00:32

-You couldn't.

-Yeah, we failed.

0:00:320:00:34

Our every waking hour has been transformed by these deals,

0:00:350:00:39

reprogramming us to think and behave in a different way.

0:00:390:00:43

While we're competing with these large monolithic banks that are very

0:00:430:00:46

bureaucratic, we can run circles around them all day.

0:00:460:00:48

This is the story of these deals.

0:00:480:00:51

These ideas were ahead of their time.

0:00:510:00:53

Online payments can change the world.

0:00:530:00:56

Who made them, why, and what do they mean for our future?

0:00:560:01:01

In this series, I'm going to look at three key aspects of our lives,

0:01:040:01:08

and how they've been revolutionised by these deals.

0:01:080:01:13

Who wants to kill cash, and why?

0:01:130:01:16

Our vision is to have a cashless society.

0:01:160:01:19

Card payments are the best way to pay and the best way to be paid.

0:01:190:01:23

Why do we medicate every aspect of our lives?

0:01:230:01:26

This notion of the pursuit of happiness, it's not realistic!

0:01:260:01:30

It gets all of us to be on drugs!

0:01:300:01:33

But in this film, I'll be looking at why we're trapped in a culture

0:01:330:01:36

of endless work.

0:01:360:01:39

It is over, dude. It is gone.

0:01:390:01:41

If you can't do it, artificial intelligence can.

0:01:420:01:45

How did work go from something we do to who we are?

0:01:460:01:50

It's not a person, it's a performer, right?

0:01:500:01:53

I'm sorry, it's not a person?! It is a person!

0:01:530:01:56

I'm a person sitting in front of you.

0:01:560:01:58

These are the deals that made you work harder, made your pay lower...

0:01:580:02:03

These are very fine decisions, and sometimes you get them wrong.

0:02:030:02:07

..and might one day replace you with a robot.

0:02:070:02:10

PHONE CHIMES

0:02:160:02:18

This is what 12% of us are going to do tonight, and every night -

0:02:230:02:29

wake up at 3:00 am, and check our work e-mail.

0:02:290:02:33

A few hours later, 51% of us are going to spend more time

0:02:340:02:38

checking our work e-mail than eating breakfast.

0:02:380:02:41

30% of jobs in the UK could be automated within the next 15 years.

0:02:470:02:52

As intelligent machines begin their march on the labour market,

0:02:540:02:57

how long will it be before they overhaul our economy completely?

0:02:570:03:01

This is the bigger picture -

0:03:050:03:07

how WE became robots, automated our lives

0:03:070:03:11

at the very moment that robots learnt to become human.

0:03:110:03:16

To understand how we got here,

0:03:280:03:31

we need to go back to the '70s, to a deal that would make us loyal

0:03:310:03:35

to our workplace, that would make us become our jobs,

0:03:350:03:38

and our jobs to eventually become our lives.

0:03:380:03:42

Summer 1975.

0:03:530:03:55

America is in recession, losing jobs at an unprecedented rate.

0:03:550:03:59

Four men set themselves the task of rebuilding American enterprise.

0:04:040:04:08

They meet in downtown San Francisco in an office on the 48th floor.

0:04:100:04:15

Their meeting would change the language of American business.

0:04:160:04:20

One of them was Tom Peters,

0:04:240:04:26

a larger-than-life management consultant,

0:04:260:04:29

who would go on to become a celebrity life coach.

0:04:290:04:32

Huge news!

0:04:320:04:34

Another was Richard Pascale,

0:04:340:04:36

a quietly spoken academic in the field of management science.

0:04:360:04:40

This unlikely double act, who were both working as consultants

0:04:410:04:45

for McKinsey, a management consultancy firm,

0:04:450:04:47

made a deal to work together to create a new management philosophy,

0:04:470:04:52

and one country in particular seemed to show the way.

0:04:520:04:55

It is obvious that the United States economy is not in good shape.

0:04:580:05:01

But while our industry is floundering,

0:05:010:05:04

the Japanese are reporting record profits.

0:05:040:05:07

The Americans were getting hammered by the Japanese in a lot of things,

0:05:070:05:13

but the one that hurt our ego was the automobile market.

0:05:130:05:16

To grossly oversimplify, to the point of not being obviously the

0:05:160:05:21

whole story, the Japanese were beating us on fundamentally what

0:05:210:05:24

we had, very sexy cars, except theirs, actually,

0:05:240:05:27

when you turned the ignition key, they started.

0:05:270:05:30

-Right.

-And our quality was highly questionable.

0:05:300:05:34

# There's something inside you... #

0:05:340:05:37

Japan has gone from being a producer of trash to a nation synonymous

0:05:370:05:41

with quality and reliability.

0:05:410:05:42

So this was a moment where the Japanese motor industry was making

0:05:440:05:48

cars that were of quality, they were reliable.

0:05:480:05:51

-Absolutely.

-And the American car industry was going...

0:05:510:05:53

Absolutely, and, you know, to be realistic about it, first of all,

0:05:530:05:57

you're attacking us in the car industry, and that doesn't feel good

0:05:570:06:00

to you and me, and secondly, for God's sakes, it was 1977.

0:06:000:06:07

We're only 30 years beyond... Well, what was it?

0:06:070:06:10

From the war.

0:06:100:06:11

Yeah, and everybody's dad or grandad fought in World War II.

0:06:110:06:17

So, you know, we had just beaten these buggers,

0:06:170:06:19

and, "What the hell are they doing?" you know?

0:06:190:06:21

So the whole thing about, "We won the war, but they've won the peace"?

0:06:210:06:24

Absolutely. Precisely so.

0:06:240:06:27

But it was about far more than how to build a better car.

0:06:290:06:32

What else could America learn from the Japanese?

0:06:340:06:36

For Richard Pascale,

0:06:370:06:39

the answer lay in the unique Japanese business culture.

0:06:390:06:42

Everyone feels this incredible sense of commitment.

0:06:430:06:46

You don't come in to hang out and lay back.

0:06:460:06:49

You're in there like you would on a team.

0:06:490:06:52

You're expected to play and hold up your end of the deal.

0:06:520:06:54

So that quality of commitment and energy just resonated with me

0:06:540:06:59

in a very deep and profound way.

0:06:590:07:01

The companies care not only about their workers' job performance,

0:07:010:07:05

but about their health, their family life,

0:07:050:07:07

and their ideas for doing things better.

0:07:070:07:10

Good ideas can win a Canon camera worker a paid vacation

0:07:100:07:13

and a cash bonus.

0:07:130:07:15

This attitude seemed to derive from the values set

0:07:150:07:18

by the companies' bosses.

0:07:180:07:21

Management philosophy was just astonishing.

0:07:210:07:23

You had aspirations that allowed people in their work

0:07:230:07:27

to have a meaningful sense of purpose.

0:07:270:07:30

We felt strongly that this overarching sense of what a company

0:07:320:07:36

is about and the meaning it makes seems to make a difference.

0:07:360:07:40

Together, they arrived at a system of management with what they called

0:07:400:07:43

shared values at its heart.

0:07:430:07:46

What we came up with was a way of taking the best of the

0:07:460:07:51

Japanese ideas and keeping them coupled to...

0:07:510:07:56

God, I hate terms like this because they're such gross

0:07:560:07:59

over-generalisation, but America's innovative spirit.

0:07:590:08:02

Peter and Pascale's meeting of minds appealed to business leaders,

0:08:040:08:09

humbled by Japanese manufacturing prowess.

0:08:090:08:12

From now on, the company wouldn't just give employees jobs to do,

0:08:120:08:16

it would set all-encompassing values for them to follow.

0:08:160:08:20

This was the key to reshaping their businesses.

0:08:200:08:23

Both went on to publish books based on their ideas

0:08:250:08:28

within a year of each other.

0:08:280:08:30

Pascale's The Art Of Japanese Management was published in 1981.

0:08:300:08:33

That shared value insight turned out to be both very useful,

0:08:350:08:41

but as well, at the time, extremely provocative.

0:08:410:08:46

Hard to imagine today, with the whole industry of vision

0:08:460:08:49

and values being created in the last 10 or 20 years,

0:08:490:08:53

in which everyone has their little plaque on the wall

0:08:530:08:56

that states what they aspire to be.

0:08:560:08:59

But at the time when we started talking about this,

0:08:590:09:02

I remember truly violent reactions from audiences.

0:09:020:09:07

What did they say you were trying to do?

0:09:070:09:08

Well, they saw it as brainwashing.

0:09:080:09:11

They saw it as paternalistic, they found it extremely offensive,

0:09:110:09:15

intrusive. Placing companies in a role that was out of their scope

0:09:150:09:20

and inappropriate for a commercial enterprise.

0:09:200:09:22

But while Pascale's book didn't capture the public's imagination,

0:09:230:09:27

Tom Peters took the same ideas and brought them to millions

0:09:270:09:31

with his book called In Search Of Excellence.

0:09:310:09:35

I don't really want to completely cheapen it

0:09:350:09:37

and say I was the populariser,

0:09:370:09:39

but I was the populariser.

0:09:390:09:41

But you were, actually, redefining how a workplace should be.

0:09:410:09:45

Yeah. There is a real intellectual hard...

0:09:450:09:49

-Of course.

-..no baloney, intellectual history for this stuff.

0:09:490:09:52

-Right.

-But nobody paid any attention to it.

0:09:520:09:54

-Right.

-And first of all we got lucky with labels like Excellence and we

0:09:540:09:59

got lucky with 10% unemployment and we got lucky by being

0:09:590:10:03

caught with our pants down and cars that didn't work.

0:10:030:10:06

And so I will completely acknowledge that this new look at organisations

0:10:060:10:12

came together with insane timing.

0:10:120:10:15

And there was nothing wrong with what we did,

0:10:150:10:18

it was a pretty good piece of work,

0:10:180:10:19

but the timing was a gift from the gods.

0:10:190:10:23

Tom Peters had put a management book on the bestseller list,

0:10:250:10:29

selling in excess of five million copies.

0:10:290:10:31

It was a modern business blockbuster,

0:10:310:10:34

but most importantly, it made management ideas

0:10:340:10:37

a serious topic of conversation in organisations.

0:10:370:10:40

Its thinking was adopted by companies around the world.

0:10:400:10:45

What Peters did was he made culture popular.

0:10:450:10:47

He had initially created this very clever thing for McKinsey called the

0:10:470:10:52

7S Model, where he identified the soft and hard drivers of

0:10:520:10:58

organisation culture and how they pulled themselves together to create

0:10:580:11:03

a set of shared values that enabled the delivery of strategy.

0:11:030:11:06

And he... It was a very smart piece of thinking.

0:11:060:11:09

And it enabled leaders, for the first time,

0:11:090:11:11

to get their hands on something and say,

0:11:110:11:14

"so if we change this process over here,

0:11:140:11:16

"or if we change our reward mechanism over here,

0:11:160:11:19

"what we should see is a different outcome, or a different...

0:11:190:11:21

"Driven by a different set of behaviours".

0:11:210:11:24

Before Peters met Pascale, the idea of your work

0:11:240:11:28

having a higher meaning would have seemed ludicrous.

0:11:280:11:31

But as their thinking caught on, our work culture changed.

0:11:310:11:36

We were asked to see work as having a higher purpose, not just a job.

0:11:360:11:42

Work could and would offer us fulfilment.

0:11:420:11:46

As the idea of shared values spread,

0:11:490:11:52

workplaces around the world became more than just workplaces.

0:11:520:11:55

They had mission statements.

0:12:000:12:02

They had values that the workers signed up to.

0:12:020:12:05

Good morning, everybody. Welcome to Services To Fruit.

0:12:060:12:09

For those of you who are new,

0:12:090:12:11

Services To Fruit is our opportunity to recognise unsung heroes.

0:12:110:12:15

People who have gone the extra mile,

0:12:150:12:17

people who have truly lived the values.

0:12:170:12:20

This month's Lady Of The Sash is Katrina.

0:12:200:12:23

At Innocent, makers of fruit smoothies,

0:12:300:12:32

they go further than most to create a unique company culture.

0:12:320:12:36

Using team bonding techniques, inspired from cult films,

0:12:370:12:40

like Bill And Ted's Excellent Adventure,

0:12:400:12:43

they see their workforce as a family.

0:12:430:12:45

One of my favourite things is this.

0:12:490:12:50

So this is... This is our baby photo wall.

0:12:500:12:53

When you join Innocent, we will put your baby photo up.

0:12:530:12:56

The idea being that you can see...

0:12:560:12:58

"This is who I work with, everybody is human, everyone is a baby,

0:12:580:13:01

-"everyone has a job to do".

-Tim, is your Chief Exec here?

0:13:010:13:04

Yeah, Douglas, our chief squeezer, he's actually just down the bottom.

0:13:040:13:07

Not at the top, trying to look down on everyone else.

0:13:070:13:09

We're one big team, and you're not just here to do that job.

0:13:090:13:13

-Yeah.

-You're here to make the world a better place.

0:13:130:13:16

Do you think someone cynical, diffident and cynical like myself,

0:13:160:13:19

would be able to fit in?

0:13:190:13:21

Because I'm the sort of person that would come and if I was interviewed,

0:13:210:13:24

I'd be like, "Oh, I'm not going to buy into all this stuff."

0:13:240:13:26

Do you see...? I'm just interested in the kind of personality that...

0:13:260:13:29

You know, would someone like myself be accepted?

0:13:290:13:32

We have people that have been here a long time, 10, 15, 16, 17 years.

0:13:320:13:36

And they don't come for beers every Friday and they don't get involved

0:13:360:13:39

in all the parties, but they are still about what Innocent are about.

0:13:390:13:43

But what if you become too dedicated to your job?

0:13:430:13:46

If you have truly drunk in the company values,

0:13:460:13:49

then sooner or later, is there a risk that you become your job?

0:13:490:13:52

And if you are your job,

0:13:530:13:55

what happens when the working day gets longer and longer?

0:13:550:13:59

In Japan, shared values have always gone hand in hand

0:14:010:14:05

with a long hours culture, and what they call karoshi -

0:14:050:14:09

death from overwork.

0:14:090:14:12

In terms of this new workplace,

0:14:120:14:14

this new really highly competitive poster book, '80s workplace,

0:14:140:14:19

it's totally redefined.

0:14:190:14:21

And what it is, is that you have to give your all to the company

0:14:210:14:24

if you're going to survive.

0:14:240:14:26

If you're going to keep your job, it means being...

0:14:260:14:29

You know, policing your own productivity.

0:14:290:14:31

Yeah, I mean, if I work my one ass off to help the ten people who work

0:14:310:14:36

for me be better prepared for the workplace of tomorrow,

0:14:360:14:40

they will work their ten asses off to make me look like a superstar.

0:14:400:14:45

Innocent have a novel way of curbing long hours working culture.

0:14:480:14:51

So this here is our last leaver pulls lever.

0:14:530:14:55

So at the end of the working day,

0:14:550:14:58

the last person to leave the floor will make an announcement at around

0:14:580:15:01

8:00, 8:30, and they will shut this off,

0:15:010:15:03

and it basically stops all non-essential electrical equipment.

0:15:030:15:07

The idea being that that last person is telling everyone, "Come on, guys,

0:15:070:15:11

"we've had enough, let's go."

0:15:110:15:13

But not keeping a lid on your working hours can take its toll.

0:15:130:15:17

There are more than 100,000 strokes in the UK each year.

0:15:230:15:27

OK, and again.

0:15:270:15:29

It's the fourth single leading cause of death in the country.

0:15:290:15:33

-Fingers are opening, lovely.

-Turn your head.

0:15:350:15:37

Most of it is age-related, but a study by University College London,

0:15:370:15:40

published in The Lancet,

0:15:400:15:42

showed a link to long working hours and a higher risk of stroke.

0:15:420:15:45

This patient thinks the stress of running his business

0:15:450:15:48

played a part in his.

0:15:480:15:50

Reflecting on it, I suppose it was the stress of work.

0:15:500:15:55

Yeah.

0:15:550:15:56

I have an international business, and I'm flying around the world.

0:15:560:16:01

It takes a toll.

0:16:010:16:02

And when you look back, is there one thing that you think,

0:16:020:16:05

"Oh, yeah, you know, I was really worried about that"?

0:16:050:16:07

Or do you think it was just an accumulation of stress?

0:16:070:16:09

I suppose it must have been an accumulation.

0:16:090:16:12

There's no question, I think, in my mind that the kind of job that

0:16:120:16:16

Mr Mehta's doing, often working 12, 18 hours a day

0:16:160:16:19

-for nearly 20, 30 years...

-Yeah.

-..that was a contributing factor.

0:16:190:16:23

-And a seven-day week.

-Seven days a week, yeah. You never turn off.

0:16:230:16:26

-No.

-That's the thing, yeah.

0:16:260:16:29

Ten years ago, I had a stroke.

0:16:290:16:31

A bit like you, it came out of the blue,

0:16:310:16:34

and in the ward I was in, everyone who'd had a stroke

0:16:340:16:39

was either in their late 30s, or in their 40s, in that ward.

0:16:390:16:44

And the common factor amongst all of us was stress.

0:16:440:16:49

Everyone had a highly stressed job, whether it be rubbish collection,

0:16:490:16:53

or a city broker or a lawyer, you know,

0:16:530:16:55

everyone was in there, and they all had high stress jobs, including me.

0:16:550:16:59

I can tell you when I come into the room, when he's not looking,

0:16:590:17:03

he's on his iPad answering e-mails before I came into the room.

0:17:030:17:08

And that's whilst he's had a stroke within the last six weeks,

0:17:080:17:12

and yet, he's just promised us -

0:17:120:17:14

and this is not the first time he's promised, by the way -

0:17:140:17:16

that he will reduce his working hours.

0:17:160:17:19

Quite frankly, the e-mails and mobile phones don't help.

0:17:190:17:24

-Because you never switch off.

-No.

0:17:240:17:26

-You can't escape that.

-It's impossible.

0:17:260:17:28

Peters, Pascale and their co-writers would foster the idea of

0:17:300:17:34

employee devotion to their employers.

0:17:340:17:37

But then, over time, our bosses began to persuade themselves

0:17:370:17:41

that they didn't need to treat us with the same devotion.

0:17:410:17:45

Two decades after Pascale and Peters met, another deal would encourage

0:17:450:17:50

companies to hire and fire us at will, no matter how devoted we were.

0:17:500:17:55

1997 - the world of business is changing fast.

0:18:070:18:12

In America, traditional blue chip companies were losing talented staff

0:18:140:18:19

to tech upstarts from Silicon Valley.

0:18:190:18:21

They needed a solution, and they paid Management consultants McKinsey

0:18:260:18:31

to provide it.

0:18:310:18:33

Helen Handfield-Jones and her colleagues were convinced that

0:18:330:18:36

companies were failing because they were employing the wrong people.

0:18:360:18:41

It was in the '90s with the dot-com boom, big companies all over

0:18:410:18:44

the place that used to get the cream of the crop

0:18:440:18:47

from their recruiting, you know, the IBMs, the GEs,

0:18:470:18:51

were losing talent to dot-com start-ups,

0:18:510:18:53

which they'd never faced before.

0:18:530:18:55

To solve this problem, McKinsey sold a strategy to their clients.

0:18:570:19:01

They called it the war for talent.

0:19:010:19:04

Using practical examples from companies such as General Electric

0:19:040:19:08

and Enron, the authors outlined five imperatives that every leader,

0:19:080:19:13

from CEO to unit manager, must act on to build a stronger talent pool.

0:19:130:19:18

It completely redefined the relationship

0:19:180:19:21

between a company and its workers.

0:19:210:19:24

Talent is the sum total of your skills and ability, values,

0:19:240:19:29

character that you bring to your work role.

0:19:290:19:33

Having great talent in the organisation is critical

0:19:330:19:35

to the business success.

0:19:350:19:37

Now, it sounds like motherhood and apple pie, right?

0:19:370:19:39

Everybody would say that.

0:19:390:19:41

But there's some roles that require just a very unique set of skills

0:19:410:19:44

that doesn't come along easily, and isn't trainable.

0:19:440:19:48

For McKinsey, the most important thing in any company was

0:19:480:19:51

having the right employees. The good people needed to be kept,

0:19:510:19:55

and underperforming staff let go.

0:19:550:19:58

You categorised employees into A, B and C.

0:19:580:20:03

And A were people who were super-talented and successful,

0:20:040:20:08

-and you just allow them to get on with it.

-Mm-hm.

0:20:080:20:11

And B were people who were kind of coasting along...

0:20:110:20:14

Doing OK, doing a good job.

0:20:140:20:16

Doing OK, but we're not sure whether to keep them.

0:20:160:20:18

-Yeah.

-And then C were people that you sack.

0:20:180:20:22

OK. All of that is mischaracterised.

0:20:220:20:23

Let me start that again.

0:20:230:20:25

That's the hardest part of talent management is dealing with

0:20:250:20:27

low performers. The concept is that...

0:20:270:20:29

Look hard at your low performers, have a clear eyed view,

0:20:290:20:33

assess their performance rigorously like you do everybody else,

0:20:330:20:37

and make a decision. That's what it means.

0:20:370:20:38

So you didn't say that the C person should just be sacked?

0:20:380:20:42

It's not a person, it's a performer.

0:20:420:20:44

-Right?

-Oh, sorry. So it's not a human being?

0:20:440:20:47

No. No, no.

0:20:470:20:48

-No, what I'm saying is...

-It's not a person?! It IS a person.

0:20:480:20:51

I'm a person sitting in front of you,

0:20:510:20:54

being interviewed by my boss.

0:20:540:20:55

-Yeah, yeah.

-And they're deciding whether I should be sacked or kept.

0:20:550:20:58

-Yes, but it...

-I'm a person.

-Of course you're a person.

0:20:580:21:01

But you said I wasn't a person. You said I was a performer.

0:21:010:21:03

But you also used the word a moment earlier that said you assess

0:21:030:21:06

whether people were TALENTED or not by A, B or C.

0:21:060:21:08

I said, "No, that's not right." We assess people on their performance.

0:21:080:21:12

-Right.

-Are they performing at a high level, an OK level,

0:21:120:21:16

-or a poor level?

-Right.

-Below acceptable level of performance.

0:21:160:21:20

They're toxic to the organisation, they're not growing

0:21:200:21:22

their businesses, if that's what you need them to do,

0:21:220:21:25

they're poor people managers, whatever it is,

0:21:250:21:27

whatever dimension of performance -

0:21:270:21:29

this is not acceptable performance for a leader of our company.

0:21:290:21:32

In the late 1990s and early 2000s,

0:21:360:21:38

the war for talent would become hugely influential.

0:21:380:21:41

As companies did everything possible to keep their supposedly A-grade

0:21:430:21:47

employees, there was a massive increase in executive pay.

0:21:470:21:52

So if you go back 20, 25 years, the ratio, say in the UK,

0:21:520:21:57

across most of Europe, between a CEO's pay

0:21:570:22:01

and the average earner in an organisation, was about 20-1.

0:22:010:22:05

And something happened in the late '80s through the '90s to the

0:22:060:22:11

early 2000s which saw Chief Executives getting increases

0:22:110:22:16

of 100-300% over that period of time.

0:22:160:22:18

The flames were fanned by the war-for-talent thinking.

0:22:180:22:22

Do you think there's a danger you create a kind of inherent inequality

0:22:220:22:27

within an organisation because just by dividing people into A, B and C,

0:22:270:22:31

you create resentment, you breed anxiety?

0:22:310:22:33

Well, no, again, you make an assumption that's not right.

0:22:330:22:36

We do not tell people, "Oh, by the way, you're a B, or you're a B plus,

0:22:360:22:40

"or you're an A, whatever. "You have potential..."

0:22:400:22:42

We don't tell people that. It's not about stamping letter grades

0:22:420:22:45

on people's foreheads or having Crown Princes walk around

0:22:450:22:49

with crowns on their heads.

0:22:490:22:50

It's about giving each person the feedback that's most helpful and

0:22:500:22:53

appropriate to them, about what they can do to perform better,

0:22:530:22:57

and how their careers might unfold in this organisation.

0:22:570:23:00

So, Helen, has one of the legacies of the war for talent been,

0:23:000:23:03

in a way, the idea that we all need to be talented now in the workplace?

0:23:030:23:07

Absolutely, yeah.

0:23:070:23:09

These companies have lessened their commitment, long-term, to people.

0:23:090:23:14

In the old days, you hired somebody full-time,

0:23:140:23:16

and you hired them forever. And people thought,

0:23:160:23:19

"I have a job with one company, I stay there forever."

0:23:190:23:22

Right? That's long gone.

0:23:220:23:23

There's no free lunches any more, right? For anybody.

0:23:230:23:27

The world of work had changed significantly since

0:23:290:23:32

Peters and Pascale's vision of shared values -

0:23:320:23:35

being loyal to the workplace and your boss being loyal to you.

0:23:350:23:39

The importance of shared values

0:23:400:23:43

shouldn't be underestimated as a concept.

0:23:430:23:46

It's probably the lasting huge thought that is still with us and

0:23:460:23:52

should keep going with us because, of course, it was shared values that

0:23:520:23:56

we got wrong in the '90s and the early 2000s with the war for talent

0:23:560:23:59

and the L'Oreal generation, the shared value became, "Me".

0:23:590:24:02

It didn't become a greater creation of wealth.

0:24:040:24:06

It didn't become doing the right thing, and we lost the sort of

0:24:060:24:11

Aristotle version of a virtuous or a good leader.

0:24:110:24:15

The ideas in the war for talent would affect us all.

0:24:190:24:22

It told businesses worldwide that it was OK to hire and fire us.

0:24:230:24:29

Today, the average worker has six jobs

0:24:290:24:32

over the course of his or her life.

0:24:320:24:35

New graduates expect to have 20.

0:24:350:24:39

The job for life is over.

0:24:390:24:41

And it was the tech firms, whose rise prompted the war for talent

0:24:430:24:47

in the first place, who took the idea to its logical extreme.

0:24:470:24:51

At the beginning of the shift, I get my phone out,

0:24:520:24:55

log in, and set myself as available, and off I go.

0:24:550:24:57

A world without long-term employees,

0:24:590:25:02

where even the very idea of staff is redundant.

0:25:020:25:06

I normally do Monday to Friday lunch shifts,

0:25:070:25:10

10-2, and a couple evenings on top of that.

0:25:100:25:14

Meg Brown is a casual worker, part of the so-called gig economy.

0:25:170:25:21

Instead of a regular wage, she gets paid for the gigs she does,

0:25:210:25:25

delivering restaurant food to customers' front doors.

0:25:250:25:29

It's estimated that five million people globally are employed

0:25:290:25:32

this way, managed by a computer algorithm, and not by a person.

0:25:320:25:37

What kind of guarantees of work do you have?

0:25:370:25:40

You'll wait for the orders to come in.

0:25:400:25:42

-Right.

-There'll usually be a peak-time of the meal times.

0:25:420:25:47

-Right, yeah.

-So it really varies.

0:25:470:25:48

Anything could change at any moment.

0:25:530:25:56

I don't know what's happening on the other end.

0:25:560:26:00

I don't know if business is slow,

0:26:000:26:03

or it's just because I'm not being assigned a job.

0:26:030:26:07

I'm vulnerable, and at the mercy of the algorithm,

0:26:070:26:11

and whoever's at control on the other end.

0:26:110:26:15

Half the time, you don't get to see a human being and you don't know

0:26:150:26:18

who... What kind of personalities you're interacting with,

0:26:180:26:21

or if you are interacting with any personality.

0:26:210:26:23

It does feel like I'm missing something in that human-contact way

0:26:230:26:29

of just interacting with a screen.

0:26:290:26:31

-There's no human being doing it, it's the algorithms.

-Yeah.

0:26:310:26:33

We keep talking... I mean, this film is all about the future of work,

0:26:330:26:36

but actually the algorithms are deciding how you work now,

0:26:360:26:39

and whether you are even employed or not.

0:26:390:26:41

Yeah. It makes me feel a bit dispensable.

0:26:410:26:44

Algorithms are the new boss.

0:26:460:26:47

Algorithms telling them where to go, to do the service.

0:26:490:26:53

Algorithms determining the price of the service.

0:26:530:26:57

The algorithm can also sack you these days.

0:26:580:27:01

So there's quite a lot of platforms. There's a rating system of points

0:27:010:27:04

or stars, and if you fall below a certain threshold,

0:27:040:27:08

you get terminated automatically.

0:27:080:27:10

You might just try and log on one day to your app,

0:27:100:27:13

and it simply no longer says that you've got an active account.

0:27:130:27:16

Algorithm management has become algorithm rebellion.

0:27:170:27:21

Gig economy workers torn between the idea of being your own boss

0:27:210:27:25

while being controlled by the smartphones in their pockets are

0:27:250:27:28

arguing for the same employment rights as everyone else.

0:27:280:27:31

An independent review is currently looking into this.

0:27:310:27:34

The way that this type of work is sold,

0:27:350:27:38

whether it be Deliveroo or any company that does this,

0:27:380:27:41

is that it's flexibility, that it gives...

0:27:410:27:44

And that, you know, this is a great way to work, this is the future.

0:27:440:27:47

The flexibility is overplayed massively.

0:27:470:27:50

No guarantee of minimum wage, let alone the living wage.

0:27:500:27:54

It's a really insecure way to live.

0:27:540:27:56

It's hard to plan your life on this kind of wages,

0:27:560:28:00

and this kind of nature of work.

0:28:000:28:03

The war for talent was a critical driver of corporate performance,

0:28:060:28:11

but employers have since realised that technology

0:28:110:28:14

can make that process ruthlessly efficient.

0:28:140:28:16

And today, that insight has spread far beyond companies

0:28:190:28:22

like Deliveroo and Uber. Soon, everyone could be monitored,

0:28:220:28:27

including those working in professional office jobs.

0:28:270:28:30

In Boston, Massachusetts, there's one company ensuring

0:28:320:28:35

none of us can escape the computer's all-seeing eye.

0:28:350:28:38

Their technology allows companies to monitor the productivity of its

0:28:400:28:44

staff and employees to monitor their own workplace performance

0:28:440:28:48

through technology embedded into a card you wear around your neck.

0:28:480:28:52

They are already selling their services to companies like

0:28:530:28:56

Deloitte and Bank of America.

0:28:560:28:58

This measures with Bluetooth and accelerometer,

0:28:580:29:02

a microphone and a GPS,

0:29:020:29:04

it understands my activity level, it understands where I am.

0:29:040:29:08

Am I at a desk? Am I on a different floor? Am I in a conference room?

0:29:080:29:12

It understands only two things about speech -

0:29:120:29:16

am I talking or not talking?

0:29:160:29:19

Combining all of this across my day in the job,

0:29:190:29:24

it understands how I actually get work done.

0:29:240:29:26

And when you combine that with all the digital signatures that we are

0:29:260:29:29

also patterning, it directly translates into, "Am I happy?

0:29:290:29:32

"Am I loyal to the organisation?

0:29:320:29:34

"And as a company, are we productive and are we profitable?"

0:29:340:29:37

You mentioned the word "happiness" there, which intrigues me.

0:29:370:29:40

So how can you measure someone's happiness through the data?

0:29:400:29:43

With all of these different signatures that we are picking up

0:29:430:29:46

on, we can interpret stress levels.

0:29:460:29:49

As a manager, I can understand if any particular of my employees are

0:29:500:29:53

being overworked, overstressed,

0:29:530:29:57

and are translating to lack of happiness and lack of productivity.

0:29:570:30:01

The data collected on staff can also monitor communication patterns.

0:30:030:30:08

The thicker lines suggest more communication,

0:30:080:30:11

the thinner lines, less.

0:30:110:30:13

It monitors whether certain staff are dominating situations

0:30:130:30:16

or working collaboratively.

0:30:160:30:19

So who's the central person...?

0:30:190:30:20

-Yeah.

-Being the size of organisation we are, we might have insight into

0:30:220:30:25

who that is, but what this is a perfect example of

0:30:250:30:28

is what we all describe as bottleneck.

0:30:280:30:31

A single person in our organisation

0:30:310:30:33

has unintentionally been a bottleneck,

0:30:330:30:35

so what's been happening the last month that has drawn this behaviour?

0:30:350:30:40

And what can we do going forward

0:30:400:30:42

to mitigate or eliminate that bottleneck behaviour?

0:30:420:30:45

But you see, you know...

0:30:450:30:47

You said you've got quite a fair idea of who that person is.

0:30:470:30:50

-Oh, absolutely.

-So you don't need... In a way, the anonymity

0:30:500:30:53

doesn't matter because, just through the data,

0:30:530:30:55

you can work out who that person was because you know your employees

0:30:550:30:58

and, therefore, you put two and two together.

0:30:580:31:00

-In this case.

-Yeah.

0:31:000:31:02

-In this case, but it's usually only for those anomalies.

-Yeah.

0:31:020:31:06

I suppose the anonymity thing seems a really key element of it

0:31:060:31:09

because I think that's the bit that would frighten people -

0:31:090:31:11

the idea that, "Oh, they're going to drill down into me personally,

0:31:110:31:15

"and my data, and that's going to expose me within the company."

0:31:150:31:18

So...

0:31:180:31:20

there must be a temptation, if a company comes along and says,

0:31:200:31:23

"Can you provide this service?"

0:31:230:31:25

Would you say "yes" or would you say "No, we can't do that"?

0:31:250:31:28

It violates our privacy policy, which starts and is rooted in

0:31:280:31:34

personal privacy and personal ownership of your behavioural data.

0:31:340:31:39

And you can do good for the organisation and maintain

0:31:390:31:43

that simple truth by exposing the patterns of the individual and from

0:31:430:31:48

a top-down perspective exposing those patterns anonymously

0:31:480:31:50

about the individual,

0:31:500:31:52

-but exposing those patterns from a management perspective down.

-Right.

0:31:520:31:56

So there is a line to be drawn, really, and that's the line, really,

0:31:560:31:58

-the line of privacy.

-And you draw it in the beginning.

0:31:580:32:01

Amazing. It's a staggering level, isn't it?

0:32:010:32:03

I'm just kind of... It does really blow your mind,

0:32:030:32:06

the level to which you are able to analyse every single aspect of

0:32:060:32:11

what you're doing at work. It's mind-blowing.

0:32:110:32:13

It's already happening. Your manager is coming in and saying,

0:32:130:32:16

"I think you're blocking everybody else's communication

0:32:160:32:20

"and you're forcing, you know,

0:32:200:32:22

"one office to communicate with the other office through you".

0:32:220:32:25

The ability to capture that, analyse it,

0:32:250:32:28

that computer science really hasn't existed at scale

0:32:280:32:31

except for the last handful of years.

0:32:310:32:33

I have never seen an office where people are wearing a badge

0:32:360:32:40

that collects all the information on every single thing you do,

0:32:400:32:44

from the moment you walk in to the moment you leave.

0:32:440:32:47

So, basically, this little box tells them who's come into the room.

0:32:470:32:54

It's just... It's just unbelievable.

0:32:560:32:59

And what it is, what's really extraordinary about it is it's

0:32:590:33:02

given to you, so you can then police your own productivity

0:33:020:33:06

because we ourselves are our harshest critics,

0:33:060:33:10

and they know that. That's the genius of this stuff.

0:33:100:33:13

But how soon before technology doesn't just enable staff to work

0:33:180:33:21

more effectively, but takes over completely?

0:33:210:33:25

Doctors, accountants, even lawyers,

0:33:280:33:30

are starting to see their professions change.

0:33:300:33:32

And it's all thanks to a deal made

0:33:340:33:36

in a sleepy town in upstate New York.

0:33:360:33:39

# Silver coins that jingle jangle

0:33:410:33:47

# Fancy shoes...#

0:33:470:33:48

In 2004, a man called Charles Lickel came to this steakhouse

0:33:480:33:52

with his colleagues after a hard day's work.

0:33:520:33:55

Lickel was head of computing system software at tech giant IBM.

0:33:560:34:01

As they ate, at 7:00 on the dot, something strange happened.

0:34:050:34:09

APPLAUSE

0:34:090:34:11

All of the diners leapt up from their seats

0:34:150:34:18

and rushed out of the room.

0:34:180:34:20

The TV above the bar had just started showing that night's episode

0:34:200:34:24

of the long-running TV quiz show Jeopardy!

0:34:240:34:27

This is Jeopardy!

0:34:270:34:30

Here are today's contestants.

0:34:310:34:33

A risk analytics manager from Irvine, California,

0:34:330:34:37

Patrick Fernandez.

0:34:370:34:38

Reigning champion, Ken Jennings, was on a record-breaking winning streak.

0:34:380:34:44

Everyone in the restaurant, everyone in America,

0:34:440:34:47

was desperate to find out if he could keep it going.

0:34:470:34:50

Lickel, his colleagues, and the nation, were gripped.

0:34:500:34:53

Let's go to Ken Jennings now. He selected Matthew and Andrew.

0:34:530:34:56

And he risked 5,800, that's right.

0:34:560:34:59

APPLAUSE

0:34:590:35:00

29,000 is what you get today and it brings your total to...

0:35:000:35:04

The unstoppable Ken Jennings did it again that night.

0:35:040:35:07

He seemed more machine than human.

0:35:070:35:10

At that moment, standing at that bar, Lickel had an epiphany.

0:35:100:35:14

He suddenly saw the perfect challenge -

0:35:140:35:16

to invent a computer programme that could beat a human

0:35:160:35:19

at the world's most popular quiz show.

0:35:190:35:21

IBM would do a deal with the makers of Jeopardy!

0:35:230:35:27

They would create artificial intelligence that could

0:35:270:35:29

beat the game's greatest players,

0:35:290:35:32

a machine that could understand human language,

0:35:320:35:36

access hordes of information, and answer questions.

0:35:360:35:40

The man tasked to create it was David Ferrucci.

0:35:420:35:45

When I heard about Jeopardy!, I said, "We've got to do this".

0:35:470:35:50

This is exactly the kind of thing we've been working on for years.

0:35:500:35:52

Of course, the problem was, most people were saying,

0:35:520:35:55

"No, this is impossible. You're going to embarrass IBM,

0:35:550:35:57

"essentially. You're going to get on television, this is pure folly,

0:35:570:36:00

"you're going to fall on your face,

0:36:000:36:02

"and you're going to embarrass the whole company."

0:36:020:36:04

We had these meetings with all these executives around big

0:36:040:36:07

conference tables, and they would say,

0:36:070:36:08

"Dave, you're mucking with the IBM brand. You need to win.

0:36:080:36:11

"Whatever you want, just tell us what you need.

0:36:130:36:16

"Tell us what you need to win."

0:36:160:36:18

IBM already had Deep Blue, which had beaten Garry Kasparov at chess.

0:36:180:36:21

So how was this challenge going to be different?

0:36:210:36:24

Jeopardy! can ask about anything,

0:36:240:36:25

-whereas chess, you always knew what you were dealing with.

-Yeah.

0:36:250:36:28

I mean, there's a bunch of pieces, they move in certain ways.

0:36:280:36:30

There's a finite set of moves. You knew what you were dealing with.

0:36:300:36:33

This space was wholly unconstrained. The only thing we had was the prior

0:36:330:36:36

data, in other words, answers had to come from places like

0:36:360:36:38

Wikipedia, dictionaries, novels, plays, the Bible,

0:36:380:36:41

wherever this stuff could come from. So you had that content base,

0:36:410:36:44

but to take that question and figure out what that question was asking,

0:36:440:36:48

and you don't want to buzz in and be wrong.

0:36:480:36:51

So you had to be able to compute your confidence that you were going

0:36:510:36:53

-to be right.

-You must have needed to do some

0:36:530:36:55

trial runs, so what were the trial runs like?

0:36:550:36:57

Were there any glitches or anything?

0:36:570:37:00

In the beginning, it was terrible.

0:37:000:37:01

I mean, it would make ridiculous, ridiculous mistakes.

0:37:010:37:05

And at that time, Watson took two hours to answer a single question,

0:37:050:37:10

so we actually couldn't play it live.

0:37:100:37:12

It would be a very boring, long game.

0:37:120:37:14

This went on for a couple of years,

0:37:140:37:16

until we got to a point where Watson could answer a question in

0:37:160:37:20

three seconds, between two and three seconds.

0:37:200:37:22

-Whoa!

-And that was fast enough to play.

0:37:220:37:26

The day you go into the studio, Dave, what are you thinking?

0:37:260:37:29

As a scientist, I was going in saying, "I wish I could just

0:37:290:37:31

"publish the practice games," because if you just set up

0:37:310:37:34

the AI challenge, you know, we did it.

0:37:340:37:37

We had a very competitive Jeopardy! player that can beat the best humans

0:37:370:37:40

at something that no-one thought was even possible.

0:37:400:37:43

Developed and programmed especially for this moment,

0:37:430:37:46

ladies and gentlemen, this is Watson.

0:37:460:37:48

APPLAUSE

0:37:480:37:50

This was going to come down to one game.

0:37:500:37:52

And if we go out there and we lose that one game, that was going to

0:37:520:37:55

be on television, it would look like the team had lost.

0:37:550:37:57

And that was just a painful thing to think about because, I mean,

0:37:570:38:01

I was in there day-to-day, trying to make this thing happen.

0:38:010:38:04

-Watson?

-Who is CS Lewis?

0:38:040:38:06

-Yes.

-Who is Sir Christopher Wren?

0:38:060:38:09

-You are right.

-What is Picasso?

0:38:090:38:11

-No, sorry.

-What is narcolepsy?

0:38:110:38:13

You are right and, with that, you move to 36,681.

0:38:130:38:19

-APPLAUSE

-A big, big lead.

0:38:190:38:21

When Ken saw Watson get the daily double -

0:38:210:38:23

Ken knows Jeopardy! really well, and he could do math -

0:38:230:38:26

and he was like, "That's it, I lost".

0:38:260:38:28

Even though you were only 32% sure of your response, you are correct.

0:38:280:38:33

Final Jeopardy!, Watson still would have won by a dollar...

0:38:370:38:41

That's...!

0:38:410:38:42

Which probably would have had a very different effect.

0:38:420:38:46

IBM's future is resting on 1!

0:38:460:38:49

1! Isn't that fantastic?

0:38:490:38:53

We weren't just building a game machine.

0:38:530:38:55

This was a machine that was taught to process language more effectively

0:38:550:38:59

than any other machine before, and that was a great result for us.

0:38:590:39:05

And I think that got people saying, "Hey, what else can this do?

0:39:050:39:07

"Can this kind of intelligence help us do things?"

0:39:070:39:10

Watson's ability to understand natural language is unprecedented

0:39:130:39:16

and IBM are now selling the software worldwide.

0:39:160:39:20

As a result, it's poised to take a whole host of jobs

0:39:200:39:23

previously thought off-limits to machines.

0:39:230:39:26

It's being used to do the work of lawyers,

0:39:260:39:28

to calculate insurance policies,

0:39:280:39:31

and even to provide technology-enhanced

0:39:310:39:33

patient care in hospitals.

0:39:330:39:35

We're working with IBM Watson to actually create a platform that we

0:39:370:39:40

can answer children's questions when they want to have them answered.

0:39:400:39:43

People talk about bedside manner, but can you create a digital

0:39:430:39:46

bedside manner? So that's the vision, is to have an actual

0:39:460:39:49

artificial intelligence built into how we engage with our children.

0:39:490:39:53

Right, now, what I was wanting to have a go at was to see if I can try

0:39:530:39:56

and train this computer to ask questions that you have.

0:39:560:40:00

But to do that, I need to do what questions you want to know about

0:40:000:40:04

and also if you think the answers are any good.

0:40:040:40:06

Because I'm training it at the moment, because I can't be here

0:40:060:40:08

all the time, because I have to go from patient to patient.

0:40:080:40:11

But especially for the easier questions, if you just ask this,

0:40:110:40:14

and then we can spend more time talking about important things

0:40:140:40:17

that you want to know about.

0:40:170:40:19

Is it going to hurt?

0:40:190:40:22

I understand that you might be worried about having an operation.

0:40:220:40:25

That's OK. During the operation, you won't feel anything,

0:40:250:40:28

as we will give you an anaesthetic.

0:40:280:40:30

Watson is being used at this Children's Hospital in Liverpool

0:40:300:40:34

to help answer patients' questions on everything from the hospital menu

0:40:340:40:38

to details of their treatment.

0:40:380:40:41

How long does it take to fall asleep?

0:40:410:40:44

The project is part of a wider big data research collaboration between

0:40:440:40:49

IBM and the Government's Science and Technology Facilities Council,

0:40:490:40:52

worth £315 million.

0:40:520:40:56

If successful, and rolled out nationally,

0:40:560:40:58

it could generate savings for the NHS.

0:40:580:41:01

What have you attended for? Well, you were having your appendix

0:41:010:41:04

taken out, weren't you? Nope, it's not got that right.

0:41:040:41:07

OK. So I'm going to train it by saying that was the wrong answer.

0:41:070:41:11

OK, so we'll give that one an unhappy face.

0:41:110:41:14

Every so often, it gives you the wrong answer and you kind of go,

0:41:140:41:17

"No, that's the wrong thing to say". And then it learns from that,

0:41:170:41:19

and it starts to progress, so that's what it's like at the start.

0:41:190:41:22

So who knows what it'll be like in five or ten years' time?

0:41:220:41:25

Jobs that require empathy were for years considered the last frontier

0:41:270:41:31

for automation, but no longer.

0:41:310:41:34

Now we have algorithms that are able to substitute for cognitive work

0:41:340:41:39

in that, in the previous realm, we had seen machines only able to

0:41:390:41:43

replace manual work. Now it is no longer clear what it is that humans

0:41:430:41:49

are fundamentally better at than machines.

0:41:490:41:52

And among all the jobs that we are slowly losing to machines,

0:41:530:41:57

one stands out - teaching.

0:41:570:42:00

Children can now be educated by computers

0:42:020:42:05

in schools that look like workplaces.

0:42:050:42:07

1990 - two political scientists secure a grant from a

0:42:140:42:19

conservative American think tank to carry out a major survey

0:42:190:42:23

into the US education system.

0:42:230:42:25

They wanted to understand why, after more than a quarter century of

0:42:270:42:31

costly education reform, the nation's schools have proven

0:42:310:42:35

so resistant to change.

0:42:350:42:37

The key to the problem, they concluded,

0:42:370:42:39

were the powerful teaching unions.

0:42:390:42:41

Terry Moe was one of those political scientists

0:42:440:42:46

and has since written extensively on the politics of American education.

0:42:460:42:50

What you want are schools and school systems that have incentives to

0:42:520:42:56

create the best possible schools and organisations and

0:42:560:43:02

education environments for children.

0:43:020:43:04

And if it turns out that certain teaching methods,

0:43:040:43:09

certain approaches to learning, are best,

0:43:090:43:11

those are the ones that should be adopted.

0:43:110:43:13

But if powerful groups have

0:43:130:43:17

embraced methods and approaches that don't do that,

0:43:170:43:23

-they're still going to do it.

-Right.

0:43:230:43:25

They are still going to impose these things on children because

0:43:250:43:28

they believe them and they're powerful enough to entrench them

0:43:280:43:31

in the schools. Teachers, I think, on average, love kids, you know,

0:43:310:43:36

they want schools that are really productive.

0:43:360:43:39

However, they join unions to protect their jobs.

0:43:390:43:43

But you know, bigger picture, what's the point of the school system?

0:43:430:43:47

It's to educate children. That's the ONLY point of the school system.

0:43:470:43:51

It was never set up in order to provide jobs to adults.

0:43:510:43:54

Moe decided the only alternative to internal politics in teaching was

0:43:550:44:00

to open up the education system to market forces of supply and demand,

0:44:000:44:05

an argument he set out in a book which would go on to influence

0:44:050:44:08

the British education system, too.

0:44:080:44:11

The book was published, we had the Blair government come in,

0:44:110:44:14

New Labour in the '90s, you're hailed as prophets.

0:44:140:44:17

Revolutionising... And we get the academy system,

0:44:170:44:20

which is...

0:44:200:44:21

Which is basically an attempt to put your ideas into practise.

0:44:210:44:26

So, how did you feel that went?

0:44:260:44:28

Did you see it as an endorsement of your ideas?

0:44:280:44:31

Yes. I mean, I think that, first of all,

0:44:320:44:36

I think we're right in the sense that politics does

0:44:360:44:40

lead to schools that are perversely organised

0:44:400:44:45

and you can expect that to happen.

0:44:450:44:46

# Funky life, I've been told

0:44:460:44:51

# All that glitters is not gold

0:44:510:44:54

# And gold is not reality... #

0:44:540:44:55

What Moe and associate John Chubb concluded was that technology

0:44:550:44:59

is the key to breaking down the politics of education.

0:44:590:45:03

Technology threatens the idea of school as a building,

0:45:030:45:07

with kids and teachers in the same physical place.

0:45:070:45:10

Technology could make traditional teaching roles obsolete.

0:45:100:45:14

In that book, you argue that technology

0:45:140:45:17

is the way to transform the class,

0:45:170:45:20

in every way, to break the teacher unions, and to do it,

0:45:200:45:23

you bring technology, incrementally, into the classroom,

0:45:230:45:26

and that way teachers slowly become marginalised.

0:45:260:45:28

Yeah, in effect.

0:45:280:45:29

So let's go back to the beginning.

0:45:310:45:33

People tend to think of this as, like, greater reliance on computers

0:45:330:45:37

in the classroom, you know.

0:45:370:45:39

But big picture -

0:45:390:45:41

what we're talking about here is the revolution in

0:45:410:45:44

information technology, which is one of the most powerful

0:45:440:45:50

transformative events in the history of the world, you know.

0:45:500:45:54

It sounds like hyperbole, you know, but it is.

0:45:540:45:58

They talk very directly about how you can use technology to kind of

0:45:580:46:03

usurp the power of the teaching unions

0:46:030:46:06

and you can kind of fragment the workforce.

0:46:060:46:10

They also talk about the opening up of data, so if you take the data

0:46:100:46:13

out of the black box that is the classroom,

0:46:130:46:16

and basically out of the teacher's head,

0:46:160:46:19

then you can start to take the power away from teachers.

0:46:190:46:21

You know, data in the hands of a good teacher is a very...

0:46:210:46:25

It can be a powerful tool.

0:46:250:46:27

But if your reason for doing it is to introduce market forces

0:46:270:46:31

and ultimately a consumer market in education products,

0:46:310:46:34

which is where they're heading with this,

0:46:340:46:37

then at the very least we need to have a conversation about this.

0:46:370:46:41

Is this about improving education for children and their prospects for

0:46:410:46:45

the future, or is it about the privatisation of education?

0:46:450:46:49

Listen up real quick, guys.

0:46:500:46:52

Just to let you know, we have about 20 minutes left of class.

0:46:520:46:55

Let me know if you guys have any questions.

0:46:550:46:57

Good job.

0:46:570:46:59

So I want you to try again and call me over before you submit,

0:46:590:47:02

and we'll go through it, OK?

0:47:020:47:03

At this school in San Diego, 75 pupils study at computer terminals

0:47:030:47:08

while being monitored by one teacher.

0:47:080:47:11

Students sitting next to each other

0:47:130:47:14

are rarely learning the same subjects.

0:47:140:47:17

This is Jordan. He's taking the astronomy 101,

0:47:180:47:22

so this is the instructor for astronomy.

0:47:220:47:24

And then next to him, Brendan, he's doing coding.

0:47:240:47:26

-OK.

-So he's doing a coding course for the coding academy.

0:47:260:47:29

And then we have principles of astronomy here.

0:47:290:47:32

And how does it differ from old-fashioned teaching?

0:47:320:47:34

So, back in the day, a lot of people just thought about,

0:47:340:47:37

"Oh, yeah, pencil and paper works the best,"

0:47:370:47:39

but now that technology has improved and stuff like that,

0:47:390:47:42

it's really cool to get different education.

0:47:420:47:44

So, Jordan, you want to be an astronomer.

0:47:440:47:47

-Well...

-Or maybe not? After watching that.

0:47:470:47:49

I've thought about it, but the class is very interesting,

0:47:490:47:51

so I decided to take it.

0:47:510:47:53

Computers, ironically, can give them a personal expedience.

0:47:530:47:58

One-on-one. You compare that to the standard model of a teacher and

0:47:580:48:04

30 students. What is that? Right, a teacher has to provide

0:48:040:48:07

a standardised curriculum to everybody in the class.

0:48:070:48:11

It's completely inefficient as a way of teaching kids

0:48:110:48:15

and I think most educators think, "Well, there isn't any alternative.

0:48:150:48:18

"That's just what teaching is." But it's not. Not any more.

0:48:180:48:22

This is like a dream come true.

0:48:220:48:24

In this school, learning by computer means you're also assessed

0:48:280:48:32

by computer. Huge amounts of data on students give teachers a record,

0:48:320:48:37

not just of exam results, but of how hard each pupil has been working.

0:48:370:48:41

As we look at our world history classes and our students,

0:48:430:48:46

how are we doing for the progress for this last week?

0:48:460:48:49

Yeah, we're doing pretty good.

0:48:490:48:51

I mean, looking at the data, we had one struggling student.

0:48:510:48:53

You can see it kind of marked right here in negative three.

0:48:530:48:56

And do we have a plan of action for our student?

0:48:560:48:59

We need to e-mail the parent and say "Hey, they're behind in this course.

0:48:590:49:02

"Here's what we're working on."

0:49:020:49:04

Well, if she's behind on other things,

0:49:040:49:05

we'll have her come in on Monday mornings...

0:49:050:49:07

-Yeah, I think that would be helpful.

-Days when we don't have class, OK.

0:49:070:49:10

All right, listen up.

0:49:100:49:12

This is exciting. You guys can go.

0:49:120:49:14

See you later.

0:49:140:49:16

Supporters of online classes say they are a cost-effective

0:49:160:49:19

alternative to traditional teaching,

0:49:190:49:22

providing students more flexibility in their learning,

0:49:220:49:24

as well as access to a greater variety of options.

0:49:240:49:28

But a report by the Organisation For Economic Co-operation

0:49:280:49:31

And Development suggested computers do not improve results.

0:49:310:49:35

When you look at the evidence, and there is very little evidence,

0:49:350:49:38

but what evidence there is shows that this stuff doesn't work.

0:49:380:49:42

It doesn't help children to learn. It lowers standards.

0:49:420:49:46

Actually, heavy use of technology in education, the results are dreadful.

0:49:460:49:51

Kids are turning off, they're not engaged in the content,

0:49:510:49:54

they get very used to it, and it's not helping them to learn.

0:49:540:49:57

There is something about the human interaction that helps kids learn.

0:49:570:50:01

So not only is the problem false, I would argue,

0:50:010:50:04

the solution is a false narrative as well.

0:50:040:50:08

But this second Industrial Revolution, the robot revolution,

0:50:100:50:14

won't take all jobs away.

0:50:140:50:16

In fact, some economists believe it could create billions more,

0:50:160:50:20

a society of full employment, but what will those jobs be?

0:50:200:50:24

To get a glimpse into the future,

0:50:240:50:26

you need to look no further than the humble car wash.

0:50:260:50:29

15 years ago,

0:50:330:50:35

automated machines on petrol station forecourts started closing down.

0:50:350:50:40

The Petrol Retailers Association say that, since the mid-noughties,

0:50:400:50:44

the numbers have halved from 9,000 to just over 4,000.

0:50:440:50:48

Instead of using machines at garages,

0:50:500:50:53

we use humans to wash cars again.

0:50:530:50:56

It was an odd regression -

0:50:560:50:58

a shift towards a low pay, low productivity trap.

0:50:580:51:02

This is a story of a political deal and how it ensured that

0:51:030:51:07

we did the hard work, not robots.

0:51:070:51:10

It's 2003, ten new countries are about to join the EU,

0:51:140:51:19

eight of them from the old Eastern Bloc -

0:51:190:51:21

poor countries, by Western standards.

0:51:210:51:24

The Home Secretary at the time, Lord Blunkett,

0:51:240:51:27

is about to sign a deal with the EU.

0:51:270:51:30

The question on the table is,

0:51:300:51:31

how many workers from those countries might come to Britain?

0:51:310:51:35

We were uncertain.

0:51:350:51:37

We were sceptical about whether we could make predictions.

0:51:370:51:40

People were used to the idea of free movement and what we were trying

0:51:400:51:44

to address, was given free movement,

0:51:440:51:47

should we allow these people to work legally, pay National Insurance,

0:51:470:51:51

pay tax, or should they work in the sub-economy?

0:51:510:51:54

Britain had a choice - open up its labour market to everyone,

0:51:570:52:02

or temporarily restrict the numbers.

0:52:020:52:05

The job of estimating how many might come to live and work in the UK

0:52:050:52:09

fell to migration expert Professor Christian Dustmann.

0:52:090:52:14

We started that report early in 2003, where we tried

0:52:140:52:18

to predict the number of people who would migrate after 2004.

0:52:180:52:23

We're talking about all the Eastern European countries joining the EU.

0:52:230:52:27

No-one has any idea what this is going to be.

0:52:270:52:29

So we tried to do the best on the evidence that we have.

0:52:290:52:34

Our number was about 13,000 would come to the UK every year.

0:52:340:52:39

The report, however, was based on a very specific scenario,

0:52:390:52:44

that Germany opens up its labour market

0:52:440:52:47

to Eastern European migrants as well.

0:52:470:52:50

Based on Dustmann's estimates,

0:52:500:52:53

the British government did a deal with the EU -

0:52:530:52:56

Britain decided to allow everyone in,

0:52:560:52:59

but then Germany changed its mind.

0:52:590:53:01

Finally, on the 1st of May, it was the UK,

0:53:020:53:06

Sweden and Ireland among the European -

0:53:060:53:10

existing European countries - which actually opened

0:53:100:53:13

the labour market to those new accession countries,

0:53:130:53:18

and not Germany. So, at that point,

0:53:180:53:21

the scenario we had modelled was, of course, not the scenario any more.

0:53:210:53:25

Germany had opted to temporarily restrict its borders to migrants,

0:53:250:53:30

so they looked to Britain instead.

0:53:300:53:32

13,000 a year didn't come here -

0:53:320:53:35

it was closer to 40,000. Critics asked how the government

0:53:350:53:40

could so badly miscalculate the numbers.

0:53:400:53:43

The 2003 Dustmann report is predicated on one thing -

0:53:430:53:47

Germany opening up in the same way as we're going to open up

0:53:470:53:51

and Germany doesn't. It has...

0:53:510:53:53

It closes down and it says it's going to close down for seven years.

0:53:530:53:57

Look, let me be very straight.

0:53:570:53:59

I've defended the decision, but I was culpable on two fronts.

0:53:590:54:03

One, I didn't really clock the significance of the decision

0:54:030:54:08

that Germany was taking, even though I knew about it.

0:54:080:54:12

And secondly, I was quite prepared, as politicians so often are,

0:54:120:54:19

to use the cover of the statisticians' analysis -

0:54:190:54:25

guess, if you want - to provide us with a comfort zone

0:54:250:54:28

while we were implementing the early part of the registration.

0:54:280:54:32

We were quite happy to ride with the public understanding that this

0:54:320:54:37

wasn't going to be a major influx and, of course, it was.

0:54:370:54:41

At least 850,000 people - around 3% of the UK working age population -

0:54:440:54:51

migrated from Eastern Europe to the UK between 2004 and 2011.

0:54:510:54:57

You were hung out to dry, basically, by politicians.

0:54:570:55:00

They came and said, "You've got this completely wrong.

0:55:000:55:04

"You've wildly underestimated the figures.

0:55:040:55:06

"A massive, catastrophic miscalculation by you".

0:55:060:55:11

Nobody read the report, I think. Not many people read the report.

0:55:110:55:15

It's a Sliding Doors moment, David, isn't it? One way or another.

0:55:150:55:19

-Well...

-You said, in a candid way, which is very rare for a politician,

0:55:190:55:23

-"We got it wrong".

-Well, we got that part of it wrong.

0:55:230:55:27

But there's no question looking back that it was a very fine line

0:55:270:55:33

as to whether we were reducing future productivity levels,

0:55:330:55:38

whilst actually creating and holding on to very high employment.

0:55:380:55:42

These are very fine decisions and sometimes you get them wrong.

0:55:420:55:46

The effect of EU migration on the UK's labour market is the subject of

0:55:500:55:55

a highly political and frequently polarised debate.

0:55:550:55:58

More than one in five low-paid jobs in the UK is now occupied by a

0:55:580:56:03

low-skilled migrant worker.

0:56:030:56:04

What appeared at the time to be a decision benefiting the economy

0:56:060:56:09

became a political negative with the downturn of the economy post-2007.

0:56:090:56:15

The future for work now in an age of austerity and automation

0:56:170:56:21

will be for us all to find our own opportunities.

0:56:210:56:25

I'm interested in the future of the United Kingdom and United States,

0:56:260:56:29

and I'm interested in the face of robotics and AI as

0:56:290:56:31

how we're going to create jobs. What I'm saying to you is,

0:56:310:56:34

"If you can't do it, artificial intelligence can".

0:56:340:56:37

I look at it as technology providing opportunity.

0:56:380:56:42

Technology giving us...

0:56:420:56:43

..liberating us to do more of what we want, the way we want to do it.

0:56:450:56:48

Of course, there's also the fear side of that, right?

0:56:480:56:51

Which is, "Am I going to lose my job?" We have to think about it.

0:56:510:56:54

We have to participate. You just don't let the machines take over.

0:56:540:56:58

Be responsible in thinking about how you apply this technology,

0:56:580:57:01

how do you faze it incrementally, and so forth.

0:57:010:57:06

This great push towards robots, this is the future,

0:57:060:57:09

and it somehow takes away any agency and any political democratic process

0:57:090:57:15

because this is the imaginings of Silicon Valley.

0:57:150:57:18

Small government, gig economy, the Uberification of jobs -

0:57:180:57:23

that's kind of what they want.

0:57:230:57:25

But I would counter that with, "Well, is it what WE want?"

0:57:250:57:28

And so, the fate of the automated car wash becomes one model of

0:57:300:57:35

how work could go - machines replaced by low paid workers

0:57:350:57:39

motivated only by the fear of unemployment.

0:57:390:57:42

Could this be the future of work?

0:57:420:57:46

Watson, in a control centre, an unblinking eye,

0:57:460:57:50

monitoring how many cars you and I wash in an hour.

0:57:500:57:54

To explore further how digital technologies

0:57:570:57:59

are transforming societies, go to the BBC web page on-screen

0:57:590:58:04

and follow the links to the Open University.

0:58:040:58:07

Download Subtitles

SRT

ASS