Work

Download Subtitles

Transcript

0:00:03 > 0:00:07What if the way we understand the world is wrong?

0:00:08 > 0:00:12What if it's not politicians and world events that shape our lives,

0:00:12 > 0:00:14but business deals?

0:00:14 > 0:00:17It's natural for people to fight against change,

0:00:17 > 0:00:21but change is coming, and we need to adapt to it.

0:00:22 > 0:00:27Deals made in secret, high up in a boardroom, over a drink,

0:00:27 > 0:00:29or in a car in the dead of night.

0:00:29 > 0:00:32We were trying to protect people from excess medicine.

0:00:32 > 0:00:34- You couldn't. - Yeah, we failed.

0:00:35 > 0:00:39Our every waking hour has been transformed by these deals,

0:00:39 > 0:00:43reprogramming us to think and behave in a different way.

0:00:43 > 0:00:46While we're competing with these large monolithic banks that are very

0:00:46 > 0:00:48bureaucratic, we can run circles around them all day.

0:00:48 > 0:00:51This is the story of these deals.

0:00:51 > 0:00:53These ideas were ahead of their time.

0:00:53 > 0:00:56Online payments can change the world.

0:00:56 > 0:01:01Who made them, why, and what do they mean for our future?

0:01:04 > 0:01:08In this series, I'm going to look at three key aspects of our lives,

0:01:08 > 0:01:13and how they've been revolutionised by these deals.

0:01:13 > 0:01:16Who wants to kill cash, and why?

0:01:16 > 0:01:19Our vision is to have a cashless society.

0:01:19 > 0:01:23Card payments are the best way to pay and the best way to be paid.

0:01:23 > 0:01:26Why do we medicate every aspect of our lives?

0:01:26 > 0:01:30This notion of the pursuit of happiness, it's not realistic!

0:01:30 > 0:01:33It gets all of us to be on drugs!

0:01:33 > 0:01:36But in this film, I'll be looking at why we're trapped in a culture

0:01:36 > 0:01:39of endless work.

0:01:39 > 0:01:41It is over, dude. It is gone.

0:01:42 > 0:01:45If you can't do it, artificial intelligence can.

0:01:46 > 0:01:50How did work go from something we do to who we are?

0:01:50 > 0:01:53It's not a person, it's a performer, right?

0:01:53 > 0:01:56I'm sorry, it's not a person?! It is a person!

0:01:56 > 0:01:58I'm a person sitting in front of you.

0:01:58 > 0:02:03These are the deals that made you work harder, made your pay lower...

0:02:03 > 0:02:07These are very fine decisions, and sometimes you get them wrong.

0:02:07 > 0:02:10..and might one day replace you with a robot.

0:02:16 > 0:02:18PHONE CHIMES

0:02:23 > 0:02:29This is what 12% of us are going to do tonight, and every night -

0:02:29 > 0:02:33wake up at 3:00 am, and check our work e-mail.

0:02:34 > 0:02:38A few hours later, 51% of us are going to spend more time

0:02:38 > 0:02:41checking our work e-mail than eating breakfast.

0:02:47 > 0:02:5230% of jobs in the UK could be automated within the next 15 years.

0:02:54 > 0:02:57As intelligent machines begin their march on the labour market,

0:02:57 > 0:03:01how long will it be before they overhaul our economy completely?

0:03:05 > 0:03:07This is the bigger picture -

0:03:07 > 0:03:11how WE became robots, automated our lives

0:03:11 > 0:03:16at the very moment that robots learnt to become human.

0:03:28 > 0:03:31To understand how we got here,

0:03:31 > 0:03:35we need to go back to the '70s, to a deal that would make us loyal

0:03:35 > 0:03:38to our workplace, that would make us become our jobs,

0:03:38 > 0:03:42and our jobs to eventually become our lives.

0:03:53 > 0:03:55Summer 1975.

0:03:55 > 0:03:59America is in recession, losing jobs at an unprecedented rate.

0:04:04 > 0:04:08Four men set themselves the task of rebuilding American enterprise.

0:04:10 > 0:04:15They meet in downtown San Francisco in an office on the 48th floor.

0:04:16 > 0:04:20Their meeting would change the language of American business.

0:04:24 > 0:04:26One of them was Tom Peters,

0:04:26 > 0:04:29a larger-than-life management consultant,

0:04:29 > 0:04:32who would go on to become a celebrity life coach.

0:04:32 > 0:04:34Huge news!

0:04:34 > 0:04:36Another was Richard Pascale,

0:04:36 > 0:04:40a quietly spoken academic in the field of management science.

0:04:41 > 0:04:45This unlikely double act, who were both working as consultants

0:04:45 > 0:04:47for McKinsey, a management consultancy firm,

0:04:47 > 0:04:52made a deal to work together to create a new management philosophy,

0:04:52 > 0:04:55and one country in particular seemed to show the way.

0:04:58 > 0:05:01It is obvious that the United States economy is not in good shape.

0:05:01 > 0:05:04But while our industry is floundering,

0:05:04 > 0:05:07the Japanese are reporting record profits.

0:05:07 > 0:05:13The Americans were getting hammered by the Japanese in a lot of things,

0:05:13 > 0:05:16but the one that hurt our ego was the automobile market.

0:05:16 > 0:05:21To grossly oversimplify, to the point of not being obviously the

0:05:21 > 0:05:24whole story, the Japanese were beating us on fundamentally what

0:05:24 > 0:05:27we had, very sexy cars, except theirs, actually,

0:05:27 > 0:05:30when you turned the ignition key, they started.

0:05:30 > 0:05:34- Right.- And our quality was highly questionable.

0:05:34 > 0:05:37# There's something inside you... #

0:05:37 > 0:05:41Japan has gone from being a producer of trash to a nation synonymous

0:05:41 > 0:05:42with quality and reliability.

0:05:44 > 0:05:48So this was a moment where the Japanese motor industry was making

0:05:48 > 0:05:51cars that were of quality, they were reliable.

0:05:51 > 0:05:53- Absolutely.- And the American car industry was going...

0:05:53 > 0:05:57Absolutely, and, you know, to be realistic about it, first of all,

0:05:57 > 0:06:00you're attacking us in the car industry, and that doesn't feel good

0:06:00 > 0:06:07to you and me, and secondly, for God's sakes, it was 1977.

0:06:07 > 0:06:10We're only 30 years beyond... Well, what was it?

0:06:10 > 0:06:11From the war.

0:06:11 > 0:06:17Yeah, and everybody's dad or grandad fought in World War II.

0:06:17 > 0:06:19So, you know, we had just beaten these buggers,

0:06:19 > 0:06:21and, "What the hell are they doing?" you know?

0:06:21 > 0:06:24So the whole thing about, "We won the war, but they've won the peace"?

0:06:24 > 0:06:27Absolutely. Precisely so.

0:06:29 > 0:06:32But it was about far more than how to build a better car.

0:06:34 > 0:06:36What else could America learn from the Japanese?

0:06:37 > 0:06:39For Richard Pascale,

0:06:39 > 0:06:42the answer lay in the unique Japanese business culture.

0:06:43 > 0:06:46Everyone feels this incredible sense of commitment.

0:06:46 > 0:06:49You don't come in to hang out and lay back.

0:06:49 > 0:06:52You're in there like you would on a team.

0:06:52 > 0:06:54You're expected to play and hold up your end of the deal.

0:06:54 > 0:06:59So that quality of commitment and energy just resonated with me

0:06:59 > 0:07:01in a very deep and profound way.

0:07:01 > 0:07:05The companies care not only about their workers' job performance,

0:07:05 > 0:07:07but about their health, their family life,

0:07:07 > 0:07:10and their ideas for doing things better.

0:07:10 > 0:07:13Good ideas can win a Canon camera worker a paid vacation

0:07:13 > 0:07:15and a cash bonus.

0:07:15 > 0:07:18This attitude seemed to derive from the values set

0:07:18 > 0:07:21by the companies' bosses.

0:07:21 > 0:07:23Management philosophy was just astonishing.

0:07:23 > 0:07:27You had aspirations that allowed people in their work

0:07:27 > 0:07:30to have a meaningful sense of purpose.

0:07:32 > 0:07:36We felt strongly that this overarching sense of what a company

0:07:36 > 0:07:40is about and the meaning it makes seems to make a difference.

0:07:40 > 0:07:43Together, they arrived at a system of management with what they called

0:07:43 > 0:07:46shared values at its heart.

0:07:46 > 0:07:51What we came up with was a way of taking the best of the

0:07:51 > 0:07:56Japanese ideas and keeping them coupled to...

0:07:56 > 0:07:59God, I hate terms like this because they're such gross

0:07:59 > 0:08:02over-generalisation, but America's innovative spirit.

0:08:04 > 0:08:09Peter and Pascale's meeting of minds appealed to business leaders,

0:08:09 > 0:08:12humbled by Japanese manufacturing prowess.

0:08:12 > 0:08:16From now on, the company wouldn't just give employees jobs to do,

0:08:16 > 0:08:20it would set all-encompassing values for them to follow.

0:08:20 > 0:08:23This was the key to reshaping their businesses.

0:08:25 > 0:08:28Both went on to publish books based on their ideas

0:08:28 > 0:08:30within a year of each other.

0:08:30 > 0:08:33Pascale's The Art Of Japanese Management was published in 1981.

0:08:35 > 0:08:41That shared value insight turned out to be both very useful,

0:08:41 > 0:08:46but as well, at the time, extremely provocative.

0:08:46 > 0:08:49Hard to imagine today, with the whole industry of vision

0:08:49 > 0:08:53and values being created in the last 10 or 20 years,

0:08:53 > 0:08:56in which everyone has their little plaque on the wall

0:08:56 > 0:08:59that states what they aspire to be.

0:08:59 > 0:09:02But at the time when we started talking about this,

0:09:02 > 0:09:07I remember truly violent reactions from audiences.

0:09:07 > 0:09:08What did they say you were trying to do?

0:09:08 > 0:09:11Well, they saw it as brainwashing.

0:09:11 > 0:09:15They saw it as paternalistic, they found it extremely offensive,

0:09:15 > 0:09:20intrusive. Placing companies in a role that was out of their scope

0:09:20 > 0:09:22and inappropriate for a commercial enterprise.

0:09:23 > 0:09:27But while Pascale's book didn't capture the public's imagination,

0:09:27 > 0:09:31Tom Peters took the same ideas and brought them to millions

0:09:31 > 0:09:35with his book called In Search Of Excellence.

0:09:35 > 0:09:37I don't really want to completely cheapen it

0:09:37 > 0:09:39and say I was the populariser,

0:09:39 > 0:09:41but I was the populariser.

0:09:41 > 0:09:45But you were, actually, redefining how a workplace should be.

0:09:45 > 0:09:49Yeah. There is a real intellectual hard...

0:09:49 > 0:09:52- Of course.- ..no baloney, intellectual history for this stuff.

0:09:52 > 0:09:54- Right. - But nobody paid any attention to it.

0:09:54 > 0:09:59- Right.- And first of all we got lucky with labels like Excellence and we

0:09:59 > 0:10:03got lucky with 10% unemployment and we got lucky by being

0:10:03 > 0:10:06caught with our pants down and cars that didn't work.

0:10:06 > 0:10:12And so I will completely acknowledge that this new look at organisations

0:10:12 > 0:10:15came together with insane timing.

0:10:15 > 0:10:18And there was nothing wrong with what we did,

0:10:18 > 0:10:19it was a pretty good piece of work,

0:10:19 > 0:10:23but the timing was a gift from the gods.

0:10:25 > 0:10:29Tom Peters had put a management book on the bestseller list,

0:10:29 > 0:10:31selling in excess of five million copies.

0:10:31 > 0:10:34It was a modern business blockbuster,

0:10:34 > 0:10:37but most importantly, it made management ideas

0:10:37 > 0:10:40a serious topic of conversation in organisations.

0:10:40 > 0:10:45Its thinking was adopted by companies around the world.

0:10:45 > 0:10:47What Peters did was he made culture popular.

0:10:47 > 0:10:52He had initially created this very clever thing for McKinsey called the

0:10:52 > 0:10:587S Model, where he identified the soft and hard drivers of

0:10:58 > 0:11:03organisation culture and how they pulled themselves together to create

0:11:03 > 0:11:06a set of shared values that enabled the delivery of strategy.

0:11:06 > 0:11:09And he... It was a very smart piece of thinking.

0:11:09 > 0:11:11And it enabled leaders, for the first time,

0:11:11 > 0:11:14to get their hands on something and say,

0:11:14 > 0:11:16"so if we change this process over here,

0:11:16 > 0:11:19"or if we change our reward mechanism over here,

0:11:19 > 0:11:21"what we should see is a different outcome, or a different...

0:11:21 > 0:11:24"Driven by a different set of behaviours".

0:11:24 > 0:11:28Before Peters met Pascale, the idea of your work

0:11:28 > 0:11:31having a higher meaning would have seemed ludicrous.

0:11:31 > 0:11:36But as their thinking caught on, our work culture changed.

0:11:36 > 0:11:42We were asked to see work as having a higher purpose, not just a job.

0:11:42 > 0:11:46Work could and would offer us fulfilment.

0:11:49 > 0:11:52As the idea of shared values spread,

0:11:52 > 0:11:55workplaces around the world became more than just workplaces.

0:12:00 > 0:12:02They had mission statements.

0:12:02 > 0:12:05They had values that the workers signed up to.

0:12:06 > 0:12:09Good morning, everybody. Welcome to Services To Fruit.

0:12:09 > 0:12:11For those of you who are new,

0:12:11 > 0:12:15Services To Fruit is our opportunity to recognise unsung heroes.

0:12:15 > 0:12:17People who have gone the extra mile,

0:12:17 > 0:12:20people who have truly lived the values.

0:12:20 > 0:12:23This month's Lady Of The Sash is Katrina.

0:12:30 > 0:12:32At Innocent, makers of fruit smoothies,

0:12:32 > 0:12:36they go further than most to create a unique company culture.

0:12:37 > 0:12:40Using team bonding techniques, inspired from cult films,

0:12:40 > 0:12:43like Bill And Ted's Excellent Adventure,

0:12:43 > 0:12:45they see their workforce as a family.

0:12:49 > 0:12:50One of my favourite things is this.

0:12:50 > 0:12:53So this is... This is our baby photo wall.

0:12:53 > 0:12:56When you join Innocent, we will put your baby photo up.

0:12:56 > 0:12:58The idea being that you can see...

0:12:58 > 0:13:01"This is who I work with, everybody is human, everyone is a baby,

0:13:01 > 0:13:04- "everyone has a job to do". - Tim, is your Chief Exec here?

0:13:04 > 0:13:07Yeah, Douglas, our chief squeezer, he's actually just down the bottom.

0:13:07 > 0:13:09Not at the top, trying to look down on everyone else.

0:13:09 > 0:13:13We're one big team, and you're not just here to do that job.

0:13:13 > 0:13:16- Yeah.- You're here to make the world a better place.

0:13:16 > 0:13:19Do you think someone cynical, diffident and cynical like myself,

0:13:19 > 0:13:21would be able to fit in?

0:13:21 > 0:13:24Because I'm the sort of person that would come and if I was interviewed,

0:13:24 > 0:13:26I'd be like, "Oh, I'm not going to buy into all this stuff."

0:13:26 > 0:13:29Do you see...? I'm just interested in the kind of personality that...

0:13:29 > 0:13:32You know, would someone like myself be accepted?

0:13:32 > 0:13:36We have people that have been here a long time, 10, 15, 16, 17 years.

0:13:36 > 0:13:39And they don't come for beers every Friday and they don't get involved

0:13:39 > 0:13:43in all the parties, but they are still about what Innocent are about.

0:13:43 > 0:13:46But what if you become too dedicated to your job?

0:13:46 > 0:13:49If you have truly drunk in the company values,

0:13:49 > 0:13:52then sooner or later, is there a risk that you become your job?

0:13:53 > 0:13:55And if you are your job,

0:13:55 > 0:13:59what happens when the working day gets longer and longer?

0:14:01 > 0:14:05In Japan, shared values have always gone hand in hand

0:14:05 > 0:14:09with a long hours culture, and what they call karoshi -

0:14:09 > 0:14:12death from overwork.

0:14:12 > 0:14:14In terms of this new workplace,

0:14:14 > 0:14:19this new really highly competitive poster book, '80s workplace,

0:14:19 > 0:14:21it's totally redefined.

0:14:21 > 0:14:24And what it is, is that you have to give your all to the company

0:14:24 > 0:14:26if you're going to survive.

0:14:26 > 0:14:29If you're going to keep your job, it means being...

0:14:29 > 0:14:31You know, policing your own productivity.

0:14:31 > 0:14:36Yeah, I mean, if I work my one ass off to help the ten people who work

0:14:36 > 0:14:40for me be better prepared for the workplace of tomorrow,

0:14:40 > 0:14:45they will work their ten asses off to make me look like a superstar.

0:14:48 > 0:14:51Innocent have a novel way of curbing long hours working culture.

0:14:53 > 0:14:55So this here is our last leaver pulls lever.

0:14:55 > 0:14:58So at the end of the working day,

0:14:58 > 0:15:01the last person to leave the floor will make an announcement at around

0:15:01 > 0:15:038:00, 8:30, and they will shut this off,

0:15:03 > 0:15:07and it basically stops all non-essential electrical equipment.

0:15:07 > 0:15:11The idea being that that last person is telling everyone, "Come on, guys,

0:15:11 > 0:15:13"we've had enough, let's go."

0:15:13 > 0:15:17But not keeping a lid on your working hours can take its toll.

0:15:23 > 0:15:27There are more than 100,000 strokes in the UK each year.

0:15:27 > 0:15:29OK, and again.

0:15:29 > 0:15:33It's the fourth single leading cause of death in the country.

0:15:35 > 0:15:37- Fingers are opening, lovely. - Turn your head.

0:15:37 > 0:15:40Most of it is age-related, but a study by University College London,

0:15:40 > 0:15:42published in The Lancet,

0:15:42 > 0:15:45showed a link to long working hours and a higher risk of stroke.

0:15:45 > 0:15:48This patient thinks the stress of running his business

0:15:48 > 0:15:50played a part in his.

0:15:50 > 0:15:55Reflecting on it, I suppose it was the stress of work.

0:15:55 > 0:15:56Yeah.

0:15:56 > 0:16:01I have an international business, and I'm flying around the world.

0:16:01 > 0:16:02It takes a toll.

0:16:02 > 0:16:05And when you look back, is there one thing that you think,

0:16:05 > 0:16:07"Oh, yeah, you know, I was really worried about that"?

0:16:07 > 0:16:09Or do you think it was just an accumulation of stress?

0:16:09 > 0:16:12I suppose it must have been an accumulation.

0:16:12 > 0:16:16There's no question, I think, in my mind that the kind of job that

0:16:16 > 0:16:19Mr Mehta's doing, often working 12, 18 hours a day

0:16:19 > 0:16:23- for nearly 20, 30 years...- Yeah. - ..that was a contributing factor.

0:16:23 > 0:16:26- And a seven-day week.- Seven days a week, yeah. You never turn off.

0:16:26 > 0:16:29- No. - That's the thing, yeah.

0:16:29 > 0:16:31Ten years ago, I had a stroke.

0:16:31 > 0:16:34A bit like you, it came out of the blue,

0:16:34 > 0:16:39and in the ward I was in, everyone who'd had a stroke

0:16:39 > 0:16:44was either in their late 30s, or in their 40s, in that ward.

0:16:44 > 0:16:49And the common factor amongst all of us was stress.

0:16:49 > 0:16:53Everyone had a highly stressed job, whether it be rubbish collection,

0:16:53 > 0:16:55or a city broker or a lawyer, you know,

0:16:55 > 0:16:59everyone was in there, and they all had high stress jobs, including me.

0:16:59 > 0:17:03I can tell you when I come into the room, when he's not looking,

0:17:03 > 0:17:08he's on his iPad answering e-mails before I came into the room.

0:17:08 > 0:17:12And that's whilst he's had a stroke within the last six weeks,

0:17:12 > 0:17:14and yet, he's just promised us -

0:17:14 > 0:17:16and this is not the first time he's promised, by the way -

0:17:16 > 0:17:19that he will reduce his working hours.

0:17:19 > 0:17:24Quite frankly, the e-mails and mobile phones don't help.

0:17:24 > 0:17:26- Because you never switch off. - No.

0:17:26 > 0:17:28- You can't escape that. - It's impossible.

0:17:30 > 0:17:34Peters, Pascale and their co-writers would foster the idea of

0:17:34 > 0:17:37employee devotion to their employers.

0:17:37 > 0:17:41But then, over time, our bosses began to persuade themselves

0:17:41 > 0:17:45that they didn't need to treat us with the same devotion.

0:17:45 > 0:17:50Two decades after Pascale and Peters met, another deal would encourage

0:17:50 > 0:17:55companies to hire and fire us at will, no matter how devoted we were.

0:18:07 > 0:18:121997 - the world of business is changing fast.

0:18:14 > 0:18:19In America, traditional blue chip companies were losing talented staff

0:18:19 > 0:18:21to tech upstarts from Silicon Valley.

0:18:26 > 0:18:31They needed a solution, and they paid Management consultants McKinsey

0:18:31 > 0:18:33to provide it.

0:18:33 > 0:18:36Helen Handfield-Jones and her colleagues were convinced that

0:18:36 > 0:18:41companies were failing because they were employing the wrong people.

0:18:41 > 0:18:44It was in the '90s with the dot-com boom, big companies all over

0:18:44 > 0:18:47the place that used to get the cream of the crop

0:18:47 > 0:18:51from their recruiting, you know, the IBMs, the GEs,

0:18:51 > 0:18:53were losing talent to dot-com start-ups,

0:18:53 > 0:18:55which they'd never faced before.

0:18:57 > 0:19:01To solve this problem, McKinsey sold a strategy to their clients.

0:19:01 > 0:19:04They called it the war for talent.

0:19:04 > 0:19:08Using practical examples from companies such as General Electric

0:19:08 > 0:19:13and Enron, the authors outlined five imperatives that every leader,

0:19:13 > 0:19:18from CEO to unit manager, must act on to build a stronger talent pool.

0:19:18 > 0:19:21It completely redefined the relationship

0:19:21 > 0:19:24between a company and its workers.

0:19:24 > 0:19:29Talent is the sum total of your skills and ability, values,

0:19:29 > 0:19:33character that you bring to your work role.

0:19:33 > 0:19:35Having great talent in the organisation is critical

0:19:35 > 0:19:37to the business success.

0:19:37 > 0:19:39Now, it sounds like motherhood and apple pie, right?

0:19:39 > 0:19:41Everybody would say that.

0:19:41 > 0:19:44But there's some roles that require just a very unique set of skills

0:19:44 > 0:19:48that doesn't come along easily, and isn't trainable.

0:19:48 > 0:19:51For McKinsey, the most important thing in any company was

0:19:51 > 0:19:55having the right employees. The good people needed to be kept,

0:19:55 > 0:19:58and underperforming staff let go.

0:19:58 > 0:20:03You categorised employees into A, B and C.

0:20:04 > 0:20:08And A were people who were super-talented and successful,

0:20:08 > 0:20:11- and you just allow them to get on with it.- Mm-hm.

0:20:11 > 0:20:14And B were people who were kind of coasting along...

0:20:14 > 0:20:16Doing OK, doing a good job.

0:20:16 > 0:20:18Doing OK, but we're not sure whether to keep them.

0:20:18 > 0:20:22- Yeah.- And then C were people that you sack.

0:20:22 > 0:20:23OK. All of that is mischaracterised.

0:20:23 > 0:20:25Let me start that again.

0:20:25 > 0:20:27That's the hardest part of talent management is dealing with

0:20:27 > 0:20:29low performers. The concept is that...

0:20:29 > 0:20:33Look hard at your low performers, have a clear eyed view,

0:20:33 > 0:20:37assess their performance rigorously like you do everybody else,

0:20:37 > 0:20:38and make a decision. That's what it means.

0:20:38 > 0:20:42So you didn't say that the C person should just be sacked?

0:20:42 > 0:20:44It's not a person, it's a performer.

0:20:44 > 0:20:47- Right?- Oh, sorry. So it's not a human being?

0:20:47 > 0:20:48No. No, no.

0:20:48 > 0:20:51- No, what I'm saying is... - It's not a person?! It IS a person.

0:20:51 > 0:20:54I'm a person sitting in front of you,

0:20:54 > 0:20:55being interviewed by my boss.

0:20:55 > 0:20:58- Yeah, yeah.- And they're deciding whether I should be sacked or kept.

0:20:58 > 0:21:01- Yes, but it...- I'm a person. - Of course you're a person.

0:21:01 > 0:21:03But you said I wasn't a person. You said I was a performer.

0:21:03 > 0:21:06But you also used the word a moment earlier that said you assess

0:21:06 > 0:21:08whether people were TALENTED or not by A, B or C.

0:21:08 > 0:21:12I said, "No, that's not right." We assess people on their performance.

0:21:12 > 0:21:16- Right.- Are they performing at a high level, an OK level,

0:21:16 > 0:21:20- or a poor level?- Right.- Below acceptable level of performance.

0:21:20 > 0:21:22They're toxic to the organisation, they're not growing

0:21:22 > 0:21:25their businesses, if that's what you need them to do,

0:21:25 > 0:21:27they're poor people managers, whatever it is,

0:21:27 > 0:21:29whatever dimension of performance -

0:21:29 > 0:21:32this is not acceptable performance for a leader of our company.

0:21:36 > 0:21:38In the late 1990s and early 2000s,

0:21:38 > 0:21:41the war for talent would become hugely influential.

0:21:43 > 0:21:47As companies did everything possible to keep their supposedly A-grade

0:21:47 > 0:21:52employees, there was a massive increase in executive pay.

0:21:52 > 0:21:57So if you go back 20, 25 years, the ratio, say in the UK,

0:21:57 > 0:22:01across most of Europe, between a CEO's pay

0:22:01 > 0:22:05and the average earner in an organisation, was about 20-1.

0:22:06 > 0:22:11And something happened in the late '80s through the '90s to the

0:22:11 > 0:22:16early 2000s which saw Chief Executives getting increases

0:22:16 > 0:22:18of 100-300% over that period of time.

0:22:18 > 0:22:22The flames were fanned by the war-for-talent thinking.

0:22:22 > 0:22:27Do you think there's a danger you create a kind of inherent inequality

0:22:27 > 0:22:31within an organisation because just by dividing people into A, B and C,

0:22:31 > 0:22:33you create resentment, you breed anxiety?

0:22:33 > 0:22:36Well, no, again, you make an assumption that's not right.

0:22:36 > 0:22:40We do not tell people, "Oh, by the way, you're a B, or you're a B plus,

0:22:40 > 0:22:42"or you're an A, whatever. "You have potential..."

0:22:42 > 0:22:45We don't tell people that. It's not about stamping letter grades

0:22:45 > 0:22:49on people's foreheads or having Crown Princes walk around

0:22:49 > 0:22:50with crowns on their heads.

0:22:50 > 0:22:53It's about giving each person the feedback that's most helpful and

0:22:53 > 0:22:57appropriate to them, about what they can do to perform better,

0:22:57 > 0:23:00and how their careers might unfold in this organisation.

0:23:00 > 0:23:03So, Helen, has one of the legacies of the war for talent been,

0:23:03 > 0:23:07in a way, the idea that we all need to be talented now in the workplace?

0:23:07 > 0:23:09Absolutely, yeah.

0:23:09 > 0:23:14These companies have lessened their commitment, long-term, to people.

0:23:14 > 0:23:16In the old days, you hired somebody full-time,

0:23:16 > 0:23:19and you hired them forever. And people thought,

0:23:19 > 0:23:22"I have a job with one company, I stay there forever."

0:23:22 > 0:23:23Right? That's long gone.

0:23:23 > 0:23:27There's no free lunches any more, right? For anybody.

0:23:29 > 0:23:32The world of work had changed significantly since

0:23:32 > 0:23:35Peters and Pascale's vision of shared values -

0:23:35 > 0:23:39being loyal to the workplace and your boss being loyal to you.

0:23:40 > 0:23:43The importance of shared values

0:23:43 > 0:23:46shouldn't be underestimated as a concept.

0:23:46 > 0:23:52It's probably the lasting huge thought that is still with us and

0:23:52 > 0:23:56should keep going with us because, of course, it was shared values that

0:23:56 > 0:23:59we got wrong in the '90s and the early 2000s with the war for talent

0:23:59 > 0:24:02and the L'Oreal generation, the shared value became, "Me".

0:24:04 > 0:24:06It didn't become a greater creation of wealth.

0:24:06 > 0:24:11It didn't become doing the right thing, and we lost the sort of

0:24:11 > 0:24:15Aristotle version of a virtuous or a good leader.

0:24:19 > 0:24:22The ideas in the war for talent would affect us all.

0:24:23 > 0:24:29It told businesses worldwide that it was OK to hire and fire us.

0:24:29 > 0:24:32Today, the average worker has six jobs

0:24:32 > 0:24:35over the course of his or her life.

0:24:35 > 0:24:39New graduates expect to have 20.

0:24:39 > 0:24:41The job for life is over.

0:24:43 > 0:24:47And it was the tech firms, whose rise prompted the war for talent

0:24:47 > 0:24:51in the first place, who took the idea to its logical extreme.

0:24:52 > 0:24:55At the beginning of the shift, I get my phone out,

0:24:55 > 0:24:57log in, and set myself as available, and off I go.

0:24:59 > 0:25:02A world without long-term employees,

0:25:02 > 0:25:06where even the very idea of staff is redundant.

0:25:07 > 0:25:10I normally do Monday to Friday lunch shifts,

0:25:10 > 0:25:1410-2, and a couple evenings on top of that.

0:25:17 > 0:25:21Meg Brown is a casual worker, part of the so-called gig economy.

0:25:21 > 0:25:25Instead of a regular wage, she gets paid for the gigs she does,

0:25:25 > 0:25:29delivering restaurant food to customers' front doors.

0:25:29 > 0:25:32It's estimated that five million people globally are employed

0:25:32 > 0:25:37this way, managed by a computer algorithm, and not by a person.

0:25:37 > 0:25:40What kind of guarantees of work do you have?

0:25:40 > 0:25:42You'll wait for the orders to come in.

0:25:42 > 0:25:47- Right.- There'll usually be a peak-time of the meal times.

0:25:47 > 0:25:48- Right, yeah. - So it really varies.

0:25:53 > 0:25:56Anything could change at any moment.

0:25:56 > 0:26:00I don't know what's happening on the other end.

0:26:00 > 0:26:03I don't know if business is slow,

0:26:03 > 0:26:07or it's just because I'm not being assigned a job.

0:26:07 > 0:26:11I'm vulnerable, and at the mercy of the algorithm,

0:26:11 > 0:26:15and whoever's at control on the other end.

0:26:15 > 0:26:18Half the time, you don't get to see a human being and you don't know

0:26:18 > 0:26:21who... What kind of personalities you're interacting with,

0:26:21 > 0:26:23or if you are interacting with any personality.

0:26:23 > 0:26:29It does feel like I'm missing something in that human-contact way

0:26:29 > 0:26:31of just interacting with a screen.

0:26:31 > 0:26:33- There's no human being doing it, it's the algorithms.- Yeah.

0:26:33 > 0:26:36We keep talking... I mean, this film is all about the future of work,

0:26:36 > 0:26:39but actually the algorithms are deciding how you work now,

0:26:39 > 0:26:41and whether you are even employed or not.

0:26:41 > 0:26:44Yeah. It makes me feel a bit dispensable.

0:26:46 > 0:26:47Algorithms are the new boss.

0:26:49 > 0:26:53Algorithms telling them where to go, to do the service.

0:26:53 > 0:26:57Algorithms determining the price of the service.

0:26:58 > 0:27:01The algorithm can also sack you these days.

0:27:01 > 0:27:04So there's quite a lot of platforms. There's a rating system of points

0:27:04 > 0:27:08or stars, and if you fall below a certain threshold,

0:27:08 > 0:27:10you get terminated automatically.

0:27:10 > 0:27:13You might just try and log on one day to your app,

0:27:13 > 0:27:16and it simply no longer says that you've got an active account.

0:27:17 > 0:27:21Algorithm management has become algorithm rebellion.

0:27:21 > 0:27:25Gig economy workers torn between the idea of being your own boss

0:27:25 > 0:27:28while being controlled by the smartphones in their pockets are

0:27:28 > 0:27:31arguing for the same employment rights as everyone else.

0:27:31 > 0:27:34An independent review is currently looking into this.

0:27:35 > 0:27:38The way that this type of work is sold,

0:27:38 > 0:27:41whether it be Deliveroo or any company that does this,

0:27:41 > 0:27:44is that it's flexibility, that it gives...

0:27:44 > 0:27:47And that, you know, this is a great way to work, this is the future.

0:27:47 > 0:27:50The flexibility is overplayed massively.

0:27:50 > 0:27:54No guarantee of minimum wage, let alone the living wage.

0:27:54 > 0:27:56It's a really insecure way to live.

0:27:56 > 0:28:00It's hard to plan your life on this kind of wages,

0:28:00 > 0:28:03and this kind of nature of work.

0:28:06 > 0:28:11The war for talent was a critical driver of corporate performance,

0:28:11 > 0:28:14but employers have since realised that technology

0:28:14 > 0:28:16can make that process ruthlessly efficient.

0:28:19 > 0:28:22And today, that insight has spread far beyond companies

0:28:22 > 0:28:27like Deliveroo and Uber. Soon, everyone could be monitored,

0:28:27 > 0:28:30including those working in professional office jobs.

0:28:32 > 0:28:35In Boston, Massachusetts, there's one company ensuring

0:28:35 > 0:28:38none of us can escape the computer's all-seeing eye.

0:28:40 > 0:28:44Their technology allows companies to monitor the productivity of its

0:28:44 > 0:28:48staff and employees to monitor their own workplace performance

0:28:48 > 0:28:52through technology embedded into a card you wear around your neck.

0:28:53 > 0:28:56They are already selling their services to companies like

0:28:56 > 0:28:58Deloitte and Bank of America.

0:28:58 > 0:29:02This measures with Bluetooth and accelerometer,

0:29:02 > 0:29:04a microphone and a GPS,

0:29:04 > 0:29:08it understands my activity level, it understands where I am.

0:29:08 > 0:29:12Am I at a desk? Am I on a different floor? Am I in a conference room?

0:29:12 > 0:29:16It understands only two things about speech -

0:29:16 > 0:29:19am I talking or not talking?

0:29:19 > 0:29:24Combining all of this across my day in the job,

0:29:24 > 0:29:26it understands how I actually get work done.

0:29:26 > 0:29:29And when you combine that with all the digital signatures that we are

0:29:29 > 0:29:32also patterning, it directly translates into, "Am I happy?

0:29:32 > 0:29:34"Am I loyal to the organisation?

0:29:34 > 0:29:37"And as a company, are we productive and are we profitable?"

0:29:37 > 0:29:40You mentioned the word "happiness" there, which intrigues me.

0:29:40 > 0:29:43So how can you measure someone's happiness through the data?

0:29:43 > 0:29:46With all of these different signatures that we are picking up

0:29:46 > 0:29:49on, we can interpret stress levels.

0:29:50 > 0:29:53As a manager, I can understand if any particular of my employees are

0:29:53 > 0:29:57being overworked, overstressed,

0:29:57 > 0:30:01and are translating to lack of happiness and lack of productivity.

0:30:03 > 0:30:08The data collected on staff can also monitor communication patterns.

0:30:08 > 0:30:11The thicker lines suggest more communication,

0:30:11 > 0:30:13the thinner lines, less.

0:30:13 > 0:30:16It monitors whether certain staff are dominating situations

0:30:16 > 0:30:19or working collaboratively.

0:30:19 > 0:30:20So who's the central person...?

0:30:22 > 0:30:25- Yeah.- Being the size of organisation we are, we might have insight into

0:30:25 > 0:30:28who that is, but what this is a perfect example of

0:30:28 > 0:30:31is what we all describe as bottleneck.

0:30:31 > 0:30:33A single person in our organisation

0:30:33 > 0:30:35has unintentionally been a bottleneck,

0:30:35 > 0:30:40so what's been happening the last month that has drawn this behaviour?

0:30:40 > 0:30:42And what can we do going forward

0:30:42 > 0:30:45to mitigate or eliminate that bottleneck behaviour?

0:30:45 > 0:30:47But you see, you know...

0:30:47 > 0:30:50You said you've got quite a fair idea of who that person is.

0:30:50 > 0:30:53- Oh, absolutely.- So you don't need... In a way, the anonymity

0:30:53 > 0:30:55doesn't matter because, just through the data,

0:30:55 > 0:30:58you can work out who that person was because you know your employees

0:30:58 > 0:31:00and, therefore, you put two and two together.

0:31:00 > 0:31:02- In this case.- Yeah.

0:31:02 > 0:31:06- In this case, but it's usually only for those anomalies.- Yeah.

0:31:06 > 0:31:09I suppose the anonymity thing seems a really key element of it

0:31:09 > 0:31:11because I think that's the bit that would frighten people -

0:31:11 > 0:31:15the idea that, "Oh, they're going to drill down into me personally,

0:31:15 > 0:31:18"and my data, and that's going to expose me within the company."

0:31:18 > 0:31:20So...

0:31:20 > 0:31:23there must be a temptation, if a company comes along and says,

0:31:23 > 0:31:25"Can you provide this service?"

0:31:25 > 0:31:28Would you say "yes" or would you say "No, we can't do that"?

0:31:28 > 0:31:34It violates our privacy policy, which starts and is rooted in

0:31:34 > 0:31:39personal privacy and personal ownership of your behavioural data.

0:31:39 > 0:31:43And you can do good for the organisation and maintain

0:31:43 > 0:31:48that simple truth by exposing the patterns of the individual and from

0:31:48 > 0:31:50a top-down perspective exposing those patterns anonymously

0:31:50 > 0:31:52about the individual,

0:31:52 > 0:31:56- but exposing those patterns from a management perspective down.- Right.

0:31:56 > 0:31:58So there is a line to be drawn, really, and that's the line, really,

0:31:58 > 0:32:01- the line of privacy. - And you draw it in the beginning.

0:32:01 > 0:32:03Amazing. It's a staggering level, isn't it?

0:32:03 > 0:32:06I'm just kind of... It does really blow your mind,

0:32:06 > 0:32:11the level to which you are able to analyse every single aspect of

0:32:11 > 0:32:13what you're doing at work. It's mind-blowing.

0:32:13 > 0:32:16It's already happening. Your manager is coming in and saying,

0:32:16 > 0:32:20"I think you're blocking everybody else's communication

0:32:20 > 0:32:22"and you're forcing, you know,

0:32:22 > 0:32:25"one office to communicate with the other office through you".

0:32:25 > 0:32:28The ability to capture that, analyse it,

0:32:28 > 0:32:31that computer science really hasn't existed at scale

0:32:31 > 0:32:33except for the last handful of years.

0:32:36 > 0:32:40I have never seen an office where people are wearing a badge

0:32:40 > 0:32:44that collects all the information on every single thing you do,

0:32:44 > 0:32:47from the moment you walk in to the moment you leave.

0:32:47 > 0:32:54So, basically, this little box tells them who's come into the room.

0:32:56 > 0:32:59It's just... It's just unbelievable.

0:32:59 > 0:33:02And what it is, what's really extraordinary about it is it's

0:33:02 > 0:33:06given to you, so you can then police your own productivity

0:33:06 > 0:33:10because we ourselves are our harshest critics,

0:33:10 > 0:33:13and they know that. That's the genius of this stuff.

0:33:18 > 0:33:21But how soon before technology doesn't just enable staff to work

0:33:21 > 0:33:25more effectively, but takes over completely?

0:33:28 > 0:33:30Doctors, accountants, even lawyers,

0:33:30 > 0:33:32are starting to see their professions change.

0:33:34 > 0:33:36And it's all thanks to a deal made

0:33:36 > 0:33:39in a sleepy town in upstate New York.

0:33:41 > 0:33:47# Silver coins that jingle jangle

0:33:47 > 0:33:48# Fancy shoes...#

0:33:48 > 0:33:52In 2004, a man called Charles Lickel came to this steakhouse

0:33:52 > 0:33:55with his colleagues after a hard day's work.

0:33:56 > 0:34:01Lickel was head of computing system software at tech giant IBM.

0:34:05 > 0:34:09As they ate, at 7:00 on the dot, something strange happened.

0:34:09 > 0:34:11APPLAUSE

0:34:15 > 0:34:18All of the diners leapt up from their seats

0:34:18 > 0:34:20and rushed out of the room.

0:34:20 > 0:34:24The TV above the bar had just started showing that night's episode

0:34:24 > 0:34:27of the long-running TV quiz show Jeopardy!

0:34:27 > 0:34:30This is Jeopardy!

0:34:31 > 0:34:33Here are today's contestants.

0:34:33 > 0:34:37A risk analytics manager from Irvine, California,

0:34:37 > 0:34:38Patrick Fernandez.

0:34:38 > 0:34:44Reigning champion, Ken Jennings, was on a record-breaking winning streak.

0:34:44 > 0:34:47Everyone in the restaurant, everyone in America,

0:34:47 > 0:34:50was desperate to find out if he could keep it going.

0:34:50 > 0:34:53Lickel, his colleagues, and the nation, were gripped.

0:34:53 > 0:34:56Let's go to Ken Jennings now. He selected Matthew and Andrew.

0:34:56 > 0:34:59And he risked 5,800, that's right.

0:34:59 > 0:35:00APPLAUSE

0:35:00 > 0:35:0429,000 is what you get today and it brings your total to...

0:35:04 > 0:35:07The unstoppable Ken Jennings did it again that night.

0:35:07 > 0:35:10He seemed more machine than human.

0:35:10 > 0:35:14At that moment, standing at that bar, Lickel had an epiphany.

0:35:14 > 0:35:16He suddenly saw the perfect challenge -

0:35:16 > 0:35:19to invent a computer programme that could beat a human

0:35:19 > 0:35:21at the world's most popular quiz show.

0:35:23 > 0:35:27IBM would do a deal with the makers of Jeopardy!

0:35:27 > 0:35:29They would create artificial intelligence that could

0:35:29 > 0:35:32beat the game's greatest players,

0:35:32 > 0:35:36a machine that could understand human language,

0:35:36 > 0:35:40access hordes of information, and answer questions.

0:35:42 > 0:35:45The man tasked to create it was David Ferrucci.

0:35:47 > 0:35:50When I heard about Jeopardy!, I said, "We've got to do this".

0:35:50 > 0:35:52This is exactly the kind of thing we've been working on for years.

0:35:52 > 0:35:55Of course, the problem was, most people were saying,

0:35:55 > 0:35:57"No, this is impossible. You're going to embarrass IBM,

0:35:57 > 0:36:00"essentially. You're going to get on television, this is pure folly,

0:36:00 > 0:36:02"you're going to fall on your face,

0:36:02 > 0:36:04"and you're going to embarrass the whole company."

0:36:04 > 0:36:07We had these meetings with all these executives around big

0:36:07 > 0:36:08conference tables, and they would say,

0:36:08 > 0:36:11"Dave, you're mucking with the IBM brand. You need to win.

0:36:13 > 0:36:16"Whatever you want, just tell us what you need.

0:36:16 > 0:36:18"Tell us what you need to win."

0:36:18 > 0:36:21IBM already had Deep Blue, which had beaten Garry Kasparov at chess.

0:36:21 > 0:36:24So how was this challenge going to be different?

0:36:24 > 0:36:25Jeopardy! can ask about anything,

0:36:25 > 0:36:28- whereas chess, you always knew what you were dealing with.- Yeah.

0:36:28 > 0:36:30I mean, there's a bunch of pieces, they move in certain ways.

0:36:30 > 0:36:33There's a finite set of moves. You knew what you were dealing with.

0:36:33 > 0:36:36This space was wholly unconstrained. The only thing we had was the prior

0:36:36 > 0:36:38data, in other words, answers had to come from places like

0:36:38 > 0:36:41Wikipedia, dictionaries, novels, plays, the Bible,

0:36:41 > 0:36:44wherever this stuff could come from. So you had that content base,

0:36:44 > 0:36:48but to take that question and figure out what that question was asking,

0:36:48 > 0:36:51and you don't want to buzz in and be wrong.

0:36:51 > 0:36:53So you had to be able to compute your confidence that you were going

0:36:53 > 0:36:55- to be right. - You must have needed to do some

0:36:55 > 0:36:57trial runs, so what were the trial runs like?

0:36:57 > 0:37:00Were there any glitches or anything?

0:37:00 > 0:37:01In the beginning, it was terrible.

0:37:01 > 0:37:05I mean, it would make ridiculous, ridiculous mistakes.

0:37:05 > 0:37:10And at that time, Watson took two hours to answer a single question,

0:37:10 > 0:37:12so we actually couldn't play it live.

0:37:12 > 0:37:14It would be a very boring, long game.

0:37:14 > 0:37:16This went on for a couple of years,

0:37:16 > 0:37:20until we got to a point where Watson could answer a question in

0:37:20 > 0:37:22three seconds, between two and three seconds.

0:37:22 > 0:37:26- Whoa! - And that was fast enough to play.

0:37:26 > 0:37:29The day you go into the studio, Dave, what are you thinking?

0:37:29 > 0:37:31As a scientist, I was going in saying, "I wish I could just

0:37:31 > 0:37:34"publish the practice games," because if you just set up

0:37:34 > 0:37:37the AI challenge, you know, we did it.

0:37:37 > 0:37:40We had a very competitive Jeopardy! player that can beat the best humans

0:37:40 > 0:37:43at something that no-one thought was even possible.

0:37:43 > 0:37:46Developed and programmed especially for this moment,

0:37:46 > 0:37:48ladies and gentlemen, this is Watson.

0:37:48 > 0:37:50APPLAUSE

0:37:50 > 0:37:52This was going to come down to one game.

0:37:52 > 0:37:55And if we go out there and we lose that one game, that was going to

0:37:55 > 0:37:57be on television, it would look like the team had lost.

0:37:57 > 0:38:01And that was just a painful thing to think about because, I mean,

0:38:01 > 0:38:04I was in there day-to-day, trying to make this thing happen.

0:38:04 > 0:38:06- Watson?- Who is CS Lewis?

0:38:06 > 0:38:09- Yes.- Who is Sir Christopher Wren?

0:38:09 > 0:38:11- You are right.- What is Picasso?

0:38:11 > 0:38:13- No, sorry.- What is narcolepsy?

0:38:13 > 0:38:19You are right and, with that, you move to 36,681.

0:38:19 > 0:38:21- APPLAUSE - A big, big lead.

0:38:21 > 0:38:23When Ken saw Watson get the daily double -

0:38:23 > 0:38:26Ken knows Jeopardy! really well, and he could do math -

0:38:26 > 0:38:28and he was like, "That's it, I lost".

0:38:28 > 0:38:33Even though you were only 32% sure of your response, you are correct.

0:38:37 > 0:38:41Final Jeopardy!, Watson still would have won by a dollar...

0:38:41 > 0:38:42That's...!

0:38:42 > 0:38:46Which probably would have had a very different effect.

0:38:46 > 0:38:49IBM's future is resting on 1!

0:38:49 > 0:38:531! Isn't that fantastic?

0:38:53 > 0:38:55We weren't just building a game machine.

0:38:55 > 0:38:59This was a machine that was taught to process language more effectively

0:38:59 > 0:39:05than any other machine before, and that was a great result for us.

0:39:05 > 0:39:07And I think that got people saying, "Hey, what else can this do?

0:39:07 > 0:39:10"Can this kind of intelligence help us do things?"

0:39:13 > 0:39:16Watson's ability to understand natural language is unprecedented

0:39:16 > 0:39:20and IBM are now selling the software worldwide.

0:39:20 > 0:39:23As a result, it's poised to take a whole host of jobs

0:39:23 > 0:39:26previously thought off-limits to machines.

0:39:26 > 0:39:28It's being used to do the work of lawyers,

0:39:28 > 0:39:31to calculate insurance policies,

0:39:31 > 0:39:33and even to provide technology-enhanced

0:39:33 > 0:39:35patient care in hospitals.

0:39:37 > 0:39:40We're working with IBM Watson to actually create a platform that we

0:39:40 > 0:39:43can answer children's questions when they want to have them answered.

0:39:43 > 0:39:46People talk about bedside manner, but can you create a digital

0:39:46 > 0:39:49bedside manner? So that's the vision, is to have an actual

0:39:49 > 0:39:53artificial intelligence built into how we engage with our children.

0:39:53 > 0:39:56Right, now, what I was wanting to have a go at was to see if I can try

0:39:56 > 0:40:00and train this computer to ask questions that you have.

0:40:00 > 0:40:04But to do that, I need to do what questions you want to know about

0:40:04 > 0:40:06and also if you think the answers are any good.

0:40:06 > 0:40:08Because I'm training it at the moment, because I can't be here

0:40:08 > 0:40:11all the time, because I have to go from patient to patient.

0:40:11 > 0:40:14But especially for the easier questions, if you just ask this,

0:40:14 > 0:40:17and then we can spend more time talking about important things

0:40:17 > 0:40:19that you want to know about.

0:40:19 > 0:40:22Is it going to hurt?

0:40:22 > 0:40:25I understand that you might be worried about having an operation.

0:40:25 > 0:40:28That's OK. During the operation, you won't feel anything,

0:40:28 > 0:40:30as we will give you an anaesthetic.

0:40:30 > 0:40:34Watson is being used at this Children's Hospital in Liverpool

0:40:34 > 0:40:38to help answer patients' questions on everything from the hospital menu

0:40:38 > 0:40:41to details of their treatment.

0:40:41 > 0:40:44How long does it take to fall asleep?

0:40:44 > 0:40:49The project is part of a wider big data research collaboration between

0:40:49 > 0:40:52IBM and the Government's Science and Technology Facilities Council,

0:40:52 > 0:40:56worth £315 million.

0:40:56 > 0:40:58If successful, and rolled out nationally,

0:40:58 > 0:41:01it could generate savings for the NHS.

0:41:01 > 0:41:04What have you attended for? Well, you were having your appendix

0:41:04 > 0:41:07taken out, weren't you? Nope, it's not got that right.

0:41:07 > 0:41:11OK. So I'm going to train it by saying that was the wrong answer.

0:41:11 > 0:41:14OK, so we'll give that one an unhappy face.

0:41:14 > 0:41:17Every so often, it gives you the wrong answer and you kind of go,

0:41:17 > 0:41:19"No, that's the wrong thing to say". And then it learns from that,

0:41:19 > 0:41:22and it starts to progress, so that's what it's like at the start.

0:41:22 > 0:41:25So who knows what it'll be like in five or ten years' time?

0:41:27 > 0:41:31Jobs that require empathy were for years considered the last frontier

0:41:31 > 0:41:34for automation, but no longer.

0:41:34 > 0:41:39Now we have algorithms that are able to substitute for cognitive work

0:41:39 > 0:41:43in that, in the previous realm, we had seen machines only able to

0:41:43 > 0:41:49replace manual work. Now it is no longer clear what it is that humans

0:41:49 > 0:41:52are fundamentally better at than machines.

0:41:53 > 0:41:57And among all the jobs that we are slowly losing to machines,

0:41:57 > 0:42:00one stands out - teaching.

0:42:02 > 0:42:05Children can now be educated by computers

0:42:05 > 0:42:07in schools that look like workplaces.

0:42:14 > 0:42:191990 - two political scientists secure a grant from a

0:42:19 > 0:42:23conservative American think tank to carry out a major survey

0:42:23 > 0:42:25into the US education system.

0:42:27 > 0:42:31They wanted to understand why, after more than a quarter century of

0:42:31 > 0:42:35costly education reform, the nation's schools have proven

0:42:35 > 0:42:37so resistant to change.

0:42:37 > 0:42:39The key to the problem, they concluded,

0:42:39 > 0:42:41were the powerful teaching unions.

0:42:44 > 0:42:46Terry Moe was one of those political scientists

0:42:46 > 0:42:50and has since written extensively on the politics of American education.

0:42:52 > 0:42:56What you want are schools and school systems that have incentives to

0:42:56 > 0:43:02create the best possible schools and organisations and

0:43:02 > 0:43:04education environments for children.

0:43:04 > 0:43:09And if it turns out that certain teaching methods,

0:43:09 > 0:43:11certain approaches to learning, are best,

0:43:11 > 0:43:13those are the ones that should be adopted.

0:43:13 > 0:43:17But if powerful groups have

0:43:17 > 0:43:23embraced methods and approaches that don't do that,

0:43:23 > 0:43:25- they're still going to do it.- Right.

0:43:25 > 0:43:28They are still going to impose these things on children because

0:43:28 > 0:43:31they believe them and they're powerful enough to entrench them

0:43:31 > 0:43:36in the schools. Teachers, I think, on average, love kids, you know,

0:43:36 > 0:43:39they want schools that are really productive.

0:43:39 > 0:43:43However, they join unions to protect their jobs.

0:43:43 > 0:43:47But you know, bigger picture, what's the point of the school system?

0:43:47 > 0:43:51It's to educate children. That's the ONLY point of the school system.

0:43:51 > 0:43:54It was never set up in order to provide jobs to adults.

0:43:55 > 0:44:00Moe decided the only alternative to internal politics in teaching was

0:44:00 > 0:44:05to open up the education system to market forces of supply and demand,

0:44:05 > 0:44:08an argument he set out in a book which would go on to influence

0:44:08 > 0:44:11the British education system, too.

0:44:11 > 0:44:14The book was published, we had the Blair government come in,

0:44:14 > 0:44:17New Labour in the '90s, you're hailed as prophets.

0:44:17 > 0:44:20Revolutionising... And we get the academy system,

0:44:20 > 0:44:21which is...

0:44:21 > 0:44:26Which is basically an attempt to put your ideas into practise.

0:44:26 > 0:44:28So, how did you feel that went?

0:44:28 > 0:44:31Did you see it as an endorsement of your ideas?

0:44:32 > 0:44:36Yes. I mean, I think that, first of all,

0:44:36 > 0:44:40I think we're right in the sense that politics does

0:44:40 > 0:44:45lead to schools that are perversely organised

0:44:45 > 0:44:46and you can expect that to happen.

0:44:46 > 0:44:51# Funky life, I've been told

0:44:51 > 0:44:54# All that glitters is not gold

0:44:54 > 0:44:55# And gold is not reality... #

0:44:55 > 0:44:59What Moe and associate John Chubb concluded was that technology

0:44:59 > 0:45:03is the key to breaking down the politics of education.

0:45:03 > 0:45:07Technology threatens the idea of school as a building,

0:45:07 > 0:45:10with kids and teachers in the same physical place.

0:45:10 > 0:45:14Technology could make traditional teaching roles obsolete.

0:45:14 > 0:45:17In that book, you argue that technology

0:45:17 > 0:45:20is the way to transform the class,

0:45:20 > 0:45:23in every way, to break the teacher unions, and to do it,

0:45:23 > 0:45:26you bring technology, incrementally, into the classroom,

0:45:26 > 0:45:28and that way teachers slowly become marginalised.

0:45:28 > 0:45:29Yeah, in effect.

0:45:31 > 0:45:33So let's go back to the beginning.

0:45:33 > 0:45:37People tend to think of this as, like, greater reliance on computers

0:45:37 > 0:45:39in the classroom, you know.

0:45:39 > 0:45:41But big picture -

0:45:41 > 0:45:44what we're talking about here is the revolution in

0:45:44 > 0:45:50information technology, which is one of the most powerful

0:45:50 > 0:45:54transformative events in the history of the world, you know.

0:45:54 > 0:45:58It sounds like hyperbole, you know, but it is.

0:45:58 > 0:46:03They talk very directly about how you can use technology to kind of

0:46:03 > 0:46:06usurp the power of the teaching unions

0:46:06 > 0:46:10and you can kind of fragment the workforce.

0:46:10 > 0:46:13They also talk about the opening up of data, so if you take the data

0:46:13 > 0:46:16out of the black box that is the classroom,

0:46:16 > 0:46:19and basically out of the teacher's head,

0:46:19 > 0:46:21then you can start to take the power away from teachers.

0:46:21 > 0:46:25You know, data in the hands of a good teacher is a very...

0:46:25 > 0:46:27It can be a powerful tool.

0:46:27 > 0:46:31But if your reason for doing it is to introduce market forces

0:46:31 > 0:46:34and ultimately a consumer market in education products,

0:46:34 > 0:46:37which is where they're heading with this,

0:46:37 > 0:46:41then at the very least we need to have a conversation about this.

0:46:41 > 0:46:45Is this about improving education for children and their prospects for

0:46:45 > 0:46:49the future, or is it about the privatisation of education?

0:46:50 > 0:46:52Listen up real quick, guys.

0:46:52 > 0:46:55Just to let you know, we have about 20 minutes left of class.

0:46:55 > 0:46:57Let me know if you guys have any questions.

0:46:57 > 0:46:59Good job.

0:46:59 > 0:47:02So I want you to try again and call me over before you submit,

0:47:02 > 0:47:03and we'll go through it, OK?

0:47:03 > 0:47:08At this school in San Diego, 75 pupils study at computer terminals

0:47:08 > 0:47:11while being monitored by one teacher.

0:47:13 > 0:47:14Students sitting next to each other

0:47:14 > 0:47:17are rarely learning the same subjects.

0:47:18 > 0:47:22This is Jordan. He's taking the astronomy 101,

0:47:22 > 0:47:24so this is the instructor for astronomy.

0:47:24 > 0:47:26And then next to him, Brendan, he's doing coding.

0:47:26 > 0:47:29- OK.- So he's doing a coding course for the coding academy.

0:47:29 > 0:47:32And then we have principles of astronomy here.

0:47:32 > 0:47:34And how does it differ from old-fashioned teaching?

0:47:34 > 0:47:37So, back in the day, a lot of people just thought about,

0:47:37 > 0:47:39"Oh, yeah, pencil and paper works the best,"

0:47:39 > 0:47:42but now that technology has improved and stuff like that,

0:47:42 > 0:47:44it's really cool to get different education.

0:47:44 > 0:47:47So, Jordan, you want to be an astronomer.

0:47:47 > 0:47:49- Well...- Or maybe not? After watching that.

0:47:49 > 0:47:51I've thought about it, but the class is very interesting,

0:47:51 > 0:47:53so I decided to take it.

0:47:53 > 0:47:58Computers, ironically, can give them a personal expedience.

0:47:58 > 0:48:04One-on-one. You compare that to the standard model of a teacher and

0:48:04 > 0:48:0730 students. What is that? Right, a teacher has to provide

0:48:07 > 0:48:11a standardised curriculum to everybody in the class.

0:48:11 > 0:48:15It's completely inefficient as a way of teaching kids

0:48:15 > 0:48:18and I think most educators think, "Well, there isn't any alternative.

0:48:18 > 0:48:22"That's just what teaching is." But it's not. Not any more.

0:48:22 > 0:48:24This is like a dream come true.

0:48:28 > 0:48:32In this school, learning by computer means you're also assessed

0:48:32 > 0:48:37by computer. Huge amounts of data on students give teachers a record,

0:48:37 > 0:48:41not just of exam results, but of how hard each pupil has been working.

0:48:43 > 0:48:46As we look at our world history classes and our students,

0:48:46 > 0:48:49how are we doing for the progress for this last week?

0:48:49 > 0:48:51Yeah, we're doing pretty good.

0:48:51 > 0:48:53I mean, looking at the data, we had one struggling student.

0:48:53 > 0:48:56You can see it kind of marked right here in negative three.

0:48:56 > 0:48:59And do we have a plan of action for our student?

0:48:59 > 0:49:02We need to e-mail the parent and say "Hey, they're behind in this course.

0:49:02 > 0:49:04"Here's what we're working on."

0:49:04 > 0:49:05Well, if she's behind on other things,

0:49:05 > 0:49:07we'll have her come in on Monday mornings...

0:49:07 > 0:49:10- Yeah, I think that would be helpful. - Days when we don't have class, OK.

0:49:10 > 0:49:12All right, listen up.

0:49:12 > 0:49:14This is exciting. You guys can go.

0:49:14 > 0:49:16See you later.

0:49:16 > 0:49:19Supporters of online classes say they are a cost-effective

0:49:19 > 0:49:22alternative to traditional teaching,

0:49:22 > 0:49:24providing students more flexibility in their learning,

0:49:24 > 0:49:28as well as access to a greater variety of options.

0:49:28 > 0:49:31But a report by the Organisation For Economic Co-operation

0:49:31 > 0:49:35And Development suggested computers do not improve results.

0:49:35 > 0:49:38When you look at the evidence, and there is very little evidence,

0:49:38 > 0:49:42but what evidence there is shows that this stuff doesn't work.

0:49:42 > 0:49:46It doesn't help children to learn. It lowers standards.

0:49:46 > 0:49:51Actually, heavy use of technology in education, the results are dreadful.

0:49:51 > 0:49:54Kids are turning off, they're not engaged in the content,

0:49:54 > 0:49:57they get very used to it, and it's not helping them to learn.

0:49:57 > 0:50:01There is something about the human interaction that helps kids learn.

0:50:01 > 0:50:04So not only is the problem false, I would argue,

0:50:04 > 0:50:08the solution is a false narrative as well.

0:50:10 > 0:50:14But this second Industrial Revolution, the robot revolution,

0:50:14 > 0:50:16won't take all jobs away.

0:50:16 > 0:50:20In fact, some economists believe it could create billions more,

0:50:20 > 0:50:24a society of full employment, but what will those jobs be?

0:50:24 > 0:50:26To get a glimpse into the future,

0:50:26 > 0:50:29you need to look no further than the humble car wash.

0:50:33 > 0:50:3515 years ago,

0:50:35 > 0:50:40automated machines on petrol station forecourts started closing down.

0:50:40 > 0:50:44The Petrol Retailers Association say that, since the mid-noughties,

0:50:44 > 0:50:48the numbers have halved from 9,000 to just over 4,000.

0:50:50 > 0:50:53Instead of using machines at garages,

0:50:53 > 0:50:56we use humans to wash cars again.

0:50:56 > 0:50:58It was an odd regression -

0:50:58 > 0:51:02a shift towards a low pay, low productivity trap.

0:51:03 > 0:51:07This is a story of a political deal and how it ensured that

0:51:07 > 0:51:10we did the hard work, not robots.

0:51:14 > 0:51:19It's 2003, ten new countries are about to join the EU,

0:51:19 > 0:51:21eight of them from the old Eastern Bloc -

0:51:21 > 0:51:24poor countries, by Western standards.

0:51:24 > 0:51:27The Home Secretary at the time, Lord Blunkett,

0:51:27 > 0:51:30is about to sign a deal with the EU.

0:51:30 > 0:51:31The question on the table is,

0:51:31 > 0:51:35how many workers from those countries might come to Britain?

0:51:35 > 0:51:37We were uncertain.

0:51:37 > 0:51:40We were sceptical about whether we could make predictions.

0:51:40 > 0:51:44People were used to the idea of free movement and what we were trying

0:51:44 > 0:51:47to address, was given free movement,

0:51:47 > 0:51:51should we allow these people to work legally, pay National Insurance,

0:51:51 > 0:51:54pay tax, or should they work in the sub-economy?

0:51:57 > 0:52:02Britain had a choice - open up its labour market to everyone,

0:52:02 > 0:52:05or temporarily restrict the numbers.

0:52:05 > 0:52:09The job of estimating how many might come to live and work in the UK

0:52:09 > 0:52:14fell to migration expert Professor Christian Dustmann.

0:52:14 > 0:52:18We started that report early in 2003, where we tried

0:52:18 > 0:52:23to predict the number of people who would migrate after 2004.

0:52:23 > 0:52:27We're talking about all the Eastern European countries joining the EU.

0:52:27 > 0:52:29No-one has any idea what this is going to be.

0:52:29 > 0:52:34So we tried to do the best on the evidence that we have.

0:52:34 > 0:52:39Our number was about 13,000 would come to the UK every year.

0:52:39 > 0:52:44The report, however, was based on a very specific scenario,

0:52:44 > 0:52:47that Germany opens up its labour market

0:52:47 > 0:52:50to Eastern European migrants as well.

0:52:50 > 0:52:53Based on Dustmann's estimates,

0:52:53 > 0:52:56the British government did a deal with the EU -

0:52:56 > 0:52:59Britain decided to allow everyone in,

0:52:59 > 0:53:01but then Germany changed its mind.

0:53:02 > 0:53:06Finally, on the 1st of May, it was the UK,

0:53:06 > 0:53:10Sweden and Ireland among the European -

0:53:10 > 0:53:13existing European countries - which actually opened

0:53:13 > 0:53:18the labour market to those new accession countries,

0:53:18 > 0:53:21and not Germany. So, at that point,

0:53:21 > 0:53:25the scenario we had modelled was, of course, not the scenario any more.

0:53:25 > 0:53:30Germany had opted to temporarily restrict its borders to migrants,

0:53:30 > 0:53:32so they looked to Britain instead.

0:53:32 > 0:53:3513,000 a year didn't come here -

0:53:35 > 0:53:40it was closer to 40,000. Critics asked how the government

0:53:40 > 0:53:43could so badly miscalculate the numbers.

0:53:43 > 0:53:47The 2003 Dustmann report is predicated on one thing -

0:53:47 > 0:53:51Germany opening up in the same way as we're going to open up

0:53:51 > 0:53:53and Germany doesn't. It has...

0:53:53 > 0:53:57It closes down and it says it's going to close down for seven years.

0:53:57 > 0:53:59Look, let me be very straight.

0:53:59 > 0:54:03I've defended the decision, but I was culpable on two fronts.

0:54:03 > 0:54:08One, I didn't really clock the significance of the decision

0:54:08 > 0:54:12that Germany was taking, even though I knew about it.

0:54:12 > 0:54:19And secondly, I was quite prepared, as politicians so often are,

0:54:19 > 0:54:25to use the cover of the statisticians' analysis -

0:54:25 > 0:54:28guess, if you want - to provide us with a comfort zone

0:54:28 > 0:54:32while we were implementing the early part of the registration.

0:54:32 > 0:54:37We were quite happy to ride with the public understanding that this

0:54:37 > 0:54:41wasn't going to be a major influx and, of course, it was.

0:54:44 > 0:54:51At least 850,000 people - around 3% of the UK working age population -

0:54:51 > 0:54:57migrated from Eastern Europe to the UK between 2004 and 2011.

0:54:57 > 0:55:00You were hung out to dry, basically, by politicians.

0:55:00 > 0:55:04They came and said, "You've got this completely wrong.

0:55:04 > 0:55:06"You've wildly underestimated the figures.

0:55:06 > 0:55:11"A massive, catastrophic miscalculation by you".

0:55:11 > 0:55:15Nobody read the report, I think. Not many people read the report.

0:55:15 > 0:55:19It's a Sliding Doors moment, David, isn't it? One way or another.

0:55:19 > 0:55:23- Well...- You said, in a candid way, which is very rare for a politician,

0:55:23 > 0:55:27- "We got it wrong". - Well, we got that part of it wrong.

0:55:27 > 0:55:33But there's no question looking back that it was a very fine line

0:55:33 > 0:55:38as to whether we were reducing future productivity levels,

0:55:38 > 0:55:42whilst actually creating and holding on to very high employment.

0:55:42 > 0:55:46These are very fine decisions and sometimes you get them wrong.

0:55:50 > 0:55:55The effect of EU migration on the UK's labour market is the subject of

0:55:55 > 0:55:58a highly political and frequently polarised debate.

0:55:58 > 0:56:03More than one in five low-paid jobs in the UK is now occupied by a

0:56:03 > 0:56:04low-skilled migrant worker.

0:56:06 > 0:56:09What appeared at the time to be a decision benefiting the economy

0:56:09 > 0:56:15became a political negative with the downturn of the economy post-2007.

0:56:17 > 0:56:21The future for work now in an age of austerity and automation

0:56:21 > 0:56:25will be for us all to find our own opportunities.

0:56:26 > 0:56:29I'm interested in the future of the United Kingdom and United States,

0:56:29 > 0:56:31and I'm interested in the face of robotics and AI as

0:56:31 > 0:56:34how we're going to create jobs. What I'm saying to you is,

0:56:34 > 0:56:37"If you can't do it, artificial intelligence can".

0:56:38 > 0:56:42I look at it as technology providing opportunity.

0:56:42 > 0:56:43Technology giving us...

0:56:45 > 0:56:48..liberating us to do more of what we want, the way we want to do it.

0:56:48 > 0:56:51Of course, there's also the fear side of that, right?

0:56:51 > 0:56:54Which is, "Am I going to lose my job?" We have to think about it.

0:56:54 > 0:56:58We have to participate. You just don't let the machines take over.

0:56:58 > 0:57:01Be responsible in thinking about how you apply this technology,

0:57:01 > 0:57:06how do you faze it incrementally, and so forth.

0:57:06 > 0:57:09This great push towards robots, this is the future,

0:57:09 > 0:57:15and it somehow takes away any agency and any political democratic process

0:57:15 > 0:57:18because this is the imaginings of Silicon Valley.

0:57:18 > 0:57:23Small government, gig economy, the Uberification of jobs -

0:57:23 > 0:57:25that's kind of what they want.

0:57:25 > 0:57:28But I would counter that with, "Well, is it what WE want?"

0:57:30 > 0:57:35And so, the fate of the automated car wash becomes one model of

0:57:35 > 0:57:39how work could go - machines replaced by low paid workers

0:57:39 > 0:57:42motivated only by the fear of unemployment.

0:57:42 > 0:57:46Could this be the future of work?

0:57:46 > 0:57:50Watson, in a control centre, an unblinking eye,

0:57:50 > 0:57:54monitoring how many cars you and I wash in an hour.

0:57:57 > 0:57:59To explore further how digital technologies

0:57:59 > 0:58:04are transforming societies, go to the BBC web page on-screen

0:58:04 > 0:58:07and follow the links to the Open University.