Browse content similar to Work. Check below for episodes and series from the same categories and more!
Line | From | To | |
---|---|---|---|
What if the way we understand the world is wrong? | 0:00:03 | 0:00:07 | |
What if it's not politicians and world events that shape our lives, | 0:00:08 | 0:00:12 | |
but business deals? | 0:00:12 | 0:00:14 | |
It's natural for people to fight against change, | 0:00:14 | 0:00:17 | |
but change is coming, and we need to adapt to it. | 0:00:17 | 0:00:21 | |
Deals made in secret, high up in a boardroom, over a drink, | 0:00:22 | 0:00:27 | |
or in a car in the dead of night. | 0:00:27 | 0:00:29 | |
We were trying to protect people from excess medicine. | 0:00:29 | 0:00:32 | |
-You couldn't. -Yeah, we failed. | 0:00:32 | 0:00:34 | |
Our every waking hour has been transformed by these deals, | 0:00:35 | 0:00:39 | |
reprogramming us to think and behave in a different way. | 0:00:39 | 0:00:43 | |
While we're competing with these large monolithic banks that are very | 0:00:43 | 0:00:46 | |
bureaucratic, we can run circles around them all day. | 0:00:46 | 0:00:48 | |
This is the story of these deals. | 0:00:48 | 0:00:51 | |
These ideas were ahead of their time. | 0:00:51 | 0:00:53 | |
Online payments can change the world. | 0:00:53 | 0:00:56 | |
Who made them, why, and what do they mean for our future? | 0:00:56 | 0:01:01 | |
In this series, I'm going to look at three key aspects of our lives, | 0:01:04 | 0:01:08 | |
and how they've been revolutionised by these deals. | 0:01:08 | 0:01:13 | |
Who wants to kill cash, and why? | 0:01:13 | 0:01:16 | |
Our vision is to have a cashless society. | 0:01:16 | 0:01:19 | |
Card payments are the best way to pay and the best way to be paid. | 0:01:19 | 0:01:23 | |
Why do we medicate every aspect of our lives? | 0:01:23 | 0:01:26 | |
This notion of the pursuit of happiness, it's not realistic! | 0:01:26 | 0:01:30 | |
It gets all of us to be on drugs! | 0:01:30 | 0:01:33 | |
But in this film, I'll be looking at why we're trapped in a culture | 0:01:33 | 0:01:36 | |
of endless work. | 0:01:36 | 0:01:39 | |
It is over, dude. It is gone. | 0:01:39 | 0:01:41 | |
If you can't do it, artificial intelligence can. | 0:01:42 | 0:01:45 | |
How did work go from something we do to who we are? | 0:01:46 | 0:01:50 | |
It's not a person, it's a performer, right? | 0:01:50 | 0:01:53 | |
I'm sorry, it's not a person?! It is a person! | 0:01:53 | 0:01:56 | |
I'm a person sitting in front of you. | 0:01:56 | 0:01:58 | |
These are the deals that made you work harder, made your pay lower... | 0:01:58 | 0:02:03 | |
These are very fine decisions, and sometimes you get them wrong. | 0:02:03 | 0:02:07 | |
..and might one day replace you with a robot. | 0:02:07 | 0:02:10 | |
PHONE CHIMES | 0:02:16 | 0:02:18 | |
This is what 12% of us are going to do tonight, and every night - | 0:02:23 | 0:02:29 | |
wake up at 3:00 am, and check our work e-mail. | 0:02:29 | 0:02:33 | |
A few hours later, 51% of us are going to spend more time | 0:02:34 | 0:02:38 | |
checking our work e-mail than eating breakfast. | 0:02:38 | 0:02:41 | |
30% of jobs in the UK could be automated within the next 15 years. | 0:02:47 | 0:02:52 | |
As intelligent machines begin their march on the labour market, | 0:02:54 | 0:02:57 | |
how long will it be before they overhaul our economy completely? | 0:02:57 | 0:03:01 | |
This is the bigger picture - | 0:03:05 | 0:03:07 | |
how WE became robots, automated our lives | 0:03:07 | 0:03:11 | |
at the very moment that robots learnt to become human. | 0:03:11 | 0:03:16 | |
To understand how we got here, | 0:03:28 | 0:03:31 | |
we need to go back to the '70s, to a deal that would make us loyal | 0:03:31 | 0:03:35 | |
to our workplace, that would make us become our jobs, | 0:03:35 | 0:03:38 | |
and our jobs to eventually become our lives. | 0:03:38 | 0:03:42 | |
Summer 1975. | 0:03:53 | 0:03:55 | |
America is in recession, losing jobs at an unprecedented rate. | 0:03:55 | 0:03:59 | |
Four men set themselves the task of rebuilding American enterprise. | 0:04:04 | 0:04:08 | |
They meet in downtown San Francisco in an office on the 48th floor. | 0:04:10 | 0:04:15 | |
Their meeting would change the language of American business. | 0:04:16 | 0:04:20 | |
One of them was Tom Peters, | 0:04:24 | 0:04:26 | |
a larger-than-life management consultant, | 0:04:26 | 0:04:29 | |
who would go on to become a celebrity life coach. | 0:04:29 | 0:04:32 | |
Huge news! | 0:04:32 | 0:04:34 | |
Another was Richard Pascale, | 0:04:34 | 0:04:36 | |
a quietly spoken academic in the field of management science. | 0:04:36 | 0:04:40 | |
This unlikely double act, who were both working as consultants | 0:04:41 | 0:04:45 | |
for McKinsey, a management consultancy firm, | 0:04:45 | 0:04:47 | |
made a deal to work together to create a new management philosophy, | 0:04:47 | 0:04:52 | |
and one country in particular seemed to show the way. | 0:04:52 | 0:04:55 | |
It is obvious that the United States economy is not in good shape. | 0:04:58 | 0:05:01 | |
But while our industry is floundering, | 0:05:01 | 0:05:04 | |
the Japanese are reporting record profits. | 0:05:04 | 0:05:07 | |
The Americans were getting hammered by the Japanese in a lot of things, | 0:05:07 | 0:05:13 | |
but the one that hurt our ego was the automobile market. | 0:05:13 | 0:05:16 | |
To grossly oversimplify, to the point of not being obviously the | 0:05:16 | 0:05:21 | |
whole story, the Japanese were beating us on fundamentally what | 0:05:21 | 0:05:24 | |
we had, very sexy cars, except theirs, actually, | 0:05:24 | 0:05:27 | |
when you turned the ignition key, they started. | 0:05:27 | 0:05:30 | |
-Right. -And our quality was highly questionable. | 0:05:30 | 0:05:34 | |
# There's something inside you... # | 0:05:34 | 0:05:37 | |
Japan has gone from being a producer of trash to a nation synonymous | 0:05:37 | 0:05:41 | |
with quality and reliability. | 0:05:41 | 0:05:42 | |
So this was a moment where the Japanese motor industry was making | 0:05:44 | 0:05:48 | |
cars that were of quality, they were reliable. | 0:05:48 | 0:05:51 | |
-Absolutely. -And the American car industry was going... | 0:05:51 | 0:05:53 | |
Absolutely, and, you know, to be realistic about it, first of all, | 0:05:53 | 0:05:57 | |
you're attacking us in the car industry, and that doesn't feel good | 0:05:57 | 0:06:00 | |
to you and me, and secondly, for God's sakes, it was 1977. | 0:06:00 | 0:06:07 | |
We're only 30 years beyond... Well, what was it? | 0:06:07 | 0:06:10 | |
From the war. | 0:06:10 | 0:06:11 | |
Yeah, and everybody's dad or grandad fought in World War II. | 0:06:11 | 0:06:17 | |
So, you know, we had just beaten these buggers, | 0:06:17 | 0:06:19 | |
and, "What the hell are they doing?" you know? | 0:06:19 | 0:06:21 | |
So the whole thing about, "We won the war, but they've won the peace"? | 0:06:21 | 0:06:24 | |
Absolutely. Precisely so. | 0:06:24 | 0:06:27 | |
But it was about far more than how to build a better car. | 0:06:29 | 0:06:32 | |
What else could America learn from the Japanese? | 0:06:34 | 0:06:36 | |
For Richard Pascale, | 0:06:37 | 0:06:39 | |
the answer lay in the unique Japanese business culture. | 0:06:39 | 0:06:42 | |
Everyone feels this incredible sense of commitment. | 0:06:43 | 0:06:46 | |
You don't come in to hang out and lay back. | 0:06:46 | 0:06:49 | |
You're in there like you would on a team. | 0:06:49 | 0:06:52 | |
You're expected to play and hold up your end of the deal. | 0:06:52 | 0:06:54 | |
So that quality of commitment and energy just resonated with me | 0:06:54 | 0:06:59 | |
in a very deep and profound way. | 0:06:59 | 0:07:01 | |
The companies care not only about their workers' job performance, | 0:07:01 | 0:07:05 | |
but about their health, their family life, | 0:07:05 | 0:07:07 | |
and their ideas for doing things better. | 0:07:07 | 0:07:10 | |
Good ideas can win a Canon camera worker a paid vacation | 0:07:10 | 0:07:13 | |
and a cash bonus. | 0:07:13 | 0:07:15 | |
This attitude seemed to derive from the values set | 0:07:15 | 0:07:18 | |
by the companies' bosses. | 0:07:18 | 0:07:21 | |
Management philosophy was just astonishing. | 0:07:21 | 0:07:23 | |
You had aspirations that allowed people in their work | 0:07:23 | 0:07:27 | |
to have a meaningful sense of purpose. | 0:07:27 | 0:07:30 | |
We felt strongly that this overarching sense of what a company | 0:07:32 | 0:07:36 | |
is about and the meaning it makes seems to make a difference. | 0:07:36 | 0:07:40 | |
Together, they arrived at a system of management with what they called | 0:07:40 | 0:07:43 | |
shared values at its heart. | 0:07:43 | 0:07:46 | |
What we came up with was a way of taking the best of the | 0:07:46 | 0:07:51 | |
Japanese ideas and keeping them coupled to... | 0:07:51 | 0:07:56 | |
God, I hate terms like this because they're such gross | 0:07:56 | 0:07:59 | |
over-generalisation, but America's innovative spirit. | 0:07:59 | 0:08:02 | |
Peter and Pascale's meeting of minds appealed to business leaders, | 0:08:04 | 0:08:09 | |
humbled by Japanese manufacturing prowess. | 0:08:09 | 0:08:12 | |
From now on, the company wouldn't just give employees jobs to do, | 0:08:12 | 0:08:16 | |
it would set all-encompassing values for them to follow. | 0:08:16 | 0:08:20 | |
This was the key to reshaping their businesses. | 0:08:20 | 0:08:23 | |
Both went on to publish books based on their ideas | 0:08:25 | 0:08:28 | |
within a year of each other. | 0:08:28 | 0:08:30 | |
Pascale's The Art Of Japanese Management was published in 1981. | 0:08:30 | 0:08:33 | |
That shared value insight turned out to be both very useful, | 0:08:35 | 0:08:41 | |
but as well, at the time, extremely provocative. | 0:08:41 | 0:08:46 | |
Hard to imagine today, with the whole industry of vision | 0:08:46 | 0:08:49 | |
and values being created in the last 10 or 20 years, | 0:08:49 | 0:08:53 | |
in which everyone has their little plaque on the wall | 0:08:53 | 0:08:56 | |
that states what they aspire to be. | 0:08:56 | 0:08:59 | |
But at the time when we started talking about this, | 0:08:59 | 0:09:02 | |
I remember truly violent reactions from audiences. | 0:09:02 | 0:09:07 | |
What did they say you were trying to do? | 0:09:07 | 0:09:08 | |
Well, they saw it as brainwashing. | 0:09:08 | 0:09:11 | |
They saw it as paternalistic, they found it extremely offensive, | 0:09:11 | 0:09:15 | |
intrusive. Placing companies in a role that was out of their scope | 0:09:15 | 0:09:20 | |
and inappropriate for a commercial enterprise. | 0:09:20 | 0:09:22 | |
But while Pascale's book didn't capture the public's imagination, | 0:09:23 | 0:09:27 | |
Tom Peters took the same ideas and brought them to millions | 0:09:27 | 0:09:31 | |
with his book called In Search Of Excellence. | 0:09:31 | 0:09:35 | |
I don't really want to completely cheapen it | 0:09:35 | 0:09:37 | |
and say I was the populariser, | 0:09:37 | 0:09:39 | |
but I was the populariser. | 0:09:39 | 0:09:41 | |
But you were, actually, redefining how a workplace should be. | 0:09:41 | 0:09:45 | |
Yeah. There is a real intellectual hard... | 0:09:45 | 0:09:49 | |
-Of course. -..no baloney, intellectual history for this stuff. | 0:09:49 | 0:09:52 | |
-Right. -But nobody paid any attention to it. | 0:09:52 | 0:09:54 | |
-Right. -And first of all we got lucky with labels like Excellence and we | 0:09:54 | 0:09:59 | |
got lucky with 10% unemployment and we got lucky by being | 0:09:59 | 0:10:03 | |
caught with our pants down and cars that didn't work. | 0:10:03 | 0:10:06 | |
And so I will completely acknowledge that this new look at organisations | 0:10:06 | 0:10:12 | |
came together with insane timing. | 0:10:12 | 0:10:15 | |
And there was nothing wrong with what we did, | 0:10:15 | 0:10:18 | |
it was a pretty good piece of work, | 0:10:18 | 0:10:19 | |
but the timing was a gift from the gods. | 0:10:19 | 0:10:23 | |
Tom Peters had put a management book on the bestseller list, | 0:10:25 | 0:10:29 | |
selling in excess of five million copies. | 0:10:29 | 0:10:31 | |
It was a modern business blockbuster, | 0:10:31 | 0:10:34 | |
but most importantly, it made management ideas | 0:10:34 | 0:10:37 | |
a serious topic of conversation in organisations. | 0:10:37 | 0:10:40 | |
Its thinking was adopted by companies around the world. | 0:10:40 | 0:10:45 | |
What Peters did was he made culture popular. | 0:10:45 | 0:10:47 | |
He had initially created this very clever thing for McKinsey called the | 0:10:47 | 0:10:52 | |
7S Model, where he identified the soft and hard drivers of | 0:10:52 | 0:10:58 | |
organisation culture and how they pulled themselves together to create | 0:10:58 | 0:11:03 | |
a set of shared values that enabled the delivery of strategy. | 0:11:03 | 0:11:06 | |
And he... It was a very smart piece of thinking. | 0:11:06 | 0:11:09 | |
And it enabled leaders, for the first time, | 0:11:09 | 0:11:11 | |
to get their hands on something and say, | 0:11:11 | 0:11:14 | |
"so if we change this process over here, | 0:11:14 | 0:11:16 | |
"or if we change our reward mechanism over here, | 0:11:16 | 0:11:19 | |
"what we should see is a different outcome, or a different... | 0:11:19 | 0:11:21 | |
"Driven by a different set of behaviours". | 0:11:21 | 0:11:24 | |
Before Peters met Pascale, the idea of your work | 0:11:24 | 0:11:28 | |
having a higher meaning would have seemed ludicrous. | 0:11:28 | 0:11:31 | |
But as their thinking caught on, our work culture changed. | 0:11:31 | 0:11:36 | |
We were asked to see work as having a higher purpose, not just a job. | 0:11:36 | 0:11:42 | |
Work could and would offer us fulfilment. | 0:11:42 | 0:11:46 | |
As the idea of shared values spread, | 0:11:49 | 0:11:52 | |
workplaces around the world became more than just workplaces. | 0:11:52 | 0:11:55 | |
They had mission statements. | 0:12:00 | 0:12:02 | |
They had values that the workers signed up to. | 0:12:02 | 0:12:05 | |
Good morning, everybody. Welcome to Services To Fruit. | 0:12:06 | 0:12:09 | |
For those of you who are new, | 0:12:09 | 0:12:11 | |
Services To Fruit is our opportunity to recognise unsung heroes. | 0:12:11 | 0:12:15 | |
People who have gone the extra mile, | 0:12:15 | 0:12:17 | |
people who have truly lived the values. | 0:12:17 | 0:12:20 | |
This month's Lady Of The Sash is Katrina. | 0:12:20 | 0:12:23 | |
At Innocent, makers of fruit smoothies, | 0:12:30 | 0:12:32 | |
they go further than most to create a unique company culture. | 0:12:32 | 0:12:36 | |
Using team bonding techniques, inspired from cult films, | 0:12:37 | 0:12:40 | |
like Bill And Ted's Excellent Adventure, | 0:12:40 | 0:12:43 | |
they see their workforce as a family. | 0:12:43 | 0:12:45 | |
One of my favourite things is this. | 0:12:49 | 0:12:50 | |
So this is... This is our baby photo wall. | 0:12:50 | 0:12:53 | |
When you join Innocent, we will put your baby photo up. | 0:12:53 | 0:12:56 | |
The idea being that you can see... | 0:12:56 | 0:12:58 | |
"This is who I work with, everybody is human, everyone is a baby, | 0:12:58 | 0:13:01 | |
-"everyone has a job to do". -Tim, is your Chief Exec here? | 0:13:01 | 0:13:04 | |
Yeah, Douglas, our chief squeezer, he's actually just down the bottom. | 0:13:04 | 0:13:07 | |
Not at the top, trying to look down on everyone else. | 0:13:07 | 0:13:09 | |
We're one big team, and you're not just here to do that job. | 0:13:09 | 0:13:13 | |
-Yeah. -You're here to make the world a better place. | 0:13:13 | 0:13:16 | |
Do you think someone cynical, diffident and cynical like myself, | 0:13:16 | 0:13:19 | |
would be able to fit in? | 0:13:19 | 0:13:21 | |
Because I'm the sort of person that would come and if I was interviewed, | 0:13:21 | 0:13:24 | |
I'd be like, "Oh, I'm not going to buy into all this stuff." | 0:13:24 | 0:13:26 | |
Do you see...? I'm just interested in the kind of personality that... | 0:13:26 | 0:13:29 | |
You know, would someone like myself be accepted? | 0:13:29 | 0:13:32 | |
We have people that have been here a long time, 10, 15, 16, 17 years. | 0:13:32 | 0:13:36 | |
And they don't come for beers every Friday and they don't get involved | 0:13:36 | 0:13:39 | |
in all the parties, but they are still about what Innocent are about. | 0:13:39 | 0:13:43 | |
But what if you become too dedicated to your job? | 0:13:43 | 0:13:46 | |
If you have truly drunk in the company values, | 0:13:46 | 0:13:49 | |
then sooner or later, is there a risk that you become your job? | 0:13:49 | 0:13:52 | |
And if you are your job, | 0:13:53 | 0:13:55 | |
what happens when the working day gets longer and longer? | 0:13:55 | 0:13:59 | |
In Japan, shared values have always gone hand in hand | 0:14:01 | 0:14:05 | |
with a long hours culture, and what they call karoshi - | 0:14:05 | 0:14:09 | |
death from overwork. | 0:14:09 | 0:14:12 | |
In terms of this new workplace, | 0:14:12 | 0:14:14 | |
this new really highly competitive poster book, '80s workplace, | 0:14:14 | 0:14:19 | |
it's totally redefined. | 0:14:19 | 0:14:21 | |
And what it is, is that you have to give your all to the company | 0:14:21 | 0:14:24 | |
if you're going to survive. | 0:14:24 | 0:14:26 | |
If you're going to keep your job, it means being... | 0:14:26 | 0:14:29 | |
You know, policing your own productivity. | 0:14:29 | 0:14:31 | |
Yeah, I mean, if I work my one ass off to help the ten people who work | 0:14:31 | 0:14:36 | |
for me be better prepared for the workplace of tomorrow, | 0:14:36 | 0:14:40 | |
they will work their ten asses off to make me look like a superstar. | 0:14:40 | 0:14:45 | |
Innocent have a novel way of curbing long hours working culture. | 0:14:48 | 0:14:51 | |
So this here is our last leaver pulls lever. | 0:14:53 | 0:14:55 | |
So at the end of the working day, | 0:14:55 | 0:14:58 | |
the last person to leave the floor will make an announcement at around | 0:14:58 | 0:15:01 | |
8:00, 8:30, and they will shut this off, | 0:15:01 | 0:15:03 | |
and it basically stops all non-essential electrical equipment. | 0:15:03 | 0:15:07 | |
The idea being that that last person is telling everyone, "Come on, guys, | 0:15:07 | 0:15:11 | |
"we've had enough, let's go." | 0:15:11 | 0:15:13 | |
But not keeping a lid on your working hours can take its toll. | 0:15:13 | 0:15:17 | |
There are more than 100,000 strokes in the UK each year. | 0:15:23 | 0:15:27 | |
OK, and again. | 0:15:27 | 0:15:29 | |
It's the fourth single leading cause of death in the country. | 0:15:29 | 0:15:33 | |
-Fingers are opening, lovely. -Turn your head. | 0:15:35 | 0:15:37 | |
Most of it is age-related, but a study by University College London, | 0:15:37 | 0:15:40 | |
published in The Lancet, | 0:15:40 | 0:15:42 | |
showed a link to long working hours and a higher risk of stroke. | 0:15:42 | 0:15:45 | |
This patient thinks the stress of running his business | 0:15:45 | 0:15:48 | |
played a part in his. | 0:15:48 | 0:15:50 | |
Reflecting on it, I suppose it was the stress of work. | 0:15:50 | 0:15:55 | |
Yeah. | 0:15:55 | 0:15:56 | |
I have an international business, and I'm flying around the world. | 0:15:56 | 0:16:01 | |
It takes a toll. | 0:16:01 | 0:16:02 | |
And when you look back, is there one thing that you think, | 0:16:02 | 0:16:05 | |
"Oh, yeah, you know, I was really worried about that"? | 0:16:05 | 0:16:07 | |
Or do you think it was just an accumulation of stress? | 0:16:07 | 0:16:09 | |
I suppose it must have been an accumulation. | 0:16:09 | 0:16:12 | |
There's no question, I think, in my mind that the kind of job that | 0:16:12 | 0:16:16 | |
Mr Mehta's doing, often working 12, 18 hours a day | 0:16:16 | 0:16:19 | |
-for nearly 20, 30 years... -Yeah. -..that was a contributing factor. | 0:16:19 | 0:16:23 | |
-And a seven-day week. -Seven days a week, yeah. You never turn off. | 0:16:23 | 0:16:26 | |
-No. -That's the thing, yeah. | 0:16:26 | 0:16:29 | |
Ten years ago, I had a stroke. | 0:16:29 | 0:16:31 | |
A bit like you, it came out of the blue, | 0:16:31 | 0:16:34 | |
and in the ward I was in, everyone who'd had a stroke | 0:16:34 | 0:16:39 | |
was either in their late 30s, or in their 40s, in that ward. | 0:16:39 | 0:16:44 | |
And the common factor amongst all of us was stress. | 0:16:44 | 0:16:49 | |
Everyone had a highly stressed job, whether it be rubbish collection, | 0:16:49 | 0:16:53 | |
or a city broker or a lawyer, you know, | 0:16:53 | 0:16:55 | |
everyone was in there, and they all had high stress jobs, including me. | 0:16:55 | 0:16:59 | |
I can tell you when I come into the room, when he's not looking, | 0:16:59 | 0:17:03 | |
he's on his iPad answering e-mails before I came into the room. | 0:17:03 | 0:17:08 | |
And that's whilst he's had a stroke within the last six weeks, | 0:17:08 | 0:17:12 | |
and yet, he's just promised us - | 0:17:12 | 0:17:14 | |
and this is not the first time he's promised, by the way - | 0:17:14 | 0:17:16 | |
that he will reduce his working hours. | 0:17:16 | 0:17:19 | |
Quite frankly, the e-mails and mobile phones don't help. | 0:17:19 | 0:17:24 | |
-Because you never switch off. -No. | 0:17:24 | 0:17:26 | |
-You can't escape that. -It's impossible. | 0:17:26 | 0:17:28 | |
Peters, Pascale and their co-writers would foster the idea of | 0:17:30 | 0:17:34 | |
employee devotion to their employers. | 0:17:34 | 0:17:37 | |
But then, over time, our bosses began to persuade themselves | 0:17:37 | 0:17:41 | |
that they didn't need to treat us with the same devotion. | 0:17:41 | 0:17:45 | |
Two decades after Pascale and Peters met, another deal would encourage | 0:17:45 | 0:17:50 | |
companies to hire and fire us at will, no matter how devoted we were. | 0:17:50 | 0:17:55 | |
1997 - the world of business is changing fast. | 0:18:07 | 0:18:12 | |
In America, traditional blue chip companies were losing talented staff | 0:18:14 | 0:18:19 | |
to tech upstarts from Silicon Valley. | 0:18:19 | 0:18:21 | |
They needed a solution, and they paid Management consultants McKinsey | 0:18:26 | 0:18:31 | |
to provide it. | 0:18:31 | 0:18:33 | |
Helen Handfield-Jones and her colleagues were convinced that | 0:18:33 | 0:18:36 | |
companies were failing because they were employing the wrong people. | 0:18:36 | 0:18:41 | |
It was in the '90s with the dot-com boom, big companies all over | 0:18:41 | 0:18:44 | |
the place that used to get the cream of the crop | 0:18:44 | 0:18:47 | |
from their recruiting, you know, the IBMs, the GEs, | 0:18:47 | 0:18:51 | |
were losing talent to dot-com start-ups, | 0:18:51 | 0:18:53 | |
which they'd never faced before. | 0:18:53 | 0:18:55 | |
To solve this problem, McKinsey sold a strategy to their clients. | 0:18:57 | 0:19:01 | |
They called it the war for talent. | 0:19:01 | 0:19:04 | |
Using practical examples from companies such as General Electric | 0:19:04 | 0:19:08 | |
and Enron, the authors outlined five imperatives that every leader, | 0:19:08 | 0:19:13 | |
from CEO to unit manager, must act on to build a stronger talent pool. | 0:19:13 | 0:19:18 | |
It completely redefined the relationship | 0:19:18 | 0:19:21 | |
between a company and its workers. | 0:19:21 | 0:19:24 | |
Talent is the sum total of your skills and ability, values, | 0:19:24 | 0:19:29 | |
character that you bring to your work role. | 0:19:29 | 0:19:33 | |
Having great talent in the organisation is critical | 0:19:33 | 0:19:35 | |
to the business success. | 0:19:35 | 0:19:37 | |
Now, it sounds like motherhood and apple pie, right? | 0:19:37 | 0:19:39 | |
Everybody would say that. | 0:19:39 | 0:19:41 | |
But there's some roles that require just a very unique set of skills | 0:19:41 | 0:19:44 | |
that doesn't come along easily, and isn't trainable. | 0:19:44 | 0:19:48 | |
For McKinsey, the most important thing in any company was | 0:19:48 | 0:19:51 | |
having the right employees. The good people needed to be kept, | 0:19:51 | 0:19:55 | |
and underperforming staff let go. | 0:19:55 | 0:19:58 | |
You categorised employees into A, B and C. | 0:19:58 | 0:20:03 | |
And A were people who were super-talented and successful, | 0:20:04 | 0:20:08 | |
-and you just allow them to get on with it. -Mm-hm. | 0:20:08 | 0:20:11 | |
And B were people who were kind of coasting along... | 0:20:11 | 0:20:14 | |
Doing OK, doing a good job. | 0:20:14 | 0:20:16 | |
Doing OK, but we're not sure whether to keep them. | 0:20:16 | 0:20:18 | |
-Yeah. -And then C were people that you sack. | 0:20:18 | 0:20:22 | |
OK. All of that is mischaracterised. | 0:20:22 | 0:20:23 | |
Let me start that again. | 0:20:23 | 0:20:25 | |
That's the hardest part of talent management is dealing with | 0:20:25 | 0:20:27 | |
low performers. The concept is that... | 0:20:27 | 0:20:29 | |
Look hard at your low performers, have a clear eyed view, | 0:20:29 | 0:20:33 | |
assess their performance rigorously like you do everybody else, | 0:20:33 | 0:20:37 | |
and make a decision. That's what it means. | 0:20:37 | 0:20:38 | |
So you didn't say that the C person should just be sacked? | 0:20:38 | 0:20:42 | |
It's not a person, it's a performer. | 0:20:42 | 0:20:44 | |
-Right? -Oh, sorry. So it's not a human being? | 0:20:44 | 0:20:47 | |
No. No, no. | 0:20:47 | 0:20:48 | |
-No, what I'm saying is... -It's not a person?! It IS a person. | 0:20:48 | 0:20:51 | |
I'm a person sitting in front of you, | 0:20:51 | 0:20:54 | |
being interviewed by my boss. | 0:20:54 | 0:20:55 | |
-Yeah, yeah. -And they're deciding whether I should be sacked or kept. | 0:20:55 | 0:20:58 | |
-Yes, but it... -I'm a person. -Of course you're a person. | 0:20:58 | 0:21:01 | |
But you said I wasn't a person. You said I was a performer. | 0:21:01 | 0:21:03 | |
But you also used the word a moment earlier that said you assess | 0:21:03 | 0:21:06 | |
whether people were TALENTED or not by A, B or C. | 0:21:06 | 0:21:08 | |
I said, "No, that's not right." We assess people on their performance. | 0:21:08 | 0:21:12 | |
-Right. -Are they performing at a high level, an OK level, | 0:21:12 | 0:21:16 | |
-or a poor level? -Right. -Below acceptable level of performance. | 0:21:16 | 0:21:20 | |
They're toxic to the organisation, they're not growing | 0:21:20 | 0:21:22 | |
their businesses, if that's what you need them to do, | 0:21:22 | 0:21:25 | |
they're poor people managers, whatever it is, | 0:21:25 | 0:21:27 | |
whatever dimension of performance - | 0:21:27 | 0:21:29 | |
this is not acceptable performance for a leader of our company. | 0:21:29 | 0:21:32 | |
In the late 1990s and early 2000s, | 0:21:36 | 0:21:38 | |
the war for talent would become hugely influential. | 0:21:38 | 0:21:41 | |
As companies did everything possible to keep their supposedly A-grade | 0:21:43 | 0:21:47 | |
employees, there was a massive increase in executive pay. | 0:21:47 | 0:21:52 | |
So if you go back 20, 25 years, the ratio, say in the UK, | 0:21:52 | 0:21:57 | |
across most of Europe, between a CEO's pay | 0:21:57 | 0:22:01 | |
and the average earner in an organisation, was about 20-1. | 0:22:01 | 0:22:05 | |
And something happened in the late '80s through the '90s to the | 0:22:06 | 0:22:11 | |
early 2000s which saw Chief Executives getting increases | 0:22:11 | 0:22:16 | |
of 100-300% over that period of time. | 0:22:16 | 0:22:18 | |
The flames were fanned by the war-for-talent thinking. | 0:22:18 | 0:22:22 | |
Do you think there's a danger you create a kind of inherent inequality | 0:22:22 | 0:22:27 | |
within an organisation because just by dividing people into A, B and C, | 0:22:27 | 0:22:31 | |
you create resentment, you breed anxiety? | 0:22:31 | 0:22:33 | |
Well, no, again, you make an assumption that's not right. | 0:22:33 | 0:22:36 | |
We do not tell people, "Oh, by the way, you're a B, or you're a B plus, | 0:22:36 | 0:22:40 | |
"or you're an A, whatever. "You have potential..." | 0:22:40 | 0:22:42 | |
We don't tell people that. It's not about stamping letter grades | 0:22:42 | 0:22:45 | |
on people's foreheads or having Crown Princes walk around | 0:22:45 | 0:22:49 | |
with crowns on their heads. | 0:22:49 | 0:22:50 | |
It's about giving each person the feedback that's most helpful and | 0:22:50 | 0:22:53 | |
appropriate to them, about what they can do to perform better, | 0:22:53 | 0:22:57 | |
and how their careers might unfold in this organisation. | 0:22:57 | 0:23:00 | |
So, Helen, has one of the legacies of the war for talent been, | 0:23:00 | 0:23:03 | |
in a way, the idea that we all need to be talented now in the workplace? | 0:23:03 | 0:23:07 | |
Absolutely, yeah. | 0:23:07 | 0:23:09 | |
These companies have lessened their commitment, long-term, to people. | 0:23:09 | 0:23:14 | |
In the old days, you hired somebody full-time, | 0:23:14 | 0:23:16 | |
and you hired them forever. And people thought, | 0:23:16 | 0:23:19 | |
"I have a job with one company, I stay there forever." | 0:23:19 | 0:23:22 | |
Right? That's long gone. | 0:23:22 | 0:23:23 | |
There's no free lunches any more, right? For anybody. | 0:23:23 | 0:23:27 | |
The world of work had changed significantly since | 0:23:29 | 0:23:32 | |
Peters and Pascale's vision of shared values - | 0:23:32 | 0:23:35 | |
being loyal to the workplace and your boss being loyal to you. | 0:23:35 | 0:23:39 | |
The importance of shared values | 0:23:40 | 0:23:43 | |
shouldn't be underestimated as a concept. | 0:23:43 | 0:23:46 | |
It's probably the lasting huge thought that is still with us and | 0:23:46 | 0:23:52 | |
should keep going with us because, of course, it was shared values that | 0:23:52 | 0:23:56 | |
we got wrong in the '90s and the early 2000s with the war for talent | 0:23:56 | 0:23:59 | |
and the L'Oreal generation, the shared value became, "Me". | 0:23:59 | 0:24:02 | |
It didn't become a greater creation of wealth. | 0:24:04 | 0:24:06 | |
It didn't become doing the right thing, and we lost the sort of | 0:24:06 | 0:24:11 | |
Aristotle version of a virtuous or a good leader. | 0:24:11 | 0:24:15 | |
The ideas in the war for talent would affect us all. | 0:24:19 | 0:24:22 | |
It told businesses worldwide that it was OK to hire and fire us. | 0:24:23 | 0:24:29 | |
Today, the average worker has six jobs | 0:24:29 | 0:24:32 | |
over the course of his or her life. | 0:24:32 | 0:24:35 | |
New graduates expect to have 20. | 0:24:35 | 0:24:39 | |
The job for life is over. | 0:24:39 | 0:24:41 | |
And it was the tech firms, whose rise prompted the war for talent | 0:24:43 | 0:24:47 | |
in the first place, who took the idea to its logical extreme. | 0:24:47 | 0:24:51 | |
At the beginning of the shift, I get my phone out, | 0:24:52 | 0:24:55 | |
log in, and set myself as available, and off I go. | 0:24:55 | 0:24:57 | |
A world without long-term employees, | 0:24:59 | 0:25:02 | |
where even the very idea of staff is redundant. | 0:25:02 | 0:25:06 | |
I normally do Monday to Friday lunch shifts, | 0:25:07 | 0:25:10 | |
10-2, and a couple evenings on top of that. | 0:25:10 | 0:25:14 | |
Meg Brown is a casual worker, part of the so-called gig economy. | 0:25:17 | 0:25:21 | |
Instead of a regular wage, she gets paid for the gigs she does, | 0:25:21 | 0:25:25 | |
delivering restaurant food to customers' front doors. | 0:25:25 | 0:25:29 | |
It's estimated that five million people globally are employed | 0:25:29 | 0:25:32 | |
this way, managed by a computer algorithm, and not by a person. | 0:25:32 | 0:25:37 | |
What kind of guarantees of work do you have? | 0:25:37 | 0:25:40 | |
You'll wait for the orders to come in. | 0:25:40 | 0:25:42 | |
-Right. -There'll usually be a peak-time of the meal times. | 0:25:42 | 0:25:47 | |
-Right, yeah. -So it really varies. | 0:25:47 | 0:25:48 | |
Anything could change at any moment. | 0:25:53 | 0:25:56 | |
I don't know what's happening on the other end. | 0:25:56 | 0:26:00 | |
I don't know if business is slow, | 0:26:00 | 0:26:03 | |
or it's just because I'm not being assigned a job. | 0:26:03 | 0:26:07 | |
I'm vulnerable, and at the mercy of the algorithm, | 0:26:07 | 0:26:11 | |
and whoever's at control on the other end. | 0:26:11 | 0:26:15 | |
Half the time, you don't get to see a human being and you don't know | 0:26:15 | 0:26:18 | |
who... What kind of personalities you're interacting with, | 0:26:18 | 0:26:21 | |
or if you are interacting with any personality. | 0:26:21 | 0:26:23 | |
It does feel like I'm missing something in that human-contact way | 0:26:23 | 0:26:29 | |
of just interacting with a screen. | 0:26:29 | 0:26:31 | |
-There's no human being doing it, it's the algorithms. -Yeah. | 0:26:31 | 0:26:33 | |
We keep talking... I mean, this film is all about the future of work, | 0:26:33 | 0:26:36 | |
but actually the algorithms are deciding how you work now, | 0:26:36 | 0:26:39 | |
and whether you are even employed or not. | 0:26:39 | 0:26:41 | |
Yeah. It makes me feel a bit dispensable. | 0:26:41 | 0:26:44 | |
Algorithms are the new boss. | 0:26:46 | 0:26:47 | |
Algorithms telling them where to go, to do the service. | 0:26:49 | 0:26:53 | |
Algorithms determining the price of the service. | 0:26:53 | 0:26:57 | |
The algorithm can also sack you these days. | 0:26:58 | 0:27:01 | |
So there's quite a lot of platforms. There's a rating system of points | 0:27:01 | 0:27:04 | |
or stars, and if you fall below a certain threshold, | 0:27:04 | 0:27:08 | |
you get terminated automatically. | 0:27:08 | 0:27:10 | |
You might just try and log on one day to your app, | 0:27:10 | 0:27:13 | |
and it simply no longer says that you've got an active account. | 0:27:13 | 0:27:16 | |
Algorithm management has become algorithm rebellion. | 0:27:17 | 0:27:21 | |
Gig economy workers torn between the idea of being your own boss | 0:27:21 | 0:27:25 | |
while being controlled by the smartphones in their pockets are | 0:27:25 | 0:27:28 | |
arguing for the same employment rights as everyone else. | 0:27:28 | 0:27:31 | |
An independent review is currently looking into this. | 0:27:31 | 0:27:34 | |
The way that this type of work is sold, | 0:27:35 | 0:27:38 | |
whether it be Deliveroo or any company that does this, | 0:27:38 | 0:27:41 | |
is that it's flexibility, that it gives... | 0:27:41 | 0:27:44 | |
And that, you know, this is a great way to work, this is the future. | 0:27:44 | 0:27:47 | |
The flexibility is overplayed massively. | 0:27:47 | 0:27:50 | |
No guarantee of minimum wage, let alone the living wage. | 0:27:50 | 0:27:54 | |
It's a really insecure way to live. | 0:27:54 | 0:27:56 | |
It's hard to plan your life on this kind of wages, | 0:27:56 | 0:28:00 | |
and this kind of nature of work. | 0:28:00 | 0:28:03 | |
The war for talent was a critical driver of corporate performance, | 0:28:06 | 0:28:11 | |
but employers have since realised that technology | 0:28:11 | 0:28:14 | |
can make that process ruthlessly efficient. | 0:28:14 | 0:28:16 | |
And today, that insight has spread far beyond companies | 0:28:19 | 0:28:22 | |
like Deliveroo and Uber. Soon, everyone could be monitored, | 0:28:22 | 0:28:27 | |
including those working in professional office jobs. | 0:28:27 | 0:28:30 | |
In Boston, Massachusetts, there's one company ensuring | 0:28:32 | 0:28:35 | |
none of us can escape the computer's all-seeing eye. | 0:28:35 | 0:28:38 | |
Their technology allows companies to monitor the productivity of its | 0:28:40 | 0:28:44 | |
staff and employees to monitor their own workplace performance | 0:28:44 | 0:28:48 | |
through technology embedded into a card you wear around your neck. | 0:28:48 | 0:28:52 | |
They are already selling their services to companies like | 0:28:53 | 0:28:56 | |
Deloitte and Bank of America. | 0:28:56 | 0:28:58 | |
This measures with Bluetooth and accelerometer, | 0:28:58 | 0:29:02 | |
a microphone and a GPS, | 0:29:02 | 0:29:04 | |
it understands my activity level, it understands where I am. | 0:29:04 | 0:29:08 | |
Am I at a desk? Am I on a different floor? Am I in a conference room? | 0:29:08 | 0:29:12 | |
It understands only two things about speech - | 0:29:12 | 0:29:16 | |
am I talking or not talking? | 0:29:16 | 0:29:19 | |
Combining all of this across my day in the job, | 0:29:19 | 0:29:24 | |
it understands how I actually get work done. | 0:29:24 | 0:29:26 | |
And when you combine that with all the digital signatures that we are | 0:29:26 | 0:29:29 | |
also patterning, it directly translates into, "Am I happy? | 0:29:29 | 0:29:32 | |
"Am I loyal to the organisation? | 0:29:32 | 0:29:34 | |
"And as a company, are we productive and are we profitable?" | 0:29:34 | 0:29:37 | |
You mentioned the word "happiness" there, which intrigues me. | 0:29:37 | 0:29:40 | |
So how can you measure someone's happiness through the data? | 0:29:40 | 0:29:43 | |
With all of these different signatures that we are picking up | 0:29:43 | 0:29:46 | |
on, we can interpret stress levels. | 0:29:46 | 0:29:49 | |
As a manager, I can understand if any particular of my employees are | 0:29:50 | 0:29:53 | |
being overworked, overstressed, | 0:29:53 | 0:29:57 | |
and are translating to lack of happiness and lack of productivity. | 0:29:57 | 0:30:01 | |
The data collected on staff can also monitor communication patterns. | 0:30:03 | 0:30:08 | |
The thicker lines suggest more communication, | 0:30:08 | 0:30:11 | |
the thinner lines, less. | 0:30:11 | 0:30:13 | |
It monitors whether certain staff are dominating situations | 0:30:13 | 0:30:16 | |
or working collaboratively. | 0:30:16 | 0:30:19 | |
So who's the central person...? | 0:30:19 | 0:30:20 | |
-Yeah. -Being the size of organisation we are, we might have insight into | 0:30:22 | 0:30:25 | |
who that is, but what this is a perfect example of | 0:30:25 | 0:30:28 | |
is what we all describe as bottleneck. | 0:30:28 | 0:30:31 | |
A single person in our organisation | 0:30:31 | 0:30:33 | |
has unintentionally been a bottleneck, | 0:30:33 | 0:30:35 | |
so what's been happening the last month that has drawn this behaviour? | 0:30:35 | 0:30:40 | |
And what can we do going forward | 0:30:40 | 0:30:42 | |
to mitigate or eliminate that bottleneck behaviour? | 0:30:42 | 0:30:45 | |
But you see, you know... | 0:30:45 | 0:30:47 | |
You said you've got quite a fair idea of who that person is. | 0:30:47 | 0:30:50 | |
-Oh, absolutely. -So you don't need... In a way, the anonymity | 0:30:50 | 0:30:53 | |
doesn't matter because, just through the data, | 0:30:53 | 0:30:55 | |
you can work out who that person was because you know your employees | 0:30:55 | 0:30:58 | |
and, therefore, you put two and two together. | 0:30:58 | 0:31:00 | |
-In this case. -Yeah. | 0:31:00 | 0:31:02 | |
-In this case, but it's usually only for those anomalies. -Yeah. | 0:31:02 | 0:31:06 | |
I suppose the anonymity thing seems a really key element of it | 0:31:06 | 0:31:09 | |
because I think that's the bit that would frighten people - | 0:31:09 | 0:31:11 | |
the idea that, "Oh, they're going to drill down into me personally, | 0:31:11 | 0:31:15 | |
"and my data, and that's going to expose me within the company." | 0:31:15 | 0:31:18 | |
So... | 0:31:18 | 0:31:20 | |
there must be a temptation, if a company comes along and says, | 0:31:20 | 0:31:23 | |
"Can you provide this service?" | 0:31:23 | 0:31:25 | |
Would you say "yes" or would you say "No, we can't do that"? | 0:31:25 | 0:31:28 | |
It violates our privacy policy, which starts and is rooted in | 0:31:28 | 0:31:34 | |
personal privacy and personal ownership of your behavioural data. | 0:31:34 | 0:31:39 | |
And you can do good for the organisation and maintain | 0:31:39 | 0:31:43 | |
that simple truth by exposing the patterns of the individual and from | 0:31:43 | 0:31:48 | |
a top-down perspective exposing those patterns anonymously | 0:31:48 | 0:31:50 | |
about the individual, | 0:31:50 | 0:31:52 | |
-but exposing those patterns from a management perspective down. -Right. | 0:31:52 | 0:31:56 | |
So there is a line to be drawn, really, and that's the line, really, | 0:31:56 | 0:31:58 | |
-the line of privacy. -And you draw it in the beginning. | 0:31:58 | 0:32:01 | |
Amazing. It's a staggering level, isn't it? | 0:32:01 | 0:32:03 | |
I'm just kind of... It does really blow your mind, | 0:32:03 | 0:32:06 | |
the level to which you are able to analyse every single aspect of | 0:32:06 | 0:32:11 | |
what you're doing at work. It's mind-blowing. | 0:32:11 | 0:32:13 | |
It's already happening. Your manager is coming in and saying, | 0:32:13 | 0:32:16 | |
"I think you're blocking everybody else's communication | 0:32:16 | 0:32:20 | |
"and you're forcing, you know, | 0:32:20 | 0:32:22 | |
"one office to communicate with the other office through you". | 0:32:22 | 0:32:25 | |
The ability to capture that, analyse it, | 0:32:25 | 0:32:28 | |
that computer science really hasn't existed at scale | 0:32:28 | 0:32:31 | |
except for the last handful of years. | 0:32:31 | 0:32:33 | |
I have never seen an office where people are wearing a badge | 0:32:36 | 0:32:40 | |
that collects all the information on every single thing you do, | 0:32:40 | 0:32:44 | |
from the moment you walk in to the moment you leave. | 0:32:44 | 0:32:47 | |
So, basically, this little box tells them who's come into the room. | 0:32:47 | 0:32:54 | |
It's just... It's just unbelievable. | 0:32:56 | 0:32:59 | |
And what it is, what's really extraordinary about it is it's | 0:32:59 | 0:33:02 | |
given to you, so you can then police your own productivity | 0:33:02 | 0:33:06 | |
because we ourselves are our harshest critics, | 0:33:06 | 0:33:10 | |
and they know that. That's the genius of this stuff. | 0:33:10 | 0:33:13 | |
But how soon before technology doesn't just enable staff to work | 0:33:18 | 0:33:21 | |
more effectively, but takes over completely? | 0:33:21 | 0:33:25 | |
Doctors, accountants, even lawyers, | 0:33:28 | 0:33:30 | |
are starting to see their professions change. | 0:33:30 | 0:33:32 | |
And it's all thanks to a deal made | 0:33:34 | 0:33:36 | |
in a sleepy town in upstate New York. | 0:33:36 | 0:33:39 | |
# Silver coins that jingle jangle | 0:33:41 | 0:33:47 | |
# Fancy shoes...# | 0:33:47 | 0:33:48 | |
In 2004, a man called Charles Lickel came to this steakhouse | 0:33:48 | 0:33:52 | |
with his colleagues after a hard day's work. | 0:33:52 | 0:33:55 | |
Lickel was head of computing system software at tech giant IBM. | 0:33:56 | 0:34:01 | |
As they ate, at 7:00 on the dot, something strange happened. | 0:34:05 | 0:34:09 | |
APPLAUSE | 0:34:09 | 0:34:11 | |
All of the diners leapt up from their seats | 0:34:15 | 0:34:18 | |
and rushed out of the room. | 0:34:18 | 0:34:20 | |
The TV above the bar had just started showing that night's episode | 0:34:20 | 0:34:24 | |
of the long-running TV quiz show Jeopardy! | 0:34:24 | 0:34:27 | |
This is Jeopardy! | 0:34:27 | 0:34:30 | |
Here are today's contestants. | 0:34:31 | 0:34:33 | |
A risk analytics manager from Irvine, California, | 0:34:33 | 0:34:37 | |
Patrick Fernandez. | 0:34:37 | 0:34:38 | |
Reigning champion, Ken Jennings, was on a record-breaking winning streak. | 0:34:38 | 0:34:44 | |
Everyone in the restaurant, everyone in America, | 0:34:44 | 0:34:47 | |
was desperate to find out if he could keep it going. | 0:34:47 | 0:34:50 | |
Lickel, his colleagues, and the nation, were gripped. | 0:34:50 | 0:34:53 | |
Let's go to Ken Jennings now. He selected Matthew and Andrew. | 0:34:53 | 0:34:56 | |
And he risked 5,800, that's right. | 0:34:56 | 0:34:59 | |
APPLAUSE | 0:34:59 | 0:35:00 | |
29,000 is what you get today and it brings your total to... | 0:35:00 | 0:35:04 | |
The unstoppable Ken Jennings did it again that night. | 0:35:04 | 0:35:07 | |
He seemed more machine than human. | 0:35:07 | 0:35:10 | |
At that moment, standing at that bar, Lickel had an epiphany. | 0:35:10 | 0:35:14 | |
He suddenly saw the perfect challenge - | 0:35:14 | 0:35:16 | |
to invent a computer programme that could beat a human | 0:35:16 | 0:35:19 | |
at the world's most popular quiz show. | 0:35:19 | 0:35:21 | |
IBM would do a deal with the makers of Jeopardy! | 0:35:23 | 0:35:27 | |
They would create artificial intelligence that could | 0:35:27 | 0:35:29 | |
beat the game's greatest players, | 0:35:29 | 0:35:32 | |
a machine that could understand human language, | 0:35:32 | 0:35:36 | |
access hordes of information, and answer questions. | 0:35:36 | 0:35:40 | |
The man tasked to create it was David Ferrucci. | 0:35:42 | 0:35:45 | |
When I heard about Jeopardy!, I said, "We've got to do this". | 0:35:47 | 0:35:50 | |
This is exactly the kind of thing we've been working on for years. | 0:35:50 | 0:35:52 | |
Of course, the problem was, most people were saying, | 0:35:52 | 0:35:55 | |
"No, this is impossible. You're going to embarrass IBM, | 0:35:55 | 0:35:57 | |
"essentially. You're going to get on television, this is pure folly, | 0:35:57 | 0:36:00 | |
"you're going to fall on your face, | 0:36:00 | 0:36:02 | |
"and you're going to embarrass the whole company." | 0:36:02 | 0:36:04 | |
We had these meetings with all these executives around big | 0:36:04 | 0:36:07 | |
conference tables, and they would say, | 0:36:07 | 0:36:08 | |
"Dave, you're mucking with the IBM brand. You need to win. | 0:36:08 | 0:36:11 | |
"Whatever you want, just tell us what you need. | 0:36:13 | 0:36:16 | |
"Tell us what you need to win." | 0:36:16 | 0:36:18 | |
IBM already had Deep Blue, which had beaten Garry Kasparov at chess. | 0:36:18 | 0:36:21 | |
So how was this challenge going to be different? | 0:36:21 | 0:36:24 | |
Jeopardy! can ask about anything, | 0:36:24 | 0:36:25 | |
-whereas chess, you always knew what you were dealing with. -Yeah. | 0:36:25 | 0:36:28 | |
I mean, there's a bunch of pieces, they move in certain ways. | 0:36:28 | 0:36:30 | |
There's a finite set of moves. You knew what you were dealing with. | 0:36:30 | 0:36:33 | |
This space was wholly unconstrained. The only thing we had was the prior | 0:36:33 | 0:36:36 | |
data, in other words, answers had to come from places like | 0:36:36 | 0:36:38 | |
Wikipedia, dictionaries, novels, plays, the Bible, | 0:36:38 | 0:36:41 | |
wherever this stuff could come from. So you had that content base, | 0:36:41 | 0:36:44 | |
but to take that question and figure out what that question was asking, | 0:36:44 | 0:36:48 | |
and you don't want to buzz in and be wrong. | 0:36:48 | 0:36:51 | |
So you had to be able to compute your confidence that you were going | 0:36:51 | 0:36:53 | |
-to be right. -You must have needed to do some | 0:36:53 | 0:36:55 | |
trial runs, so what were the trial runs like? | 0:36:55 | 0:36:57 | |
Were there any glitches or anything? | 0:36:57 | 0:37:00 | |
In the beginning, it was terrible. | 0:37:00 | 0:37:01 | |
I mean, it would make ridiculous, ridiculous mistakes. | 0:37:01 | 0:37:05 | |
And at that time, Watson took two hours to answer a single question, | 0:37:05 | 0:37:10 | |
so we actually couldn't play it live. | 0:37:10 | 0:37:12 | |
It would be a very boring, long game. | 0:37:12 | 0:37:14 | |
This went on for a couple of years, | 0:37:14 | 0:37:16 | |
until we got to a point where Watson could answer a question in | 0:37:16 | 0:37:20 | |
three seconds, between two and three seconds. | 0:37:20 | 0:37:22 | |
-Whoa! -And that was fast enough to play. | 0:37:22 | 0:37:26 | |
The day you go into the studio, Dave, what are you thinking? | 0:37:26 | 0:37:29 | |
As a scientist, I was going in saying, "I wish I could just | 0:37:29 | 0:37:31 | |
"publish the practice games," because if you just set up | 0:37:31 | 0:37:34 | |
the AI challenge, you know, we did it. | 0:37:34 | 0:37:37 | |
We had a very competitive Jeopardy! player that can beat the best humans | 0:37:37 | 0:37:40 | |
at something that no-one thought was even possible. | 0:37:40 | 0:37:43 | |
Developed and programmed especially for this moment, | 0:37:43 | 0:37:46 | |
ladies and gentlemen, this is Watson. | 0:37:46 | 0:37:48 | |
APPLAUSE | 0:37:48 | 0:37:50 | |
This was going to come down to one game. | 0:37:50 | 0:37:52 | |
And if we go out there and we lose that one game, that was going to | 0:37:52 | 0:37:55 | |
be on television, it would look like the team had lost. | 0:37:55 | 0:37:57 | |
And that was just a painful thing to think about because, I mean, | 0:37:57 | 0:38:01 | |
I was in there day-to-day, trying to make this thing happen. | 0:38:01 | 0:38:04 | |
-Watson? -Who is CS Lewis? | 0:38:04 | 0:38:06 | |
-Yes. -Who is Sir Christopher Wren? | 0:38:06 | 0:38:09 | |
-You are right. -What is Picasso? | 0:38:09 | 0:38:11 | |
-No, sorry. -What is narcolepsy? | 0:38:11 | 0:38:13 | |
You are right and, with that, you move to 36,681. | 0:38:13 | 0:38:19 | |
-APPLAUSE -A big, big lead. | 0:38:19 | 0:38:21 | |
When Ken saw Watson get the daily double - | 0:38:21 | 0:38:23 | |
Ken knows Jeopardy! really well, and he could do math - | 0:38:23 | 0:38:26 | |
and he was like, "That's it, I lost". | 0:38:26 | 0:38:28 | |
Even though you were only 32% sure of your response, you are correct. | 0:38:28 | 0:38:33 | |
Final Jeopardy!, Watson still would have won by a dollar... | 0:38:37 | 0:38:41 | |
That's...! | 0:38:41 | 0:38:42 | |
Which probably would have had a very different effect. | 0:38:42 | 0:38:46 | |
IBM's future is resting on 1! | 0:38:46 | 0:38:49 | |
1! Isn't that fantastic? | 0:38:49 | 0:38:53 | |
We weren't just building a game machine. | 0:38:53 | 0:38:55 | |
This was a machine that was taught to process language more effectively | 0:38:55 | 0:38:59 | |
than any other machine before, and that was a great result for us. | 0:38:59 | 0:39:05 | |
And I think that got people saying, "Hey, what else can this do? | 0:39:05 | 0:39:07 | |
"Can this kind of intelligence help us do things?" | 0:39:07 | 0:39:10 | |
Watson's ability to understand natural language is unprecedented | 0:39:13 | 0:39:16 | |
and IBM are now selling the software worldwide. | 0:39:16 | 0:39:20 | |
As a result, it's poised to take a whole host of jobs | 0:39:20 | 0:39:23 | |
previously thought off-limits to machines. | 0:39:23 | 0:39:26 | |
It's being used to do the work of lawyers, | 0:39:26 | 0:39:28 | |
to calculate insurance policies, | 0:39:28 | 0:39:31 | |
and even to provide technology-enhanced | 0:39:31 | 0:39:33 | |
patient care in hospitals. | 0:39:33 | 0:39:35 | |
We're working with IBM Watson to actually create a platform that we | 0:39:37 | 0:39:40 | |
can answer children's questions when they want to have them answered. | 0:39:40 | 0:39:43 | |
People talk about bedside manner, but can you create a digital | 0:39:43 | 0:39:46 | |
bedside manner? So that's the vision, is to have an actual | 0:39:46 | 0:39:49 | |
artificial intelligence built into how we engage with our children. | 0:39:49 | 0:39:53 | |
Right, now, what I was wanting to have a go at was to see if I can try | 0:39:53 | 0:39:56 | |
and train this computer to ask questions that you have. | 0:39:56 | 0:40:00 | |
But to do that, I need to do what questions you want to know about | 0:40:00 | 0:40:04 | |
and also if you think the answers are any good. | 0:40:04 | 0:40:06 | |
Because I'm training it at the moment, because I can't be here | 0:40:06 | 0:40:08 | |
all the time, because I have to go from patient to patient. | 0:40:08 | 0:40:11 | |
But especially for the easier questions, if you just ask this, | 0:40:11 | 0:40:14 | |
and then we can spend more time talking about important things | 0:40:14 | 0:40:17 | |
that you want to know about. | 0:40:17 | 0:40:19 | |
Is it going to hurt? | 0:40:19 | 0:40:22 | |
I understand that you might be worried about having an operation. | 0:40:22 | 0:40:25 | |
That's OK. During the operation, you won't feel anything, | 0:40:25 | 0:40:28 | |
as we will give you an anaesthetic. | 0:40:28 | 0:40:30 | |
Watson is being used at this Children's Hospital in Liverpool | 0:40:30 | 0:40:34 | |
to help answer patients' questions on everything from the hospital menu | 0:40:34 | 0:40:38 | |
to details of their treatment. | 0:40:38 | 0:40:41 | |
How long does it take to fall asleep? | 0:40:41 | 0:40:44 | |
The project is part of a wider big data research collaboration between | 0:40:44 | 0:40:49 | |
IBM and the Government's Science and Technology Facilities Council, | 0:40:49 | 0:40:52 | |
worth £315 million. | 0:40:52 | 0:40:56 | |
If successful, and rolled out nationally, | 0:40:56 | 0:40:58 | |
it could generate savings for the NHS. | 0:40:58 | 0:41:01 | |
What have you attended for? Well, you were having your appendix | 0:41:01 | 0:41:04 | |
taken out, weren't you? Nope, it's not got that right. | 0:41:04 | 0:41:07 | |
OK. So I'm going to train it by saying that was the wrong answer. | 0:41:07 | 0:41:11 | |
OK, so we'll give that one an unhappy face. | 0:41:11 | 0:41:14 | |
Every so often, it gives you the wrong answer and you kind of go, | 0:41:14 | 0:41:17 | |
"No, that's the wrong thing to say". And then it learns from that, | 0:41:17 | 0:41:19 | |
and it starts to progress, so that's what it's like at the start. | 0:41:19 | 0:41:22 | |
So who knows what it'll be like in five or ten years' time? | 0:41:22 | 0:41:25 | |
Jobs that require empathy were for years considered the last frontier | 0:41:27 | 0:41:31 | |
for automation, but no longer. | 0:41:31 | 0:41:34 | |
Now we have algorithms that are able to substitute for cognitive work | 0:41:34 | 0:41:39 | |
in that, in the previous realm, we had seen machines only able to | 0:41:39 | 0:41:43 | |
replace manual work. Now it is no longer clear what it is that humans | 0:41:43 | 0:41:49 | |
are fundamentally better at than machines. | 0:41:49 | 0:41:52 | |
And among all the jobs that we are slowly losing to machines, | 0:41:53 | 0:41:57 | |
one stands out - teaching. | 0:41:57 | 0:42:00 | |
Children can now be educated by computers | 0:42:02 | 0:42:05 | |
in schools that look like workplaces. | 0:42:05 | 0:42:07 | |
1990 - two political scientists secure a grant from a | 0:42:14 | 0:42:19 | |
conservative American think tank to carry out a major survey | 0:42:19 | 0:42:23 | |
into the US education system. | 0:42:23 | 0:42:25 | |
They wanted to understand why, after more than a quarter century of | 0:42:27 | 0:42:31 | |
costly education reform, the nation's schools have proven | 0:42:31 | 0:42:35 | |
so resistant to change. | 0:42:35 | 0:42:37 | |
The key to the problem, they concluded, | 0:42:37 | 0:42:39 | |
were the powerful teaching unions. | 0:42:39 | 0:42:41 | |
Terry Moe was one of those political scientists | 0:42:44 | 0:42:46 | |
and has since written extensively on the politics of American education. | 0:42:46 | 0:42:50 | |
What you want are schools and school systems that have incentives to | 0:42:52 | 0:42:56 | |
create the best possible schools and organisations and | 0:42:56 | 0:43:02 | |
education environments for children. | 0:43:02 | 0:43:04 | |
And if it turns out that certain teaching methods, | 0:43:04 | 0:43:09 | |
certain approaches to learning, are best, | 0:43:09 | 0:43:11 | |
those are the ones that should be adopted. | 0:43:11 | 0:43:13 | |
But if powerful groups have | 0:43:13 | 0:43:17 | |
embraced methods and approaches that don't do that, | 0:43:17 | 0:43:23 | |
-they're still going to do it. -Right. | 0:43:23 | 0:43:25 | |
They are still going to impose these things on children because | 0:43:25 | 0:43:28 | |
they believe them and they're powerful enough to entrench them | 0:43:28 | 0:43:31 | |
in the schools. Teachers, I think, on average, love kids, you know, | 0:43:31 | 0:43:36 | |
they want schools that are really productive. | 0:43:36 | 0:43:39 | |
However, they join unions to protect their jobs. | 0:43:39 | 0:43:43 | |
But you know, bigger picture, what's the point of the school system? | 0:43:43 | 0:43:47 | |
It's to educate children. That's the ONLY point of the school system. | 0:43:47 | 0:43:51 | |
It was never set up in order to provide jobs to adults. | 0:43:51 | 0:43:54 | |
Moe decided the only alternative to internal politics in teaching was | 0:43:55 | 0:44:00 | |
to open up the education system to market forces of supply and demand, | 0:44:00 | 0:44:05 | |
an argument he set out in a book which would go on to influence | 0:44:05 | 0:44:08 | |
the British education system, too. | 0:44:08 | 0:44:11 | |
The book was published, we had the Blair government come in, | 0:44:11 | 0:44:14 | |
New Labour in the '90s, you're hailed as prophets. | 0:44:14 | 0:44:17 | |
Revolutionising... And we get the academy system, | 0:44:17 | 0:44:20 | |
which is... | 0:44:20 | 0:44:21 | |
Which is basically an attempt to put your ideas into practise. | 0:44:21 | 0:44:26 | |
So, how did you feel that went? | 0:44:26 | 0:44:28 | |
Did you see it as an endorsement of your ideas? | 0:44:28 | 0:44:31 | |
Yes. I mean, I think that, first of all, | 0:44:32 | 0:44:36 | |
I think we're right in the sense that politics does | 0:44:36 | 0:44:40 | |
lead to schools that are perversely organised | 0:44:40 | 0:44:45 | |
and you can expect that to happen. | 0:44:45 | 0:44:46 | |
# Funky life, I've been told | 0:44:46 | 0:44:51 | |
# All that glitters is not gold | 0:44:51 | 0:44:54 | |
# And gold is not reality... # | 0:44:54 | 0:44:55 | |
What Moe and associate John Chubb concluded was that technology | 0:44:55 | 0:44:59 | |
is the key to breaking down the politics of education. | 0:44:59 | 0:45:03 | |
Technology threatens the idea of school as a building, | 0:45:03 | 0:45:07 | |
with kids and teachers in the same physical place. | 0:45:07 | 0:45:10 | |
Technology could make traditional teaching roles obsolete. | 0:45:10 | 0:45:14 | |
In that book, you argue that technology | 0:45:14 | 0:45:17 | |
is the way to transform the class, | 0:45:17 | 0:45:20 | |
in every way, to break the teacher unions, and to do it, | 0:45:20 | 0:45:23 | |
you bring technology, incrementally, into the classroom, | 0:45:23 | 0:45:26 | |
and that way teachers slowly become marginalised. | 0:45:26 | 0:45:28 | |
Yeah, in effect. | 0:45:28 | 0:45:29 | |
So let's go back to the beginning. | 0:45:31 | 0:45:33 | |
People tend to think of this as, like, greater reliance on computers | 0:45:33 | 0:45:37 | |
in the classroom, you know. | 0:45:37 | 0:45:39 | |
But big picture - | 0:45:39 | 0:45:41 | |
what we're talking about here is the revolution in | 0:45:41 | 0:45:44 | |
information technology, which is one of the most powerful | 0:45:44 | 0:45:50 | |
transformative events in the history of the world, you know. | 0:45:50 | 0:45:54 | |
It sounds like hyperbole, you know, but it is. | 0:45:54 | 0:45:58 | |
They talk very directly about how you can use technology to kind of | 0:45:58 | 0:46:03 | |
usurp the power of the teaching unions | 0:46:03 | 0:46:06 | |
and you can kind of fragment the workforce. | 0:46:06 | 0:46:10 | |
They also talk about the opening up of data, so if you take the data | 0:46:10 | 0:46:13 | |
out of the black box that is the classroom, | 0:46:13 | 0:46:16 | |
and basically out of the teacher's head, | 0:46:16 | 0:46:19 | |
then you can start to take the power away from teachers. | 0:46:19 | 0:46:21 | |
You know, data in the hands of a good teacher is a very... | 0:46:21 | 0:46:25 | |
It can be a powerful tool. | 0:46:25 | 0:46:27 | |
But if your reason for doing it is to introduce market forces | 0:46:27 | 0:46:31 | |
and ultimately a consumer market in education products, | 0:46:31 | 0:46:34 | |
which is where they're heading with this, | 0:46:34 | 0:46:37 | |
then at the very least we need to have a conversation about this. | 0:46:37 | 0:46:41 | |
Is this about improving education for children and their prospects for | 0:46:41 | 0:46:45 | |
the future, or is it about the privatisation of education? | 0:46:45 | 0:46:49 | |
Listen up real quick, guys. | 0:46:50 | 0:46:52 | |
Just to let you know, we have about 20 minutes left of class. | 0:46:52 | 0:46:55 | |
Let me know if you guys have any questions. | 0:46:55 | 0:46:57 | |
Good job. | 0:46:57 | 0:46:59 | |
So I want you to try again and call me over before you submit, | 0:46:59 | 0:47:02 | |
and we'll go through it, OK? | 0:47:02 | 0:47:03 | |
At this school in San Diego, 75 pupils study at computer terminals | 0:47:03 | 0:47:08 | |
while being monitored by one teacher. | 0:47:08 | 0:47:11 | |
Students sitting next to each other | 0:47:13 | 0:47:14 | |
are rarely learning the same subjects. | 0:47:14 | 0:47:17 | |
This is Jordan. He's taking the astronomy 101, | 0:47:18 | 0:47:22 | |
so this is the instructor for astronomy. | 0:47:22 | 0:47:24 | |
And then next to him, Brendan, he's doing coding. | 0:47:24 | 0:47:26 | |
-OK. -So he's doing a coding course for the coding academy. | 0:47:26 | 0:47:29 | |
And then we have principles of astronomy here. | 0:47:29 | 0:47:32 | |
And how does it differ from old-fashioned teaching? | 0:47:32 | 0:47:34 | |
So, back in the day, a lot of people just thought about, | 0:47:34 | 0:47:37 | |
"Oh, yeah, pencil and paper works the best," | 0:47:37 | 0:47:39 | |
but now that technology has improved and stuff like that, | 0:47:39 | 0:47:42 | |
it's really cool to get different education. | 0:47:42 | 0:47:44 | |
So, Jordan, you want to be an astronomer. | 0:47:44 | 0:47:47 | |
-Well... -Or maybe not? After watching that. | 0:47:47 | 0:47:49 | |
I've thought about it, but the class is very interesting, | 0:47:49 | 0:47:51 | |
so I decided to take it. | 0:47:51 | 0:47:53 | |
Computers, ironically, can give them a personal expedience. | 0:47:53 | 0:47:58 | |
One-on-one. You compare that to the standard model of a teacher and | 0:47:58 | 0:48:04 | |
30 students. What is that? Right, a teacher has to provide | 0:48:04 | 0:48:07 | |
a standardised curriculum to everybody in the class. | 0:48:07 | 0:48:11 | |
It's completely inefficient as a way of teaching kids | 0:48:11 | 0:48:15 | |
and I think most educators think, "Well, there isn't any alternative. | 0:48:15 | 0:48:18 | |
"That's just what teaching is." But it's not. Not any more. | 0:48:18 | 0:48:22 | |
This is like a dream come true. | 0:48:22 | 0:48:24 | |
In this school, learning by computer means you're also assessed | 0:48:28 | 0:48:32 | |
by computer. Huge amounts of data on students give teachers a record, | 0:48:32 | 0:48:37 | |
not just of exam results, but of how hard each pupil has been working. | 0:48:37 | 0:48:41 | |
As we look at our world history classes and our students, | 0:48:43 | 0:48:46 | |
how are we doing for the progress for this last week? | 0:48:46 | 0:48:49 | |
Yeah, we're doing pretty good. | 0:48:49 | 0:48:51 | |
I mean, looking at the data, we had one struggling student. | 0:48:51 | 0:48:53 | |
You can see it kind of marked right here in negative three. | 0:48:53 | 0:48:56 | |
And do we have a plan of action for our student? | 0:48:56 | 0:48:59 | |
We need to e-mail the parent and say "Hey, they're behind in this course. | 0:48:59 | 0:49:02 | |
"Here's what we're working on." | 0:49:02 | 0:49:04 | |
Well, if she's behind on other things, | 0:49:04 | 0:49:05 | |
we'll have her come in on Monday mornings... | 0:49:05 | 0:49:07 | |
-Yeah, I think that would be helpful. -Days when we don't have class, OK. | 0:49:07 | 0:49:10 | |
All right, listen up. | 0:49:10 | 0:49:12 | |
This is exciting. You guys can go. | 0:49:12 | 0:49:14 | |
See you later. | 0:49:14 | 0:49:16 | |
Supporters of online classes say they are a cost-effective | 0:49:16 | 0:49:19 | |
alternative to traditional teaching, | 0:49:19 | 0:49:22 | |
providing students more flexibility in their learning, | 0:49:22 | 0:49:24 | |
as well as access to a greater variety of options. | 0:49:24 | 0:49:28 | |
But a report by the Organisation For Economic Co-operation | 0:49:28 | 0:49:31 | |
And Development suggested computers do not improve results. | 0:49:31 | 0:49:35 | |
When you look at the evidence, and there is very little evidence, | 0:49:35 | 0:49:38 | |
but what evidence there is shows that this stuff doesn't work. | 0:49:38 | 0:49:42 | |
It doesn't help children to learn. It lowers standards. | 0:49:42 | 0:49:46 | |
Actually, heavy use of technology in education, the results are dreadful. | 0:49:46 | 0:49:51 | |
Kids are turning off, they're not engaged in the content, | 0:49:51 | 0:49:54 | |
they get very used to it, and it's not helping them to learn. | 0:49:54 | 0:49:57 | |
There is something about the human interaction that helps kids learn. | 0:49:57 | 0:50:01 | |
So not only is the problem false, I would argue, | 0:50:01 | 0:50:04 | |
the solution is a false narrative as well. | 0:50:04 | 0:50:08 | |
But this second Industrial Revolution, the robot revolution, | 0:50:10 | 0:50:14 | |
won't take all jobs away. | 0:50:14 | 0:50:16 | |
In fact, some economists believe it could create billions more, | 0:50:16 | 0:50:20 | |
a society of full employment, but what will those jobs be? | 0:50:20 | 0:50:24 | |
To get a glimpse into the future, | 0:50:24 | 0:50:26 | |
you need to look no further than the humble car wash. | 0:50:26 | 0:50:29 | |
15 years ago, | 0:50:33 | 0:50:35 | |
automated machines on petrol station forecourts started closing down. | 0:50:35 | 0:50:40 | |
The Petrol Retailers Association say that, since the mid-noughties, | 0:50:40 | 0:50:44 | |
the numbers have halved from 9,000 to just over 4,000. | 0:50:44 | 0:50:48 | |
Instead of using machines at garages, | 0:50:50 | 0:50:53 | |
we use humans to wash cars again. | 0:50:53 | 0:50:56 | |
It was an odd regression - | 0:50:56 | 0:50:58 | |
a shift towards a low pay, low productivity trap. | 0:50:58 | 0:51:02 | |
This is a story of a political deal and how it ensured that | 0:51:03 | 0:51:07 | |
we did the hard work, not robots. | 0:51:07 | 0:51:10 | |
It's 2003, ten new countries are about to join the EU, | 0:51:14 | 0:51:19 | |
eight of them from the old Eastern Bloc - | 0:51:19 | 0:51:21 | |
poor countries, by Western standards. | 0:51:21 | 0:51:24 | |
The Home Secretary at the time, Lord Blunkett, | 0:51:24 | 0:51:27 | |
is about to sign a deal with the EU. | 0:51:27 | 0:51:30 | |
The question on the table is, | 0:51:30 | 0:51:31 | |
how many workers from those countries might come to Britain? | 0:51:31 | 0:51:35 | |
We were uncertain. | 0:51:35 | 0:51:37 | |
We were sceptical about whether we could make predictions. | 0:51:37 | 0:51:40 | |
People were used to the idea of free movement and what we were trying | 0:51:40 | 0:51:44 | |
to address, was given free movement, | 0:51:44 | 0:51:47 | |
should we allow these people to work legally, pay National Insurance, | 0:51:47 | 0:51:51 | |
pay tax, or should they work in the sub-economy? | 0:51:51 | 0:51:54 | |
Britain had a choice - open up its labour market to everyone, | 0:51:57 | 0:52:02 | |
or temporarily restrict the numbers. | 0:52:02 | 0:52:05 | |
The job of estimating how many might come to live and work in the UK | 0:52:05 | 0:52:09 | |
fell to migration expert Professor Christian Dustmann. | 0:52:09 | 0:52:14 | |
We started that report early in 2003, where we tried | 0:52:14 | 0:52:18 | |
to predict the number of people who would migrate after 2004. | 0:52:18 | 0:52:23 | |
We're talking about all the Eastern European countries joining the EU. | 0:52:23 | 0:52:27 | |
No-one has any idea what this is going to be. | 0:52:27 | 0:52:29 | |
So we tried to do the best on the evidence that we have. | 0:52:29 | 0:52:34 | |
Our number was about 13,000 would come to the UK every year. | 0:52:34 | 0:52:39 | |
The report, however, was based on a very specific scenario, | 0:52:39 | 0:52:44 | |
that Germany opens up its labour market | 0:52:44 | 0:52:47 | |
to Eastern European migrants as well. | 0:52:47 | 0:52:50 | |
Based on Dustmann's estimates, | 0:52:50 | 0:52:53 | |
the British government did a deal with the EU - | 0:52:53 | 0:52:56 | |
Britain decided to allow everyone in, | 0:52:56 | 0:52:59 | |
but then Germany changed its mind. | 0:52:59 | 0:53:01 | |
Finally, on the 1st of May, it was the UK, | 0:53:02 | 0:53:06 | |
Sweden and Ireland among the European - | 0:53:06 | 0:53:10 | |
existing European countries - which actually opened | 0:53:10 | 0:53:13 | |
the labour market to those new accession countries, | 0:53:13 | 0:53:18 | |
and not Germany. So, at that point, | 0:53:18 | 0:53:21 | |
the scenario we had modelled was, of course, not the scenario any more. | 0:53:21 | 0:53:25 | |
Germany had opted to temporarily restrict its borders to migrants, | 0:53:25 | 0:53:30 | |
so they looked to Britain instead. | 0:53:30 | 0:53:32 | |
13,000 a year didn't come here - | 0:53:32 | 0:53:35 | |
it was closer to 40,000. Critics asked how the government | 0:53:35 | 0:53:40 | |
could so badly miscalculate the numbers. | 0:53:40 | 0:53:43 | |
The 2003 Dustmann report is predicated on one thing - | 0:53:43 | 0:53:47 | |
Germany opening up in the same way as we're going to open up | 0:53:47 | 0:53:51 | |
and Germany doesn't. It has... | 0:53:51 | 0:53:53 | |
It closes down and it says it's going to close down for seven years. | 0:53:53 | 0:53:57 | |
Look, let me be very straight. | 0:53:57 | 0:53:59 | |
I've defended the decision, but I was culpable on two fronts. | 0:53:59 | 0:54:03 | |
One, I didn't really clock the significance of the decision | 0:54:03 | 0:54:08 | |
that Germany was taking, even though I knew about it. | 0:54:08 | 0:54:12 | |
And secondly, I was quite prepared, as politicians so often are, | 0:54:12 | 0:54:19 | |
to use the cover of the statisticians' analysis - | 0:54:19 | 0:54:25 | |
guess, if you want - to provide us with a comfort zone | 0:54:25 | 0:54:28 | |
while we were implementing the early part of the registration. | 0:54:28 | 0:54:32 | |
We were quite happy to ride with the public understanding that this | 0:54:32 | 0:54:37 | |
wasn't going to be a major influx and, of course, it was. | 0:54:37 | 0:54:41 | |
At least 850,000 people - around 3% of the UK working age population - | 0:54:44 | 0:54:51 | |
migrated from Eastern Europe to the UK between 2004 and 2011. | 0:54:51 | 0:54:57 | |
You were hung out to dry, basically, by politicians. | 0:54:57 | 0:55:00 | |
They came and said, "You've got this completely wrong. | 0:55:00 | 0:55:04 | |
"You've wildly underestimated the figures. | 0:55:04 | 0:55:06 | |
"A massive, catastrophic miscalculation by you". | 0:55:06 | 0:55:11 | |
Nobody read the report, I think. Not many people read the report. | 0:55:11 | 0:55:15 | |
It's a Sliding Doors moment, David, isn't it? One way or another. | 0:55:15 | 0:55:19 | |
-Well... -You said, in a candid way, which is very rare for a politician, | 0:55:19 | 0:55:23 | |
-"We got it wrong". -Well, we got that part of it wrong. | 0:55:23 | 0:55:27 | |
But there's no question looking back that it was a very fine line | 0:55:27 | 0:55:33 | |
as to whether we were reducing future productivity levels, | 0:55:33 | 0:55:38 | |
whilst actually creating and holding on to very high employment. | 0:55:38 | 0:55:42 | |
These are very fine decisions and sometimes you get them wrong. | 0:55:42 | 0:55:46 | |
The effect of EU migration on the UK's labour market is the subject of | 0:55:50 | 0:55:55 | |
a highly political and frequently polarised debate. | 0:55:55 | 0:55:58 | |
More than one in five low-paid jobs in the UK is now occupied by a | 0:55:58 | 0:56:03 | |
low-skilled migrant worker. | 0:56:03 | 0:56:04 | |
What appeared at the time to be a decision benefiting the economy | 0:56:06 | 0:56:09 | |
became a political negative with the downturn of the economy post-2007. | 0:56:09 | 0:56:15 | |
The future for work now in an age of austerity and automation | 0:56:17 | 0:56:21 | |
will be for us all to find our own opportunities. | 0:56:21 | 0:56:25 | |
I'm interested in the future of the United Kingdom and United States, | 0:56:26 | 0:56:29 | |
and I'm interested in the face of robotics and AI as | 0:56:29 | 0:56:31 | |
how we're going to create jobs. What I'm saying to you is, | 0:56:31 | 0:56:34 | |
"If you can't do it, artificial intelligence can". | 0:56:34 | 0:56:37 | |
I look at it as technology providing opportunity. | 0:56:38 | 0:56:42 | |
Technology giving us... | 0:56:42 | 0:56:43 | |
..liberating us to do more of what we want, the way we want to do it. | 0:56:45 | 0:56:48 | |
Of course, there's also the fear side of that, right? | 0:56:48 | 0:56:51 | |
Which is, "Am I going to lose my job?" We have to think about it. | 0:56:51 | 0:56:54 | |
We have to participate. You just don't let the machines take over. | 0:56:54 | 0:56:58 | |
Be responsible in thinking about how you apply this technology, | 0:56:58 | 0:57:01 | |
how do you faze it incrementally, and so forth. | 0:57:01 | 0:57:06 | |
This great push towards robots, this is the future, | 0:57:06 | 0:57:09 | |
and it somehow takes away any agency and any political democratic process | 0:57:09 | 0:57:15 | |
because this is the imaginings of Silicon Valley. | 0:57:15 | 0:57:18 | |
Small government, gig economy, the Uberification of jobs - | 0:57:18 | 0:57:23 | |
that's kind of what they want. | 0:57:23 | 0:57:25 | |
But I would counter that with, "Well, is it what WE want?" | 0:57:25 | 0:57:28 | |
And so, the fate of the automated car wash becomes one model of | 0:57:30 | 0:57:35 | |
how work could go - machines replaced by low paid workers | 0:57:35 | 0:57:39 | |
motivated only by the fear of unemployment. | 0:57:39 | 0:57:42 | |
Could this be the future of work? | 0:57:42 | 0:57:46 | |
Watson, in a control centre, an unblinking eye, | 0:57:46 | 0:57:50 | |
monitoring how many cars you and I wash in an hour. | 0:57:50 | 0:57:54 | |
To explore further how digital technologies | 0:57:57 | 0:57:59 | |
are transforming societies, go to the BBC web page on-screen | 0:57:59 | 0:58:04 | |
and follow the links to the Open University. | 0:58:04 | 0:58:07 |