
Browse content similar to The Joy of Data. Check below for episodes and series from the same categories and more!
| Line | From | To | |
|---|---|---|---|
It sometimes seems we're being deluged with data. | 0:00:07 | 0:00:11 | |
Wave upon wave of news and messages. | 0:00:13 | 0:00:15 | |
Submerged by step counts. | 0:00:17 | 0:00:19 | |
Constantly bailing out to make room for more. | 0:00:21 | 0:00:26 | |
We buy it, surf it, occasionally drown in it | 0:00:26 | 0:00:31 | |
and with modern technology quantify ourselves and everything else with it. | 0:00:31 | 0:00:37 | |
Data is the new currency of our time. | 0:00:37 | 0:00:41 | |
Data has become almost a magic word for...anything. | 0:00:42 | 0:00:46 | |
Crime and lunacy and literacy and religion and...drunkenness. | 0:00:46 | 0:00:53 | |
You name it, somebody was gathering information about it. | 0:00:55 | 0:00:58 | |
It offers the ability to be transformationally positive. | 0:00:58 | 0:01:01 | |
It's, in one sense, just the reduction in uncertainty. | 0:01:03 | 0:01:07 | |
So what exactly is data? | 0:01:07 | 0:01:10 | |
How is it captured, stored, shared and made sense of? | 0:01:10 | 0:01:15 | |
The engineers of the data age are people that most of us have never heard of, | 0:01:16 | 0:01:21 | |
despite the fact that they brought about | 0:01:21 | 0:01:25 | |
a technological and philosophical revolution, | 0:01:25 | 0:01:29 | |
and created a digital world that the mind boggles to comprehend. | 0:01:29 | 0:01:33 | |
This is the story of THE word of our times... | 0:01:35 | 0:01:39 | |
..how the constant flow of more and better data has transformed society... | 0:01:41 | 0:01:45 | |
..and is even changing our sense of ourselves. | 0:01:48 | 0:01:53 | |
I can't believe this is my life now. | 0:01:53 | 0:01:55 | |
So come on in, because the water's lovely. | 0:01:59 | 0:02:01 | |
My name is Hannah Fry, | 0:02:13 | 0:02:14 | |
I'm a mathematician, and I'd like to begin with a confession. | 0:02:14 | 0:02:19 | |
I haven't always loved data. | 0:02:20 | 0:02:23 | |
The truth is mathematicians just don't really like data that much. | 0:02:23 | 0:02:27 | |
And for most of my professional life I was quite happy sitting in | 0:02:27 | 0:02:30 | |
a windowless room with my equations, describing the world around me. | 0:02:30 | 0:02:35 | |
You can capture the arc of a perfect free kick or the beautiful | 0:02:35 | 0:02:40 | |
aerodynamics of a race car. | 0:02:40 | 0:02:42 | |
The mathematics of the real world is clean and ordered and elegant, | 0:02:42 | 0:02:47 | |
everything that data absolutely isn't. | 0:02:47 | 0:02:51 | |
There was one moment that helped to change my mind. | 0:02:51 | 0:02:54 | |
It was in 2011 when I came across a little game that a teenage Wikipedia | 0:02:56 | 0:03:01 | |
user called Mark J had invented. | 0:03:01 | 0:03:04 | |
Now, Mark noticed that if you hit the first link in the main text of any | 0:03:04 | 0:03:09 | |
Wikipedia page and then do the same for the next page, a pattern emerges. | 0:03:09 | 0:03:14 | |
So the page for data, for example, | 0:03:14 | 0:03:17 | |
links from "set" to "maths" | 0:03:17 | 0:03:19 | |
to "quantity" to "property" and then "philosophy", | 0:03:19 | 0:03:25 | |
which after a few more links will loop back onto itself. | 0:03:25 | 0:03:28 | |
Now, the page "egg" ends up in the same place, | 0:03:28 | 0:03:32 | |
and even that famously philosophical boyband One Direction will take you | 0:03:32 | 0:03:37 | |
all the way through to "philosophy", | 0:03:37 | 0:03:39 | |
although you have to go through "science" to get there. | 0:03:39 | 0:03:42 | |
The same goes for "fungi", or "hairspray", | 0:03:44 | 0:03:49 | |
"marmalade", even "mice", "dust" and "socks". | 0:03:49 | 0:03:54 | |
It was a very strange finding and it called for some statistics. | 0:03:54 | 0:03:59 | |
Another Wikipedia user, Il Mare, | 0:04:01 | 0:04:04 | |
wrote a computer program to try and investigate this phenomenon. | 0:04:04 | 0:04:09 | |
Now, he discovered, amazingly, that for almost 95% of Wikipedia pages, | 0:04:09 | 0:04:16 | |
you will end up getting to "philosophy" eventually. | 0:04:16 | 0:04:20 | |
Now, that's pretty cool, but how did it change my mind about data? | 0:04:20 | 0:04:25 | |
Well, the pattern that Mark J discovered and the data that was captured and analysed, | 0:04:25 | 0:04:31 | |
it revealed a hidden mathematical structure, | 0:04:31 | 0:04:36 | |
because Wikipedia is just a network with loops and chains hidden all over the place | 0:04:36 | 0:04:42 | |
and it's something that can be described beautifully using mathematics. | 0:04:42 | 0:04:46 | |
For me this was the perfect example of how there are two parallel universes. | 0:04:48 | 0:04:55 | |
There's the tangible, noisy, messy one, | 0:04:55 | 0:04:58 | |
the one that you can see and touch and experience. | 0:04:58 | 0:05:02 | |
But there's also the mathematical one, where I think the key to our | 0:05:02 | 0:05:07 | |
understanding lies. | 0:05:07 | 0:05:09 | |
And data is the bridge between those two universes. | 0:05:09 | 0:05:13 | |
Our understanding of everything from cities to crime, | 0:05:15 | 0:05:19 | |
global trade, | 0:05:19 | 0:05:21 | |
migration and even disease... | 0:05:21 | 0:05:24 | |
..it's all underpinned by data. | 0:05:25 | 0:05:27 | |
Take this for example. | 0:05:30 | 0:05:32 | |
Rural Wiltshire and a dairy farm | 0:05:33 | 0:05:37 | |
gathering data from its cows wearing | 0:05:37 | 0:05:41 | |
pedometers. | 0:05:41 | 0:05:42 | |
We can't be out here 24-7. | 0:05:44 | 0:05:46 | |
The pedometers help us to have our eyes and ears everywhere. | 0:05:46 | 0:05:50 | |
It turns out when cows go into heat | 0:05:50 | 0:05:53 | |
they move around a lot more than normal. | 0:05:53 | 0:05:56 | |
Constant monitoring of their steps and some background mathematics | 0:05:56 | 0:06:00 | |
reveal the prime time for insemination. | 0:06:00 | 0:06:04 | |
We'll be able to look at the data | 0:06:04 | 0:06:06 | |
and within 24 hours there'll be a | 0:06:06 | 0:06:09 | |
greater chance of getting her in calf. | 0:06:09 | 0:06:12 | |
Data-driven farming is now big business, | 0:06:12 | 0:06:15 | |
turning a centuries-old way of life into precision science. | 0:06:15 | 0:06:19 | |
Pretty much every industry you can think of | 0:06:23 | 0:06:26 | |
now relies on data. | 0:06:26 | 0:06:27 | |
We all agree that we are undergoing | 0:06:32 | 0:06:34 | |
a major revolution in human history. | 0:06:34 | 0:06:36 | |
The digital world replacing the analogue world. | 0:06:46 | 0:06:48 | |
A world based on data, | 0:06:48 | 0:06:50 | |
they are made of codes rather than a world made of biological or physical | 0:06:50 | 0:06:55 | |
data, that is extraordinary. | 0:06:55 | 0:06:57 | |
Why philosophy at this stage? | 0:06:57 | 0:06:59 | |
Because when you face extraordinary challenges, | 0:06:59 | 0:07:02 | |
the worst thing you can do is to get close to it. | 0:07:02 | 0:07:05 | |
You need to take a long run-up. | 0:07:05 | 0:07:07 | |
The bigger the gap, the longer the run-up. | 0:07:07 | 0:07:09 | |
And the run-up is called philosophy. | 0:07:09 | 0:07:11 | |
In the spirit of taking a long run-up, | 0:07:12 | 0:07:15 | |
we'll start with the word itself. | 0:07:15 | 0:07:17 | |
"Data" is originally from the Latin "datum", | 0:07:20 | 0:07:23 | |
meaning "that which is given". | 0:07:23 | 0:07:25 | |
Data can be descriptions... | 0:07:25 | 0:07:28 | |
..counts, or measures... | 0:07:29 | 0:07:33 | |
..of anything... | 0:07:35 | 0:07:36 | |
..in any format. | 0:07:38 | 0:07:40 | |
It's anything that when analysed becomes information, | 0:07:43 | 0:07:46 | |
which in turn is the raw material for knowledge, | 0:07:46 | 0:07:49 | |
the only true path to wisdom. | 0:07:49 | 0:07:52 | |
Look at the data on data. | 0:07:54 | 0:07:55 | |
And before the scientific and industrial revolution, | 0:07:58 | 0:08:00 | |
the word barely gets a look in, in English. | 0:08:00 | 0:08:03 | |
But then it starts to appear in print | 0:08:04 | 0:08:07 | |
as scientists and the state gather, | 0:08:07 | 0:08:11 | |
observe and create more and more of it. | 0:08:11 | 0:08:13 | |
This arrival of the age of data would change everything. | 0:08:14 | 0:08:18 | |
Industrial Revolution Britain. | 0:08:24 | 0:08:27 | |
For Victorians, | 0:08:27 | 0:08:28 | |
booming industry and the growth of major cities were changing both the | 0:08:28 | 0:08:32 | |
landscape and daily life beyond recognition. | 0:08:32 | 0:08:36 | |
Into this scene stepped an unlikely man of numbers, | 0:08:37 | 0:08:41 | |
William Farr, | 0:08:41 | 0:08:42 | |
one of the first people to manage data on an industrial scale. | 0:08:42 | 0:08:47 | |
William Farr had a quite unusual upbringing, | 0:08:47 | 0:08:50 | |
in that he was actually the son of a farm labourer but who had managed to | 0:08:50 | 0:08:53 | |
get a medical education, which was really very unusual for someone of his class. | 0:08:53 | 0:08:56 | |
Farr very quickly became absorbed in the study of statistics. | 0:09:05 | 0:09:08 | |
He was particularly interested, as you might expect for somebody with medical training, | 0:09:08 | 0:09:11 | |
in public health, life expectancy and about causes of death. | 0:09:11 | 0:09:14 | |
For anyone interested in statistics, | 0:09:15 | 0:09:18 | |
there was only one place to be. | 0:09:18 | 0:09:19 | |
Somerset House in London was home to | 0:09:21 | 0:09:24 | |
the General Register Office, | 0:09:24 | 0:09:26 | |
where, in 1839, | 0:09:26 | 0:09:27 | |
Farr found his dream job. | 0:09:27 | 0:09:30 | |
From up there in the north wing, William Farr, the apothecary, | 0:09:30 | 0:09:34 | |
medical journalist and top statistician, | 0:09:34 | 0:09:37 | |
would really rule the roost. | 0:09:37 | 0:09:38 | |
Now, this place was almost like a factory. | 0:09:38 | 0:09:41 | |
Here, they would collect, process and analyse | 0:09:41 | 0:09:44 | |
vast amounts of data. | 0:09:44 | 0:09:47 | |
So in would come the census returns, the records of every single birth, | 0:09:47 | 0:09:50 | |
death and marriage in the country, and out would come the big picture, | 0:09:50 | 0:09:55 | |
the usable information that could help | 0:09:55 | 0:09:58 | |
inform policy and reform society. | 0:09:58 | 0:10:00 | |
I think it's sometimes difficult for us to remember just how little people knew | 0:10:01 | 0:10:05 | |
in the early 19th century about the changes that Britain was going through. | 0:10:05 | 0:10:08 | |
So when Farr did an analysis of population density and death | 0:10:08 | 0:10:11 | |
rate, he was able to show that life expectancy in Liverpool | 0:10:11 | 0:10:15 | |
was absolutely atrocious. | 0:10:15 | 0:10:16 | |
It was far, far worse than the surrounding areas. | 0:10:16 | 0:10:18 | |
This came as a surprise to a lot of people who believed that Liverpool, | 0:10:18 | 0:10:21 | |
a coastal town, was actually quite a salubrious place to live. | 0:10:21 | 0:10:24 | |
At Somerset House, Farr spearheaded a revolution in the systematic | 0:10:27 | 0:10:32 | |
collection of data | 0:10:32 | 0:10:34 | |
to uncover the real picture of this changing society. | 0:10:34 | 0:10:37 | |
Its scale and ambition was described in a newspaper at the time. | 0:10:39 | 0:10:44 | |
"In arched chambers of immense strength and extent | 0:10:44 | 0:10:47 | |
"are, in many volumes, the | 0:10:47 | 0:10:48 | |
"genuine certificates of upwards of 28 million persons | 0:10:48 | 0:10:54 | |
"born into life, married or passed into the grave." | 0:10:54 | 0:10:59 | |
Here, every person was recorded equally. | 0:10:59 | 0:11:02 | |
A revolutionary idea. | 0:11:02 | 0:11:04 | |
"Here are to be found the records of nonentities, | 0:11:07 | 0:11:11 | |
"side-by-side with those once learned in the law | 0:11:11 | 0:11:15 | |
"or distinguished in | 0:11:15 | 0:11:16 | |
"literature, art or science." | 0:11:16 | 0:11:19 | |
But what really motivated William Farr was not just data collection, | 0:11:20 | 0:11:25 | |
it was the possibility that data gathered could be analysed to help | 0:11:25 | 0:11:30 | |
overcome society's greatest ill. | 0:11:30 | 0:11:33 | |
Cholera was probably the most feared of all of the Victorian diseases. | 0:11:35 | 0:11:40 | |
The terrifying thing was that you could wake up in the morning and feel absolutely fine, | 0:11:40 | 0:11:43 | |
and then be dead by the evening. | 0:11:43 | 0:11:45 | |
Between the 1830s and the 1860s, | 0:11:47 | 0:11:50 | |
tens of thousands died in London alone. | 0:11:50 | 0:11:53 | |
The control of infectious diseases like cholera, | 0:11:56 | 0:11:59 | |
which no-one fully understood, | 0:11:59 | 0:12:02 | |
became the greatest public-health issue of the time. | 0:12:02 | 0:12:05 | |
However great London might have looked back then, | 0:12:06 | 0:12:08 | |
it would have smelled absolutely terrible. | 0:12:08 | 0:12:12 | |
At that point, the Victorians didn't have really a great way of disposing | 0:12:12 | 0:12:16 | |
of human waste, so it would have flowed down the gutters | 0:12:16 | 0:12:19 | |
into open sewers | 0:12:19 | 0:12:21 | |
and out into the Thames. | 0:12:21 | 0:12:22 | |
Now, the city smelt so bad that it | 0:12:22 | 0:12:25 | |
was pretty plausible that the foul air | 0:12:25 | 0:12:29 | |
was responsible for carrying the disease. | 0:12:29 | 0:12:31 | |
Farr collected a huge range of data during each | 0:12:33 | 0:12:36 | |
cholera outbreak to try to | 0:12:36 | 0:12:38 | |
identify what put people most at risk from the bad air. | 0:12:38 | 0:12:43 | |
He used income-tax data to try and measure the affluence of the different | 0:12:44 | 0:12:47 | |
boroughs that were affected by cholera. | 0:12:47 | 0:12:49 | |
He asked his friends at the Royal Observatory to provide data on the | 0:12:49 | 0:12:53 | |
temperature and climatic conditions. | 0:12:53 | 0:12:55 | |
But the one that he thought was most convincing was about the topography. | 0:12:56 | 0:12:59 | |
It was about the elevation above the Thames. | 0:12:59 | 0:13:02 | |
Using the data, Farr suggested a mathematical law of elevation. | 0:13:02 | 0:13:07 | |
Its equations described how cholera mortality falls the higher you live | 0:13:07 | 0:13:12 | |
above the Thames. | 0:13:12 | 0:13:14 | |
Now, he published his report in 1852, | 0:13:15 | 0:13:18 | |
which the Lancet described as one of | 0:13:18 | 0:13:20 | |
the most remarkable productions of | 0:13:20 | 0:13:22 | |
type and pen in any age and country. | 0:13:22 | 0:13:25 | |
The only problem was that Farr's work, | 0:13:27 | 0:13:30 | |
although elegant and meticulous, | 0:13:30 | 0:13:33 | |
was fundamentally flawed. | 0:13:33 | 0:13:35 | |
Farr stuck to the prevailing theory | 0:13:37 | 0:13:40 | |
that cholera was spread by air. | 0:13:40 | 0:13:43 | |
Such is the power of the status quo. | 0:13:43 | 0:13:45 | |
But in 1866, | 0:13:45 | 0:13:47 | |
5,500 people died in just one square mile of London's East End, | 0:13:47 | 0:13:53 | |
and that data made Farr change his mind. | 0:13:53 | 0:13:56 | |
When Farr came to write his next report, | 0:13:58 | 0:14:01 | |
the data told a different story | 0:14:01 | 0:14:04 | |
which proved the turning point in | 0:14:04 | 0:14:05 | |
combating the disease. | 0:14:05 | 0:14:07 | |
The common factor among those who died was not elevation or air | 0:14:07 | 0:14:13 | |
but sewage-contaminated drinking water. | 0:14:13 | 0:14:16 | |
With this new report, | 0:14:18 | 0:14:20 | |
Farr may seem to have contradicted | 0:14:20 | 0:14:22 | |
much of his own work, but I think that | 0:14:22 | 0:14:25 | |
this is the perfect example of what data can do. | 0:14:25 | 0:14:29 | |
It provides that bridge essential to scientific discovery, | 0:14:29 | 0:14:33 | |
from theory to proof, problem to solution. | 0:14:33 | 0:14:36 | |
Good data, even in huge volumes, | 0:14:38 | 0:14:41 | |
does not guarantee that you will arrive at the truth. | 0:14:41 | 0:14:45 | |
But, eventually, when the weight of the data tips the balance, | 0:14:45 | 0:14:49 | |
even the strongest-held beliefs can be overcome. | 0:14:49 | 0:14:52 | |
Of course, it was the weight of the data itself which, | 0:14:55 | 0:14:59 | |
with the dawn of the 20th century, | 0:14:59 | 0:15:01 | |
was becoming increasingly hard to manage. | 0:15:01 | 0:15:04 | |
Data stored long form in things like census ledgers could take the best | 0:15:06 | 0:15:10 | |
part of a decade to process, | 0:15:10 | 0:15:13 | |
meaning the stats were often out of date. | 0:15:13 | 0:15:16 | |
When you're dealing with figures like these, it's one thing. | 0:15:17 | 0:15:19 | |
But when you're counting the population like this it's quite a different matter. | 0:15:22 | 0:15:26 | |
A deceptively simple solution got what's now called the | 0:15:26 | 0:15:30 | |
information revolution under way, | 0:15:30 | 0:15:33 | |
encoding data as holes punched in cards. | 0:15:33 | 0:15:38 | |
These cards are passed over sorting machines, | 0:15:38 | 0:15:41 | |
each of which handles 22,000 cards a minute. | 0:15:41 | 0:15:44 | |
By the 1950s, data processing and simple calculations were | 0:15:46 | 0:15:51 | |
routinely mechanised, | 0:15:51 | 0:15:53 | |
laying the groundwork for the next generation of | 0:15:53 | 0:15:55 | |
data-processing machines. | 0:15:55 | 0:15:57 | |
They would be put to pioneering work | 0:16:00 | 0:16:04 | |
in a rather unlikely place. | 0:16:04 | 0:16:06 | |
In a grand London dining hall, a group of men and women, | 0:16:08 | 0:16:11 | |
many in their 80s and 90s, have gathered for a special work reunion. | 0:16:11 | 0:16:16 | |
At its peak, their employer, J Lyons, | 0:16:18 | 0:16:22 | |
purveyor of fine British tea and cakes, | 0:16:22 | 0:16:25 | |
had hundreds of tea shops nationwide. | 0:16:25 | 0:16:27 | |
There are hundreds of items of food. | 0:16:28 | 0:16:30 | |
All these in a varying quantity each day are delivered to a precise | 0:16:30 | 0:16:34 | |
timetable to the tea shops. | 0:16:34 | 0:16:35 | |
These people aren't former J Lyons bakers or tea-shop managers. | 0:16:38 | 0:16:43 | |
They were hired for their mathematical skills. | 0:16:43 | 0:16:47 | |
Lyons had a huge amount of data which has to be processed, | 0:16:47 | 0:16:51 | |
often very low-value data. | 0:16:51 | 0:16:54 | |
So, for example, the transaction from a tea shop | 0:16:54 | 0:16:57 | |
would be a cup of tea. | 0:16:57 | 0:16:58 | |
But each one had a voucher and had to be recorded, | 0:16:59 | 0:17:02 | |
and had to go to the accounts | 0:17:02 | 0:17:04 | |
for business reasons and for management reasons. | 0:17:04 | 0:17:08 | |
Every calculation you did, not only you had to do it twice, | 0:17:08 | 0:17:12 | |
but you had to get it checked by someone else as well. | 0:17:12 | 0:17:15 | |
The handling of these millions and millions of pieces of data, | 0:17:15 | 0:17:19 | |
the storage of that data, are the key of the business problem. | 0:17:19 | 0:17:23 | |
The Lyons team took the world by surprise when, in 1951, | 0:17:25 | 0:17:30 | |
they unveiled the Lyons Electronic Office, | 0:17:30 | 0:17:33 | |
or Leo for short. | 0:17:33 | 0:17:34 | |
At this point, only a handful of computers existed, | 0:17:38 | 0:17:42 | |
and they were used solely for scientific and military research, | 0:17:42 | 0:17:46 | |
so a business computer was a radical reimagining of what this brand-new | 0:17:46 | 0:17:51 | |
technology could be for. | 0:17:51 | 0:17:52 | |
Each manageress has a standing order depending on the day of the week. | 0:17:55 | 0:17:59 | |
She speaks by telephone to head office, where her variations are | 0:18:00 | 0:18:03 | |
taken quickly onto cards. | 0:18:03 | 0:18:05 | |
What the girl hears, she punches. | 0:18:05 | 0:18:08 | |
The programme is fed first, | 0:18:08 | 0:18:10 | |
laying down the sequence for the multiplicity of calculations Leo will perform. | 0:18:10 | 0:18:14 | |
It was the first | 0:18:15 | 0:18:16 | |
opportunity to process large volumes | 0:18:16 | 0:18:19 | |
of clerical work, | 0:18:19 | 0:18:20 | |
take all the hard work out of it, | 0:18:20 | 0:18:22 | |
and put it on an automatic system. | 0:18:22 | 0:18:25 | |
Before Leo, working out an employee's pay | 0:18:25 | 0:18:28 | |
took an experienced clerk eight minutes, | 0:18:28 | 0:18:31 | |
but with Leo that dropped to an astonishing one and a half seconds. | 0:18:31 | 0:18:35 | |
It was all so exciting | 0:18:39 | 0:18:40 | |
because we were breaking new ground the whole time. | 0:18:40 | 0:18:43 | |
Absolutely everything which we did has never been done before. | 0:18:43 | 0:18:48 | |
By anybody anywhere. | 0:18:49 | 0:18:51 | |
I don't think we realised the kind of transformation we were part of. | 0:18:51 | 0:18:55 | |
The post-war years saw a boom in the application of this new computing | 0:18:59 | 0:19:04 | |
technology. | 0:19:04 | 0:19:05 | |
Leo ran on paper, tape and cards, | 0:19:06 | 0:19:09 | |
but soon machines with magnetic tape and disks were developed, | 0:19:09 | 0:19:13 | |
allowing for greater data storage and faster calculations. | 0:19:13 | 0:19:17 | |
As more businesses and institutions adopted these new machines, | 0:19:19 | 0:19:23 | |
application of mathematics to a whole host of new, | 0:19:23 | 0:19:27 | |
real-world challenges took off. | 0:19:27 | 0:19:29 | |
And the word "data" went from relatively obscure to ubiquitous. | 0:19:31 | 0:19:36 | |
"Data" has become almost a magic word for anything. | 0:19:44 | 0:19:48 | |
The truth is that it is a kind of interface | 0:19:48 | 0:19:51 | |
today between us and the rest of the world. | 0:19:51 | 0:19:54 | |
In fact, between us and ourselves, | 0:19:54 | 0:19:57 | |
we understand our bodies in terms of data, | 0:19:57 | 0:20:00 | |
we understand society in terms of data, | 0:20:00 | 0:20:03 | |
we understand the physics of the universe in terms of data. | 0:20:03 | 0:20:06 | |
The economy, social science, we play with data, | 0:20:06 | 0:20:09 | |
so essentially it is what we interact with | 0:20:09 | 0:20:12 | |
most regularly every day. | 0:20:12 | 0:20:14 | |
Data underpins all human communication, | 0:20:17 | 0:20:21 | |
regardless of the format. | 0:20:21 | 0:20:23 | |
And it was the desire to communicate effectively and efficiently that led | 0:20:23 | 0:20:28 | |
to one of the most important academic papers of the 20th century. | 0:20:28 | 0:20:33 | |
A mathematical theory of communication | 0:20:35 | 0:20:38 | |
has justifiably been called the Magna Carta for the information age. | 0:20:38 | 0:20:45 | |
It was written by a very young and bright employee of Bell Laboratories, | 0:20:45 | 0:20:50 | |
the American centre for telecoms research that was founded by one of | 0:20:50 | 0:20:54 | |
the inventors of the telephone, Alexander Graham Bell. | 0:20:54 | 0:20:58 | |
Now, this paper was written by Claude Shannon in 1948 and it would | 0:20:58 | 0:21:04 | |
effectively lay out the theoretical framework for the data revolution | 0:21:04 | 0:21:09 | |
that was just beginning. | 0:21:09 | 0:21:12 | |
Those that knew him described | 0:21:12 | 0:21:14 | |
Shannon as a lifelong puzzle solver and inventor. | 0:21:14 | 0:21:18 | |
To define the correct path it registers the information in its memory. | 0:21:18 | 0:21:23 | |
Later, I can put him down in any part of the maze that he has already explored | 0:21:23 | 0:21:27 | |
and he will be able to go directly to the goal without making a single | 0:21:27 | 0:21:29 | |
false turn. | 0:21:29 | 0:21:32 | |
During World War II he worked on data-encryption systems, | 0:21:32 | 0:21:36 | |
including one used by Churchill and Roosevelt. | 0:21:36 | 0:21:40 | |
But at Bell Labs, | 0:21:41 | 0:21:43 | |
Claude Shannon was trying to solve the very civilian problem of noisy | 0:21:43 | 0:21:48 | |
telephone lines. | 0:21:48 | 0:21:49 | |
# There's a call, there's a call | 0:21:49 | 0:21:51 | |
# There's a call for you | 0:21:51 | 0:21:53 | |
# There's a call on the phone for you. # | 0:21:53 | 0:21:56 | |
In that analogue world of 20th-century phones, | 0:21:56 | 0:22:00 | |
your speech was converted into an electrical signal using a handset | 0:22:00 | 0:22:04 | |
like this and then transmitted down a series of wires. | 0:22:04 | 0:22:08 | |
The voice signals would travel along the wire, | 0:22:08 | 0:22:12 | |
be detected by the receiver at the other end and then be converted back | 0:22:12 | 0:22:16 | |
into sound waves to reach the ear of whoever had picked up. | 0:22:16 | 0:22:20 | |
The problem was, | 0:22:20 | 0:22:22 | |
the further the electrical signal travelled down the line, | 0:22:22 | 0:22:24 | |
the weaker it would get. | 0:22:24 | 0:22:26 | |
PHONE LINE CRACKLES Eventually you couldn't | 0:22:26 | 0:22:28 | |
even hear the conversation for the amount of noise on the line. | 0:22:28 | 0:22:32 | |
And you could boost the signal but it would mean boosting the noise, too. | 0:22:32 | 0:22:36 | |
Shannon's genius idea was just as simple as it was beautiful. | 0:22:37 | 0:22:42 | |
The breakthrough was converting speech | 0:22:43 | 0:22:46 | |
into an incredibly simple code. | 0:22:46 | 0:22:49 | |
ON PHONE: Hello? | 0:22:50 | 0:22:51 | |
First the audio wave is detected, then sampled. | 0:22:51 | 0:22:55 | |
Each point is assigned a code of ones and zeros | 0:22:55 | 0:23:00 | |
and the resulting long string of digits can then be sent down | 0:23:00 | 0:23:03 | |
the wire with the zeros as brief low-voltage signals | 0:23:03 | 0:23:07 | |
and ones as brief bursts of high voltage. | 0:23:07 | 0:23:11 | |
From this code, the original audio can be cleanly reconstructed and | 0:23:11 | 0:23:16 | |
regenerated at the other end. | 0:23:16 | 0:23:18 | |
ON PHONE: Hello? | 0:23:18 | 0:23:20 | |
Shannon was the first person to publish the name | 0:23:20 | 0:23:23 | |
for these ones and zeros, | 0:23:23 | 0:23:24 | |
the smallest possible pieces of information, | 0:23:24 | 0:23:27 | |
and they are called bits or binary digits, | 0:23:27 | 0:23:30 | |
and the real power of the bit and the mathematics behind it | 0:23:30 | 0:23:35 | |
applies way beyond telephones. | 0:23:35 | 0:23:37 | |
They offered a new way for everything, | 0:23:38 | 0:23:41 | |
including text and pictures, to be encoded as ones and zeros. | 0:23:41 | 0:23:48 | |
The possibility to store and share data digitally in the form of bits | 0:23:51 | 0:23:57 | |
was clearly going to transform the world. | 0:23:57 | 0:24:00 | |
If anyone has to be identified as | 0:24:02 | 0:24:05 | |
the genius who developed the | 0:24:05 | 0:24:08 | |
foundational science of mathematics | 0:24:08 | 0:24:11 | |
for our age, that is certainly Claude Shannon. | 0:24:11 | 0:24:14 | |
Now, one thing has to be clarified, | 0:24:14 | 0:24:18 | |
the theory developed by Shannon is about data transmission and it has | 0:24:18 | 0:24:24 | |
nothing to do with meaning, truth, relevance, | 0:24:24 | 0:24:27 | |
importance of the data transmitted. | 0:24:27 | 0:24:30 | |
So it doesn't matter whether the zero and one represent | 0:24:30 | 0:24:35 | |
an answer to, "Heads or tails?", or to the question, "Will you marry me?", | 0:24:35 | 0:24:40 | |
for a theory of information is data anyway and if it is a 50-50 chance | 0:24:40 | 0:24:46 | |
that you will or will not marry me or that it is heads or tails, | 0:24:46 | 0:24:51 | |
the amount of information, the Shannon information, | 0:24:51 | 0:24:54 | |
communicated is the same. | 0:24:54 | 0:24:56 | |
Shannon information is not information like you or I might think about it. | 0:24:58 | 0:25:04 | |
Encoding any and every signal using just ones and zeros is a pretty | 0:25:04 | 0:25:08 | |
remarkable breakthrough. | 0:25:08 | 0:25:10 | |
However, Shannon also came up with a revolutionary bit of mathematics. | 0:25:10 | 0:25:16 | |
That equation there is the reason you can fit an entire HD movie on a | 0:25:16 | 0:25:21 | |
flimsy bit of plastic or the reason why you can stream films online. | 0:25:21 | 0:25:26 | |
I'll admit, it might not look too nice, but... | 0:25:26 | 0:25:30 | |
don't get put off yet, because I'm going to explain how this equation | 0:25:32 | 0:25:36 | |
works using Scrabble. | 0:25:36 | 0:25:38 | |
Imagine that I created a new alphabet | 0:25:40 | 0:25:43 | |
containing only the letter A. | 0:25:43 | 0:25:45 | |
This bag would only have A tiles inside it | 0:25:45 | 0:25:48 | |
and my chances of pulling out an A tile would be one. | 0:25:48 | 0:25:52 | |
You'd be completely certain of what was going to happen. | 0:25:52 | 0:25:54 | |
Using Shannon's maths, | 0:25:54 | 0:25:56 | |
the letter A contains zero bits of what's called Shannon information. | 0:25:56 | 0:26:01 | |
Let's say then I got a little bit more creative, but not much, | 0:26:03 | 0:26:05 | |
and had an alphabet with two letters, A and B, | 0:26:05 | 0:26:09 | |
and equal numbers of both in this bag. | 0:26:09 | 0:26:11 | |
Now my chances of pulling out an A are going to be a half | 0:26:11 | 0:26:15 | |
and each letter contains one bit of Shannon information. | 0:26:15 | 0:26:20 | |
Of course, when transmitting real messages, | 0:26:22 | 0:26:24 | |
you'll use the full alphabet. | 0:26:24 | 0:26:26 | |
But English, | 0:26:28 | 0:26:30 | |
as with every other language, has some letters that are used | 0:26:30 | 0:26:33 | |
more frequently than others. | 0:26:33 | 0:26:35 | |
If you take a quite common letter like H, | 0:26:35 | 0:26:38 | |
which appear about 5.9% of the time, | 0:26:38 | 0:26:42 | |
this will have a Shannon information of 4.1 bits. | 0:26:42 | 0:26:46 | |
And incidentally, a Scrabble score of four. | 0:26:48 | 0:26:52 | |
Of course, there are some much more exotic and rare letters, | 0:26:52 | 0:26:56 | |
like Z, for instance, which appears about 0.07% of the time. | 0:26:56 | 0:27:02 | |
That gives it 10.5 bits | 0:27:02 | 0:27:05 | |
and Scrabble score of ten. | 0:27:05 | 0:27:07 | |
Bits measure our uncertainty. | 0:27:09 | 0:27:13 | |
If you're guessing a three-letter word and you know this letter is Z, | 0:27:13 | 0:27:17 | |
it gives you a lot of information about what the word could be. | 0:27:17 | 0:27:21 | |
But if you know it's H, | 0:27:22 | 0:27:24 | |
because it is a more common letter with less information, | 0:27:24 | 0:27:27 | |
you're more uncertain about the answer. | 0:27:27 | 0:27:30 | |
Now if you wrap up all that uncertainty together, | 0:27:32 | 0:27:35 | |
you end up with this, the Shannon entropy. | 0:27:35 | 0:27:38 | |
It's the sum of the probability of | 0:27:40 | 0:27:42 | |
each symbol turning up times the | 0:27:42 | 0:27:44 | |
number of bits in each symbol. | 0:27:44 | 0:27:47 | |
And this very clever bit of insight | 0:27:47 | 0:27:49 | |
and mathematics means that the code | 0:27:49 | 0:27:52 | |
for any message can be quantified. | 0:27:52 | 0:27:55 | |
Not every letter, or any other signal for that matter, | 0:27:55 | 0:27:58 | |
needs to be encoded equally. | 0:27:58 | 0:28:01 | |
The digital code behind a movie like this one of my dog, Molly, | 0:28:03 | 0:28:07 | |
for example, | 0:28:07 | 0:28:08 | |
can usually be compressed by up to 50% | 0:28:08 | 0:28:12 | |
without losing any information. | 0:28:12 | 0:28:16 | |
But there's a limit. | 0:28:16 | 0:28:17 | |
Compressing more might make it easier to share or download, | 0:28:19 | 0:28:24 | |
but the quality can never be the same as the original. | 0:28:24 | 0:28:27 | |
DOG BARKS | 0:28:27 | 0:28:30 | |
You can't really overstate the impact that Shannon's work | 0:28:32 | 0:28:35 | |
has had, because without it we wouldn't have JPEGs or Zip files | 0:28:35 | 0:28:40 | |
or HD movies or digital communications. | 0:28:40 | 0:28:43 | |
But it doesn't just stop there, because while the mathematics of | 0:28:43 | 0:28:47 | |
information theory doesn't tell you anything about the meaning of data, | 0:28:47 | 0:28:51 | |
it does begin to open up a possibility | 0:28:51 | 0:28:54 | |
of how we can understand ourselves | 0:28:54 | 0:28:56 | |
and our society, because pretty much anything and everything can be | 0:28:56 | 0:29:01 | |
measured and encoded as data. | 0:29:01 | 0:29:04 | |
We say that signals flow through human society, | 0:29:10 | 0:29:13 | |
that people use signals to get things done, | 0:29:13 | 0:29:16 | |
that our social life is, in many ways, | 0:29:16 | 0:29:18 | |
the sending back and forth of signals. | 0:29:18 | 0:29:20 | |
So what is a signal? | 0:29:20 | 0:29:22 | |
It's, in one sense, just the reduction in uncertainty. | 0:29:22 | 0:29:26 | |
What it means to receive a signal is to be less uncertain than you were | 0:29:34 | 0:29:39 | |
before and so, another way to think of measuring or quantifying signal | 0:29:39 | 0:29:43 | |
is in that change in uncertainty. | 0:29:43 | 0:29:46 | |
Using Shannon's mathematics to quantify signals | 0:29:47 | 0:29:50 | |
is common in the world of complexity science. | 0:29:50 | 0:29:53 | |
It's rather less familiar to historians. | 0:29:53 | 0:29:57 | |
I love maths, I love its precision, I love its beauty. | 0:29:57 | 0:30:00 | |
I absolutely love | 0:30:08 | 0:30:11 | |
its certainty, and that, Simon can bring that mathematical worldview, | 0:30:11 | 0:30:19 | |
that mathematical certainty to what I work with. | 0:30:19 | 0:30:22 | |
The reason behind this remarkable marriage between history and science | 0:30:24 | 0:30:28 | |
is the analysis of the largest single body of digital text | 0:30:28 | 0:30:32 | |
ever collated about ordinary people. | 0:30:32 | 0:30:35 | |
It's the Proceedings of London's Old Bailey, | 0:30:36 | 0:30:39 | |
the central criminal court of England and Wales, | 0:30:39 | 0:30:41 | |
which hosted close to 200,000 trials between 1674 and 1913. | 0:30:41 | 0:30:49 | |
There are 127 million words of everyday speech | 0:30:49 | 0:30:54 | |
in the mouths of orphans and women and servants and ne'er-do-wells, | 0:30:54 | 0:31:00 | |
of criminals, certainly, | 0:31:00 | 0:31:02 | |
but also people from every rank and station in society. | 0:31:02 | 0:31:06 | |
And that made them unique. | 0:31:06 | 0:31:09 | |
What's exciting about the Old Bailey and the size of the dataset, | 0:31:09 | 0:31:13 | |
the length and magnitude of it, | 0:31:13 | 0:31:15 | |
is that not only can we detect a signal, | 0:31:15 | 0:31:18 | |
but we are able to look at that signal's emergence over time. | 0:31:18 | 0:31:22 | |
Shannon's mathematics can be used to capture the amount of information in | 0:31:24 | 0:31:28 | |
every single word, | 0:31:28 | 0:31:30 | |
and like the alphabet, the less you expect a word, | 0:31:30 | 0:31:34 | |
the more bits of information it carries. | 0:31:34 | 0:31:37 | |
Imagine that you walk into a courtroom | 0:31:37 | 0:31:40 | |
at the time and you hear a single word, | 0:31:40 | 0:31:43 | |
the question we ask is how much information does that word carry | 0:31:43 | 0:31:47 | |
about the nature of the crime being tried? | 0:31:47 | 0:31:50 | |
You hear the word "the". | 0:31:52 | 0:31:55 | |
It's common across all trials and so gives you no bits of information. | 0:31:55 | 0:32:00 | |
Most words you hear are poor signals of what's going on. | 0:32:00 | 0:32:04 | |
But then you hear "purse". | 0:32:06 | 0:32:09 | |
It conveys real information. | 0:32:09 | 0:32:11 | |
Then comes "coin", | 0:32:12 | 0:32:15 | |
"grab" and "struck". | 0:32:15 | 0:32:17 | |
The more rare a word, the more bits of information it carries, | 0:32:17 | 0:32:22 | |
the stronger the signal becomes. | 0:32:22 | 0:32:24 | |
One of the clearest signals that we see in the Old Bailey, | 0:32:26 | 0:32:29 | |
one of the clearest processes that comes out, | 0:32:29 | 0:32:31 | |
is something that is known as the civilising process. | 0:32:31 | 0:32:35 | |
It's an increasing sensitivity to, and attention to, the | 0:32:35 | 0:32:42 | |
distinction between violent and nonviolent crime. | 0:32:42 | 0:32:46 | |
If, for example, somebody hit you and stole your handkerchief, | 0:32:46 | 0:32:52 | |
in the 18th-century context, in 1780, | 0:32:52 | 0:32:54 | |
you would concentrate on the handkerchief. | 0:32:54 | 0:32:57 | |
More worried about a few pence worth of dirty linen than the fact that | 0:32:57 | 0:33:01 | |
somebody just broke your nose or cracked a rib. | 0:33:01 | 0:33:05 | |
The fact that 100 years later, by 1880, | 0:33:05 | 0:33:09 | |
every concern, every focus, both in terms of the words used in court, | 0:33:09 | 0:33:14 | |
but also in terms of what people were brought to court for, | 0:33:14 | 0:33:18 | |
focus on that broken nose and that cracked rib, | 0:33:18 | 0:33:21 | |
speaks to a fundamental change in how we think about the world | 0:33:21 | 0:33:25 | |
and how we think about how social relations work. | 0:33:25 | 0:33:28 | |
Look at the strongest word signals for violent crime across the period. | 0:33:30 | 0:33:35 | |
In the 18th century, the age of highwaymen, | 0:33:35 | 0:33:38 | |
words relating to property theft dominate. | 0:33:38 | 0:33:41 | |
But by the 20th century, | 0:33:43 | 0:33:45 | |
it's physical violence itself and the impact on the victim | 0:33:45 | 0:33:49 | |
that carry the most weight. | 0:33:49 | 0:33:51 | |
That notion that one can trace change over time | 0:33:54 | 0:33:56 | |
by looking at language and how it's used, | 0:33:56 | 0:33:58 | |
who deploys it in what context, | 0:33:58 | 0:34:00 | |
that I think gives this kind of work its real power. | 0:34:00 | 0:34:04 | |
There are billions of words, there's all of Google Books, | 0:34:04 | 0:34:07 | |
there's every printed newspaper, | 0:34:07 | 0:34:10 | |
there is every speech made in Parliament, | 0:34:10 | 0:34:13 | |
every sermon given at most churches. | 0:34:13 | 0:34:15 | |
All of it is suddenly data and capable of being analysed. | 0:34:15 | 0:34:19 | |
The rapid development of computers in the mid 20th century | 0:34:24 | 0:34:27 | |
transformed our ability to encode, store and analyse data. | 0:34:27 | 0:34:31 | |
It took a little longer for us to work out how to share it. | 0:34:33 | 0:34:37 | |
This place is home to one of the most | 0:34:40 | 0:34:42 | |
important UK scientific institutions, | 0:34:42 | 0:34:46 | |
although it's one you've probably never heard of before. | 0:34:46 | 0:34:49 | |
But since the 1900s, this place has advanced all areas of physics, | 0:34:49 | 0:34:54 | |
radio communications, engineering, materials science, aeronautics, | 0:34:54 | 0:34:59 | |
even ship design. | 0:34:59 | 0:35:02 | |
NPL, the National Physical Laboratory, | 0:35:02 | 0:35:04 | |
in south-west London is where the first atomic clock was built | 0:35:04 | 0:35:08 | |
and where radar and the Automatic Computer Engine, or Ace, were invented. | 0:35:08 | 0:35:13 | |
The Ace computer was the brainchild of Alan Turing, | 0:35:14 | 0:35:19 | |
who came to work here right after the Second World War. | 0:35:19 | 0:35:22 | |
Now, Turing's contributions to the story of data are undoubtedly vast, | 0:35:22 | 0:35:26 | |
but more important for our story is another person who worked here with | 0:35:26 | 0:35:31 | |
Turing, someone who arguably is even less well known than this place, | 0:35:31 | 0:35:36 | |
Donald Davies. | 0:35:36 | 0:35:38 | |
Davies worked on secret British nuclear weapons research during the war... | 0:35:39 | 0:35:43 | |
..later joining Turing at NPL, | 0:35:45 | 0:35:48 | |
climbing the ranks to be put in charge of computing in 1966. | 0:35:48 | 0:35:52 | |
As well as the new digital computers, | 0:35:54 | 0:35:56 | |
Davies had a lifelong fascination with telephones and communication. | 0:35:56 | 0:36:02 | |
His mother had worked in the Post Office telephone exchange, | 0:36:02 | 0:36:05 | |
so even when he was a kid, | 0:36:05 | 0:36:06 | |
he had a real understanding of how these phone calls were routed and | 0:36:06 | 0:36:10 | |
rerouted through this growing network, | 0:36:10 | 0:36:12 | |
and that was the perfect training for what was to follow. | 0:36:12 | 0:36:16 | |
What was Donald Davies like, then? | 0:36:27 | 0:36:28 | |
He was a super boss because he was very approachable. | 0:36:28 | 0:36:32 | |
Everybody realised he'd got huge intellect but not difficult with it. | 0:36:33 | 0:36:39 | |
Very nice guy. | 0:36:39 | 0:36:40 | |
Davies' innovation was to develop, with his team, | 0:36:40 | 0:36:44 | |
a way of sharing data between computers, a prototype network. | 0:36:44 | 0:36:49 | |
Donald had spotted that there was a need to connect computers together | 0:36:49 | 0:36:53 | |
and to connect people to computers, not by punch cards or | 0:36:53 | 0:36:57 | |
paper tape or on a motorcycle, but over the wires, | 0:36:57 | 0:37:00 | |
where you can move files or programs, or | 0:37:00 | 0:37:03 | |
run a program remotely on another computer, | 0:37:03 | 0:37:05 | |
and the telephone network is not really suited for that. | 0:37:05 | 0:37:09 | |
In the pre-digital era, | 0:37:10 | 0:37:12 | |
sending an encoded file along a telephone line meant that the line | 0:37:12 | 0:37:16 | |
was engaged for as long as the transmission took. | 0:37:16 | 0:37:20 | |
So the opportunity here was because we owned the site, | 0:37:20 | 0:37:23 | |
78 acres with some 50 buildings, we could build a network. | 0:37:23 | 0:37:28 | |
Davies' team sidestepped the telephone problem | 0:37:28 | 0:37:31 | |
by laying high-bandwidth data cables before instituting a new way of | 0:37:31 | 0:37:36 | |
moving data around the network. | 0:37:36 | 0:37:39 | |
The technique he came up with was packet switching, | 0:37:41 | 0:37:44 | |
the idea being that you take whatever it is you're going to send, | 0:37:44 | 0:37:47 | |
you chop it up into uniform pieces, like having a standard envelope, | 0:37:47 | 0:37:52 | |
and you put the pieces into the envelope and you post them off and they go | 0:37:52 | 0:37:56 | |
separately through the network and get reassembled at the far end. | 0:37:56 | 0:37:59 | |
To demonstrate this idea, | 0:37:59 | 0:38:01 | |
Roger and I are convening NPL's | 0:38:01 | 0:38:03 | |
first-ever packet-switching data-dash... | 0:38:03 | 0:38:07 | |
..which is a bit more complicated than your average sports-day event. | 0:38:08 | 0:38:11 | |
The course is a data network. | 0:38:13 | 0:38:16 | |
There are two computers, represented | 0:38:16 | 0:38:19 | |
here as the start and finish signs. | 0:38:19 | 0:38:21 | |
Those computers are connected by a | 0:38:21 | 0:38:24 | |
series of network cables and nodes. | 0:38:24 | 0:38:27 | |
In our case, cables are lines of | 0:38:27 | 0:38:29 | |
cones and the connecting nodes are Hula Hoops. | 0:38:29 | 0:38:33 | |
Having built it, all we need now are some willing volunteers. | 0:38:35 | 0:38:40 | |
And here they are. | 0:38:40 | 0:38:41 | |
NPL's very own apprentices. | 0:38:41 | 0:38:44 | |
So welcome to our packet-switching sports day. | 0:38:47 | 0:38:50 | |
We've got two teams, red and blue. | 0:38:50 | 0:38:52 | |
'Both teams are pretending to be data | 0:38:52 | 0:38:55 | |
'and they're going to have to race.' | 0:38:55 | 0:38:58 | |
You're going to start over there | 0:38:58 | 0:38:59 | |
where it says "start", kind of obvious, | 0:38:59 | 0:39:01 | |
and you're trying to get through to the end as quickly as you possibly | 0:39:01 | 0:39:05 | |
can. You can't just go anywhere, | 0:39:05 | 0:39:07 | |
you have to go through these hoops to get to the finish line, | 0:39:07 | 0:39:12 | |
these little nodes in our network. | 0:39:12 | 0:39:13 | |
You're only allowed to travel along the lines of the cones, | 0:39:13 | 0:39:17 | |
but only if there's nobody else along that line. | 0:39:17 | 0:39:20 | |
All clear? OK, there is one catch. | 0:39:20 | 0:39:23 | |
All of you who are in the red team, | 0:39:23 | 0:39:25 | |
we are going to tie your feet together. | 0:39:25 | 0:39:28 | |
So you've got to travel round our network as one big chunk of data. | 0:39:29 | 0:39:34 | |
Those of you who are in blue, you are allowed to travel on your own, | 0:39:34 | 0:39:38 | |
so it's slightly easier. | 0:39:38 | 0:39:39 | |
'The objective is for both teams to deposit their beanbags | 0:39:39 | 0:39:43 | |
'in the goal in the right order, one to five.' | 0:39:43 | 0:39:47 | |
EXCITED CHATTER | 0:39:47 | 0:39:51 | |
Get in the hoop! Get in the hoop! | 0:39:51 | 0:39:53 | |
Bring out your competitive spirit here. | 0:39:53 | 0:39:55 | |
We've got packets versus big chunks of data. | 0:39:55 | 0:39:58 | |
I'm going to time you. Everyone ready? | 0:39:58 | 0:40:01 | |
OK, over to you, Roger. | 0:40:01 | 0:40:02 | |
TOOT! | 0:40:02 | 0:40:05 | |
Remember, you can't go down the route until it's clear. | 0:40:07 | 0:40:10 | |
'The red and blue teams are exactly the same size, | 0:40:10 | 0:40:12 | |
'let's say five megabytes each. | 0:40:12 | 0:40:15 | |
'But their progress through the network is clearly very different.' | 0:40:15 | 0:40:21 | |
THEY LAUGH | 0:40:21 | 0:40:24 | |
OK, blues, you took 13 seconds, pretty impressive. | 0:40:32 | 0:40:36 | |
Reds, 20 seconds. | 0:40:36 | 0:40:38 | |
That's a victory for the packet switchers. | 0:40:38 | 0:40:40 | |
Well done, you guys! Well done, you guys. | 0:40:40 | 0:40:43 | |
The impact that packet switching has had on the world, I mean, | 0:40:43 | 0:40:47 | |
it sort of came from here and then spread out elsewhere. | 0:40:47 | 0:40:50 | |
It did indeed, we gave the world packet switching, and the world, | 0:40:50 | 0:40:54 | |
of course, being America, they took it on and ran with it. | 0:40:54 | 0:40:58 | |
This little race, Donald Davies' packet switching, | 0:41:01 | 0:41:05 | |
was adopted by the people that would go on to build the internet, | 0:41:05 | 0:41:09 | |
and today, the whole thing still runs on this idea. | 0:41:09 | 0:41:13 | |
Let's say I want to e-mail you a picture of Molly. | 0:41:16 | 0:41:19 | |
First, it will be broken up into over 1,000 data packets. | 0:41:19 | 0:41:24 | |
Each one is stamped with the address of where it's from and where it's | 0:41:24 | 0:41:27 | |
going to, which routers check to keep the packets moving. | 0:41:27 | 0:41:32 | |
Regardless of the order they arrive, | 0:41:32 | 0:41:34 | |
the image is reassembled, and there she is. | 0:41:34 | 0:41:38 | |
This is quite a cool thing, right, | 0:41:41 | 0:41:42 | |
that you've got one of the original creators | 0:41:42 | 0:41:44 | |
of packet switching right here and you can ask him... | 0:41:44 | 0:41:48 | |
Every time you're like... Well, do anything, really. | 0:41:48 | 0:41:51 | |
"Why is my internet running so slowly?" | 0:41:51 | 0:41:54 | |
-THEY LAUGH -Don't ask me! | 0:41:54 | 0:41:57 | |
We've come a very long way in just a few decades. | 0:42:01 | 0:42:05 | |
Around 3.4 billion people now have access to the internet at home | 0:42:05 | 0:42:10 | |
and there are around four times the number of phones | 0:42:10 | 0:42:13 | |
and other data-sharing devices online, | 0:42:13 | 0:42:16 | |
the so-called Internet of Things. | 0:42:16 | 0:42:19 | |
Just by being alive in the 21st century | 0:42:21 | 0:42:24 | |
with our phones, our tablets, our smart devices, all of us are | 0:42:24 | 0:42:28 | |
familiar with data. | 0:42:28 | 0:42:30 | |
Really embrace your inner nerd here, because every time you wander around | 0:42:30 | 0:42:34 | |
looking at your screen, you are gobbling up and churning out | 0:42:34 | 0:42:38 | |
absolutely tons of the stuff. | 0:42:38 | 0:42:40 | |
Our relationship with data has really changed - | 0:42:40 | 0:42:43 | |
it's no longer just for specialists, it's for everyone. | 0:42:43 | 0:42:47 | |
There's one city in the UK that's putting the sharing and real-time | 0:42:48 | 0:42:52 | |
analysis of data at the heart of everything it does - | 0:42:52 | 0:42:57 | |
Bristol. | 0:42:57 | 0:42:58 | |
Using digital technology, we take the city's pulse. | 0:42:59 | 0:43:03 | |
This data is the route to an open, smart, liveable city, | 0:43:04 | 0:43:10 | |
a city where optical, wireless and mesh networks | 0:43:10 | 0:43:15 | |
combine to create an open, urban canopy of connectivity. | 0:43:15 | 0:43:20 | |
Taking the pulse of the city under a canopy of connectivity | 0:43:20 | 0:43:25 | |
might sound a bit sci-fi, or like something from a broadband advert. | 0:43:25 | 0:43:30 | |
But if you just hold on to your cynicism for a second, | 0:43:30 | 0:43:33 | |
because Bristol are trying to build a new type of data-sharing network for its citizens. | 0:43:33 | 0:43:39 | |
There's a city-centre area which now has next-generation | 0:43:41 | 0:43:44 | |
or maybe the generation after next | 0:43:44 | 0:43:47 | |
of superfast broadband and then that's coupled to a Wi-Fi network, as well. | 0:43:47 | 0:43:51 | |
The question is, what can you do with it? | 0:43:51 | 0:43:54 | |
We would have a wide area network of very simple Internet of Things | 0:44:01 | 0:44:06 | |
sensing devices that just monitor a simple signal like air quality | 0:44:06 | 0:44:10 | |
or traffic queued in a traffic jam. | 0:44:10 | 0:44:11 | |
Once you've got all this network infrastructure, | 0:44:11 | 0:44:14 | |
you can get an awful lot, a really huge amount of data | 0:44:14 | 0:44:17 | |
arriving to you in real time. | 0:44:17 | 0:44:20 | |
What's happening here is a city-scale experiment | 0:44:22 | 0:44:26 | |
to try and develop and test | 0:44:26 | 0:44:28 | |
what's going to be called the programmable city of the future. | 0:44:28 | 0:44:32 | |
It relies on Bristol's futuristic network, | 0:44:33 | 0:44:36 | |
vast amounts of data from as many sensors as possible | 0:44:36 | 0:44:40 | |
and a computer system that can simulate | 0:44:40 | 0:44:43 | |
and effectively reprogram the city. | 0:44:43 | 0:44:47 | |
The computer system can intervene. | 0:44:47 | 0:44:49 | |
It could reroute traffic and we can actually radio out to individuals, | 0:44:49 | 0:44:53 | |
so maybe they get a message on their smartphone | 0:44:53 | 0:44:55 | |
or perhaps a wrist-mounted device, | 0:44:55 | 0:44:57 | |
saying, "If you have asthma, perhaps you should get indoors." | 0:44:57 | 0:44:59 | |
Once you create that capacity for anything and everything in the city | 0:45:01 | 0:45:06 | |
to be connected together, | 0:45:06 | 0:45:08 | |
you can really start to re-imagine how a city might operate. | 0:45:08 | 0:45:11 | |
We are starting to experiment with driverless cars and, in order for | 0:45:11 | 0:45:16 | |
driverless cars to work, | 0:45:16 | 0:45:17 | |
they have to be able to communicate with the city infrastructure. | 0:45:17 | 0:45:21 | |
So, your car needs to speak to the traffic lights, | 0:45:21 | 0:45:23 | |
the traffic lights need to speak to the car, the cars to speak to each other. | 0:45:23 | 0:45:27 | |
All of that requires a completely different set of infrastructure. | 0:45:27 | 0:45:31 | |
Of course, as the amount of data a city can share grows, | 0:45:33 | 0:45:38 | |
the computing power needed to do something useful with it must grow, too. | 0:45:38 | 0:45:42 | |
And for that, we have the cloud. | 0:45:45 | 0:45:48 | |
For example, imagine trying to analyse all of Bristol's traffic data, | 0:45:49 | 0:45:54 | |
weather and pollution data on your home computer. | 0:45:54 | 0:45:57 | |
It could take a year. | 0:45:57 | 0:45:58 | |
Well, you could reduce that to a day by getting 364 more computers, | 0:46:01 | 0:46:07 | |
but that's expensive. | 0:46:07 | 0:46:09 | |
A cheaper option is sharing the analysis with other computers over the internet, | 0:46:09 | 0:46:14 | |
which Google worked out first, but they published the basics | 0:46:14 | 0:46:19 | |
and now free software exists to help anyone do the same. | 0:46:19 | 0:46:23 | |
Big online companies rent their spare computers | 0:46:23 | 0:46:26 | |
for a few pence an hour. | 0:46:26 | 0:46:28 | |
So, now anyone like me or you | 0:46:29 | 0:46:32 | |
can do big data analytics quickly for a few quid. | 0:46:32 | 0:46:36 | |
Such computing power is something we could never have dreamt of | 0:46:41 | 0:46:45 | |
just a few years ago, but it will only fulfil its potential | 0:46:45 | 0:46:49 | |
if we can share our own data in a safe and transparent way. | 0:46:49 | 0:46:54 | |
If Bristol Council wanted to know where your car was at all times | 0:46:55 | 0:47:01 | |
but could use that information to sort of minimise traffic jams, | 0:47:01 | 0:47:04 | |
how would you feel about something like that? | 0:47:04 | 0:47:06 | |
Er, I'm not sure if I'd particularly like it. | 0:47:06 | 0:47:09 | |
I think it is up to me where I leave my car. | 0:47:09 | 0:47:12 | |
I understand the idea of justifying it with all these great other ideas, | 0:47:12 | 0:47:15 | |
but I still probably wouldn't like it very much. | 0:47:15 | 0:47:18 | |
If they are using it for a better purpose, then yeah, | 0:47:18 | 0:47:20 | |
but one should know how they are using it and why they'll be using it, for what purpose. | 0:47:20 | 0:47:24 | |
I'd like to imagine a world in which all the data that was retained | 0:47:24 | 0:47:29 | |
was used for the greater good of mankind, | 0:47:29 | 0:47:31 | |
but I can't imagine a circumstance like that | 0:47:31 | 0:47:35 | |
in the world that we have today. | 0:47:35 | 0:47:36 | |
We live in a modern society, where if you don't | 0:47:36 | 0:47:40 | |
let your data out there, not in the public domain, | 0:47:40 | 0:47:43 | |
but in a secure business domain, | 0:47:43 | 0:47:45 | |
then you can't take part in society, really. | 0:47:45 | 0:47:47 | |
Unsurprisingly, people are pretty wary about what happens to their data. | 0:47:49 | 0:47:54 | |
We need to be careful that civil liberties are not eroded, | 0:47:55 | 0:47:58 | |
because otherwise the technology is likely to be rejected. | 0:47:58 | 0:48:02 | |
I think it's an area where us as a society have yet to sort of fully | 0:48:02 | 0:48:06 | |
understand what the correct way forward is | 0:48:06 | 0:48:11 | |
and therefore it is very much a discussion. | 0:48:11 | 0:48:13 | |
It's not a lecture, it's not a code, | 0:48:13 | 0:48:16 | |
it's one where we are co-producing and co-forming these sorts of rules | 0:48:16 | 0:48:20 | |
with people in the city, in order to sort of help us work out what the | 0:48:20 | 0:48:23 | |
right and wrong things to do are. | 0:48:23 | 0:48:26 | |
It will be intriguing to watch Bristol grapple with the technological | 0:48:26 | 0:48:31 | |
and ethical challenges of being our first data-centric city. | 0:48:31 | 0:48:35 | |
In all these contexts, Internet of Things... | 0:48:37 | 0:48:40 | |
..new forms of health care, smart cities, | 0:48:41 | 0:48:45 | |
what we're seeing is an increase in transparency. | 0:48:45 | 0:48:48 | |
You can see through the body, you can see through the house, | 0:48:49 | 0:48:52 | |
you can see through the city and the square, you can see through society. | 0:48:52 | 0:48:56 | |
Now, transparency may be good. | 0:48:56 | 0:48:58 | |
It's something that we may need to handle carefully in order to extract | 0:48:58 | 0:49:03 | |
the value from those data to improve | 0:49:03 | 0:49:06 | |
your lifestyle, your social interactions, | 0:49:06 | 0:49:10 | |
the way in which your city works and so on. | 0:49:10 | 0:49:12 | |
But it also needs to be carefully handled, because it's touching | 0:49:12 | 0:49:16 | |
the ultimate nerve of what it means to be human. | 0:49:16 | 0:49:19 | |
So how much data should you give away? | 0:49:22 | 0:49:24 | |
Traffic management is one thing but when it comes to health care, | 0:49:24 | 0:49:29 | |
the stakes, the risks and benefits are even higher. | 0:49:29 | 0:49:33 | |
And in Bristol, with a project called Sphere, | 0:49:35 | 0:49:38 | |
they're pushing the boundaries here, too. | 0:49:38 | 0:49:40 | |
The population is getting older, and an ageing population needs | 0:49:42 | 0:49:45 | |
more intense health care, but it's very difficult to pay for that health care | 0:49:45 | 0:49:50 | |
in institutions, paying for nurses and doctors. | 0:49:50 | 0:49:53 | |
So, the key insight of the Sphere team was that it's now possible | 0:49:53 | 0:49:57 | |
to arrange, in a house, lots of small devices | 0:49:57 | 0:50:00 | |
where each device is monitoring a simple set of signals | 0:50:00 | 0:50:03 | |
about what's going on in that house. | 0:50:03 | 0:50:05 | |
There might be monitors for your heart rate or your temperature, | 0:50:05 | 0:50:08 | |
but there might also be monitors that notice, as you're going up and down stairs, | 0:50:08 | 0:50:12 | |
whether you're limping or not. | 0:50:12 | 0:50:14 | |
They've invited me to go and spend a night in this | 0:50:15 | 0:50:19 | |
very experimental house, but unfortunately, I'm not allowed to tell you where it is. | 0:50:19 | 0:50:24 | |
The project is a live-in experiment and will soon roll out | 0:50:24 | 0:50:28 | |
to 100 homes across Bristol. | 0:50:28 | 0:50:30 | |
It's a gigantic data challenge, overseen by Professor Ian Craddock. | 0:50:30 | 0:50:35 | |
So, that's one up there, then? | 0:50:35 | 0:50:36 | |
Yes, that's one of the video sensors | 0:50:36 | 0:50:38 | |
and we have more sensors in the kitchen. | 0:50:38 | 0:50:41 | |
We have another video camera in the hall and some environmental sensors, | 0:50:41 | 0:50:44 | |
and a few more in here. | 0:50:44 | 0:50:46 | |
The house can generate 3-D video, | 0:50:47 | 0:50:51 | |
body position, location and movement data | 0:50:51 | 0:50:55 | |
from a special wearable. | 0:50:55 | 0:50:56 | |
How much data are you collecting, then? | 0:50:58 | 0:50:59 | |
So, when we scale from this house to 100 houses in Bristol, | 0:50:59 | 0:51:03 | |
in total we'll be storing over two petabytes of data for the project. | 0:51:03 | 0:51:06 | |
Lord. So, on my computer at home, I don't even have a terabyte hard drive | 0:51:06 | 0:51:12 | |
and you're talking about 20,000 of those. | 0:51:12 | 0:51:14 | |
Yes. I mean, you know, the interaction of people with their environment | 0:51:14 | 0:51:17 | |
and with each other is a very complicated and very variable thing | 0:51:17 | 0:51:21 | |
and that's why it is a very challenging area, | 0:51:21 | 0:51:23 | |
especially for data analysts, | 0:51:23 | 0:51:26 | |
machine learners, to make sense of this big mass of data. | 0:51:26 | 0:51:29 | |
I'm happy to find out that the research doesn't call | 0:51:31 | 0:51:34 | |
for cameras in the bedroom or bathroom, | 0:51:34 | 0:51:37 | |
but I do have to be left entirely on my own for the night. | 0:51:37 | 0:51:40 | |
The very first thing I'm going to do is pour myself a nice bloody big | 0:51:42 | 0:51:48 | |
glass of wine. There we go. | 0:51:48 | 0:51:49 | |
So, that nice glass of wine that I'm enjoying isn't completely guilt-free, | 0:51:52 | 0:51:55 | |
because I've got to admit to it to the University of Bristol. | 0:51:55 | 0:51:59 | |
I have to keep a log of everything I do, | 0:52:00 | 0:52:03 | |
so that the data from my stay can be labelled with what I actually got up to. | 0:52:03 | 0:52:08 | |
In this way, I'll be helping the process of machine learning, | 0:52:08 | 0:52:12 | |
teaching the team's computers how to automatically monitor things like | 0:52:12 | 0:52:16 | |
cooking, washing and sleeping, | 0:52:16 | 0:52:19 | |
signals in the data of normal behaviour. | 0:52:19 | 0:52:22 | |
In the interests of science. | 0:52:27 | 0:52:29 | |
'I was also asked to do some things that are less expected.' | 0:52:29 | 0:52:34 | |
Oh! I spilled my drink. | 0:52:34 | 0:52:37 | |
'The team need to learn to detect out-of-the-ordinary behaviour, too, | 0:52:37 | 0:52:40 | |
'if they want to, one day, spot specific signs of ill health.' | 0:52:40 | 0:52:44 | |
Right, I'm going to run this back to the kitchen now. | 0:52:46 | 0:52:50 | |
It's a fairly strange experience. | 0:52:52 | 0:52:55 | |
I think the temperature sensors, the humidity sensors, | 0:52:55 | 0:52:58 | |
the motion sensors, even the wearable | 0:52:58 | 0:53:01 | |
I don't have a problem with at all. | 0:53:01 | 0:53:03 | |
For some reason the body position is the one that's getting me. | 0:53:03 | 0:53:08 | |
On the flipside, though, I would go absolutely crazy to have this data. | 0:53:08 | 0:53:13 | |
This is the most wonderful... My goodness me. | 0:53:13 | 0:53:16 | |
Everything you could learn about humans. It would be so brilliant. | 0:53:16 | 0:53:20 | |
One thing I wanted to do was to do something completely crazy | 0:53:26 | 0:53:30 | |
just to see if they can spot it in the data. Just to kind of test them. | 0:53:30 | 0:53:33 | |
OK, ready? | 0:53:36 | 0:53:38 | |
I can't believe this is my life now. | 0:53:44 | 0:53:46 | |
'Anyone can get the data from my stay online if they fancy trying to find | 0:53:48 | 0:53:52 | |
'my below-the-radar escape. | 0:53:52 | 0:53:55 | |
'The man in charge of machine learning, Professor Peter Flach, | 0:53:55 | 0:53:59 | |
'has the first look.' | 0:53:59 | 0:54:00 | |
Between nine and ten, you were cooking. | 0:54:02 | 0:54:04 | |
Correct. | 0:54:04 | 0:54:05 | |
Then you went into the lounge. You had your meal in the lounge. | 0:54:05 | 0:54:09 | |
You know what? I ate on the sofa. | 0:54:09 | 0:54:11 | |
And you were watching crap television. | 0:54:11 | 0:54:12 | |
I was watching crap television? | 0:54:12 | 0:54:14 | |
I've been found out. | 0:54:14 | 0:54:15 | |
We didn't switch the crap-television sensor on. That's not on here, but OK. | 0:54:15 | 0:54:20 | |
-So, you were in the lounge sort of until 11:30. -Correct. | 0:54:20 | 0:54:25 | |
Then you went upstairs, there's a very clear signal here. | 0:54:27 | 0:54:30 | |
-And then, from then on, there isn't a lot of movement. -I was in bed. | 0:54:30 | 0:54:34 | |
So, I guess you were in bed. | 0:54:34 | 0:54:36 | |
Sleeping. | 0:54:36 | 0:54:37 | |
Normal activities, like cooking or being in bed, are relatively | 0:54:37 | 0:54:41 | |
straightforward to spot. | 0:54:41 | 0:54:43 | |
But what about the weird stuff? | 0:54:43 | 0:54:45 | |
This is yesterday, again. | 0:54:46 | 0:54:48 | |
I can see it. I can see the moment. | 0:54:48 | 0:54:51 | |
-You can see the moment? -I can see it, yeah. | 0:54:51 | 0:54:54 | |
There's something happening here which is sort of rather quick. | 0:54:54 | 0:54:58 | |
You've been in the lounge for quite a while and then, suddenly, | 0:54:58 | 0:55:02 | |
there's a brief move to the kitchen here | 0:55:02 | 0:55:06 | |
and then very quick cleaning up in the lounge. | 0:55:06 | 0:55:10 | |
-I wasted good wine on this experiment. -Good wine? | 0:55:10 | 0:55:14 | |
Humans are extraordinarily good at spotting most patterns. | 0:55:14 | 0:55:19 | |
For machines, the task is much more challenging, | 0:55:20 | 0:55:23 | |
but, once they've learned what to look for, | 0:55:23 | 0:55:26 | |
they can do it tirelessly. | 0:55:26 | 0:55:28 | |
I suppose, in the long run, | 0:55:30 | 0:55:31 | |
if you are going to scale this up to more houses, | 0:55:31 | 0:55:35 | |
you can't have people sifting through these graphs trying to find... | 0:55:35 | 0:55:40 | |
I mean, you have to train computers to do them. | 0:55:40 | 0:55:42 | |
You have to train computers to do them. One challenge that we are facing is that our models, | 0:55:42 | 0:55:47 | |
our machine learning classifiers and models, need to be robust | 0:55:47 | 0:55:51 | |
against changes in layout, changes in personal behaviour, | 0:55:51 | 0:55:55 | |
changes in the number of people that are in a house. | 0:55:55 | 0:55:58 | |
And maybe we are wildly optimistic about what it can do, | 0:55:58 | 0:56:02 | |
but we are in the process of trying to find out what it can do, | 0:56:02 | 0:56:06 | |
at what cost, at what... | 0:56:06 | 0:56:08 | |
..invasion into privacy, and then we can have a discussion about whether, | 0:56:09 | 0:56:14 | |
as a society, we want this or not. | 0:56:14 | 0:56:16 | |
If this type of technology rolls out, | 0:56:18 | 0:56:20 | |
machines will be modelling us in mathematical terms | 0:56:20 | 0:56:24 | |
and intervening to help keep us healthy in real time - | 0:56:24 | 0:56:28 | |
and that's completely new. | 0:56:28 | 0:56:30 | |
It's true that our fascination with machine, or artificial, intelligence | 0:56:32 | 0:56:37 | |
is as old as computers themselves. | 0:56:37 | 0:56:40 | |
Claude Shannon and Alan Turing both explored the possibilities | 0:56:40 | 0:56:44 | |
of machines that could learn. | 0:56:44 | 0:56:45 | |
But it's only today, | 0:56:47 | 0:56:49 | |
with torrents of data and pattern-finding algorithms, | 0:56:49 | 0:56:53 | |
that intelligent machines will realise their potential. | 0:56:53 | 0:56:56 | |
You'll hear a lot of heady stuff about what's going to happen when we | 0:57:00 | 0:57:03 | |
mix big data with artificial intelligence. | 0:57:03 | 0:57:06 | |
A lot of people, understandably, are very anxious about it. | 0:57:06 | 0:57:10 | |
But, for me, despite how much the world has changed, | 0:57:10 | 0:57:12 | |
the core challenge is the same as it always was. | 0:57:12 | 0:57:16 | |
It doesn't matter if you are William Farr in Victorian London trying to | 0:57:16 | 0:57:20 | |
understand cholera or in one of Bristol's wired-up houses, | 0:57:20 | 0:57:25 | |
all you're trying to do is to understand patterns in the data | 0:57:25 | 0:57:29 | |
using the language of mathematics. | 0:57:29 | 0:57:32 | |
And machines can certainly help us to find those patterns, | 0:57:32 | 0:57:36 | |
but it takes us to find the meaning in them. | 0:57:36 | 0:57:38 | |
We should be worried about what we're going to do with these smart technologies, | 0:57:40 | 0:57:43 | |
not about the smart technologies in themselves. | 0:57:43 | 0:57:47 | |
They are in our hands to shape our future. | 0:57:47 | 0:57:50 | |
They will not shape our futures for us. | 0:57:51 | 0:57:53 | |
In the blink of an eye, we have gone from a world where data, | 0:58:00 | 0:58:04 | |
information and knowledge belonged only to the privileged few, | 0:58:04 | 0:58:09 | |
to what we have now, where it doesn't matter if you're trying to work out | 0:58:09 | 0:58:12 | |
where to go on holiday next or researching the best cancer treatments. | 0:58:12 | 0:58:17 | |
Data has really empowered all of us. | 0:58:17 | 0:58:20 | |
Now, of course, there are some concerns about big corporations | 0:58:20 | 0:58:23 | |
hoovering up the data traces that we all leave behind in our everyday lives, | 0:58:23 | 0:58:28 | |
but I, for one, am an optimist as well as a rationalist | 0:58:28 | 0:58:33 | |
and I think that if we can marshal together the power of data, | 0:58:33 | 0:58:37 | |
then the future lies in the hands of the many and not just the few. | 0:58:37 | 0:58:43 | |
And that, for me, is the real joy of data. | 0:58:43 | 0:58:47 | |
MUSIC: Good Vibrations by The Beach Boys | 0:58:47 | 0:58:48 | |
Subtitles by Ericsson | 0:59:06 | 0:59:10 |