Browse content similar to The Word. Check below for episodes and series from the same categories and more!
Line | From | To | |
---|---|---|---|
Human language can be complex and bewildering. | 0:00:02 | 0:00:04 | |
PHONE RINGS | 0:00:04 | 0:00:05 | |
Oh, dear. Sorry, I've got to take this. | 0:00:05 | 0:00:08 | |
Hello? I can't talk now. | 0:00:10 | 0:00:12 | |
I'm doing the Christmas lectures! | 0:00:12 | 0:00:16 | |
What? | 0:00:16 | 0:00:17 | |
She said DAVID didn't take his money? | 0:00:19 | 0:00:21 | |
What? She said David didn't take HIS money? | 0:00:23 | 0:00:26 | |
Oh, she SAID David didn't take his money. | 0:00:27 | 0:00:31 | |
Why didn't you just say that, then? | 0:00:31 | 0:00:33 | |
Sorry. Now, what you have there is three very different meanings from | 0:00:33 | 0:00:37 | |
exactly the same sentence. | 0:00:37 | 0:00:40 | |
Will anything other than another human being ever be able to cope | 0:00:40 | 0:00:44 | |
with that level of complexity? | 0:00:44 | 0:00:45 | |
In this lecture, I'm going to find out what makes language the ultimate | 0:00:47 | 0:00:50 | |
communication tool and why humans are absolute masters of it. | 0:00:50 | 0:00:55 | |
Welcome to the third Royal Institution Christmas lecture | 0:01:19 | 0:01:22 | |
of 2017. | 0:01:22 | 0:01:24 | |
I'm Professor Sophie Scott. | 0:01:24 | 0:01:26 | |
Now humans have got an incredibly powerful ability - language. | 0:01:26 | 0:01:31 | |
I can convey very precise meanings to anyone within earshot if they | 0:01:31 | 0:01:35 | |
speak my language. To give you a taste, please let me introduce scientist, | 0:01:35 | 0:01:40 | |
comedian and rapper, Alex Lethbridge. | 0:01:40 | 0:01:43 | |
So I've listened to Doc Brown, | 0:01:54 | 0:01:56 | |
Akala and Syntax, they showed me how to flow off my grammar and syntax. | 0:01:56 | 0:02:00 | |
The RI told me, Alex, what's your language? | 0:02:00 | 0:02:02 | |
I checked my head, do you mean English, Fante or Spanish? | 0:02:02 | 0:02:05 | |
Now, my PhD's crazy, I'll do wordplay till it pays me. | 0:02:05 | 0:02:07 | |
And when I get bored, a language or two. | 0:02:07 | 0:02:09 | |
So while you're getting PTSD from your GCSEs | 0:02:09 | 0:02:11 | |
and wondering should I RSVP to GCHQ? | 0:02:11 | 0:02:13 | |
Now I'm not sure, Sophie says language is complex. | 0:02:13 | 0:02:15 | |
You've got the rules like subjects, verbs and objects. | 0:02:15 | 0:02:18 | |
It's more than words, you've got intonation and context. | 0:02:18 | 0:02:20 | |
Final lecture, we're learning all of these concepts. | 0:02:20 | 0:02:23 | |
APPLAUSE | 0:02:23 | 0:02:24 | |
I don't know about you, when I'm listening to rap music, I like to count all the words. | 0:02:31 | 0:02:36 | |
And I reckon in about 25 seconds there, Alex, | 0:02:36 | 0:02:38 | |
you said about 110 words. | 0:02:38 | 0:02:40 | |
-Yeah, exactly. -And got over about 15 ideas, does that sound about right? | 0:02:40 | 0:02:45 | |
Amazing. Thank you very much, Alex. | 0:02:45 | 0:02:47 | |
No worries, thanks. | 0:02:47 | 0:02:48 | |
We're used to thinking of telepathy as a science-fiction concept, but | 0:02:54 | 0:02:58 | |
Alex just achieved the exact same result. | 0:02:58 | 0:03:01 | |
We share the content of our minds, our brains, | 0:03:01 | 0:03:04 | |
whenever we want to speak, or rap, to anyone. | 0:03:04 | 0:03:07 | |
Don't worry, I'm not going to rap! | 0:03:07 | 0:03:10 | |
Are we unique in having these skills or will we one day have a full | 0:03:10 | 0:03:14 | |
conversation with another species? | 0:03:14 | 0:03:16 | |
When I was a little girl, I so wanted to be able to talk to animals. | 0:03:16 | 0:03:20 | |
Will that ever happen? | 0:03:20 | 0:03:22 | |
And will computers ever be able to fully get their processors around our | 0:03:22 | 0:03:26 | |
language well enough to understand a joke? | 0:03:26 | 0:03:29 | |
Tonight, I'm going to explore what makes language so amazing | 0:03:29 | 0:03:33 | |
and so very difficult. | 0:03:33 | 0:03:36 | |
But what do we mean when we talk about language? | 0:03:36 | 0:03:40 | |
Languages can come in many forms. | 0:03:40 | 0:03:41 | |
We can talk, we can write, we can sign. | 0:03:41 | 0:03:44 | |
And I've got a very basic form of language here. | 0:03:44 | 0:03:47 | |
MORSE BEEPING | 0:03:47 | 0:03:50 | |
Anybody speak Morse? | 0:03:56 | 0:03:58 | |
That was a cry for help! | 0:03:59 | 0:04:01 | |
Now, I don't speak Morse beyond being able to do that. | 0:04:01 | 0:04:04 | |
But basically you can think of language like Morse code as being | 0:04:04 | 0:04:08 | |
a message which we're sending with a code. | 0:04:08 | 0:04:11 | |
And to make a code, | 0:04:11 | 0:04:13 | |
the first thing you need to do is to produce a signal that's got some | 0:04:13 | 0:04:17 | |
kind of structure. Now, that means a signal | 0:04:17 | 0:04:20 | |
that's not just a random stream of noises without any order, | 0:04:20 | 0:04:24 | |
nor can it be a very simple pattern just repeated again and again. | 0:04:24 | 0:04:29 | |
You need to have a capacity to send information like the short and long | 0:04:29 | 0:04:33 | |
patterns of the Morse. | 0:04:33 | 0:04:34 | |
Now, humans do this when we speak aloud. | 0:04:34 | 0:04:37 | |
We're using the sounds of our voices that we use in our language to | 0:04:37 | 0:04:40 | |
express a code. | 0:04:40 | 0:04:42 | |
Can we find any signs of a similar kind of structure in other animals' | 0:04:42 | 0:04:46 | |
voices? And if so, | 0:04:46 | 0:04:48 | |
could we crack their code and have a proper conversation with them? | 0:04:48 | 0:04:51 | |
There are some animals who are very good candidates for being able to | 0:04:52 | 0:04:56 | |
produce these sorts of sounds with structured elements, | 0:04:56 | 0:04:59 | |
and those are birds. | 0:04:59 | 0:05:00 | |
Now, these guys, who are a couple of zebra finches, | 0:05:03 | 0:05:08 | |
and a couple of canaries, they're songbirds. | 0:05:08 | 0:05:11 | |
Songbirds, when they're babies, | 0:05:13 | 0:05:15 | |
they learn all the songs they're going to sing when they're adults. | 0:05:15 | 0:05:19 | |
The most impressive can learn over 1,000 different songs. | 0:05:19 | 0:05:23 | |
Could these songs contain coded information? | 0:05:23 | 0:05:26 | |
Now, I can hear a couple of cheeps coming out of here, | 0:05:26 | 0:05:29 | |
but I think we have a recording of one of the canaries. | 0:05:29 | 0:05:32 | |
Can we listen to that? | 0:05:32 | 0:05:33 | |
CANARY CHIRPS | 0:05:33 | 0:05:36 | |
It's a beautiful sound. | 0:05:44 | 0:05:45 | |
But does it contain enough structure in that signal that it could be used | 0:05:45 | 0:05:49 | |
to transmit a code? | 0:05:49 | 0:05:50 | |
Well, I've got an example of the canary's song here. | 0:05:52 | 0:05:55 | |
And what I'm showing it to you as, is what's called a spectrogram. | 0:05:55 | 0:06:00 | |
Now, a spectrogram is a way of looking at the structure in a sound. | 0:06:00 | 0:06:04 | |
So, what you have along this direction is time. | 0:06:04 | 0:06:08 | |
So, this is the sound unfurling over time, | 0:06:08 | 0:06:11 | |
how it's changing over time. This direction, we've got frequency. | 0:06:11 | 0:06:15 | |
And that's roughly telling you, | 0:06:15 | 0:06:17 | |
low-pitched sounds up to high-pitched sounds. | 0:06:17 | 0:06:20 | |
And where the colours are warmer and brighter, | 0:06:20 | 0:06:22 | |
that's where there's more energy. | 0:06:22 | 0:06:24 | |
And we can see in these individual elements, these little notes, | 0:06:24 | 0:06:27 | |
and we're seeing some quite structured elements to this. | 0:06:27 | 0:06:31 | |
We've got a similar sequence here and repeating there. | 0:06:31 | 0:06:33 | |
And then these sequences of lower and higher alternating notes. | 0:06:33 | 0:06:38 | |
Now, I need to compare this with another kind of voice. | 0:06:38 | 0:06:41 | |
So, I would like a human volunteer, please. | 0:06:41 | 0:06:44 | |
Can I have you in the middle there, with the penguin? | 0:06:44 | 0:06:47 | |
Thank you very much. | 0:06:47 | 0:06:48 | |
Now, what's your name? | 0:06:52 | 0:06:53 | |
-Ruth. -Ruth. I'm going to ask you to come over here and say the first two | 0:06:53 | 0:06:57 | |
lines of Humpty Dumpty into my computer. | 0:06:57 | 0:07:00 | |
OK, I'll tell you when to go. | 0:07:00 | 0:07:01 | |
If you could just stand about there. | 0:07:01 | 0:07:03 | |
Brilliant. And go now. | 0:07:03 | 0:07:05 | |
Humpty Dumpty sat on the wall. | 0:07:05 | 0:07:08 | |
Humpty Dumpty had a great fall. | 0:07:08 | 0:07:11 | |
Brilliant. Thank you very much, Ruth. | 0:07:11 | 0:07:12 | |
Exemplary, I think you'd agree. | 0:07:12 | 0:07:14 | |
So, here's Ruth's version of Humpty Dumpty shown in the same way | 0:07:21 | 0:07:24 | |
on a spectrogram. You can see immediately there are some differences. | 0:07:24 | 0:07:27 | |
There's the canary, there's Ruth. | 0:07:27 | 0:07:30 | |
You can also see some similarities. | 0:07:30 | 0:07:32 | |
And I'm talking in the most general sense here, | 0:07:32 | 0:07:34 | |
but Ruth is producing individual rhythm in the syllables of what she's saying, | 0:07:34 | 0:07:38 | |
and you're seeing a pattern of that over the sentence. | 0:07:38 | 0:07:41 | |
And you're seeing something broadly comparable in the canary. | 0:07:41 | 0:07:46 | |
We're seeing structure in those sounds. | 0:07:46 | 0:07:48 | |
The canary and the speech sounds have both got rhythm, pitch, | 0:07:48 | 0:07:52 | |
rate information in there. | 0:07:52 | 0:07:54 | |
So it's at least possible that the songbird | 0:07:54 | 0:07:57 | |
is producing something which has got | 0:07:57 | 0:07:59 | |
similarities to the way we code information in our speech. | 0:07:59 | 0:08:03 | |
Could it actually be a code, though? | 0:08:03 | 0:08:06 | |
Well, probably not. | 0:08:07 | 0:08:08 | |
It doesn't seem to be quite enough. | 0:08:08 | 0:08:11 | |
So if you look at how songbirds use song, | 0:08:11 | 0:08:14 | |
what you find is they don't generally change their songs | 0:08:14 | 0:08:17 | |
once they've learned them. | 0:08:17 | 0:08:18 | |
They will always sing the same whole song. | 0:08:18 | 0:08:20 | |
And the other thing they don't do is chop songs up and rearrange them | 0:08:20 | 0:08:24 | |
to make new songs. We do that all the time. | 0:08:24 | 0:08:27 | |
We can use words in lots of different, very novel orders. | 0:08:27 | 0:08:30 | |
The birds don't do this. | 0:08:30 | 0:08:31 | |
So it's entirely possible that, complex though the songbirds are, | 0:08:31 | 0:08:35 | |
they are not producing something that is conveying a complex, | 0:08:35 | 0:08:38 | |
coded meaning. | 0:08:38 | 0:08:40 | |
But there's another group of birds | 0:08:40 | 0:08:41 | |
that can learn to say human-coded signals, use words. | 0:08:41 | 0:08:45 | |
And those are parrots. | 0:08:45 | 0:08:47 | |
Please meet Mike and his parrot. | 0:08:47 | 0:08:50 | |
-Hello. -Hello. | 0:08:53 | 0:08:56 | |
-Hi. -Hi. -Hi, Mike, who have you brought with you? | 0:08:59 | 0:09:02 | |
-So this is Helly... -You all right? | 0:09:02 | 0:09:03 | |
-You all right? -..and Helly is an Amazon parrot. | 0:09:03 | 0:09:05 | |
So a South American bird from the rainforests. | 0:09:05 | 0:09:08 | |
And she's already said hello, hasn't she? | 0:09:08 | 0:09:10 | |
She has, yes. She knows that's an introduction, | 0:09:10 | 0:09:12 | |
so it's the first point of call for a conversation | 0:09:12 | 0:09:14 | |
or attention gathering. | 0:09:14 | 0:09:16 | |
-How many other words does she use? -You all right? | 0:09:16 | 0:09:18 | |
She uses about 80 different sounds and words. | 0:09:18 | 0:09:21 | |
Yeah, human words, though? | 0:09:21 | 0:09:23 | |
Oh, human words, I would say she knows about five or ten. | 0:09:23 | 0:09:25 | |
-Yeah. -Can you say bye? | 0:09:25 | 0:09:27 | |
Bye! | 0:09:27 | 0:09:28 | |
So she's doing a set of human words. | 0:09:30 | 0:09:32 | |
Did you teach her those or did she just pick them up from seeing how they were used? | 0:09:32 | 0:09:36 | |
Yeah, she understands that "Hello" is a greeting because it's a | 0:09:36 | 0:09:39 | |
word that we would use on arrival. | 0:09:39 | 0:09:41 | |
And she would then understand that the "Bye" is a departure word. | 0:09:41 | 0:09:44 | |
So she places the words with timings, as well. | 0:09:44 | 0:09:47 | |
-Yeah. -But throughout her life she's had lots of trainers saying, | 0:09:47 | 0:09:50 | |
"Are you all right, are you all right?" | 0:09:50 | 0:09:51 | |
So if she actually gets worried in something like a studio environment, | 0:09:51 | 0:09:54 | |
-she will say "Are you all right?" -You all right? | 0:09:54 | 0:09:56 | |
And she knows that goes with almost an emotion, as well. | 0:09:56 | 0:09:58 | |
-Yes. -So she links these words with timings and emotions. | 0:09:58 | 0:10:03 | |
And she can do other sounds that aren't... she can do more than words, can't she? | 0:10:03 | 0:10:06 | |
Yes, she uses lots of different sounds as well as words. | 0:10:06 | 0:10:10 | |
Can we have a bomb? | 0:10:10 | 0:10:11 | |
WHISTLE, BOOM | 0:10:11 | 0:10:13 | |
LAUGHTER | 0:10:13 | 0:10:15 | |
They can also be copying other birds as well, so local birds in the wild. | 0:10:18 | 0:10:22 | |
And also other birds that we house. | 0:10:22 | 0:10:25 | |
Thank you very much, Mike. And thank you very much, Helly. | 0:10:25 | 0:10:27 | |
Lovely meeting you. Bye-bye. | 0:10:27 | 0:10:29 | |
What gives birds their amazing ability to learn to produce these sounds? | 0:10:39 | 0:10:43 | |
Well, the answer may lie within their brains. | 0:10:43 | 0:10:48 | |
Now, birds are more closely related to dinosaurs than they are to us. | 0:10:48 | 0:10:52 | |
But we share similarities in the ways that our brains control both | 0:10:52 | 0:10:56 | |
the learning and the production of sounds we make with our voices | 0:10:56 | 0:10:59 | |
and the genes that build those parts of the brain. | 0:10:59 | 0:11:02 | |
If we look... | 0:11:02 | 0:11:04 | |
..at a bird brain, | 0:11:05 | 0:11:06 | |
we can identify specific areas which are important | 0:11:06 | 0:11:10 | |
in the learning of song and in the control of singing. | 0:11:10 | 0:11:14 | |
And if we look at a human brain... | 0:11:14 | 0:11:17 | |
..we can see a similarity, in that there are specific networks | 0:11:19 | 0:11:23 | |
recruited when we are speaking. | 0:11:23 | 0:11:25 | |
When we're talking, these are human brains here, | 0:11:25 | 0:11:28 | |
this is the right side of the brain and the left side of the brain. | 0:11:28 | 0:11:32 | |
We see some areas which are strongly associated with the control of all | 0:11:32 | 0:11:35 | |
the work we have to do to make the sounds of speech. | 0:11:35 | 0:11:39 | |
We also find the very specific area just on the left side of the brain | 0:11:39 | 0:11:43 | |
which seems to be very important in planning speech. | 0:11:43 | 0:11:46 | |
And it's also important in learning new things to do with our voices. | 0:11:46 | 0:11:50 | |
How can I find out more about this system? | 0:11:51 | 0:11:54 | |
We can take these snapshots of the brain in action and we can work with | 0:11:54 | 0:11:58 | |
people who have had strokes and have damaged these brain areas. | 0:11:58 | 0:12:01 | |
But there's another technique that we can use where we can investigate | 0:12:01 | 0:12:05 | |
what would happen if we could turn off | 0:12:05 | 0:12:07 | |
that part of the brain in someone, and just that part of the brain. | 0:12:07 | 0:12:10 | |
What I'd like to do is introduce you to my colleague from UCL, | 0:12:10 | 0:12:13 | |
Dr Ricci Hannah, and comedian Robin Ince. | 0:12:13 | 0:12:16 | |
Hello Robin, hello. I've been waiting for this day. | 0:12:19 | 0:12:23 | |
Hello, Ricci. | 0:12:23 | 0:12:24 | |
-Hi. -Hi. | 0:12:24 | 0:12:25 | |
Now, Robin. | 0:12:25 | 0:12:27 | |
Robin, what we're going to do is sit you down here. | 0:12:27 | 0:12:30 | |
-Right. -And then I'm going to let Ricci explain what we're going to do next. OK? | 0:12:30 | 0:12:34 | |
I have to emphasise, this is a temporary state of affairs, OK? | 0:12:34 | 0:12:38 | |
We're not about to do some sort of terrible live brain surgery on you! | 0:12:38 | 0:12:42 | |
I was quite worried, because I was watching what else was going on and I thought, | 0:12:42 | 0:12:45 | |
"They're going to prove that I'm less intelligent than a parrot!" | 0:12:45 | 0:12:47 | |
So let's find out what happens. | 0:12:47 | 0:12:49 | |
Ricci, can you tell us what you've got here? | 0:12:49 | 0:12:51 | |
So this is a transcranial magnetic stimulator. | 0:12:51 | 0:12:55 | |
And what we can use it for is to probe | 0:12:55 | 0:12:57 | |
how different parts of the brain play a role | 0:12:57 | 0:12:59 | |
in different aspects of behaviour. | 0:12:59 | 0:13:01 | |
In this case, speech. | 0:13:01 | 0:13:03 | |
OK. So can we start just by pointing out, that the way that we get this, | 0:13:03 | 0:13:08 | |
we change Robin's brain activity by passing electrical current through | 0:13:08 | 0:13:13 | |
-that coil, is that right? -Mm-hmm. | 0:13:13 | 0:13:15 | |
And actually, according to the principles of electromagnetic forces, | 0:13:15 | 0:13:18 | |
which were first described here by Michael Faraday, | 0:13:18 | 0:13:21 | |
what that lets us do is induce currents inside Robin's brain | 0:13:21 | 0:13:25 | |
without actually having to get inside his brain. | 0:13:25 | 0:13:28 | |
It's absolutely amazing. | 0:13:28 | 0:13:29 | |
-So can we start by getting Robin talking and then seeing if we can stop him talking? -OK! | 0:13:29 | 0:13:36 | |
People have been trying this for years! | 0:13:37 | 0:13:39 | |
-It's going to be popular! -Can I get you to shuffle back, Robin? | 0:13:39 | 0:13:42 | |
-Yes. -There we go. | 0:13:42 | 0:13:43 | |
-Are you OK? -OK, so I'll just position the coil. | 0:13:43 | 0:13:48 | |
-What I want you to do is to say the months of the year really loudly and clearly. -OK. | 0:13:49 | 0:13:56 | |
When you're ready. | 0:13:56 | 0:13:58 | |
January, February, March, April, M... | 0:13:58 | 0:14:03 | |
Please start talking again! | 0:14:03 | 0:14:04 | |
That is weird! | 0:14:05 | 0:14:07 | |
That is very... | 0:14:07 | 0:14:09 | |
I think Professor Brian Cox who I do a radio show with is going to want a | 0:14:09 | 0:14:12 | |
beret that I wear with that in there, so he can stop me talking! | 0:14:12 | 0:14:15 | |
I don't know what it looked like to you, but it was like just... | 0:14:15 | 0:14:18 | |
It was like kind of Homer with a doughnut, sort of... | 0:14:20 | 0:14:23 | |
If you could bear it, can we try it again? | 0:14:24 | 0:14:26 | |
Yeah! It's really... I find it amazing. | 0:14:26 | 0:14:28 | |
It is extraordinary, isn't it? And then it just comes back. | 0:14:28 | 0:14:31 | |
I think I prefer quiet me. Let's do it again! | 0:14:31 | 0:14:33 | |
Shall I try, and see how far I can get in Jabberwocky, shall I try that? | 0:14:34 | 0:14:38 | |
Yes, excellent, yes. | 0:14:38 | 0:14:40 | |
-Tell me when you want me to. -When you're ready. | 0:14:40 | 0:14:42 | |
'Twas brillig, and the slithy toves. Did... | 0:14:42 | 0:14:44 | |
It's fantastic! | 0:14:48 | 0:14:49 | |
That's... | 0:14:54 | 0:14:55 | |
So what Ricci's very specifically | 0:15:01 | 0:15:03 | |
focusing on here is this part of the brain, | 0:15:03 | 0:15:05 | |
it's called the inferior frontal gyrus, on the left. | 0:15:05 | 0:15:08 | |
And in humans it's incredibly important for | 0:15:08 | 0:15:10 | |
planning and controlling speech. | 0:15:10 | 0:15:12 | |
If we move to a slightly different area, | 0:15:12 | 0:15:15 | |
even within the same network, | 0:15:15 | 0:15:16 | |
what we find is that Robin will be able to talk absolutely fine. | 0:15:16 | 0:15:20 | |
Would it be OK to try that? | 0:15:20 | 0:15:21 | |
-Yeah, sure. -OK. | 0:15:21 | 0:15:22 | |
-OK. -I didn't even look at the health and safety form for this! | 0:15:25 | 0:15:28 | |
-I sent this to you! -I know, I didn't look, just in case! | 0:15:28 | 0:15:31 | |
You're fine, you're safe. | 0:15:31 | 0:15:33 | |
OK, when you're ready. | 0:15:33 | 0:15:34 | |
'Twas brillig, and the slithy toves | 0:15:34 | 0:15:36 | |
Did gyre and gimble in the wabe | 0:15:36 | 0:15:37 | |
All mimsy were the borogoves And the mome... Yeah, that's... | 0:15:37 | 0:15:39 | |
There you go, there you go. | 0:15:39 | 0:15:41 | |
I think I preferred the one at the side! | 0:15:41 | 0:15:43 | |
I can't thank you enough for being prepared to come out in front of | 0:15:43 | 0:15:46 | |
everybody and have us try and zap your brain! | 0:15:46 | 0:15:48 | |
Thank you very much, Robin. Thank you very much, Ricci. | 0:15:48 | 0:15:50 | |
That was amazing, thank you. | 0:15:50 | 0:15:52 | |
It was brilliant, thank you. | 0:15:53 | 0:15:55 | |
Thank you. | 0:15:58 | 0:16:00 | |
So you could see how precise the effect of the transcranial magnetic | 0:16:02 | 0:16:07 | |
stimulation was. We were only seeing Robin stopping talking when we were | 0:16:07 | 0:16:12 | |
applying the TMS over his left inferior frontal gyrus. | 0:16:12 | 0:16:15 | |
When we went elsewhere in the brain, other things happened, | 0:16:15 | 0:16:18 | |
but it's not stopping him from talking. | 0:16:18 | 0:16:21 | |
So what we're seeing here in the humans and in the birds | 0:16:21 | 0:16:25 | |
are very dedicated brain regions | 0:16:25 | 0:16:27 | |
that are important in vocal control and vocal learning. | 0:16:27 | 0:16:30 | |
It's a strong hint that we really are seeing some commonality in | 0:16:30 | 0:16:34 | |
the brain areas that are to do with the learning | 0:16:34 | 0:16:36 | |
and the producing of vocal sounds. | 0:16:36 | 0:16:38 | |
But can any of these birds really understand the words they're saying? | 0:16:38 | 0:16:43 | |
Well, parrots don't just say human words. | 0:16:43 | 0:16:45 | |
They'll mimic pretty much anything they hear a lot of. | 0:16:45 | 0:16:48 | |
Car alarms, creaking doors, alarm clocks. | 0:16:48 | 0:16:51 | |
Why do they do this at all? | 0:16:51 | 0:16:53 | |
Well, birds are mimicking to show off to potential mates, | 0:16:53 | 0:16:56 | |
to get attention. | 0:16:56 | 0:16:57 | |
To impress other birds and other humans. | 0:16:57 | 0:16:59 | |
To defend their nesting sites. | 0:16:59 | 0:17:01 | |
Perhaps the more impressive a sound they can make, | 0:17:01 | 0:17:04 | |
the more likely they'll find a companion, | 0:17:04 | 0:17:06 | |
or scare off a rival. | 0:17:06 | 0:17:08 | |
So birds aren't showing a great ability to decode our language. | 0:17:08 | 0:17:12 | |
But what do I mean by decoding? | 0:17:12 | 0:17:15 | |
We humans are exceptionally good at working out what words are and what words mean. | 0:17:15 | 0:17:22 | |
I'd like you to watch and listen to a clip that I'm about to play you, | 0:17:22 | 0:17:26 | |
and there's going to be a short test afterwards. | 0:17:26 | 0:17:28 | |
And this will go down on your permanent record. | 0:17:28 | 0:17:31 | |
Back. Deaf. Gidge. | 0:17:32 | 0:17:34 | |
Hock. Fip. Nop. | 0:17:34 | 0:17:35 | |
Fib. Wreck. Sit. | 0:17:35 | 0:17:37 | |
They. Fip. Fip. | 0:17:37 | 0:17:38 | |
Hock. Fib. Gack. | 0:17:38 | 0:17:40 | |
Gin. Hock. Hock. | 0:17:40 | 0:17:41 | |
Lun. Fip. Nat. | 0:17:41 | 0:17:43 | |
Fip. Ros. Hock. | 0:17:43 | 0:17:44 | |
Yas. Beth. | 0:17:44 | 0:17:45 | |
OK. Short test. | 0:17:47 | 0:17:49 | |
What's this called? | 0:17:49 | 0:17:50 | |
-CROWD: -Fip. | 0:17:52 | 0:17:53 | |
It's a fip. What's this? | 0:17:54 | 0:17:57 | |
-CROWD: -Hock. -Hock, excellent. | 0:17:57 | 0:17:59 | |
Well done. Now, I didn't tell you to work out what was going on there. | 0:17:59 | 0:18:03 | |
What you were doing was decoding the information we gave you. | 0:18:03 | 0:18:07 | |
There were some novel sounds in there and they were being associated | 0:18:07 | 0:18:11 | |
in quite a regular way with visual information. | 0:18:11 | 0:18:13 | |
Even if you don't know that your brain is trying to do it, | 0:18:13 | 0:18:16 | |
you're always trying to spot words and work out what those words mean. | 0:18:16 | 0:18:20 | |
We are not the only animal who has an ability to do this. | 0:18:20 | 0:18:24 | |
There's an animal you're probably all very familiar with who's actually | 0:18:24 | 0:18:27 | |
really very, very good at sharing this ability with us. | 0:18:27 | 0:18:30 | |
And that's dogs. | 0:18:30 | 0:18:31 | |
Please give us a very nice doggy-friendly round of applause | 0:18:32 | 0:18:36 | |
for Gable and his owner, Sally. | 0:18:36 | 0:18:38 | |
-Hello. -Hello. -Hello. | 0:18:42 | 0:18:44 | |
Hello. Hey, Gable. | 0:18:44 | 0:18:48 | |
So Sally, what's different about Gable? | 0:18:49 | 0:18:51 | |
Gable's got the ability to identify a large number of objects and toys by name. | 0:18:52 | 0:19:00 | |
He currently knows about 150 different names for toys and objects and articles. | 0:19:00 | 0:19:04 | |
Goodness. | 0:19:04 | 0:19:06 | |
Now, can we see a demonstration of that? | 0:19:06 | 0:19:09 | |
So we've got some of Gable's toys here and what were going to do is | 0:19:09 | 0:19:12 | |
get a random selection of those out and then see. | 0:19:12 | 0:19:14 | |
-OK. -I think there is somebody... | 0:19:14 | 0:19:17 | |
Hi, Sean. Now, we've spent a little bit of time with Sean, | 0:19:17 | 0:19:20 | |
getting Sean used to being with Gable and Gable used to being with Sean. | 0:19:20 | 0:19:23 | |
Can I bring you down? Thank you. | 0:19:23 | 0:19:25 | |
Now Sean, if you could come over here. | 0:19:27 | 0:19:30 | |
Can you pop these gloves on and can you just randomly select 15 toys | 0:19:30 | 0:19:35 | |
from these buckets and spread them out? | 0:19:35 | 0:19:37 | |
When did you first realise Gable could do this? | 0:19:37 | 0:19:40 | |
When he was a young puppy, actually. | 0:19:40 | 0:19:42 | |
He sort of invented the game. | 0:19:42 | 0:19:44 | |
I was trying to watch telly one evening and he kept pestering me. | 0:19:44 | 0:19:46 | |
He wanted to do something. | 0:19:46 | 0:19:48 | |
And I just remembered oh, his red toy was upstairs. | 0:19:48 | 0:19:51 | |
And I just said, "Oh, go and get your red toy." | 0:19:51 | 0:19:53 | |
Sort of idly just dismissing him, | 0:19:53 | 0:19:55 | |
hoping he'd be gone for ages because he couldn't find it. | 0:19:55 | 0:19:57 | |
And he came back with it and he put it in front of me and looked at me. | 0:19:57 | 0:20:00 | |
And I just thought, "I've actually never told you that's called Red Toy." | 0:20:00 | 0:20:04 | |
And so I just then thought, I wonder what happens | 0:20:04 | 0:20:06 | |
if I do teach him a name, and it's gone from there, really. | 0:20:06 | 0:20:09 | |
-Now... -Wow, look at all your toys. -..Sean, don't go anywhere, | 0:20:10 | 0:20:12 | |
I'm going to need you again. | 0:20:12 | 0:20:14 | |
Can I ask you, Sally, to tell Gable to pick one of these toys, please? | 0:20:14 | 0:20:19 | |
OK. Gable... | 0:20:19 | 0:20:20 | |
triceratops. | 0:20:23 | 0:20:24 | |
Get triceratops. | 0:20:26 | 0:20:28 | |
-Yes! Yes, good boy! -Good boy! | 0:20:32 | 0:20:36 | |
Good boy! Leave it. | 0:20:36 | 0:20:37 | |
-Come round here. -Fantastic. | 0:20:39 | 0:20:41 | |
So we've seen that Gable is really very good at working out what you're | 0:20:41 | 0:20:45 | |
saying and trying to work out from looking at you which one you mean. | 0:20:45 | 0:20:48 | |
But then Gable is your dog and he might be really, | 0:20:48 | 0:20:51 | |
really familiar with your voice. | 0:20:51 | 0:20:52 | |
It would be very interesting to know if we could see this happen with | 0:20:52 | 0:20:55 | |
someone who's got a very different voice. | 0:20:55 | 0:20:56 | |
So Sean, is it OK if Sean has a go? | 0:20:56 | 0:20:59 | |
-Yes. -And what you need to do is whisper into Sean's ear | 0:20:59 | 0:21:03 | |
which toy you'd like him to pick up. | 0:21:03 | 0:21:05 | |
Do you remember Sean? | 0:21:05 | 0:21:07 | |
-OK. -Gable, octopus. | 0:21:07 | 0:21:11 | |
Get octopus. | 0:21:11 | 0:21:12 | |
Go. | 0:21:12 | 0:21:13 | |
Yes! | 0:21:14 | 0:21:15 | |
Good boy, good boy! | 0:21:17 | 0:21:18 | |
-And well done, Sean. -Leave it. | 0:21:18 | 0:21:20 | |
-Can Sean do one more? -Of course, yes. | 0:21:22 | 0:21:23 | |
Is that OK, Sean? | 0:21:23 | 0:21:25 | |
Gable, get hammer. | 0:21:25 | 0:21:27 | |
Get hammer. | 0:21:27 | 0:21:28 | |
Yes! Amazing. | 0:21:31 | 0:21:34 | |
Good boy! | 0:21:34 | 0:21:36 | |
So this really does indicate that Gable must have some understanding of what | 0:21:37 | 0:21:41 | |
words mean above and beyond just associating your voice with things. | 0:21:41 | 0:21:44 | |
That's fantastic. Thank you so much, Sean, thank you so much, Sally, | 0:21:44 | 0:21:48 | |
and particularly thank you so much, Gable. | 0:21:48 | 0:21:50 | |
Come on, then! | 0:21:50 | 0:21:51 | |
That's amazing. | 0:21:55 | 0:21:57 | |
So, what's happening in Gable's brain so that he can do this? | 0:21:57 | 0:22:03 | |
Well, there's a group of scientists in Hungary who have been doing some | 0:22:03 | 0:22:06 | |
quite extraordinary experiments. | 0:22:06 | 0:22:09 | |
They've been training dogs to lie very still | 0:22:09 | 0:22:11 | |
and then they've been putting them into brain scanners. | 0:22:11 | 0:22:14 | |
The brain scanners don't work if you move so the dog has to stay very still. | 0:22:16 | 0:22:21 | |
And then what they are doing is taking pictures of the activity inside the | 0:22:22 | 0:22:25 | |
dog's brains while they're listening to different sounds and words. | 0:22:25 | 0:22:30 | |
When I scan people, they're not normally that happy! | 0:22:30 | 0:22:35 | |
So this is called functional magnetic resonance imaging and it lets us | 0:22:35 | 0:22:38 | |
take photographs of the brain in action. | 0:22:38 | 0:22:41 | |
And this is showing the results. | 0:22:42 | 0:22:44 | |
So a dog's brain, as you can see, | 0:22:44 | 0:22:46 | |
it's different in shape to a human brain, but it has some of the same | 0:22:46 | 0:22:49 | |
structures in there. And quite strikingly, | 0:22:49 | 0:22:54 | |
one of the things they are finding in the results is when dogs hear words they understand, | 0:22:54 | 0:22:58 | |
we see greater activation in the left side of the brain, | 0:22:58 | 0:23:01 | |
particularly in brain areas to do with processing sound. | 0:23:01 | 0:23:04 | |
And that is very similar to what you find in our brains. | 0:23:04 | 0:23:10 | |
So it looks like it's at least possible that more than any other animal | 0:23:10 | 0:23:14 | |
we've looked at, dogs might be sharing some of our ability | 0:23:14 | 0:23:18 | |
to decode spoken words and they may even be doing it in similar ways to us. | 0:23:18 | 0:23:23 | |
But of course, amazing as they are, dog's brains do have their limits. | 0:23:24 | 0:23:30 | |
The largest number of words that any dog has been found to understand, | 0:23:30 | 0:23:34 | |
and this was an exceptional dog, is about 1,000 words. | 0:23:34 | 0:23:37 | |
Which sounds fantastic. | 0:23:37 | 0:23:39 | |
Gable knows about 150 words, | 0:23:39 | 0:23:41 | |
but every one of you could understand 1,000 words when you were three years old. | 0:23:41 | 0:23:46 | |
And of course human language is more than just single words. | 0:23:46 | 0:23:49 | |
We don't walk around going, "Steps, camera, man". | 0:23:49 | 0:23:54 | |
We are putting words together into sequences. | 0:23:54 | 0:23:57 | |
And when we put words into sequences, | 0:23:57 | 0:24:00 | |
we actually add in an extra level of coded meaning. | 0:24:00 | 0:24:04 | |
So perhaps, if we want to look for some more humanlike ability to put | 0:24:06 | 0:24:10 | |
symbols into sequences like we do with sentences, | 0:24:10 | 0:24:13 | |
we should be looking closer in the evolutionary tree. | 0:24:13 | 0:24:17 | |
Perhaps we should be looking to our closest cousins, other apes. | 0:24:17 | 0:24:21 | |
Now, because I'm not David Attenborough, | 0:24:21 | 0:24:24 | |
I don't get to play with the chimpanzee. | 0:24:24 | 0:24:25 | |
But we have the next best thing. | 0:24:27 | 0:24:29 | |
Please release the apes. | 0:24:29 | 0:24:32 | |
THEY IMITATE MONKEYS | 0:24:32 | 0:24:36 | |
This didn't happen to David Attenborough! | 0:24:50 | 0:24:52 | |
That's just brilliant. | 0:24:54 | 0:24:55 | |
These are human apes. | 0:24:57 | 0:24:58 | |
Please welcome Neil and Ace. | 0:25:01 | 0:25:03 | |
So you guys have been working a lot with our ape cousins? | 0:25:03 | 0:25:06 | |
We have, yeah. We were actually very fortunate to work with Andy Serkis | 0:25:06 | 0:25:09 | |
on Planet Of The Apes Last Frontier, which is an interactive movie. | 0:25:09 | 0:25:12 | |
Hold on a second, sorry. | 0:25:12 | 0:25:14 | |
OK, just reassurance. | 0:25:14 | 0:25:17 | |
We were very lucky, we got to study apes for quite a while before we | 0:25:17 | 0:25:20 | |
started creating our own characters from ape work. | 0:25:20 | 0:25:23 | |
We watched their movements, their behaviour patterns, hierarchies - | 0:25:23 | 0:25:27 | |
here's some of the work we did - and also the speech they use, | 0:25:27 | 0:25:30 | |
which is five different parts of language. | 0:25:30 | 0:25:33 | |
Grunts, barks, hoots, whimpers and screams, which they do use as chimps. | 0:25:33 | 0:25:37 | |
And this is my ape actually, Bryn. | 0:25:37 | 0:25:39 | |
-That's you? -Yes, that's me. | 0:25:39 | 0:25:41 | |
-That's amazing. -Yes. | 0:25:41 | 0:25:43 | |
So what this is letting you do is have ape actors which are based on humans, | 0:25:43 | 0:25:46 | |
so you can get them to do things which you could never actually ask | 0:25:46 | 0:25:49 | |
-another ape to do? -Absolutely, | 0:25:49 | 0:25:51 | |
and we get to create a bit of drama in the family. | 0:25:51 | 0:25:53 | |
It's actually really about family, this interactive film, | 0:25:53 | 0:25:56 | |
so it is centred on this particular family of apes that have splintered | 0:25:56 | 0:25:58 | |
away from Caesar's group. | 0:25:58 | 0:26:00 | |
So you've had to work really closely with the apes to actually learn | 0:26:00 | 0:26:02 | |
about their body language and movements. | 0:26:02 | 0:26:04 | |
Did you pick up on anything that felt like communication that you were looking at? | 0:26:04 | 0:26:08 | |
Yeah, we studied a lot of their gesticulation and body language, | 0:26:08 | 0:26:11 | |
especially between dominant and subservient male apes. | 0:26:11 | 0:26:15 | |
Because what you get is a lot of different behaviour patterns that | 0:26:15 | 0:26:18 | |
have been formed by a simple sign, for instance, the touch of hands. | 0:26:18 | 0:26:21 | |
If somebody is in the dominant position, | 0:26:21 | 0:26:22 | |
it's quite important to be able to... | 0:26:22 | 0:26:24 | |
So if you were offering, | 0:26:24 | 0:26:25 | |
if you're trying to get my attention and forgiveness, for instance. Go on. | 0:26:25 | 0:26:29 | |
He gives me respect and I give it back to him. | 0:26:29 | 0:26:33 | |
-They are communicating with gestures. -They are, yes. | 0:26:33 | 0:26:35 | |
Amazing. Thank you so much, thank you. | 0:26:35 | 0:26:37 | |
So Neil and Ace aren't just imitating the apes very well, | 0:26:48 | 0:26:51 | |
they are clearly picking up on some aspects of communication, | 0:26:51 | 0:26:54 | |
but is there really some kind of | 0:26:54 | 0:26:57 | |
ape conversation going on? | 0:26:57 | 0:26:59 | |
And would it look at all like the way we use language? | 0:26:59 | 0:27:02 | |
To find out, please welcome chimpanzee researcher from the University of St Andrews, | 0:27:03 | 0:27:08 | |
Dr Cat Hobaiter. | 0:27:08 | 0:27:10 | |
Hello. Lovely to meet you. | 0:27:12 | 0:27:14 | |
So you've been asking some really interesting questions about ape language | 0:27:16 | 0:27:20 | |
and can you tell us more about your work? | 0:27:20 | 0:27:22 | |
Yes, absolutely, | 0:27:22 | 0:27:23 | |
so what people have done in the past was try to teach apes our language, | 0:27:23 | 0:27:29 | |
so human language or sign language, | 0:27:29 | 0:27:31 | |
or they've looked at their vocalisations. | 0:27:31 | 0:27:33 | |
But what we've been looking at is their gestures. | 0:27:33 | 0:27:36 | |
-OK. -And it turns out, they've got a lot of different gestures. | 0:27:36 | 0:27:39 | |
So their own natural communication contains 60 or 70 | 0:27:39 | 0:27:44 | |
different hand and body movements that they're using every day to ask come here, go away, | 0:27:44 | 0:27:50 | |
I want that, all the little meanings. | 0:27:50 | 0:27:52 | |
And how do you work out what those gestures mean? | 0:27:52 | 0:27:55 | |
What we do is we look for a particular gesture | 0:27:55 | 0:27:58 | |
and then we are looking for what happens next. | 0:27:58 | 0:28:01 | |
And if I were to do this and you responded back to me, | 0:28:01 | 0:28:06 | |
one or two cases it could be anything or you could misunderstand me or I | 0:28:06 | 0:28:10 | |
could keep going. But what I'm looking for is what stops me from | 0:28:10 | 0:28:15 | |
signalling. So if I'm asking you for something, | 0:28:15 | 0:28:17 | |
then the thing you do that makes me happy as a signaller | 0:28:17 | 0:28:21 | |
-is the thing that I wanted. -Yes, so you have to look at the whole context. | 0:28:21 | 0:28:25 | |
Yes, so we need to look at the signaller, at the recipient, the gesture, | 0:28:25 | 0:28:29 | |
then we need to look at not just one or two but hundreds of cases so we | 0:28:29 | 0:28:33 | |
can see the patterns emerging in the behaviour. | 0:28:33 | 0:28:35 | |
And you've been using the general public to help you with your research as | 0:28:35 | 0:28:39 | |
-well, haven't you? -Yes, | 0:28:39 | 0:28:40 | |
so we were able over a few years to look at all the other apes but there | 0:28:40 | 0:28:44 | |
was one of them missing, which was us. | 0:28:44 | 0:28:46 | |
And we know we have language but we don't know if we still have access | 0:28:46 | 0:28:51 | |
to some of the communication that the apes also use. | 0:28:51 | 0:28:54 | |
The sort of gestural stuff they are using, yes. | 0:28:54 | 0:28:56 | |
Exactly. So whether or not if we showed a member of the public a | 0:28:56 | 0:29:00 | |
particular gesture, could they guess what it meant. | 0:29:00 | 0:29:02 | |
And what we did was put up lots of videos online and ask everyone to | 0:29:02 | 0:29:06 | |
come along and sort of play the game and have a go. | 0:29:06 | 0:29:08 | |
Can we have a go at that now? | 0:29:08 | 0:29:10 | |
-Yes, please. -Fantastic, so I think we've got a clip to start with. | 0:29:10 | 0:29:12 | |
We are going to watch this closely because we are going to ask you what you think is going on. | 0:29:12 | 0:29:16 | |
So the gesture there was that kind of arm. | 0:29:20 | 0:29:22 | |
Little arm raise from Charlotte. | 0:29:22 | 0:29:24 | |
OK, and is that a baby? | 0:29:24 | 0:29:26 | |
Yes, she's just a little one. | 0:29:26 | 0:29:28 | |
Do we think that's A - I want that, B - move closer, or C - go away? | 0:29:28 | 0:29:33 | |
ALL: B. | 0:29:36 | 0:29:37 | |
Yes, you guys are all well versed in chimpanzee already. | 0:29:37 | 0:29:40 | |
That was move closer. | 0:29:40 | 0:29:42 | |
OK, can we try another one? | 0:29:42 | 0:29:43 | |
OK, we are just highlighting this because it happens very quickly. | 0:29:45 | 0:29:48 | |
Yes, this is a really subtle one so it's just that little foot movement | 0:29:48 | 0:29:52 | |
as she looks back and gives a little foot wiggle to her son. | 0:29:52 | 0:29:54 | |
What happens next? | 0:29:57 | 0:29:58 | |
Is that play with me, climb on me or come here? | 0:30:00 | 0:30:04 | |
ALL: B. | 0:30:04 | 0:30:05 | |
B, yeah. It's rather sweet, that one. Very, very swift. | 0:30:05 | 0:30:08 | |
-Yes. -Really subtle. | 0:30:08 | 0:30:09 | |
So this is really fascinating. | 0:30:10 | 0:30:12 | |
It does look like you are seeing a sign language, | 0:30:12 | 0:30:15 | |
you're seeing really gestural use of communication. | 0:30:15 | 0:30:18 | |
They're not just waving their arms around passionately, | 0:30:18 | 0:30:20 | |
there is a very precise meaning being conveyed here. | 0:30:20 | 0:30:23 | |
But of course the real power of human language is that we can combine our | 0:30:23 | 0:30:28 | |
words and symbols into sequences and that gives us a more complex level of meaning. | 0:30:28 | 0:30:33 | |
Can the chimps ever do this? | 0:30:33 | 0:30:35 | |
Do you ever see any kind of examples of sequences? | 0:30:35 | 0:30:39 | |
Definitely. So what we see sometimes are these one-off gestures but they | 0:30:39 | 0:30:43 | |
will also put gestures together. | 0:30:43 | 0:30:45 | |
Sometimes a couple at once, | 0:30:45 | 0:30:47 | |
that was a big scratch and shaking the object and he's actually waiting | 0:30:47 | 0:30:51 | |
there. And whatever that gesture was, | 0:30:51 | 0:30:53 | |
it doesn't seem to have done the trick because Rohara is ignoring him. | 0:30:53 | 0:30:57 | |
-He's going to do it again. -He's going to do it again. -And the arm went up. | 0:30:57 | 0:31:00 | |
So we've got a scratch, a tree wiggle and the arm going up. | 0:31:00 | 0:31:02 | |
Yes. | 0:31:02 | 0:31:04 | |
And it worked. Oh, all right then, I'll get down. | 0:31:04 | 0:31:07 | |
She's a bit of an old lady. | 0:31:07 | 0:31:09 | |
So the tree waving is the sort of "Come here, I'm in charge". | 0:31:09 | 0:31:14 | |
-The scratching? -Scratching usually means groom, let's groom, | 0:31:14 | 0:31:18 | |
let's get together and groom. | 0:31:18 | 0:31:20 | |
Our big question for us now is when they are putting these gestures into | 0:31:20 | 0:31:24 | |
sequences, if it was human words the order of the sequence would make a | 0:31:24 | 0:31:28 | |
difference to the meaning, and we are trying to work out now if that's the same for the chimps. | 0:31:28 | 0:31:32 | |
Amazing. | 0:31:32 | 0:31:33 | |
Thank you so much, Cat, thank you. | 0:31:33 | 0:31:35 | |
Now, why is this so exciting? | 0:31:42 | 0:31:45 | |
Well, if other apes can also combine symbols into a specific order to | 0:31:45 | 0:31:50 | |
produce something with complex meaning, | 0:31:50 | 0:31:53 | |
it unlocks a great deal more of the power of the language. | 0:31:53 | 0:31:57 | |
Combining symbols into sequences allows us to describe the world in | 0:31:57 | 0:32:01 | |
much more complex ways. | 0:32:01 | 0:32:03 | |
It gives us the power to share much more complex ideas. | 0:32:03 | 0:32:07 | |
It's a really useful skill but it's not in any way a simple skill. | 0:32:07 | 0:32:11 | |
And to look at exactly what kind of thing the chimps are up against in | 0:32:12 | 0:32:16 | |
terms of the difficulty of understanding sequences of symbols and decoding them | 0:32:16 | 0:32:21 | |
accurately, I need two volunteers. | 0:32:21 | 0:32:24 | |
Can I have you there in the Los Angeles... | 0:32:25 | 0:32:28 | |
Can I have you in the Santa - sorry, Rudolph. | 0:32:28 | 0:32:32 | |
Fantastic, thank you very much, thank you... | 0:32:32 | 0:32:34 | |
You come round here. | 0:32:35 | 0:32:36 | |
You come round this side. Thank you very much. | 0:32:38 | 0:32:40 | |
Now, what's your name? | 0:32:40 | 0:32:42 | |
-Gracie. -Gracie, lovely to meet you, Gracie. | 0:32:42 | 0:32:44 | |
-And what's your name? -Ryan. | 0:32:44 | 0:32:46 | |
-Ryan? -Yes. -Fantastic. | 0:32:46 | 0:32:47 | |
Now, Ryan and Gracie, | 0:32:47 | 0:32:48 | |
what we need to do is put some general covers on you. | 0:32:48 | 0:32:53 | |
OK, we are just going to, sort of, from top to toe, cover you up. | 0:32:53 | 0:32:55 | |
Thank you. So what we are going to do is I'm going to ask you one at a | 0:32:55 | 0:32:59 | |
time to read some instructions and to mix some things together. | 0:32:59 | 0:33:03 | |
And, Gracie, I'm going to ask you to go first. | 0:33:03 | 0:33:04 | |
So, Ryan, so that you don't see what's going on, | 0:33:04 | 0:33:08 | |
I need you to pop on a blindfold and some ear defenders, OK? | 0:33:08 | 0:33:11 | |
Just so that you don't give away the game. | 0:33:11 | 0:33:15 | |
Right, so in a second I'm going to step forward with you | 0:33:15 | 0:33:18 | |
and we are going to turn over your instructions. | 0:33:18 | 0:33:20 | |
An important thing to remember, when we get to the end | 0:33:20 | 0:33:23 | |
of the instructions, is you step back with me. | 0:33:23 | 0:33:25 | |
Are you all right? | 0:33:25 | 0:33:26 | |
OK, let's start. | 0:33:26 | 0:33:28 | |
I think you might need to... You've got it, you've got it. | 0:33:38 | 0:33:41 | |
Good code-cracking. | 0:33:41 | 0:33:43 | |
And step back. | 0:33:44 | 0:33:45 | |
To see an alarming change of colour. | 0:33:47 | 0:33:49 | |
Fantastic, Gracie. | 0:33:49 | 0:33:50 | |
OK, now don't go anywhere. | 0:33:50 | 0:33:52 | |
We are now going to turn to Ryan. | 0:33:52 | 0:33:53 | |
OK, so what we are going to do, we are going to turn over your instructions | 0:33:53 | 0:33:56 | |
in just a second and I want | 0:33:56 | 0:33:58 | |
you to follow them with these chemicals here. OK? | 0:33:58 | 0:34:01 | |
Yes. First mix A into B. | 0:34:01 | 0:34:05 | |
A into B. | 0:34:05 | 0:34:07 | |
-So I just put... -A into B. | 0:34:07 | 0:34:09 | |
So I just put them like this? | 0:34:09 | 0:34:11 | |
Yes, very good. | 0:34:11 | 0:34:12 | |
-OK. -Add C. | 0:34:19 | 0:34:21 | |
And then... | 0:34:22 | 0:34:23 | |
you step back. | 0:34:25 | 0:34:27 | |
LAUGHTER | 0:34:27 | 0:34:29 | |
Are you OK, there? | 0:34:32 | 0:34:35 | |
Amazing. | 0:34:35 | 0:34:36 | |
Well done. | 0:34:37 | 0:34:39 | |
Thank you so much. | 0:34:40 | 0:34:42 | |
Now, we had two different outcomes there. | 0:34:42 | 0:34:44 | |
We had either a dramatic change of colour or an actual foam explosion, | 0:34:44 | 0:34:48 | |
depending on the order in which the chemicals were mixed. | 0:34:48 | 0:34:51 | |
And the only reason why Gracie and Ryan mixed them in different orders | 0:34:51 | 0:34:55 | |
was just how we'd punctuated the instructions. | 0:34:55 | 0:34:57 | |
So they both have the same instructions, | 0:34:57 | 0:34:59 | |
and they were just understanding them differently based on where we | 0:34:59 | 0:35:02 | |
put the full stops. Thank you very much, Gracie, thank you very much, Ryan. | 0:35:02 | 0:35:05 | |
So you can see that the exact grammar, | 0:35:09 | 0:35:11 | |
in this case the punctuation of the sentences, | 0:35:11 | 0:35:13 | |
has completely changed the meaning of those sentences | 0:35:13 | 0:35:17 | |
and that's why this tiny glimpse of the ability to combine gestures | 0:35:17 | 0:35:21 | |
in the apes is so exciting. | 0:35:21 | 0:35:23 | |
So if we think about the animals that we have talked about this evening, | 0:35:23 | 0:35:28 | |
we've seen birds who are able to produce very complex sounds | 0:35:28 | 0:35:31 | |
and they can mimic many other sounds, but they don't seem to understand the meaning of | 0:35:31 | 0:35:36 | |
those words. We've seen dogs, | 0:35:36 | 0:35:38 | |
now dogs don't talk but dogs are incredibly good at working out | 0:35:38 | 0:35:42 | |
what human words mean and what they refer to. | 0:35:42 | 0:35:45 | |
And we've seen the chimpanzees, | 0:35:45 | 0:35:47 | |
they are both using gestures to have quite precise meanings and they are | 0:35:47 | 0:35:52 | |
also using sequences of those gestures to have more sentence-like | 0:35:52 | 0:35:58 | |
structures. But they are not the same as the kind of codes humans use. | 0:35:58 | 0:36:02 | |
Why are there these distinctions between humans and other animals? | 0:36:02 | 0:36:06 | |
Well, I think a big difference is likely to do with our brains. | 0:36:06 | 0:36:11 | |
I just want you to look at these different ones here. | 0:36:11 | 0:36:14 | |
So here we've got a bird brain, that's actually a chicken brain. | 0:36:14 | 0:36:18 | |
That's a dog's brain. | 0:36:18 | 0:36:19 | |
This is a replica chimpanzee brain, and this is a human brain. | 0:36:20 | 0:36:26 | |
Now, they look different. | 0:36:27 | 0:36:29 | |
One obvious difference is size, and the human brain is enormous, | 0:36:29 | 0:36:33 | |
much larger than any of the other brains on the table. | 0:36:33 | 0:36:37 | |
But big brains aren't everything. | 0:36:37 | 0:36:39 | |
To make an analogy to computers, | 0:36:39 | 0:36:41 | |
it's not perhaps just about the raw processing power of the brain, | 0:36:41 | 0:36:45 | |
maybe we need to think about the operating system as well. | 0:36:45 | 0:36:48 | |
And we seem to have a very efficient operating system | 0:36:48 | 0:36:52 | |
for dealing with symbol-based codes. | 0:36:52 | 0:36:55 | |
I always wanted the superpower of being able to talk to animals, | 0:36:56 | 0:37:00 | |
but maybe our mismatching brains mean that I never will. | 0:37:00 | 0:37:05 | |
Maybe there's something else in our world with which we communicate far more frequently, | 0:37:05 | 0:37:10 | |
and maybe I need to stop thinking so much about how I could have | 0:37:10 | 0:37:13 | |
conversations with other animals and maybe think a bit more about talking | 0:37:13 | 0:37:17 | |
to machines. | 0:37:17 | 0:37:18 | |
This year's big Christmas present gadgets are machines that we can talk to, | 0:37:20 | 0:37:23 | |
digital assistants. | 0:37:23 | 0:37:25 | |
A quarter of us now either command our phones via our voice | 0:37:25 | 0:37:28 | |
or use devices like these around the home. | 0:37:28 | 0:37:31 | |
Hey, computer, please tell me what the weather will be like in London tomorrow. | 0:37:33 | 0:37:37 | |
There will be showers there tomorrow with a high of nine and a low of three. | 0:37:37 | 0:37:41 | |
OK. | 0:37:41 | 0:37:43 | |
Now that's nowhere near being anything like a human brain, but when I was doing my PhD | 0:37:43 | 0:37:47 | |
on speech processing 25 years ago, | 0:37:47 | 0:37:50 | |
I would have never believed that one day we would be walking around with | 0:37:50 | 0:37:53 | |
phones in our pockets that we could talk to. | 0:37:53 | 0:37:56 | |
It's incredibly hard to overestimate | 0:37:56 | 0:37:59 | |
how quickly this field has developed. | 0:37:59 | 0:38:01 | |
What's happening inside that box? | 0:38:01 | 0:38:03 | |
How are computers managing to interact with us? | 0:38:03 | 0:38:06 | |
And is it ever having a proper conversation with us? | 0:38:06 | 0:38:09 | |
Are we ever going to have a meaningful dialogue with a computer? | 0:38:09 | 0:38:12 | |
Will it ever understand the jokes we tell? | 0:38:12 | 0:38:15 | |
Now this is an incredibly complex area and we could easily fill three | 0:38:15 | 0:38:19 | |
lectures just on this topic, so I'm going to give you a tiny insight | 0:38:19 | 0:38:23 | |
into aspects of how this works. | 0:38:23 | 0:38:25 | |
But first, let's break down the question. | 0:38:25 | 0:38:27 | |
What do computers need to do to be able to understand us? | 0:38:27 | 0:38:31 | |
And the first thing is it needs to take the sounds that we make and | 0:38:31 | 0:38:35 | |
decode that into words. | 0:38:35 | 0:38:37 | |
This is called speech recognition and it's what is known in science as | 0:38:37 | 0:38:41 | |
a ridiculously difficult question. | 0:38:41 | 0:38:43 | |
And that's because speech is hard and speech is complex. | 0:38:43 | 0:38:47 | |
I'm going to play you a sentence spoken in Estonian | 0:38:47 | 0:38:50 | |
and what I want you to do is count the words you can hear. | 0:38:50 | 0:38:54 | |
OK? So just listen out for words and see how many there are. | 0:38:54 | 0:38:57 | |
VOICE SPEAKS IN ESTONIAN | 0:38:57 | 0:39:00 | |
OK, how many words did you think were there? | 0:39:07 | 0:39:09 | |
Sorry? 13, that's a good guess. | 0:39:10 | 0:39:13 | |
Pink top, glasses. | 0:39:13 | 0:39:15 | |
-32? -32, OK, a big jump there, about twice as many. | 0:39:15 | 0:39:19 | |
OK. Let's see the actual sentences. | 0:39:19 | 0:39:21 | |
So we've got 22 words, so actually it's a meaningless question to ask you, | 0:39:21 | 0:39:26 | |
because if you don't speak Estonian why would the words stand out to you? | 0:39:26 | 0:39:30 | |
But this is what the computer is confronted with. | 0:39:30 | 0:39:34 | |
It's hearing this continuous flow of sound and it's got to find some way | 0:39:34 | 0:39:39 | |
of getting a toehold on what those words could be. | 0:39:39 | 0:39:42 | |
Now listen to this sentence. | 0:39:42 | 0:39:43 | |
What I'm going to do is record myself speaking. | 0:39:43 | 0:39:46 | |
Where are the words? | 0:39:49 | 0:39:50 | |
So this is the spectrogram of me saying "Where are the words" that I've just recorded. | 0:39:53 | 0:39:58 | |
But where ARE the words? | 0:39:58 | 0:40:00 | |
There's no individual words there at all. | 0:40:00 | 0:40:02 | |
What you're getting is a sort of smear of energy and what looks like | 0:40:02 | 0:40:05 | |
one thing happening at the end there is actually the "ss" sound at the end of words. | 0:40:05 | 0:40:11 | |
When you hear gaps between words when someone is talking to you, | 0:40:11 | 0:40:15 | |
that's because you understand those words. | 0:40:15 | 0:40:17 | |
If you don't understand the words, you don't hear the gaps, | 0:40:17 | 0:40:20 | |
as we saw with the Estonian. | 0:40:20 | 0:40:22 | |
So how do computers deal with this? | 0:40:22 | 0:40:25 | |
How do computers find the starts and ends of words when there actually | 0:40:25 | 0:40:29 | |
are no physical gaps there necessarily at all? | 0:40:29 | 0:40:32 | |
Well the first thing a computer does is it breaks up the incoming stream | 0:40:32 | 0:40:37 | |
of sound into smaller chunks of sound | 0:40:37 | 0:40:39 | |
and we've got a demonstration of this here. | 0:40:39 | 0:40:43 | |
So this is an incoming sentence and I don't know what this is, | 0:40:43 | 0:40:47 | |
I've just got the spectrogram to work off and I'm a computer for the | 0:40:47 | 0:40:50 | |
purposes of this point. | 0:40:50 | 0:40:52 | |
And what I'm going to do is split this up | 0:40:52 | 0:40:55 | |
into different slices of information. | 0:40:55 | 0:40:58 | |
And what I'm going to do then is take those different slices | 0:40:58 | 0:41:04 | |
and go to look them up in the library that I've got, | 0:41:04 | 0:41:07 | |
which will help me try and get the best estimate of what speech sound I'm probably trying to look at. | 0:41:07 | 0:41:13 | |
And my library has to be very... Has to sort of | 0:41:13 | 0:41:15 | |
have an idealised version of speech sounds because we all talk | 0:41:15 | 0:41:20 | |
differently. So instead of trying to find an exact match for the speech sound, | 0:41:20 | 0:41:24 | |
the computer is looking for almost like a best guess. | 0:41:24 | 0:41:27 | |
So if we do this with the first speech sound in my sequence... | 0:41:27 | 0:41:30 | |
I've got a little slice of sound here. | 0:41:33 | 0:41:36 | |
Now if I look at that in isolation, what I can see | 0:41:38 | 0:41:42 | |
is that there is a broad sort of smear of energy there. | 0:41:43 | 0:41:47 | |
There's no bright hotspots. | 0:41:47 | 0:41:49 | |
Remember when you get a more intense bit of energy in a speech sound or | 0:41:49 | 0:41:54 | |
any sound in the spectrogram, you see it as a brighter colour? | 0:41:54 | 0:41:57 | |
There aren't really any bright colours there, but there's a big stretch of red | 0:41:57 | 0:42:01 | |
and it's going almost the whole length. | 0:42:01 | 0:42:03 | |
If I go over here, | 0:42:03 | 0:42:05 | |
I think I might be looking at a "T" so my best guess for that first sound is | 0:42:05 | 0:42:11 | |
that it is "T". Now who would like to help me guess the next sound? | 0:42:11 | 0:42:15 | |
Can I have you in the blue top, please? | 0:42:16 | 0:42:18 | |
Thank you very much. | 0:42:18 | 0:42:19 | |
-Now what's your name? -Joe. | 0:42:21 | 0:42:23 | |
-Sorry? -Joe. -Right, Joe, | 0:42:23 | 0:42:25 | |
you're going to help me be a computer processor and work out what the next sound is. | 0:42:25 | 0:42:28 | |
We've got one here. | 0:42:28 | 0:42:30 | |
Now, Joe, | 0:42:31 | 0:42:33 | |
you can see this looks quite different to that sound we were just looking for. | 0:42:33 | 0:42:37 | |
Now can we find anything in our library that looks like that? | 0:42:37 | 0:42:41 | |
I think you're right, | 0:42:41 | 0:42:44 | |
I think we are dealing with "EE." | 0:42:44 | 0:42:46 | |
Can you put that there for me? | 0:42:46 | 0:42:47 | |
Now not to labour the point, if we carry on like this, | 0:42:47 | 0:42:50 | |
you're not getting like really whip quick recognition of the speech, | 0:42:50 | 0:42:54 | |
are you? We are taking our time so what I'm going to do is throw more | 0:42:54 | 0:42:57 | |
processors at the problem, just as a computer would. | 0:42:57 | 0:43:00 | |
I'm going to take a couple more volunteers please, | 0:43:00 | 0:43:02 | |
so can I have you with the glasses, please, | 0:43:02 | 0:43:05 | |
and can I have you with the polo neck sweater. | 0:43:05 | 0:43:09 | |
Thank you very much. Can I have you? | 0:43:09 | 0:43:11 | |
Thank you very much. Got to have a unicorn. | 0:43:11 | 0:43:13 | |
OK, so, Unicorn, what's your name? | 0:43:17 | 0:43:21 | |
-Sasha. -Sasha, lovely to meet you, Sasha. | 0:43:21 | 0:43:22 | |
-Hi. -Evie. | 0:43:22 | 0:43:24 | |
Evie. And your name is? | 0:43:24 | 0:43:26 | |
-Kit. -Kit, excellent. | 0:43:26 | 0:43:27 | |
Now there's four speech sounds left. | 0:43:27 | 0:43:29 | |
I want you to each grab one and see if you can match it up with any of | 0:43:29 | 0:43:33 | |
the speech sounds here, OK? | 0:43:33 | 0:43:35 | |
There you go. | 0:43:35 | 0:43:36 | |
And what have you got here? | 0:43:36 | 0:43:38 | |
Look at this one. Can you see any that have got sloping shapes? | 0:43:38 | 0:43:40 | |
-Is it that one? -I think you might be right there. | 0:43:40 | 0:43:43 | |
So if you can remember, you came from number four. | 0:43:44 | 0:43:46 | |
I think I have an S sound. | 0:43:47 | 0:43:48 | |
I think you're absolutely right. | 0:43:48 | 0:43:50 | |
You grab an S sound and pop it up. | 0:43:50 | 0:43:52 | |
Now you have got the really difficult one here | 0:43:53 | 0:43:55 | |
because when the speaker said it, they really underarticulated it, | 0:43:55 | 0:43:59 | |
so what you have is this sound here | 0:43:59 | 0:44:04 | |
and that's going to have to be our best guess, OK? | 0:44:04 | 0:44:07 | |
OK, so if you can pop that in, we can see what we've got. | 0:44:07 | 0:44:09 | |
Thank you very much. | 0:44:09 | 0:44:11 | |
It could be team mates, it could be tea mate. | 0:44:11 | 0:44:16 | |
We don't know still what the words are but we've now got a good guess | 0:44:16 | 0:44:20 | |
what our speech sounds are and that takes us to our next stage. | 0:44:20 | 0:44:23 | |
What I want to do first is say thank you very much | 0:44:23 | 0:44:25 | |
to Joe, Sasha, Evie, Kit. | 0:44:25 | 0:44:27 | |
Thank you very much, thank you. | 0:44:27 | 0:44:29 | |
So the addition of processing power | 0:44:34 | 0:44:36 | |
has really helped us be able to speed up this process | 0:44:36 | 0:44:38 | |
and that's why you can talk to your phone without it | 0:44:38 | 0:44:40 | |
taking that amount of time. | 0:44:40 | 0:44:43 | |
And what we are doing here is pulling out what the speech sounds are | 0:44:43 | 0:44:47 | |
but we still don't know what those words are, | 0:44:47 | 0:44:49 | |
we don't know where the edges are. | 0:44:49 | 0:44:51 | |
What we need to do is go to another level and what computers do | 0:44:51 | 0:44:54 | |
is go to what's called a language level | 0:44:54 | 0:44:57 | |
to start to break this stream of sounds up into words and into sentences. | 0:44:57 | 0:45:01 | |
I need two new volunteers to decode a stream of speech for me, just like | 0:45:01 | 0:45:06 | |
you were a computer. Can I have... | 0:45:06 | 0:45:09 | |
you in the middle with the blue T-shirt? | 0:45:09 | 0:45:12 | |
Yes, there you go, fantastic. | 0:45:12 | 0:45:14 | |
Can I have you? Thank you very much, thank you. | 0:45:14 | 0:45:16 | |
-Now, what's your name? -Max. | 0:45:22 | 0:45:24 | |
Lovely to meet you, Max. And your name is? | 0:45:24 | 0:45:26 | |
-Cammy. -Cammy, fantastic. | 0:45:26 | 0:45:28 | |
What I'm going to give you is the same task a computer has got when | 0:45:28 | 0:45:30 | |
it's worked out the speech sounds but it doesn't know what the words are. | 0:45:30 | 0:45:34 | |
What you're going to do is see a list of speech sounds and I want you | 0:45:34 | 0:45:37 | |
to try and work out what words could be in there. | 0:45:37 | 0:45:39 | |
Sometimes the computer doesn't know what a sound is at all. | 0:45:39 | 0:45:43 | |
You'll just see a question mark, maybe there was a cough, | 0:45:43 | 0:45:45 | |
maybe there was a noise, so sometimes you're going to have to guess, OK? | 0:45:45 | 0:45:49 | |
So I can just ask you to step over here, Max, thank you very much. | 0:45:49 | 0:45:52 | |
OK and we will see it appear on the screen, OK? | 0:45:52 | 0:45:56 | |
So reading down in that direction, | 0:45:56 | 0:45:58 | |
Try reading that aloud. | 0:45:59 | 0:46:01 | |
THEY READ WORDS ON SCREEN SLOWLY | 0:46:01 | 0:46:04 | |
Now, think about the building that we are in. | 0:46:17 | 0:46:20 | |
-The Royal Institution. -The Royal Institution, absolutely, | 0:46:20 | 0:46:23 | |
so context can help you. | 0:46:23 | 0:46:24 | |
You go back at that, we are at the Royal Institution. | 0:46:24 | 0:46:27 | |
One last time. | 0:46:27 | 0:46:29 | |
TOGETHER: Drums are really | 0:46:29 | 0:46:32 | |
loud, so we have to | 0:46:32 | 0:46:37 | |
use ear | 0:46:37 | 0:46:42 | |
defenders at the Royal Institution. | 0:46:44 | 0:46:47 | |
Well done. | 0:46:47 | 0:46:49 | |
Thank you. Thank you, Cammy, thank you, Max. | 0:46:51 | 0:46:53 | |
Thank you. | 0:46:53 | 0:46:55 | |
So that's an extreme example, but what you are seeing there was essentially | 0:46:55 | 0:47:00 | |
the same problem a computer is trying to solve. | 0:47:00 | 0:47:02 | |
It's trying to find the edges, | 0:47:02 | 0:47:03 | |
it's trying to work out where the words could be, | 0:47:03 | 0:47:05 | |
and it's helped in this because it knows what the words are. | 0:47:05 | 0:47:08 | |
It has a database of thousands and thousands of words and it also knows | 0:47:08 | 0:47:12 | |
something about how sentences go together. | 0:47:12 | 0:47:15 | |
It knows how probable it is that a word will follow another word. | 0:47:15 | 0:47:19 | |
And if a word becomes more likely, | 0:47:19 | 0:47:21 | |
the probability of it occurring actually changes in the computer | 0:47:21 | 0:47:24 | |
so it's more easily activated. | 0:47:24 | 0:47:26 | |
And, in fact, our brains do something similar. | 0:47:26 | 0:47:28 | |
If you are listening to speech and you hear something which is highly | 0:47:28 | 0:47:32 | |
predictable like "the ship sailed across the bay" | 0:47:32 | 0:47:34 | |
then you will understand that sentence more easily. | 0:47:34 | 0:47:37 | |
So we are seeing actually, in terms of a lot of the code of language, | 0:47:37 | 0:47:41 | |
humans and computers are not as differently matched as we used to be. | 0:47:41 | 0:47:46 | |
Computers are catching up with a lot of our linguistic abilities. | 0:47:46 | 0:47:50 | |
Computers have become huge and very powerful, | 0:47:50 | 0:47:53 | |
they can be searching through very large databases very quickly, | 0:47:53 | 0:47:57 | |
and the internet means a little box like that can be connected to those | 0:47:57 | 0:48:00 | |
large online databases. | 0:48:00 | 0:48:02 | |
It doesn't need to have all the information there. | 0:48:02 | 0:48:04 | |
But, of course, this is not the full story of how human language works. | 0:48:04 | 0:48:10 | |
There is an entire level of communication | 0:48:10 | 0:48:13 | |
that we use all the time that we have barely mentioned. | 0:48:13 | 0:48:17 | |
Rather than just what we say when we're talking, | 0:48:17 | 0:48:20 | |
we are always sending out information in how we say words. | 0:48:20 | 0:48:24 | |
Everything I've talked about so far has been, if we think about brains, | 0:48:26 | 0:48:32 | |
associated with properties of the left side of the brain. | 0:48:32 | 0:48:35 | |
The left side of the brain in humans, for most people, is associated with | 0:48:37 | 0:48:42 | |
how we decode speech, how we control our own voices. | 0:48:42 | 0:48:45 | |
And the right side of the brain is a lot less interested in these | 0:48:46 | 0:48:50 | |
linguistic properties of communication like words and sentences, | 0:48:50 | 0:48:54 | |
and a lot more interested in all the other things that are going on when | 0:48:54 | 0:48:57 | |
we are talking to people: | 0:48:57 | 0:48:59 | |
who we are talking to. | 0:48:59 | 0:49:00 | |
Are they being emotional? | 0:49:00 | 0:49:02 | |
Are they telling a brilliant joke like, | 0:49:02 | 0:49:03 | |
what is round and sounds like a trumpet? | 0:49:03 | 0:49:06 | |
A crumpet. | 0:49:06 | 0:49:08 | |
GROANS | 0:49:08 | 0:49:09 | |
So to have a meaningful conversation with a machine or understand my | 0:49:09 | 0:49:14 | |
brilliant jokes, we need to do something much more difficult. | 0:49:14 | 0:49:18 | |
We need to add in to our computer the missing right half of its brain. | 0:49:18 | 0:49:24 | |
Because actually when we are talking, when you're listening to somebody, | 0:49:24 | 0:49:28 | |
there are very important aspects of our communication which you need to | 0:49:28 | 0:49:32 | |
pick up on to really understand what somebody is saying. | 0:49:32 | 0:49:34 | |
Not just when it's written down. | 0:49:34 | 0:49:36 | |
The information when someone is speaking is very often being expressed as | 0:49:36 | 0:49:39 | |
much by how they are talking as what they are saying. | 0:49:39 | 0:49:42 | |
And this often refers to something called intonation. | 0:49:42 | 0:49:45 | |
Intonation is how we vary the pitch, the speed, | 0:49:45 | 0:49:49 | |
the melody of our voice when we are speaking. | 0:49:49 | 0:49:52 | |
An example would be that we don't talk to each other like that. | 0:49:52 | 0:49:56 | |
We would consider it to be quite strange. | 0:49:56 | 0:49:58 | |
We are always using intonation to clarify, enhance, | 0:49:58 | 0:50:02 | |
put in emphasis and emotion. | 0:50:02 | 0:50:04 | |
In our brains, | 0:50:04 | 0:50:06 | |
intonation is processed very differently | 0:50:06 | 0:50:09 | |
from the words that you're listening to. | 0:50:09 | 0:50:11 | |
On the whole, intonation is strongly found to be something | 0:50:11 | 0:50:14 | |
that the right hemisphere deals with and is interested in. | 0:50:14 | 0:50:18 | |
And often, it can be as important if not more important to the real | 0:50:18 | 0:50:22 | |
meaning of what somebody is saying. | 0:50:22 | 0:50:24 | |
We've got some recordings here of someone speaking in an emotional style | 0:50:25 | 0:50:29 | |
and we have stripped out all the auditory information that tells you about | 0:50:29 | 0:50:33 | |
the words they are saying and it's just leaving you with the intonation. | 0:50:33 | 0:50:35 | |
See if you can guess the emotion. | 0:50:35 | 0:50:37 | |
BUZZING | 0:50:37 | 0:50:40 | |
Any guesses, does that sound happy? | 0:50:40 | 0:50:42 | |
Angry? It sounds annoyed, doesn't it? | 0:50:42 | 0:50:45 | |
Yeah, that was angry. | 0:50:45 | 0:50:46 | |
That was very angry. | 0:50:46 | 0:50:47 | |
Another one. | 0:50:47 | 0:50:48 | |
BUZZING | 0:50:48 | 0:50:51 | |
Yeah, lots of downward inflections. | 0:50:53 | 0:50:57 | |
Sounds softer. | 0:50:57 | 0:50:59 | |
That was a sad voice so you are definitely using intonation to get aspects | 0:50:59 | 0:51:03 | |
of emotion out, but that's by no means the whole story. | 0:51:03 | 0:51:07 | |
And to help me demonstrate why intonation is so very important to | 0:51:07 | 0:51:10 | |
communication, please welcome news presenter and journalist | 0:51:10 | 0:51:13 | |
Krishnan Guru-Murthy. | 0:51:13 | 0:51:16 | |
Krishnan, obviously the words you say are absolutely critical to your job, | 0:51:22 | 0:51:26 | |
but how you are saying them must be something that you're always thinking about. | 0:51:26 | 0:51:30 | |
We use intonation to do all sorts of things on the news, all the | 0:51:30 | 0:51:34 | |
time, and it's very, very complex. | 0:51:34 | 0:51:35 | |
At the beginning of the news, we're saying, "This is urgent, | 0:51:35 | 0:51:39 | |
"it's exciting, you've got to watch, | 0:51:39 | 0:51:40 | |
"something really important happened today and I've got to tell you about | 0:51:40 | 0:51:43 | |
-"it." -Yeah. -And then once we get into the news, | 0:51:43 | 0:51:46 | |
we are trying to use the right intonation for the type of story | 0:51:46 | 0:51:50 | |
we are telling people, so we've got to try and hit the right note, literally. | 0:51:50 | 0:51:54 | |
So if it's something very surprising, | 0:51:54 | 0:51:57 | |
"The new president of the United States is Donald Trump!" | 0:51:57 | 0:52:00 | |
-That's quite surprising. -That is quite surprising. | 0:52:00 | 0:52:02 | |
If it's very serious or frightening news, something terrible has happened, | 0:52:02 | 0:52:07 | |
"The new president of the United States is Donald Trump." | 0:52:07 | 0:52:10 | |
It's a different thing. | 0:52:10 | 0:52:13 | |
You wouldn't normally do that, but if you were sort of talking about | 0:52:13 | 0:52:17 | |
an attack or a war or something like that, | 0:52:17 | 0:52:19 | |
this is a very serious thing and you're trying to get people in the | 0:52:19 | 0:52:22 | |
right frame of mind. But you're also trying not to frighten them | 0:52:22 | 0:52:25 | |
and you've got to use intonation for that. | 0:52:25 | 0:52:27 | |
Now we've got a very short experiment to do with Krishnan, | 0:52:27 | 0:52:30 | |
who's going to read out some football results. | 0:52:30 | 0:52:33 | |
Just by the intonation on his voice, I want you to... | 0:52:33 | 0:52:37 | |
He's going to stop before he gets to the final score and we're going to | 0:52:37 | 0:52:40 | |
work out if the last team had a higher score or a lower score than | 0:52:40 | 0:52:45 | |
the first team. OK, shall we try one? | 0:52:45 | 0:52:48 | |
Manchester City 2, West Bromwich Albion... | 0:52:48 | 0:52:52 | |
Higher or lower? | 0:52:52 | 0:52:53 | |
-Nil. -Nil, there you go, lower. | 0:52:54 | 0:52:56 | |
West Ham United one, Tottenham Hotspur... | 0:52:58 | 0:53:01 | |
-Higher, yes. -Five. | 0:53:02 | 0:53:04 | |
Chelsea one, Liverpool... | 0:53:05 | 0:53:07 | |
One - exactly, it's a draw. | 0:53:07 | 0:53:09 | |
This is absolutely amazing. | 0:53:11 | 0:53:12 | |
What Krishnan is doing, is over the whole course of the sentence, | 0:53:12 | 0:53:15 | |
he's doing a kind of dance with his intonation | 0:53:15 | 0:53:18 | |
that's completely keeping pace with the meaning of what he's saying. | 0:53:18 | 0:53:21 | |
You've got weaving the intonation in and out of the words and you're | 0:53:21 | 0:53:24 | |
picking up on that, you know what that means. | 0:53:24 | 0:53:27 | |
Can you do an example of it incorrectly so we can hear what that sounds like? | 0:53:27 | 0:53:31 | |
Stoke City four, Huddersfield... | 0:53:33 | 0:53:35 | |
five. | 0:53:36 | 0:53:37 | |
And when you say that, it sounds like it's the words that are wrong, whereas it's the | 0:53:39 | 0:53:43 | |
intonation that's wrong, it's really striking. | 0:53:43 | 0:53:45 | |
Thank you very, very much, Krishnan. | 0:53:45 | 0:53:47 | |
-Thank you. -Thank you, Sophie. | 0:53:47 | 0:53:48 | |
So we are using intonation all the time in regular conversation, | 0:53:53 | 0:53:56 | |
and sometimes we think that that's all we do to pick up on emotion in the voice. | 0:53:56 | 0:54:02 | |
Actually emotion in the voice is incredibly complex. | 0:54:02 | 0:54:05 | |
There's a great deal going on because emotions change how your bodies work | 0:54:05 | 0:54:08 | |
and how they feel. They can change many different aspects of your voice. | 0:54:08 | 0:54:11 | |
Up to now, | 0:54:11 | 0:54:13 | |
computers have really struggled with this kind of information. | 0:54:13 | 0:54:16 | |
But that's changing. | 0:54:16 | 0:54:20 | |
And people are starting to make more progress. | 0:54:20 | 0:54:23 | |
For our last demonstration, | 0:54:23 | 0:54:25 | |
I'm going to look at something really amazing - | 0:54:25 | 0:54:28 | |
a computer that can read emotions from human voices. | 0:54:28 | 0:54:31 | |
Now the acoustics in here are quite reverberant | 0:54:31 | 0:54:34 | |
and this computer has been built to work in the home, | 0:54:34 | 0:54:38 | |
and so what I'm going to do is just step outside where there's a lovely | 0:54:38 | 0:54:41 | |
carpet and it's a little bit more like being in someone's living room. | 0:54:41 | 0:54:43 | |
Let's go and meet Olly. | 0:54:43 | 0:54:45 | |
-Hi. Hi, Raymond. -Hello. | 0:54:47 | 0:54:50 | |
Now, tell me about Olly. | 0:54:50 | 0:54:54 | |
So Olly can actually understand what you're saying, as well as | 0:54:54 | 0:54:57 | |
emotions of your voice, the tone of your voice. | 0:54:57 | 0:54:59 | |
Excellent, what kind of thing do you do with Olly? | 0:54:59 | 0:55:02 | |
Olly actually enhances the communication between human and technology. | 0:55:02 | 0:55:06 | |
So he could be a digital assistant who's really understanding how you | 0:55:06 | 0:55:10 | |
feel, not just what you are saying? | 0:55:10 | 0:55:12 | |
-Exactly. -Yes. -Can we find out more about how he works? | 0:55:12 | 0:55:14 | |
-Sure. -Is it easiest to demonstrate that? | 0:55:14 | 0:55:16 | |
Yes, of course. So what you need to do is just stand a sensible distance | 0:55:16 | 0:55:19 | |
from Olly, speak normal as usual, | 0:55:19 | 0:55:22 | |
-and then maybe you can add some emotion to it when you speak. -OK. | 0:55:22 | 0:55:25 | |
So, are you ready? | 0:55:25 | 0:55:27 | |
One, two, three. | 0:55:27 | 0:55:28 | |
SADLY: Hey, Ollie, what's the weather like in London today? | 0:55:29 | 0:55:32 | |
Oh, there we go, yes. | 0:55:35 | 0:55:38 | |
So is Ollie getting the emotion out of your voice by the sound of your | 0:55:40 | 0:55:44 | |
voice or the words you're saying, or is it both? | 0:55:44 | 0:55:47 | |
Actually it's both. | 0:55:47 | 0:55:48 | |
It recognises some acoustic components of what you said, | 0:55:48 | 0:55:52 | |
and it's also using the syntax and semantics of the words you used. | 0:55:52 | 0:55:56 | |
It's very clever. Thank you very much, | 0:55:56 | 0:55:58 | |
-thank you. -Thank you very much. | 0:55:58 | 0:56:00 | |
Olly is doing an amazing job of decoding emotion as well as words | 0:56:03 | 0:56:08 | |
from the human voice, and you can see from this just how hard emotion | 0:56:08 | 0:56:13 | |
is to read and how extremely nuanced it can be. | 0:56:13 | 0:56:17 | |
We find it very easy, because we've grown up around people using language, | 0:56:17 | 0:56:22 | |
emotion, social meaning in their voices all the time, | 0:56:22 | 0:56:25 | |
so it seems easy to us but it's absolutely vital to understanding language. | 0:56:25 | 0:56:29 | |
All around the world, | 0:56:34 | 0:56:36 | |
animals are communicating in very simple and very complex ways and we | 0:56:36 | 0:56:42 | |
have so much more to learn. | 0:56:42 | 0:56:44 | |
I've touched on just the surface of animal communication today. | 0:56:44 | 0:56:48 | |
Within this, humans do seem to be extraordinarily complex in their linguistic abilities. | 0:56:49 | 0:56:55 | |
But we're only really using language with each other. | 0:56:55 | 0:56:59 | |
Could there ever be another life form that could crack our code - | 0:56:59 | 0:57:02 | |
perhaps one we haven't even encountered yet? | 0:57:02 | 0:57:05 | |
Watching Carl Sagan describe his work on the Voyager space probes | 0:57:06 | 0:57:10 | |
40 years ago in the 1977 Christmas lectures, | 0:57:10 | 0:57:13 | |
literally set me on course to standing here today - | 0:57:13 | 0:57:16 | |
that's why I became a scientist. | 0:57:16 | 0:57:18 | |
The Voyagers contain golden records, | 0:57:19 | 0:57:22 | |
and the golden records contain the sounds of Earth, | 0:57:22 | 0:57:24 | |
including greetings in 55 different human languages. | 0:57:24 | 0:57:28 | |
GREETINGS IN DIFFERENT LANGUAGES | 0:57:28 | 0:57:30 | |
Now, the sounds of the Earth's languages | 0:57:35 | 0:57:39 | |
have long since left our solar system. | 0:57:39 | 0:57:41 | |
The Voyager probes are now 13 billion miles away from Earth. | 0:57:41 | 0:57:45 | |
They are the most distant objects ever created by humans. | 0:57:45 | 0:57:48 | |
Now, if those spacecraft ever encounter an alien life form, | 0:57:49 | 0:57:53 | |
maybe they can use these sounds to start to decode our language. | 0:57:53 | 0:57:57 | |
And if they have brains that work like ours, | 0:57:57 | 0:57:59 | |
maybe one day we could have conversations with them. | 0:57:59 | 0:58:03 | |
This year's lectures have explored our fundamental urge to communicate. | 0:58:03 | 0:58:07 | |
We've looked at where it comes from and why we're so very good at it, | 0:58:07 | 0:58:10 | |
and we just can't help but reach out to others. | 0:58:10 | 0:58:14 | |
The big question is whether there is another form of life out there | 0:58:14 | 0:58:18 | |
that could crack our code, | 0:58:18 | 0:58:19 | |
and I hope you've realised that it will need to have an incredible brain - | 0:58:19 | 0:58:23 | |
at least as incredible as your brains, to do so. | 0:58:23 | 0:58:27 | |
Thank you. | 0:58:27 | 0:58:29 |