Browse content similar to The Hunt for AI. Check below for episodes and series from the same categories and more!
Line | From | To | |
---|---|---|---|
Just before Christmas, I witnessed something extraordinary. I stood in | :01:11. | :01:15. | |
a room with two robots that were starting to do something distinctly | :01:15. | :01:20. | |
human. They were learning like young children do, when they | :01:20. | :01:27. | |
discover the world for the first time. They were developing a | :01:27. | :01:30. | |
language, one that I didn't understand. It's quite scary. Oh, | :01:30. | :01:36. | |
gosh, it's asked me to do something. More and more of the things we once | :01:36. | :01:46. | |
:01:46. | :01:47. | ||
thought made us human are now being taken on by machines. I want to | :01:47. | :01:53. | |
find out how close we are to creating artificial intelligence. | :01:53. | :01:59. | |
And what that might mean for us as human. As a mathtition, should I be | :01:59. | :02:05. | |
worried that they may put me out of a job, or could they extend human | :02:05. | :02:10. | |
intelligence in unimaginable new ways? What are these machines? Or, | :02:10. | :02:20. | |
:02:20. | :02:41. | ||
In February 2011, New York was a seen for a very public test of how | :02:42. | :02:48. | |
clever machines have become. A showdown watched by millions, took | :02:48. | :02:55. | |
place between two men and a supercomputer. The arena for this | :02:56. | :02:59. | |
battle was a quiz show called Jeopardy. It's one of the most | :02:59. | :03:04. | |
popular and long-running quiz shows in America. It's famous for its | :03:04. | :03:14. | |
:03:14. | :03:19. | ||
Fortunately, those in mankind's corner had no problem with these | :03:19. | :03:27. | |
questions. One of the two human contestants was Brad Rutter, the | :03:27. | :03:34. | |
show's biggest-ever money winner. He became a five-time champion, | :03:34. | :03:41. | |
earning a record $3.2 million. long as I can remember, I grew up | :03:42. | :03:45. | |
with Jeopardy and by the time I was in high school I started realising | :03:45. | :03:48. | |
I knew the answers to most questions and if I ever get a | :03:48. | :03:52. | |
chance to try out, I think I can do pretty well. If wow have told me | :03:52. | :03:58. | |
how well I would actually do I would have said you were crazy, but | :03:59. | :04:08. | |
$3.2 million later here I am. the newcomer. This is Watson. | :04:08. | :04:11. | |
Watson was one of the most sophisticated AI computers ever | :04:11. | :04:20. | |
built. Sitting in the audience was his creator. We wanted to do this | :04:20. | :04:23. | |
kind of experiment, this illustration of what would it take | :04:23. | :04:28. | |
for a computer to play Jeopardy against a human. A brain that fits | :04:28. | :04:34. | |
in a shoebox powered on a glass of water and sandwich then you have a | :04:34. | :04:38. | |
computer 80 kilowatts of electricity and 20 tonnes of air | :04:38. | :04:43. | |
conditioning. This is what it took. In the lead-up to the match, Watson | :04:43. | :04:51. | |
had to learn how to play the game. In effect, it had to be coached. | :04:51. | :04:56. | |
One of the most difficult tasks it faced was to get to grips with the | :04:56. | :04:59. | |
cryptic format of the game, where the contestants are first given the | :04:59. | :05:09. | |
:05:09. | :05:11. | ||
answer and have to come up with the question. This queen fled to London. | :05:11. | :05:16. | |
It had to go through the 200 million pages of information it had | :05:16. | :05:23. | |
been programmed with and find the answer in just three seconds. | :05:23. | :05:30. | |
famous clown or any incompetent fool? Who is Bozzo. After four | :05:30. | :05:39. | |
years of development, Watson was considered ready for the challenge. | :05:39. | :05:46. | |
Let's play Jeopardy. Here we go. was a really, really tense moment. | :05:46. | :05:52. | |
If I was going to fail we were going to fail in front of - This | :05:52. | :05:57. | |
was real jeopardy? This is real. You were putting yourself on the | :05:57. | :06:02. | |
line. This was real and live. People are going to route for the | :06:02. | :06:06. | |
people rather than the machine, so we felt a little bit of pressure, | :06:06. | :06:13. | |
but also a great opportunity to kind of represent humanity against | :06:13. | :06:18. | |
encroaching technology, shall we say? Four-letter word for a vantage | :06:18. | :06:25. | |
point or belief? What is a view. You have adrenaline going. It was | :06:25. | :06:32. | |
an open playing field. Anything could happen. Who can Grendell | :06:32. | :06:38. | |
is the 1930s? After the first round of the first game, Watson was tied | :06:38. | :06:46. | |
with Brad at about $5,000. Into a tie for the lead. I think everyone | :06:46. | :06:49. | |
sat there and thought at least we didn't lose. In other words, at | :06:50. | :06:55. | |
least at this point, the awedients is understanding -- audience is | :06:55. | :06:58. | |
understanding that the system is good to tie a grand champion. | :06:58. | :07:03. | |
close for comfort. I remember that moment when I first heard about | :07:03. | :07:08. | |
Watson. I thought what a key moment for AI. We have already programmed | :07:08. | :07:12. | |
a computer to beat humans at chess, but chess is a vertical thought | :07:12. | :07:18. | |
process, where you are making deep, logical connections. Watson is | :07:18. | :07:24. | |
doing something completely different and thinking horizontally, | :07:24. | :07:27. | |
accessing a huge database. This is machine intelligence taken in a new | :07:27. | :07:33. | |
direction. The computer is starting to think in a multi-dimensional way. | :07:33. | :07:37. | |
How the game would unfold was to change the way many people think | :07:37. | :07:41. | |
about artificial intelligence, including me. But in order to | :07:41. | :07:46. | |
understand the real significance of Watson, I need to know where this | :07:46. | :07:56. | |
:07:56. | :08:00. | ||
dream to replicate our intelligence For thousands of years, we have | :08:00. | :08:06. | |
invented machines to do things for us. But there was one moment in | :08:06. | :08:10. | |
history where we realised they weren't just slaves, but had the | :08:10. | :08:20. | |
:08:20. | :08:24. | ||
potential to be our rivals. Or even I've come to the place where this | :08:24. | :08:32. | |
realisation first came to life. It was in this building, that the | :08:32. | :08:36. | |
search for artificial intelligence began. What happened here still | :08:36. | :08:46. | |
:08:46. | :08:47. | ||
resonates today. The idea that machines could be intelligent like | :08:47. | :08:55. | |
us was sparked by one man. He was the brilliant man, Alan Turing and | :08:55. | :09:04. | |
he's a real hero of mine. When he was a young man in his 20s he came | :09:04. | :09:08. | |
to work here at Bletchley Park, as a code breaker. This was the home | :09:08. | :09:13. | |
of British intelligence at the time. It was the 1930s and Britain was | :09:13. | :09:18. | |
gearing up to war with Germany. As well as needing the brute force of | :09:18. | :09:24. | |
the military, the Government were gathering a crack team hear. And | :09:24. | :09:28. | |
his genius was precisely what was needed at this time of national | :09:28. | :09:38. | |
:09:38. | :09:54. | ||
The main problem that he had to wrestle with whilst here, was this. | :09:55. | :09:58. | |
It's an enigma machine and it's an incredibly clever piece of kit. It | :09:59. | :10:02. | |
was used by the German military to scramble secret messages into code | :10:03. | :10:09. | |
that nobody could understand. Cracking this code was incredibly | :10:10. | :10:13. | |
difficult for humans. The Germans believed that the code was | :10:13. | :10:23. | |
:10:23. | :10:26. | ||
uncrackable. It was up to human minds to decipher the cryptic | :10:26. | :10:31. | |
communication that was picked up on the airwaves. This was an | :10:31. | :10:34. | |
impossibly long task. We were on the brink of war and didn't have | :10:34. | :10:44. | |
time to waist. -- waste. The machine consists of a key board | :10:44. | :10:49. | |
that I will type my message on to try to encode it. If I press an R, | :10:49. | :10:53. | |
then the wires in the machine go through and light up a letter at | :10:53. | :10:58. | |
the top here, so the R was encoded by theler N, but the incredible | :10:58. | :11:02. | |
thing is, about the machine, if I press R again it doesn't get | :11:02. | :11:07. | |
encoded by N the next time, because these rotas have kicked on one step | :11:07. | :11:11. | |
and so the wires are connecting up the letters in a completely | :11:11. | :11:17. | |
different way now. So when I now press R it is F. The operators had | :11:17. | :11:21. | |
different ways that they could set up the machine, so they can make a | :11:21. | :11:26. | |
choice, for example, of which rotas they were using, so there were five | :11:26. | :11:29. | |
different rotas and they would choose three and they could put | :11:29. | :11:33. | |
them inside in different orders. Then they would set each of the | :11:33. | :11:38. | |
rotas on one of 26 different settings and then not only that, | :11:38. | :11:42. | |
they had some plugs at the front here, which would also scramble up | :11:42. | :11:49. | |
all of the letters, so the combined effect is that there are over 150 | :11:49. | :11:53. | |
Slobodan Milosevic different ways that the machine can be -- million, | :11:53. | :12:03. | |
million, million, different ways that the machine can be set up. It | :12:03. | :12:13. | |
:12:13. | :12:13. | ||
really was human against machine. Alan Turing and his team realised | :12:13. | :12:17. | |
the human mind wasn't fast enough to do this and the only hope they | :12:17. | :12:24. | |
had of cracking the code was to create another machine to do it. | :12:24. | :12:28. | |
This is the incredible machine that he and his colleagues created. It's | :12:28. | :12:33. | |
called the Bomb. It was from this mesh of cogs and switches held | :12:33. | :12:37. | |
together by over ten miles of wire that finally came the ability to | :12:37. | :12:41. | |
crack the code on a daily basis. Something that no human had come | :12:41. | :12:46. | |
close to being able to do. What this machine did changed the course | :12:46. | :12:49. | |
of the war and many regard as crucial to the allied victory over | :12:49. | :12:59. | |
:12:59. | :13:03. | ||
In one sense, this machine was just a collection of cogs and wheels. | :13:03. | :13:06. | |
But it could do something that some of the finest finds in Britain | :13:06. | :13:14. | |
couldn't. It was an unsettling moment. Was this machine cleverer | :13:14. | :13:24. | |
than a human? That is an incredibly satisfying sound. It must have been | :13:24. | :13:27. | |
an amazing moment when they switched it on for the first time, | :13:27. | :13:30. | |
seeing the cogs clicking through all the different possibilities, | :13:30. | :13:33. | |
getting a possible setting in ten minutes what would have taken | :13:33. | :13:39. | |
humans weeks to try to find. To see this machine solve a problem that | :13:39. | :13:44. | |
many regarded as impossible, it must have really fired Alan | :13:44. | :13:48. | |
Turing's imagination. This was built to crack the enigma code, but | :13:48. | :13:51. | |
he must have begun to think about what was the limit of what machines | :13:51. | :14:01. | |
could do? Once the war was over, his ideas about machines that could | :14:01. | :14:10. | |
mimic the human mind really started to take shape. But it was when he | :14:10. | :14:15. | |
asked the question, can a machine think in a paper in 1950, that the | :14:15. | :14:19. | |
idea finally sparked people's imaginations. For many, it conjured | :14:19. | :14:25. | |
up terrifying visions of the future and sparked furious debate. But it | :14:25. | :14:30. | |
was this idea, above any other, than inspired scientists in labs | :14:30. | :14:38. | |
across the world to try to re- create what is essentially us. The | :14:38. | :14:42. | |
hunt for artificial intelligence has become tied up with one | :14:42. | :14:49. | |
technology, the computer. At the heart of it is the simple idea that | :14:49. | :14:58. | |
humans and computers aren't so very different. The central thesis of AI | :14:58. | :15:04. | |
is that actually all human thought is just computation, that the mind | :15:04. | :15:08. | |
and the brain are just a sophisticated sort of machine. So, | :15:08. | :15:13. | |
if you could make a big enough computer, with the best software, | :15:13. | :15:22. | |
you can make a copy of the human So riveted by the man versus | :15:22. | :15:27. | |
machine Jeopardy game. Because at estheart, it was so much more than | :15:27. | :15:32. | |
just a TV showdown. Hoare was a machine that could grapple with | :15:32. | :15:41. | |
natural language, as well as any human. -- here. Including pun, | :15:41. | :15:48. | |
riddles and complex questions across a broad range of knowledge. | :15:48. | :15:52. | |
Please welcome Watson. At the end of the first round, the | :15:52. | :15:58. | |
supercomputer Watson was neck and neck with human Brad. But as the | :15:58. | :16:03. | |
second round began, the computer started to show what it was really | :16:03. | :16:10. | |
capable of. Watson who is... Frans list. Who is the church lady. It | :16:10. | :16:16. | |
all fell apart from Ken and me the second day. Watson went on a tear. | :16:16. | :16:23. | |
What is violin. It ex exceeded expectations Watson dominated the | :16:23. | :16:30. | |
entire thing. The final blow for human kind was when Watson went on | :16:30. | :16:37. | |
to win a bonus question. Answer... Which doubleded his score. When | :16:37. | :16:40. | |
Watson hit the daily double that was pretty much the end of the road | :16:40. | :16:46. | |
ch you could see the look on Ken's face, we knew, the guys in the team, | :16:46. | :16:53. | |
that "It was over, we've won." Watson's landslide victory made | :16:53. | :16:57. | |
headlines across America. I would have thought technology like this | :16:57. | :17:02. | |
was years away but it is here now. I have the bruised eco-to prove it. | :17:02. | :17:08. | |
For days after the interhet was alive with chatter about the demise | :17:08. | :17:12. | |
of humanity. Here was a computer that was displaying uniquely human | :17:12. | :17:18. | |
qualities. This machine could understand all the complexities, | :17:18. | :17:25. | |
puns and riddles much faster than any human. But was this true | :17:25. | :17:35. | |
:17:35. | :17:36. | ||
artificial intelligence? Was Watson really thinking like us? The 1980s | :17:36. | :17:39. | |
the American philosopher John Searle challenged the idea whether | :17:39. | :17:42. | |
a machine could ever really think. He came up with an interesting | :17:42. | :17:47. | |
thought experiment to try and prove his idea. It is called the Chinese | :17:47. | :17:52. | |
Room. I am going to try and put that into practise to explore his | :17:52. | :18:02. | |
:18:02. | :18:12. | ||
He imagined a non-Chinese speaking person like me, in a bare room, | :18:12. | :18:22. | |
:18:22. | :18:23. | ||
with an instruction manual, and a selection of Chinese characters. He | :18:23. | :18:26. | |
then imagined someone who could speak Chinese, to be outside the | :18:26. | :18:31. | |
room, communicating with me, by a series of messages written using | :18:31. | :18:41. | |
:18:41. | :18:49. | ||
Chinese characters. Can you speak Chinese? The messages are posted to | :18:49. | :18:56. | |
me through a door. And I have to come up with a response, even | :18:56. | :19:02. | |
though I don't understand a word of Chinese. OK I haven't got a clue | :19:02. | :19:08. | |
what this says. But if I follow the instructions in this book, I should | :19:08. | :19:13. | |
be able to give some sort of response to this question, which | :19:13. | :19:17. | |
will make some sort of sense, so let's have a look. What do I have | :19:17. | :19:25. | |
to find? There is a little one over X symbol. The book tells me to find | :19:25. | :19:30. | |
an exact match of the message on the top half of the page. And then | :19:30. | :19:34. | |
to reply by copying the Chinese characters from the bottom half of | :19:34. | :19:39. | |
the page. That is interesting. Some of the characters appear in the | :19:39. | :19:45. | |
answer, which makes some sense. Gosh, I don't know whether these | :19:45. | :19:50. | |
are the right way round. That looks like a little house with a roof on | :19:50. | :20:00. | |
:20:00. | :20:03. | ||
the top. I need this character here. Oh yes. Bingo! Who knows what I | :20:03. | :20:08. | |
have just said! Can you speak Chinese? He compared the person in | :20:08. | :20:13. | |
the room making up the Chinese message, to a computer reading a | :20:13. | :20:23. | |
:20:23. | :20:45. | ||
bit of code. Have you been to No I haven't been to China but | :20:45. | :20:51. | |
would one day like to. I am just faking I can speak Chinese. So just | :20:51. | :20:55. | |
how convincing was I? You your Chinese was perfect. If we were on | :20:55. | :20:59. | |
line having a chat like that, you would have been convinced you were | :20:59. | :21:03. | |
talking to... I really really would. I would have thought you were | :21:03. | :21:10. | |
someone who spoke Chinese. Chinese Room I think is an | :21:10. | :21:13. | |
important thought experiment. Not for the answers it offers but more | :21:13. | :21:17. | |
for the questions it raise, because I certainly didn't understand a | :21:17. | :21:20. | |
word of what I was writing down there, yet I seem to be able to | :21:20. | :21:26. | |
fake the fact I was speaking Mandarin, so if a computer is | :21:26. | :21:31. | |
following instructions, isn't really thinking, is it? But then | :21:31. | :21:36. | |
again, what is my mind doing when I am actually articulating words now? | :21:36. | :21:40. | |
It is following a set of instructions, so it raises the | :21:40. | :21:45. | |
question, where is the threshold? What point can you they a machine | :21:45. | :21:54. | |
is really understanding and thinking? So, our bigger and faster | :21:54. | :21:58. | |
machines going to bring us closer to this threshold? Since John | :21:59. | :22:03. | |
Searle came one the Chinese Room experiment computer power has been | :22:03. | :22:10. | |
doubling every two years. I am off to see one of the most powerful | :22:10. | :22:16. | |
computers that exists in the world today. We enter the lab I need to | :22:16. | :22:22. | |
warn you it will be very noisy. You will need some ear protection. | :22:22. | :22:27. | |
What is the noise? So that is all cold air blowing through the racks, | :22:27. | :22:31. | |
to cool off the computer chip. Because they are working so hard. | :22:31. | :22:34. | |
Because they are working so hard. This is the first time I have done | :22:34. | :22:42. | |
an interview with ear plugs in. This beast of a machine called Blue | :22:42. | :22:47. | |
Gene is a phenomenal piece of engineering. It is used to perform | :22:47. | :22:51. | |
the kind of calculations that my brain could never do, and it never | :22:52. | :22:59. | |
has to sleep. Fred Mintzer is the naers inventor and Cray to have | :23:00. | :23:07. | |
Blue Gene. -- master. So this is first generation Blue Gene rack. | :23:07. | :23:13. | |
Its commutational performance is 5.6 Terra nops a second. A teraflop | :23:13. | :23:19. | |
is a million million operations per second. Faster than I can go. | :23:20. | :23:26. | |
Faster than you can go. How many have you got. We have 16 Rab rack, | :23:26. | :23:32. | |
they all work in parallel. That is 60,000 processors and a performance | :23:32. | :23:41. | |
of 91 teraflops a second. Blue Gene is solving complex calculations | :23:41. | :23:45. | |
about biological processes. It would take me the rest of my life | :23:45. | :23:51. | |
to do what this computer can do in an hour. But is it really being | :23:51. | :23:59. | |
intelligent? Or is it just doing what it is told? The real | :23:59. | :24:06. | |
intelligence comes from the programmer. It is totally | :24:06. | :24:10. | |
programmed intelligence. Would you say it's a big number cruncher.. It | :24:11. | :24:15. | |
is an enormous number cruncher. But it the doing the things our brains | :24:15. | :24:20. | |
could never get anywhere near. computers are good at computation, | :24:20. | :24:30. | |
:24:30. | :24:34. | ||
brains do much more. It is impressive to see just how powerful | :24:34. | :24:37. | |
they have become. This firepower you have with Blue Gene is | :24:37. | :24:40. | |
something my brain could get nowhere near, but then again, with | :24:40. | :24:45. | |
when I am doing maths it is not just doing number crunching. I am | :24:45. | :24:50. | |
doing more than that. The brain is working, making leap, a lot of | :24:50. | :24:54. | |
imagination, there is a lot of choices involved, which often | :24:54. | :25:00. | |
involve a kind of aesthetic sensibility. I am not saying a ma | :25:00. | :25:04. | |
sheen like that might not be able do that but at the moment it is a | :25:05. | :25:08. | |
sheer brute force crunching. It hasn't captured what my brain can | :25:08. | :25:18. | |
do. These other qualities that inare inhe want hernt in how man | :25:18. | :25:21. | |
intelligence, like creativity and imagine nation, are what make us so | :25:21. | :25:26. | |
unique. And they are the reason that the quest for artificial | :25:26. | :25:31. | |
intelligence is such an interesting and difficult challenge. There is | :25:31. | :25:34. | |
no reason to think that a computer couldn't one day develop these | :25:34. | :25:40. | |
qualities too. But to do this, scientists have got to get o grips | :25:40. | :25:50. | |
:25:50. | :26:01. | ||
with exactly how well develop them At the heart of human intelligence | :26:01. | :26:07. | |
lies ourable to learn to do things we have never done before. Which is | :26:07. | :26:14. | |
what has borough me here. And that is not me by the way. I have joined | :26:14. | :26:20. | |
the circus an my challenge is to walk the tightrope. Not something | :26:21. | :26:26. | |
mathematicians normally do. My instructor for the day is Lila. | :26:26. | :26:33. | |
are nervous. It is not that. There we go. This is a training wire. | :26:33. | :26:39. | |
So... This isn't the real thing. is. It is the same you will feel on | :26:39. | :26:42. | |
the high wire. You don't have the fear fabg - factor because you can | :26:42. | :26:49. | |
stand over it safely. The other one there is a bit of jeopardy. No, it | :26:49. | :26:59. | |
:26:59. | :27:00. | ||
is just... You could get a body bubble edouble. He is wearing a red | :27:00. | :27:10. | |
:27:10. | :27:11. | ||
shirt! See, no movement. So once upon a time he couldn't do that. He | :27:11. | :27:14. | |
wasn't born being able do that. That is something he learned. | :27:14. | :27:19. | |
I should have the confidence that I can learn this new skill.. You know | :27:19. | :27:25. | |
the younger you learn something... What are you saying? I am saying it | :27:25. | :27:33. | |
is easier. You saying I am too old. No, not at all. I am going to do | :27:33. | :27:41. | |
this. Lila can't programme me with a set of rules. Unlike a computer, | :27:41. | :27:47. | |
my brain and my body have to master this by themselves. As a | :27:47. | :27:52. | |
mathematician I know there is a formula to describe what is going | :27:52. | :27:56. | |
on here. But I can't calculate my way out of this. I just have to do | :27:56. | :28:06. | |
:28:06. | :28:13. | ||
Let your muscles do the thinking. Turn your brain off a bit. I have o | :28:13. | :28:18. | |
do things with my body and I have to think and counter act certain | :28:18. | :28:23. | |
things which I shouldn't do. There is a guy over there, who can just | :28:23. | :28:33. | |
:28:33. | :28:38. | ||
sort of skip and dance across. Hands up. Hands up. Hands up! | :28:38. | :28:42. | |
trying to concentrate on the wall over there and keep my hand up, I | :28:42. | :28:46. | |
am trying to keep this leg straight, and maybe I am thinking too much. | :28:46. | :28:50. | |
think, we have to get some of it into muscle memory so you don't | :28:50. | :28:54. | |
think about that, that it comes naturally. As I flail round on the | :28:54. | :29:00. | |
tightrope, I can see the complicates physics I am asking my | :29:00. | :29:04. | |
brain and body do to master this new skill. It is this idea that the | :29:04. | :29:08. | |
brain and body work together. That is taking the hunt for AI in a | :29:08. | :29:18. | |
:29:18. | :29:31. | ||
I have come to meet a scientist who has built the world's first | :29:31. | :29:41. | |
anthropomimetic robot. Which means it mimics the hue -- human body. | :29:41. | :29:47. | |
Because he believes that our fiscal form actually helps shape the way | :29:47. | :29:51. | |
we think. Without a body, artificial intelligence cannot | :29:51. | :29:57. | |
exist. 6. What we are interested in is how the brain controls body and | :29:57. | :30:01. | |
having built the body we have realised what a serious problem | :30:01. | :30:07. | |
that. Can I interact with it? you like to shake hands with it? | :30:07. | :30:15. | |
Yes, provided it doesn't clonking me anywhere sensitive. Give its | :30:15. | :30:25. | |
:30:25. | :30:29. | ||
hand a good squeeze and a shake. It's squeezing back. It's really | :30:29. | :30:33. | |
responding to contact with me, picking up something. It gives | :30:33. | :30:37. | |
people the sensation of interacting with something that in a way of | :30:37. | :30:47. | |
:30:47. | :30:47. | ||
really like themselves. This robot called Ecce Robot has no blood or | :30:47. | :30:55. | |
skin or flesh, but ha bones, joints, muscles and tendons that move like | :30:56. | :31:00. | |
ours, so it's able to have human interactions. This is an | :31:00. | :31:03. | |
extraordinary piece of engineering, but how does it help in getting | :31:03. | :31:09. | |
insight into intelligence? We are interested in what is known as | :31:09. | :31:14. | |
embodied intelligence. This is the idea that your intelligence is not | :31:14. | :31:18. | |
an abstract free intelligence, but tied very much to having a body and | :31:18. | :31:22. | |
in particular to having a human body. In order to investigate this, | :31:22. | :31:29. | |
what we have to do was build a robot that had as near to a human | :31:29. | :31:33. | |
body as we can manage and investigate the way it interacts | :31:33. | :31:37. | |
with the world and enables it to develop a particular sort of | :31:37. | :31:41. | |
intelligence for dealing with objects and so on. Once we get on | :31:41. | :31:48. | |
that track we'll be able to see how that intelligence, how that is | :31:48. | :31:52. | |
actually determined and conditioned by its body. You think it will be | :31:52. | :31:58. | |
distinct from the sort of intelligence that - does it know | :31:58. | :32:04. | |
I'm talking about it? The intelligence that before they were | :32:04. | :32:07. | |
doing things in robots, computers and that doesn't have a sense of a | :32:07. | :32:11. | |
body, it's a box? What happened when people started trying to make | :32:11. | :32:15. | |
robots that were controlled by AI and computers, is that they found | :32:15. | :32:19. | |
that the things that we thought would be difficult like playing | :32:19. | :32:24. | |
drafts or something like that were extremely easy. What was difficult | :32:24. | :32:28. | |
was moving the pieces. The interaction with the world is very, | :32:28. | :32:31. | |
very difficult and requires a lot of computation, but we are | :32:31. | :32:37. | |
completely unaware of this. What about the ultimate goal of things | :32:37. | :32:40. | |
like imagination or consciousness, this would actually become | :32:40. | :32:45. | |
conscious of its own body? I like to think it's the most interesting | :32:45. | :32:48. | |
question of all, but there are lots of problems to overcome before we | :32:48. | :32:52. | |
get to that stage. I happen to believe it will be possible one day | :32:52. | :32:57. | |
to build a machine that will have a kind of consciousness, whether it's | :32:58. | :33:02. | |
the same consciousness as ours, whatever that means, will be an | :33:02. | :33:06. | |
issue. I don't really believe we are going to arrive at | :33:06. | :33:14. | |
consciousness with a disembodied computer. This robot suggests that | :33:14. | :33:17. | |
human intelligence simply cannot be divorced from the human body that | :33:17. | :33:27. | |
created it. The hunt for AI has ended up on an alternative path. | :33:27. | :33:30. | |
It's not about hardware and software, but about trying to | :33:30. | :33:40. | |
:33:40. | :33:41. | ||
replicate human biology. Back at the circus school, I'm still having | :33:41. | :33:44. | |
problems with my own body as I continue to struggle to get across | :33:44. | :33:50. | |
the tightrope. Take that leg off right away. Good. Up, up. Come on, | :33:50. | :33:57. | |
fight it. It's a battle. You think this isn't going to work, but the | :33:57. | :34:02. | |
minute you reach up you are pulling yourself back in line. It doesn't | :34:02. | :34:10. | |
seem logical. It really didn't. I was thinking that will do that. I | :34:10. | :34:14. | |
feel there are lots of instructions going on here and I've got to try | :34:14. | :34:19. | |
to put them together and maybe I'm - well, I don't know, whether I'm | :34:19. | :34:27. | |
ever going to be able to do this. This is something -- is this | :34:27. | :34:33. | |
something instinctive I'll be able to learn, or whatever? Move, move, | :34:33. | :34:36. | |
move. If you are doing maths, you are thinking, thinking, you need to | :34:36. | :34:45. | |
clear that. This feels really alien. But when I take a step back I | :34:45. | :34:49. | |
realise that I've done something very similar to this before. That's | :34:49. | :34:55. | |
when I learnt to walk as a child. Really relax your brain. Hands up. | :34:55. | :35:01. | |
Step. Come on. I need to get mean now. Hands up. What's truly | :35:01. | :35:04. | |
remarkable about our brains is that everything we learn to do from | :35:04. | :35:08. | |
seeing, thinking, walking and talking, becomes automatic when | :35:08. | :35:18. | |
:35:18. | :35:19. | ||
practice. Go, go, go. Don't rush. Hands up. Hands up! We'll try a | :35:19. | :35:26. | |
little bit of military technique. Use a whip now! This ability to | :35:26. | :35:30. | |
learn to do something instinctively, without even thinking about it, is | :35:30. | :35:35. | |
what makes us human. Getting a machine to do this is one hell of a | :35:35. | :35:45. | |
:35:45. | :35:51. | ||
task. Getting a computer to see has become one of the most difficult | :35:51. | :35:59. | |
challenges in AI. Our vision is key to our intelligence. Without it how | :35:59. | :36:09. | |
:36:09. | :36:16. | ||
would we perceive the world around us? The one scientist -- for one | :36:16. | :36:20. | |
scientist, the challenge has been to get a computer to be able to | :36:20. | :36:29. | |
monitor and interpret CCTV footage, like a human does. You would think | :36:29. | :36:32. | |
with the power of computers and a mainframe in here scanning all the | :36:32. | :36:36. | |
programmes we can write, you would be able to pick up We haven't | :36:36. | :36:40. | |
written the programme that can do this sort of thing. It's so | :36:40. | :36:45. | |
difficult for a computer to pick out the needles in the haystacks. | :36:45. | :36:49. | |
Our brains have learnt to control our eye movements effortlessly, | :36:49. | :36:54. | |
taking in the whole seen. We instantly combine this with our | :36:54. | :36:59. | |
previous knowledge of what a car, bus, person or building looks like. | :36:59. | :37:03. | |
In less than seconds we can make sense of all the activity that's | :37:03. | :37:09. | |
going on in busy Piccadilly Circus. For a computer to recognise just | :37:09. | :37:14. | |
one of these elements, is almost impossible. James wants to show me | :37:14. | :37:22. | |
why this is. This is a simple programme to demonstrate how | :37:22. | :37:30. | |
difficult vision is. What it does is remove our hardware, our vision | :37:30. | :37:33. | |
processing hardware from the problem, so we have to process each | :37:34. | :37:38. | |
bit of the image piece by piece, rather like a computer has to. | :37:38. | :37:43. | |
challenge is to try to recognise a few familiar faces. It's through a | :37:43. | :37:48. | |
tiny dot on the screen. Is this somebody I should know? Yep, you | :37:48. | :37:55. | |
should know all of these. OK. I've found the eye. OK. I've got a bit | :37:55. | :38:00. | |
of clothing there. Check shirt. Exactly. You can put all this | :38:00. | :38:04. | |
together, Marcus. Then I should get it? Yeah. I don't know. Give in on | :38:05. | :38:12. | |
that one. Stephen Hawking. again. Another face. It's weird, I | :38:12. | :38:22. | |
:38:22. | :38:22. | ||
can pick out eyes and nose and mouth. Putting that all together in | :38:22. | :38:27. | |
bits, but it doesn't have enough hair for Einstein. It's relevant | :38:27. | :38:37. | |
:38:37. | :38:38. | ||
for this programme. OK. Alan Turing. We'll have one more. Beyond picking | :38:38. | :38:46. | |
out that this is a face, I think just the little pieces kind of | :38:46. | :38:54. | |
piece by piece - all right, it's me. I don't spend ages looking at me. | :38:54. | :38:59. | |
It really demonstrates how difficult it is. A computer can | :38:59. | :39:04. | |
only interpret one small business of the image at a time. But we have | :39:04. | :39:06. | |
learnt to piece all the pixels together and see the whole thing | :39:06. | :39:10. | |
with no effort at all. James has been wrestling with the problems | :39:10. | :39:15. | |
for years. He has done something really interesting, I think. He's | :39:16. | :39:20. | |
developed a piece of software that uses ground-breaking technology to | :39:20. | :39:26. | |
recognise individuals in a crowd. He wants to use me as a guinea pig. | :39:26. | :39:33. | |
What we'll do now is take you out on to the streets and track you out | :39:33. | :39:37. | |
and about. This means tracking you as you go within a camera and then | :39:37. | :39:40. | |
when you move on to a different view, we'll track you into the new | :39:40. | :39:44. | |
camera and try to find you there and carry on to see how far we can | :39:44. | :39:54. | |
:39:54. | :39:54. | ||
take you. I'll be passing under the gaze of the Dons of CCTV cameras in | :39:54. | :40:00. | |
and around Piccadilly Circus. -- dozens. It's difficult for a | :40:00. | :40:04. | |
computer to pick out, because as I walk through the different camera | :40:04. | :40:08. | |
shots my size and position changes all the time. Whereas we can | :40:08. | :40:16. | |
separate what is me and what isn't without even thinking about it. | :40:16. | :40:20. | |
James' computer system gives each person a sequence of numbers and | :40:20. | :40:22. | |
then attempts to recognise individuals by matching their | :40:22. | :40:28. | |
number between all the different camera views. On the right-hand | :40:28. | :40:31. | |
side of the screen we can see where the computer has successfully | :40:31. | :40:36. | |
picked me out of the crowd on my journey around central London. In | :40:36. | :40:40. | |
the end, I was tracked across five views as I walked a quarter of a | :40:40. | :40:50. | |
:40:50. | :40:51. | ||
mile. James is making real progress in getting a computer to recognise | :40:51. | :40:58. | |
what it's looking at. But there's a deeper question at stake here. -- | :40:58. | :41:03. | |
steak here. It's not just about -- stake here. It's not just about | :41:03. | :41:07. | |
assigning objects, but what we see. It's something that humans working | :41:07. | :41:12. | |
here don't have a problem with. don't think we'll get rid of the | :41:13. | :41:17. | |
human intervention. On a Friday or Saturday night, we see lots of | :41:17. | :41:21. | |
people, the ones who have had a drink will mess about and play | :41:21. | :41:24. | |
fight. I don't think a computer would be able to tell the | :41:24. | :41:26. | |
difference between a real and play fight, because we can look at the | :41:26. | :41:31. | |
expressions on faces and see if they're smiling or if they're angry, | :41:31. | :41:35. | |
so I don't think we're at that stage. The chaos of the urban | :41:35. | :41:41. | |
jungle. Exactly. The human eye is still better. We are programmed to | :41:41. | :41:46. | |
pick out something unusual. That's weird. What are you found there? | :41:46. | :41:53. | |
pair of shoes on a seat. I was curious as to why they are there. | :41:53. | :41:56. | |
It's kind of strange. I've looked around to see who they might belong | :41:56. | :42:00. | |
to why they might be there, a couple of people have sat down next | :42:00. | :42:05. | |
to them and walked away, so you know they're not theirs, but I | :42:05. | :42:09. | |
don't think a computer can be programmed with the level of | :42:09. | :42:16. | |
curiosity and that's a natural human emotion. This desire to | :42:16. | :42:19. | |
understand meaning is absolutely at the forefront of artificial | :42:19. | :42:25. | |
intelligence right now. Our brains do it by learning complue | :42:25. | :42:35. | |
:42:35. | :42:39. | ||
experience and it's become a really -- through experience and it's a -- | :42:39. | :42:43. | |
it's become a really complex experience. I'm being reminded | :42:44. | :42:47. | |
today what a great human thing it is to be able to learn something | :42:48. | :42:55. | |
through experience. You would learn this pretty quick if you thought a | :42:55. | :43:01. | |
crocodile was under you. Trying to get across a wire encapsulates all | :43:01. | :43:06. | |
the problems involved in getting a machine to do a new task. Through | :43:06. | :43:09. | |
the process of learning by experience we are able to combine | :43:09. | :43:16. | |
all the different abilities that are needed. Like body control, | :43:16. | :43:22. | |
reasoning and instinct. I've tried everything to get across this wire. | :43:22. | :43:30. | |
Nothing seems to be working. But then Leila comes up with one little | :43:30. | :43:37. | |
thing. Try a little song. It releases any little tune. I'm | :43:37. | :43:47. | |
:43:47. | :43:52. | ||
chatting to you now to try to relax you a bit. La, la, la, la. Keep | :43:53. | :43:59. | |
coming, don't stop. Keep the leg down. Keep coming. Keep it small. | :43:59. | :44:08. | |
Hands up. Hands up. Come on! That was so easy. You were there. | :44:08. | :44:13. | |
close and then my hand went down. What you just did there, right to | :44:13. | :44:17. | |
that point, looked really easy. I thought you can cracked this. I | :44:17. | :44:20. | |
don't know, sometimes the last, the anticipation, it's the last step or | :44:20. | :44:25. | |
something. This is the last bit. You've got it, so keep that | :44:25. | :44:29. | |
momentum going. We can't fully explain how our brains learn in | :44:29. | :44:39. | |
:44:39. | :44:39. | ||
this way. I don't know where the huming is helping, but it is. | :44:40. | :44:49. | |
:44:50. | :45:09. | ||
rush. Oh, yes, I'm rushing. Mm, mm, Hand up. Leg off, hand up. Hand up. | :45:09. | :45:19. | |
:45:19. | :45:21. | ||
Little steps. Hands up. Hand up. Hand up. Yes! It wasn't elegant. | :45:21. | :45:25. | |
Frpbts it was pretty elegant. Perfect. Well done. That is an | :45:25. | :45:29. | |
amazing feeling to be able to get from one end of the tightwire to | :45:29. | :45:32. | |
the other. But that is the amazing thing about human intelligence, | :45:32. | :45:38. | |
just how adaptable it is, how I can go from nothing to learning a new | :45:38. | :45:41. | |
skill like doing the tightwire. If you think about a computer, you | :45:41. | :45:44. | |
would have to programme it especially to be able to do such a | :45:44. | :45:50. | |
something different and for it to be lost again. Perhaps we can try | :45:50. | :46:00. | |
:46:00. | :46:01. | ||
the trapeze now? Yes, sure, great! One other human quality we like to | :46:01. | :46:09. | |
celebrate, and often link to intelligence, is creativity. I have | :46:09. | :46:18. | |
come to Paris, where the streets are steeped in artistic culture. I | :46:18. | :46:25. | |
am here to visit an unusual artist's studio. Run by computer | :46:25. | :46:28. | |
scientist Simon Colton, who is normally based at Imperial College | :46:29. | :46:37. | |
London. All these immap -- images have been created not by him but by | :46:37. | :46:42. | |
a computer. Because Simon wants to find out if it is possible to break | :46:42. | :46:48. | |
down the creative process into a set of instructions. That is why I | :46:48. | :46:51. | |
started simulating paint strokes. And this is a good example of that. | :46:51. | :46:55. | |
As well as programming it with painting techniques, Simon gives | :46:55. | :47:00. | |
the computer rules about different artistic styles. We can choose a | :47:00. | :47:04. | |
painting style according to the emotion of the sitter. So it is | :47:05. | :47:10. | |
kind of feeding off its suant and trying to tap in and appreciate its | :47:10. | :47:15. | |
mood. His of her mood and express that through the art. I can see | :47:15. | :47:21. | |
what a difficult task this is. But there is one image that has really | :47:21. | :47:26. | |
caught my eye. I love this one. Lots of comments from people. | :47:26. | :47:31. | |
is what dancers? That is right. But these people don't really exist. | :47:31. | :47:38. | |
This isn't a photo that, or a video it has looked at. It is taking some | :47:38. | :47:42. | |
concept of people fed into it. you watch it being painted it | :47:42. | :47:46. | |
paints it all in one stroke. This is a single line. Is that is what | :47:46. | :47:50. | |
is giving it this sense of energy and motion? That is right. Yes. | :47:50. | :47:54. | |
That is a classic example of where you put in a rule, where you have | :47:54. | :47:58. | |
more out of it than you put in I had no idea that I would get this | :47:58. | :48:05. | |
kind of effect. I had no idea it would give such a dynamism. I say | :48:05. | :48:08. | |
that real creativity makes you think more, so the software we are | :48:08. | :48:12. | |
developing, that will make us think more not less. Just like every | :48:12. | :48:18. | |
artist or musician or poet wants us to think more. So, I see a | :48:18. | :48:21. | |
different future for AI which is somewhere software is out there to | :48:21. | :48:28. | |
challenge you. So if these creations are meant to challenge us, | :48:28. | :48:36. | |
what do the city's artists make of them? There is something sexual. In | :48:36. | :48:45. | |
this artwork. Interesting. Like bacon, like monkey. You are seeing | :48:45. | :48:52. | |
bacon and monkey? That is interesting. Is it a man or ma heen | :48:52. | :48:57. | |
who does the paintings. Why do you think? Was a it is done | :48:57. | :49:02. | |
automatically. How can you tell, you are right, but, I am interested. | :49:02. | :49:09. | |
You can see it. For me I am not an artist, so I would be hard pushed. | :49:09. | :49:13. | |
What gives it away? There is no sensibility. Really? You can pick | :49:13. | :49:18. | |
that up. That is interesting. I quite like the image, or some of | :49:18. | :49:23. | |
them any way. It is clear that trying to mimic something as | :49:23. | :49:28. | |
elusive as raytivety remains a huge challenge. -- creativity. But | :49:28. | :49:38. | |
:49:38. | :49:39. | ||
scientists are starting to put down the first important building blocks. | :49:39. | :49:42. | |
100 years the birth of the visionary mathematician Alan Turing, | :49:42. | :49:49. | |
we are discover hag the hunt for AI is a more difficult problem than he | :49:49. | :49:55. | |
or we ever imagined. Machines have surpassed us as specific tasks, but | :49:55. | :49:59. | |
the human brain sets the bar for artificial intelligence extremely | :49:59. | :50:06. | |
high, in its ability to constantly learn new skills. Hard wear and | :50:06. | :50:16. | |
:50:16. | :50:18. | ||
software, are struggling to mimic our biology. But maybe there is | :50:18. | :50:25. | |
another way to create a thinking machine. That can capture the | :50:25. | :50:29. | |
versatility of the human brain. I have come to Berlin, for the | :50:29. | :50:35. | |
highlight of my journey. To meet a scientist who believes that for | :50:35. | :50:39. | |
something to be truly intelligent, it needs to develop and evolve like | :50:39. | :50:45. | |
we did at the very beginning of our lives and as a species. If you want | :50:45. | :50:49. | |
to understand how to create or understand intelligence in some way, | :50:49. | :50:55. | |
we have to capture this evolutionary adaptive aspect of it. | :50:55. | :50:59. | |
So we cannot programme intelligence? We cannot hope to | :50:59. | :51:03. | |
give it the purpose and do the learning and, a lot of people think | :51:03. | :51:11. | |
that today, but it, we have to understand this continuous adaptive | :51:11. | :51:17. | |
nature, this flexibility, also this motivation to continue to learn, to | :51:17. | :51:25. | |
be excited about new thing, to seek out new things. Luc is taking me to | :51:25. | :51:30. | |
a lab on the outskirts of the city. To show me some robots that are | :51:30. | :51:40. | |
:51:40. | :51:40. | ||
starting to do something remarkable. Three of them. They are punch | :51:40. | :51:45. | |
larger than I thought they were going to be. -- much. These robots | :51:45. | :51:55. | |
:51:55. | :51:58. | ||
have been set up to develop much They are beginning to understand | :51:58. | :52:03. | |
how their bodies work, by looking at a reflection of themselves. You | :52:03. | :52:08. | |
have a mirror, so what is the.... Well the experiment is about that | :52:08. | :52:12. | |
the robot would learn something about its own body, because in | :52:12. | :52:18. | |
order to move in the world, in order to control it, in order to | :52:18. | :52:22. | |
also recognise the movements of another, you need to have some sort | :52:22. | :52:26. | |
of model of your own body. Right. The way that the model is going to | :52:26. | :52:33. | |
be built up is that the robot is doing actions, and watching itself. | :52:33. | :52:39. | |
Performing the actions. So to get a relation between the visual image, | :52:39. | :52:44. | |
and movement of the motor for. So here you see this looking at the | :52:44. | :52:50. | |
its hand. You also see very much how I was trying to keep balance. | :52:50. | :52:55. | |
It is very impressive. How all the motor commands are in an early | :52:55. | :53:04. | |
phase, right. Well caught! It has just woken up. I know that feeling. | :53:04. | :53:11. | |
I think this was a beautiful example of feedback, and of finding | :53:11. | :53:15. | |
balance. It is extraordinary. It really does look like it is | :53:15. | :53:23. | |
encountering itself for the first time. But what is even more | :53:23. | :53:28. | |
remarkable about Luc's robots, is that once they are able to | :53:28. | :53:33. | |
recognise themselves, they start to evolve their own language and | :53:33. | :53:40. | |
communicate with each other. So what will happen now? Well, one of | :53:40. | :53:45. | |
them is going to speak, then he is going o to ask the other one to do | :53:45. | :53:48. | |
an action. He is going to invent a word, because he doesn't have yet | :53:48. | :53:54. | |
the word to maim that action. OK, so then he says the word and | :53:54. | :53:58. | |
this one isn't sure whether what it could mean, this is a brand-new | :53:58. | :54:05. | |
word, so he will make a guess, and if the guess is OK, totally by luck, | :54:05. | :54:08. | |
well they both know this word and they know for the future they can | :54:08. | :54:12. | |
use this word to communicate with each other. What if he gets it | :54:12. | :54:17. | |
wrong? If it gets it wrong this one will say this is not what I had in | :54:17. | :54:24. | |
mind. He will show it. OK. I mean I don't know which one is going to | :54:24. | :54:33. | |
speak first. OK, he is speaking first. He is doing the action. | :54:33. | :54:40. | |
is fantastic. OK. Notice how he looked. So he is... He is doing the | :54:40. | :54:43. | |
real action. So now there is another interaction going to happen | :54:43. | :54:50. | |
again. I don't know which one is going to speak. OK. Is that the | :54:50. | :54:57. | |
word it just learned. Kimato? doing it. He will say yes, | :54:57. | :55:03. | |
presumably. Yes. So would they interact with me, to learn this | :55:03. | :55:10. | |
language? Yes, we could try it. Do you remember any one of the | :55:10. | :55:19. | |
words. We will see. In is scary. Gosh it has asked me to do | :55:19. | :55:24. | |
something. Kimato. What was that? Was it that, have I got it right. | :55:24. | :55:34. | |
:55:34. | :55:35. | ||
OK, you tell me what Kimato is. OK, that is right. All right. Tomima. | :55:36. | :55:41. | |
Tomima. I think that was lifting my right arm. Did it get it right. Yes. | :55:41. | :55:49. | |
I am learning robot! What would you say to people that would suggest | :55:49. | :55:53. | |
that, well this looks simple. They are doing a few action, swapping | :55:53. | :55:57. | |
words what is the big deal? first thing I would say to people, | :55:57. | :56:03. | |
just try it yourself, OK, just try to build a robot that stands up. | :56:03. | :56:08. | |
And then, you already will see its extraordinary complicated. How fast | :56:08. | :56:12. | |
do you think the development will be in of the intelligence of these | :56:12. | :56:16. | |
robot, such they are beginning to create a new language between | :56:16. | :56:20. | |
themselves, but how long will it be before they are, I don't know, | :56:20. | :56:25. | |
creating great works of art, the sort of things that we can do? | :56:25. | :56:30. | |
know, if we have these two robots and we let them alone. We come back | :56:30. | :56:36. | |
at the end o of the day and they will have developed a vocabulary | :56:36. | :56:39. | |
for co-ordinating their action, once they start talking about | :56:39. | :56:44. | |
shapes and colours and we don't know when they say Bobita do they | :56:44. | :56:48. | |
mean a colour or a position in space. You will be going through | :56:49. | :56:53. | |
the same process that the robot did to acquire that language, I mean... | :56:53. | :57:01. | |
Yeah. The exciting thing about the robots, is they have their own | :57:01. | :57:05. | |
evolutionary journey to go on. Tomima. Which could happen much | :57:05. | :57:11. | |
quicker than ours. And they open up the possibility for intelligences | :57:11. | :57:18. | |
very different from our own to emerge. That was a striking example | :57:18. | :57:22. | |
of a merging intelligence, something which wasn't | :57:22. | :57:25. | |
preprogrammed. Those robots really started with a pretty blank sheet, | :57:25. | :57:30. | |
and by the end of the day just talking to each other, they had | :57:30. | :57:35. | |
evolved their own language. A Laing cadge I couldn't really understand. | :57:35. | :57:39. | |
Turing asked can machines think? I think he would be excited by what I | :57:39. | :57:49. | |
:57:49. | :57:50. | ||
-- a language I couldn't understand. I have discovered that computers | :57:50. | :57:54. | |
have far surpassed humans in many fields but are unlikely to put us | :57:54. | :58:00. | |
out of business yet. Because these super machines haven't come close | :58:00. | :58:07. | |
to capturing the true nature of our intelligence. But I also learned | :58:07. | :58:11. | |
how closely our intelligence is linked to our biology. And that | :58:11. | :58:15. | |
perhaps the secret to creating thinking machines lies in letting | :58:15. | :58:25. | |
:58:25. | :58:33. | ||
But the one thought that has been creeping up on me throughout all of | :58:33. | :58:39. | |
this, is that maybe we have become obsessed with our own abilities. | :58:39. | :58:44. | |
And are blinded by being human. It does raise the question about | :58:44. | :58:48. | |
whether we should be bothering to try and recreate what is us, when | :58:48. | :58:54. | |
machines can do something so different to what human beings can. | :58:54. | :58:59. |