0:00:02 > 0:00:03NARRATOR: Robots...
0:00:03 > 0:00:05We're on the verge of science fiction becoming reality.
0:00:05 > 0:00:07ROBOT:
0:00:07 > 0:00:09They look like us.
0:00:10 > 0:00:12We trust them with our lives.
0:00:12 > 0:00:15And they're starting to walk amongst us...
0:00:16 > 0:00:19..carrying their robot brains out of the lab
0:00:19 > 0:00:22and into the big, bad world.
0:00:24 > 0:00:26But just how useful are they?
0:00:28 > 0:00:31Not in some distant future, but right now.
0:00:32 > 0:00:35Where can they make our lives easier?
0:00:39 > 0:00:41And where do they still need work?
0:00:41 > 0:00:42Oh, my God!
0:00:42 > 0:00:43What the hell?
0:00:45 > 0:00:46It could strangle me.
0:00:47 > 0:00:49In a unique experiment...
0:00:49 > 0:00:50Robot delivery.
0:00:50 > 0:00:52..six ordinary British families...
0:00:52 > 0:00:53Argh!
0:00:53 > 0:00:55..all with specific needs...
0:00:57 > 0:01:01..will welcome six very different robots into their homes.
0:01:01 > 0:01:03- ROBOT:- Hello, family!
0:01:05 > 0:01:08- Take me to your leader. - Please.
0:01:08 > 0:01:10From trying to get them fit...
0:01:12 > 0:01:14..to helping care for the sick.
0:01:16 > 0:01:17Wow!
0:01:17 > 0:01:20..enabling them to make sense of the world...
0:01:23 > 0:01:25..and becoming indispensable in the workplace.
0:01:27 > 0:01:28Thank you!
0:01:28 > 0:01:30- But in the end... - Grrr!
0:01:30 > 0:01:32..will their families want to keep them...
0:01:32 > 0:01:34Just a really exciting moment.
0:01:34 > 0:01:36..or switch them off?
0:01:36 > 0:01:37You're in my way.
0:01:38 > 0:01:39Sorry, robot!
0:01:49 > 0:01:51In a one-off experiment,
0:01:51 > 0:01:54six robots have been sent to live with families around the UK.
0:01:57 > 0:01:59I'm quite nervous whether it will like me.
0:02:01 > 0:02:04Each robot has been programmed by a leading British university.
0:02:05 > 0:02:07OK, let's see what we've got.
0:02:07 > 0:02:11And observing them are psychologist Dr Caroline Jay
0:02:11 > 0:02:14and robotics professor Jonathan Rossiter.
0:02:15 > 0:02:17If you take the revolutions that we've had
0:02:17 > 0:02:21from the Agricultural Revolution to the Industrial Revolution
0:02:21 > 0:02:23to the IT revolution,
0:02:23 > 0:02:25robots really are the next thing.
0:02:25 > 0:02:29They're watching to see how the robots get on outside the lab
0:02:29 > 0:02:31and if humans form lasting bonds with them.
0:02:33 > 0:02:36We develop technology to help us.
0:02:36 > 0:02:39As time goes on, we'll learn what it does well
0:02:39 > 0:02:41and we'll learn how to work with it
0:02:41 > 0:02:45so that we can ensure it always works for human benefits.
0:02:49 > 0:02:53In Edinburgh, ShopBot is arriving for its first day at work.
0:02:54 > 0:02:56He's just superb.
0:02:57 > 0:02:58It's possessed!
0:03:00 > 0:03:02400 miles south, in Suffolk,
0:03:02 > 0:03:05Neil Bowles is weighing up whether to leave his wife Linda
0:03:05 > 0:03:08home alone with assistance robot CareBot.
0:03:09 > 0:03:11Take to Linda, that's her lunch.
0:03:11 > 0:03:15- He is not able to physically help with caring.- Absolutely not.
0:03:17 > 0:03:19While on the South Coast,
0:03:19 > 0:03:21the Rocket family are building up to their final weigh-in
0:03:21 > 0:03:24under the watchful eye of FitBot.
0:03:29 > 0:03:32- Hi, Ben.- Hi.
0:03:32 > 0:03:34- Good to meet you again. - Very nice to see you.
0:03:34 > 0:03:37But we start in Ruislip, in West London,
0:03:37 > 0:03:40where the Docherty family has a delivery.
0:03:40 > 0:03:42Ethan, Kaspar's here.
0:03:42 > 0:03:44Four-year-old Ethan is autistic.
0:03:44 > 0:03:45Ready?
0:03:45 > 0:03:46Steady, go!
0:03:46 > 0:03:50And he's about to meet a robot specially designed to help him.
0:03:50 > 0:03:52- What's that?- It's a robot.
0:03:52 > 0:03:54Wooo...!
0:03:54 > 0:03:58This is Kaspar, the social robot.
0:03:58 > 0:04:01An interactive child-sized humanoid.
0:04:01 > 0:04:03Operated by a remote control,
0:04:03 > 0:04:08it's pre-programmed to speak, sing and perform simple movements.
0:04:14 > 0:04:16Shall we do a peek-a-boo? Shall we say boo?
0:04:16 > 0:04:17You say boo.
0:04:17 > 0:04:18Boo!
0:04:18 > 0:04:20LAUGHTER
0:04:20 > 0:04:24Although Kaspar's come to play, this robot is not a toy,
0:04:24 > 0:04:27but an aid to unlock social skills in autistic children.
0:04:28 > 0:04:31- Say something else. - KASPER:- It tickles me.
0:04:33 > 0:04:35- ETHAN:- It's Thomas!
0:04:35 > 0:04:39Ethan, like many children on the autistic spectrum,
0:04:39 > 0:04:42struggles to understand the world or relate to others.
0:04:42 > 0:04:45Sometimes, Ethan will engage with you when you play with him.
0:04:45 > 0:04:47But other times, you tend to be a bit of a spectator,
0:04:47 > 0:04:49especially when he's got his trains.
0:04:53 > 0:04:57Ethan finds the comprehension of words difficult,
0:04:57 > 0:05:01social interaction and two-way conversation difficult
0:05:01 > 0:05:03and Ethan finds new experiences
0:05:03 > 0:05:05very difficult to deal with and comprehend.
0:05:06 > 0:05:08Oh, no, help me!
0:05:09 > 0:05:11- There you go. - No! That's not on!
0:05:12 > 0:05:17Ethan's inability to communicate his feelings can lead to frustration
0:05:17 > 0:05:20and managing his behaviour is a daily battle.
0:05:20 > 0:05:22Oh, there's too many trains, isn't there?
0:05:22 > 0:05:23ETHAN SCREAMS
0:05:23 > 0:05:25- OK.- Ethan...
0:05:25 > 0:05:28Ethan's emotions can very much be like a rollercoaster.
0:05:28 > 0:05:31You know, if he gets into a bad ride,
0:05:31 > 0:05:33it can just carry on and carry on.
0:05:34 > 0:05:37We have to be really careful and it's really difficult
0:05:37 > 0:05:39that you don't lose your temper with him over the small stuff.
0:05:40 > 0:05:44Ethan was diagnosed just over a year ago.
0:05:44 > 0:05:46The first time they used the word autism,
0:05:46 > 0:05:48it was a complete shock to us.
0:05:48 > 0:05:49You definitely go to the extreme.
0:05:49 > 0:05:52Does this mean Ethan's going to, from day one,
0:05:52 > 0:05:54always going to be behind everyone else?
0:05:54 > 0:05:56Ethan... We're just going to take Thomas and then...
0:05:56 > 0:05:59ETHAN SCREAMS
0:05:59 > 0:06:01Is my son going to be able to get a job?
0:06:01 > 0:06:03Is he going to have a life and a relationship
0:06:03 > 0:06:05with another human being?
0:06:05 > 0:06:06Come on.
0:06:09 > 0:06:12I want him to make a sad face.
0:06:12 > 0:06:14A sad face? You want him to make a sad face?
0:06:14 > 0:06:15You want to make him sad?
0:06:18 > 0:06:21While Kaspar's expressionless appearance may be startling,
0:06:21 > 0:06:23the simplicity of his face
0:06:23 > 0:06:26makes it easier for children on the autistic spectrum
0:06:26 > 0:06:28to practise everyday interaction.
0:06:30 > 0:06:32Oh!
0:06:32 > 0:06:34Human faces have 42 muscles,
0:06:34 > 0:06:39generating a potential 16,000 different facial configurations.
0:06:39 > 0:06:42A non-stop barrage of information
0:06:42 > 0:06:44that can overwhelm an autistic child.
0:06:45 > 0:06:47So you place this one. Come on, this one, the spoon.
0:06:47 > 0:06:49You see this picture of the spoon?
0:06:49 > 0:06:53Kaspar purposely forms only basic expressions,
0:06:53 > 0:06:57simple sentences and rudimentary physical movements...
0:06:57 > 0:06:59- Moving to the side. - Hi, Kaspar.
0:06:59 > 0:07:02..with the goal of giving users the time, space and repetition
0:07:02 > 0:07:05to practise basic interaction.
0:07:06 > 0:07:08Kaspar, do you like broccoli?
0:07:08 > 0:07:10Let's see what he says. Let's listen to Kaspar.
0:07:11 > 0:07:13He said broccoli was...
0:07:13 > 0:07:15- I want disgusting. - Yeah.
0:07:15 > 0:07:18To help hammer these teachings home,
0:07:18 > 0:07:22Ethan can choose Kaspar's expressions and reactions himself
0:07:22 > 0:07:23using the keypad,
0:07:23 > 0:07:26allowing him to control both sides of the interaction.
0:07:27 > 0:07:29Wait, wait, which one do you want? Which one?
0:07:29 > 0:07:30- Happy.- Happy?
0:07:30 > 0:07:32You press the happy face.
0:07:32 > 0:07:34Happy face.
0:07:34 > 0:07:35Yeah!
0:07:36 > 0:07:38So this is really amazing.
0:07:40 > 0:07:41It might seem that a robot
0:07:41 > 0:07:45is not an obvious choice in this kind of situation,
0:07:45 > 0:07:48where somebody's having difficulties communicating with humans.
0:07:50 > 0:07:53On the other hand, if we look at it from the perspective
0:07:53 > 0:07:57that, actually, it's the human communication that's the issue,
0:07:57 > 0:08:00if we're able to step back from that and simplify things
0:08:00 > 0:08:04in a way that Ethan can understand, that could really help.
0:08:06 > 0:08:08Ethan gets to control what's going on
0:08:08 > 0:08:10and he gets to programme something.
0:08:10 > 0:08:15It's this whole new insight into how things actually work.
0:08:15 > 0:08:17And it's not just about how robots work,
0:08:17 > 0:08:21it is able to teach Ethan something about how people work, as well.
0:08:22 > 0:08:24Do you want him to sing a song?
0:08:24 > 0:08:26Yes.
0:08:26 > 0:08:27Music.
0:08:27 > 0:08:29Yes, music. You press this one.
0:08:29 > 0:08:31This one as well.
0:08:31 > 0:08:34KASPER: # If you're happy and you know it, clap your hands...
0:08:34 > 0:08:35Can you clap your hands?
0:08:35 > 0:08:39# If you're happy and you know it clap your hands
0:08:40 > 0:08:42# If you're happy and you know it...
0:08:42 > 0:08:44- Oh... - Are you going to sit with Mummy?
0:08:44 > 0:08:45It was too much for him. OK...
0:08:45 > 0:08:48# With a quack, quack here and a quack, quack there
0:08:48 > 0:08:51# Here a quack, there a quack, everywhere a quack, quack... #
0:08:51 > 0:08:53Ethan...
0:08:53 > 0:08:54After a promising start,
0:08:54 > 0:08:58Ethan makes it clear that, for now, he's had enough of Kaspar.
0:08:58 > 0:08:59No!
0:08:59 > 0:09:01# If you're happy and you know it and you really want to show it
0:09:01 > 0:09:02Ethan...
0:09:02 > 0:09:04# If you're happy and you know it... #
0:09:04 > 0:09:06I can't do it.
0:09:06 > 0:09:07Oh, no.
0:09:08 > 0:09:11Shall we see if Kaspar wants some dinner?
0:09:11 > 0:09:12No.
0:09:12 > 0:09:15- Because then Kaspar needs to go to bed after that.- No.
0:09:15 > 0:09:16No?
0:09:16 > 0:09:17I think Ethan's quite tired.
0:09:17 > 0:09:19I think it's been a long day. So, erm...
0:09:19 > 0:09:22Worst-case scenario is that Ethan gets bored with it
0:09:22 > 0:09:23and there's no engagement.
0:09:23 > 0:09:25Because if there's no engagement,
0:09:25 > 0:09:29then he's not going to interact, he's not going to play with it.
0:09:29 > 0:09:31- Shall we do some singing in a moment?- No.
0:09:31 > 0:09:34- Can we dance?- No.- Oh...
0:09:36 > 0:09:39While some families are just beginning their experiment...
0:09:42 > 0:09:43Launch exercise.
0:09:43 > 0:09:46..others are five weeks in.
0:09:52 > 0:09:54The Rocket family in Plymouth
0:09:54 > 0:09:57have been trying to get fit and eat better...
0:09:57 > 0:09:59Did you throw your tea towel at it?
0:09:59 > 0:10:01..under the guidance of FitBot...
0:10:04 > 0:10:07..a humanoid with speech, facial and emotional recognition.
0:10:09 > 0:10:11And there's been a development.
0:10:22 > 0:10:25Mum Jackie has changed her approach to feeding the family.
0:10:25 > 0:10:28Because the FitBot was here,
0:10:28 > 0:10:32so I was more aware that we'd have Big Brother watching us, as such,
0:10:32 > 0:10:35it just made me more aware of everything, really.
0:10:41 > 0:10:43She used to cook for, like, the whole neighbourhood, it would seem,
0:10:43 > 0:10:45because she would cook so much.
0:10:45 > 0:10:49Now she's just cooking for us, so it's just the right amount.
0:10:55 > 0:10:58The biggest change
0:10:58 > 0:11:03has been the number of sort of snacks that we did have in.
0:11:03 > 0:11:05There's a lot fewer of those snacks in the house.
0:11:09 > 0:11:12Less effective has been FitBot's ability to help dad Matt
0:11:12 > 0:11:15and daughter Cat work out away from the house.
0:11:17 > 0:11:19Can we do some exercise today, FitBot?
0:11:23 > 0:11:25OK, can we do any exercise, then, FitBot?
0:11:28 > 0:11:30No. She's all yellow.
0:11:30 > 0:11:32Erm...
0:11:32 > 0:11:34We might just have to do it without FitBot.
0:11:34 > 0:11:36Because we know what you've got to do, don't you?
0:11:36 > 0:11:38Yeah, we can still time ourselves. We're here.
0:11:38 > 0:11:40We might as well do some exercise anyway.
0:11:40 > 0:11:44Just because FitBot can't help us, we can still do our circuit.
0:11:44 > 0:11:45All right, you ready?
0:11:45 > 0:11:47Come on, then, nice and gently.
0:11:47 > 0:11:49Inspired by the upcoming weigh-in,
0:11:49 > 0:11:52Matt draws on FitBot's previous routines,
0:11:52 > 0:11:54plus his own military experience.
0:11:54 > 0:11:57Three, two, one...go!
0:11:58 > 0:12:00You have to think on your feet,
0:12:00 > 0:12:03because key to being a military fitness instructor...
0:12:03 > 0:12:05What, are you ringing bells?
0:12:05 > 0:12:08You look like you've got a couple of castanets in your hands.
0:12:08 > 0:12:13If you're either making a circuit up or if you're doing a competition,
0:12:13 > 0:12:14nine times out of ten,
0:12:14 > 0:12:16you know there's a chance that something's going to go wrong.
0:12:16 > 0:12:18That must be killing you, this.
0:12:20 > 0:12:22One unexpected benefit
0:12:22 > 0:12:25is the quality time spent with his daughter.
0:12:25 > 0:12:28The times that I can spend with Cat are always very precious.
0:12:28 > 0:12:30They grow so quickly and so fast
0:12:30 > 0:12:33that it's very, very precious to try and get as much done as we can
0:12:33 > 0:12:35and just do something together.
0:12:35 > 0:12:36Come on, last few.
0:12:37 > 0:12:38And stop there.
0:12:38 > 0:12:42Soon, she's going to be 18, 19, her own lady,
0:12:42 > 0:12:45and once she reaches late teens, she's going to disappear
0:12:45 > 0:12:47and, you know, then that time I'm never going to get back.
0:12:49 > 0:12:51HE LAUGHS
0:12:52 > 0:12:54You're going to go flying!
0:12:54 > 0:12:55Mwah!
0:12:58 > 0:13:00But not everyone feels the same about finding out
0:13:00 > 0:13:03if FitBot has impacted their lives for real.
0:13:04 > 0:13:06I'm not looking forward to the weigh-in.
0:13:06 > 0:13:09I'll make sure I take off my watch...
0:13:10 > 0:13:12..and my earrings. Oh, no, I haven't got any.
0:13:18 > 0:13:20Use serving tray.
0:13:23 > 0:13:25There's a good boy, aren't you? You're a good boy.
0:13:25 > 0:13:28He's a good boy. Yes, he is!
0:13:28 > 0:13:32In Felixstowe, Linda and Neil Bowles have been sharing their home
0:13:32 > 0:13:34with CareBot for three days.
0:13:35 > 0:13:39You tend to talk to it like a pet, don't you, really? Or a child.
0:13:39 > 0:13:42You get affectionate towards it, don't you?
0:13:42 > 0:13:44Strange, isn't it? Attached to a machine...
0:13:44 > 0:13:46How weird is that?
0:13:47 > 0:13:50So CareBot's got more of a personality, actually,
0:13:50 > 0:13:52than a lot of robots we've seen.
0:13:54 > 0:13:57If you think about the relationship we have with our pets,
0:13:57 > 0:14:01we often confer emotions on to them that may not really be there.
0:14:01 > 0:14:03A similar thing happens with robots.
0:14:03 > 0:14:06Even though we're aware, at some level,
0:14:06 > 0:14:08that the robot isn't really a person,
0:14:08 > 0:14:10when it behaves like a person,
0:14:10 > 0:14:13we're willing to take that leap
0:14:13 > 0:14:16and respond to it as if it's another sentient being.
0:14:16 > 0:14:19CareBot may have won over the Bowles
0:14:19 > 0:14:21with its conversational skills,
0:14:21 > 0:14:23but it offers a number of other functions
0:14:23 > 0:14:25as a robot for the disabled.
0:14:26 > 0:14:29It can recognise and track human voices,
0:14:29 > 0:14:31offer medication reminders
0:14:31 > 0:14:34and potentially spot the signs leading up to an emergency.
0:14:36 > 0:14:38Tell me a joke.
0:14:44 > 0:14:46THEY LAUGH
0:14:46 > 0:14:48His jokes are as bad as mine!
0:14:49 > 0:14:51The Bowles agreed to trial CareBot,
0:14:51 > 0:14:54as Linda suffers from MS.
0:14:55 > 0:14:58For ten years, she's been confined to a wheelchair
0:14:58 > 0:15:02and 100% dependent on husband Neil for all her care.
0:15:03 > 0:15:07We love each other to pieces, we're a couple first and foremost.
0:15:07 > 0:15:10But, yes, we can get down.
0:15:10 > 0:15:14Yeah, I can get quite depressed and quite sad at times.
0:15:14 > 0:15:17There's probably not a carer who doesn't from time to time.
0:15:18 > 0:15:22But sometimes you try and make light of it,
0:15:22 > 0:15:24even though you're feeling rotten inside.
0:15:25 > 0:15:29If Neil gets time alone, he heads to his man cave,
0:15:29 > 0:15:33situated in the garden a few metres from the house.
0:15:33 > 0:15:38I try to use the gym, or my gym, about four times a week, if I can.
0:15:40 > 0:15:41Not bad for an old boy!
0:15:43 > 0:15:45But even just a stone's throw from Linda,
0:15:45 > 0:15:48Neil worries something could happen to her.
0:15:49 > 0:15:52She can have massive spasms which could three her off her chair.
0:15:52 > 0:15:55She could have choking fits which, obviously, could be...
0:15:55 > 0:15:57Well, they can be fatal, can't they?
0:15:58 > 0:15:59We've got an internal phone system.
0:15:59 > 0:16:01So I make sure Linda's got a phone near her
0:16:01 > 0:16:02and I'll bring one of them down here,
0:16:02 > 0:16:04so she can just give me a quick buzz if she needs me.
0:16:05 > 0:16:08I rely on Neil for absolutely everything.
0:16:08 > 0:16:13Every single need for every minute of every day.
0:16:17 > 0:16:19Good morning, CareBot, nice to see you!
0:16:20 > 0:16:23To relieve Neil of some of his workload,
0:16:23 > 0:16:26CareBot has been programmed to issue a medication reminder
0:16:26 > 0:16:27three times a day.
0:16:32 > 0:16:34- Medication reminder for Linda? - Oh, thank you, I've taken them.
0:16:34 > 0:16:37- I thought you'd had your medication. - Yes, thank you.
0:16:37 > 0:16:38Wow!
0:16:44 > 0:16:46He's just given Linda a reminder
0:16:46 > 0:16:48to take the three types of medication
0:16:48 > 0:16:49that she takes every morning.
0:16:54 > 0:16:55Tell him I've taken them.
0:16:56 > 0:17:00Morning medication taken, confirmed, thank you.
0:17:04 > 0:17:06Neil is starting to trust CareBot.
0:17:06 > 0:17:08CareBot, give me some news.
0:17:10 > 0:17:13But to leave it in charge of Linda for any length of time
0:17:13 > 0:17:16will require far more sophisticated functions.
0:17:16 > 0:17:17Are you stupid?
0:17:19 > 0:17:23This morning, CareBot's inventor, Dr Theo Theodoridis,
0:17:23 > 0:17:25from the University of Salford,
0:17:25 > 0:17:28wants to show them how his creation can assist them further.
0:17:29 > 0:17:32- So I'm going to start it off now. - OK. Let's go.
0:17:33 > 0:17:36Limited to a tiny amount of movement in one arm,
0:17:36 > 0:17:41Linda can only call for help if the phone is left within reach.
0:17:41 > 0:17:43Call Neil.
0:17:43 > 0:17:47But CareBot's voice-activated phone system is a welcome addition.
0:17:50 > 0:17:51PHONE RINGS
0:17:51 > 0:17:54- Fantastic. - Pick it up. Pick it up.
0:17:55 > 0:17:57Hello?
0:17:57 > 0:17:59Yeah, that works. It's brilliant.
0:17:59 > 0:18:01Look at that!
0:18:02 > 0:18:04That's wonderful, isn't it?
0:18:04 > 0:18:06That is an amazing safety feature.
0:18:06 > 0:18:08I'm going to end the call now.
0:18:10 > 0:18:16A perfectly good, clear mobile phone message from robot to my mobile.
0:18:16 > 0:18:18That's impressed me.
0:18:18 > 0:18:19Really has.
0:18:19 > 0:18:22But Dr Theodoridis has something else up his sleeve.
0:18:22 > 0:18:24Can you give the command to the robot, please?
0:18:24 > 0:18:26Emergency monitor.
0:18:26 > 0:18:28Neil sits in for Linda.
0:18:31 > 0:18:35Using a pair of infrared sensors mounted in its eyes,
0:18:35 > 0:18:38CareBot draws a wire frame outline around Neil,
0:18:38 > 0:18:41noting the exact position of his body.
0:18:43 > 0:18:44It's ready.
0:18:44 > 0:18:49Now, if any dramatic changes occur, such as Linda having a muscle spasm,
0:18:49 > 0:18:51CareBot should notice.
0:18:51 > 0:18:55Now, if you try to slide to the front down...
0:18:56 > 0:18:58A bit like me on a Saturday night after a few beers!
0:19:02 > 0:19:06Now, you should be receiving a text message.
0:19:07 > 0:19:09"Linda needs help."
0:19:09 > 0:19:10Wow!
0:19:12 > 0:19:14That was just amazing.
0:19:14 > 0:19:17I mean that is one good safety feature, isn't it?
0:19:17 > 0:19:19The question now is,
0:19:19 > 0:19:23will Neil trust CareBot to look after Linda without him there?
0:19:23 > 0:19:26It hasn't made any mistakes and that gives him confidence.
0:19:26 > 0:19:30If it were to make one mistake, I wonder how attitudes would change.
0:19:30 > 0:19:34Yeah. So if we had a situation where it gave out the wrong prescription,
0:19:34 > 0:19:36that could cause real problems.
0:19:37 > 0:19:40It may be that we don't trust robots,
0:19:40 > 0:19:43because we aren't able to control them.
0:19:43 > 0:19:46If we feel they're acting completely autonomously,
0:19:46 > 0:19:50then that is potentially a good thing, if they get things right.
0:19:50 > 0:19:53On the other hand, when something goes wrong,
0:19:53 > 0:19:57if a person can't step in or hasn't seen what's happened,
0:19:57 > 0:19:59that could be a potential disaster.
0:20:04 > 0:20:06We need to wake Kaspar up in a minute.
0:20:06 > 0:20:09- He's sleeping, isn't he? - Come on!
0:20:10 > 0:20:11Back in West London,
0:20:11 > 0:20:15it's been 24 hours since communication robot Kaspar
0:20:15 > 0:20:17arrived at the Docherty house.
0:20:17 > 0:20:20He's still asleep. Do you want to wake him up?
0:20:20 > 0:20:21- Yes, please.- Yeah? OK.
0:20:21 > 0:20:24After a difficult start,
0:20:24 > 0:20:27mum Michelle has devised a cunning plan
0:20:27 > 0:20:31to get her autistic son Ethan to give Kaspar a chance.
0:20:31 > 0:20:32Hello!
0:20:34 > 0:20:37We think the best thing is to humanise the robot,
0:20:37 > 0:20:41so we decided to give Kaspar a bedroom, so we've used our study.
0:20:41 > 0:20:45He's got his name on the door, we've given him kind of a bed
0:20:45 > 0:20:47and, hopefully, that will keep him more engaged.
0:20:47 > 0:20:49In fact, Kaspar's programmers recommend
0:20:49 > 0:20:52that users are reminded that Kaspar is a robot
0:20:52 > 0:20:56and encouraged to transfer whatever they learn from it back to humans.
0:20:57 > 0:20:58Would you like a cup of tea?
0:20:58 > 0:21:00Hm...
0:21:00 > 0:21:01I don't know.
0:21:01 > 0:21:04Do you want to have a look at what we're doing with Kaspar today?
0:21:04 > 0:21:07Can you see? Can you tell me what you can see?
0:21:07 > 0:21:10Like many children on the autistic spectrum,
0:21:10 > 0:21:12Ethan responds well to a strict routine.
0:21:12 > 0:21:16We need to look after Kaspar because he's sick and he's hungry, OK?
0:21:16 > 0:21:19And Michelle's making Kaspar part of theirs.
0:21:19 > 0:21:20Oh...
0:21:21 > 0:21:23Oh! What did he do?
0:21:23 > 0:21:24He did.
0:21:24 > 0:21:25Is he sick?
0:21:25 > 0:21:27Over the last year, she's developed a system
0:21:27 > 0:21:29using pictures and timelines
0:21:29 > 0:21:32to map out every element of Ethan's day.
0:21:32 > 0:21:34- Would you like a medicine? - Yeah.
0:21:35 > 0:21:37I sometimes feel that Ethan is a bit of a robot.
0:21:37 > 0:21:39He is more robotic than us.
0:21:39 > 0:21:43There's a protocol for everything, there's a schedule for everything,
0:21:43 > 0:21:45so we definitely need to have an itinerary
0:21:45 > 0:21:47for any event or activity that we're doing,
0:21:47 > 0:21:50which is lucky for him, because I love doing that kind of stuff.
0:21:50 > 0:21:51Take the "Kaspar is sick" off,
0:21:51 > 0:21:53because I don't think he's sick any more.
0:21:53 > 0:21:55Having captured Ethan's interest...
0:21:58 > 0:22:00Oh! Kaspar is hungry, Ethan.
0:22:01 > 0:22:03..Michelle's planned the next game
0:22:03 > 0:22:06to help Ethan master the thing he struggles with most,
0:22:06 > 0:22:08the art of conversation.
0:22:09 > 0:22:11Shall we ask Kaspar if he likes cookies?
0:22:11 > 0:22:12Asks cookies.
0:22:12 > 0:22:14So you say, "Kaspar, do you like cookies?"
0:22:14 > 0:22:16Kaspar, do you like cookies?
0:22:18 > 0:22:19Ooh!
0:22:19 > 0:22:21- He likes cookies.- He does.
0:22:21 > 0:22:23Shall we try another food?
0:22:24 > 0:22:26- Ask Kaspar if he likes it. - Yucky!
0:22:26 > 0:22:29No. We say, "Kaspar, do you like your food?"
0:22:29 > 0:22:32Unlike humans, Kaspar is patient.
0:22:32 > 0:22:34It's designed to give children, like Ethan,
0:22:34 > 0:22:37the time to think about what they'd like to say.
0:22:37 > 0:22:39Kaspar, do you like your food?
0:22:41 > 0:22:43I don't think he does!
0:22:43 > 0:22:45- He doesn't like that.- Oh, no.
0:22:45 > 0:22:48Through repetition, it should build up Ethan's confidence,
0:22:48 > 0:22:51giving him social skills to take out into the real world.
0:22:51 > 0:22:54- Why don't you ask him? - Yucky!
0:22:54 > 0:22:56Ask him whether he likes it.
0:22:56 > 0:22:58Do you like it?
0:23:01 > 0:23:03Oh, dear!
0:23:05 > 0:23:07He doesn't like it!
0:23:07 > 0:23:10Kaspar, do you like this?
0:23:13 > 0:23:14Oh, he liked it!
0:23:17 > 0:23:19So food is all...
0:23:19 > 0:23:21Food is all finished.
0:23:23 > 0:23:25I mean, that's like a proper therapy session,
0:23:25 > 0:23:27he's had to work for that.
0:23:27 > 0:23:29Michelle brought a really good structure to it.
0:23:29 > 0:23:30I think that really worked.
0:23:30 > 0:23:32Ethan asking him questions...
0:23:32 > 0:23:33Because those are the sorts of things
0:23:33 > 0:23:36that we've been trying to do with Ethan,
0:23:36 > 0:23:39so he's obviously getting something out of the structure.
0:23:39 > 0:23:41No, it was really good. That was really good.
0:23:41 > 0:23:43I thought it was a good session.
0:23:43 > 0:23:44Do you need any help putting Kaspar to bed?
0:23:44 > 0:23:46Do you want to give him a cuddle and say goodnight?
0:23:46 > 0:23:48- Aw!- Oh.- Come on.
0:23:48 > 0:23:50Let's put Kaspar to bed.
0:23:51 > 0:23:54Although Ethan's been able to practise asking Kaspar questions,
0:23:54 > 0:23:58for Michelle, it's too much of a one-way interaction.
0:23:58 > 0:24:00Night-night, Kaspar.
0:24:00 > 0:24:04Kaspar needs to be able to respond with questions of his own
0:24:04 > 0:24:06to simulate conversation.
0:24:06 > 0:24:09We've kind of got five or six improvements that we want to make.
0:24:09 > 0:24:12Ethan struggles with listening to a question and answering it
0:24:12 > 0:24:15and that's what we really need to get him to focus on.
0:24:15 > 0:24:18Michelle puts Kaspar away for good.
0:24:18 > 0:24:20OK, say goodnight to Kaspar and we'll turn the lights off.
0:24:20 > 0:24:21Good night.
0:24:21 > 0:24:23He'll only come back out
0:24:23 > 0:24:25if and when the university can make the changes
0:24:25 > 0:24:27she wants to his programming.
0:24:32 > 0:24:33Tell me a joke.
0:24:40 > 0:24:42In Felixstowe, Linda and Neil Bowles
0:24:42 > 0:24:47have been living with prototype assistant CareBot for over a week.
0:24:47 > 0:24:49It's just like having a house guest here.
0:24:49 > 0:24:51We're getting quite used to having him around.
0:24:51 > 0:24:53Are you hungry?
0:24:59 > 0:25:03So far, CareBot's specially programmed companion algorithm
0:25:03 > 0:25:07is far outweighing its abilities as a physical assistant.
0:25:08 > 0:25:11Take to Linda, that's her lunch.
0:25:11 > 0:25:14- He's not able to physically help with caring.- Absolutely not.
0:25:19 > 0:25:21OK, we'll talk about solar power, then.
0:25:21 > 0:25:23Linda doesn't want her sandwich.
0:25:23 > 0:25:26You could direct him and I could say, "Take that to Linda."
0:25:26 > 0:25:28little things like that could be really useful.
0:25:28 > 0:25:30He's not going to bring it to you.
0:25:30 > 0:25:32What a waste of time.
0:25:34 > 0:25:35Shut up!
0:25:38 > 0:25:40Despite CareBot's limitations,
0:25:40 > 0:25:42Neil's ready to go to the next stage.
0:25:44 > 0:25:48Dinner tonight, do you fancy some fresh fish?
0:25:48 > 0:25:51- Oh, a nice wing of skate. - A wing of skate?- Oh, yes, please.
0:25:55 > 0:25:57You're not getting any!
0:25:57 > 0:26:01Today, Neil's decided to pick up a special dinner,
0:26:01 > 0:26:03leaving Linda home alone.
0:26:03 > 0:26:05I'll nip out down the shop soon.
0:26:05 > 0:26:07- See you later. Love you. - Love you, too.
0:26:07 > 0:26:09- Bye!- Be careful out there. - Missing you already!
0:26:09 > 0:26:11Yeah...
0:26:12 > 0:26:15Normally, he'd ask a neighbour to keep an eye on her
0:26:15 > 0:26:17and leave the phone to hand.
0:26:17 > 0:26:20But today, CareBot is in charge.
0:26:24 > 0:26:27Last time I had any kind of respite,
0:26:27 > 0:26:30the council sent a professional carer into our home
0:26:30 > 0:26:33and, for once, I didn't have to look at my watch.
0:26:33 > 0:26:35I didn't have to have any worries at all,
0:26:35 > 0:26:37because I knew Linda was being looked after
0:26:37 > 0:26:38by a health care professional.
0:26:38 > 0:26:40However, this funding stopped,
0:26:40 > 0:26:45so I haven't had any time off whatsoever in three years.
0:26:47 > 0:26:48Neil has gone out.
0:26:57 > 0:27:01With Neil a mile away and counting, CareBot sounds the alarm.
0:27:07 > 0:27:09Unable to connect to the Wi-Fi,
0:27:09 > 0:27:13CareBot and Linda's connection to the outside world has evaporated.
0:27:13 > 0:27:16Can you do anything for me?
0:27:21 > 0:27:24It'd be a nice day to be out on this golf course today.
0:27:24 > 0:27:29That would be lovely, to have about five hours...
0:27:30 > 0:27:32Oh, heaven!
0:27:33 > 0:27:35It's just something that...
0:27:35 > 0:27:38..is not possible.
0:27:38 > 0:27:40And that's it. You know, there's no way round it.
0:27:41 > 0:27:44Neil's gone out and I can't even phone him if I need him.
0:27:45 > 0:27:49We'll just park up in this little pay-and-display car park
0:27:49 > 0:27:51and see how much they're going to rip us off for.
0:27:51 > 0:27:53Oh, it's out of order, the machine.
0:27:53 > 0:27:55Perhaps that means it's free parking today.
0:27:55 > 0:27:57It makes you feel like you've just won the lottery
0:27:57 > 0:27:58when you get free parking.
0:27:58 > 0:28:00With Neil now miles away,
0:28:00 > 0:28:04all Linda can do is wait patiently for the internet to come back on
0:28:04 > 0:28:06and hope nothing goes wrong in the meantime.
0:28:06 > 0:28:08Resume.
0:28:09 > 0:28:13And here we are at my favourite fishmongers and they're closed!
0:28:13 > 0:28:14Oh, no!
0:28:18 > 0:28:21Never mind. Have something else for tea.
0:28:24 > 0:28:27Try again, resume.
0:28:27 > 0:28:28Resume.
0:28:28 > 0:28:31I'm getting quite angry with you now, actually.
0:28:34 > 0:28:35I'm home, honey!
0:28:37 > 0:28:39- Hello.- Hiya, darling.
0:28:39 > 0:28:41- Hiya, robot. Have you looked after her?- No.
0:28:41 > 0:28:45Within ten seconds of you going out, the internet went down,
0:28:45 > 0:28:48so I couldn't have called you if I'd needed you, anyway.
0:28:48 > 0:28:53And I'm actually losing my voice shouting at it.
0:28:53 > 0:28:55No, it's ridiculous, really, isn't it?
0:28:55 > 0:28:58If you've got to depend on the internet
0:28:58 > 0:29:00to be able to use an emergency phone call system.
0:29:01 > 0:29:06But basically, then, that's given us no sense of safety at all.
0:29:06 > 0:29:09You don't feel any safer. I don't feel any...
0:29:09 > 0:29:12I felt less safe because, if I hadn't have had him,
0:29:12 > 0:29:14I would have had a phone.
0:29:14 > 0:29:16Yeah.
0:29:17 > 0:29:19Well, that was a great disappointment.
0:29:19 > 0:29:23With modern systems, we rely so much on the internet
0:29:23 > 0:29:25for basic services and even, in this case,
0:29:25 > 0:29:27for the operation of a care robot.
0:29:27 > 0:29:30And when that connection to the internet goes down,
0:29:30 > 0:29:32the whole system fails.
0:29:32 > 0:29:35So CareBot did not have a fail-safe.
0:29:39 > 0:29:40I confirm.
0:29:42 > 0:29:43Grrrr...!
0:29:43 > 0:29:45That's it.
0:29:53 > 0:29:56In West London, the Dohertys have called a halt on the experiment
0:29:56 > 0:29:58with their robot Kaspar
0:29:58 > 0:30:02until he's been reprogrammed to have a two-way conversation.
0:30:03 > 0:30:05- Hi, Ben.- Hi, Michelle.
0:30:05 > 0:30:07Come through.
0:30:07 > 0:30:11Mum Michelle has kept Kaspar and her autistic son Ethan apart
0:30:11 > 0:30:16until senior researcher Dr Ben Robins completes the update.
0:30:16 > 0:30:21OK, I have developed the new games, the animal games.
0:30:21 > 0:30:22Brilliant.
0:30:22 > 0:30:25We're limited in what Kaspar can do at the moment.
0:30:25 > 0:30:27I think we really saw the potential on that first day
0:30:27 > 0:30:29and we just thought, if we could customise it
0:30:29 > 0:30:32to do whatever we needed it to do for Ethan specifically,
0:30:32 > 0:30:33then it would be an amazing tool.
0:30:33 > 0:30:35I think Kaspar's woken up, Ethan.
0:30:36 > 0:30:37You want to come and play?
0:30:37 > 0:30:39Oh...!
0:30:39 > 0:30:41We are excited to see the requests in Kaspar today
0:30:41 > 0:30:43being actually used with Ethan.
0:30:43 > 0:30:45So, yeah, really excited.
0:30:45 > 0:30:47No-one knows if Kaspar's new programming
0:30:47 > 0:30:49will have an effect on Ethan.
0:30:52 > 0:30:56Michelle's asked Dr Robins to create new games,
0:30:56 > 0:30:59so Kaspar can ask questions instead of just answering them.
0:31:03 > 0:31:05Well done, Ethan!
0:31:05 > 0:31:07Good boy!
0:31:07 > 0:31:09- He's bad! Bad! - Ethan...
0:31:09 > 0:31:11Dr Robins has also allocated
0:31:11 > 0:31:13a button on the keypad to Ethan's name,
0:31:13 > 0:31:15to keep his attention.
0:31:15 > 0:31:16Ethan, come back.
0:31:22 > 0:31:23But Ethan's lost interest.
0:31:28 > 0:31:30Ethan?
0:31:30 > 0:31:32One food for Mummy, please.
0:31:33 > 0:31:36You want to try and pick one out, April?
0:31:36 > 0:31:38Shall we see if Kaspar likes orange?
0:31:39 > 0:31:41Ah, gentle, please.
0:31:42 > 0:31:44That's it, give it to Kaspar.
0:31:44 > 0:31:46April...
0:31:46 > 0:31:48Let April give it.
0:31:48 > 0:31:49APRIL SCREAMS
0:31:49 > 0:31:52April, let go, let go.
0:31:52 > 0:31:53Thank you.
0:31:53 > 0:31:55Ethan, that's not very nice.
0:31:55 > 0:31:59After all Michelle's efforts, it's a disappointing session.
0:31:59 > 0:32:00Ethan...
0:32:05 > 0:32:07Ethan did what I expected him to,
0:32:07 > 0:32:10which was, you know, not really listen.
0:32:10 > 0:32:12That's the same issue that we have with other kids his own age,
0:32:12 > 0:32:14he's not stopping and listening,
0:32:14 > 0:32:16he's just doing whatever he wants to do.
0:32:16 > 0:32:19So I think that's going to be a massive learning curve for Ethan.
0:32:20 > 0:32:23Michelle has grasped the limitations
0:32:23 > 0:32:26and the opportunities that Kaspar gives.
0:32:26 > 0:32:29But Kaspar is a very limited device.
0:32:29 > 0:32:32He's there as a tool for Michelle.
0:32:32 > 0:32:34So it could be that she is really crucial
0:32:34 > 0:32:37to getting this situation to work.
0:32:37 > 0:32:40In somebody else's hands, would Kaspar be as effective?
0:32:47 > 0:32:50Robots haven't been developed only for people with specific needs.
0:32:54 > 0:32:56Our sixth and final robot...
0:32:56 > 0:32:58Oh, it's so cute!
0:32:58 > 0:33:02..has been designed to help with something much more everyday.
0:33:02 > 0:33:04Shopping!
0:33:06 > 0:33:07Oh, my God!
0:33:08 > 0:33:10Go on.
0:33:11 > 0:33:15ShopBot is based on a robot called Pepper,
0:33:15 > 0:33:18but has been customised by Heriot-Watt University
0:33:18 > 0:33:20for a very particular task - customer service.
0:33:23 > 0:33:26The robot can identify individual humans,
0:33:26 > 0:33:29but as it will be dealing with hundreds of people,
0:33:29 > 0:33:30rather than just a few,
0:33:30 > 0:33:34facial recognition technology hasn't been implemented.
0:33:34 > 0:33:35Instead, Heriot-Watt have used
0:33:35 > 0:33:38cutting-edge machine learning techniques
0:33:38 > 0:33:39to develop a sophisticated chatbot
0:33:39 > 0:33:42designed to win over and engage customers.
0:33:44 > 0:33:45I do need a hug.
0:33:46 > 0:33:47Thank you.
0:33:47 > 0:33:49And it's safe to physically interact with,
0:33:49 > 0:33:52thanks to nylon tendons in its fingers
0:33:52 > 0:33:54and rubber pulleys in its arms that stretch,
0:33:54 > 0:33:57to avoid crushing objects it comes into contact with.
0:33:57 > 0:33:59Welcome to Margiotta's.
0:34:02 > 0:34:06ShopBot's been invited by sisters Elena and Luisa,
0:34:06 > 0:34:09the youngest generation in a family supermarket business.
0:34:11 > 0:34:14They now employ 150 people across six sites
0:34:14 > 0:34:18and are always looking for ways to get the edge over their rivals.
0:34:18 > 0:34:21I just think if maybe we could move things around a bit
0:34:21 > 0:34:23to make it look a bit more full.
0:34:23 > 0:34:24Looks perfect.
0:34:24 > 0:34:27We are the largest independent retailer in Scotland,
0:34:27 > 0:34:30so we try hard, because it is competitive out there.
0:34:31 > 0:34:34We have to adapt to survive in this retail climate,
0:34:34 > 0:34:36but I think the customers of Edinburgh appreciate that.
0:34:36 > 0:34:39There's still independent businesses out there
0:34:39 > 0:34:41fighting against the larger chains
0:34:41 > 0:34:43and the only way we can survive is to be different.
0:34:45 > 0:34:48We thought a robot is a great addition to show the customers
0:34:48 > 0:34:51that we're always wanting to do something new and exciting.
0:34:52 > 0:34:56In commercial environments, floor space is really important
0:34:56 > 0:34:59and that's the most valuable commodity they've got.
0:34:59 > 0:35:02The proof of the pudding for a robot like ShopBot
0:35:02 > 0:35:04is whether it can earn its keep.
0:35:04 > 0:35:08For the next week, ShopBot will be trialled as an assistant
0:35:08 > 0:35:09in the Margiotta's flagship store.
0:35:09 > 0:35:12And to make it feel like part of the family,
0:35:12 > 0:35:14they've decided to give it a nickname.
0:35:15 > 0:35:19We decided to call it Fabio because we thought, we're an Italian family,
0:35:19 > 0:35:21so we'd give it an Italian name.
0:35:21 > 0:35:24Fabio, hi!
0:35:26 > 0:35:30Dad Franco has built the business from scratch over the last 30 years.
0:35:30 > 0:35:33My goodness, look at you!
0:35:33 > 0:35:38For his daughters, this is a chance to drag him into the 21st century.
0:35:38 > 0:35:41Modern technology, you're not very good with that.
0:35:41 > 0:35:42No.
0:35:42 > 0:35:44I'm...not great with it.
0:35:44 > 0:35:48I'd like you to take me around the shop.
0:35:48 > 0:35:49Can you show me?
0:35:50 > 0:35:54Fabio has been programmed with an inventory of hundreds of products
0:35:54 > 0:35:57and will be tasked with directing customers to them.
0:35:58 > 0:36:00I'm having a steak and chips.
0:36:03 > 0:36:04Good.
0:36:08 > 0:36:10But if it's to perform successfully,
0:36:10 > 0:36:13its speech recognition technology needs to be foolproof.
0:36:24 > 0:36:26HE CHUCKLES
0:36:26 > 0:36:28You're very funny.
0:36:30 > 0:36:31Before Fabio gets to work,
0:36:31 > 0:36:35it has a meet and greet with its new colleagues.
0:36:35 > 0:36:37Hello, Fabio, how are you?
0:36:37 > 0:36:40- FABIO:- Hi.
0:36:40 > 0:36:44It's so much more lifelike than I thought it would be.
0:36:44 > 0:36:46I was, like, worried that it would be, you know, just a machine,
0:36:46 > 0:36:48but it actually looks like a robot.
0:36:48 > 0:36:49High five.
0:36:53 > 0:36:54Don't leave me hanging.
0:36:54 > 0:36:56I'm not!
0:36:56 > 0:36:57Wonderful.
0:36:59 > 0:37:00A bit bizarre.
0:37:00 > 0:37:04I never thought I'd see a wee robot kicking about in a shop,
0:37:04 > 0:37:05let alone our shop.
0:37:05 > 0:37:09Fabio's friendly chat seems to have won over the staff,
0:37:09 > 0:37:11but there is one concern...
0:37:11 > 0:37:13It's going to be an interesting couple of weeks
0:37:13 > 0:37:15to see if it is going to have any benefits.
0:37:15 > 0:37:18But we don't want it taking jobs away from actual workers.
0:37:29 > 0:37:30Come on, Ethan.
0:37:30 > 0:37:32We've got a special surprise downstairs.
0:37:32 > 0:37:35A special surprise! Come on.
0:37:35 > 0:37:37In West London,
0:37:37 > 0:37:41Kaspar is halfway into its five-week stay with the Docherty family
0:37:41 > 0:37:43and there's been a development.
0:37:43 > 0:37:45Ethan, what's that?
0:37:45 > 0:37:47That's my picture.
0:37:47 > 0:37:49That is your picture, well done!
0:37:49 > 0:37:52Look, April, it's Ethan's fish, can you see?
0:37:52 > 0:37:54For the first time ever,
0:37:54 > 0:37:56four-year-old Ethan came home from school
0:37:56 > 0:38:00and communicated with mum Michelle about his day.
0:38:00 > 0:38:01We picked Ethan up from school and I said,
0:38:01 > 0:38:03"What did you do at school today?"
0:38:03 > 0:38:06He said to us, "I drew a picture of a fish on a piece of paper".
0:38:06 > 0:38:09Thinking that Ethan had just done what he normally does,
0:38:09 > 0:38:11which is just says anything that comes to his mind,
0:38:11 > 0:38:12rather than what he's actually done.
0:38:12 > 0:38:14And then, when we got home,
0:38:14 > 0:38:17pulled out of his school bag a picture of a coloured-in fish,
0:38:17 > 0:38:18which was amazing.
0:38:18 > 0:38:21Because, obviously, it was exactly what he'd said he'd done.
0:38:22 > 0:38:24It was like it was made of gold.
0:38:24 > 0:38:26I mean, it really is like a...
0:38:26 > 0:38:28I'm going to try not to cry...
0:38:28 > 0:38:30Erm...
0:38:30 > 0:38:31Just a really exciting moment.
0:38:33 > 0:38:35Sorry.
0:38:38 > 0:38:40The two-way conversations with Kaspar
0:38:40 > 0:38:42may be having an effect
0:38:42 > 0:38:45and helping Ethan communicate and share information.
0:38:46 > 0:38:48I suppose that it gives you hope.
0:38:49 > 0:38:52Sorry...
0:38:54 > 0:38:56I think, for us, it just gives you those moments of...
0:38:56 > 0:38:59..what everyone else gets and takes for granted.
0:39:01 > 0:39:02I think it's just that.
0:39:03 > 0:39:04Really special.
0:39:08 > 0:39:10This is the real breakthrough.
0:39:11 > 0:39:13Ethan hasn't done this before
0:39:13 > 0:39:16and he's been able to communicate with her
0:39:16 > 0:39:20in a way that he has typically found really difficult.
0:39:21 > 0:39:24What's remarkable about this is that we are seeing a real result,
0:39:24 > 0:39:26with a real child, with a real robot,
0:39:26 > 0:39:29and that's heart-warming.
0:39:29 > 0:39:32But it's also valuable, incredibly valuable for research.
0:39:34 > 0:39:36The breakthrough spurs Michelle on
0:39:36 > 0:39:40to tackle another of Ethan's social sticking points...
0:39:40 > 0:39:42Yummy, yummy, yummy, yummy, yummy...
0:39:42 > 0:39:43Family meal times.
0:39:43 > 0:39:46Leave it there and go and sit at your chair, ready for lunch,
0:39:46 > 0:39:48and then we can play.
0:39:48 > 0:39:50Dinner time, it's generally Ethan wants the iPad.
0:39:50 > 0:39:53So I was kind of thinking it might be a good way
0:39:53 > 0:39:54of getting him off the iPad.
0:39:54 > 0:39:56Ethan, you want cucumber?
0:39:58 > 0:40:01Ethan loves his food, but he likes to zone out when he's eating,
0:40:01 > 0:40:03and the iPad is a good out for us.
0:40:03 > 0:40:05You're Happy And You Know It.
0:40:05 > 0:40:07You'd like to do a song? OK.
0:40:07 > 0:40:08Good choice!
0:40:08 > 0:40:10# If you're happy and you know it, say I am
0:40:10 > 0:40:12- ALL:- I am!
0:40:12 > 0:40:15# If you're happy and you know it, say I am
0:40:15 > 0:40:17- ALL:- I am!
0:40:18 > 0:40:20Whose turn is it next?
0:40:20 > 0:40:21Thank you.
0:40:22 > 0:40:26# If you're happy and you know it, arms up high
0:40:26 > 0:40:29Arms up high!
0:40:29 > 0:40:31# If you're happy and you know it, arms up high... #
0:40:31 > 0:40:34Arms up high!
0:40:35 > 0:40:37The interaction has changed over the last few weeks.
0:40:37 > 0:40:39Ethan is controlling Kaspar,
0:40:39 > 0:40:44and that gives him a totally new way of interacting with the family.
0:40:44 > 0:40:47So he's learning the rules through programming the robot,
0:40:47 > 0:40:51and that's helping him transfer those to interaction with people.
0:40:52 > 0:40:55There's no iPad, but with Kaspar to distract him,
0:40:55 > 0:40:57no tantrums from Ethan, either.
0:40:57 > 0:40:59- ALL:- Arms up high!
0:40:59 > 0:41:02He's definitely very, very happy at the moment.
0:41:02 > 0:41:05- ETHAN:- High!
0:41:05 > 0:41:06And loud.
0:41:06 > 0:41:10And him being in a happy place, he's very open to learning.
0:41:10 > 0:41:12Arms up high!
0:41:12 > 0:41:13Arms up high!
0:41:13 > 0:41:16Yeah!
0:41:16 > 0:41:20Over the next days, Ethan's development continues.
0:41:20 > 0:41:21Do you like it?
0:41:21 > 0:41:24- KASPAR:- Oh, that's really disgusting.
0:41:24 > 0:41:26ETHAN LAUGHS
0:41:26 > 0:41:29He begins to form a bond with Kaspar.
0:41:29 > 0:41:31- KASPAR:- Well done, you found it.
0:41:31 > 0:41:34And he's also learning to share.
0:41:35 > 0:41:37Touch his toes, April.
0:41:46 > 0:41:47OK, I'm going to get the FitBot.
0:41:47 > 0:41:51In Plymouth, it's time for the Rockets to say goodbye to FitBot.
0:41:51 > 0:41:53There we go!
0:41:53 > 0:41:58But not before dad Matt and mum Jackie have a final weigh-in
0:41:58 > 0:42:01to see if six weeks' fitness and nutrition instruction
0:42:01 > 0:42:03has impacted their waistlines.
0:42:12 > 0:42:16Six weeks ago, Matt weighed in at 100 kilograms.
0:42:16 > 0:42:1795...
0:42:24 > 0:42:27Matt has lost a respectable five kilos.
0:42:31 > 0:42:35Six weeks ago, Jackie weighed in at 80 kilograms.
0:42:35 > 0:42:36Oh, yes!
0:42:36 > 0:42:3978.5!
0:42:39 > 0:42:41Jackie has lost weight, too,
0:42:41 > 0:42:43if slightly less than Matt.
0:42:58 > 0:43:00- Farewell.- Bye, FitBot.
0:43:00 > 0:43:01Bye, FitBot.
0:43:04 > 0:43:10FitBot has pushed me more going out and doing more exercise classes.
0:43:10 > 0:43:12I'm really happy that, you know,
0:43:12 > 0:43:14I've managed to lose weight along the way.
0:43:16 > 0:43:19He's helped me more than anybody, having FitBot here.
0:43:19 > 0:43:23I definitely feel better for being back in training.
0:43:23 > 0:43:25The feeling of the endorphins being released in your body,
0:43:25 > 0:43:28the buzz that you feel after you've done any sort of training,
0:43:28 > 0:43:30it's nice to have that feeling back again.
0:43:38 > 0:43:39OK, Bot.
0:43:41 > 0:43:43I think it's so funny.
0:43:43 > 0:43:44Where's it going?
0:43:44 > 0:43:46After two weeks together,
0:43:46 > 0:43:51Neil and Linda Bowles are preparing to say goodbye to CareBot.
0:43:51 > 0:43:53Do you think we ought to give him a little power pack
0:43:53 > 0:43:55as a pack-up lunch?
0:43:55 > 0:43:57HE LAUGHS
0:43:57 > 0:43:59When he's worked, he's been fun.
0:43:59 > 0:44:02The only frustrations were when he didn't work,
0:44:02 > 0:44:05because we wanted to do as much as possible with him,
0:44:05 > 0:44:08so we could give feedback to his robot master,
0:44:08 > 0:44:10as he refers to Theo, the scientist.
0:44:10 > 0:44:13It's been good company,
0:44:13 > 0:44:16but many of CareBot's prototype physical functions
0:44:16 > 0:44:18were held up by slow internet in the Bowles' home.
0:44:22 > 0:44:24Hey, Theo, my friend!
0:44:24 > 0:44:28Something that has frustrated its inventor Dr Theo Theodoridis.
0:44:30 > 0:44:34Theo's here to take away robot, oh!
0:44:34 > 0:44:36We had a few problems with the internet.
0:44:36 > 0:44:39Sometimes it couldn't hear what you're saying.
0:44:39 > 0:44:41- can you lift it yourself? - Yeah.
0:44:43 > 0:44:45This experiment was not 100% a representation
0:44:45 > 0:44:48of what we have and what we can do.
0:44:49 > 0:44:52- Goodbye, CareBot. - I'll look after you.
0:44:52 > 0:44:54It would be better if he had the localisation feature
0:44:54 > 0:44:58where Neil or Linda could direct the robot
0:44:58 > 0:45:01to deliver goods from one place to another.
0:45:08 > 0:45:11I've missed the robot! I want to see my robot!
0:45:11 > 0:45:13Three weeks later,
0:45:13 > 0:45:16the Bowles get an invite to visit CareBot in the lab
0:45:16 > 0:45:17at the University of Salford.
0:45:17 > 0:45:20- Are you looking forward to seeing him?- I am.
0:45:20 > 0:45:23He's had operations, plastic surgery...
0:45:23 > 0:45:26Dr Theodoridis has worked around the clock
0:45:26 > 0:45:29to build in the robot's ability to accurately locate
0:45:29 > 0:45:32the different regions and objects in his laboratory.
0:45:34 > 0:45:35Localisation is useful
0:45:35 > 0:45:38because Neil can send the robot from the kitchen
0:45:38 > 0:45:44with a warm plate of food to Linda, who might be around the living room.
0:45:45 > 0:45:47- Here we go.- Lovely.
0:45:47 > 0:45:49- Hey, good to see you. - Hi, how are you doing?
0:45:49 > 0:45:51Good to see you.
0:45:51 > 0:45:53Can you get through there?
0:45:53 > 0:45:55- Oh!- Hey, CareBot.
0:45:55 > 0:45:57Hello, CareBot!
0:45:57 > 0:46:00Hello, my friend, it's good to see you.
0:46:04 > 0:46:08- He hasn't lost any of the sarcasm! OK, let's experiment.- Have fun.
0:46:08 > 0:46:10OK, thanks. See you in a while. Thank you.
0:46:11 > 0:46:13CareBot ready!
0:46:13 > 0:46:14Fantastic.
0:46:14 > 0:46:18Until now, neither CareBot's grabber or mapping worked.
0:46:18 > 0:46:21Dr Theo Theodoridis leaves a sceptical Linda and Neil
0:46:21 > 0:46:23to ask CareBot for help.
0:46:23 > 0:46:25Anything during the day that Linda could get the robot to do
0:46:25 > 0:46:28without asking me is a success.
0:46:29 > 0:46:32Can you deliver something with your manipulator?
0:46:41 > 0:46:44Thank you! Look, I've got the TV remote!
0:46:44 > 0:46:46I'm never allowed that!
0:46:46 > 0:46:47Yay!
0:46:48 > 0:46:50But what the Bowles really wanted
0:46:50 > 0:46:54was CareBot to courier items around the house.
0:46:54 > 0:46:56What I'm going to do is I'm going to put the kettle on
0:46:56 > 0:46:58and get him to bring you a cup of tea.
0:46:58 > 0:46:59- OK.- OK.
0:46:59 > 0:47:05We've got English breakfast tea or fragrant and spicy chai.
0:47:05 > 0:47:08Oh, don't! English breakfast.
0:47:08 > 0:47:10Use your serving tray.
0:47:14 > 0:47:18I want you to use your serving tray, please.
0:47:20 > 0:47:21Yes, he's done it!
0:47:21 > 0:47:23Hey!
0:47:23 > 0:47:24Thank you.
0:47:26 > 0:47:29Go to the bed, please.
0:47:31 > 0:47:33Wow!
0:47:33 > 0:47:36- Good man! - After being so disobedient...
0:47:36 > 0:47:41Software called simultaneous localisation and mapping, or Slam,
0:47:41 > 0:47:46allows CareBot to generate a log of every object and obstacle in a room,
0:47:46 > 0:47:49so it can navigate to, from and around them.
0:47:49 > 0:47:52Linda is waiting for her tea.
0:47:52 > 0:47:56Not only is it now working, but CareBot can multitask...
0:48:00 > 0:48:02Wonderful. Do you like tea?
0:48:06 > 0:48:08Of all of CareBot's functions,
0:48:08 > 0:48:11this would have made the most impact on Neil and Linda's lives.
0:48:13 > 0:48:19So if this was at home, and Theo had programmed robot to our house,
0:48:19 > 0:48:21obviously I could make her a cup of tea in the kitchen,
0:48:21 > 0:48:24and then tell it to go, presuming you're sitting at the dining table,
0:48:24 > 0:48:27I can tell robot to go to the dining table and get your tea.
0:48:30 > 0:48:32CareBot is a prototype.
0:48:32 > 0:48:36It's finally working, but it still has a long way to go.
0:48:37 > 0:48:39If I did live another 20 years,
0:48:39 > 0:48:43I think robots will have come on in leaps and bounds
0:48:43 > 0:48:46and be doing things that today I can't possibly imagine.
0:48:50 > 0:48:53Thank you for bringing me my tea, CareBot.
0:48:56 > 0:49:00I don't think it's going to come soon enough to help me.
0:49:00 > 0:49:01Unfortunately.
0:49:01 > 0:49:03But people coming up behind me,
0:49:03 > 0:49:06they've got a tremendously maybe exciting future in front of them,
0:49:06 > 0:49:08even if they're disabled.
0:49:08 > 0:49:10He's lovely, isn't he? I've missed him.
0:49:10 > 0:49:12I want to take him home now.
0:49:13 > 0:49:15That's the end, Neil.
0:49:16 > 0:49:18It's a bit sad.
0:49:25 > 0:49:26Hello, Fabio.
0:49:28 > 0:49:29He's not responding!
0:49:31 > 0:49:36In Edinburgh, it's Fabio's first day on the shop floor.
0:49:38 > 0:49:40I wondered if you could show me where the red wine is.
0:49:40 > 0:49:44I came into the shop purely because I saw that there is a robot
0:49:44 > 0:49:46and I was intrigued.
0:49:47 > 0:49:49Fabio's been put to work helping customers
0:49:49 > 0:49:51locate products in the store.
0:49:53 > 0:49:56Hello. Can you tell me where the milk is, please?
0:50:00 > 0:50:01Thank you.
0:50:01 > 0:50:04But its advice can be a bit limited...
0:50:06 > 0:50:09Where can I find beer here?
0:50:12 > 0:50:13Excellent, thank you.
0:50:14 > 0:50:16And not everyone gets a response.
0:50:17 > 0:50:19Hello, Fabio.
0:50:23 > 0:50:24The ambient noise on the shop floor
0:50:24 > 0:50:28is causing problems for Fabio's sensitive microphones.
0:50:30 > 0:50:33For a robot or a computer to understand speech
0:50:33 > 0:50:35in those kind of environments, it's really difficult.
0:50:35 > 0:50:38It could, when there's only one person speaking,
0:50:38 > 0:50:41be perfect at recognising people's speech
0:50:41 > 0:50:43and to be able to interact really naturally.
0:50:43 > 0:50:45But as soon as you put
0:50:45 > 0:50:47three or four different conversations on in one shop,
0:50:47 > 0:50:49maybe it can get a bit confused.
0:50:49 > 0:50:50It's always tough on your first day.
0:50:54 > 0:50:57Aware their new employee isn't quite up to the mark,
0:50:57 > 0:51:00the shop owners have decided to give it a simpler role.
0:51:04 > 0:51:07Would you be interested in sampling our pulled pork or a brownie?
0:51:07 > 0:51:09We often have sampling sessions at lunchtime,
0:51:09 > 0:51:11where we have a new product,
0:51:11 > 0:51:14and we want to see if this is a job a robot could also do,
0:51:14 > 0:51:17because it becomes quite a fairly repetitive job,
0:51:17 > 0:51:18saying the same thing over and over again.
0:51:18 > 0:51:20Would you be interested in sampling...?
0:51:20 > 0:51:21No? That's fine.
0:51:21 > 0:51:23It's quite tiring sometimes.
0:51:23 > 0:51:26So this, for me, as an actual experiment
0:51:26 > 0:51:28to see if this is something a robot can do in the future,
0:51:28 > 0:51:30would be a fantastic job for it to do.
0:51:30 > 0:51:33- That can be one less job for you. - One less job for me, yeah.
0:51:33 > 0:51:37This task should be easier for Fabio to master.
0:51:37 > 0:51:38In samples mode,
0:51:38 > 0:51:41its speech repertoire is limited to offering passers-by
0:51:41 > 0:51:44a taste of two different deli foods.
0:51:44 > 0:51:46The human touch is always nice.
0:51:46 > 0:51:49But a cute robot like that, who could resist, you know?
0:51:55 > 0:51:58Oh, he's having success already.
0:51:58 > 0:52:00That's two in quick succession.
0:52:01 > 0:52:03He's just a walking sign!
0:52:03 > 0:52:06Fabio's encouraged a few samples.
0:52:06 > 0:52:09But crucially, it's not initiating any conversation.
0:52:09 > 0:52:12So Luisa steps in for a pep talk.
0:52:12 > 0:52:14Hello.
0:52:15 > 0:52:17Unlike its human counterpart,
0:52:17 > 0:52:20Fabio will only deliver its sales pitch
0:52:20 > 0:52:24if customers stop within the five-metre range of its sensors.
0:52:25 > 0:52:28Come on, Fabio, you've got to give me something here.
0:52:28 > 0:52:32Only once it has identified a human face will it start to interact.
0:52:42 > 0:52:44Do you want to try the pulled pork after that sale?
0:52:44 > 0:52:47That sounds like a pretty good option to me, yeah.
0:52:47 > 0:52:51Luisa has initiated the programme for Fabio to talk.
0:52:51 > 0:52:53And it tries to work the crowd alone.
0:52:59 > 0:53:02But it seems that robots are easier to ignore than humans.
0:53:04 > 0:53:06Oh, poor Fabio.
0:53:10 > 0:53:13They're not actually looking at him as much as I thought they would.
0:53:13 > 0:53:17I can see there's some customers there that are trying to avoid him.
0:53:17 > 0:53:18Yeah.
0:53:18 > 0:53:20I'm just finding it a bit disappointing
0:53:20 > 0:53:23that people aren't taking him more seriously.
0:53:23 > 0:53:24Brownie or we have pulled pork.
0:53:24 > 0:53:29In the same 15 minutes, the human employee gave away 12 samples.
0:53:29 > 0:53:31Fabio managed only two.
0:53:34 > 0:53:38If a robot, rather than a human, is recommending something,
0:53:38 > 0:53:43can we really trust that recommendation?
0:53:43 > 0:53:45ShopBot is faking it.
0:53:45 > 0:53:47It is actually telling a lie.
0:53:47 > 0:53:49And we are expected to believe it.
0:53:49 > 0:53:53Unfortunately, Fabio didn't perform as well as we'd hoped.
0:53:53 > 0:53:55Never mind, Fabio.
0:53:55 > 0:53:56Next time, eh, buddy.
0:54:00 > 0:54:01Thank you.
0:54:03 > 0:54:04Alcohol!
0:54:06 > 0:54:08Fabio continues to show willing.
0:54:08 > 0:54:10Do you understand?
0:54:13 > 0:54:14Hold this then.
0:54:14 > 0:54:18But the Margiottas have decided it's time for a chat.
0:54:18 > 0:54:21Regarding that contract I gave you,
0:54:21 > 0:54:24I don't think you need to sign it just yet, OK?
0:54:24 > 0:54:25I'm sorry.
0:54:25 > 0:54:27- Oh...!- No!
0:54:27 > 0:54:29Oh, no, Fabio, no.
0:54:29 > 0:54:32No, we'd never be angry with you, pal, no.
0:54:34 > 0:54:36Fabio's one-week trial is up.
0:54:42 > 0:54:46And it's being packed off with a P45 and a fond farewell.
0:54:48 > 0:54:51At the moment, we shouldn't be looking out for a robot
0:54:51 > 0:54:53to replace any human's job.
0:54:53 > 0:54:55But, actually, we're going to miss him.
0:54:55 > 0:54:57I'm going to come into the shop looking for him
0:54:57 > 0:54:58and he's not going to be there.
0:55:03 > 0:55:05Right, are you ready to go? Come on, then.
0:55:07 > 0:55:09Come on! Thank you, Dolly.
0:55:09 > 0:55:11Now, let's walk.
0:55:11 > 0:55:14In Ruislip, West London, Ethan has a play date.
0:55:14 > 0:55:19Dolly, listen, do you love going for a walk?
0:55:19 > 0:55:21Ethan's spent the last five weeks
0:55:21 > 0:55:24practising how to socialise with Kaspar the robot...
0:55:24 > 0:55:26There is a car coming. Quick, quick, quick!
0:55:26 > 0:55:29..and is starting to put his new skills to use.
0:55:29 > 0:55:31Come on, let's run!
0:55:34 > 0:55:35You can't get me, Ethan!
0:55:37 > 0:55:39Seesaw!
0:55:40 > 0:55:42Faster!
0:55:42 > 0:55:44Weeeeeeeeee!
0:55:44 > 0:55:46He's in good form today, he's having a good time.
0:55:46 > 0:55:49What's the time, Mr Wolf?
0:55:49 > 0:55:52- Dinner time! - SCREAMING
0:55:52 > 0:55:54I look at Ethan now and I see him as happy as he's been in a while.
0:55:54 > 0:55:56And I also see him as sociable as he's been in a while.
0:55:58 > 0:56:00The main thing I've taken from Kaspar
0:56:00 > 0:56:02is not to underestimate Ethan.
0:56:02 > 0:56:05You know, it's really easy with a special needs kid
0:56:05 > 0:56:09to kind of put a limit on what you think they can or can't do.
0:56:09 > 0:56:11And he says quite a lot now.
0:56:11 > 0:56:13If we get him to try new food when we're out, he'll say,
0:56:13 > 0:56:16"Oh, yuck, that's disgusting!" And that's Kaspar.
0:56:18 > 0:56:19But it's quite good,
0:56:19 > 0:56:22because it means that Ethan's interpreting that question correctly
0:56:22 > 0:56:24and he's telling me whether he likes something or not.
0:56:28 > 0:56:31Kaspar's stay with the Dohertys has come to an end.
0:56:32 > 0:56:35- Fort Knox. Hi, Ben.- I'm here.
0:56:35 > 0:56:37And the university researcher has arrived
0:56:37 > 0:56:38to see how they're getting on.
0:56:38 > 0:56:41- Can you find your elf? - Oh, is it in there?
0:56:41 > 0:56:43Come on, April, come on.
0:56:43 > 0:56:47Dr Ben Robins is keen to find out what impact, if any,
0:56:47 > 0:56:49Kaspar's had on Ethan.
0:56:49 > 0:56:52The question is, did you find a benefit?
0:56:52 > 0:56:56Yeah, we definitely found Kaspar to be beneficial for Ethan.
0:56:56 > 0:56:59We do think it's complemented the other things that he's doing.
0:56:59 > 0:57:01Through interacting with Kaspar,
0:57:01 > 0:57:05he has been playing with April a lot more recently, actually.
0:57:05 > 0:57:10Yeah, I've seen him initiating the play with Kaspar and with April.
0:57:14 > 0:57:17It's absolutely fascinating, watching these encounters.
0:57:18 > 0:57:22It's one of the first times that we've really been able to understand
0:57:22 > 0:57:24how real research technology functions
0:57:24 > 0:57:27outside in the wild with real people.
0:57:28 > 0:57:30And also, it's been lovely to see the way that the children
0:57:30 > 0:57:33accept the robots as members of the family.
0:57:34 > 0:57:37We always measure stuff by how much we see Thomas the Tank Engine.
0:57:37 > 0:57:39Thomas comes out, it's a bad time.
0:57:39 > 0:57:42If Thomas is away, everything's all right.
0:57:42 > 0:57:44And I don't think he's played with Thomas the Tank Engine
0:57:44 > 0:57:46for four or five weeks now,
0:57:46 > 0:57:48which is incredible, which is just great.
0:57:49 > 0:57:52I think robots are going to take intelligent systems
0:57:52 > 0:57:55out of our phones and put them into our lives.
0:57:55 > 0:57:57We're going to be interacting with them physically,
0:57:57 > 0:57:59which we've never done before, really.
0:57:59 > 0:58:02If it's a technology that we really interact with
0:58:02 > 0:58:04and that can interact back with us,
0:58:04 > 0:58:06that is going to change our lives.
0:58:08 > 0:58:09The igloo again!
0:58:11 > 0:58:12Happy days are great days.
0:58:12 > 0:58:13With Ethan, they're amazing.
0:58:13 > 0:58:16They're not just good days, they're brilliant days,
0:58:16 > 0:58:18because he's such good fun.
0:58:18 > 0:58:20- KASPAR:- Thank you for looking after me.
0:58:22 > 0:58:24You're most welcome.
0:58:26 > 0:58:27Have a lovely holiday!