:00:00. > :00:13.On today's programme: Are street grooming gangs a Muslim problem,
:00:14. > :00:16.as was claimed in a tabloid this week, or is the language
:00:17. > :00:18.we use around the issue racist and divisive?
:00:19. > :00:20.Equality campaigners propose all jobs be open to part
:00:21. > :00:24.Will such measures help to close the gender pay gap
:00:25. > :00:29.We ask is equality impossible in the workplace?
:00:30. > :00:33.Also on today's show, will we take over the world?
:00:34. > :00:35.I don't know about that, Sanbot, but I hope you're not looking
:00:36. > :00:39.As tech billionaires battle over machines that can
:00:40. > :00:40.think for themselves, should we be worried
:00:41. > :00:45.Emma Barnett is here ready to help you join in the debates.
:00:46. > :01:00.And you should be worried about that robot! We want to hear from you
:01:01. > :01:01.about all our discussions, especially the first one about
:01:02. > :01:03.grooming gangs. You can contact us by
:01:04. > :01:05.Facebook and Twitter. Don't forget to use
:01:06. > :01:07.the hashtag #bbcsml. Or text SML followed
:01:08. > :01:09.by your message to 60011. Texts are charged at your
:01:10. > :01:11.standard message rate. Or email us at
:01:12. > :01:16.sundaymorninglive@bbc.co.uk. However you choose to get in touch,
:01:17. > :01:19.please don't forget to include your name so I can get you involved
:01:20. > :01:21.in our discussions. And later we'll meet
:01:22. > :01:34.a pioneering RAF pilot who has The fact is, that there are already
:01:35. > :01:37.thousands of people serving in the US military who are transgender. I
:01:38. > :01:38.am living proof that being transgender does not affect how you
:01:39. > :01:49.do your job. This week Labour MP
:01:50. > :01:52.for Rotherham Sarah Champion quit her role on the Shadow Cabinet
:01:53. > :01:54.after criticism over a newspaper article she wrote
:01:55. > :02:00.about grooming gangs. "Britain has a problem
:02:01. > :02:03.with British Pakistani men raping and exploiting white girls,"
:02:04. > :02:06.she wrote in The Sun on Friday. Or am I just prepared
:02:07. > :02:16.to call out this horrifying She has since apologised for what
:02:17. > :02:21.she described as an extremely poor choice of words.
:02:22. > :02:23.In the same newspaper, another article by former Sun
:02:24. > :02:25.political editor Trevor Kavanagh referred to "the Muslim problem".
:02:26. > :02:28.It was condemned in an open letter signed by more than 100 MPs,
:02:29. > :02:37.So should race be talked about when tackling grooming gangs or are we in
:02:38. > :02:43.danger of alienating a community? Joining us now are Nazir Afzal,
:02:44. > :02:45.former Chief Crown Prosecutor for North West England,
:02:46. > :02:47.Michelle Dewberry, a broadcaster Kieran Yates,
:02:48. > :02:50.a journalist and writer, and Mohammed Shafiq, chief executive
:02:51. > :03:00.of the Ramadhan Foundation. Hearing, has some of the language
:03:01. > :03:04.that has been used been racist and divisive? I think it has definitely
:03:05. > :03:08.stoked the fire of prejudice and racism that already exists in the
:03:09. > :03:12.UK. What it has also done is derailed the conversation away from
:03:13. > :03:16.the victims, by somehow suggesting there is something inherently
:03:17. > :03:20.sexually deviant about British Pakistani man, which is disgusting.
:03:21. > :03:24.What about future victims? If we don't talk about this properly, we
:03:25. > :03:29.will never solve the problem. Of course any investigation has got to
:03:30. > :03:31.unpick and discuss the issues of culture and religion, but what
:03:32. > :03:36.Trevor Kavanagh and Sarah Champion have done is make the conversation
:03:37. > :03:40.simply about that and devoid of nuance. How did you review the
:03:41. > :03:45.apology? I think it was a necessary for her to do that. What Sarah
:03:46. > :03:50.Champion did was speak the truth. It is not racist to say that Pakistani
:03:51. > :03:55.men are grooming and abusing and raping white ladies or girls, if
:03:56. > :03:59.that is what they are doing. Grouping them altogether? A
:04:00. > :04:06.Pakistani men are doing that. Some Pakistani men. She never said all in
:04:07. > :04:09.her article. People are leaping on this and suggesting that somehow
:04:10. > :04:13.anybody with a brain cell is walking around thinking that all Pakistani
:04:14. > :04:18.people are doing this. Lady had suggested that all Pakistani men are
:04:19. > :04:23.doing anything. Brexit nobody has suggested that all Pakistani men are
:04:24. > :04:27.doing anything. Growing up as a mixed-race boy in 1980s Britain,
:04:28. > :04:31.people talked about black people committing crimes, and that affected
:04:32. > :04:36.me. I was no more likely to commit a crime and my white colleagues.
:04:37. > :04:40.Pakistani men are feeling finger pointed. Nobody is suggesting that
:04:41. > :04:43.all Pakistani men are doing anything. Sarah Champion's article
:04:44. > :04:47.was correct. She should not have resigned and I think she was forced
:04:48. > :04:52.into that. She should have stood by her words and her position. You have
:04:53. > :04:57.been involved in prosecuting gangs. Does race and religion play a part
:04:58. > :05:01.in these crimes? As people have said already, you can't get away from the
:05:02. > :05:05.fact that British Pakistani men are disproportionately involved in
:05:06. > :05:13.street grooming. British white paedophiles have a different
:05:14. > :05:15.approach. They work in institutions and they organise themselves in a
:05:16. > :05:18.different way. The question that you should be asking is why we focusing
:05:19. > :05:26.on race and not sexism, when we haven't talked about the victims and
:05:27. > :05:29.what they gone through. I am with Michelle. I have known Sarah for a
:05:30. > :05:35.long time and she is a champion for victims and everybody involved in
:05:36. > :05:38.this is a champion for victims. We can't get away from the fact that
:05:39. > :05:44.British Pakistani men are disproportionately involved in group
:05:45. > :05:52.grooming. Is it a race crime? It is about the availability and
:05:53. > :05:57.vulnerability of young girls. Is it about the way they view white girls?
:05:58. > :06:01.Part of that is true. They also targeted Pakistani girls and girls
:06:02. > :06:06.from other ethnicities. I prosecuted the ringleader of Rochdale for his
:06:07. > :06:12.abuse. But a smaller amount. I totally get that. That is right. The
:06:13. > :06:16.fact is that victims would be targeted by predators from all
:06:17. > :06:22.communities. We can't get away from the fact that Pakistani men are
:06:23. > :06:27.disproportionately more involved in street grooming white girls, can we?
:06:28. > :06:30.You can't and the facts speak for themselves. I have always been very
:06:31. > :06:33.consistent in saying that as a community we have got to deal with
:06:34. > :06:37.this. What has been difficult over the last ten days has been when it
:06:38. > :06:41.has been politicised. Politicians and commentators turning it into a
:06:42. > :06:46.massive debate. As my colleague said. And we have forgotten the
:06:47. > :06:50.reality, the victims. I have met these victims. I have spoken to
:06:51. > :06:54.them. Do you know the most distressing thing? Not a single
:06:55. > :06:57.person commenting on this is talking about their experiences. I think we
:06:58. > :07:00.should be talking about that. In terms of Sarah Champion, I think it
:07:01. > :07:10.was right that she resigned. She deliberately lied and told the
:07:11. > :07:17.mysteries when she said that she had the article jape by The Sun. -- and
:07:18. > :07:22.told an untruth. It was signed off by her office, not The Sun. This
:07:23. > :07:27.inflammatory talking about race feed paranoia among the far right. I am
:07:28. > :07:31.standing up against these gangs and I have been campaigning against them
:07:32. > :07:37.since 2007. But when you make inflammatory comments, as she did,
:07:38. > :07:43.as a shadow qualities secretary, her position is untenable. You are
:07:44. > :07:47.talking to one of the victims now. Yes, it is often said that the
:07:48. > :07:50.victim's voice gets lost in these conversations.
:07:51. > :07:53.I'm joined now by Sammy Woodhouse, a victim of the Rochdale grooming
:07:54. > :07:55.gang, and now a campaigner for victims of abuse.
:07:56. > :08:02.Good morning. Do you think race and religion played any role in the way
:08:03. > :08:07.that these gangs operated? You had first-hand experience of it. I think
:08:08. > :08:10.it has in some cases, yes. We can't get away from that. There has been
:08:11. > :08:20.evidence in court cases throughout the country. As a country, we are
:08:21. > :08:25.open to talking about white men raping children, but we are in 2017
:08:26. > :08:30.and we can't say Pakistani con Muslim and child abuse in the same
:08:31. > :08:36.sentence. There are lots of factors involved, not just race and
:08:37. > :08:41.religion. I was targeted because of my sex, I was a girl. You are
:08:42. > :08:47.targeted for different reasons. There were different races involved.
:08:48. > :08:51.The fact that I was in the wrong place at the wrong time, or he was.
:08:52. > :08:56.He was very open to getting away with it and he thought it was
:08:57. > :09:00.normal. If I can just break in, do you feel the fact that you are white
:09:01. > :09:06.played a role in the fact that you were seen as vulnerable? I think it
:09:07. > :09:12.did. That is part of why we were ignored and it was covered up. That
:09:13. > :09:16.is happening throughout the country, a pattern. We can't make it all
:09:17. > :09:23.about race, especially on to one race, but we also can't get away
:09:24. > :09:27.from the facts. We can't tackle it unless we discuss everything. Do you
:09:28. > :09:31.get frustrated when you see somebody like Sarah Champion having to resign
:09:32. > :09:37.from the Shadow Cabinet or feeling like she has got to because she has
:09:38. > :09:41.written what are facts, as she sees them, in a newspaper article?
:09:42. > :09:45.Definitely. As a country we can have an open and honest discussion about
:09:46. > :09:50.race and religion. I don't think Sarah did anything wrong and I don't
:09:51. > :09:55.see eye to eye with her myself, but putting that to one side, I think
:09:56. > :10:00.she said something that is the truth. What's Jeremy Corbyn did by
:10:01. > :10:04.sacking her, it resigning, whichever it was, I think that sends a far
:10:05. > :10:12.more dangerous message, saying that we can't have these discussions. If
:10:13. > :10:15.you were to summarise what role Pakistani culture or Islamic culture
:10:16. > :10:19.potentially played in the fact that you were chosen to be groomed as a
:10:20. > :10:23.young white girl, how would you summarise that from your point of
:10:24. > :10:28.view and your experience? I think there are a few factors involved.
:10:29. > :10:33.There are some Muslims that few women in poor light, especially if
:10:34. > :10:39.you are not a Muslim girl or Muslim woman yourself. I think we need to
:10:40. > :10:44.be open to talking about it. I don't understand why it has got to be a
:10:45. > :10:49.secret. Let's tackle it from all angles. I appreciate you talking to
:10:50. > :10:53.us this morning. Thank you. You have been getting in touch with your
:10:54. > :10:56.views. Please keep doing so. Ruth says why should this be an
:10:57. > :11:00.uncomfortable truth? Discuss it openly without the fear of being
:11:01. > :11:04.called racist. It is impossible not to bring race and religion into
:11:05. > :11:08.these conversations. Doreen: The reality is that a group of men
:11:09. > :11:11.exploited and raped young, vulnerable girls. Is it racist to
:11:12. > :11:17.mention that the girls were white British and the men were of Asian
:11:18. > :11:21.origin? These are the facts. Suzanne says that all of the offenders were
:11:22. > :11:25.British-born. Does that mean that Britain is a nation of sex
:11:26. > :11:29.offenders? You can't tar everybody with the same brush. And this one
:11:30. > :11:34.says, the question is not whether the MP is racist but why a large
:11:35. > :11:39.number of men from one community are preying on vulnerable girls. Very
:11:40. > :11:43.interesting. Sammy says in 2017 we can't use the words child abuse and
:11:44. > :11:48.Pakistani men in the same sentence. Is that political correctness gone
:11:49. > :11:52.mad? I think she was saying that we can't make this completely about
:11:53. > :11:56.race. And what we have seen, we are having a discussion about media
:11:57. > :12:00.responses, and British Pakistani men are always part of the conversation
:12:01. > :12:05.when we are talking about this. They have been intertwined into this
:12:06. > :12:09.narrative, and that is why Sarah Champion was right to stand down.
:12:10. > :12:12.When we are talking about this case in particular, there is a systemic
:12:13. > :12:16.failure of Greater Manchester Police who did not believe the victims.
:12:17. > :12:20.There is something we need to unpick and discuss their without derailing
:12:21. > :12:24.the conversation. We discuss all parts of it, so why can't we talk
:12:25. > :12:29.about Pakistani men who are highly involved in this? What about talking
:12:30. > :12:32.about Polish truck drivers abusing young black girls? We would talk
:12:33. > :12:36.about Polish truck drivers and the way they view young black women. We
:12:37. > :12:39.are talking about Pakistani men disproportionately involved in
:12:40. > :12:43.abusing young, white women. Why can't we talk about the way that
:12:44. > :12:47.Pakistani men are brought up to view women who do not fit the mould that
:12:48. > :12:50.they are expecting and don't act in the way they are expecting? Why
:12:51. > :12:53.can't we talk about that? Among these criminals there is a mindset
:12:54. > :13:07.that white girls are worthless. I say it time and time
:13:08. > :13:10.again. Is it the criminals or is it the culture? Are they brought up to
:13:11. > :13:12.think like that? It is not the culture. I am a Muslim and a
:13:13. > :13:15.Pakistani and there is no way in my face and in my community that we
:13:16. > :13:19.were brought up to say you can rape white girls. When you see a young
:13:20. > :13:24.white girl who is vulnerable, maybe drinking, on the streets, there is a
:13:25. > :13:27.problem that they are viewing them, some Pakistani men, are viewing them
:13:28. > :13:32.in a different way. Are they just criminals or is it a cultural thing?
:13:33. > :13:40.Answer the question. What I am saying is that it is an issue and we
:13:41. > :13:43.need to address the issue. Why can't we talk about this in terms of race?
:13:44. > :13:46.Show me some respect. I have been campaigning about this since 2007,
:13:47. > :13:50.even when the police and everybody else was finding excuse for the
:13:51. > :13:53.grooming. I was on the street campaigning and all we have been
:13:54. > :13:58.talking about his race and grooming for the last ten years. We need to
:13:59. > :14:01.talk about this in terms of race. I know but are we also going to talk
:14:02. > :14:09.about the huge endemic problem of child abuse the church? And Jimmy
:14:10. > :14:14.Savile? Let's just get real on these issues and remember that was one
:14:15. > :14:20.victim. I have met many, and I am sure he has. More than I have had
:14:21. > :14:23.breakfast! With Jimmy Savile, the focus and the finger was pointed out
:14:24. > :14:28.a lot of male celebrities and rightly so. Somebody in my position
:14:29. > :14:32.cannot do what he did and we talked about it. We discussed it. With the
:14:33. > :14:34.Catholic Church we discussed it. Judgment here is that we cannot talk
:14:35. > :14:43.about it in terms of race. I have prosecuted dozens of these
:14:44. > :14:48.cases, there is always this conversation for a few weeks after
:14:49. > :14:51.the trial, then we forget about it. I am with you, we should have this
:14:52. > :14:55.conversation day in, day out. Some British Pakistani men have a problem
:14:56. > :14:59.with white girls on the street, there is violent misogyny going on,
:15:00. > :15:04.misogyny more than anything drives their attitude towards women and
:15:05. > :15:08.girls. We need to tackle it, every community, including the British
:15:09. > :15:20.Pakistani community. I am never afraid of
:15:21. > :15:23.saying who is responsible, if they are responsible they need to do
:15:24. > :15:26.something. We have focused on this issue for ten days, we have not
:15:27. > :15:28.focused about the victims, why they were not believed and why they
:15:29. > :15:30.continue to suffer. Michelle, what can we do? Have a honest
:15:31. > :15:32.conversations, I fundamentally disagree with you both when you say
:15:33. > :15:36.Sarah Champion should have resigned. You should be able to have those
:15:37. > :15:41.conversations, have the courage of your convictions, hold your position
:15:42. > :15:46.once you have spoken those truths. The second thing is talk to these
:15:47. > :15:50.people, there have been so many convictions now, talk to these
:15:51. > :15:55.people and understand what is going on, how can we stop this? There is
:15:56. > :15:59.some form of cultural influence, it has to be talked about and
:16:00. > :16:01.understood. I am afraid we are out of time, thank you.
:16:02. > :16:05.Caroline Paige served in the RAF for more than three decades as a jet
:16:06. > :16:07.and helicopter navigator but her career is notable for more
:16:08. > :16:09.than just her remarkable service and battlefield expertise.
:16:10. > :16:11.In 1999 she became the first transgender officer to transition
:16:12. > :16:18.Wendy Robbins went on a tour of duty to find out more about Caroline's
:16:19. > :16:42.When Caroline Paige joined the Royal Air Force in 1980, her colleagues
:16:43. > :16:45.knew her as a man called Eric, and identity she had struggled with all
:16:46. > :16:49.her life. I saw myself as a female that just
:16:50. > :16:54.happen to have been born with a boy's body for some reason, but I
:16:55. > :17:02.knew that I was now joining an organisation whereby if I was
:17:03. > :17:07.discovered that I was transgender, then I would be thrown out.
:17:08. > :17:11.Caroline kept her secret for 19 years of her flying career. It was
:17:12. > :17:15.something she had lived with since the age of five, when she saw a
:17:16. > :17:20.dress on her mother's bed. I felt, well, what a beautiful
:17:21. > :17:25.dress, I need to wear the dress. I put it on and it felt wonderful,
:17:26. > :17:32.natural, it was me, but it was a little bit tight. And I heard
:17:33. > :17:36.footsteps approaching and it was my father, and I panicked and I could
:17:37. > :17:40.not get the dress. I found it very difficult because he shouted at me
:17:41. > :17:43.and made it very clear it was completely wrong, and that
:17:44. > :17:48.frightened me. So I felt the best thing to do was hide it. My life
:17:49. > :17:51.then became one of hopes and dreams, each night I went to bed and I would
:17:52. > :18:01.hope that the following morning I would wake up and it would be fixed,
:18:02. > :18:02.I would be the girl that I knew I was.
:18:03. > :18:05.You wanted to be a girl, you are wearing your mum's clothes, yet you
:18:06. > :18:09.also wanted to fly fast jets. It goes back to my days in Melayu when
:18:10. > :18:13.we lived on an army base and the helicopters used to come backwards
:18:14. > :18:20.and forwards and lands in the field next to the house -- my days in
:18:21. > :18:24.Malaya. At 15 I learned to fly gliders, I was in the sky is 2000
:18:25. > :18:29.feet on my own and I felt it was wonderful, fantastic. It was the
:18:30. > :18:34.first time I had seen there was something I could do. You joined the
:18:35. > :18:45.RAF, then a very macho environment. It was quite a contrast.
:18:46. > :18:53.There were no female is allowed to fly the fast jets, combat aircraft,
:18:54. > :18:57.in the 80s. Not until 1991. It was a very male environment and, of
:18:58. > :19:01.course, it was an extra pressure because I knew that if I was
:19:02. > :19:06.discovered I would be dismissed from the air force immediately and I
:19:07. > :19:11.would be outed. What led you after 19 years to reveal your true
:19:12. > :19:15.identity and tell the authorities? I was always living with the fear of
:19:16. > :19:19.getting caught, and I realise that, you know what? I have to standard, I
:19:20. > :19:27.accept the consequences and see where it goes. I went to see the
:19:28. > :19:31.medical officer and I sat down and told her, right, I need to tell you
:19:32. > :19:34.something. When I did, she cleared her appointment is for the rest of
:19:35. > :19:39.the afternoon. You must have expected to be thrown out? Yes, I
:19:40. > :19:44.expected thank you very much, we no longer require your servers, get
:19:45. > :19:49.out. That is going to be disappointing, but a decision came
:19:50. > :19:54.back that they wanted to keep me in service, which was absolutely
:19:55. > :19:59.amazingly wonderful. Well, I never dreamt that this would be possible,
:20:00. > :20:03.and here I am now, Caroline Paige, female officer in the RAF.
:20:04. > :20:07.In the end, the military reacted better than your parents? And
:20:08. > :20:12.fortunately my brothers never spoke to me again, my dad initially turned
:20:13. > :20:15.round and said, as far as I'm concerned, you are dead --
:20:16. > :20:19.unfortunately my brothers never spoke to me again. That is why I
:20:20. > :20:23.kept the secret so long for my family. I anticipated losing them.
:20:24. > :20:27.My father was really lovely, I looked into bits, I think he was
:20:28. > :20:32.slowly coming to terms and realising I was still his child and he wanted
:20:33. > :20:35.to support me, I think that is happening, but unfortunately he had
:20:36. > :20:43.a heart attack and died. You were not allowed to go to the funeral?
:20:44. > :20:46.The family decided they did not want me at the funeral, but I agreed with
:20:47. > :20:50.a padre at the base and I have my private servers, but it was very
:20:51. > :20:54.sad. In 2000 tabloid newspaper discovered
:20:55. > :20:59.Caroline's story. She then experienced hostility from some REF
:21:00. > :21:02.colleagues. People came up to me and said get out of RF force, our
:21:03. > :21:07.military, what on earth are you doing serving? It made me more
:21:08. > :21:11.determined to want to get the front line and prove them wrong.
:21:12. > :21:18.Caroline was already experienced on the front line before transitioning.
:21:19. > :21:21.She went on to serve a further 16 years flying battlefield helicopters
:21:22. > :21:25.in Afghanistan and Iraq, winning several commendations for her work.
:21:26. > :21:29.She forged lasting friendships with colleagues during that time,
:21:30. > :21:32.including Andy, with whom she flew for a decade.
:21:33. > :21:38.We have been hearing Caroline's story, I wonder what you made of
:21:39. > :21:41.what happened to her? We have flown together on operations from
:21:42. > :21:46.Yugoslavia, Iraq, Afghanistan, on many dangerous missions. She is the
:21:47. > :21:51.same as everyone else, Maxine, great sense of humour, helps us through
:21:52. > :21:57.the hard times, I think. -- McKerrs in, great sense of humour. Last
:21:58. > :22:02.month Trump treated his intention to ban transgender military personnel,
:22:03. > :22:07.part of his rationale said it would degrade military readiness and
:22:08. > :22:13.demoralise the troops. What did you make of that? It is appalling. There
:22:14. > :22:17.are already thousands of people serving in the US military who are
:22:18. > :22:20.transgender, and they have been on the front line, they have been stood
:22:21. > :22:24.next to their colleagues doing the job as well as anybody else. I am
:22:25. > :22:28.living proof that being transgendered is not affect how you
:22:29. > :22:30.do your job. Wendy Robbins talking to Caroline
:22:31. > :22:30.Paige. Still to come on Sunday Morning
:22:31. > :22:33.Live: As robots are becoming more developed could they ever take
:22:34. > :22:43.over from humans? In the long term there is an
:22:44. > :22:44.existential risk of do you create a form of being batters more
:22:45. > :22:47.intelligent than us? The latest in a string of terror
:22:48. > :22:50.attacks to hit Europe this year occurred in Spain this week,
:22:51. > :22:52.as the Catalonia region of the country was struck twice
:22:53. > :22:54.by people driving cars The first happened on Thursday
:22:55. > :22:58.as a white van smashed into people on the famous boulevard and tourist
:22:59. > :23:01.hotspot of Las Ramblas, while it was packed
:23:02. > :23:12.with holiday makers. Emergency services were quickly on
:23:13. > :23:13.the scene but the devastating attack left 13 people dead and more than
:23:14. > :23:14.100 injured. One of the visitors to Barcelona
:23:15. > :23:17.at the time was the security expert Will Geddes, who was one
:23:18. > :23:25.of the first on the scene I know you are not directly there
:23:26. > :23:28.when the car hits, but what did you see in the aftermath, what did you
:23:29. > :23:32.hear? As soon as I got with the attack was taking place I got down
:23:33. > :23:35.to the court in as quickly as possible, I had been at Las Ramblas
:23:36. > :23:40.only a couple of hours previous to the attack so I knew the area quite
:23:41. > :23:44.well. When I got there the Spanish authorities have been very quick and
:23:45. > :23:50.establishing a chord in, lots of the blue light services had arrived and
:23:51. > :23:53.obviously lots of visitors were moved out of the location -- very
:23:54. > :23:58.quick in establishing a cordon. They seem to have good control very
:23:59. > :24:02.quickly. Is a security expert, you were there on business, which is
:24:03. > :24:06.rather ironic given what happened, why is this being described as the
:24:07. > :24:10.new normal, vehicles driving into crowds? Lots of people use the term
:24:11. > :24:15.low-tech, I think that gives too much sophistication to something
:24:16. > :24:19.which we can fundamentally all access, a vehicle. In every city
:24:20. > :24:23.centre, unless they start banning wholesale vehicles being able to
:24:24. > :24:27.access, we will always face this risk. The problem is the camouflage
:24:28. > :24:31.of a vehicle being used as a delivery device for an attack will
:24:32. > :24:35.only be determined at the last minute when it starts to strike
:24:36. > :24:40.members of the public. In terms of what you can do in these situations,
:24:41. > :24:47.many people are abroad in tourist hotspots at the moment, have you any
:24:48. > :24:51.tips? Is lots of people will no doubt assess, be aware of your
:24:52. > :24:55.surroundings, the people in your surroundings, try to establish as
:24:56. > :25:01.much as you can in your gut as to how you feel. And stop looking at
:25:02. > :25:05.your mobile phone? Absolutely. You are not giving yourself a chance to
:25:06. > :25:10.detect the problem before it happens. The second thing is to look
:25:11. > :25:13.for points of escape, if you are walking down the boulevard, the cup
:25:14. > :25:16.the side streets, shops, restaurants, if you needed to take
:25:17. > :25:21.over quickly, where would you do that? If they are not available,
:25:22. > :25:25.street furniture like parked cars or a roll of motorcycles. It might not
:25:26. > :25:35.be the most robust coverage but at least it is some level of coverage.
:25:36. > :25:38.As a family, with your partner or children, try to have an emergency
:25:39. > :25:42.plan. If you get separated, where do you regroup? Back at the hotel or
:25:43. > :25:46.somewhere else? Would you stop going to tourist hotspots? Not at all, all
:25:47. > :25:49.that has happened in Barcelona is to reiterate that there is no set your
:25:50. > :25:51.location across Europe exempt from these threats. Thank you very much,
:25:52. > :25:54.Will. Every job should be opened
:25:55. > :25:56.to part-time working according to the Equality
:25:57. > :25:57.and Human Rights Commission. The agency claims such
:25:58. > :25:59.measures would help reduce as well as address the pay
:26:00. > :26:03.differences affecting ethnic minorities and the disabled
:26:04. > :26:05.across all industries in the UK. Part-time hours, job sharing
:26:06. > :26:07.and other flexible working schemes could help reduce the motherhood
:26:08. > :26:10.penalty that many women face after having children,
:26:11. > :26:12.the commission says. But MP Philip Davies,
:26:13. > :26:15.a member of the Commons women and equalities committee,
:26:16. > :26:17.called the proposals left-wing claptrap and
:26:18. > :26:18.nanny state nonsense. So should employers
:26:19. > :26:24.be more flexible? Or would such changes make it
:26:25. > :26:26.unfeasible to run a business? Joining me now are Stefan Stern,
:26:27. > :26:29.who writes about management, and Christopher Snowden
:26:30. > :26:30.from the Institute And still
:26:31. > :26:33.with us are the journalist and writer Kieran Yates
:26:34. > :26:43.and broadcaster and businesswoman Stefan, starting with you, some
:26:44. > :26:48.business leaders have said implementing these plans would be
:26:49. > :26:51.impossible. Would they be bad for business? I think we all want
:26:52. > :26:54.flexibility. Bosses have been telling us about the flexible labour
:26:55. > :26:58.force they need but that has to go to microwaves. Businesses are losing
:26:59. > :27:01.out on the talents and abilities of all sort of people who cannot get
:27:02. > :27:05.the hours that they want or need which suits the demands of their
:27:06. > :27:14.life, whether part-time working, term time working, certain times of
:27:15. > :27:16.the year they do not want to work. We are underperforming as a country,
:27:17. > :27:19.low productivity, lower wages, frustration about the economic
:27:20. > :27:22.circumstances. When something is not working, we need more flexibility so
:27:23. > :27:26.more of us can give our best at work. Michelle, you are a
:27:27. > :27:30.businesswoman, with these proposals help women into higher paid and
:27:31. > :27:33.high-ranking jobs? About there is a difference between part-time and
:27:34. > :27:38.flexible working, they are slightly different. To say that all jobs
:27:39. > :27:41.regardless of sector or responsibilities and workload can be
:27:42. > :27:48.offered with flexible working is just not practical in many work
:27:49. > :27:52.situations, it is just not. If you offer every potential employee
:27:53. > :27:56.flexible working, for example if I say I do not want to work on a
:27:57. > :28:00.Friday, if I had to hire a replacement for a Friday and he or
:28:01. > :28:04.she says they don't want to do Fridays either, do I get a third or
:28:05. > :28:10.fourth person? It is just not doable for all jobs to be flexible and/ or
:28:11. > :28:19.part-time, I don't believe. Kieran? We need to completely change the way
:28:20. > :28:24.we talk and flexible working hours. There are a couple of things at
:28:25. > :28:27.play, the gig economy and the high numbers of self-employed people have
:28:28. > :28:33.really changed the face of the British workforce. More women are
:28:34. > :28:40.taking up an increased role in the labour market and a bigger share of
:28:41. > :28:44.division and domestic in the house means that women had to start
:28:45. > :28:51.galvanising this campaign towards equal pay. I think that there are a
:28:52. > :28:53.couple of things to discuss and I do not think that demonising women for
:28:54. > :28:58.needing flexible hours for whatever reason is the way to go. Kieran
:28:59. > :29:04.talked about the gender pay gap, Christopher, how big an issue is it?
:29:05. > :29:08.It is tremendously misrepresented. It exists, but not for the reasons
:29:09. > :29:13.many people assume. It is not endemic sexism in the workplace. As
:29:14. > :29:16.is explained every year, it is three things, one women are much more
:29:17. > :29:22.likely to be in part-time work, which is generally paid less by the
:29:23. > :29:25.hour, they tend on average to go towards slightly less well paying
:29:26. > :29:35.occupations, things like childcare rather than engineering, for
:29:36. > :29:39.example, and... I forgot what the... The gender pay gap, often we talk
:29:40. > :29:45.about it with the same job? A woman doing the same job as a man? That
:29:46. > :29:51.ruins your argument. That has been a legal for 42 years. But there is a
:29:52. > :29:54.gender pay gap. On average, because more women are in part-time jobs and
:29:55. > :29:57.a gender pay gap. On average, because more women are in part-time
:29:58. > :29:59.jobs and in lower paid professions. If you look at like-for-like,
:30:00. > :30:03.comparing a female engineer with a male one, they had to be paid the
:30:04. > :30:11.same. If they were not, the courts would be overrun. It is the mistake
:30:12. > :30:16.of looking at averages when you need to look at individuals. It comes
:30:17. > :30:19.down to choices. The third one that I have remembered as bringing up
:30:20. > :30:22.families. If you are going to take several years out of the labour
:30:23. > :30:28.market you will obviously be paid less when you return to it, so these
:30:29. > :30:31.are down to choices. It depends on the context in which these
:30:32. > :30:34.apparently free choices are being made. If you look at the
:30:35. > :30:43.professional services, accountants and lawyers, at graduate level,
:30:44. > :30:48.there are as many women and men, but further up the chain the women have
:30:49. > :30:55.gone. Maybe because they have had families. But it is Beverly possible
:30:56. > :30:58.to have families and go to work. -- perfectly possible. Maybe we are
:30:59. > :31:04.putting pressure on women to go to work and not bring up their
:31:05. > :31:11.children. Of course but there is research that shows that women would
:31:12. > :31:14.cut their hours by half as much if they had genuine flexibility. From
:31:15. > :31:18.an economic point of view this is a waste of potential and talent. You
:31:19. > :31:24.are creating insecure workers. Women are insecure workers because they
:31:25. > :31:27.are not represented. Now we have an interesting guest on this.
:31:28. > :31:29.I'm joined by Danielle Ayers, a solicitor specialising
:31:30. > :31:30.in pregnancy, maternity and sex discrimination, who
:31:31. > :31:32.advises both employers and employees in these areas.
:31:33. > :31:41.Good morning. Have we still got a great deal of discrimination for
:31:42. > :31:46.women? Have you seen stories of late that you couldn't quite believe?
:31:47. > :31:49.Unfortunately, yes. You think in this day and age that employers
:31:50. > :31:53.would know what they have got to do in these situations and they would
:31:54. > :31:57.be up to date on the law, and they would not fall foul of these rules
:31:58. > :32:01.and regulations, but unfortunately it seems to be becoming more
:32:02. > :32:05.blatant. One of the main examples I can give you is equal pay. Women are
:32:06. > :32:10.being paid less to do the same job as men. That is different to looking
:32:11. > :32:13.at the gender pay gap, which is taking an average of your female
:32:14. > :32:19.workers and your mail workers and providing an average of their wages.
:32:20. > :32:22.Equal pay is looking at men and women doing the same job, where
:32:23. > :32:27.there can be no argument that women should not be paid the same. Why is
:32:28. > :32:30.it happening? It is down to employers not knowing what they
:32:31. > :32:33.should be doing in these situations. There was a report by the equality
:32:34. > :32:37.and human rights commission last year which said that more needs to
:32:38. > :32:40.be done to support employers. More needs to be done to give them the
:32:41. > :32:43.knowledge that they need to make sure that their employees are
:32:44. > :32:50.supported and doing the right things. What knowledge do they need?
:32:51. > :32:53.The law is the law, isn't it? Why is it happening? You would think so. I
:32:54. > :32:58.think employers have become more bolshy overtime and a lot has gone
:32:59. > :33:03.in their favour. Just a couple of weeks ago, the employment tribunal
:33:04. > :33:07.fees have been scrapped, which is a great thing, but a lot of things
:33:08. > :33:11.have happened. ACAS conciliation has been brought in, unfair dismissal,
:33:12. > :33:15.you can't bring an unfair dismissal complaint unless you have been
:33:16. > :33:19.employed for two years. They are becoming more bolshy in making
:33:20. > :33:23.decisions around staff. They have rights on their side. This idea that
:33:24. > :33:27.everybody should be offered flexible working, surely you can see the
:33:28. > :33:33.issue with that for buses? It is hardly workable for all companies.
:33:34. > :33:36.No, I don't think it is and I agree with those in the studio saying that
:33:37. > :33:42.but that is not what we are asking for here. I am asking that if you
:33:43. > :33:45.make an application that there is a sensible and informed discussion
:33:46. > :33:50.between the parties to see whether or not there is a workable solution
:33:51. > :33:54.for both. There cannot be a case where it is costing an employer so
:33:55. > :33:57.much just to put flexible working in place for an employee, however in
:33:58. > :34:05.the vast majority of cases there is a solution for both parties. Thank
:34:06. > :34:08.you for that. We have lots of responses here. Thank you. Mavis
:34:09. > :34:10.says that equality sounds fine on paper but so does communism.
:34:11. > :34:14.Different people work at different rates and some work better than
:34:15. > :34:18.others. Should they all get the same money? And Andrew says that all work
:34:19. > :34:21.should be opened up to allow part-time positions and people
:34:22. > :34:24.should be allowed to decide how many hours they are happy working. It has
:34:25. > :34:33.big benefits for the employer as well. That sounds nice! Now,
:34:34. > :34:39.Danielle Croce about businesses becoming bolshy. You have mentioned
:34:40. > :34:42.that women have become insecure. Yes, women are an increasingly
:34:43. > :34:47.insecure workforce because they are aware of discrimination. The gender
:34:48. > :34:53.pay gap, the same money for the same job, that is clear evidence of
:34:54. > :34:55.discrimination. British African women and Bangladeshi and Pakistani
:34:56. > :34:58.women are disproportionately affected. This is something that
:34:59. > :35:02.does exist and it is happening and we need to talk about it.
:35:03. > :35:07.Christopher says it doesn't exist. It exists but not for those reasons.
:35:08. > :35:11.The Office for National Statistics looked at this again this year and
:35:12. > :35:14.it said for the three reasons that I gave before the full explanation of
:35:15. > :35:17.it. There might be sexism here and there and that is why we have got
:35:18. > :35:22.the courts to look into these things. Sometimes it might be sexism
:35:23. > :35:25.against men, you don't know. These things are going to court on the
:35:26. > :35:31.occasions that they happen. It does not explain the 10% pay gap, which
:35:32. > :35:35.is just an average. It tells you nothing whatsoever. Likeable
:35:36. > :35:40.working, it is incredibly inefficient actually. It might be
:35:41. > :35:44.all right on a factory line or something but in most jobs doesn't
:35:45. > :35:48.work. And most people don't want part-time work. There are always
:35:49. > :35:52.more people who want full-time work you are working part-time currently
:35:53. > :35:56.than the other way round. Flexible working is inefficient? That
:35:57. > :36:01.management is inefficient and bosses who don't listen to their workforce
:36:02. > :36:07.are inefficient. -- bad management is inefficient. Not giving employees
:36:08. > :36:14.the chance to be flexible is inefficient. That explains chronic
:36:15. > :36:19.low productivity in this country. You couldn't work flexibly because
:36:20. > :36:24.then you couldn't be on this show. It works in some jobs and not
:36:25. > :36:28.others. That is right. Any NHS, they work long shift that we need them to
:36:29. > :36:32.be there and it has got to be managed. This is part of the job of
:36:33. > :36:37.being a good manager, using people properly. It is down to the
:36:38. > :36:41.managers? I think this conversation, this report calling for flexibility,
:36:42. > :36:47.it is just think tanks sitting in a corner somewhere dreaming up things
:36:48. > :36:50.to be more politically correct, to create equality, where actually it
:36:51. > :36:55.is not realistic in the real world, in the world of business for
:36:56. > :36:59.example, as one sector, you cannot turn around to all employers and say
:37:00. > :37:03.that all jobs need to be flexible and part-time. I want to touch on
:37:04. > :37:09.what Chris is saying. I agree with so much of what you are saying about
:37:10. > :37:12.the gender pay gap. You are talking about ladies feeling discriminated
:37:13. > :37:15.against in the workplace. I don't feel discriminated against in the
:37:16. > :37:19.workplace. I believe in myself and my abilities and I will negotiate
:37:20. > :37:23.hard. We need to stop saying to women that if you go into the world
:37:24. > :37:29.of work that you will be automatically earning 80p to the
:37:30. > :37:32.pound compared to men. Shouldn't we highlight a problem? That statistic,
:37:33. > :37:35.it is a headline figure which takes all the men and women in this room
:37:36. > :37:41.and says she earns 80p and he earns ?1 and now we have got a problem.
:37:42. > :37:45.You can only compared pay and discuss whether it is fair when you
:37:46. > :37:49.take two people doing the same job with the same experience. So there
:37:50. > :37:57.is no gender pay gap problem? I don't feel that taking the headline
:37:58. > :38:04.figure and saying that women own 80p and men ?1 is helpful. An equal pay
:38:05. > :38:08.is illegal. Yes I know, is there gender pay gap problem? Not in the
:38:09. > :38:09.way you're trying to describe with that headline figure, no. Thank you
:38:10. > :38:12.very much. A recent University of Oxford
:38:13. > :38:14.study concluded that artifical intelligence, or AI,
:38:15. > :38:16.will be better than humans at all tasks within 45 years,
:38:17. > :38:20.and many people, including Stephen Hawking, believe humans
:38:21. > :38:22.will be in trouble in the future if our ambitions don't match
:38:23. > :38:25.with those of machines. Now two tech titans are engaged
:38:26. > :38:28.in a public disagreement about AI. Billionaires Elon Musk
:38:29. > :38:34.and Mark Zuckerberg have differing opinions of the future possibility
:38:35. > :38:36.of machines that can Tesla and SpaceX founder
:38:37. > :38:43.Musk has been warning A few days ago he tweeted "If you're
:38:44. > :38:48.not concerned about AI Facebook's Zuckerberg thinks
:38:49. > :38:55.it's all fear-mongering. Responding to Musk's
:38:56. > :38:59.warnings he said, "I think and in some ways I actually think
:39:00. > :39:03.it is pretty irresponsible". To explore the issue further,
:39:04. > :39:19.we sent Samanthi Flanagan Whether we realise it or not, robots
:39:20. > :39:23.and artificial intelligence are becoming more and more part of our
:39:24. > :39:26.everyday lives. I am here to meet the curator of the robots
:39:27. > :39:32.exhibition, Ben Russell, who introduced me to a rather theatrical
:39:33. > :39:39.robot. This is Robo thespian, built by a UK company. He is quite
:39:40. > :39:43.remarkable. They decided to build a robot actor. His movements are
:39:44. > :39:48.naturalistic and characterful and the expression, on his face, a
:39:49. > :39:52.wonderful thing. Surely a robot couldn't replace the human actor.
:39:53. > :39:56.They are not capable of emotions. You can programme stuff in. Whether
:39:57. > :40:02.or not he is a motoring is the question. It is following a piece of
:40:03. > :40:08.code, effectively. They can't improvise. They find it difficult!
:40:09. > :40:14.Is it something you are working towards? At the moment they are very
:40:15. > :40:18.good at doing a carefully defined job in a very specific way
:40:19. > :40:22.repeatedly, but not acting in the human environment of flexibility.
:40:23. > :40:27.Now a robot worker called Baxter who with no programming can teach a
:40:28. > :40:33.tough new tasks and is designed to work alongside humans. It takes bits
:40:34. > :40:37.of human nonverbal behaviour. If it moves its arm, it looks at where its
:40:38. > :40:42.arm is going. It looks at an obstacle, if rounds. So much of
:40:43. > :40:47.human conversation is not verbal and it is these physical cues that we
:40:48. > :40:51.picked up on and so it is very smart in this respect. Like a young child,
:40:52. > :40:57.Baxter learns to pick up the objects through trial and error, improving
:40:58. > :41:03.with each attempt. There are 3000 Baxters out there. They all learn
:41:04. > :41:07.and upload that information centrally. If one object picks it up
:41:08. > :41:11.successfully, it uploads that information and in theory all the
:41:12. > :41:15.other robot should be able to pick up the same object. Sharing
:41:16. > :41:20.information between machines is very important, machine learning. Do you
:41:21. > :41:23.think we need to monitor the development of AI? Absolutely.
:41:24. > :41:27.Everybody concentrates on the big red thing physically doing things,
:41:28. > :41:29.the robot, but there is a lot of software behind-the-scenes that we
:41:30. > :41:33.don't think about and that is the important thing. We are taking our
:41:34. > :41:39.eye off the ball we are thinking about the robot. As complex as
:41:40. > :41:45.Baxter is, he doesn't look very human but this robot is uncannily
:41:46. > :41:48.lifelike and has found work as a newsreader. How humanlike do you
:41:49. > :41:52.want your robots to be? There is something called the uncanny valley.
:41:53. > :41:56.You build a machine and it gets a bit lifelike and we are happy with
:41:57. > :42:00.that. As soon as it is too lifelike, we get weird associations with
:42:01. > :42:06.corpses and other stuff, and people back away, and that is uncanny
:42:07. > :42:09.valley where robots plummet. What if we don't monitor the development of
:42:10. > :42:15.AI? In the long term there is existential risk of creating a being
:42:16. > :42:18.that is more intelligent than us. Generally, how does that
:42:19. > :42:22.intelligence work in a physical form? That is very comeback to the
:42:23. > :42:25.more mundane things which can trip is over. How do they interact with
:42:26. > :42:30.the world that we take entirely for granted and they find very
:42:31. > :42:33.difficult? That is a challenge. AI is a very different thing. I am
:42:34. > :42:39.going to argue that she will not replace me as a presenter and I will
:42:40. > :42:42.stick to that! Time will tell. Samanthi Flanagan visiting
:42:43. > :42:44.an exhibition which runs Samanthi showed some concern
:42:45. > :42:47.about her job being taken by robots, So over to our future overlord
:42:48. > :42:52.Sanbot with the question. Should you be worried
:42:53. > :42:54.about the rise of robots? I don't think so,
:42:55. > :42:56.but what do you think? To answer that question,
:42:57. > :42:58.I'm now joined by Luke Robert Mason,
:42:59. > :43:00.a science commentator, Michelle Hanson,
:43:01. > :43:04.a journalist and author, Kriti Sharma, vice
:43:05. > :43:06.president of AI at Sage, and Oliver Moody, science
:43:07. > :43:16.correspondent for The Times. Michelle, you have written about
:43:17. > :43:19.your concerns over the rise of artificial intelligence. What
:43:20. > :43:24.worries you the most? I am worried about this taking jobs, as she
:43:25. > :43:29.suggested. It has already taken over shop assistants. You can't go into
:43:30. > :43:32.Boots and talk to anybody. You have people standing around like odd
:43:33. > :43:39.socks with no job, just showing people how to use machines. All
:43:40. > :43:42.right, if they are going to take over thousands of jobs and people
:43:43. > :43:47.will be unemployed and people are going to get a basic wage, I bet it
:43:48. > :43:51.will be measly, not generous or luxurious. There are thousands of
:43:52. > :43:55.people out of work sitting at home with nothing to do. I expect there
:43:56. > :44:00.will be social unrest then, don't you? But robots are there to help
:44:01. > :44:04.us. Many people in history have worried about the advancement of
:44:05. > :44:14.technology. You're just like them, aren't you? No, it is much worse and
:44:15. > :44:17.out of hand now. The genie is out of the bottle. We have got to use them
:44:18. > :44:20.but we have got to control them. If you imagine it will be benign forces
:44:21. > :44:22.in charge of these robots, OK, but it won't be. It will be greedy
:44:23. > :44:24.people wanting to make lots of money. We are not a benign species
:44:25. > :44:34.and it is very risky. It seems dangerous building machines
:44:35. > :44:38.which will outsmart us and take our jobs? He is talking about super
:44:39. > :44:44.intelligent AI in the image and likeness of us as humans. Where a
:44:45. > :44:46.lot of the concern comes from Arendse per intelligent AI is that
:44:47. > :44:51.we are equating to human intelligence. We see intelligence is
:44:52. > :44:57.what makes us as humans the paragon of animals, but we have realised
:44:58. > :45:02.that humans do not necessarily always do the best thing for
:45:03. > :45:06.ourselves. Intelligence exists on a spectrum and if you look at where we
:45:07. > :45:11.existed here and chickens down here and dogs here, if we have something
:45:12. > :45:15.more intelligent than as we worry that we might be treated like the
:45:16. > :45:21.apes, if we take seriously the notion that we are an evolved
:45:22. > :45:23.version... That is worrying? Considering how humans are
:45:24. > :45:28.destroying the environment of animals and other entities. They
:45:29. > :45:33.would destroy us? We are concerned we will put the sorts of values that
:45:34. > :45:38.we have whereby humans have caused so much trauma and harm, we will
:45:39. > :45:43.translate some of that to our artificially intelligent machines.
:45:44. > :45:47.Are you worried? A I am to a degree, but I don't think we need to be as
:45:48. > :45:51.worried as some of us may code. Because it will not affect you, it
:45:52. > :45:57.will affect your grandchildren. Partially because I do not think it
:45:58. > :46:02.will happen. Will it happen? How club are these machines? Is who
:46:03. > :46:09.works in AI everyday we are quite far from machines taking over the
:46:10. > :46:13.world, but AI is already everywhere in our daily lives. If you have ever
:46:14. > :46:20.used Google search under prompts the next few words, it ranks the results
:46:21. > :46:25.based on what you see at the top, all you would talk to Siri, that is
:46:26. > :46:31.AI. It is not have to be a robot taking over the world, thank you to
:46:32. > :46:34.Hollywood for that image! We are creating AI good at certain
:46:35. > :46:40.specialised tax, it is doing a good job at augmenting and supporting
:46:41. > :46:44.humans. We need to design AI correctly. It is like any other
:46:45. > :46:48.industrial Revolution but the differences this time AI learns on
:46:49. > :46:53.its own. It is the job of the creators of AI as well as the humans
:46:54. > :46:57.teaching AI or talking to it to do it right. Oliver, is it worrying
:46:58. > :47:02.that we are making something that will be more clever than us? They
:47:03. > :47:06.will never have the same ability to generate emotion, art or music, so
:47:07. > :47:12.shouldn't we say we will always be better than them? Super intelligence
:47:13. > :47:17.is a bit like the Apocalypse of computing. It has been predicted so
:47:18. > :47:22.many times, in 1960 's and academic predicted it would happen within a
:47:23. > :47:26.generation. I think there are much bigger reasons to be worried about
:47:27. > :47:30.the impact AI is having a society now. There is a common misconception
:47:31. > :47:35.that AI is a magical process by which computers can make
:47:36. > :47:38.superhumanly perfect decisions about the world. That raises a serious
:47:39. > :47:43.enough moral question when it works, but what is worse is that it often
:47:44. > :47:46.does not and you are taking limited information about what people have
:47:47. > :47:51.done in the past and using it to make decisions about who gets a
:47:52. > :47:58.mortgage, who will reoffend when they leave prison? You are doing it
:47:59. > :48:00.very badly and it is a recipe for biased and Abhijit decisions that
:48:01. > :48:05.the computers cannot even explain. Many applications of AI are computer
:48:06. > :48:09.says no on steroids. I would love to say that Emma is talking to a robot
:48:10. > :48:10.but I do not think a robot would be able to deal with your difficult
:48:11. > :48:12.questions. I'm joined now by Noel Sharkey
:48:13. > :48:23.a professor of robotics And a real humanoids, I am told. We
:48:24. > :48:27.want to talk about jobs, lots of messages coming in, when will a
:48:28. > :48:32.robot take our jobs? Do you see that any time soon? It has already
:48:33. > :48:37.happened for some time, it is just the number of jobs they take. There
:48:38. > :48:40.is a lot of talk that perhaps there will be new jobs, and there
:48:41. > :48:46.certainly will be, but I fear that the new jobs will not be as many as
:48:47. > :48:51.we currently have, we will see a big job reduction, particularly in the
:48:52. > :48:56.service industry, making burgers, driving trucks and cars, taxis,
:48:57. > :48:59.those kinds of things will go first. AI is sweeping through many areas
:49:00. > :49:05.and it is difficult to really predict the future because there are
:49:06. > :49:10.new laws coming out in Europe and in the UK, I believe, where we are
:49:11. > :49:16.giving too many decisions to AI programmes. People are now looking
:49:17. > :49:19.at the idea that if these decisions impact on people they should be
:49:20. > :49:23.transparent and give a good explanation. That is difficult when
:49:24. > :49:26.you have machine learning because you are left with big matrices of
:49:27. > :49:33.numbers. We might see many of those things become a. On to emotion and
:49:34. > :49:38.nuance, when will we get to a point where robots and artificial
:49:39. > :49:42.intelligence develops that side of things? I can't see it myself. It
:49:43. > :49:46.might turn out that you need an organic body to feel emotions, I
:49:47. > :49:53.don't know nobody knows. We also chemical machine as well as a
:49:54. > :49:57.metaphor we have for our brains as computers, but they are quite
:49:58. > :50:01.chemical in origin and we might be able to simulate emotions, which we
:50:02. > :50:07.can do pretty well, we can make a robot smile and frowned, we can get
:50:08. > :50:12.them to perceive emotions in a sense that they can classify whether we
:50:13. > :50:16.are happy or sad, but getting them to understand sadness, a child
:50:17. > :50:24.crying, did it drop its lollipop down the toilet or did its mother
:50:25. > :50:28.just die? They are not good at subtle contextual stuff. As far as
:50:29. > :50:33.feeling emotions we are nowhere with that, we do not understand that.
:50:34. > :50:36.Thank you for painting a picture and doing a very good robot impression.
:50:37. > :50:41.One viewer says robots will only be as bad as we make them, without
:50:42. > :50:46.making them in our own image and our own reflection is frightening. If we
:50:47. > :50:51.give them our intellect, power and inhumanity they will be an adversary
:50:52. > :50:56.to humanity. Another person says we afraid of something better than us?
:50:57. > :50:59.Why not say, yes, they can do better than us. Another person says we
:51:00. > :51:03.should appreciate the rise of robots, solving problems for the
:51:04. > :51:06.future and automating more processors. Michael says we should
:51:07. > :51:12.only be afraid of the people behind the robots. We should be very afraid
:51:13. > :51:15.of them. Michelle, we are talking about robots being the answer to the
:51:16. > :51:20.social care crisis, they could help to look after the elderly, people
:51:21. > :51:24.with dementia? Surely even you can see that is good? That really
:51:25. > :51:28.frightens me because I am so old. The one in Japan, 24 fingered robot
:51:29. > :51:34.that washes your hair, I do not want that. We do not have enough people
:51:35. > :51:41.to do that. Get more people, pay them, sort out social care and stop
:51:42. > :51:45.wasting money McGraw robots. Kriti, a highly respected professor at
:51:46. > :51:49.Southampton University says it is not artificial intelligence that
:51:50. > :51:54.worries me, it is human stupidity. How far do you agree? We heard that
:51:55. > :51:58.from the e-mails? Humans are interacting with robots and
:51:59. > :52:02.surprisingly they are asking them out on dates, robots are trying to
:52:03. > :52:06.be as human as they can be. It is about designing the right solutions
:52:07. > :52:11.for the right problems. Lots of work needs to be done with education,
:52:12. > :52:16.health care, transportation, AI can help. Rather than focusing on the
:52:17. > :52:20.interactions that are stupid, we need to focus on problems to be
:52:21. > :52:26.sold. Can it solve our problems? That is less of an issue. What Noel
:52:27. > :52:29.Sharkey is talking about is key. We already seeing artificial entities
:52:30. > :52:34.with agency of the human beings, they are called artificial persons,
:52:35. > :52:38.otherwise known as corporations, we give them a degree of agency to
:52:39. > :52:44.cause a great deal of harm. This is not science fiction, it is already
:52:45. > :52:51.yet. Ten seconds, is it a good thing? Is what? AI. It has the
:52:52. > :52:55.potential to do enormous good, but now is the time to focus on the
:52:56. > :52:59.social problems that may create. In the future I will have this debate
:53:00. > :53:02.with ball robots. Or cyborgs. Yes. Thank you.
:53:03. > :53:04.A celebration of the Hindu God Krishna was marked this week.
:53:05. > :53:07.He is depicted in many ways, from an innocent child,
:53:08. > :53:09.a conquering hero, a lover, a cattle herder and a musician.
:53:10. > :53:12.Mehreen Baig went to Watford to join thousands of devotees and join
:53:13. > :53:27.I am at their home to the International successes --
:53:28. > :53:31.International Society For Krishna Consciousness,, better known as the
:53:32. > :53:35.Hare Krishna movement, when the house and the grounds were gifted to
:53:36. > :53:40.them by George Harrison. The man, just outside Watford, is one of the
:53:41. > :53:45.UK's most popular Hindu pilgrimage sites, and I am here for the biggest
:53:46. > :53:50.event of the year. More than 50,000 pilgrims are expected here over the
:53:51. > :53:54.course of two days to celebrate Janmashtami. This man has been a
:53:55. > :54:04.devotee of Krishna for 15 years. It is one of the most wonderful and joy
:54:05. > :54:08.you should occasions in our calendars, Sri Krishna Janmashtami,
:54:09. > :54:13.the birthday of Krishna, who we regard as supreme God. Krishna has
:54:14. > :54:17.many manifestations over the ages but the fountain had the source of
:54:18. > :54:20.all incarnations, it is said to be Krishna.
:54:21. > :54:24.Krishna is one of the most popular Hindu gods, so Hindus from many
:54:25. > :54:29.traditions gathered to mark his birthday. The spiritual focus of the
:54:30. > :54:34.day is to offer a prayer to Krishna in the temple. I can't wait to see
:54:35. > :54:44.inside. It is a very long queue. I can hear
:54:45. > :54:47.chanting, music. I have seen things like this in the
:54:48. > :54:53.Bollywood movies we watch at home, but never in real life. It is an
:54:54. > :54:58.incredible sights to see devotees making their offerings. Food is at
:54:59. > :55:05.the heart of the celebrations, everyone who attends the festival
:55:06. > :55:09.gets a free meal, and all the food is vegetarian. This woman, who hosts
:55:10. > :55:14.a vegetarian cooking demonstration, explains why. From the spiritual
:55:15. > :55:18.perspective, Krishna says if one offers me with love and devotion a
:55:19. > :55:26.leaf, a flower, fruit or water I will accept it. He said in the
:55:27. > :55:31.Bhagavad-Gita that killing animals is not permitted. So we are
:55:32. > :55:37.vegetarian. An army of volunteers work behind the scenes in the
:55:38. > :55:41.kitchen, preparing over 50,000 plates of the free food offered to
:55:42. > :55:47.pilgrims. After I have had a quick makeover, more of an explanation.
:55:48. > :55:50.My mother will be very proud. I can smell is an amazing things around
:55:51. > :56:00.me. Can you tell me what is going on? I can smell some amazing things.
:56:01. > :56:03.We have been preparing for four weeks, the last four days have been
:56:04. > :56:08.very intense, we have been chopping coriander and peppers. We offer the
:56:09. > :56:15.food to Lord Krishna once it has been sanctified. Then we give it to
:56:16. > :56:22.the pilgrims free of charge. What is the importance of feeding people?
:56:23. > :56:33.When you go to temples, you get sanctified food. It is about being a
:56:34. > :56:37.compassionate person. I am not a natural in the kitchen
:56:38. > :56:43.but my helpers welcomes all the same. I can do that. This is not me
:56:44. > :56:49.and my prime. -- my helper is a welcome to all the same.
:56:50. > :56:56.As well as food and worship their all kinds of entertainment choose
:56:57. > :57:03.from. When I spot a henna artist, I can't
:57:04. > :57:14.resist getting my hand painted. I am a devotee of Krishna, so every year
:57:15. > :57:19.for the last 17 years I have been doing mehndi for Krishna. In the
:57:20. > :57:23.Scripture it says that a person who has any talent given to you by Lord
:57:24. > :57:28.Krishna, you should offer that if you cannot festival time. This is
:57:29. > :57:36.the reason why I do mehndi every year. We are spreading the message
:57:37. > :57:41.of love, humility. Just give something to earth which God has
:57:42. > :57:47.given you. All too soon it is time to go home, but I am not leaving
:57:48. > :57:56.until I collect my plate. The best part of day! Thank you.
:57:57. > :58:01.And that is a very special end to a very special day. For me personally,
:58:02. > :58:06.this is a community I have barely had an inside too, so seeing what a
:58:07. > :58:09.strong sense of community there is, people who have been volunteering
:58:10. > :58:12.for weeks, coming after work to make sure everyone has a really great
:58:13. > :58:14.time, that was particularly heart-warming.
:58:15. > :58:15.That's all from us for this week.
:58:16. > :58:18.Many thanks to all our guests and you at home
:58:19. > :58:21.Emma will be carrying on the conversation online.
:58:22. > :58:23.Yes, I'll be talking to Kriti Sharma about the future
:58:24. > :58:25.Log on to facebook.com/bbcSundayMorningLive
:58:26. > :58:33.In the meantime, from everyone here in the studio and the whole