Browse content similar to Episode 10. Check below for episodes and series from the same categories and more!
Line | From | To | |
---|---|---|---|
On today's programme: Are street grooming gangs a Muslim problem, | :00:00. | :00:13. | |
as was claimed in a tabloid this week, or is the language | :00:14. | :00:16. | |
we use around the issue racist and divisive? | :00:17. | :00:18. | |
Equality campaigners propose all jobs be open to part | :00:19. | :00:20. | |
Will such measures help to close the gender pay gap | :00:21. | :00:24. | |
We ask is equality impossible in the workplace? | :00:25. | :00:29. | |
Also on today's show, will we take over the world? | :00:30. | :00:33. | |
I don't know about that, Sanbot, but I hope you're not looking | :00:34. | :00:35. | |
As tech billionaires battle over machines that can | :00:36. | :00:39. | |
think for themselves, should we be worried | :00:40. | :00:40. | |
Emma Barnett is here ready to help you join in the debates. | :00:41. | :00:45. | |
And you should be worried about that robot! We want to hear from you | :00:46. | :01:00. | |
about all our discussions, especially the first one about | :01:01. | :01:01. | |
grooming gangs. You can contact us by | :01:02. | :01:03. | |
Facebook and Twitter. Don't forget to use | :01:04. | :01:05. | |
the hashtag #bbcsml. Or text SML followed | :01:06. | :01:07. | |
by your message to 60011. Texts are charged at your | :01:08. | :01:09. | |
standard message rate. Or email us at | :01:10. | :01:11. | |
[email protected]. However you choose to get in touch, | :01:12. | :01:16. | |
please don't forget to include your name so I can get you involved | :01:17. | :01:19. | |
in our discussions. And later we'll meet | :01:20. | :01:21. | |
a pioneering RAF pilot who has The fact is, that there are already | :01:22. | :01:34. | |
thousands of people serving in the US military who are transgender. I | :01:35. | :01:37. | |
am living proof that being transgender does not affect how you | :01:38. | :01:38. | |
do your job. This week Labour MP | :01:39. | :01:49. | |
for Rotherham Sarah Champion quit her role on the Shadow Cabinet | :01:50. | :01:52. | |
after criticism over a newspaper article she wrote | :01:53. | :01:54. | |
about grooming gangs. "Britain has a problem | :01:55. | :02:00. | |
with British Pakistani men raping and exploiting white girls," | :02:01. | :02:03. | |
she wrote in The Sun on Friday. Or am I just prepared | :02:04. | :02:06. | |
to call out this horrifying She has since apologised for what | :02:07. | :02:16. | |
she described as an extremely poor choice of words. | :02:17. | :02:21. | |
In the same newspaper, another article by former Sun | :02:22. | :02:23. | |
political editor Trevor Kavanagh referred to "the Muslim problem". | :02:24. | :02:25. | |
It was condemned in an open letter signed by more than 100 MPs, | :02:26. | :02:28. | |
So should race be talked about when tackling grooming gangs or are we in | :02:29. | :02:37. | |
danger of alienating a community? Joining us now are Nazir Afzal, | :02:38. | :02:43. | |
former Chief Crown Prosecutor for North West England, | :02:44. | :02:45. | |
Michelle Dewberry, a broadcaster Kieran Yates, | :02:46. | :02:47. | |
a journalist and writer, and Mohammed Shafiq, chief executive | :02:48. | :02:50. | |
of the Ramadhan Foundation. Hearing, has some of the language | :02:51. | :03:00. | |
that has been used been racist and divisive? I think it has definitely | :03:01. | :03:04. | |
stoked the fire of prejudice and racism that already exists in the | :03:05. | :03:08. | |
UK. What it has also done is derailed the conversation away from | :03:09. | :03:12. | |
the victims, by somehow suggesting there is something inherently | :03:13. | :03:16. | |
sexually deviant about British Pakistani man, which is disgusting. | :03:17. | :03:20. | |
What about future victims? If we don't talk about this properly, we | :03:21. | :03:24. | |
will never solve the problem. Of course any investigation has got to | :03:25. | :03:29. | |
unpick and discuss the issues of culture and religion, but what | :03:30. | :03:31. | |
Trevor Kavanagh and Sarah Champion have done is make the conversation | :03:32. | :03:36. | |
simply about that and devoid of nuance. How did you review the | :03:37. | :03:40. | |
apology? I think it was a necessary for her to do that. What Sarah | :03:41. | :03:45. | |
Champion did was speak the truth. It is not racist to say that Pakistani | :03:46. | :03:50. | |
men are grooming and abusing and raping white ladies or girls, if | :03:51. | :03:55. | |
that is what they are doing. Grouping them altogether? A | :03:56. | :03:59. | |
Pakistani men are doing that. Some Pakistani men. She never said all in | :04:00. | :04:06. | |
her article. People are leaping on this and suggesting that somehow | :04:07. | :04:09. | |
anybody with a brain cell is walking around thinking that all Pakistani | :04:10. | :04:13. | |
people are doing this. Lady had suggested that all Pakistani men are | :04:14. | :04:18. | |
doing anything. Brexit nobody has suggested that all Pakistani men are | :04:19. | :04:23. | |
doing anything. Growing up as a mixed-race boy in 1980s Britain, | :04:24. | :04:27. | |
people talked about black people committing crimes, and that affected | :04:28. | :04:31. | |
me. I was no more likely to commit a crime and my white colleagues. | :04:32. | :04:36. | |
Pakistani men are feeling finger pointed. Nobody is suggesting that | :04:37. | :04:40. | |
all Pakistani men are doing anything. Sarah Champion's article | :04:41. | :04:43. | |
was correct. She should not have resigned and I think she was forced | :04:44. | :04:47. | |
into that. She should have stood by her words and her position. You have | :04:48. | :04:52. | |
been involved in prosecuting gangs. Does race and religion play a part | :04:53. | :04:57. | |
in these crimes? As people have said already, you can't get away from the | :04:58. | :05:01. | |
fact that British Pakistani men are disproportionately involved in | :05:02. | :05:05. | |
street grooming. British white paedophiles have a different | :05:06. | :05:13. | |
approach. They work in institutions and they organise themselves in a | :05:14. | :05:15. | |
different way. The question that you should be asking is why we focusing | :05:16. | :05:18. | |
on race and not sexism, when we haven't talked about the victims and | :05:19. | :05:26. | |
what they gone through. I am with Michelle. I have known Sarah for a | :05:27. | :05:29. | |
long time and she is a champion for victims and everybody involved in | :05:30. | :05:35. | |
this is a champion for victims. We can't get away from the fact that | :05:36. | :05:38. | |
British Pakistani men are disproportionately involved in group | :05:39. | :05:44. | |
grooming. Is it a race crime? It is about the availability and | :05:45. | :05:52. | |
vulnerability of young girls. Is it about the way they view white girls? | :05:53. | :05:57. | |
Part of that is true. They also targeted Pakistani girls and girls | :05:58. | :06:01. | |
from other ethnicities. I prosecuted the ringleader of Rochdale for his | :06:02. | :06:06. | |
abuse. But a smaller amount. I totally get that. That is right. The | :06:07. | :06:12. | |
fact is that victims would be targeted by predators from all | :06:13. | :06:16. | |
communities. We can't get away from the fact that Pakistani men are | :06:17. | :06:22. | |
disproportionately more involved in street grooming white girls, can we? | :06:23. | :06:27. | |
You can't and the facts speak for themselves. I have always been very | :06:28. | :06:30. | |
consistent in saying that as a community we have got to deal with | :06:31. | :06:33. | |
this. What has been difficult over the last ten days has been when it | :06:34. | :06:37. | |
has been politicised. Politicians and commentators turning it into a | :06:38. | :06:41. | |
massive debate. As my colleague said. And we have forgotten the | :06:42. | :06:46. | |
reality, the victims. I have met these victims. I have spoken to | :06:47. | :06:50. | |
them. Do you know the most distressing thing? Not a single | :06:51. | :06:54. | |
person commenting on this is talking about their experiences. I think we | :06:55. | :06:57. | |
should be talking about that. In terms of Sarah Champion, I think it | :06:58. | :07:00. | |
was right that she resigned. She deliberately lied and told the | :07:01. | :07:10. | |
mysteries when she said that she had the article jape by The Sun. -- and | :07:11. | :07:17. | |
told an untruth. It was signed off by her office, not The Sun. This | :07:18. | :07:22. | |
inflammatory talking about race feed paranoia among the far right. I am | :07:23. | :07:27. | |
standing up against these gangs and I have been campaigning against them | :07:28. | :07:31. | |
since 2007. But when you make inflammatory comments, as she did, | :07:32. | :07:37. | |
as a shadow qualities secretary, her position is untenable. You are | :07:38. | :07:43. | |
talking to one of the victims now. Yes, it is often said that the | :07:44. | :07:47. | |
victim's voice gets lost in these conversations. | :07:48. | :07:50. | |
I'm joined now by Sammy Woodhouse, a victim of the Rochdale grooming | :07:51. | :07:53. | |
gang, and now a campaigner for victims of abuse. | :07:54. | :07:55. | |
Good morning. Do you think race and religion played any role in the way | :07:56. | :08:02. | |
that these gangs operated? You had first-hand experience of it. I think | :08:03. | :08:07. | |
it has in some cases, yes. We can't get away from that. There has been | :08:08. | :08:10. | |
evidence in court cases throughout the country. As a country, we are | :08:11. | :08:20. | |
open to talking about white men raping children, but we are in 2017 | :08:21. | :08:25. | |
and we can't say Pakistani con Muslim and child abuse in the same | :08:26. | :08:30. | |
sentence. There are lots of factors involved, not just race and | :08:31. | :08:36. | |
religion. I was targeted because of my sex, I was a girl. You are | :08:37. | :08:41. | |
targeted for different reasons. There were different races involved. | :08:42. | :08:47. | |
The fact that I was in the wrong place at the wrong time, or he was. | :08:48. | :08:51. | |
He was very open to getting away with it and he thought it was | :08:52. | :08:56. | |
normal. If I can just break in, do you feel the fact that you are white | :08:57. | :09:00. | |
played a role in the fact that you were seen as vulnerable? I think it | :09:01. | :09:06. | |
did. That is part of why we were ignored and it was covered up. That | :09:07. | :09:12. | |
is happening throughout the country, a pattern. We can't make it all | :09:13. | :09:16. | |
about race, especially on to one race, but we also can't get away | :09:17. | :09:23. | |
from the facts. We can't tackle it unless we discuss everything. Do you | :09:24. | :09:27. | |
get frustrated when you see somebody like Sarah Champion having to resign | :09:28. | :09:31. | |
from the Shadow Cabinet or feeling like she has got to because she has | :09:32. | :09:37. | |
written what are facts, as she sees them, in a newspaper article? | :09:38. | :09:41. | |
Definitely. As a country we can have an open and honest discussion about | :09:42. | :09:45. | |
race and religion. I don't think Sarah did anything wrong and I don't | :09:46. | :09:50. | |
see eye to eye with her myself, but putting that to one side, I think | :09:51. | :09:55. | |
she said something that is the truth. What's Jeremy Corbyn did by | :09:56. | :10:00. | |
sacking her, it resigning, whichever it was, I think that sends a far | :10:01. | :10:04. | |
more dangerous message, saying that we can't have these discussions. If | :10:05. | :10:12. | |
you were to summarise what role Pakistani culture or Islamic culture | :10:13. | :10:15. | |
potentially played in the fact that you were chosen to be groomed as a | :10:16. | :10:19. | |
young white girl, how would you summarise that from your point of | :10:20. | :10:23. | |
view and your experience? I think there are a few factors involved. | :10:24. | :10:28. | |
There are some Muslims that few women in poor light, especially if | :10:29. | :10:33. | |
you are not a Muslim girl or Muslim woman yourself. I think we need to | :10:34. | :10:39. | |
be open to talking about it. I don't understand why it has got to be a | :10:40. | :10:44. | |
secret. Let's tackle it from all angles. I appreciate you talking to | :10:45. | :10:49. | |
us this morning. Thank you. You have been getting in touch with your | :10:50. | :10:53. | |
views. Please keep doing so. Ruth says why should this be an | :10:54. | :10:56. | |
uncomfortable truth? Discuss it openly without the fear of being | :10:57. | :11:00. | |
called racist. It is impossible not to bring race and religion into | :11:01. | :11:04. | |
these conversations. Doreen: The reality is that a group of men | :11:05. | :11:08. | |
exploited and raped young, vulnerable girls. Is it racist to | :11:09. | :11:11. | |
mention that the girls were white British and the men were of Asian | :11:12. | :11:17. | |
origin? These are the facts. Suzanne says that all of the offenders were | :11:18. | :11:21. | |
British-born. Does that mean that Britain is a nation of sex | :11:22. | :11:25. | |
offenders? You can't tar everybody with the same brush. And this one | :11:26. | :11:29. | |
says, the question is not whether the MP is racist but why a large | :11:30. | :11:34. | |
number of men from one community are preying on vulnerable girls. Very | :11:35. | :11:39. | |
interesting. Sammy says in 2017 we can't use the words child abuse and | :11:40. | :11:43. | |
Pakistani men in the same sentence. Is that political correctness gone | :11:44. | :11:48. | |
mad? I think she was saying that we can't make this completely about | :11:49. | :11:52. | |
race. And what we have seen, we are having a discussion about media | :11:53. | :11:56. | |
responses, and British Pakistani men are always part of the conversation | :11:57. | :12:00. | |
when we are talking about this. They have been intertwined into this | :12:01. | :12:05. | |
narrative, and that is why Sarah Champion was right to stand down. | :12:06. | :12:09. | |
When we are talking about this case in particular, there is a systemic | :12:10. | :12:12. | |
failure of Greater Manchester Police who did not believe the victims. | :12:13. | :12:16. | |
There is something we need to unpick and discuss their without derailing | :12:17. | :12:20. | |
the conversation. We discuss all parts of it, so why can't we talk | :12:21. | :12:24. | |
about Pakistani men who are highly involved in this? What about talking | :12:25. | :12:29. | |
about Polish truck drivers abusing young black girls? We would talk | :12:30. | :12:32. | |
about Polish truck drivers and the way they view young black women. We | :12:33. | :12:36. | |
are talking about Pakistani men disproportionately involved in | :12:37. | :12:39. | |
abusing young, white women. Why can't we talk about the way that | :12:40. | :12:43. | |
Pakistani men are brought up to view women who do not fit the mould that | :12:44. | :12:47. | |
they are expecting and don't act in the way they are expecting? Why | :12:48. | :12:50. | |
can't we talk about that? Among these criminals there is a mindset | :12:51. | :12:53. | |
that white girls are worthless. I say it time and time | :12:54. | :13:07. | |
again. Is it the criminals or is it the culture? Are they brought up to | :13:08. | :13:10. | |
think like that? It is not the culture. I am a Muslim and a | :13:11. | :13:12. | |
Pakistani and there is no way in my face and in my community that we | :13:13. | :13:15. | |
were brought up to say you can rape white girls. When you see a young | :13:16. | :13:19. | |
white girl who is vulnerable, maybe drinking, on the streets, there is a | :13:20. | :13:24. | |
problem that they are viewing them, some Pakistani men, are viewing them | :13:25. | :13:27. | |
in a different way. Are they just criminals or is it a cultural thing? | :13:28. | :13:32. | |
Answer the question. What I am saying is that it is an issue and we | :13:33. | :13:40. | |
need to address the issue. Why can't we talk about this in terms of race? | :13:41. | :13:43. | |
Show me some respect. I have been campaigning about this since 2007, | :13:44. | :13:46. | |
even when the police and everybody else was finding excuse for the | :13:47. | :13:50. | |
grooming. I was on the street campaigning and all we have been | :13:51. | :13:53. | |
talking about his race and grooming for the last ten years. We need to | :13:54. | :13:58. | |
talk about this in terms of race. I know but are we also going to talk | :13:59. | :14:01. | |
about the huge endemic problem of child abuse the church? And Jimmy | :14:02. | :14:09. | |
Savile? Let's just get real on these issues and remember that was one | :14:10. | :14:14. | |
victim. I have met many, and I am sure he has. More than I have had | :14:15. | :14:20. | |
breakfast! With Jimmy Savile, the focus and the finger was pointed out | :14:21. | :14:23. | |
a lot of male celebrities and rightly so. Somebody in my position | :14:24. | :14:28. | |
cannot do what he did and we talked about it. We discussed it. With the | :14:29. | :14:32. | |
Catholic Church we discussed it. Judgment here is that we cannot talk | :14:33. | :14:34. | |
about it in terms of race. I have prosecuted dozens of these | :14:35. | :14:43. | |
cases, there is always this conversation for a few weeks after | :14:44. | :14:48. | |
the trial, then we forget about it. I am with you, we should have this | :14:49. | :14:51. | |
conversation day in, day out. Some British Pakistani men have a problem | :14:52. | :14:55. | |
with white girls on the street, there is violent misogyny going on, | :14:56. | :14:59. | |
misogyny more than anything drives their attitude towards women and | :15:00. | :15:04. | |
girls. We need to tackle it, every community, including the British | :15:05. | :15:08. | |
Pakistani community. I am never afraid of | :15:09. | :15:20. | |
saying who is responsible, if they are responsible they need to do | :15:21. | :15:23. | |
something. We have focused on this issue for ten days, we have not | :15:24. | :15:26. | |
focused about the victims, why they were not believed and why they | :15:27. | :15:28. | |
continue to suffer. Michelle, what can we do? Have a honest | :15:29. | :15:30. | |
conversations, I fundamentally disagree with you both when you say | :15:31. | :15:32. | |
Sarah Champion should have resigned. You should be able to have those | :15:33. | :15:36. | |
conversations, have the courage of your convictions, hold your position | :15:37. | :15:41. | |
once you have spoken those truths. The second thing is talk to these | :15:42. | :15:46. | |
people, there have been so many convictions now, talk to these | :15:47. | :15:50. | |
people and understand what is going on, how can we stop this? There is | :15:51. | :15:55. | |
some form of cultural influence, it has to be talked about and | :15:56. | :15:59. | |
understood. I am afraid we are out of time, thank you. | :16:00. | :16:01. | |
Caroline Paige served in the RAF for more than three decades as a jet | :16:02. | :16:05. | |
and helicopter navigator but her career is notable for more | :16:06. | :16:07. | |
than just her remarkable service and battlefield expertise. | :16:08. | :16:09. | |
In 1999 she became the first transgender officer to transition | :16:10. | :16:11. | |
Wendy Robbins went on a tour of duty to find out more about Caroline's | :16:12. | :16:18. | |
When Caroline Paige joined the Royal Air Force in 1980, her colleagues | :16:19. | :16:42. | |
knew her as a man called Eric, and identity she had struggled with all | :16:43. | :16:45. | |
her life. I saw myself as a female that just | :16:46. | :16:49. | |
happen to have been born with a boy's body for some reason, but I | :16:50. | :16:54. | |
knew that I was now joining an organisation whereby if I was | :16:55. | :17:02. | |
discovered that I was transgender, then I would be thrown out. | :17:03. | :17:07. | |
Caroline kept her secret for 19 years of her flying career. It was | :17:08. | :17:11. | |
something she had lived with since the age of five, when she saw a | :17:12. | :17:15. | |
dress on her mother's bed. I felt, well, what a beautiful | :17:16. | :17:20. | |
dress, I need to wear the dress. I put it on and it felt wonderful, | :17:21. | :17:25. | |
natural, it was me, but it was a little bit tight. And I heard | :17:26. | :17:32. | |
footsteps approaching and it was my father, and I panicked and I could | :17:33. | :17:36. | |
not get the dress. I found it very difficult because he shouted at me | :17:37. | :17:40. | |
and made it very clear it was completely wrong, and that | :17:41. | :17:43. | |
frightened me. So I felt the best thing to do was hide it. My life | :17:44. | :17:48. | |
then became one of hopes and dreams, each night I went to bed and I would | :17:49. | :17:51. | |
hope that the following morning I would wake up and it would be fixed, | :17:52. | :18:01. | |
I would be the girl that I knew I was. | :18:02. | :18:02. | |
You wanted to be a girl, you are wearing your mum's clothes, yet you | :18:03. | :18:05. | |
also wanted to fly fast jets. It goes back to my days in Melayu when | :18:06. | :18:09. | |
we lived on an army base and the helicopters used to come backwards | :18:10. | :18:13. | |
and forwards and lands in the field next to the house -- my days in | :18:14. | :18:20. | |
Malaya. At 15 I learned to fly gliders, I was in the sky is 2000 | :18:21. | :18:24. | |
feet on my own and I felt it was wonderful, fantastic. It was the | :18:25. | :18:29. | |
first time I had seen there was something I could do. You joined the | :18:30. | :18:34. | |
RAF, then a very macho environment. It was quite a contrast. | :18:35. | :18:45. | |
There were no female is allowed to fly the fast jets, combat aircraft, | :18:46. | :18:53. | |
in the 80s. Not until 1991. It was a very male environment and, of | :18:54. | :18:57. | |
course, it was an extra pressure because I knew that if I was | :18:58. | :19:01. | |
discovered I would be dismissed from the air force immediately and I | :19:02. | :19:06. | |
would be outed. What led you after 19 years to reveal your true | :19:07. | :19:11. | |
identity and tell the authorities? I was always living with the fear of | :19:12. | :19:15. | |
getting caught, and I realise that, you know what? I have to standard, I | :19:16. | :19:19. | |
accept the consequences and see where it goes. I went to see the | :19:20. | :19:27. | |
medical officer and I sat down and told her, right, I need to tell you | :19:28. | :19:31. | |
something. When I did, she cleared her appointment is for the rest of | :19:32. | :19:34. | |
the afternoon. You must have expected to be thrown out? Yes, I | :19:35. | :19:39. | |
expected thank you very much, we no longer require your servers, get | :19:40. | :19:44. | |
out. That is going to be disappointing, but a decision came | :19:45. | :19:49. | |
back that they wanted to keep me in service, which was absolutely | :19:50. | :19:54. | |
amazingly wonderful. Well, I never dreamt that this would be possible, | :19:55. | :19:59. | |
and here I am now, Caroline Paige, female officer in the RAF. | :20:00. | :20:03. | |
In the end, the military reacted better than your parents? And | :20:04. | :20:07. | |
fortunately my brothers never spoke to me again, my dad initially turned | :20:08. | :20:12. | |
round and said, as far as I'm concerned, you are dead -- | :20:13. | :20:15. | |
unfortunately my brothers never spoke to me again. That is why I | :20:16. | :20:19. | |
kept the secret so long for my family. I anticipated losing them. | :20:20. | :20:23. | |
My father was really lovely, I looked into bits, I think he was | :20:24. | :20:27. | |
slowly coming to terms and realising I was still his child and he wanted | :20:28. | :20:32. | |
to support me, I think that is happening, but unfortunately he had | :20:33. | :20:35. | |
a heart attack and died. You were not allowed to go to the funeral? | :20:36. | :20:43. | |
The family decided they did not want me at the funeral, but I agreed with | :20:44. | :20:46. | |
a padre at the base and I have my private servers, but it was very | :20:47. | :20:50. | |
sad. In 2000 tabloid newspaper discovered | :20:51. | :20:54. | |
Caroline's story. She then experienced hostility from some REF | :20:55. | :20:59. | |
colleagues. People came up to me and said get out of RF force, our | :21:00. | :21:02. | |
military, what on earth are you doing serving? It made me more | :21:03. | :21:07. | |
determined to want to get the front line and prove them wrong. | :21:08. | :21:11. | |
Caroline was already experienced on the front line before transitioning. | :21:12. | :21:18. | |
She went on to serve a further 16 years flying battlefield helicopters | :21:19. | :21:21. | |
in Afghanistan and Iraq, winning several commendations for her work. | :21:22. | :21:25. | |
She forged lasting friendships with colleagues during that time, | :21:26. | :21:29. | |
including Andy, with whom she flew for a decade. | :21:30. | :21:32. | |
We have been hearing Caroline's story, I wonder what you made of | :21:33. | :21:38. | |
what happened to her? We have flown together on operations from | :21:39. | :21:41. | |
Yugoslavia, Iraq, Afghanistan, on many dangerous missions. She is the | :21:42. | :21:46. | |
same as everyone else, Maxine, great sense of humour, helps us through | :21:47. | :21:51. | |
the hard times, I think. -- McKerrs in, great sense of humour. Last | :21:52. | :21:57. | |
month Trump treated his intention to ban transgender military personnel, | :21:58. | :22:02. | |
part of his rationale said it would degrade military readiness and | :22:03. | :22:07. | |
demoralise the troops. What did you make of that? It is appalling. There | :22:08. | :22:13. | |
are already thousands of people serving in the US military who are | :22:14. | :22:17. | |
transgender, and they have been on the front line, they have been stood | :22:18. | :22:20. | |
next to their colleagues doing the job as well as anybody else. I am | :22:21. | :22:24. | |
living proof that being transgendered is not affect how you | :22:25. | :22:28. | |
do your job. Wendy Robbins talking to Caroline | :22:29. | :22:30. | |
Paige. Still to come on Sunday Morning | :22:31. | :22:30. | |
Live: As robots are becoming more developed could they ever take | :22:31. | :22:33. | |
over from humans? In the long term there is an | :22:34. | :22:43. | |
existential risk of do you create a form of being batters more | :22:44. | :22:44. | |
intelligent than us? The latest in a string of terror | :22:45. | :22:47. | |
attacks to hit Europe this year occurred in Spain this week, | :22:48. | :22:50. | |
as the Catalonia region of the country was struck twice | :22:51. | :22:52. | |
by people driving cars The first happened on Thursday | :22:53. | :22:54. | |
as a white van smashed into people on the famous boulevard and tourist | :22:55. | :22:58. | |
hotspot of Las Ramblas, while it was packed | :22:59. | :23:01. | |
with holiday makers. Emergency services were quickly on | :23:02. | :23:12. | |
the scene but the devastating attack left 13 people dead and more than | :23:13. | :23:13. | |
100 injured. One of the visitors to Barcelona | :23:14. | :23:14. | |
at the time was the security expert Will Geddes, who was one | :23:15. | :23:17. | |
of the first on the scene I know you are not directly there | :23:18. | :23:25. | |
when the car hits, but what did you see in the aftermath, what did you | :23:26. | :23:28. | |
hear? As soon as I got with the attack was taking place I got down | :23:29. | :23:32. | |
to the court in as quickly as possible, I had been at Las Ramblas | :23:33. | :23:35. | |
only a couple of hours previous to the attack so I knew the area quite | :23:36. | :23:40. | |
well. When I got there the Spanish authorities have been very quick and | :23:41. | :23:44. | |
establishing a chord in, lots of the blue light services had arrived and | :23:45. | :23:50. | |
obviously lots of visitors were moved out of the location -- very | :23:51. | :23:53. | |
quick in establishing a cordon. They seem to have good control very | :23:54. | :23:58. | |
quickly. Is a security expert, you were there on business, which is | :23:59. | :24:02. | |
rather ironic given what happened, why is this being described as the | :24:03. | :24:06. | |
new normal, vehicles driving into crowds? Lots of people use the term | :24:07. | :24:10. | |
low-tech, I think that gives too much sophistication to something | :24:11. | :24:15. | |
which we can fundamentally all access, a vehicle. In every city | :24:16. | :24:19. | |
centre, unless they start banning wholesale vehicles being able to | :24:20. | :24:23. | |
access, we will always face this risk. The problem is the camouflage | :24:24. | :24:27. | |
of a vehicle being used as a delivery device for an attack will | :24:28. | :24:31. | |
only be determined at the last minute when it starts to strike | :24:32. | :24:35. | |
members of the public. In terms of what you can do in these situations, | :24:36. | :24:40. | |
many people are abroad in tourist hotspots at the moment, have you any | :24:41. | :24:47. | |
tips? Is lots of people will no doubt assess, be aware of your | :24:48. | :24:51. | |
surroundings, the people in your surroundings, try to establish as | :24:52. | :24:55. | |
much as you can in your gut as to how you feel. And stop looking at | :24:56. | :25:01. | |
your mobile phone? Absolutely. You are not giving yourself a chance to | :25:02. | :25:05. | |
detect the problem before it happens. The second thing is to look | :25:06. | :25:10. | |
for points of escape, if you are walking down the boulevard, the cup | :25:11. | :25:13. | |
the side streets, shops, restaurants, if you needed to take | :25:14. | :25:16. | |
over quickly, where would you do that? If they are not available, | :25:17. | :25:21. | |
street furniture like parked cars or a roll of motorcycles. It might not | :25:22. | :25:25. | |
be the most robust coverage but at least it is some level of coverage. | :25:26. | :25:35. | |
As a family, with your partner or children, try to have an emergency | :25:36. | :25:38. | |
plan. If you get separated, where do you regroup? Back at the hotel or | :25:39. | :25:42. | |
somewhere else? Would you stop going to tourist hotspots? Not at all, all | :25:43. | :25:46. | |
that has happened in Barcelona is to reiterate that there is no set your | :25:47. | :25:49. | |
location across Europe exempt from these threats. Thank you very much, | :25:50. | :25:51. | |
Will. Every job should be opened | :25:52. | :25:54. | |
to part-time working according to the Equality | :25:55. | :25:56. | |
and Human Rights Commission. The agency claims such | :25:57. | :25:57. | |
measures would help reduce as well as address the pay | :25:58. | :25:59. | |
differences affecting ethnic minorities and the disabled | :26:00. | :26:03. | |
across all industries in the UK. Part-time hours, job sharing | :26:04. | :26:05. | |
and other flexible working schemes could help reduce the motherhood | :26:06. | :26:07. | |
penalty that many women face after having children, | :26:08. | :26:10. | |
the commission says. But MP Philip Davies, | :26:11. | :26:12. | |
a member of the Commons women and equalities committee, | :26:13. | :26:15. | |
called the proposals left-wing claptrap and | :26:16. | :26:17. | |
nanny state nonsense. So should employers | :26:18. | :26:18. | |
be more flexible? Or would such changes make it | :26:19. | :26:24. | |
unfeasible to run a business? Joining me now are Stefan Stern, | :26:25. | :26:26. | |
who writes about management, and Christopher Snowden | :26:27. | :26:29. | |
from the Institute And still | :26:30. | :26:30. | |
with us are the journalist and writer Kieran Yates | :26:31. | :26:33. | |
and broadcaster and businesswoman Stefan, starting with you, some | :26:34. | :26:43. | |
business leaders have said implementing these plans would be | :26:44. | :26:48. | |
impossible. Would they be bad for business? I think we all want | :26:49. | :26:51. | |
flexibility. Bosses have been telling us about the flexible labour | :26:52. | :26:54. | |
force they need but that has to go to microwaves. Businesses are losing | :26:55. | :26:58. | |
out on the talents and abilities of all sort of people who cannot get | :26:59. | :27:01. | |
the hours that they want or need which suits the demands of their | :27:02. | :27:05. | |
life, whether part-time working, term time working, certain times of | :27:06. | :27:14. | |
the year they do not want to work. We are underperforming as a country, | :27:15. | :27:16. | |
low productivity, lower wages, frustration about the economic | :27:17. | :27:19. | |
circumstances. When something is not working, we need more flexibility so | :27:20. | :27:22. | |
more of us can give our best at work. Michelle, you are a | :27:23. | :27:26. | |
businesswoman, with these proposals help women into higher paid and | :27:27. | :27:30. | |
high-ranking jobs? About there is a difference between part-time and | :27:31. | :27:33. | |
flexible working, they are slightly different. To say that all jobs | :27:34. | :27:38. | |
regardless of sector or responsibilities and workload can be | :27:39. | :27:41. | |
offered with flexible working is just not practical in many work | :27:42. | :27:48. | |
situations, it is just not. If you offer every potential employee | :27:49. | :27:52. | |
flexible working, for example if I say I do not want to work on a | :27:53. | :27:56. | |
Friday, if I had to hire a replacement for a Friday and he or | :27:57. | :28:00. | |
she says they don't want to do Fridays either, do I get a third or | :28:01. | :28:04. | |
fourth person? It is just not doable for all jobs to be flexible and/ or | :28:05. | :28:10. | |
part-time, I don't believe. Kieran? We need to completely change the way | :28:11. | :28:19. | |
we talk and flexible working hours. There are a couple of things at | :28:20. | :28:24. | |
play, the gig economy and the high numbers of self-employed people have | :28:25. | :28:27. | |
really changed the face of the British workforce. More women are | :28:28. | :28:33. | |
taking up an increased role in the labour market and a bigger share of | :28:34. | :28:40. | |
division and domestic in the house means that women had to start | :28:41. | :28:44. | |
galvanising this campaign towards equal pay. I think that there are a | :28:45. | :28:51. | |
couple of things to discuss and I do not think that demonising women for | :28:52. | :28:53. | |
needing flexible hours for whatever reason is the way to go. Kieran | :28:54. | :28:58. | |
talked about the gender pay gap, Christopher, how big an issue is it? | :28:59. | :29:04. | |
It is tremendously misrepresented. It exists, but not for the reasons | :29:05. | :29:08. | |
many people assume. It is not endemic sexism in the workplace. As | :29:09. | :29:13. | |
is explained every year, it is three things, one women are much more | :29:14. | :29:16. | |
likely to be in part-time work, which is generally paid less by the | :29:17. | :29:22. | |
hour, they tend on average to go towards slightly less well paying | :29:23. | :29:25. | |
occupations, things like childcare rather than engineering, for | :29:26. | :29:35. | |
example, and... I forgot what the... The gender pay gap, often we talk | :29:36. | :29:39. | |
about it with the same job? A woman doing the same job as a man? That | :29:40. | :29:45. | |
ruins your argument. That has been a legal for 42 years. But there is a | :29:46. | :29:51. | |
gender pay gap. On average, because more women are in part-time jobs and | :29:52. | :29:54. | |
a gender pay gap. On average, because more women are in part-time | :29:55. | :29:57. | |
jobs and in lower paid professions. If you look at like-for-like, | :29:58. | :29:59. | |
comparing a female engineer with a male one, they had to be paid the | :30:00. | :30:03. | |
same. If they were not, the courts would be overrun. It is the mistake | :30:04. | :30:11. | |
of looking at averages when you need to look at individuals. It comes | :30:12. | :30:16. | |
down to choices. The third one that I have remembered as bringing up | :30:17. | :30:19. | |
families. If you are going to take several years out of the labour | :30:20. | :30:22. | |
market you will obviously be paid less when you return to it, so these | :30:23. | :30:28. | |
are down to choices. It depends on the context in which these | :30:29. | :30:31. | |
apparently free choices are being made. If you look at the | :30:32. | :30:34. | |
professional services, accountants and lawyers, at graduate level, | :30:35. | :30:43. | |
there are as many women and men, but further up the chain the women have | :30:44. | :30:48. | |
gone. Maybe because they have had families. But it is Beverly possible | :30:49. | :30:55. | |
to have families and go to work. -- perfectly possible. Maybe we are | :30:56. | :30:58. | |
putting pressure on women to go to work and not bring up their | :30:59. | :31:04. | |
children. Of course but there is research that shows that women would | :31:05. | :31:11. | |
cut their hours by half as much if they had genuine flexibility. From | :31:12. | :31:14. | |
an economic point of view this is a waste of potential and talent. You | :31:15. | :31:18. | |
are creating insecure workers. Women are insecure workers because they | :31:19. | :31:24. | |
are not represented. Now we have an interesting guest on this. | :31:25. | :31:27. | |
I'm joined by Danielle Ayers, a solicitor specialising | :31:28. | :31:29. | |
in pregnancy, maternity and sex discrimination, who | :31:30. | :31:30. | |
advises both employers and employees in these areas. | :31:31. | :31:32. | |
Good morning. Have we still got a great deal of discrimination for | :31:33. | :31:41. | |
women? Have you seen stories of late that you couldn't quite believe? | :31:42. | :31:46. | |
Unfortunately, yes. You think in this day and age that employers | :31:47. | :31:49. | |
would know what they have got to do in these situations and they would | :31:50. | :31:53. | |
be up to date on the law, and they would not fall foul of these rules | :31:54. | :31:57. | |
and regulations, but unfortunately it seems to be becoming more | :31:58. | :32:01. | |
blatant. One of the main examples I can give you is equal pay. Women are | :32:02. | :32:05. | |
being paid less to do the same job as men. That is different to looking | :32:06. | :32:10. | |
at the gender pay gap, which is taking an average of your female | :32:11. | :32:13. | |
workers and your mail workers and providing an average of their wages. | :32:14. | :32:19. | |
Equal pay is looking at men and women doing the same job, where | :32:20. | :32:22. | |
there can be no argument that women should not be paid the same. Why is | :32:23. | :32:27. | |
it happening? It is down to employers not knowing what they | :32:28. | :32:30. | |
should be doing in these situations. There was a report by the equality | :32:31. | :32:33. | |
and human rights commission last year which said that more needs to | :32:34. | :32:37. | |
be done to support employers. More needs to be done to give them the | :32:38. | :32:40. | |
knowledge that they need to make sure that their employees are | :32:41. | :32:43. | |
supported and doing the right things. What knowledge do they need? | :32:44. | :32:50. | |
The law is the law, isn't it? Why is it happening? You would think so. I | :32:51. | :32:53. | |
think employers have become more bolshy overtime and a lot has gone | :32:54. | :32:58. | |
in their favour. Just a couple of weeks ago, the employment tribunal | :32:59. | :33:03. | |
fees have been scrapped, which is a great thing, but a lot of things | :33:04. | :33:07. | |
have happened. ACAS conciliation has been brought in, unfair dismissal, | :33:08. | :33:11. | |
you can't bring an unfair dismissal complaint unless you have been | :33:12. | :33:15. | |
employed for two years. They are becoming more bolshy in making | :33:16. | :33:19. | |
decisions around staff. They have rights on their side. This idea that | :33:20. | :33:23. | |
everybody should be offered flexible working, surely you can see the | :33:24. | :33:27. | |
issue with that for buses? It is hardly workable for all companies. | :33:28. | :33:33. | |
No, I don't think it is and I agree with those in the studio saying that | :33:34. | :33:36. | |
but that is not what we are asking for here. I am asking that if you | :33:37. | :33:42. | |
make an application that there is a sensible and informed discussion | :33:43. | :33:45. | |
between the parties to see whether or not there is a workable solution | :33:46. | :33:50. | |
for both. There cannot be a case where it is costing an employer so | :33:51. | :33:54. | |
much just to put flexible working in place for an employee, however in | :33:55. | :33:57. | |
the vast majority of cases there is a solution for both parties. Thank | :33:58. | :34:05. | |
you for that. We have lots of responses here. Thank you. Mavis | :34:06. | :34:08. | |
says that equality sounds fine on paper but so does communism. | :34:09. | :34:10. | |
Different people work at different rates and some work better than | :34:11. | :34:14. | |
others. Should they all get the same money? And Andrew says that all work | :34:15. | :34:18. | |
should be opened up to allow part-time positions and people | :34:19. | :34:21. | |
should be allowed to decide how many hours they are happy working. It has | :34:22. | :34:24. | |
big benefits for the employer as well. That sounds nice! Now, | :34:25. | :34:33. | |
Danielle Croce about businesses becoming bolshy. You have mentioned | :34:34. | :34:39. | |
that women have become insecure. Yes, women are an increasingly | :34:40. | :34:42. | |
insecure workforce because they are aware of discrimination. The gender | :34:43. | :34:47. | |
pay gap, the same money for the same job, that is clear evidence of | :34:48. | :34:53. | |
discrimination. British African women and Bangladeshi and Pakistani | :34:54. | :34:55. | |
women are disproportionately affected. This is something that | :34:56. | :34:58. | |
does exist and it is happening and we need to talk about it. | :34:59. | :35:02. | |
Christopher says it doesn't exist. It exists but not for those reasons. | :35:03. | :35:07. | |
The Office for National Statistics looked at this again this year and | :35:08. | :35:11. | |
it said for the three reasons that I gave before the full explanation of | :35:12. | :35:14. | |
it. There might be sexism here and there and that is why we have got | :35:15. | :35:17. | |
the courts to look into these things. Sometimes it might be sexism | :35:18. | :35:22. | |
against men, you don't know. These things are going to court on the | :35:23. | :35:25. | |
occasions that they happen. It does not explain the 10% pay gap, which | :35:26. | :35:31. | |
is just an average. It tells you nothing whatsoever. Likeable | :35:32. | :35:35. | |
working, it is incredibly inefficient actually. It might be | :35:36. | :35:40. | |
all right on a factory line or something but in most jobs doesn't | :35:41. | :35:44. | |
work. And most people don't want part-time work. There are always | :35:45. | :35:48. | |
more people who want full-time work you are working part-time currently | :35:49. | :35:52. | |
than the other way round. Flexible working is inefficient? That | :35:53. | :35:56. | |
management is inefficient and bosses who don't listen to their workforce | :35:57. | :36:01. | |
are inefficient. -- bad management is inefficient. Not giving employees | :36:02. | :36:07. | |
the chance to be flexible is inefficient. That explains chronic | :36:08. | :36:14. | |
low productivity in this country. You couldn't work flexibly because | :36:15. | :36:19. | |
then you couldn't be on this show. It works in some jobs and not | :36:20. | :36:24. | |
others. That is right. Any NHS, they work long shift that we need them to | :36:25. | :36:28. | |
be there and it has got to be managed. This is part of the job of | :36:29. | :36:32. | |
being a good manager, using people properly. It is down to the | :36:33. | :36:37. | |
managers? I think this conversation, this report calling for flexibility, | :36:38. | :36:41. | |
it is just think tanks sitting in a corner somewhere dreaming up things | :36:42. | :36:47. | |
to be more politically correct, to create equality, where actually it | :36:48. | :36:50. | |
is not realistic in the real world, in the world of business for | :36:51. | :36:55. | |
example, as one sector, you cannot turn around to all employers and say | :36:56. | :36:59. | |
that all jobs need to be flexible and part-time. I want to touch on | :37:00. | :37:03. | |
what Chris is saying. I agree with so much of what you are saying about | :37:04. | :37:09. | |
the gender pay gap. You are talking about ladies feeling discriminated | :37:10. | :37:12. | |
against in the workplace. I don't feel discriminated against in the | :37:13. | :37:15. | |
workplace. I believe in myself and my abilities and I will negotiate | :37:16. | :37:19. | |
hard. We need to stop saying to women that if you go into the world | :37:20. | :37:23. | |
of work that you will be automatically earning 80p to the | :37:24. | :37:29. | |
pound compared to men. Shouldn't we highlight a problem? That statistic, | :37:30. | :37:32. | |
it is a headline figure which takes all the men and women in this room | :37:33. | :37:35. | |
and says she earns 80p and he earns ?1 and now we have got a problem. | :37:36. | :37:41. | |
You can only compared pay and discuss whether it is fair when you | :37:42. | :37:45. | |
take two people doing the same job with the same experience. So there | :37:46. | :37:49. | |
is no gender pay gap problem? I don't feel that taking the headline | :37:50. | :37:57. | |
figure and saying that women own 80p and men ?1 is helpful. An equal pay | :37:58. | :38:04. | |
is illegal. Yes I know, is there gender pay gap problem? Not in the | :38:05. | :38:08. | |
way you're trying to describe with that headline figure, no. Thank you | :38:09. | :38:09. | |
very much. A recent University of Oxford | :38:10. | :38:12. | |
study concluded that artifical intelligence, or AI, | :38:13. | :38:14. | |
will be better than humans at all tasks within 45 years, | :38:15. | :38:16. | |
and many people, including Stephen Hawking, believe humans | :38:17. | :38:20. | |
will be in trouble in the future if our ambitions don't match | :38:21. | :38:22. | |
with those of machines. Now two tech titans are engaged | :38:23. | :38:25. | |
in a public disagreement about AI. Billionaires Elon Musk | :38:26. | :38:28. | |
and Mark Zuckerberg have differing opinions of the future possibility | :38:29. | :38:34. | |
of machines that can Tesla and SpaceX founder | :38:35. | :38:36. | |
Musk has been warning A few days ago he tweeted "If you're | :38:37. | :38:43. | |
not concerned about AI Facebook's Zuckerberg thinks | :38:44. | :38:48. | |
it's all fear-mongering. Responding to Musk's | :38:49. | :38:55. | |
warnings he said, "I think and in some ways I actually think | :38:56. | :38:59. | |
it is pretty irresponsible". To explore the issue further, | :39:00. | :39:03. | |
we sent Samanthi Flanagan Whether we realise it or not, robots | :39:04. | :39:19. | |
and artificial intelligence are becoming more and more part of our | :39:20. | :39:23. | |
everyday lives. I am here to meet the curator of the robots | :39:24. | :39:26. | |
exhibition, Ben Russell, who introduced me to a rather theatrical | :39:27. | :39:32. | |
robot. This is Robo thespian, built by a UK company. He is quite | :39:33. | :39:39. | |
remarkable. They decided to build a robot actor. His movements are | :39:40. | :39:43. | |
naturalistic and characterful and the expression, on his face, a | :39:44. | :39:48. | |
wonderful thing. Surely a robot couldn't replace the human actor. | :39:49. | :39:52. | |
They are not capable of emotions. You can programme stuff in. Whether | :39:53. | :39:56. | |
or not he is a motoring is the question. It is following a piece of | :39:57. | :40:02. | |
code, effectively. They can't improvise. They find it difficult! | :40:03. | :40:08. | |
Is it something you are working towards? At the moment they are very | :40:09. | :40:14. | |
good at doing a carefully defined job in a very specific way | :40:15. | :40:18. | |
repeatedly, but not acting in the human environment of flexibility. | :40:19. | :40:22. | |
Now a robot worker called Baxter who with no programming can teach a | :40:23. | :40:27. | |
tough new tasks and is designed to work alongside humans. It takes bits | :40:28. | :40:33. | |
of human nonverbal behaviour. If it moves its arm, it looks at where its | :40:34. | :40:37. | |
arm is going. It looks at an obstacle, if rounds. So much of | :40:38. | :40:42. | |
human conversation is not verbal and it is these physical cues that we | :40:43. | :40:47. | |
picked up on and so it is very smart in this respect. Like a young child, | :40:48. | :40:51. | |
Baxter learns to pick up the objects through trial and error, improving | :40:52. | :40:57. | |
with each attempt. There are 3000 Baxters out there. They all learn | :40:58. | :41:03. | |
and upload that information centrally. If one object picks it up | :41:04. | :41:07. | |
successfully, it uploads that information and in theory all the | :41:08. | :41:11. | |
other robot should be able to pick up the same object. Sharing | :41:12. | :41:15. | |
information between machines is very important, machine learning. Do you | :41:16. | :41:20. | |
think we need to monitor the development of AI? Absolutely. | :41:21. | :41:23. | |
Everybody concentrates on the big red thing physically doing things, | :41:24. | :41:27. | |
the robot, but there is a lot of software behind-the-scenes that we | :41:28. | :41:29. | |
don't think about and that is the important thing. We are taking our | :41:30. | :41:33. | |
eye off the ball we are thinking about the robot. As complex as | :41:34. | :41:39. | |
Baxter is, he doesn't look very human but this robot is uncannily | :41:40. | :41:45. | |
lifelike and has found work as a newsreader. How humanlike do you | :41:46. | :41:48. | |
want your robots to be? There is something called the uncanny valley. | :41:49. | :41:52. | |
You build a machine and it gets a bit lifelike and we are happy with | :41:53. | :41:56. | |
that. As soon as it is too lifelike, we get weird associations with | :41:57. | :42:00. | |
corpses and other stuff, and people back away, and that is uncanny | :42:01. | :42:06. | |
valley where robots plummet. What if we don't monitor the development of | :42:07. | :42:09. | |
AI? In the long term there is existential risk of creating a being | :42:10. | :42:15. | |
that is more intelligent than us. Generally, how does that | :42:16. | :42:18. | |
intelligence work in a physical form? That is very comeback to the | :42:19. | :42:22. | |
more mundane things which can trip is over. How do they interact with | :42:23. | :42:25. | |
the world that we take entirely for granted and they find very | :42:26. | :42:30. | |
difficult? That is a challenge. AI is a very different thing. I am | :42:31. | :42:33. | |
going to argue that she will not replace me as a presenter and I will | :42:34. | :42:39. | |
stick to that! Time will tell. Samanthi Flanagan visiting | :42:40. | :42:42. | |
an exhibition which runs Samanthi showed some concern | :42:43. | :42:44. | |
about her job being taken by robots, So over to our future overlord | :42:45. | :42:47. | |
Sanbot with the question. Should you be worried | :42:48. | :42:52. | |
about the rise of robots? I don't think so, | :42:53. | :42:54. | |
but what do you think? To answer that question, | :42:55. | :42:56. | |
I'm now joined by Luke Robert Mason, | :42:57. | :42:58. | |
a science commentator, Michelle Hanson, | :42:59. | :43:00. | |
a journalist and author, Kriti Sharma, vice | :43:01. | :43:04. | |
president of AI at Sage, and Oliver Moody, science | :43:05. | :43:06. | |
correspondent for The Times. Michelle, you have written about | :43:07. | :43:16. | |
your concerns over the rise of artificial intelligence. What | :43:17. | :43:19. | |
worries you the most? I am worried about this taking jobs, as she | :43:20. | :43:24. | |
suggested. It has already taken over shop assistants. You can't go into | :43:25. | :43:29. | |
Boots and talk to anybody. You have people standing around like odd | :43:30. | :43:32. | |
socks with no job, just showing people how to use machines. All | :43:33. | :43:39. | |
right, if they are going to take over thousands of jobs and people | :43:40. | :43:42. | |
will be unemployed and people are going to get a basic wage, I bet it | :43:43. | :43:47. | |
will be measly, not generous or luxurious. There are thousands of | :43:48. | :43:51. | |
people out of work sitting at home with nothing to do. I expect there | :43:52. | :43:55. | |
will be social unrest then, don't you? But robots are there to help | :43:56. | :44:00. | |
us. Many people in history have worried about the advancement of | :44:01. | :44:04. | |
technology. You're just like them, aren't you? No, it is much worse and | :44:05. | :44:14. | |
out of hand now. The genie is out of the bottle. We have got to use them | :44:15. | :44:17. | |
but we have got to control them. If you imagine it will be benign forces | :44:18. | :44:20. | |
in charge of these robots, OK, but it won't be. It will be greedy | :44:21. | :44:22. | |
people wanting to make lots of money. We are not a benign species | :44:23. | :44:24. | |
and it is very risky. It seems dangerous building machines | :44:25. | :44:34. | |
which will outsmart us and take our jobs? He is talking about super | :44:35. | :44:38. | |
intelligent AI in the image and likeness of us as humans. Where a | :44:39. | :44:44. | |
lot of the concern comes from Arendse per intelligent AI is that | :44:45. | :44:46. | |
we are equating to human intelligence. We see intelligence is | :44:47. | :44:51. | |
what makes us as humans the paragon of animals, but we have realised | :44:52. | :44:57. | |
that humans do not necessarily always do the best thing for | :44:58. | :45:02. | |
ourselves. Intelligence exists on a spectrum and if you look at where we | :45:03. | :45:06. | |
existed here and chickens down here and dogs here, if we have something | :45:07. | :45:11. | |
more intelligent than as we worry that we might be treated like the | :45:12. | :45:15. | |
apes, if we take seriously the notion that we are an evolved | :45:16. | :45:21. | |
version... That is worrying? Considering how humans are | :45:22. | :45:23. | |
destroying the environment of animals and other entities. They | :45:24. | :45:28. | |
would destroy us? We are concerned we will put the sorts of values that | :45:29. | :45:33. | |
we have whereby humans have caused so much trauma and harm, we will | :45:34. | :45:38. | |
translate some of that to our artificially intelligent machines. | :45:39. | :45:43. | |
Are you worried? A I am to a degree, but I don't think we need to be as | :45:44. | :45:47. | |
worried as some of us may code. Because it will not affect you, it | :45:48. | :45:51. | |
will affect your grandchildren. Partially because I do not think it | :45:52. | :45:57. | |
will happen. Will it happen? How club are these machines? Is who | :45:58. | :46:02. | |
works in AI everyday we are quite far from machines taking over the | :46:03. | :46:09. | |
world, but AI is already everywhere in our daily lives. If you have ever | :46:10. | :46:13. | |
used Google search under prompts the next few words, it ranks the results | :46:14. | :46:20. | |
based on what you see at the top, all you would talk to Siri, that is | :46:21. | :46:25. | |
AI. It is not have to be a robot taking over the world, thank you to | :46:26. | :46:31. | |
Hollywood for that image! We are creating AI good at certain | :46:32. | :46:34. | |
specialised tax, it is doing a good job at augmenting and supporting | :46:35. | :46:40. | |
humans. We need to design AI correctly. It is like any other | :46:41. | :46:44. | |
industrial Revolution but the differences this time AI learns on | :46:45. | :46:48. | |
its own. It is the job of the creators of AI as well as the humans | :46:49. | :46:53. | |
teaching AI or talking to it to do it right. Oliver, is it worrying | :46:54. | :46:57. | |
that we are making something that will be more clever than us? They | :46:58. | :47:02. | |
will never have the same ability to generate emotion, art or music, so | :47:03. | :47:06. | |
shouldn't we say we will always be better than them? Super intelligence | :47:07. | :47:12. | |
is a bit like the Apocalypse of computing. It has been predicted so | :47:13. | :47:17. | |
many times, in 1960 's and academic predicted it would happen within a | :47:18. | :47:22. | |
generation. I think there are much bigger reasons to be worried about | :47:23. | :47:26. | |
the impact AI is having a society now. There is a common misconception | :47:27. | :47:30. | |
that AI is a magical process by which computers can make | :47:31. | :47:35. | |
superhumanly perfect decisions about the world. That raises a serious | :47:36. | :47:38. | |
enough moral question when it works, but what is worse is that it often | :47:39. | :47:43. | |
does not and you are taking limited information about what people have | :47:44. | :47:46. | |
done in the past and using it to make decisions about who gets a | :47:47. | :47:51. | |
mortgage, who will reoffend when they leave prison? You are doing it | :47:52. | :47:58. | |
very badly and it is a recipe for biased and Abhijit decisions that | :47:59. | :48:00. | |
the computers cannot even explain. Many applications of AI are computer | :48:01. | :48:05. | |
says no on steroids. I would love to say that Emma is talking to a robot | :48:06. | :48:09. | |
but I do not think a robot would be able to deal with your difficult | :48:10. | :48:10. | |
questions. I'm joined now by Noel Sharkey | :48:11. | :48:12. | |
a professor of robotics And a real humanoids, I am told. We | :48:13. | :48:23. | |
want to talk about jobs, lots of messages coming in, when will a | :48:24. | :48:27. | |
robot take our jobs? Do you see that any time soon? It has already | :48:28. | :48:32. | |
happened for some time, it is just the number of jobs they take. There | :48:33. | :48:37. | |
is a lot of talk that perhaps there will be new jobs, and there | :48:38. | :48:40. | |
certainly will be, but I fear that the new jobs will not be as many as | :48:41. | :48:46. | |
we currently have, we will see a big job reduction, particularly in the | :48:47. | :48:51. | |
service industry, making burgers, driving trucks and cars, taxis, | :48:52. | :48:56. | |
those kinds of things will go first. AI is sweeping through many areas | :48:57. | :48:59. | |
and it is difficult to really predict the future because there are | :49:00. | :49:05. | |
new laws coming out in Europe and in the UK, I believe, where we are | :49:06. | :49:10. | |
giving too many decisions to AI programmes. People are now looking | :49:11. | :49:16. | |
at the idea that if these decisions impact on people they should be | :49:17. | :49:19. | |
transparent and give a good explanation. That is difficult when | :49:20. | :49:23. | |
you have machine learning because you are left with big matrices of | :49:24. | :49:26. | |
numbers. We might see many of those things become a. On to emotion and | :49:27. | :49:33. | |
nuance, when will we get to a point where robots and artificial | :49:34. | :49:38. | |
intelligence develops that side of things? I can't see it myself. It | :49:39. | :49:42. | |
might turn out that you need an organic body to feel emotions, I | :49:43. | :49:46. | |
don't know nobody knows. We also chemical machine as well as a | :49:47. | :49:53. | |
metaphor we have for our brains as computers, but they are quite | :49:54. | :49:57. | |
chemical in origin and we might be able to simulate emotions, which we | :49:58. | :50:01. | |
can do pretty well, we can make a robot smile and frowned, we can get | :50:02. | :50:07. | |
them to perceive emotions in a sense that they can classify whether we | :50:08. | :50:12. | |
are happy or sad, but getting them to understand sadness, a child | :50:13. | :50:16. | |
crying, did it drop its lollipop down the toilet or did its mother | :50:17. | :50:24. | |
just die? They are not good at subtle contextual stuff. As far as | :50:25. | :50:28. | |
feeling emotions we are nowhere with that, we do not understand that. | :50:29. | :50:33. | |
Thank you for painting a picture and doing a very good robot impression. | :50:34. | :50:36. | |
One viewer says robots will only be as bad as we make them, without | :50:37. | :50:41. | |
making them in our own image and our own reflection is frightening. If we | :50:42. | :50:46. | |
give them our intellect, power and inhumanity they will be an adversary | :50:47. | :50:51. | |
to humanity. Another person says we afraid of something better than us? | :50:52. | :50:56. | |
Why not say, yes, they can do better than us. Another person says we | :50:57. | :50:59. | |
should appreciate the rise of robots, solving problems for the | :51:00. | :51:03. | |
future and automating more processors. Michael says we should | :51:04. | :51:06. | |
only be afraid of the people behind the robots. We should be very afraid | :51:07. | :51:12. | |
of them. Michelle, we are talking about robots being the answer to the | :51:13. | :51:15. | |
social care crisis, they could help to look after the elderly, people | :51:16. | :51:20. | |
with dementia? Surely even you can see that is good? That really | :51:21. | :51:24. | |
frightens me because I am so old. The one in Japan, 24 fingered robot | :51:25. | :51:28. | |
that washes your hair, I do not want that. We do not have enough people | :51:29. | :51:34. | |
to do that. Get more people, pay them, sort out social care and stop | :51:35. | :51:41. | |
wasting money McGraw robots. Kriti, a highly respected professor at | :51:42. | :51:45. | |
Southampton University says it is not artificial intelligence that | :51:46. | :51:49. | |
worries me, it is human stupidity. How far do you agree? We heard that | :51:50. | :51:54. | |
from the e-mails? Humans are interacting with robots and | :51:55. | :51:58. | |
surprisingly they are asking them out on dates, robots are trying to | :51:59. | :52:02. | |
be as human as they can be. It is about designing the right solutions | :52:03. | :52:06. | |
for the right problems. Lots of work needs to be done with education, | :52:07. | :52:11. | |
health care, transportation, AI can help. Rather than focusing on the | :52:12. | :52:16. | |
interactions that are stupid, we need to focus on problems to be | :52:17. | :52:20. | |
sold. Can it solve our problems? That is less of an issue. What Noel | :52:21. | :52:26. | |
Sharkey is talking about is key. We already seeing artificial entities | :52:27. | :52:29. | |
with agency of the human beings, they are called artificial persons, | :52:30. | :52:34. | |
otherwise known as corporations, we give them a degree of agency to | :52:35. | :52:38. | |
cause a great deal of harm. This is not science fiction, it is already | :52:39. | :52:44. | |
yet. Ten seconds, is it a good thing? Is what? AI. It has the | :52:45. | :52:51. | |
potential to do enormous good, but now is the time to focus on the | :52:52. | :52:55. | |
social problems that may create. In the future I will have this debate | :52:56. | :52:59. | |
with ball robots. Or cyborgs. Yes. Thank you. | :53:00. | :53:02. | |
A celebration of the Hindu God Krishna was marked this week. | :53:03. | :53:04. | |
He is depicted in many ways, from an innocent child, | :53:05. | :53:07. | |
a conquering hero, a lover, a cattle herder and a musician. | :53:08. | :53:09. | |
Mehreen Baig went to Watford to join thousands of devotees and join | :53:10. | :53:12. | |
I am at their home to the International successes -- | :53:13. | :53:27. | |
International Society For Krishna Consciousness,, better known as the | :53:28. | :53:31. | |
Hare Krishna movement, when the house and the grounds were gifted to | :53:32. | :53:35. | |
them by George Harrison. The man, just outside Watford, is one of the | :53:36. | :53:40. | |
UK's most popular Hindu pilgrimage sites, and I am here for the biggest | :53:41. | :53:45. | |
event of the year. More than 50,000 pilgrims are expected here over the | :53:46. | :53:50. | |
course of two days to celebrate Janmashtami. This man has been a | :53:51. | :53:54. | |
devotee of Krishna for 15 years. It is one of the most wonderful and joy | :53:55. | :54:04. | |
you should occasions in our calendars, Sri Krishna Janmashtami, | :54:05. | :54:08. | |
the birthday of Krishna, who we regard as supreme God. Krishna has | :54:09. | :54:13. | |
many manifestations over the ages but the fountain had the source of | :54:14. | :54:17. | |
all incarnations, it is said to be Krishna. | :54:18. | :54:20. | |
Krishna is one of the most popular Hindu gods, so Hindus from many | :54:21. | :54:24. | |
traditions gathered to mark his birthday. The spiritual focus of the | :54:25. | :54:29. | |
day is to offer a prayer to Krishna in the temple. I can't wait to see | :54:30. | :54:34. | |
inside. It is a very long queue. I can hear | :54:35. | :54:44. | |
chanting, music. I have seen things like this in the | :54:45. | :54:47. | |
Bollywood movies we watch at home, but never in real life. It is an | :54:48. | :54:53. | |
incredible sights to see devotees making their offerings. Food is at | :54:54. | :54:58. | |
the heart of the celebrations, everyone who attends the festival | :54:59. | :55:05. | |
gets a free meal, and all the food is vegetarian. This woman, who hosts | :55:06. | :55:09. | |
a vegetarian cooking demonstration, explains why. From the spiritual | :55:10. | :55:14. | |
perspective, Krishna says if one offers me with love and devotion a | :55:15. | :55:18. | |
leaf, a flower, fruit or water I will accept it. He said in the | :55:19. | :55:26. | |
Bhagavad-Gita that killing animals is not permitted. So we are | :55:27. | :55:31. | |
vegetarian. An army of volunteers work behind the scenes in the | :55:32. | :55:37. | |
kitchen, preparing over 50,000 plates of the free food offered to | :55:38. | :55:41. | |
pilgrims. After I have had a quick makeover, more of an explanation. | :55:42. | :55:47. | |
My mother will be very proud. I can smell is an amazing things around | :55:48. | :55:50. | |
me. Can you tell me what is going on? I can smell some amazing things. | :55:51. | :56:00. | |
We have been preparing for four weeks, the last four days have been | :56:01. | :56:03. | |
very intense, we have been chopping coriander and peppers. We offer the | :56:04. | :56:08. | |
food to Lord Krishna once it has been sanctified. Then we give it to | :56:09. | :56:15. | |
the pilgrims free of charge. What is the importance of feeding people? | :56:16. | :56:22. | |
When you go to temples, you get sanctified food. It is about being a | :56:23. | :56:33. | |
compassionate person. I am not a natural in the kitchen | :56:34. | :56:37. | |
but my helpers welcomes all the same. I can do that. This is not me | :56:38. | :56:43. | |
and my prime. -- my helper is a welcome to all the same. | :56:44. | :56:49. | |
As well as food and worship their all kinds of entertainment choose | :56:50. | :56:56. | |
from. When I spot a henna artist, I can't | :56:57. | :57:03. | |
resist getting my hand painted. I am a devotee of Krishna, so every year | :57:04. | :57:14. | |
for the last 17 years I have been doing mehndi for Krishna. In the | :57:15. | :57:19. | |
Scripture it says that a person who has any talent given to you by Lord | :57:20. | :57:23. | |
Krishna, you should offer that if you cannot festival time. This is | :57:24. | :57:28. | |
the reason why I do mehndi every year. We are spreading the message | :57:29. | :57:36. | |
of love, humility. Just give something to earth which God has | :57:37. | :57:41. | |
given you. All too soon it is time to go home, but I am not leaving | :57:42. | :57:47. | |
until I collect my plate. The best part of day! Thank you. | :57:48. | :57:56. | |
And that is a very special end to a very special day. For me personally, | :57:57. | :58:01. | |
this is a community I have barely had an inside too, so seeing what a | :58:02. | :58:06. | |
strong sense of community there is, people who have been volunteering | :58:07. | :58:09. | |
for weeks, coming after work to make sure everyone has a really great | :58:10. | :58:12. | |
time, that was particularly heart-warming. | :58:13. | :58:14. | |
That's all from us for this week. | :58:15. | :58:15. | |
Many thanks to all our guests and you at home | :58:16. | :58:18. | |
Emma will be carrying on the conversation online. | :58:19. | :58:21. | |
Yes, I'll be talking to Kriti Sharma about the future | :58:22. | :58:23. | |
Log on to facebook.com/bbcSundayMorningLive | :58:24. | :58:25. | |
In the meantime, from everyone here in the studio and the whole | :58:26. | :58:33. |