:00:00. > :00:07.I'm Robbie Savage - retired footballer,
:00:08. > :00:20.Social media can be great, but I have also learned
:00:21. > :00:32.I woke up the next morning, and it had completely blown out
:00:33. > :00:40.We take our questions to Twitter headquarters.
:00:41. > :00:43.And we find out how badly wrong it can go.
:00:44. > :00:46.If only they did not do these stupid things on Facebook,
:00:47. > :01:02.Every day I am amazed what some people will do on social media,
:01:03. > :01:06.without considering the consequences to themselves or to others.
:01:07. > :01:13.This video has had more than 8 million views so far
:01:14. > :01:20.Right now I have spikes in my hand, the top and bottom of my mouth,
:01:21. > :01:24.Likes and shares are being put before safety and privacy.
:01:25. > :01:29.Especially when it comes to printing members of the public.
:01:30. > :01:32.Especially when it comes to -- pranking members of the public.
:01:33. > :01:46.That is what I do to make you guys laugh.
:01:47. > :01:57.For loads of us, life has become eat, sleep, social media.
:01:58. > :02:03.This new world brings opportunities, but also pitfalls.
:02:04. > :02:07.It is now easy to put your message out there for all the world to see.
:02:08. > :02:14.And once you put something online, you cannot be sure what will happen
:02:15. > :02:18.This video was posted on Facebook last June
:02:19. > :02:36.We quite like to go out to these high places and see the view
:02:37. > :02:40.We had been looking at this bridge for a while,
:02:41. > :02:47.We managed to get up the inside, and got to the top.
:02:48. > :02:50.Because it was quite a struggle to get up the inside, we thought it was
:02:51. > :02:56.But as soon as you are on the wires, and sat down, it would just be
:02:57. > :03:10.Jake says he knew what he was doing because he is
:03:11. > :03:14.Free running started out as a martial art called parkour.
:03:15. > :03:18.It was more a case of getting from A to B as fast as you can.
:03:19. > :03:23.And then it evolved into more of a stylish kind of sport.
:03:24. > :03:27.When he posted this video last June, he says he could never have
:03:28. > :03:34.We thought it would get us a few likes, get us a bit of recognition.
:03:35. > :03:37.We woke up the next morning and it had completely blown out
:03:38. > :03:39.of proportion, with thousands and thousands of views.
:03:40. > :03:45.And we were on the front cover of the newspaper.
:03:46. > :03:48.At the time, it was reported that Jake's free running group
:03:49. > :03:54.Others said it might encourage dangerous copycat behaviour.
:03:55. > :03:57.A lot of parents were understandably worried that we might be encouraging
:03:58. > :03:59.their younger children to do this kind of thing,
:04:00. > :04:08.We were not thinking about that at the time.
:04:09. > :04:13.I never would have done that if I did not have the ability to do so.
:04:14. > :04:22.Social media had turned him into a target for criticism.
:04:23. > :04:28.social media is not your best friend.
:04:29. > :04:39.And yet social media keeps on growing.
:04:40. > :04:53.One student might be wishing they had never started using it.
:04:54. > :04:57.It shared pictures of student life and nights out,
:04:58. > :05:11.I wanted to experiment and I wanted to see if it would actually work.
:05:12. > :05:14.Probably the second, I wanted a little bit of fame,
:05:15. > :05:22.Nobody knew who I was, and I got a thrill out of that.
:05:23. > :05:29.But being in charge of this account, with loads
:05:30. > :05:32.of people, which is a bit weird, I suppose.
:05:33. > :05:35.Alex copied the idea from a group called Cardiff Uni.
:05:36. > :05:39.That account was shut down, and the story ran in the national press.
:05:40. > :05:42.The university said the account had nothing to do with them.
:05:43. > :05:48.They sent out a strong warning to students about social media.
:05:49. > :05:50.Despite that, Alex started out Dipsnaps and encouraged
:05:51. > :06:01.What was uploaded to the group was his choice.
:06:02. > :06:05.A lot of men like sending rude photos across, which was awkward,
:06:06. > :06:09.but I found I just had to do it because people were asking for it.
:06:10. > :06:15.Photos like this were posted uncensored, but some people were
:06:16. > :06:23.You had various drug photos, which I know the first account, Cardiff Uni,
:06:24. > :06:28.got taken down for having students taking different drugs.
:06:29. > :06:31.So I avoided putting pictures or videos of people doing all sorts
:06:32. > :06:38.There was a couple of photos of people I think trying to advertise
:06:39. > :06:50.I had a couple of photos of people with boxes full of laughing gas.
:06:51. > :06:54.I think they were trying to use the account to sell it or something,
:06:55. > :06:58.So I did not upload those photos, because obviously, I thought that
:06:59. > :07:07.It is quite shocking that the people do not realise how dangerous
:07:08. > :07:13.This doctor is an expert in social media who has studied how
:07:14. > :07:19.I think if they realised how difficult it was to remove this
:07:20. > :07:21.content when it gets shared and put on other websites and hosted
:07:22. > :07:24.in different countries, which can happen pretty easily, they probably
:07:25. > :07:30.They do not realise that these images could stay online
:07:31. > :07:37.And you could have a lot of problems with removing them.
:07:38. > :07:40.As well as the risks of being featured on a video
:07:41. > :07:43.like this, the doctor feels that being responsible for creating it
:07:44. > :07:51.It would be interesting to know if the person who was running that
:07:52. > :07:53.account actually knew the ages of the people involved, especially when
:07:54. > :08:02.There is no guarantee that everybody in that film was of a certain age.
:08:03. > :08:09.Did people know that their images were actually being shared?
:08:10. > :08:12.With Snapchat and how it worked, there was no real way
:08:13. > :08:19.I assumed from how the account operates that if you send
:08:20. > :08:25.in a snap to that account, I assume that you want it to be on there.
:08:26. > :08:30.Some of the explicit and offensive pictures could cause all kinds of
:08:31. > :08:33.damage to people's reputations and careers.
:08:34. > :08:41.I had a lot come through, and we had a couple which outright said
:08:42. > :08:46.They would take a picture of someone and say,
:08:47. > :08:54.I did not upload those, because they are just really offensive.
:08:55. > :08:59.Most of them were friends taking snaps of other friends,
:09:00. > :09:04.and I just assumed that they all gave their consent to do that.
:09:05. > :09:14.How the entire thing operates, you cannot confirm it 100%.
:09:15. > :09:16.Explicit content and social media are a very dangerous mix, especially
:09:17. > :09:24.As shocking as it sounds, North Wales police say kids as young as 11
:09:25. > :09:37.A young girl who showed pictures of herself, naked pictures
:09:38. > :09:45.That particular person, the victim, is very vulnerable,
:09:46. > :09:50.and has caused some significant distress to her and her family.
:09:51. > :09:52.The force says there are even more sinister risks
:09:53. > :10:01.These children understand the dangers and risks they are putting
:10:02. > :10:04.themselves at when they are sharing pictures, whether it be with their
:10:05. > :10:10.friends or a complete stranger, but they perceive they know them.
:10:11. > :10:15.What they have got to understand is that a 17-year-old person,
:10:16. > :10:26.that is purporting to be young John from USA, may well be an old man
:10:27. > :10:30.And it is even more disturbing when you look
:10:31. > :10:33.at how often it is happening and the ages of the children involved.
:10:34. > :10:35.Predominately between 11 and 18, sharing pictures now.
:10:36. > :10:38.Young children, young girls, young boys sharing pictures as well.
:10:39. > :10:42.And it is happening on a daily basis.
:10:43. > :10:53.And that is just from the ones which are getting reported to us.
:10:54. > :10:55.Social media can definitely be funny, entertaining and informative.
:10:56. > :10:57.But things can get out of hand, especially
:10:58. > :11:00.when it comes to challenges or crazies which encourage people to
:11:01. > :11:15.risk their lives, performing more and more dangerous stunts.
:11:16. > :11:18.It has to be crazy to get noticed in the first place.
:11:19. > :11:21.I think that is one of the key problems or driving
:11:22. > :11:24.factors, I should say, in these extreme videos, you have to
:11:25. > :11:26.do something that is more dangerous than the person before.
:11:27. > :11:29.Some of the challenges that have been online are ridiculous.
:11:30. > :11:39.This challenge involves salt, and causes burns.
:11:40. > :11:45.Others have gone further by setting themselves on fire.
:11:46. > :12:01.it was reported that his mother was arrested for helping to film it!
:12:02. > :12:14.These crazes can start innocent enough.
:12:15. > :12:18.This started for charity for the Ice Bucket Challenge group,
:12:19. > :12:21.This man was heavily criticised for jumping from a bridge in
:12:22. > :12:25.Police called it reckless and said it should not be copied.
:12:26. > :12:29.The local MP said he could have been killed and that no one could
:12:30. > :12:33.Although he survived, a couple of weeks later, an 18-year-old
:12:34. > :12:37.That was reported that his death was linked to the Ice Bucket Challenge.
:12:38. > :12:41.In Wales, people have died as a result of social media challenges.
:12:42. > :12:54.He was funny, he was kind, and he just loved everyone.
:12:55. > :12:57.This was a drinking game that swept through social media,
:12:58. > :13:00.people filmed themselves downing a drink, nominated someone else and
:13:01. > :13:06.I got a phone call from my son, Paul, saying mum you need to come
:13:07. > :13:13.Obviously I got into the car and drove over to see him.
:13:14. > :13:16.By the time I got there, it was too late.
:13:17. > :13:25.Nothing. Nothing.
:13:26. > :13:33.It was the first time I had heard about the game.
:13:34. > :13:36.Paula's son Stephen downed a bottle of vodka and died at
:13:37. > :13:49.Paula feels angry that her son died because of the social media craze.
:13:50. > :14:06.I just wish that day never happened.
:14:07. > :14:14.If only he didn't do those stupid things, and to Facebook, internet,
:14:15. > :14:21.Despite her anger, Paula has still found solace in social media.
:14:22. > :14:24.A Facebook page has been set up to remember Stephen
:14:25. > :14:28.I get messages every day on the page.
:14:29. > :14:33.I enjoy looking at all the messages, all his friends, different
:14:34. > :14:36.pictures, messages, tributes and songs.
:14:37. > :14:43.Last month, balloons were released on the anniversary of Stephen's
:14:44. > :14:50.death and the pictures were shared on Facebook.
:14:51. > :14:53.Even when people are no longer here, there are stories continue
:14:54. > :15:06.I used to love social media, fair enough,
:15:07. > :15:10.I got a lot of stick but I accepted that, it goes with the territory of
:15:11. > :15:17.But a few years ago, on the day my father died, it went too far.
:15:18. > :15:27.He had only passed away for less than a day and someone tweeted,
:15:28. > :15:31.How someone could sit and type those nasty things is beyond me.
:15:32. > :15:40.For that minute, I was in a fit of rage and wanted to go round, I do
:15:41. > :15:46.-- I don't know what would have happened.
:15:47. > :15:48.You just have to use restraint and control
:15:49. > :15:52.and you have to think that these people are just absolute idiots.
:15:53. > :16:03.So for me now social media is all about promoting the things I am
:16:04. > :16:06.associated with - whether a phone in, a newspaper column, being a
:16:07. > :16:17.It's a good tool to use as a broadcaster, but in terms of
:16:18. > :16:19.being a normal human being I don't go anywhere near it.
:16:20. > :16:28.Abusive comments on social media are known as trolling.
:16:29. > :16:31.There seems to be no limit to what people will say.
:16:32. > :16:34."Did you ever consider killing yourself?
:16:35. > :16:42."If I ever see you, I will BLEEP hurt you."
:16:43. > :16:48."I know where you live, you want to be careful.
:16:49. > :16:50."Go and kill yourself, you ugly BLEEP."
:16:51. > :16:53.This is the kind of abuse that Wayne got all the time.
:16:54. > :17:00.He is a YouTube celebrity with 2.5 million followers.
:17:01. > :17:03.Twice a week, he records new videos with tips on make up.
:17:04. > :17:08.He says 90% of the reaction to his videos is positive
:17:09. > :17:29.No doubt you'll put them in the comments section and I will say,
:17:30. > :17:31.that's nice, oh, that's not very nice.
:17:32. > :17:43.It's funny that you can have 100 very positive comments,
:17:44. > :17:46.and then one bad one can cancel out all that other 100 that made
:17:47. > :17:52.I totally understand why if you put yourself out there you should expect
:17:53. > :17:55.to get something back, but should you expect to get death threats from
:17:56. > :18:00.Should that create such anger that you have to accept that
:18:01. > :18:03.Wayne runs his channel from South Wales,
:18:04. > :18:08.I mean, I've had people post pictures of my house.
:18:09. > :18:09.Location, the address, my phone number...
:18:10. > :18:19.After he slid down at this bridge in South Wales,
:18:20. > :18:35.Told us that they hoped we would fall and die, which was
:18:36. > :18:41.quite upsetting in a way because we were just doing what we love doing.
:18:42. > :18:43.An extreme sport, which is something we have been
:18:44. > :18:45.doing for a long time, something a lot of people do.
:18:46. > :18:50.I felt quite hurt by it and very shocked at the time and it made me
:18:51. > :18:52.feel quite angry, but I also understood where they were coming
:18:53. > :18:58.from, because from their perspective what we were doing was ridiculous.
:18:59. > :19:01.Trolling can be difficult to cope with an even harder to stop.
:19:02. > :19:04.The Crown Prosecution Service is working on new guidelines
:19:05. > :19:21.Wayne wants social media platforms to take responsibility.
:19:22. > :19:24.I would like them to be more vocal in how
:19:25. > :19:28.And for them to follow through with it.
:19:29. > :19:31.It's one thing to say this person has been blocked, but if this person
:19:32. > :19:35.is threatening to kill you, you want them to be more than blocked.
:19:36. > :19:37.They need to be reported to the police.
:19:38. > :19:38.I suppose you could do that yourself.
:19:39. > :19:40.But they have more behind-the-scenes knowledge
:19:41. > :19:43.of where that person is and what part of the country they are in.
:19:44. > :19:46.They can help the authorities to deal with that
:19:47. > :19:57.We contacted Facebook and YouTube, both declined to be interviewed.
:19:58. > :20:00.We also approached Twitter, who invited us to their offices
:20:01. > :20:02.We asked if there was more that social media
:20:03. > :20:08.If someone feels in danger for their safety in the real world, then
:20:09. > :20:14.One of the things that might happen is that someone might
:20:15. > :20:22.So the words on the screen and whether you feel in danger or not
:20:23. > :20:28.So we're going to take action and close down accounts down, even
:20:29. > :20:29.where you might not feel in danger.
:20:30. > :20:34.We have a team that works with the police on a daily basis dealing
:20:35. > :20:37.with requests from law enforcement agencies, for a range of offences.
:20:38. > :20:39.It's never something we think will be done
:20:40. > :20:46.It's one of the main priorities for our business to make sure that
:20:47. > :20:49.people feel confident that when they use Twitter the clear rules are
:20:50. > :20:53.there and if rules are broken, the people who break them will have
:20:54. > :20:56.We also asked if Twitter was concerned about people taking risks
:20:57. > :21:22.a video sharing platform, then you can say, this is dangerous,
:21:23. > :21:32.to do this for a few of your friends seeing a video online?
:21:33. > :21:50.It can be a picture, a quick snap, or a quick tweet and a short video
:21:51. > :21:54.and before you know it hasn't disappeared and it is not just a
:21:55. > :21:58.moment in time, it creates a legacy and it is damaging your online
:21:59. > :22:05.profile in the future, or you are setting and damaging other peoples'
:22:06. > :22:10.profiles. There are various organisations working to educate
:22:11. > :22:26.people about the risks of social media through videos like this one.
:22:27. > :22:29.Paula Brooks is trying to get the message out after her son died
:22:30. > :22:32.taking part in an online drinking game and she is warning everyone
:22:33. > :22:36.Anyone asks you to do anything silly, to drink, jump of bridges,
:22:37. > :22:42.Because it could be your mum who could be sitting here telling
:22:43. > :22:45.the story that I'm telling, or your father, brothers or sisters.
:22:46. > :22:54.As for Alex Jones, his Snap Chat account is now closed and Cardiff
:22:55. > :23:05.I wish I didn't make it, to be honest.
:23:06. > :23:09.At the time I thought it was really cool
:23:10. > :23:17.You risk upsetting the lives of people, if they do get caught out.
:23:18. > :23:25.Social media is the most beautiful and ridiculous place to be.
:23:26. > :23:28.It can create amazing things and, in the click of a finger,
:23:29. > :23:35.Personally, I'll never stop using social media, but my