Browse content similar to Hunting the Internet Bullies. Check below for episodes and series from the same categories and more!
Line | From | To | |
---|---|---|---|
including racist abuse. Social networking sites have changed the | 0:00:01 | 0:00:06 | |
way we talk to one another. But what happens when you're the one | 0:00:06 | 0:00:09 | |
being talked about and you don't like what's being said? There's | 0:00:09 | 0:00:15 | |
been many times where I've cried myself to sleep. I'm very scared | 0:00:15 | 0:00:20 | |
and I've never, ever said that. I like to think that people think I'm | 0:00:20 | 0:00:28 | |
strong. Tonight on Panorama, hunting the internet bullies. Who | 0:00:28 | 0:00:33 | |
they target... You guys have ruined my life. Why they do it. Loads of | 0:00:33 | 0:00:37 | |
people are doing it. It's purely to get some kind of reaction to people. | 0:00:37 | 0:00:44 | |
And what happens when we confront them. Facebook is an open forum. | 0:00:44 | 0:00:48 | |
You're entitled to your own opinion. Some of the it racist. People have | 0:00:48 | 0:00:58 | |
0:00:58 | 0:01:04 | ||
been convicted for doing less than In 2010, Cher Lloyd went from being | 0:01:04 | 0:01:09 | |
anonymous teenager to pop star. She reached the final four of the X | 0:01:09 | 0:01:13 | |
Factor. It was a dream come true. After her very first appearance on | 0:01:13 | 0:01:18 | |
the show, she noticed something remarkable happening on her | 0:01:18 | 0:01:22 | |
Facebook profile. Within minutes my Facebook page exceeded the amount | 0:01:22 | 0:01:28 | |
of friends you could have. It was excitement for me. I just kept | 0:01:28 | 0:01:30 | |
clicking "confirm" because I couldn't believe how many people | 0:01:30 | 0:01:33 | |
wanted to be my friend. Since the X Factor she's sold over 400,000 | 0:01:33 | 0:01:39 | |
singles. But there was a price to pay - nasty, sometimes racist | 0:01:39 | 0:01:46 | |
comments posted about her online. Her mother is of Roma origin. That | 0:01:46 | 0:01:50 | |
was used to attack her and it continues to this day. I must get | 0:01:50 | 0:01:57 | |
at least ten tweets a day saying that I'm a dirty pikey, yeah. I | 0:01:57 | 0:02:06 | |
must get that. Why do they say that? Erm, I think because they | 0:02:06 | 0:02:13 | |
know it's going to get to me or because they know they can. There | 0:02:13 | 0:02:16 | |
has even been a death threat to a member of her family. There's been | 0:02:16 | 0:02:22 | |
many times where I've cried myself to sleep. I'm very scared and I | 0:02:22 | 0:02:27 | |
have never, ever said that because I like to think that people think | 0:02:27 | 0:02:31 | |
I'm strong. Because if they don't, they might not think I can do it. | 0:02:31 | 0:02:38 | |
But I am in some ways, but I think some days I just wish that people | 0:02:38 | 0:02:45 | |
would leave me alone for a little bit, wish for there to be a hole to | 0:02:45 | 0:02:52 | |
suck me in. You've been really affected by this, haven't you? | 0:02:52 | 0:02:56 | |
never thought that I could talk to anyone about it, because I didn't | 0:02:56 | 0:03:00 | |
feel like I needed to. I thought that I'd be strong enough to get | 0:03:00 | 0:03:03 | |
through it on my own. I think that's the worst thing you can | 0:03:03 | 0:03:13 | |
possibly do. In the last five years, social networks have become a fact | 0:03:13 | 0:03:17 | |
of life. Facebook now has 30 million users in the UK. That's | 0:03:17 | 0:03:22 | |
almost half the population of the country. But for some, the runaway | 0:03:23 | 0:03:29 | |
success of social networking has come at a cost. Natasha McBryde was | 0:03:29 | 0:03:35 | |
15 when last year, she was bullied not on Facebook but on a smaller | 0:03:35 | 0:03:38 | |
social networking site. Tash came home from school one day, floods of | 0:03:38 | 0:03:43 | |
tears and saying that she never wanted to go back to school again. | 0:03:43 | 0:03:46 | |
It was quite serious. It wasn't just a fall out with friends that | 0:03:46 | 0:03:49 | |
she'd had and I believe that her mother did look into possibly | 0:03:49 | 0:03:53 | |
transferring schools etc. So it shows the seriousness of the nature. | 0:03:54 | 0:04:01 | |
What began as a disagreement in school soon went online. Panorama | 0:04:01 | 0:04:05 | |
understands at the height of the problem there were 30 pages of | 0:04:05 | 0:04:09 | |
critical comments about Natasha on one social network. Her father | 0:04:09 | 0:04:14 | |
believes it contributed to what happened next. It was a perfectly | 0:04:14 | 0:04:17 | |
normal evening. There had been discussions at tea. There had been | 0:04:17 | 0:04:23 | |
a bit of a fall out with one of her friends and then at 9.30pm, she | 0:04:23 | 0:04:30 | |
just walked out. Something had happened that she just decided | 0:04:30 | 0:04:38 | |
enough's enough and yeah, she walked out of the house. Early the | 0:04:38 | 0:04:42 | |
next morning, Shanie Erwin an officer with the British Transport | 0:04:42 | 0:04:48 | |
Police was called to the scene of an incident on the railway line. | 0:04:48 | 0:04:53 | |
About 2.30am on February 14th, a train driver discovered the body of | 0:04:53 | 0:04:58 | |
a young female on the railway line, just further down here. It was just | 0:04:58 | 0:05:04 | |
up here, yeah? Yes, just on the bridge here in front of us. | 0:05:04 | 0:05:09 | |
first thing I knew about it was about 6.30am and suddenly, I got a | 0:05:09 | 0:05:12 | |
phone call from my son James there had been an incident on the railway | 0:05:12 | 0:05:16 | |
and the lines were closed and that we needed to go down to the local | 0:05:16 | 0:05:23 | |
bridge. Natasha had committed suicide by throwing herself in | 0:05:23 | 0:05:27 | |
front of a train. Earlier that evening, she had read nasty | 0:05:27 | 0:05:33 | |
comments about herself online. Cases like this are still very rare, | 0:05:33 | 0:05:39 | |
but cyberbullying is becoming more and more common. Earlier today, the | 0:05:39 | 0:05:45 | |
charity Beatbullying released a major report which says that 28% of | 0:05:45 | 0:05:50 | |
11 to 16-year-olds have experienced bullying or harassment online. | 0:05:50 | 0:05:54 | |
research shows that cyberbullying, online bullying is relatively | 0:05:54 | 0:05:58 | |
common. A large number of children receive abusive, vicious messages, | 0:05:58 | 0:06:02 | |
have comments posted about them online, have hate pages set up | 0:06:02 | 0:06:06 | |
about them online and for some, this happens relentlessly over | 0:06:06 | 0:06:11 | |
months and months. Social networking means that bullying has | 0:06:11 | 0:06:15 | |
become more intrucive. Where once it might have ended at the school | 0:06:15 | 0:06:20 | |
gate, now it can happen around-the- clock. And when it comes to social | 0:06:20 | 0:06:25 | |
networks, one site is now the daddy of them all. What social network | 0:06:25 | 0:06:30 | |
site is the big one, where does it all happen? Facebook. Facebook. | 0:06:30 | 0:06:34 | |
Facebook. For the vast majority of teenagers, Facebook is now just a | 0:06:34 | 0:06:40 | |
part of life. But this isn't just any group of teenagers. They all | 0:06:41 | 0:06:46 | |
work in their spare time for the charity Beatbullying. They're | 0:06:46 | 0:06:50 | |
what's known as cybermentors. They give advice to young people who are | 0:06:50 | 0:06:53 | |
being bullied online. What kind of things do you experience or have | 0:06:53 | 0:06:58 | |
you seen on a day-to-day basis? I've seen many people get told | 0:06:58 | 0:07:02 | |
they're not allowed to come to school, no-one loves you, your mum | 0:07:02 | 0:07:05 | |
don't love you, you look like you've got a disease. So many | 0:07:05 | 0:07:11 | |
horrible things. The cybermentors said they would like to take some | 0:07:11 | 0:07:14 | |
of the issues they had up with Facebook but didn't know who to | 0:07:14 | 0:07:17 | |
call. We recorded some of their questions and asked Facebook if | 0:07:17 | 0:07:25 | |
they would answer them. Their director of policy, Richard Allen | 0:07:25 | 0:07:29 | |
agreed. I tried to report a cyberbully and found out it was | 0:07:29 | 0:07:34 | |
hard to do. My friends struggle with it as well. We feel the report | 0:07:34 | 0:07:39 | |
button does not work. What are you going to do? Is there someone to | 0:07:39 | 0:07:45 | |
speak to at Facebook if an incident of bullying happening. What happens | 0:07:45 | 0:07:50 | |
when you press the report button, where does it go? The report goes | 0:07:50 | 0:07:54 | |
into a system where it's categorised according to the | 0:07:54 | 0:07:57 | |
seriousness of the offence. We've tried to improve the reporting tool | 0:07:57 | 0:08:01 | |
to make it as accurate as possible in terms of categorising reports. | 0:08:01 | 0:08:05 | |
Are you saying it's just too difficult, Facebook just isn't up | 0:08:05 | 0:08:11 | |
to the job of answering phone calls? You're a multibillion dollar | 0:08:11 | 0:08:14 | |
company growing every year. Companies that operate on the scale | 0:08:14 | 0:08:19 | |
we do across the internet, it's incumbent on us to make sure we | 0:08:19 | 0:08:23 | |
have systems in place to allow us to be safe and secure. The only way | 0:08:23 | 0:08:27 | |
to do that realistically and maintain services that are free to | 0:08:27 | 0:08:31 | |
the user, which is their expectation, is to put in place a | 0:08:31 | 0:08:35 | |
large number of automated systems, backed up by trained staff to look | 0:08:35 | 0:08:38 | |
at the most serious cases, which is what we do. To some of the experts | 0:08:39 | 0:08:45 | |
in this area, that's not enough. All of us need to demand more from | 0:08:45 | 0:08:49 | |
social networking sites and hold them to account to make sure that A, | 0:08:49 | 0:08:55 | |
their sites are as safe as they can possibly be and B, that they | 0:08:55 | 0:08:58 | |
actually work more and support organisations that deal with the | 0:08:58 | 0:09:04 | |
consequences of what happens on their sites. Facebook does at least | 0:09:04 | 0:09:09 | |
try to promote a real name policy. It urges users to post under their | 0:09:09 | 0:09:17 | |
genuine identities. But on some smaller social networking sites, it | 0:09:17 | 0:09:21 | |
seems the attraction lies in the fact you can post comments | 0:09:21 | 0:09:26 | |
anonymously. At Reigate College in Surrey one site created havoc in | 0:09:26 | 0:09:30 | |
the school. You all know from come together college that we won't have | 0:09:30 | 0:09:34 | |
any student here who has been made to feel uncomfortable or unwelcome | 0:09:34 | 0:09:39 | |
because of someone else making their life difficult. The website | 0:09:39 | 0:09:43 | |
is called Little Gossip. You may not have heard of it, but they have. | 0:09:43 | 0:09:48 | |
Can you put your hand up if you've been on Little Gossip, I mean just | 0:09:48 | 0:09:52 | |
looked at it? Pretty much everybody. Can you now put your hand up if | 0:09:52 | 0:09:57 | |
you've been named on it, if someone has commented about you? So quite a | 0:09:57 | 0:10:07 | |
few. It works by allowing people to leave anonymous comments about | 0:10:07 | 0:10:12 | |
anyone they like. That's bad enough, but it's all organised around | 0:10:12 | 0:10:16 | |
schools and colleges, which makes it even easier for the people | 0:10:16 | 0:10:19 | |
around you every day to see what's being written about you. No wonder | 0:10:19 | 0:10:25 | |
this site has been called a recipe for cyberbullying. A quick look at | 0:10:25 | 0:10:29 | |
the site for Reigate College reveals a list of horrendous | 0:10:29 | 0:10:32 | |
comments about real named individuals. Like this one for | 0:10:32 | 0:10:41 | |
When it comes to nasty design features, Little Gossip seems to be | 0:10:41 | 0:10:45 | |
the site that has thought of everything. One feature mean that's | 0:10:45 | 0:10:49 | |
your peers can vote on whether a piece of gossip about you is true | 0:10:49 | 0:10:59 | |
0:10:59 | 0:10:59 | ||
or false. When you're giving them is your e-mail address. This man | 0:10:59 | 0:11:03 | |
has re-- recently retired as head of Reigate. If there's a comment | 0:11:03 | 0:11:07 | |
about you, it's a comment that says you're unpleasant, nobody wants you | 0:11:07 | 0:11:13 | |
around, 38 people have said "true" and no-one's said "false", it make | 0:11:13 | 0:11:16 | |
it's so much worse. It will build up the opinion that you're not | 0:11:16 | 0:11:24 | |
wanted in that community. We wanted to find out more about Little | 0:11:24 | 0:11:27 | |
Gossip. First, was to find the person who invented it and he | 0:11:27 | 0:11:34 | |
agreed to talk to us. Ted Nash is now a 20-year-old app | 0:11:34 | 0:11:37 | |
designer. Little Gossip was all his idea. But these days he has nothing | 0:11:37 | 0:11:42 | |
to do with it. He originally set it up from his bedroom as a chat forum | 0:11:42 | 0:11:46 | |
for his friends. What was the thinking behind it? I mean, how | 0:11:46 | 0:11:50 | |
many people did you expect would be involved with it? There's no way I | 0:11:50 | 0:11:54 | |
expected it to be as big as it was. It had 33,000 hits within the first | 0:11:54 | 0:12:00 | |
hour. Ted Nash did set up controls so that people could report abusive | 0:12:00 | 0:12:07 | |
comments and bullying. The trouble was once the traffic took off, how | 0:12:07 | 0:12:12 | |
can one man cope with 60,000 pieces of gossip which come in. It got to | 0:12:12 | 0:12:16 | |
a stage where I deleted everything coming through because it was too | 0:12:16 | 0:12:19 | |
fast. I took the domain offline. Through those four days I was | 0:12:19 | 0:12:22 | |
approached by a number of people who said, you know, can we take the | 0:12:23 | 0:12:29 | |
site off you? Within days, Ted sold the site on. It was difficult to | 0:12:29 | 0:12:35 | |
find any information on who owns Little Gossip now. | 0:12:35 | 0:12:42 | |
Until very recently, it was hosted by a company called lease lease | 0:12:42 | 0:12:46 | |
based in the Netherlands. Leaseweb says it recently stopped hosting | 0:12:46 | 0:12:50 | |
the site and said it couldn't tell us who owned it. We did discover | 0:12:50 | 0:12:55 | |
that it was recently registered with a Russian internet service by | 0:12:55 | 0:13:02 | |
an anonymous person, who gave a contact number in Belarus, a | 0:13:02 | 0:13:05 | |
Communist-style dictatorship in Eastern Europe. | 0:13:05 | 0:13:10 | |
This is the problem with regulating sites like Little Gossip, the sites | 0:13:10 | 0:13:14 | |
users are in one country. It's hosted in another country. It's run | 0:13:14 | 0:13:18 | |
from yet another country. The owners of the site don't have to | 0:13:18 | 0:13:22 | |
tell anybody who they are or where they are. How on earth, do you | 0:13:22 | 0:13:27 | |
regulate that? Anonymity is a big problem when it comes to | 0:13:27 | 0:13:33 | |
cyberbullying. Getting offensive comments from people you know is | 0:13:33 | 0:13:39 | |
one thing, but getting them anonymously is anotherment This -- | 0:13:39 | 0:13:48 | |
another. This is Formspring. You can ask questions and others answer | 0:13:48 | 0:13:53 | |
them. Can can use their real name or anonymously. Natasha McBryde was | 0:13:53 | 0:13:58 | |
bullied on Formspring. It was being abused by a lot of people in that | 0:13:58 | 0:14:05 | |
they were sending anonymous postings to Tash with some quite | 0:14:05 | 0:14:10 | |
vindictive and nasty things being said about her, which obviously | 0:14:10 | 0:14:15 | |
undermined her confidence and made her naturally very upset. Because | 0:14:15 | 0:14:19 | |
the comments were all anonymous Natasha didn't know who in the | 0:14:19 | 0:14:22 | |
school she could trust and who she couldn't. Because she didn't know | 0:14:22 | 0:14:26 | |
who it was and probable thri was one of her friends, she didn't know | 0:14:26 | 0:14:30 | |
who to turn to. There was a bad apple amongst her friends, but she | 0:14:30 | 0:14:36 | |
didn't know which one. Formspring is integrated with Facebook, so | 0:14:36 | 0:14:41 | |
that it's easy for users to connect from one site to the other. Why is | 0:14:41 | 0:14:45 | |
it that Facebook, which has a real name policy intergrates and link | 0:14:45 | 0:14:51 | |
was a site like Formspring, which revels in anonymity? We are | 0:14:51 | 0:14:55 | |
responsible for the content on our service. If a third party starts | 0:14:55 | 0:14:59 | |
abusing or breaking the terms and conditions for the use of Facebook, | 0:14:59 | 0:15:02 | |
then we'll make sure that they're not able to access our service. But | 0:15:02 | 0:15:06 | |
we're not responsible for what they do within their own environment. | 0:15:06 | 0:15:11 | |
But why Lincar to them? Why integrate to them? -- why link to | 0:15:11 | 0:15:17 | |
them? Why integrate with them? people on Facebook can link to most | 0:15:17 | 0:15:22 | |
of the internet. We can't a rule where you can't link to the | 0:15:22 | 0:15:25 | |
internet. We asked Formspring for an interview. They declined. They | 0:15:25 | 0:15:35 | |
0:15:35 | 0:15:44 | ||
For Natasha MacBryde's father, the crucial point is that anonymity on | 0:15:44 | 0:15:51 | |
the internet can be dangerous. think the first thing is to remove | 0:15:51 | 0:15:54 | |
the anonymity of people on social network sites because an awful lot | 0:15:54 | 0:15:57 | |
of bullying is done because people can say it, because they think they | 0:15:57 | 0:16:02 | |
will not be caught and therefore that's why they say what they like. | 0:16:02 | 0:16:05 | |
And do you think anonymity played a big part in what happened to | 0:16:05 | 0:16:10 | |
Natasha? Absolutely. Anonymity - for many people, it just gives them | 0:16:10 | 0:16:16 | |
a chance to let off steam online. But for one internet subculture, | 0:16:16 | 0:16:24 | |
being anonymous is a way of life. They are internet trolls. There's | 0:16:24 | 0:16:34 | |
0:16:34 | 0:16:34 | ||
Trolls set out to provoke a reaction, often by upsetting and | 0:16:34 | 0:16:41 | |
offending people as much as they can. You guys are bitches you know | 0:16:41 | 0:16:44 | |
what, you don't phase me. Jessica Leinhardt, an 11-year-old girl from | 0:16:44 | 0:16:47 | |
Florida, became one of the most- well known trolling victims of | 0:16:47 | 0:16:52 | |
recent years. Her story shows what can happen when you inflame the | 0:16:52 | 0:16:57 | |
global community of internet trolls. I'm more pretty than you, I have | 0:16:57 | 0:17:02 | |
more friends. It all started when Jessi went online to post a video | 0:17:02 | 0:17:06 | |
of herself, using her online name, Jessi Slaughter. I'm happy with my | 0:17:06 | 0:17:10 | |
life, OK. If you can't, like, realise that and stop hating, you | 0:17:10 | 0:17:15 | |
know, I'll pop a Glock in your mouth and make a brain slushy, OK. | 0:17:15 | 0:17:19 | |
This video also went viral. Internet trolls thought that 11- | 0:17:19 | 0:17:24 | |
year old Jessi needed a comeuppance. So they went to work. Malcolm | 0:17:24 | 0:17:28 | |
Blackman saw what happened up close. Today, he's part of the activist | 0:17:28 | 0:17:31 | |
group Anonymous UK, but when the Jessi Slaughter campaign was at its | 0:17:31 | 0:17:34 | |
height, he was in close contact with the people who were attacking | 0:17:34 | 0:17:42 | |
I spent quite a lot of time with several notorious world trolls | 0:17:42 | 0:17:47 | |
online. I spent some time in their company. Obviously not in their | 0:17:47 | 0:17:51 | |
personal company but online. I gained a modicum of respect from | 0:17:51 | 0:17:56 | |
them for what I do. And they in turn over the time, I ran with | 0:17:56 | 0:18:00 | |
their pack gained my respect. hating on me. So what happened to | 0:18:01 | 0:18:03 | |
Jessi? Well, Jessi became the target of pretty much a worldwide | 0:18:03 | 0:18:08 | |
viral hate campaign. It wasn't just the trolls. I mean, it became a | 0:18:08 | 0:18:13 | |
vehicle for anyone to jump on to to really slaughter Jessi Slaughter. | 0:18:13 | 0:18:19 | |
And if you hate me. The trolls got what they wanted. Jessi posted | 0:18:19 | 0:18:23 | |
another video a few days later, this time in great distress. | 0:18:23 | 0:18:28 | |
Remember, she's 11 years old. this is Jessi Slaughter I just need | 0:18:28 | 0:18:32 | |
to tell you guys that you have ruined my life. My house has been | 0:18:32 | 0:18:39 | |
torn. I don't be suicide. I am not. Jessi's father intervened as well. | 0:18:39 | 0:18:42 | |
You guys have ruined my life. going to tell you right now. This | 0:18:42 | 0:18:47 | |
is from her father. You bunch of lying no-good punks. But this video | 0:18:47 | 0:18:51 | |
didn't stop the campaign, it just made it worse. By now, trolls were | 0:18:52 | 0:18:54 | |
invading every aspect of Jessi's life hijacking her social | 0:18:54 | 0:18:56 | |
networking sites, publishing her real address online, and even | 0:18:56 | 0:19:06 | |
0:19:06 | 0:19:07 | ||
sending constant pizza deliveries to her house. I am torn. I have had | 0:19:08 | 0:19:13 | |
emotional breakdowns one after the other. I guess the point is that | 0:19:13 | 0:19:17 | |
the internet community is merciless in a sense. It is, yeah. It doesn't | 0:19:17 | 0:19:21 | |
take account of age. It's just there are people there who are | 0:19:21 | 0:19:24 | |
going to... Well, because they can literally get away with being... | 0:19:24 | 0:19:27 | |
It's a great term, merciless it is, they can be thoroughly ruthless. | 0:19:27 | 0:19:30 | |
Malcolm says he's out to work against trolls, but he admits he | 0:19:30 | 0:19:34 | |
associates with them regularly, and he can even see why they do what | 0:19:34 | 0:19:40 | |
they do. It's the thrill, it's the win, being able to claim a victory, | 0:19:40 | 0:19:43 | |
being able to infuriate someone to the point where they log off or | 0:19:43 | 0:19:52 | |
they disconnect their computer. To a troll, that is a win. For many, | 0:19:52 | 0:19:55 | |
the worst kind of trolling happens on tribute sites to those who have | 0:19:55 | 0:20:00 | |
died. It's surprisingly common - there's even a name for it RIP | 0:20:00 | 0:20:05 | |
trolling. Just hours after her death, Natasha MacBryde's family | 0:20:06 | 0:20:08 | |
were coming to terms with the idea that cyber-bullying may have played | 0:20:09 | 0:20:14 | |
a part in her decision to commit suicide. But nothing could have | 0:20:14 | 0:20:19 | |
prepared them for what happened next. Pretty much immediately after | 0:20:19 | 0:20:24 | |
Tash had died, a tribute site was set up on Facebook. I had a phone | 0:20:25 | 0:20:28 | |
call from a friend saying had I seen it because there was a | 0:20:28 | 0:20:33 | |
particularly nasty comments being put up. What had been posted on | 0:20:33 | 0:20:35 | |
Natasha's tribute website was a range of offensive and obscene | 0:20:35 | 0:20:41 | |
remarks. Someone had even made an animated video of trains, using a | 0:20:41 | 0:20:47 | |
picture of Natasha. How soon after her death were you reading this? | 0:20:47 | 0:20:51 | |
That was within about 24 hours. you are in the initial traumatic | 0:20:51 | 0:20:56 | |
stages of grief. I was still in a state of shock at the time when we | 0:20:56 | 0:21:00 | |
are seeing this traumatic stuff being posted about your daughter. | 0:21:00 | 0:21:04 | |
Some disgusting and disgraceful. When most people have seen in the | 0:21:04 | 0:21:10 | |
media, horrendous comments to actually see about your daughter. | 0:21:10 | 0:21:12 | |
Shanie Erwin of the British Transport Police was investigating | 0:21:12 | 0:21:18 | |
the death of Natasha when told of the Facebook posts. I had never | 0:21:18 | 0:21:21 | |
heard of the term trolling before so we worked with Facebook and | 0:21:21 | 0:21:23 | |
YouTube and later with the internet service providers and enquiries - | 0:21:24 | 0:21:33 | |
both those and others led us to Sean Duffy, a 25-year-old loner who | 0:21:33 | 0:21:36 | |
suffers from Asperger's syndrome, was responsible for many of the | 0:21:36 | 0:21:42 | |
posts. Panorama has obtained exclusive footage of his police | 0:21:42 | 0:21:52 | |
0:21:52 | 0:21:52 | ||
Apology for the loss of subtitles for 52 seconds | 0:21:52 | 0:22:45 | |
In the interview, Duffy gives a Duffy was charged under the | 0:22:45 | 0:22:48 | |
Malicious Communications Act. It's a piece of legislation that | 0:22:48 | 0:22:51 | |
predates the internet, but it's still one of the best tools the | 0:22:51 | 0:22:57 | |
police have for prosecuting trolls. He was convicted and sentenced to | 0:22:57 | 0:23:03 | |
18 weeks in prison. He served nine weeks. Most trolls are never caught, | 0:23:03 | 0:23:12 | |
but we decided to try and track one down. We met a source who's | 0:23:13 | 0:23:15 | |
familiar with the trolling community, and he gave us | 0:23:15 | 0:23:20 | |
information about one notorious troll. The screen name of that | 0:23:20 | 0:23:26 | |
troll is Nimrod Severn. On this pen drive is the evidence of what this | 0:23:26 | 0:23:31 | |
troll Nimrod Severn has been up to. And really he's just popping up on | 0:23:31 | 0:23:36 | |
loads of RIP tribute sites and leaving really offensive messages. | 0:23:36 | 0:23:45 | |
But some of them are extremely On Boxing Day last year, Lancaster | 0:23:45 | 0:23:55 | |
University student Anuj Bidve was murdered here. A memorial page was | 0:23:55 | 0:23:58 | |
posted up for him on Facebook and within days Nimrod Severn posted an | 0:23:58 | 0:24:02 | |
offensive comment. "Rot in kiss". But on the tribute site of US | 0:24:02 | 0:24:05 | |
rapper Dolla, who was shot dead, he used one of the most racist terms | 0:24:05 | 0:24:15 | |
0:24:15 | 0:24:19 | ||
We wanted to meet the person behind Nimrod Severn. And it turns out we | 0:24:19 | 0:24:27 | |
were able to put a face to a name. His real name is Darren Burton. It | 0:24:27 | 0:24:36 | |
We have managed to track down Darren Burton, we've got an address | 0:24:36 | 0:24:42 | |
for him. He lives here, in Cardiff. We're going to go and try to find | 0:24:42 | 0:24:46 | |
him today and ask him the big question which is why? How can he | 0:24:46 | 0:24:56 | |
0:24:56 | 0:24:58 | ||
Hi, Declan Lawn, BBC Panorama. Can I ask you about the internet | 0:24:58 | 0:25:04 | |
trolling you do? What do I suggest? Go away. But you are Nimrod Severn, | 0:25:04 | 0:25:09 | |
can I ask you about it? Can I ask you how you justify it and why you | 0:25:09 | 0:25:13 | |
do it? What I want to ask you, Darren, is have you ever thought | 0:25:13 | 0:25:17 | |
about the people that you are hurting? Have you ever considered | 0:25:17 | 0:25:22 | |
that? Have you ever considered it? Do you think about the effect it | 0:25:22 | 0:25:26 | |
has on them? That's my question. Yeah. And what do I think? Yeah, | 0:25:26 | 0:25:32 | |
what do you think? I think (BLEEP) 'em. That's what I think. Darren, | 0:25:32 | 0:25:36 | |
how do you justify it? Justify what? The trolling. The trolling. | 0:25:36 | 0:25:40 | |
How do you justify it? How do you justify the trolling? It's | 0:25:40 | 0:25:44 | |
extremely offensive and hurtful for people. Is it breaking the law? | 0:25:44 | 0:25:48 | |
Well, some of it is racist, so some of it is and some of it is clearly | 0:25:48 | 0:25:51 | |
an incitement for racial hatred. You are entitled to your own | 0:25:51 | 0:25:54 | |
opinion. Facebook is an open forum. Facebook is an open forum and | 0:25:54 | 0:25:57 | |
you're entitled to your own opinion. Some of it is racist. If people | 0:25:57 | 0:26:00 | |
don't like it then fair enough. There's things on there I don't | 0:26:00 | 0:26:03 | |
like and I don't kick off, though. People have been convicted. People | 0:26:03 | 0:26:06 | |
have been convicted for doing less than you've done. Like what? Nine | 0:26:06 | 0:26:09 | |
weeks? People have been convicted. Nine weeks in jail? Yeah. What's | 0:26:09 | 0:26:15 | |
that? So there you go, an internet troll. That's what they look like. | 0:26:15 | 0:26:20 | |
The big question here is that the law seems unevenly applied. Sean | 0:26:21 | 0:26:25 | |
Duffy wrote on Natasha MacBryde's RIP site and was convicted for it. | 0:26:25 | 0:26:32 | |
Darren here does pretty much the Darren Burton told us before we | 0:26:32 | 0:26:35 | |
went to see him that South Wales Police had already visited him | 0:26:35 | 0:26:40 | |
about his trolling, but says they never took it further. So we asked | 0:26:40 | 0:26:44 | |
South Wales Police about it. They acknowledged they had visited | 0:26:44 | 0:26:47 | |
Darren Burton back in November 2010 to give him what they called | 0:26:47 | 0:26:50 | |
suitable advice about his future activities on the internet and that | 0:26:50 | 0:26:57 | |
at that time there was no evidence that a crime had been committed. | 0:26:57 | 0:27:05 | |
But clearly it didn't stop Darren I think there is also a variation | 0:27:05 | 0:27:09 | |
in how the law is enforced in different cases and the level of | 0:27:09 | 0:27:11 | |
knowledge between police officers and between police forces, in terms | 0:27:11 | 0:27:16 | |
of how to deal with cyber-bullying, varies greatly. There's just no | 0:27:16 | 0:27:19 | |
joined-up approach to dealing with it, is there? There's no joined up | 0:27:19 | 0:27:22 | |
approach. Perhaps what they need is some training. And some | 0:27:23 | 0:27:25 | |
understanding and some confidence of how to deal with cyber-bullying. | 0:27:25 | 0:27:28 | |
What actual protection the law, or the different laws, that are | 0:27:28 | 0:27:31 | |
involved in this affords to the victim and what as a police officer | 0:27:31 | 0:27:36 | |
you can do. Cher Lloyd told us she feels there's no point complaining | 0:27:36 | 0:27:40 | |
to the police about the people who are making her life a misery and | 0:27:40 | 0:27:45 | |
that there's not much she can do about it. Let's say you had become | 0:27:45 | 0:27:51 | |
well known ten years ago, before social networks. Do you think you'd | 0:27:51 | 0:27:59 | |
be happier about what you were doing? Yeah. I think that... I | 0:27:59 | 0:28:06 | |
think I would be a lot happier. I'd be protected a lot more. A few | 0:28:06 | 0:28:12 | |
comments to a young girl that's all When it comes to the social | 0:28:12 | 0:28:15 | |
networking revolution, all of us are trying to keep up with a | 0:28:15 | 0:28:22 | |
changing world. Most of us have no problems online, but some do. And | 0:28:22 | 0:28:29 | |
when they do, they're not always Next week on Panorama: | 0:28:29 | 0:28:31 |