Click or Treat Click


Click or Treat

This week on Click, spotting fake news and debunking the people in power, and wandering the ruins of a nightclub. And fancy a virtual reality fright this halloween?


Similar Content

Browse content similar to Click or Treat. Check below for episodes and series from the same categories and more!

Transcript


LineFromTo

Now on BBC News,

it's time for Click.

0:00:000:00:02

This week, spotting fake news and

debunking the people in power.

0:00:090:00:15

Wondering if the ruins. And

something wicked this week comes.

0:00:150:00:21

Going into space has long been the

dream of many a sci-fi fan and for

0:00:440:00:48

one BBC presenter that dream is

about to come true. In a world first

0:00:480:00:54

for the broadcast industry, Spencer

Kelly, who fronts the BBC technology

0:00:540:00:58

programme Click, has been accepted

by NASA to visit and report from

0:00:580:01:02

International Space Station. During

his stay onboard he will present

0:01:020:01:06

several episodes of Click. He says

he has always harboured ambitions to

0:01:060:01:11

leave planet Earth and will test how

the latest technology performs in

0:01:110:01:14

zero gravity. He says he looking

forward to the months of training

0:01:140:01:19

ahead of him.

That's not true. I'm

so sorry. That shouldn't be on the

0:01:190:01:24

autocue. That's my Christmas fantasy

list.

That's fake news! We are

0:01:240:01:30

fighting the fake news. It's fake,

phoney.

The fake media tried to

0:01:300:01:35

stop... Everyone is using the term

these days. The problem is it now

0:01:350:01:40

seems to mean anything from actual

lives to something you simply don't

0:01:400:01:43

agree with. And the technology world

is anguishing over how to use taught

0:01:430:01:48

that from fiction. From opinion,

from satire, from highly skewed and

0:01:480:01:54

misleading headlines... As a result,

fact checking organisations are now

0:01:540:02:00

working to counter the fake news of

that. The first draft coalition

0:02:000:02:05

operates around the world and in

Germany it is working alongside

0:02:050:02:10

journalists from a group to help

improve online transparency. In the

0:02:100:02:16

run-up to the recent election here

they published a daily news --

0:02:160:02:23

newsletter, looking at Storey

suspected to be false or misleading.

0:02:230:02:26

You look at an incident. Is what is

being claimed in the captions

0:02:260:02:35

descriptions what is actually being

insinuated? One which showed a

0:02:350:02:40

couple of maybe not traditional

northern European skinhead guys

0:02:400:02:45

waving passports. This was claimed

to be smug immigrants tramping all

0:02:450:02:51

over German people's feelings.

The

tweets said they were insulting

0:02:510:02:56

local Germans and provoking them.

Using simple tools such as reverse

0:02:560:03:01

image searches to verify the

original sources of videos and in

0:03:010:03:04

this case a facility called watch

frame by frame, journalists were

0:03:040:03:09

able to identify the street name.

The thing that helped me if there is

0:03:090:03:16

a police officer walking through the

video, back year.

After locating the

0:03:160:03:24

police court in question they were

able to get an eyewitness account of

0:03:240:03:27

what happened, not just in front of

the camera but also behind.

Actually

0:03:270:03:33

we discovered that behind the camera

there are like 100 people insulting

0:03:330:03:39

these three to four guys in the

first place.

They were if anything

0:03:390:03:43

just reacting.

Another story

debunked by the group involved what

0:03:430:03:51

looked like a number of Muslims

standing at a bus stop.

The headline

0:03:510:03:57

was, this is how Islamic society or

an Islamic stick society would look

0:03:570:04:02

like or does look like and we are

heading to this.

We are taking a

0:04:020:04:09

closer look at this... Narrowing

down where buses met, journalists

0:04:090:04:15

were able to pinpoint the spot and

the fact that the group had just

0:04:150:04:21

come out of a Christian church.

They

confirmed that they were refugees

0:04:210:04:25

and they know the guys in the BDO

and they were just coming back from

0:04:250:04:29

a baptism and they were trying to

celebrate the baptism and were just

0:04:290:04:35

heading for lunch.

So this was

really misleading information and

0:04:350:04:44

trying to manipulate people and make

them worried about whether we are

0:04:440:04:47

overruled by other cultures.

The

problem is that anything can look

0:04:470:04:53

believable when it is published

online. And there is an ongoing

0:04:530:04:56

debate about whether the platforms

on which the stories are published

0:04:560:04:59

should be the ones to police them.

Making sure that quality content and

0:04:590:05:05

quality journalism is on top is a

big mission. So that's why we work

0:05:050:05:12

very closely with fact checking

organisations and media

0:05:120:05:16

organisations around the world. Just

a couple of months ago we changed

0:05:160:05:21

our ads policy around misleading

news websites.

The ends are fighting

0:05:210:05:27

the rising tide of fake news, one

thing is for certain. Ultimately we

0:05:270:05:32

are going to need an automated fact

checking system. Back in the UK, a

0:05:320:05:39

stones throw from Westminster lies

full fact. This is an organisation

0:05:390:05:45

that first came to the public

attention around the time of the EU

0:05:450:05:49

referendum. These guys have some

pretty interesting fact checking

0:05:490:05:52

tools. In this session of Prime

Minister's Questions, the group is

0:05:520:05:58

verifying claims using a mixture of

manual and automated fact checking.

0:05:580:06:02

One of the automated tools being

developed looks at the trends behind

0:06:020:06:07

a claim such as where and how many

times any statement was repeated.

0:06:070:06:13

Another tool will take text from TV

subtitles and check it off in real

0:06:130:06:18

time against reliable databases,

such as the office of national

0:06:180:06:22

statistics. Using a combination of

AI and machine learning, the

0:06:220:06:28

algorithm will perform calculations

and check facts with primary

0:06:280:06:33

sources. Eventually it could be used

in a scenario such as this.

There

0:06:330:06:36

are 10,000 more training places

available for nurses in the NHS, but

0:06:360:06:43

the right honourable gentleman...

That's not right. That's an ambition

0:06:430:06:47

for 2020 it is currently not true.

How cool would it be to debunk

0:06:470:06:52

claims like that on the spot? At the

system would be able to challenge

0:06:520:06:55

more subtle claims with lots of

caveats, such as the statement of

0:06:550:06:59

the NHS is in crisis. Nor will it

provide simple yes and no answers.

0:06:590:07:04

GDP is rising. It's kind of like

shazam full-backs. -- for facts. The

0:07:040:07:18

tool that I most excited about is

the text checking. When somebody is

0:07:180:07:23

talking live and it takes you in

real time to the primary sources. So

0:07:230:07:28

if a journalist is in a press

conference or if they are

0:07:280:07:32

interviewing someone, they can see

straightaway if there is something

0:07:320:07:36

that the person in front of them has

said is true or false, which is

0:07:360:07:40

particularly cool.

I so want that.

Have you used it? I haven't used it

0:07:400:07:44

in anger yet.

When will it be ready?

It's ready now, but it can only do

0:07:440:07:50

one sentence at a time.

You think

public figures will have to change

0:07:500:07:54

the way they behave?

There's no

debate that can happen eventually

0:07:540:07:59

without hitting on numbers and its

port and they are correct and aren't

0:07:590:08:04

being neglected. That's the place we

are starting from in the world we

0:08:040:08:07

want to create.

From fakery of news to fakery of

0:08:070:08:11

images now. That's not exactly the

spin that Adobe would like us to put

0:08:110:08:18

on the way its products are used,

but at an event in Las Vegas it has

0:08:180:08:25

just unveiled some pretty nifty

tools to do just that. We sent

0:08:250:08:28

Richard Taylor along to have a look.

12,000 creatives under one roof, all

0:08:280:08:35

geared up to find out what's next

from the up that literally invented

0:08:350:08:39

photo shopping. The arms are, AI as

we've never known it before. -- the

0:08:390:08:46

answer. Take this image of Denver

where an entire neighbourhood is

0:08:460:08:51

expunged in a flash and replaced

with something more aesthetically

0:08:510:08:55

pleasing, instead of just trying to

fill in the area with surrounding

0:08:550:08:58

pixels the software can now extract

meaning from the image and make a

0:08:580:09:04

substitute from its library of 100

million others. A similar principle

0:09:040:09:07

is at play here. The plaster now

intelligently removed as the

0:09:070:09:13

software can understand the

protrusion in the middle of

0:09:130:09:16

someone's face as a nose. And see

you want to remove something or

0:09:160:09:20

someone from a video. Right now you

could try it frame by painstaking

0:09:200:09:23

frame. The chance is the result

would look crude. But this demo is

0:09:230:09:29

real. A research product we may see

in a future version of Adobe's

0:09:290:09:34

Redox. In this era of fake news the

invocations of being able to easily

0:09:340:09:39

fool your audience are of course

potentially troublesome, but Adobe

0:09:390:09:42

is more interested in the creative

potential.

We are trying to

0:09:420:09:48

reimagine the entire creative

process so you can create the way

0:09:480:09:51

you want. Machines can see patterns

and possibilities that we may not be

0:09:510:09:56

able to see immediately.

Adobe says

AIA should allow creatives more time

0:09:560:10:01

for artistic expression and to be

creative rather than doing boring

0:10:010:10:07

and repetitive tasks. -- AI. They

say the creative process should be

0:10:070:10:12

more efficient and AI can

potentially even second-guess our

0:10:120:10:15

next moves. To illustrate, check out

this photo shop at a type which has

0:10:150:10:22

Adobe's gritty creative resistance

built in.

Find some images based on

0:10:220:10:26

my sketch.

And within seconds,

others Sapir based on your very

0:10:260:10:32

rough sketch of a woman in a

spaceship. What you might we

0:10:320:10:35

thinking all of this is pretty

similar to the AI used by Google and

0:10:350:10:40

Apple.

We have decades and decades

of understanding of how people use

0:10:400:10:46

our tools. When one of the best

creative artists launches photo shop

0:10:460:10:51

and they know what's creatively

pleasing and aesthetically pleasing,

0:10:510:10:54

we are learning from that. So we are

not training on just images of cats

0:10:540:10:59

or dogs, we are training with the

world's best people.

I'm certainly

0:10:590:11:04

impressed at how the AIA could for

example take an image of me and

0:11:040:11:08

within seconds returned matches and

then further refine them. -- AI. The

0:11:080:11:15

TEC also understands 3D, you don't

have to be another is the do

0:11:150:11:19

understand it. Few people would

argue AI is fantastic in terms of

0:11:190:11:23

creating efficiencies, at

overreliance on our machines instead

0:11:230:11:28

of amplifying the creative process

could eventually end up supplanting

0:11:280:11:32

it.

I actually don't think so.

Creatives are distracted by all of

0:11:320:11:37

the things that take multiple steps,

make them suddenly moved out of a

0:11:370:11:45

right brain mode into a procedural

left brain mode. I think AI and are

0:11:450:11:50

being this news at the elbow.

And

that's the prevailing view amongst

0:11:500:11:55

creatives here, keen to embrace the

possibilities of an AI world.

0:11:550:12:01

Welcome to the week in TEC. It was

the week that the Hawaiian city of

0:12:070:12:11

Honolulu began fining people might

$9 for paying too much attention to

0:12:110:12:15

their smartphone while crossing the

road. Microsoft announced it has

0:12:150:12:19

stopped manufacturing its motion

sensing controller. And Japanese

0:12:190:12:25

company Toyota showed up a concept

car with the airbags on the outside

0:12:250:12:29

in Tokyo. Meanwhile, Nissan revealed

the artificially created noise its

0:12:290:12:35

electric cars will permit. US

authorities are insisting all hybrid

0:12:350:12:39

and electric cars will have to emit

sound for safety reasons. Amazon now

0:12:390:12:44

wants to enter its customers homes

when making deliveries. The system

0:12:440:12:48

is called Amazon Key. Trustworthy

types who sign up will allow

0:12:480:12:54

deliveries to be left inside their

homes. Suspicious people will be

0:12:540:12:58

able to view the delivery on a smart

camera that they've left at home.

0:12:580:13:03

What could possibly go wrong? And

creepy or cute? You decide. Sony has

0:13:030:13:08

developed a new winking robot

assistants. The robot communicate

0:13:080:13:12

with users using endearing gestures.

It is hoping the cuteness will

0:13:120:13:19

challenge Amazon's Echo range. And

researchers at Harvard have

0:13:190:13:24

developed a tiny robot that can swim

and flight. It is flapping wings

0:13:240:13:28

argues to propel it around when it

is underwater. The creators hope one

0:13:280:13:34

day similar technology could be used

in search and rescue robots.

0:13:340:13:39

This is art in the 21st-century.

Trust me, it is.

0:13:420:13:50

And it actually looks and sounds

great when you are standing in the

0:13:500:13:56

middle. I'm thinking it has a

specific sound. The buzzing sound is

0:13:560:14:01

the electric current that lights the

LED is and it is being translated

0:14:010:14:06

into a kind of compositional

concert. It's got its own groove to

0:14:060:14:10

it. I like it. This week I'm

wandering the halls of a brand-new

0:14:100:14:15

installation in the heart of London.

It's called Everything At Once. And

0:14:150:14:21

if it doesn't actually have

everything, it certainly has a lot.

0:14:210:14:24

It's a mixture of dynamic works like

the black pot and static pieces by

0:14:240:14:28

renowned artists like Ai Weiwei and

a niche Cabu. There are also

0:14:280:14:36

faceless voices describing their

near death experiences. -- Anish

0:14:360:14:46

Kapoor.

I was in hospital and my

heart stopped beating.

The

0:14:460:14:49

centrepiece of the exhibition is

even more unsettling. I'm about to

0:14:490:14:53

be subjected to intensely fast

flashing images. Now, if you'd

0:14:530:14:58

rather not see them, please look

away now and come back in a couple

0:14:580:15:02

of minutes. Because I'm about to

walk through and in test pattern

0:15:020:15:08

number 12. It's by a Japanese

electric artist. The experiences

0:15:080:15:18

overwhelming. The video moves that

more than 100 frames a second and in

0:15:180:15:23

fact we've had to doctor our footage

in order to be allowed to show it on

0:15:230:15:27

TV. The video frame rate is so high

that the black and light is

0:15:270:15:36

flickering incredibly fast. I can

actually see colours in between the

0:15:360:15:41

black and light, they're moving so

fast, there's greys, I'm starting to

0:15:410:15:46

see yellow and red, maybe that's

just because my eyes are exploding,

0:15:460:15:50

I don't know. Ikeda has taken

digital files and broken them down

0:15:500:15:56

into their native zeros and ones.

It's these binary patterns that are

0:15:560:16:00

then blasted onto the viewer. After

that, time for a drink in a

0:16:000:16:06

nightclub called Ruin. Only it looks

like I've arrived after the after

0:16:060:16:11

party.

Now, earlier we were looking at

0:16:110:16:16

attempts to combat fake news and so

often these days that means the US

0:16:160:16:21

elections, Russia and the like. But

it's actually a problem all around

0:16:210:16:25

the world in different ways. David

Reid has been looking at the

0:16:250:16:28

particular issues in India.

This summer mob violence in the

0:16:280:16:36

eastern state of dark and was

sparked by a rumour on WhatsApp that

0:16:360:16:41

child abductors were targeting a

tribal community. The story wasn't

0:16:410:16:47

true but still seven people died in

violence. It doesn't take much here

0:16:470:16:51

for long simmering conflicts to boil

over and fake news like this can be

0:16:510:16:56

just the trigger for it. Stories

like these are very powerful and can

0:16:560:17:04

potentially threaten India's often

temperatures communal relations. So

0:17:040:17:08

much so that now even the police are

getting involved in tackling fake

0:17:080:17:12

news.

I visited one of the country's mein

0:17:120:17:20

cyber crime units in Hyderabad, the

capital of the southern state of

0:17:200:17:24

telling Ghana. Here cyber cops are

worried about the threat to law and

0:17:240:17:28

order by fake stories with the

potential to spark riots. Police

0:17:280:17:34

here in best false and inflammatory

stories, try to get them taken down

0:17:340:17:40

and then attempt to prosecute those

producing them, but much of India's

0:17:400:17:45

fake news is spread through the

mobile communication platform

0:17:450:17:49

WhatsApp and because it's encrypted,

for police here it's a wall.

It

0:17:490:17:55

appears to be a communication

problem with WhatsApp, we try with

0:17:550:18:00

WhatsApp and they say it's not a

messaging facility. We need a date

0:18:000:18:04

and time stamp to prove a case, so

that's also not there.

0:18:040:18:10

Something like 200 meal and people

in India use WhatsApp. For some the

0:18:100:18:14

story shared on the platform are

there only or main source of news.

0:18:140:18:19

If the police are hitting a road

black tarmac block with WhatsApp's

0:18:190:18:26

end to end encryption, others are

trying to utilise fake stories by

0:18:260:18:30

debunking them. This man is based in

Gujarat. This website out News roots

0:18:300:18:36

out and reveals what's wrong on the

web.

My guess is it often starts on

0:18:360:18:41

WhatsApp because those who put it on

WhatsApp know it's difficult to

0:18:410:18:45

track them down. The people who

circulate these videos, they are

0:18:450:18:49

very well aware that it's a fake

video. There's no doubt.

Videos like

0:18:490:18:57

this one purported to show a woman

being killed in India by a Muslim

0:18:570:19:01

mob.

It's one of the most grotesque,

stomach churning videos you will

0:19:010:19:05

see.

But the harrowing incident it

the Picts actually took place in

0:19:050:19:09

central America.

This video was easy

to debunk. For a lot of videos what

0:19:090:19:17

we do is break it up into friends,

we use Google reverse image search

0:19:170:19:21

and the first Google result is that

of this girl who was killed in

0:19:210:19:29

Guatemala. She was accused of being

an accomplice in a murder, she got

0:19:290:19:32

caught in a mob and she died.

And

yet many who saw the video took its

0:19:320:19:37

claims to be true. The reason is

that in India hundreds of millions

0:19:370:19:42

are encountering the Internet for

the first time and they lack the

0:19:420:19:46

media literacy to assess if the news

is actually true.

We have more than

0:19:460:19:53

400 million mobile Internet users.

50% of them are using WhatsApp.

0:19:530:20:00

WhatsApp is the main medium for

promoting the fake news. But how

0:20:000:20:04

many people are aware of the stuff

they are forwarding weather it is

0:20:040:20:11

true or not and weather it's a

problem or not, we are not equipped

0:20:110:20:15

to deal with that and this is an

epidemic like situation.

It's still

0:20:150:20:20

early days for the Internet in India

and as police and journalists battle

0:20:200:20:24

the fake content that can trigger

conflict, many are still prone to

0:20:240:20:29

manipulation from the lies in their

inbox. That was David in India.

0:20:290:20:36

Now, with Halloween fast

approaching, there are plenty of

0:20:360:20:39

scary movies around but none of them

will be as immersive as a virtual

0:20:390:20:42

reality horror show. And that's the

event that we've sent Nick quip to

0:20:420:20:49

in Covent Garden.

On the way to see a film, a movie,

0:20:490:20:53

but not as we need know it, in

virtual reality. I think this is 68

0:20:530:21:03

A common not your standard cinema.

Cinemas, downstairs. There's people

0:21:030:21:12

down there wearing VI headsets.

Virtual reality film is super

0:21:120:21:20

exciting but right now you can only

enjoy it in the comfort of your own

0:21:200:21:24

home and it's not a social

experience, we want to bring people

0:21:240:21:27

together so they can enjoy BR with

their friends, family and partner.

0:21:270:21:32

Where am I sitting? Excellent. Can

we get any popcorn? Up and over...

0:21:320:21:40

Popcorn, excellent. Everyone ready

to go?

Yes.

Let's do it! Showtime!

0:21:400:21:50

Scary suburbia. I'm looking down, I

don't have any legs or anything, I'm

0:21:500:21:56

not a person... I've been directed

to go down the debt. Oh my goodness!

0:21:560:22:03

That is It down the drain. Are just

left that scene.

Look behind you...

0:22:030:22:19

A bit unnerving!

Someone has just

appeared in front of me. OK.

We've

0:22:190:22:27

got a collection of films five to

ten minutes each and we're showing

0:22:270:22:31

them back-to-back in A40 minute

montage.

It's

0:22:310:22:34

them back-to-back in A40 minute

montage.

It's, it the firstly are

0:22:340:22:35

cinema pop-up and it's not cutting

hardware but they have got custom...

0:22:350:22:46

People can have the shared cinema

experience of being shocked all at

0:22:460:22:50

the same time.

0:22:500:22:52

Lies is not good. This is going to

end well, this is going to end very

0:23:020:23:09

well.

0:23:090:23:10

I'm burning alive. With several

showings starting at the top of the

0:23:120:23:19

hour, the headsets need to be taken

away for charging. It's all very

0:23:190:23:24

pop-up. But the chaps hope it will

get another hearts racing so they

0:23:240:23:27

can open up a permanently are cinema

later this year. Is this all just a

0:23:270:23:32

novelty, though?

This is kind of a

nightclub in Glasgow. OK, that's

0:23:320:23:39

horrendous. Horrendous! That's

enough.

Actually it was quite fun to

0:23:390:23:47

bring a group of friends together.

To go out and have a shared

0:23:470:23:51

experience, another group is coming

in right now? OK.

Masters of

0:23:510:23:55

turnaround.

We'd better be on our

way. Nick Kwek, always up for an

0:23:550:24:03

experience and regularly needing a

lie down after a shoot because of

0:24:030:24:06

it. That's it from me Ruin for this

week, don't forget we live on

0:24:060:24:11

Facebook and Twitter throughout the

week, @BBCclick is where you'll find

0:24:110:24:15

us. Thanks for watching and we'll

see you soon.

0:24:150:24:20

This week on Click, spotting fake news and debunking the people in power, and wandering the ruins of a nightclub. And fancy a virtual reality fright this halloween?


Download Subtitles

SRT

ASS