Browse content similar to Click or Treat. Check below for episodes and series from the same categories and more!
Line | From | To | |
---|---|---|---|
Now on BBC News,
it's time for Click. | 0:00:00 | 0:00:02 | |
This week, spotting fake news and
debunking the people in power. | 0:00:09 | 0:00:15 | |
Wondering if the ruins. And
something wicked this week comes. | 0:00:15 | 0:00:21 | |
Going into space has long been the
dream of many a sci-fi fan and for | 0:00:44 | 0:00:48 | |
one BBC presenter that dream is
about to come true. In a world first | 0:00:48 | 0:00:54 | |
for the broadcast industry, Spencer
Kelly, who fronts the BBC technology | 0:00:54 | 0:00:58 | |
programme Click, has been accepted
by NASA to visit and report from | 0:00:58 | 0:01:02 | |
International Space Station. During
his stay onboard he will present | 0:01:02 | 0:01:06 | |
several episodes of Click. He says
he has always harboured ambitions to | 0:01:06 | 0:01:11 | |
leave planet Earth and will test how
the latest technology performs in | 0:01:11 | 0:01:14 | |
zero gravity. He says he looking
forward to the months of training | 0:01:14 | 0:01:19 | |
ahead of him. That's not true. I'm
so sorry. That shouldn't be on the | 0:01:19 | 0:01:24 | |
autocue. That's my Christmas fantasy
list. That's fake news! We are | 0:01:24 | 0:01:30 | |
fighting the fake news. It's fake,
phoney. The fake media tried to | 0:01:30 | 0:01:35 | |
stop... Everyone is using the term
these days. The problem is it now | 0:01:35 | 0:01:40 | |
seems to mean anything from actual
lives to something you simply don't | 0:01:40 | 0:01:43 | |
agree with. And the technology world
is anguishing over how to use taught | 0:01:43 | 0:01:48 | |
that from fiction. From opinion,
from satire, from highly skewed and | 0:01:48 | 0:01:54 | |
misleading headlines... As a result,
fact checking organisations are now | 0:01:54 | 0:02:00 | |
working to counter the fake news of
that. The first draft coalition | 0:02:00 | 0:02:05 | |
operates around the world and in
Germany it is working alongside | 0:02:05 | 0:02:10 | |
journalists from a group to help
improve online transparency. In the | 0:02:10 | 0:02:16 | |
run-up to the recent election here
they published a daily news -- | 0:02:16 | 0:02:23 | |
newsletter, looking at Storey
suspected to be false or misleading. | 0:02:23 | 0:02:26 | |
You look at an incident. Is what is
being claimed in the captions | 0:02:26 | 0:02:35 | |
descriptions what is actually being
insinuated? One which showed a | 0:02:35 | 0:02:40 | |
couple of maybe not traditional
northern European skinhead guys | 0:02:40 | 0:02:45 | |
waving passports. This was claimed
to be smug immigrants tramping all | 0:02:45 | 0:02:51 | |
over German people's feelings. The
tweets said they were insulting | 0:02:51 | 0:02:56 | |
local Germans and provoking them.
Using simple tools such as reverse | 0:02:56 | 0:03:01 | |
image searches to verify the
original sources of videos and in | 0:03:01 | 0:03:04 | |
this case a facility called watch
frame by frame, journalists were | 0:03:04 | 0:03:09 | |
able to identify the street name.
The thing that helped me if there is | 0:03:09 | 0:03:16 | |
a police officer walking through the
video, back year. After locating the | 0:03:16 | 0:03:24 | |
police court in question they were
able to get an eyewitness account of | 0:03:24 | 0:03:27 | |
what happened, not just in front of
the camera but also behind. Actually | 0:03:27 | 0:03:33 | |
we discovered that behind the camera
there are like 100 people insulting | 0:03:33 | 0:03:39 | |
these three to four guys in the
first place. They were if anything | 0:03:39 | 0:03:43 | |
just reacting. Another story
debunked by the group involved what | 0:03:43 | 0:03:51 | |
looked like a number of Muslims
standing at a bus stop. The headline | 0:03:51 | 0:03:57 | |
was, this is how Islamic society or
an Islamic stick society would look | 0:03:57 | 0:04:02 | |
like or does look like and we are
heading to this. We are taking a | 0:04:02 | 0:04:09 | |
closer look at this... Narrowing
down where buses met, journalists | 0:04:09 | 0:04:15 | |
were able to pinpoint the spot and
the fact that the group had just | 0:04:15 | 0:04:21 | |
come out of a Christian church. They
confirmed that they were refugees | 0:04:21 | 0:04:25 | |
and they know the guys in the BDO
and they were just coming back from | 0:04:25 | 0:04:29 | |
a baptism and they were trying to
celebrate the baptism and were just | 0:04:29 | 0:04:35 | |
heading for lunch. So this was
really misleading information and | 0:04:35 | 0:04:44 | |
trying to manipulate people and make
them worried about whether we are | 0:04:44 | 0:04:47 | |
overruled by other cultures. The
problem is that anything can look | 0:04:47 | 0:04:53 | |
believable when it is published
online. And there is an ongoing | 0:04:53 | 0:04:56 | |
debate about whether the platforms
on which the stories are published | 0:04:56 | 0:04:59 | |
should be the ones to police them.
Making sure that quality content and | 0:04:59 | 0:05:05 | |
quality journalism is on top is a
big mission. So that's why we work | 0:05:05 | 0:05:12 | |
very closely with fact checking
organisations and media | 0:05:12 | 0:05:16 | |
organisations around the world. Just
a couple of months ago we changed | 0:05:16 | 0:05:21 | |
our ads policy around misleading
news websites. The ends are fighting | 0:05:21 | 0:05:27 | |
the rising tide of fake news, one
thing is for certain. Ultimately we | 0:05:27 | 0:05:32 | |
are going to need an automated fact
checking system. Back in the UK, a | 0:05:32 | 0:05:39 | |
stones throw from Westminster lies
full fact. This is an organisation | 0:05:39 | 0:05:45 | |
that first came to the public
attention around the time of the EU | 0:05:45 | 0:05:49 | |
referendum. These guys have some
pretty interesting fact checking | 0:05:49 | 0:05:52 | |
tools. In this session of Prime
Minister's Questions, the group is | 0:05:52 | 0:05:58 | |
verifying claims using a mixture of
manual and automated fact checking. | 0:05:58 | 0:06:02 | |
One of the automated tools being
developed looks at the trends behind | 0:06:02 | 0:06:07 | |
a claim such as where and how many
times any statement was repeated. | 0:06:07 | 0:06:13 | |
Another tool will take text from TV
subtitles and check it off in real | 0:06:13 | 0:06:18 | |
time against reliable databases,
such as the office of national | 0:06:18 | 0:06:22 | |
statistics. Using a combination of
AI and machine learning, the | 0:06:22 | 0:06:28 | |
algorithm will perform calculations
and check facts with primary | 0:06:28 | 0:06:33 | |
sources. Eventually it could be used
in a scenario such as this. There | 0:06:33 | 0:06:36 | |
are 10,000 more training places
available for nurses in the NHS, but | 0:06:36 | 0:06:43 | |
the right honourable gentleman...
That's not right. That's an ambition | 0:06:43 | 0:06:47 | |
for 2020 it is currently not true.
How cool would it be to debunk | 0:06:47 | 0:06:52 | |
claims like that on the spot? At the
system would be able to challenge | 0:06:52 | 0:06:55 | |
more subtle claims with lots of
caveats, such as the statement of | 0:06:55 | 0:06:59 | |
the NHS is in crisis. Nor will it
provide simple yes and no answers. | 0:06:59 | 0:07:04 | |
GDP is rising. It's kind of like
shazam full-backs. -- for facts. The | 0:07:04 | 0:07:18 | |
tool that I most excited about is
the text checking. When somebody is | 0:07:18 | 0:07:23 | |
talking live and it takes you in
real time to the primary sources. So | 0:07:23 | 0:07:28 | |
if a journalist is in a press
conference or if they are | 0:07:28 | 0:07:32 | |
interviewing someone, they can see
straightaway if there is something | 0:07:32 | 0:07:36 | |
that the person in front of them has
said is true or false, which is | 0:07:36 | 0:07:40 | |
particularly cool. I so want that.
Have you used it? I haven't used it | 0:07:40 | 0:07:44 | |
in anger yet. When will it be ready?
It's ready now, but it can only do | 0:07:44 | 0:07:50 | |
one sentence at a time. You think
public figures will have to change | 0:07:50 | 0:07:54 | |
the way they behave? There's no
debate that can happen eventually | 0:07:54 | 0:07:59 | |
without hitting on numbers and its
port and they are correct and aren't | 0:07:59 | 0:08:04 | |
being neglected. That's the place we
are starting from in the world we | 0:08:04 | 0:08:07 | |
want to create.
From fakery of news to fakery of | 0:08:07 | 0:08:11 | |
images now. That's not exactly the
spin that Adobe would like us to put | 0:08:11 | 0:08:18 | |
on the way its products are used,
but at an event in Las Vegas it has | 0:08:18 | 0:08:25 | |
just unveiled some pretty nifty
tools to do just that. We sent | 0:08:25 | 0:08:28 | |
Richard Taylor along to have a look.
12,000 creatives under one roof, all | 0:08:28 | 0:08:35 | |
geared up to find out what's next
from the up that literally invented | 0:08:35 | 0:08:39 | |
photo shopping. The arms are, AI as
we've never known it before. -- the | 0:08:39 | 0:08:46 | |
answer. Take this image of Denver
where an entire neighbourhood is | 0:08:46 | 0:08:51 | |
expunged in a flash and replaced
with something more aesthetically | 0:08:51 | 0:08:55 | |
pleasing, instead of just trying to
fill in the area with surrounding | 0:08:55 | 0:08:58 | |
pixels the software can now extract
meaning from the image and make a | 0:08:58 | 0:09:04 | |
substitute from its library of 100
million others. A similar principle | 0:09:04 | 0:09:07 | |
is at play here. The plaster now
intelligently removed as the | 0:09:07 | 0:09:13 | |
software can understand the
protrusion in the middle of | 0:09:13 | 0:09:16 | |
someone's face as a nose. And see
you want to remove something or | 0:09:16 | 0:09:20 | |
someone from a video. Right now you
could try it frame by painstaking | 0:09:20 | 0:09:23 | |
frame. The chance is the result
would look crude. But this demo is | 0:09:23 | 0:09:29 | |
real. A research product we may see
in a future version of Adobe's | 0:09:29 | 0:09:34 | |
Redox. In this era of fake news the
invocations of being able to easily | 0:09:34 | 0:09:39 | |
fool your audience are of course
potentially troublesome, but Adobe | 0:09:39 | 0:09:42 | |
is more interested in the creative
potential. We are trying to | 0:09:42 | 0:09:48 | |
reimagine the entire creative
process so you can create the way | 0:09:48 | 0:09:51 | |
you want. Machines can see patterns
and possibilities that we may not be | 0:09:51 | 0:09:56 | |
able to see immediately. Adobe says
AIA should allow creatives more time | 0:09:56 | 0:10:01 | |
for artistic expression and to be
creative rather than doing boring | 0:10:01 | 0:10:07 | |
and repetitive tasks. -- AI. They
say the creative process should be | 0:10:07 | 0:10:12 | |
more efficient and AI can
potentially even second-guess our | 0:10:12 | 0:10:15 | |
next moves. To illustrate, check out
this photo shop at a type which has | 0:10:15 | 0:10:22 | |
Adobe's gritty creative resistance
built in. Find some images based on | 0:10:22 | 0:10:26 | |
my sketch. And within seconds,
others Sapir based on your very | 0:10:26 | 0:10:32 | |
rough sketch of a woman in a
spaceship. What you might we | 0:10:32 | 0:10:35 | |
thinking all of this is pretty
similar to the AI used by Google and | 0:10:35 | 0:10:40 | |
Apple. We have decades and decades
of understanding of how people use | 0:10:40 | 0:10:46 | |
our tools. When one of the best
creative artists launches photo shop | 0:10:46 | 0:10:51 | |
and they know what's creatively
pleasing and aesthetically pleasing, | 0:10:51 | 0:10:54 | |
we are learning from that. So we are
not training on just images of cats | 0:10:54 | 0:10:59 | |
or dogs, we are training with the
world's best people. I'm certainly | 0:10:59 | 0:11:04 | |
impressed at how the AIA could for
example take an image of me and | 0:11:04 | 0:11:08 | |
within seconds returned matches and
then further refine them. -- AI. The | 0:11:08 | 0:11:15 | |
TEC also understands 3D, you don't
have to be another is the do | 0:11:15 | 0:11:19 | |
understand it. Few people would
argue AI is fantastic in terms of | 0:11:19 | 0:11:23 | |
creating efficiencies, at
overreliance on our machines instead | 0:11:23 | 0:11:28 | |
of amplifying the creative process
could eventually end up supplanting | 0:11:28 | 0:11:32 | |
it. I actually don't think so.
Creatives are distracted by all of | 0:11:32 | 0:11:37 | |
the things that take multiple steps,
make them suddenly moved out of a | 0:11:37 | 0:11:45 | |
right brain mode into a procedural
left brain mode. I think AI and are | 0:11:45 | 0:11:50 | |
being this news at the elbow. And
that's the prevailing view amongst | 0:11:50 | 0:11:55 | |
creatives here, keen to embrace the
possibilities of an AI world. | 0:11:55 | 0:12:01 | |
Welcome to the week in TEC. It was
the week that the Hawaiian city of | 0:12:07 | 0:12:11 | |
Honolulu began fining people might
$9 for paying too much attention to | 0:12:11 | 0:12:15 | |
their smartphone while crossing the
road. Microsoft announced it has | 0:12:15 | 0:12:19 | |
stopped manufacturing its motion
sensing controller. And Japanese | 0:12:19 | 0:12:25 | |
company Toyota showed up a concept
car with the airbags on the outside | 0:12:25 | 0:12:29 | |
in Tokyo. Meanwhile, Nissan revealed
the artificially created noise its | 0:12:29 | 0:12:35 | |
electric cars will permit. US
authorities are insisting all hybrid | 0:12:35 | 0:12:39 | |
and electric cars will have to emit
sound for safety reasons. Amazon now | 0:12:39 | 0:12:44 | |
wants to enter its customers homes
when making deliveries. The system | 0:12:44 | 0:12:48 | |
is called Amazon Key. Trustworthy
types who sign up will allow | 0:12:48 | 0:12:54 | |
deliveries to be left inside their
homes. Suspicious people will be | 0:12:54 | 0:12:58 | |
able to view the delivery on a smart
camera that they've left at home. | 0:12:58 | 0:13:03 | |
What could possibly go wrong? And
creepy or cute? You decide. Sony has | 0:13:03 | 0:13:08 | |
developed a new winking robot
assistants. The robot communicate | 0:13:08 | 0:13:12 | |
with users using endearing gestures.
It is hoping the cuteness will | 0:13:12 | 0:13:19 | |
challenge Amazon's Echo range. And
researchers at Harvard have | 0:13:19 | 0:13:24 | |
developed a tiny robot that can swim
and flight. It is flapping wings | 0:13:24 | 0:13:28 | |
argues to propel it around when it
is underwater. The creators hope one | 0:13:28 | 0:13:34 | |
day similar technology could be used
in search and rescue robots. | 0:13:34 | 0:13:39 | |
This is art in the 21st-century.
Trust me, it is. | 0:13:42 | 0:13:50 | |
And it actually looks and sounds
great when you are standing in the | 0:13:50 | 0:13:56 | |
middle. I'm thinking it has a
specific sound. The buzzing sound is | 0:13:56 | 0:14:01 | |
the electric current that lights the
LED is and it is being translated | 0:14:01 | 0:14:06 | |
into a kind of compositional
concert. It's got its own groove to | 0:14:06 | 0:14:10 | |
it. I like it. This week I'm
wandering the halls of a brand-new | 0:14:10 | 0:14:15 | |
installation in the heart of London.
It's called Everything At Once. And | 0:14:15 | 0:14:21 | |
if it doesn't actually have
everything, it certainly has a lot. | 0:14:21 | 0:14:24 | |
It's a mixture of dynamic works like
the black pot and static pieces by | 0:14:24 | 0:14:28 | |
renowned artists like Ai Weiwei and
a niche Cabu. There are also | 0:14:28 | 0:14:36 | |
faceless voices describing their
near death experiences. -- Anish | 0:14:36 | 0:14:46 | |
Kapoor. I was in hospital and my
heart stopped beating. The | 0:14:46 | 0:14:49 | |
centrepiece of the exhibition is
even more unsettling. I'm about to | 0:14:49 | 0:14:53 | |
be subjected to intensely fast
flashing images. Now, if you'd | 0:14:53 | 0:14:58 | |
rather not see them, please look
away now and come back in a couple | 0:14:58 | 0:15:02 | |
of minutes. Because I'm about to
walk through and in test pattern | 0:15:02 | 0:15:08 | |
number 12. It's by a Japanese
electric artist. The experiences | 0:15:08 | 0:15:18 | |
overwhelming. The video moves that
more than 100 frames a second and in | 0:15:18 | 0:15:23 | |
fact we've had to doctor our footage
in order to be allowed to show it on | 0:15:23 | 0:15:27 | |
TV. The video frame rate is so high
that the black and light is | 0:15:27 | 0:15:36 | |
flickering incredibly fast. I can
actually see colours in between the | 0:15:36 | 0:15:41 | |
black and light, they're moving so
fast, there's greys, I'm starting to | 0:15:41 | 0:15:46 | |
see yellow and red, maybe that's
just because my eyes are exploding, | 0:15:46 | 0:15:50 | |
I don't know. Ikeda has taken
digital files and broken them down | 0:15:50 | 0:15:56 | |
into their native zeros and ones.
It's these binary patterns that are | 0:15:56 | 0:16:00 | |
then blasted onto the viewer. After
that, time for a drink in a | 0:16:00 | 0:16:06 | |
nightclub called Ruin. Only it looks
like I've arrived after the after | 0:16:06 | 0:16:11 | |
party.
Now, earlier we were looking at | 0:16:11 | 0:16:16 | |
attempts to combat fake news and so
often these days that means the US | 0:16:16 | 0:16:21 | |
elections, Russia and the like. But
it's actually a problem all around | 0:16:21 | 0:16:25 | |
the world in different ways. David
Reid has been looking at the | 0:16:25 | 0:16:28 | |
particular issues in India.
This summer mob violence in the | 0:16:28 | 0:16:36 | |
eastern state of dark and was
sparked by a rumour on WhatsApp that | 0:16:36 | 0:16:41 | |
child abductors were targeting a
tribal community. The story wasn't | 0:16:41 | 0:16:47 | |
true but still seven people died in
violence. It doesn't take much here | 0:16:47 | 0:16:51 | |
for long simmering conflicts to boil
over and fake news like this can be | 0:16:51 | 0:16:56 | |
just the trigger for it. Stories
like these are very powerful and can | 0:16:56 | 0:17:04 | |
potentially threaten India's often
temperatures communal relations. So | 0:17:04 | 0:17:08 | |
much so that now even the police are
getting involved in tackling fake | 0:17:08 | 0:17:12 | |
news.
I visited one of the country's mein | 0:17:12 | 0:17:20 | |
cyber crime units in Hyderabad, the
capital of the southern state of | 0:17:20 | 0:17:24 | |
telling Ghana. Here cyber cops are
worried about the threat to law and | 0:17:24 | 0:17:28 | |
order by fake stories with the
potential to spark riots. Police | 0:17:28 | 0:17:34 | |
here in best false and inflammatory
stories, try to get them taken down | 0:17:34 | 0:17:40 | |
and then attempt to prosecute those
producing them, but much of India's | 0:17:40 | 0:17:45 | |
fake news is spread through the
mobile communication platform | 0:17:45 | 0:17:49 | |
WhatsApp and because it's encrypted,
for police here it's a wall. It | 0:17:49 | 0:17:55 | |
appears to be a communication
problem with WhatsApp, we try with | 0:17:55 | 0:18:00 | |
WhatsApp and they say it's not a
messaging facility. We need a date | 0:18:00 | 0:18:04 | |
and time stamp to prove a case, so
that's also not there. | 0:18:04 | 0:18:10 | |
Something like 200 meal and people
in India use WhatsApp. For some the | 0:18:10 | 0:18:14 | |
story shared on the platform are
there only or main source of news. | 0:18:14 | 0:18:19 | |
If the police are hitting a road
black tarmac block with WhatsApp's | 0:18:19 | 0:18:26 | |
end to end encryption, others are
trying to utilise fake stories by | 0:18:26 | 0:18:30 | |
debunking them. This man is based in
Gujarat. This website out News roots | 0:18:30 | 0:18:36 | |
out and reveals what's wrong on the
web. My guess is it often starts on | 0:18:36 | 0:18:41 | |
WhatsApp because those who put it on
WhatsApp know it's difficult to | 0:18:41 | 0:18:45 | |
track them down. The people who
circulate these videos, they are | 0:18:45 | 0:18:49 | |
very well aware that it's a fake
video. There's no doubt. Videos like | 0:18:49 | 0:18:57 | |
this one purported to show a woman
being killed in India by a Muslim | 0:18:57 | 0:19:01 | |
mob. It's one of the most grotesque,
stomach churning videos you will | 0:19:01 | 0:19:05 | |
see. But the harrowing incident it
the Picts actually took place in | 0:19:05 | 0:19:09 | |
central America. This video was easy
to debunk. For a lot of videos what | 0:19:09 | 0:19:17 | |
we do is break it up into friends,
we use Google reverse image search | 0:19:17 | 0:19:21 | |
and the first Google result is that
of this girl who was killed in | 0:19:21 | 0:19:29 | |
Guatemala. She was accused of being
an accomplice in a murder, she got | 0:19:29 | 0:19:32 | |
caught in a mob and she died. And
yet many who saw the video took its | 0:19:32 | 0:19:37 | |
claims to be true. The reason is
that in India hundreds of millions | 0:19:37 | 0:19:42 | |
are encountering the Internet for
the first time and they lack the | 0:19:42 | 0:19:46 | |
media literacy to assess if the news
is actually true. We have more than | 0:19:46 | 0:19:53 | |
400 million mobile Internet users.
50% of them are using WhatsApp. | 0:19:53 | 0:20:00 | |
WhatsApp is the main medium for
promoting the fake news. But how | 0:20:00 | 0:20:04 | |
many people are aware of the stuff
they are forwarding weather it is | 0:20:04 | 0:20:11 | |
true or not and weather it's a
problem or not, we are not equipped | 0:20:11 | 0:20:15 | |
to deal with that and this is an
epidemic like situation. It's still | 0:20:15 | 0:20:20 | |
early days for the Internet in India
and as police and journalists battle | 0:20:20 | 0:20:24 | |
the fake content that can trigger
conflict, many are still prone to | 0:20:24 | 0:20:29 | |
manipulation from the lies in their
inbox. That was David in India. | 0:20:29 | 0:20:36 | |
Now, with Halloween fast
approaching, there are plenty of | 0:20:36 | 0:20:39 | |
scary movies around but none of them
will be as immersive as a virtual | 0:20:39 | 0:20:42 | |
reality horror show. And that's the
event that we've sent Nick quip to | 0:20:42 | 0:20:49 | |
in Covent Garden.
On the way to see a film, a movie, | 0:20:49 | 0:20:53 | |
but not as we need know it, in
virtual reality. I think this is 68 | 0:20:53 | 0:21:03 | |
A common not your standard cinema.
Cinemas, downstairs. There's people | 0:21:03 | 0:21:12 | |
down there wearing VI headsets.
Virtual reality film is super | 0:21:12 | 0:21:20 | |
exciting but right now you can only
enjoy it in the comfort of your own | 0:21:20 | 0:21:24 | |
home and it's not a social
experience, we want to bring people | 0:21:24 | 0:21:27 | |
together so they can enjoy BR with
their friends, family and partner. | 0:21:27 | 0:21:32 | |
Where am I sitting? Excellent. Can
we get any popcorn? Up and over... | 0:21:32 | 0:21:40 | |
Popcorn, excellent. Everyone ready
to go? Yes. Let's do it! Showtime! | 0:21:40 | 0:21:50 | |
Scary suburbia. I'm looking down, I
don't have any legs or anything, I'm | 0:21:50 | 0:21:56 | |
not a person... I've been directed
to go down the debt. Oh my goodness! | 0:21:56 | 0:22:03 | |
That is It down the drain. Are just
left that scene. Look behind you... | 0:22:03 | 0:22:19 | |
A bit unnerving! Someone has just
appeared in front of me. OK. We've | 0:22:19 | 0:22:27 | |
got a collection of films five to
ten minutes each and we're showing | 0:22:27 | 0:22:31 | |
them back-to-back in A40 minute
montage. It's | 0:22:31 | 0:22:34 | |
them back-to-back in A40 minute
montage. It's, it the firstly are | 0:22:34 | 0:22:35 | |
cinema pop-up and it's not cutting
hardware but they have got custom... | 0:22:35 | 0:22:46 | |
People can have the shared cinema
experience of being shocked all at | 0:22:46 | 0:22:50 | |
the same time. | 0:22:50 | 0:22:52 | |
Lies is not good. This is going to
end well, this is going to end very | 0:23:02 | 0:23:09 | |
well. | 0:23:09 | 0:23:10 | |
I'm burning alive. With several
showings starting at the top of the | 0:23:12 | 0:23:19 | |
hour, the headsets need to be taken
away for charging. It's all very | 0:23:19 | 0:23:24 | |
pop-up. But the chaps hope it will
get another hearts racing so they | 0:23:24 | 0:23:27 | |
can open up a permanently are cinema
later this year. Is this all just a | 0:23:27 | 0:23:32 | |
novelty, though? This is kind of a
nightclub in Glasgow. OK, that's | 0:23:32 | 0:23:39 | |
horrendous. Horrendous! That's
enough. Actually it was quite fun to | 0:23:39 | 0:23:47 | |
bring a group of friends together.
To go out and have a shared | 0:23:47 | 0:23:51 | |
experience, another group is coming
in right now? OK. Masters of | 0:23:51 | 0:23:55 | |
turnaround. We'd better be on our
way. Nick Kwek, always up for an | 0:23:55 | 0:24:03 | |
experience and regularly needing a
lie down after a shoot because of | 0:24:03 | 0:24:06 | |
it. That's it from me Ruin for this
week, don't forget we live on | 0:24:06 | 0:24:11 | |
Facebook and Twitter throughout the
week, @BBCclick is where you'll find | 0:24:11 | 0:24:15 | |
us. Thanks for watching and we'll
see you soon. | 0:24:15 | 0:24:20 |