28/05/2016 Click


28/05/2016

Similar Content

Browse content similar to 28/05/2016. Check below for episodes and series from the same categories and more!

Transcript


LineFromTo

You are up to date on the headlines. It is time now for Click.

:00:00.:00:07.

3D projected people come to life, and the web goes ad free.

:00:08.:00:17.

All that after these important messages.

:00:18.:00:42.

This is East London, painfully trendy and somewhere I

:00:43.:00:48.

Down Shoreditch way is the famous Brick Lane, an area famous for its

:00:49.:00:56.

colourful street art that you will find lurking around every corner.

:00:57.:01:01.

It is a real work of art, which we can all enjoy at no cost.

:01:02.:01:08.

The creator of this really didn't expect to get paid.

:01:09.:01:10.

That is rarely the case with other kinds of content, online news, video

:01:11.:01:14.

content, as we know, the creators of those are in it for money.

:01:15.:01:27.

Money that often comes from advertisers' pockets.

:01:28.:01:29.

The problem is, our love affair with doing this is spilling online.

:01:30.:01:32.

The way to skip these ads on the web is to install a browser

:01:33.:01:36.

These are known as ad blockers, stopping annoying things popping up

:01:37.:01:41.

Not everything that pops up on your screen is annoying, case in point.

:01:42.:01:50.

You have been testing some of these ad blockers, haven't you?

:01:51.:01:53.

While some people might not mind random interruptions,

:01:54.:01:55.

for others advertisements are a real nuisance, particularly on mobile

:01:56.:01:58.

devices because people don't want to use their data or battery power

:01:59.:02:01.

I guess it is good news for us, definitely bad news

:02:02.:02:12.

for the advertisers by the sound of it.

:02:13.:02:15.

Absolutely, and although ad blocking is nothing new, people are

:02:16.:02:17.

learning more about it and there are more ways of blocking adverts.

:02:18.:02:20.

It seems it will only become a bigger business.

:02:21.:02:28.

Adblocker Plus is leading the way with 100 million users,

:02:29.:02:32.

and once you install it, you should be free of advertisements.

:02:33.:02:35.

Not quite, it creates a white list, a list of sites that adheres to what

:02:36.:02:39.

Acceptable ads is a set of criteria, and those are criteria we developed

:02:40.:02:46.

If advertisers or publishers have certain ads that meet those criteria

:02:47.:02:51.

they can be white listed, and then the top 10% of those advertisers are

:02:52.:02:55.

also charged a licensing fee for the work you put into doing it.

:02:56.:03:05.

It is not just about bursting their bubble, the issue is

:03:06.:03:08.

the general dislike of adverts, only about 10% of the company's

:03:09.:03:11.

They don't like distractions or the idea of companies tracking

:03:12.:03:22.

their data and habits, something ad blockers also aim to reduce.

:03:23.:03:27.

Some content providers are going as far as blocking user access

:03:28.:03:30.

if they detect that ad blocking software is in use.

:03:31.:03:34.

Now, Opera has an ad blocker built into its latest browser, Apple

:03:35.:03:41.

recently allowed them to function on its mobile operators, and others are

:03:42.:03:44.

It is not all doom and gloom, at least not yet.

:03:45.:03:58.

We haven't all downloaded ad blockers.

:03:59.:04:02.

There does seem to be a growing trend.

:04:03.:04:07.

Some try to shield themselves by going behind pay walls, and if

:04:08.:04:10.

things don't get better we may start seeing subscription models where you

:04:11.:04:13.

sign up for access to multiple publications in one place, or even

:04:14.:04:16.

using cash trails, where you pay nominal amount each

:04:17.:04:19.

Either of those would put an end to the freely available

:04:20.:04:27.

I have come to one of the biggest online news publishers in the UK,

:04:28.:04:34.

where the issue of whether the news should be free has been

:04:35.:04:36.

The Guardian is one of the most read online news services, which is free

:04:37.:04:43.

to access and a significant portion of income comes from advertisers.

:04:44.:04:49.

I sat down with the man in charge of the digital site to see how much

:04:50.:04:53.

It is a significant issue that we take very seriously.

:04:54.:05:00.

In Germany, for example, 80% of millennial males, people aged 18-34,

:05:01.:05:03.

That gives us a sense of where this could go.

:05:04.:05:17.

Readers enjoy the content and probably don't think very much about

:05:18.:05:20.

the fact that it is advertising that is paying for the creation

:05:21.:05:22.

of amazing investigative journalism like the Panama Papers or WikiLeaks.

:05:23.:05:30.

Then there is the idea of the ad blocker blocker, which is

:05:31.:05:33.

something that broadcasters or publishers can install to stop the

:05:34.:05:36.

John Whittingdale, the Culture Secretary, referred to

:05:37.:05:44.

the ad blocking business model as a modern-day protection racket, and

:05:45.:05:47.

that is because ad blockers will charge publishers and other media

:05:48.:05:50.

owners to be able to punch through the ad blocker

:05:51.:05:52.

and still deliver ads to the user who has chosen to block them.

:05:53.:06:09.

They are offering the opportunity to be on a white list.

:06:10.:06:12.

Even crazier, there are further services to the user to block

:06:13.:06:15.

Going down an arms race doesn't seem to be a viable route, it will just

:06:16.:06:29.

continue to escalate and we don't want to circumvent users' choice to

:06:30.:06:32.

not see ads, we would much rather engage in the conversation with the

:06:33.:06:35.

user which says hopefully the content are

:06:36.:06:37.

reading has some value to you, and if you value it we would like

:06:38.:06:41.

you to contribute to that, either by seeing advertising or by paying us.

:06:42.:06:53.

At least one company is attempting to unite it all and change

:06:54.:06:56.

PageFair is one of those ad blocker blockers, one of several out there,

:06:57.:07:06.

giving back to those who are stung by the rise of ad blockers.

:07:07.:07:13.

The company says there philosophy is more about bridging the gap

:07:14.:07:16.

What users will start seeing is far fewer ads, respectful ads that

:07:17.:07:24.

aren't snooping on their data, that aren't jumping around the page, that

:07:25.:07:27.

aren't interrupting the traffic by taking so long to download, and also

:07:28.:07:30.

ads that behave in such a simple way that they can't expose the machine

:07:31.:07:34.

Companies need to use advertising because it pays the bills,

:07:35.:07:38.

but they need to reduce the number of ads such that the ones that are

:07:39.:07:41.

The way beyond that blocking, where you get to decide what ads to show

:07:42.:07:55.

on this now uncluttered space where people have blocked out all the

:07:56.:07:58.

other ads, once you can decide what ads you want to show beyond that

:07:59.:08:02.

blocking, the answer is to limit the quantity

:08:03.:08:04.

What that will do is I think it will cure what is

:08:05.:08:09.

Whether ad blockers or the ad blocker blockers win,

:08:10.:08:15.

it seems we are the ones who will come out glorious, with a less

:08:16.:08:19.

It was the week the US Navy tested firing drone swarms up into the sky.

:08:20.:08:38.

The creepiest robot ever with whiskers was unveiled.

:08:39.:08:46.

And The European Commission proposed that a fifth of the shows offered

:08:47.:08:49.

by services such as Netflix and Amazon Prime Video,

:08:50.:08:51.

should be made locally and given good visibility on the platform.

:08:52.:08:56.

And, after giving Nokia the heave-ho last week,

:08:57.:08:58.

Microsoft announced a cut of over 1800 jobs as part of its retreat

:08:59.:09:01.

This comes just two years after it paid ?5 billion for the once

:09:02.:09:09.

This also brought us a most unlikely internet star, in

:09:10.:09:20.

She popped to the shops to buy yoga pants, came

:09:21.:09:31.

out with a Chewbacca mask, and an amazing 140 million views later had

:09:32.:09:34.

If you think your touchscreen is playing up, MIT has unveiled its

:09:35.:09:48.

newest shape shifting interface, which has been programmed to mimic

:09:49.:09:50.

It can bend and flex, giving users what research is called

:09:51.:09:59.

In last week's show we talked to leading academics

:10:00.:10:12.

about what our lives might be like in the future, when automation

:10:13.:10:22.

Now, here is their view on the ethical

:10:23.:10:33.

and moral questions we may face as robots become part of our society.

:10:34.:10:36.

In many years to come, if we build human level AI,

:10:37.:10:39.

would it have consciousness or would it be just a tool?

:10:40.:10:42.

We know how to make little electronic neurons.

:10:43.:10:45.

If we put together hundreds of thousands of these, just the same as

:10:46.:10:49.

the way the brain of a small animal is organised, then it seems to me if

:10:50.:10:53.

they are governed by the same laws of physics, that they should behave

:10:54.:10:56.

in the same way as the animal behaves. There is no reason why we

:10:57.:10:59.

shouldn't be able to replicate all the abilities of a human being.

:11:00.:11:04.

There may be some aspects of human cognition that are not computable.

:11:05.:11:07.

If I slap myself across the cheek, I know what it feels like to be

:11:08.:11:11.

It is not obvious to me how the execution of any computer programme

:11:12.:11:15.

on no matter how powerful hardware, will be sufficient to bring forth

:11:16.:11:18.

What every person has is this amazing complexity inside of them,

:11:19.:11:32.

We have subjectivity, agency, thoughts and feelings.

:11:33.:11:35.

It is actually impossible to have the kind

:11:36.:11:37.

of relationship with a machine as you might another person, because

:11:38.:11:40.

there is a profound distinction between a person in a thing.

:11:41.:11:48.

I know there are lots of people that make claims that if you get machines

:11:49.:11:52.

to imitate behaviours and humans recognise that and start relating to

:11:53.:11:55.

it in a particular kind of way, that shows some relationship.

:11:56.:11:57.

But the only reason why a human being is able to appreciate

:11:58.:12:00.

those, is because first and foremost they are human beings.

:12:01.:12:18.

I think we could build things that are capable of suffering,

:12:19.:12:21.

We also could build things that are very sophisticated and capable,

:12:22.:12:28.

but that don't actually experience emotions of their own,

:12:29.:12:30.

We don't have a moral duty towards that thing so

:12:31.:12:34.

You could say, that is fine, people want to beat up robots

:12:35.:12:40.

Provided they pay for them that is not a worry.

:12:41.:12:44.

I consider the extreme case, where people are commissioning

:12:45.:12:47.

robots that look exactly like their ex-girlfriend or their boss,

:12:48.:12:49.

It is incredibly easy to make a robot that can pretend to suffer.

:12:50.:12:54.

If a robot pretends to cry, then 98% of people can't help

:12:55.:12:57.

That is a massive opportunity for manufacturers

:12:58.:13:11.

We really need to discuss this before we are all

:13:12.:13:15.

If you couldn't get people who are lonely or disaffected with a

:13:16.:13:31.

relationship, you can actually build robots to meet that need, then you

:13:32.:13:34.

If you can keep create new needs and desires,

:13:35.:13:40.

The elderly argument is one that is often in the background for a lot

:13:41.:13:54.

If you say to people, "would you like to be cared for by a

:13:55.:13:59.

If, instead, you say, "Would you like us to develop

:14:00.:14:04.

sophisticated artificial intelligence technology which will

:14:05.:14:05.

enable you to remain independent in your own home for longer?"

:14:06.:14:08.

Now, in many ways, those suggest different ways of describing the

:14:09.:14:21.

same thing but I think it is really important that we use the latter

:14:22.:14:25.

description because, with today's technology, the robots don't really

:14:26.:14:27.

care, they simply do a job that resembles caring. Twitter

:14:28.:14:30.

has been facing some tough questions of late, questions like,

:14:31.:14:33.

Are you even relevant anymore in this rapidly expanding universe

:14:34.:14:40.

Last year its boss, Jack Dorsey, left the company

:14:41.:14:49.

only to come back saying he was the man who could save it.

:14:50.:14:52.

Dave Lee gave him more than a sentence or two to explain

:14:53.:14:56.

Jack, you're making changes to Twitter, what exactly is going to

:14:57.:15:05.

What are users going to notice is different?

:15:06.:15:08.

It's going to be a whole lot simpler.

:15:09.:15:10.

It's going to be a lot simpler to use.

:15:11.:15:12.

So we're focus a lot of our energy on making sure that

:15:13.:15:16.

And we've noticed how people have been tweeting over ten years and one

:15:17.:15:25.

of the things that we noticed is that there's this little, hidden

:15:26.:15:31.

rule where if you put and app name to start your tweet,

:15:32.:15:35.

it's actually only seen by that person you are referencing

:15:36.:15:37.

and people that follow the both of you.

:15:38.:15:40.

It doesn't make sense to anyone and people have had to work around it.

:15:41.:15:43.

Put a dot before the app and then the name and that also

:15:44.:15:46.

We're remopving that, making it a whole lot simpler.

:15:47.:15:50.

The other thing that was happening, people are tweeting out images, or

:15:51.:15:53.

videos, polls, gifts and what was happening is they add an image and

:15:54.:15:58.

then the character count goes down and it doesn't really make sense

:15:59.:16:05.

because they don't see a URL or any other text, they see an image.

:16:06.:16:09.

So we removed counting against any time you add an image,

:16:10.:16:14.

or media or poll to make it a whole lot more visible and intuitive.

:16:15.:16:17.

Many people's reluctance to be involved in Twitter is still this

:16:18.:16:21.

sense that in many cases it's not a nice place to be on the internet.

:16:22.:16:27.

More so than any other social networks you're seeing abuse,

:16:28.:16:29.

People might look at the change today and say the length of the

:16:30.:16:35.

There is something wrong with the fabric of Twitter, in that it kind

:16:36.:16:40.

How much of a concern is about and what can you do about it?

:16:41.:16:45.

I don't think the negativity and abuse and harassment is unique

:16:46.:16:48.

I think it's an industry-wide, internet-wide issue that we all

:16:49.:16:52.

need to solve and we did make it a priority for the company.

:16:53.:16:56.

So we have five priorities this year.

:16:57.:17:00.

One is refining and simplifying our product.

:17:01.:17:05.

Two is around live video, three is around making sure that creators are

:17:06.:17:08.

coming to Twitter and they have the best tools to express themselves.

:17:09.:17:11.

Four is around safety and making sure that people feel

:17:12.:17:14.

Give them very easy tools to mute, to block and to report.

:17:15.:17:20.

So all these parts correspond to making Twitter better and making

:17:21.:17:26.

Twitter a better place to express oneself

:17:27.:17:28.

The amounts os tweets is by some measures going down,

:17:29.:17:32.

the user growth is either small or even non-existent in some quarters.

:17:33.:17:39.

What do you need to do to improve that beyond kind

:17:40.:17:42.

What is it about Twitter as a social network,

:17:43.:17:50.

the fabric of Twitter, what can you do to turn around what's happening

:17:51.:17:53.

Well, a lot of these things may look small to

:17:54.:17:57.

people but they are really dramatic in terms of how people use them.

:17:58.:18:00.

A series of small things can end up being massive in terms

:18:01.:18:03.

of their impact when taken into consideration in a full

:18:04.:18:06.

So, you know, I do believe we are focused on the right things to make

:18:07.:18:14.

Twitter successful and to help Twitter thrive.

:18:15.:18:19.

I feel like every week I read a rumour or story about someone to

:18:20.:18:22.

As the chief executive, are you open to Twitter being bought

:18:23.:18:26.

We are focused on making Twitter amazing.

:18:27.:18:32.

Making something that people want to use every single day.

:18:33.:18:34.

We are an independant company and we are thriving and we want to

:18:35.:18:37.

If Google came with a check book and said we are going to buy you

:18:38.:18:44.

out, we are going to take you off the market so you do not have to

:18:45.:18:48.

worry about that kind of pressure, would that be appealing to you?

:18:49.:18:51.

We are focused on building our service.

:18:52.:18:56.

We are all used to seeing a lot of CGI, computer generated imagery,

:18:57.:18:59.

One thing that has become a lot more common in recent years is the idea

:19:00.:19:11.

of the virtual actor - computer generated humans doing things that

:19:12.:19:14.

It turns out that the Hollywood minds behind virtual actors are now

:19:15.:19:18.

turning their attention behind something much more serious.

:19:19.:19:23.

It is this technology that allowed Marc Cieslak to meet a virtual

:19:24.:19:26.

The University of Southern California is home to the Institute

:19:27.:19:31.

A place where cutting-edge technology and Hollywood meet.

:19:32.:19:42.

The ICT has developed a holster kit used to create visual

:19:43.:19:44.

For example, the light stage - this is used to capture an actor's face

:19:45.:19:50.

This data can then be used to create virtual renders of the actor.

:19:51.:19:59.

It's a pretty spectacular sight, I have to say.

:20:00.:20:10.

But this kit has applications way beyond the multiplex.

:20:11.:20:14.

A similar setup has been used as part of the process

:20:15.:20:16.

of recording interviews with Holocaust survivors for

:20:17.:20:18.

a project which could eventually end up in museums and even classrooms.

:20:19.:20:23.

Do you remember any songs from your youth?

:20:24.:20:25.

So, this is what we call the automultiscopic projector ray.

:20:26.:20:33.

We have about 200 video projectors behind

:20:34.:20:35.

You walk around the display at 130 degrees.

:20:36.:20:46.

You can see a 3D image sitting in front of you.

:20:47.:20:54.

This man was an 84-year-old Holocaust survivor.

:20:55.:20:56.

He was in five concentration camps throughout the war.

:20:57.:21:01.

The most dangerous thing that happened is being in the camps.

:21:02.:21:03.

Because being in the camps, every minute, every second

:21:04.:21:06.

of your existence, you were at the mercy of someone who could actually

:21:07.:21:10.

take your life away, who could torture you, who could beat you, who

:21:11.:21:13.

It took a week to film these interviews and over 20 hours

:21:14.:21:26.

In order to record these interviews with Holocaust survivors,

:21:27.:21:33.

30 cameras were arranged around the interviewees in a semicircle.

:21:34.:21:41.

The images they captured are projected out of the projector ray,

:21:42.:21:44.

It provides the illusion of three dimensions.

:21:45.:21:47.

It really does feel like this man is in the room with you.

:21:48.:21:52.

But what really brings these 3D images to life is the audience's

:21:53.:21:55.

ability to interact with the interviewees, asking them questions

:21:56.:21:57.

We have about just under 2,000 responses.

:21:58.:22:12.

They cover all the top questions that people tend

:22:13.:22:14.

to ask Holocaust survivors and also things specific to the story.

:22:15.:22:20.

So when you ask a question, we use a natural language understanding

:22:21.:22:27.

similar to how Siri or your phone may translate your speech into text.

:22:28.:22:31.

The system then uses an algorithm to find the best video response

:22:32.:22:34.

As you ask a question, we play the video response and then transition

:22:35.:22:50.

back to the listening pose, so it feels like you are having an actual

:22:51.:22:54.

You can read about the Holocaust in a book or see it on a movie or on TV

:22:55.:22:59.

but until you actually interactive with someone who has lived through

:23:00.:23:02.

The reality that this thing actually happened and how horrendous it was

:23:03.:23:07.

They say if you ignore history you are doomed to repeat it and

:23:08.:23:14.

as the survivors of the Holocaust age, interactive recordings like

:23:15.:23:16.

this one have the ability to keep their stories alive long after they

:23:17.:23:20.

From the beginning of the Nazis taking over Poland,

:23:21.:23:29.

it was a nightmare from which you didn't wake up

:23:30.:23:32.

That was Mark and that's it for this week, I'm afraid.

:23:33.:23:44.

We'll give you a regular digests of tech news

:23:45.:23:53.

and all the behind-the-scenes nonsense and pictures that you

:23:54.:24:16.

Wednesday may have been the first of June, but it wasn't exactly

:24:17.:24:20.

Take the eastern side of England, for an example.

:24:21.:24:23.

The Lincolnshire coast, temperatures struggled to just 11 degrees.

:24:24.:24:26.

Download Subtitles

SRT

ASS