50 Shades of Hay Click


50 Shades of Hay

Click visits the Hay Festival and chats to some of the biggest names in technology. Plus a look at how social media is being used in the general election.


Similar Content

Browse content similar to 50 Shades of Hay. Check below for episodes and series from the same categories and more!

Transcript


LineFromTo

Now I BBC News, it is time for Click.

:00:00.:00:00.

This week: Swaying votes on Facebook.

:00:00.:00:07.

It's geek versus geek as Spen meets Fry.

:00:08.:00:12.

And lots of people shout in a big tent.

:00:13.:00:39.

Summer is on the way and, well, it wouldn't be a British summer

:00:40.:00:42.

without a visit to a good old fashioned festival.

:00:43.:00:44.

Known as the Town of Books, Hay-on-Wye, in Wales,

:00:45.:00:56.

It's a literary Mecca, an annual gathering of artists,

:00:57.:01:05.

authors, Daleks and, yep, even Royals.

:01:06.:01:08.

It's even been called the Woodstock of the Mind by none other

:01:09.:01:11.

than former US President Bill Clinton.

:01:12.:01:15.

This year it's the 30th Hay Festival and the line-up is pretty stellar.

:01:16.:01:20.

Well, for the second year in a row, we've been invited to share some

:01:21.:01:24.

of our favourite experiences and show off some really good tech,

:01:25.:01:26.

all in front of a real, live audience of actual people.

:01:27.:01:29.

A packed tent waited, all that we had to do was wow them!

:01:30.:01:44.

We have robots falling over, experiments in haptic feedback

:01:45.:01:56.

and demos in binaural sound, but that was nothing

:01:57.:01:58.

compared to the climax - a Click created wavy,

:01:59.:02:00.

shouty game built using artificial intelligence.

:02:01.:02:02.

In the meantime, it can't have have escaped your attention that around

:02:03.:02:07.

the UK things are getting a touch political.

:02:08.:02:09.

As the general election looms, those politicians are using increasingly

:02:10.:02:12.

sophisticated techniques in order to learn more about us.

:02:13.:02:20.

The advertising reach of Facebook has long been an open secret,

:02:21.:02:27.

but now it's something the political parties are getting in on too.

:02:28.:02:31.

In fact, both the Trump campaign and the Leave.EU groups credited

:02:32.:02:35.

Facebook as being a vital part of their electioneering.

:02:36.:02:37.

We know that the personal details that you give to social networks

:02:38.:02:40.

allow them to send you relevant, targeted content, and it goes

:02:41.:02:43.

much deeper than just your basic demographics.

:02:44.:02:51.

There are now data analytics companies claiming to be able

:02:52.:02:54.

to micro-target and micro-tweak messages for individual readers,

:02:55.:02:56.

If you know the personality of the people you're targeting,

:02:57.:03:07.

you can nuance your messaging to resonate more effectively

:03:08.:03:09.

What's also emerging is that political parties have been

:03:10.:03:14.

using this data to reach potential voters, on a very granular level.

:03:15.:03:17.

So who is being targeted on Facebook and how?

:03:18.:03:26.

Well, until now, there's been nothing around to analyse any

:03:27.:03:28.

of this, but the snap general election galvanised

:03:29.:03:32.

Louis Knight-Webb and Sam Jeffers to develop Who Targets Me,

:03:33.:03:34.

a plug-in to tell each of us how we're being targeted.

:03:35.:03:41.

When you install the plug-in for the first time,

:03:42.:03:43.

it asks for your age, your gender and your location,

:03:44.:03:46.

and then it starts scouring your Facebook feed looking for adverts

:03:47.:03:48.

So once you've installed the plug-in, it works

:03:49.:03:51.

in the background to extract the whole advert that

:03:52.:03:54.

So it pulls out the headline, the subtitle, any related videos,

:03:55.:03:57.

We also get the reaction - so how many likes, how many

:03:58.:04:05.

comments, how many shares - so we can see which messages

:04:06.:04:08.

Are they particularly clandestine messages,

:04:09.:04:10.

are they slightly subversive, are they even fake news?

:04:11.:04:14.

But how do data companies get the information in the first place?

:04:15.:04:16.

A lot of the quizzes you fill out on Facebook or,

:04:17.:04:19.

you know, you open a survey, it asks your Facebook

:04:20.:04:21.

Sometimes you'll notice that there's a lot of permissions attached

:04:22.:04:26.

and as soon as you click yes, all of your data is mined,

:04:27.:04:29.

and it's then sold on to data brokers who then, eventually,

:04:30.:04:31.

sell it to the political parties for use in their campaigns.

:04:32.:04:38.

Although Facebook says it doesn't sell our information on,

:04:39.:04:42.

data brokers can overlay any details they mine from the site with other

:04:43.:04:45.

datasets that they have on people based on their email addresses.

:04:46.:04:51.

The next step after that of course is to find similar users that

:04:52.:04:55.

are using Facebook and then target adverts, from that advertiser

:04:56.:04:57.

that supplied the email addresses, to those users.

:04:58.:05:00.

There are just some people that you don't find on Twitter.

:05:01.:05:09.

The very nature of the fact that I can't see your adverts,

:05:10.:05:12.

you can't see my adverts, means that this approach had to be

:05:13.:05:15.

It's a first of its kind anywhere in the world on this scale,

:05:16.:05:23.

giving us citizens some transparency into what we're being shown,

:05:24.:05:26.

Do you think that people wouldn't know that certain things are advert

:05:27.:05:33.

A lot of the time people are scrolling through Facebook

:05:34.:05:43.

and the adverts fit into this weird intersection of friend

:05:44.:05:45.

It's quite easy to miss the adverts on Facebook.

:05:46.:05:51.

So far, Who Targets Me has some 6,700 users

:05:52.:05:53.

in 620 constituencies, and it's rising as

:05:54.:05:54.

On the down side, it's only as good as the data it's

:05:55.:06:06.

managed to crowd-source, so it isn't necessarily

:06:07.:06:07.

representative, and it also doesn't work with mobile Facebook,

:06:08.:06:09.

So we're seeing a mixture of two things.

:06:10.:06:14.

We're seeing, firstly, A/B testing, which is where I try out

:06:15.:06:16.

two different messages with the same group.

:06:17.:06:18.

I see which one gets the best reaction and then

:06:19.:06:21.

We're also seeing targeting, which is where I pick a particular

:06:22.:06:26.

demographic of people, and then I send a message

:06:27.:06:28.

So, for example, it might be young people targeted

:06:29.:06:35.

The data from Who Targets Me is also being poured over by analysts

:06:36.:06:44.

One aspect of their research is collecting dark posts,

:06:45.:06:47.

ads which are here one day and gone the next.

:06:48.:06:51.

It gives us the ability to create a respository of those dark posts.

:06:52.:06:54.

So if promises are being made on Facebook, in ads which will

:06:55.:06:57.

disappear the day after you use them, we should be able to go back

:06:58.:07:00.

to those after the election, look at them, evaluate them

:07:01.:07:02.

and maybe discuss them in the cold light of day.

:07:03.:07:08.

And the irony is that, as we demand more transparency

:07:09.:07:10.

from public bodies, the whole basis of political propaganda

:07:11.:07:12.

could be on the brink of a revolutionary change.

:07:13.:07:19.

What's interesting, I think, about the new environment

:07:20.:07:21.

is the potential for using paid advertising and other techniques

:07:22.:07:23.

to create individual propaganda bubbles around individual voters.

:07:24.:07:25.

And that's not about controlling the market as a whole,

:07:26.:07:28.

but it's about using smart targeted which, in a sense, creates such

:07:29.:07:31.

a compelling and overarching information environment

:07:32.:07:32.

for individual people that that in some ways constrains what they do

:07:33.:07:35.

I think that's why some academic commentators and others

:07:36.:07:47.

are beginning to think some of this is a bit spooky.

:07:48.:07:59.

But politicians aren't the only ones with Facebook on their minds.

:08:00.:08:03.

The social network was one of many topics on the very large brain

:08:04.:08:07.

of national treasure and tech geek Stephen Fry.

:08:08.:08:08.

I met up with him after he gave a lecture at the Hay Festival

:08:09.:08:12.

highlighting how he thinks the world is being changed by social

:08:13.:08:14.

The very current conversation is whether Facebook and platforms

:08:15.:08:18.

like them should actually be considered publishers?

:08:19.:08:19.

Should they take responsibility for what ends up on the site?

:08:20.:08:31.

They are aware there is a problem, a serious problem.

:08:32.:08:34.

If 80%, some people have said, is the...

:08:35.:08:36.

You know, in proportion of people who get their news from Facebook

:08:37.:08:39.

rather than from mainstream media, then surely it is incumbent

:08:40.:08:41.

upon someone who is providing 80% of their news sources to make sure

:08:42.:08:44.

that those news sources are not deflamatory,

:08:45.:08:46.

blatant lies, propaganda, the wrong kind of,

:08:47.:08:47.

I would posit there that a publisher is responsible for all the people

:08:48.:08:59.

They are employed by that publisher and Facebook is clearly not that.

:09:00.:09:03.

So do we need a third definition, a third thing?

:09:04.:09:07.

I think there is a medium sort of definition that it's not

:09:08.:09:17.

beyond the wit of lawyers of the right kind to find that.

:09:18.:09:20.

Everyone knows you are a famous early adopter.

:09:21.:09:22.

You seem to try everything that comes out?

:09:23.:09:24.

Yeah, I'm perpetually driven by curiosity.

:09:25.:09:26.

It may even be a kind of addictive greed, which is a sort of rather

:09:27.:09:30.

graver example of the consumer greed that we all share, really.

:09:31.:09:33.

Maybe I'm just more of a jackdaw, who likes shiny things and I just...

:09:34.:09:36.

I still love the, you know, the first kind of...

:09:37.:09:51.

I get short of breath and I pant slightly.

:09:52.:09:53.

Your presentation was a warning that people should prepare

:09:54.:09:55.

for the changes that are coming, for example, artificial

:09:56.:09:57.

I said that it was a sort of transformation of

:09:58.:10:01.

You know, it's an obsolescence of certain types of job,

:10:02.:10:04.

but that doesn't mean forced redundancy of millions of workers.

:10:05.:10:06.

I mentioned one of the pleasing things about AI and robotics,

:10:07.:10:09.

and that is what's known as Moravec's paradox,

:10:10.:10:11.

that what we're incredibly bad at, as individuals, machines tend to be

:10:12.:10:14.

Complicated sums, rapid and incredible access of memory

:10:15.:10:20.

from a database of a kind that we could never do,

:10:21.:10:22.

sorting and swapping of information and cataloguing

:10:23.:10:26.

But things we can do without even thinking,

:10:27.:10:41.

like walk across the room or pick up a glass and have a sip of water,

:10:42.:10:45.

But that's fine, because we don't want them to do that for us.

:10:46.:10:49.

Where it gets difficult is in medium sort of service jobs, I think.

:10:50.:10:52.

Well, you could go the etymological route, and you could say

:10:53.:10:56.

Legere is read and inter, interleg, and that's pretty good, actually,

:10:57.:11:03.

Just being able to see connections in things and people are talking

:11:04.:11:14.

about the moment that we arrive at AGI, artificial general

:11:15.:11:18.

intelligence, and that's when the various types of pattern

:11:19.:11:21.

recognition, you know, numbers, data, you know,

:11:22.:11:22.

certain faces and things like that, they all come together

:11:23.:11:25.

so that they can be intelligent across these different things.

:11:26.:11:32.

If you've got an artificial intelligence that's good at that

:11:33.:11:35.

and another one that's good at that and another one that's good at that

:11:36.:11:38.

and you get enough of them, and then you put something on top

:11:39.:11:41.

that goes, "Oh, you want to know about a language problem,

:11:42.:11:44.

You want to know about walking and recognising objects,

:11:45.:11:47.

Surely, just a collection of specialists of intelligences

:11:48.:11:50.

under one umbrella is a generals intelligence.

:11:51.:11:52.

It doesn't have to be a breakthrough, it just has to be

:11:53.:11:55.

I think that's very likely to be the way it goes.

:11:56.:12:01.

Part of our intelligence, a huge part, is our emotional intelligence.

:12:02.:12:04.

Einstein would say he did say the same thing.

:12:05.:12:06.

It's not just the cerebral power, it is something to do

:12:07.:12:08.

with our blends of fury, inquisitive desire,

:12:09.:12:10.

These are very, very much more primal and much harder

:12:11.:12:13.

to reduce to a table of calculations and memories.

:12:14.:12:15.

But maybe you give it something else, some other kind

:12:16.:12:18.

It's reward is similar to our reward system

:12:19.:12:21.

It's tryptophan and serotonin and endorphins of various kinds that

:12:22.:12:25.

reward us and then we have a pain system to deter us,

:12:26.:12:28.

and there's nothing to stop us giving that to a machine.

:12:29.:12:33.

Stephen, thank you so much for your time.

:12:34.:12:35.

Thank you for having us at your place.

:12:36.:12:40.

It was the week where Android creator Andy Rubin

:12:41.:12:49.

Co-founder of Google Larry Page had a couple of novices

:12:50.:12:53.

take his Kitty Hawk Flyer for a spin.

:12:54.:12:55.

And Intel popped swing-sensing chips into bats for the ICC Champions Cup.

:12:56.:12:57.

Now, whenever you go into a church you expect to see a man

:12:58.:13:00.

of God or a woman of God, but not a robot of God!

:13:01.:13:12.

A Protestant church in Germany has unveiled a robotic priest,

:13:13.:13:15.

called BlessU-2, to mark 500 years since the reaffirmation.

:13:16.:13:17.

The machine delivers various blessings in eight languages.

:13:18.:13:28.

Researchers at MIT have come up with a unique way to let

:13:29.:13:30.

the visually impaired know more about what's around them.

:13:31.:13:32.

A 3D camera recognises stuff and feeds that detail back

:13:33.:13:35.

using vibrations through a belt and an electronic Braille interface.

:13:36.:13:37.

Microsoft showed off a range of mixed reality headsets,

:13:38.:13:39.

they're designed to blend the real world and virtual content into one

:13:40.:13:42.

where physical and digitally created objects co-exist and interact.

:13:43.:13:52.

Google Maps now doesn't just take you to your destination,

:13:53.:13:59.

Click on an art museum and take a peek around with each artwork

:14:00.:14:11.

displayed with the gallery's notations, plus extra

:14:12.:14:13.

Now, as you've just been hearing, Microsoft may be leading the way

:14:14.:14:23.

in mixed or augmented reality, but they're not the only company

:14:24.:14:26.

this's trying to make you see things that aren't really there.

:14:27.:14:33.

I'm at the Augmented World Expo which has doubled in size since last

:14:34.:14:36.

year, a sign of the excitement this relatively new technology

:14:37.:14:38.

Now you have a hologram in front you.

:14:39.:14:58.

When you open your fist, you release it and then you can look at it.

:14:59.:15:06.

Now you can stick your hand beside it.

:15:07.:15:10.

Place it anywhere in space, anywhere you want.

:15:11.:15:15.

I'm going to put it on top of Cody's head, behind the camera.

:15:16.:15:20.

I feel like this is going to strain my eyes if I use it for any

:15:21.:15:26.

Funny that you say that, we have a team of neuroscientists

:15:27.:15:31.

on staff who actually do user testing with us.

:15:32.:15:33.

We're confirming with our neuroscientists team that it does

:15:34.:15:35.

not cause any eye strain or neck pain or anything when

:15:36.:15:38.

The problem with all of these devices is that most of them require

:15:39.:15:44.

expensive new gadgets, like these $799 glasses or big bulky

:15:45.:15:46.

devices that soon become uncomfortable to wear.

:15:47.:15:48.

But I did find one idea that I think could genuinely make many

:15:49.:15:51.

Project Chalk is an app that uses augmented reality to let

:15:52.:15:54.

you remotely help someone use a gadget or fix an object.

:15:55.:15:57.

So next time your parents ring, you can show them what to do,

:15:58.:16:00.

You can see the clever thing about this app is that,

:16:01.:16:09.

even if I move around the machine, we still have the instructions

:16:10.:16:12.

overlaid in the same place which means it can be much more

:16:13.:16:15.

precise than if it was just drawn on a normal screen.

:16:16.:16:26.

Project Chalk's designed to work on all types of devices -

:16:27.:16:28.

phones, tablets and what we call digital eyewear, which you can also

:16:29.:16:31.

So this will be available on a free basis which means anyone will be

:16:32.:16:36.

able to start by downloading it from the app store and using

:16:37.:16:39.

But once people have to pay, do you think they'll go back to just

:16:40.:16:45.

using Skype or just talking on the phone to get

:16:46.:16:48.

Now last week, the world's best player in the Chinese board game go

:16:49.:17:03.

was pretty soundly beaten by Google's artificial

:17:04.:17:04.

For those following the rapid development of AI, the result

:17:05.:17:17.

20 years ago however, a victory of machine over

:17:18.:17:20.

man shocked the world, prompting apocalyptic

:17:21.:17:21.

warnings about our own role in society and questions

:17:22.:17:23.

about whether humanitarian could survive in a world

:17:24.:17:25.

where we could be out thought by a computer.

:17:26.:17:27.

The man at the centre of that match was chess

:17:28.:17:30.

The machine was IBM's super computer Deep Blue.

:17:31.:17:38.

You talk about the fact that it took you quite a long time to get over

:17:39.:17:42.

Can you explain why it took that long and how

:17:43.:17:50.

It was my first loss and losing this match,

:17:51.:17:53.

I mean, I knew I made mistakes and I didn't play well

:17:54.:17:57.

and I could kick myself in my head for not preparing

:17:58.:18:00.

Since then, you've become an advocate of AI?

:18:01.:18:08.

At the end of the day, the future's a self-fulfilling prophecy.

:18:09.:18:10.

So if we're negative, something wrong will happen.

:18:11.:18:12.

I'm not telling you that if we're positive we can avoid it,

:18:13.:18:15.

but at least we have a chance to turn it into an adventure.

:18:16.:18:18.

Since it's happening anyway, I could see what's happening

:18:19.:18:20.

in the game of chess, I have been looking for ways

:18:21.:18:23.

I believe that there's so many things we can do

:18:24.:18:32.

and when people say, "We don't know what's

:18:33.:18:34.

That's exactly the reason for us to move there because we don't know,

:18:35.:18:38.

that was one of the main driving forces in the history of humanity.

:18:39.:18:44.

The AlphaGo software, that's just won the match with go,

:18:45.:18:46.

works in a very different way, which is teaching itself the rules.

:18:47.:18:49.

How do you feel about that kind of AI?

:18:50.:18:51.

Is that the way that AI should go, learn about the rules of the world?

:18:52.:18:57.

One should say that Deep Blue was anything but intelligent.

:18:58.:18:59.

It was as intelligent as your alarm clock.

:19:00.:19:01.

Very expensive alarm clock, $10 million alarm clock,

:19:02.:19:03.

Very powerful, brute force, with little chess knowledge,

:19:04.:19:14.

but chess proved to be vulnerable to this brute force,

:19:15.:19:16.

But with 200 million positions per second,

:19:17.:19:21.

that's roughly the average speed of Deep Blue, it showed very

:19:22.:19:23.

So it gave us almost no insight into the mysteries

:19:24.:19:27.

AlphaGo is different because it's a self-teaching programme,

:19:28.:19:34.

and if we're looking for AI definition, it's still hard

:19:35.:19:36.

to really understand whether AlphaGo is AI because now we're moving

:19:37.:19:39.

So we still don't understand exactly what human intelligence is.

:19:40.:19:55.

What do you think of the grand challenges

:19:56.:19:58.

What are the things that maybe aren't being talked about enough

:19:59.:20:02.

or that people don't think about enough when it comes

:20:03.:20:05.

to accepting and implementing artificial intelligence?

:20:06.:20:12.

One thing that's important to remember thateverything

:20:13.:20:14.

that we do and we know how to do, machines will do better eventually.

:20:15.:20:17.

Everything we do and we know how we do.

:20:18.:20:21.

But there's so many things that we don't know how we do.

:20:22.:20:26.

Let's concentrate on that because machines have algorithms

:20:27.:20:28.

and they're getting better and better, but machines have no

:20:29.:20:31.

curiosity, no passion and, most importantly, machines

:20:32.:20:32.

Maybe the good thing is that so we know we have purpose,

:20:33.:20:36.

but we don't know exactly what the purpose is.

:20:37.:20:38.

So that guarantees that we'll stay around for quite a while.

:20:39.:20:51.

Now, it's time for artificial intelligence to be put

:20:52.:20:54.

For Click's show at this year's Hay Festival,

:20:55.:20:58.

we challenged Click code monkey Stephen Beckett to create a game

:20:59.:21:00.

We're going to split the audience in two now, right down there.

:21:01.:21:11.

This half is going to play against this half.

:21:12.:21:13.

Our teams have to be the last one to fall off this plank,

:21:14.:21:24.

whether they do that by canny balancing or more boisterous

:21:25.:21:26.

They're in two teams and by each moving their arms left and right,

:21:27.:21:32.

they can altogether control where their player goes.

:21:33.:21:35.

And there's one more surprise feature, by shouting out special key

:21:36.:21:38.

words they can make their player get bigger and therefore heavier.

:21:39.:21:41.

So that's the rules of the game, but how can it see what you're doing

:21:42.:21:44.

Behind the scenes, it's running in a game engine called Unity.

:21:45.:21:52.

You can download this for free, and it's a great way to get started

:21:53.:21:56.

There's a massive community out there offering free

:21:57.:21:59.

I've shared a few of my favourites over on Twitter.

:22:00.:22:04.

Unity is looking after the graphics and rules of the game and it's also

:22:05.:22:07.

hooked into IBM's artificially intelligent Watson system,

:22:08.:22:09.

which is turning the speech from the audience into text

:22:10.:22:11.

Watson runs in the cloud, so my little laptop can

:22:12.:22:20.

have the smarts of a supercomputer just using a Wi-Fi connection.

:22:21.:22:22.

If you've ever used a speech recognition service like Siri,

:22:23.:22:25.

Alexa or Google, you'll know that sometimes they can be

:22:26.:22:27.

IBM's Watson is no exception, things like background noise

:22:28.:22:36.

and music can throw it off, leading to some

:22:37.:22:39.

But to make the game more fun, I've set it up to also look for the most

:22:40.:22:47.

So words like say, stay and may will also count as a shout for hey.

:22:48.:22:52.

Despite this, I didn't anticipate just how loud our

:22:53.:22:54.

They're so loud in fact that the voice recognition struggled

:22:55.:23:01.

to hear all the words, so there wasn't quite as much

:23:02.:23:04.

So that's the voice, but how do we work out

:23:05.:23:12.

Well, we're using the openFrameworks and OpenCV libraries.

:23:13.:23:15.

These are two beefy bits of software to help you build creative

:23:16.:23:18.

OpenFrameworks is more complicated than Unity,

:23:19.:23:23.

but there are lots of great guides out there if you already have some

:23:24.:23:26.

This programme takes the feed from our cameras,

:23:27.:23:31.

looks for edges and then tries to find lines by

:23:32.:23:33.

So if you move your arm left or right, that counts

:23:34.:23:37.

And that is all it takes to get 300 people, in a tent,

:23:38.:23:41.

Give yourself a huge round of applause.

:23:42.:23:46.

Yeah, never a dull moment when you put us in a tent.

:23:47.:23:50.

That's it from us for the Hay Festival for this year.

:23:51.:23:53.

Hopefully, we didn't break too much and they'll have us

:23:54.:23:55.

In the meantime, why don't you follow us on Facebook for loads

:23:56.:24:05.

of extra content and of course on Twitter too @BBCClick.

:24:06.:24:07.

Thanks for watching and, one more time - hey, why.

:24:08.:24:09.

We will do the easy bit first and I will give you the weekend forecast,

:24:10.:24:41.

probably the bit you are after anyway! A mix

:24:42.:24:42.

Download Subtitles

SRT

ASS