:00:00. > :00:00.Now I BBC News, it is time for Click.
:00:00. > :00:07.This week: Swaying votes on Facebook.
:00:08. > :00:12.It's geek versus geek as Spen meets Fry.
:00:13. > :00:39.And lots of people shout in a big tent.
:00:40. > :00:42.Summer is on the way and, well, it wouldn't be a British summer
:00:43. > :00:44.without a visit to a good old fashioned festival.
:00:45. > :00:56.Known as the Town of Books, Hay-on-Wye, in Wales,
:00:57. > :01:05.It's a literary Mecca, an annual gathering of artists,
:01:06. > :01:08.authors, Daleks and, yep, even Royals.
:01:09. > :01:11.It's even been called the Woodstock of the Mind by none other
:01:12. > :01:15.than former US President Bill Clinton.
:01:16. > :01:20.This year it's the 30th Hay Festival and the line-up is pretty stellar.
:01:21. > :01:24.Well, for the second year in a row, we've been invited to share some
:01:25. > :01:26.of our favourite experiences and show off some really good tech,
:01:27. > :01:29.all in front of a real, live audience of actual people.
:01:30. > :01:44.A packed tent waited, all that we had to do was wow them!
:01:45. > :01:56.We have robots falling over, experiments in haptic feedback
:01:57. > :01:58.and demos in binaural sound, but that was nothing
:01:59. > :02:00.compared to the climax - a Click created wavy,
:02:01. > :02:02.shouty game built using artificial intelligence.
:02:03. > :02:07.In the meantime, it can't have have escaped your attention that around
:02:08. > :02:09.the UK things are getting a touch political.
:02:10. > :02:12.As the general election looms, those politicians are using increasingly
:02:13. > :02:20.sophisticated techniques in order to learn more about us.
:02:21. > :02:27.The advertising reach of Facebook has long been an open secret,
:02:28. > :02:31.but now it's something the political parties are getting in on too.
:02:32. > :02:35.In fact, both the Trump campaign and the Leave.EU groups credited
:02:36. > :02:37.Facebook as being a vital part of their electioneering.
:02:38. > :02:40.We know that the personal details that you give to social networks
:02:41. > :02:43.allow them to send you relevant, targeted content, and it goes
:02:44. > :02:51.much deeper than just your basic demographics.
:02:52. > :02:54.There are now data analytics companies claiming to be able
:02:55. > :02:56.to micro-target and micro-tweak messages for individual readers,
:02:57. > :03:07.If you know the personality of the people you're targeting,
:03:08. > :03:09.you can nuance your messaging to resonate more effectively
:03:10. > :03:14.What's also emerging is that political parties have been
:03:15. > :03:17.using this data to reach potential voters, on a very granular level.
:03:18. > :03:26.So who is being targeted on Facebook and how?
:03:27. > :03:28.Well, until now, there's been nothing around to analyse any
:03:29. > :03:32.of this, but the snap general election galvanised
:03:33. > :03:34.Louis Knight-Webb and Sam Jeffers to develop Who Targets Me,
:03:35. > :03:41.a plug-in to tell each of us how we're being targeted.
:03:42. > :03:43.When you install the plug-in for the first time,
:03:44. > :03:46.it asks for your age, your gender and your location,
:03:47. > :03:48.and then it starts scouring your Facebook feed looking for adverts
:03:49. > :03:51.So once you've installed the plug-in, it works
:03:52. > :03:54.in the background to extract the whole advert that
:03:55. > :03:57.So it pulls out the headline, the subtitle, any related videos,
:03:58. > :04:05.We also get the reaction - so how many likes, how many
:04:06. > :04:08.comments, how many shares - so we can see which messages
:04:09. > :04:10.Are they particularly clandestine messages,
:04:11. > :04:14.are they slightly subversive, are they even fake news?
:04:15. > :04:16.But how do data companies get the information in the first place?
:04:17. > :04:19.A lot of the quizzes you fill out on Facebook or,
:04:20. > :04:21.you know, you open a survey, it asks your Facebook
:04:22. > :04:26.Sometimes you'll notice that there's a lot of permissions attached
:04:27. > :04:29.and as soon as you click yes, all of your data is mined,
:04:30. > :04:31.and it's then sold on to data brokers who then, eventually,
:04:32. > :04:38.sell it to the political parties for use in their campaigns.
:04:39. > :04:42.Although Facebook says it doesn't sell our information on,
:04:43. > :04:45.data brokers can overlay any details they mine from the site with other
:04:46. > :04:51.datasets that they have on people based on their email addresses.
:04:52. > :04:55.The next step after that of course is to find similar users that
:04:56. > :04:57.are using Facebook and then target adverts, from that advertiser
:04:58. > :05:00.that supplied the email addresses, to those users.
:05:01. > :05:09.There are just some people that you don't find on Twitter.
:05:10. > :05:12.The very nature of the fact that I can't see your adverts,
:05:13. > :05:15.you can't see my adverts, means that this approach had to be
:05:16. > :05:23.It's a first of its kind anywhere in the world on this scale,
:05:24. > :05:26.giving us citizens some transparency into what we're being shown,
:05:27. > :05:33.Do you think that people wouldn't know that certain things are advert
:05:34. > :05:43.A lot of the time people are scrolling through Facebook
:05:44. > :05:45.and the adverts fit into this weird intersection of friend
:05:46. > :05:51.It's quite easy to miss the adverts on Facebook.
:05:52. > :05:53.So far, Who Targets Me has some 6,700 users
:05:54. > :05:54.in 620 constituencies, and it's rising as
:05:55. > :06:06.On the down side, it's only as good as the data it's
:06:07. > :06:07.managed to crowd-source, so it isn't necessarily
:06:08. > :06:09.representative, and it also doesn't work with mobile Facebook,
:06:10. > :06:14.So we're seeing a mixture of two things.
:06:15. > :06:16.We're seeing, firstly, A/B testing, which is where I try out
:06:17. > :06:18.two different messages with the same group.
:06:19. > :06:21.I see which one gets the best reaction and then
:06:22. > :06:26.We're also seeing targeting, which is where I pick a particular
:06:27. > :06:28.demographic of people, and then I send a message
:06:29. > :06:35.So, for example, it might be young people targeted
:06:36. > :06:44.The data from Who Targets Me is also being poured over by analysts
:06:45. > :06:47.One aspect of their research is collecting dark posts,
:06:48. > :06:51.ads which are here one day and gone the next.
:06:52. > :06:54.It gives us the ability to create a respository of those dark posts.
:06:55. > :06:57.So if promises are being made on Facebook, in ads which will
:06:58. > :07:00.disappear the day after you use them, we should be able to go back
:07:01. > :07:02.to those after the election, look at them, evaluate them
:07:03. > :07:08.and maybe discuss them in the cold light of day.
:07:09. > :07:10.And the irony is that, as we demand more transparency
:07:11. > :07:12.from public bodies, the whole basis of political propaganda
:07:13. > :07:19.could be on the brink of a revolutionary change.
:07:20. > :07:21.What's interesting, I think, about the new environment
:07:22. > :07:23.is the potential for using paid advertising and other techniques
:07:24. > :07:25.to create individual propaganda bubbles around individual voters.
:07:26. > :07:28.And that's not about controlling the market as a whole,
:07:29. > :07:31.but it's about using smart targeted which, in a sense, creates such
:07:32. > :07:32.a compelling and overarching information environment
:07:33. > :07:35.for individual people that that in some ways constrains what they do
:07:36. > :07:47.I think that's why some academic commentators and others
:07:48. > :07:59.are beginning to think some of this is a bit spooky.
:08:00. > :08:03.But politicians aren't the only ones with Facebook on their minds.
:08:04. > :08:07.The social network was one of many topics on the very large brain
:08:08. > :08:08.of national treasure and tech geek Stephen Fry.
:08:09. > :08:12.I met up with him after he gave a lecture at the Hay Festival
:08:13. > :08:14.highlighting how he thinks the world is being changed by social
:08:15. > :08:18.The very current conversation is whether Facebook and platforms
:08:19. > :08:19.like them should actually be considered publishers?
:08:20. > :08:31.Should they take responsibility for what ends up on the site?
:08:32. > :08:34.They are aware there is a problem, a serious problem.
:08:35. > :08:36.If 80%, some people have said, is the...
:08:37. > :08:39.You know, in proportion of people who get their news from Facebook
:08:40. > :08:41.rather than from mainstream media, then surely it is incumbent
:08:42. > :08:44.upon someone who is providing 80% of their news sources to make sure
:08:45. > :08:46.that those news sources are not deflamatory,
:08:47. > :08:47.blatant lies, propaganda, the wrong kind of,
:08:48. > :08:59.I would posit there that a publisher is responsible for all the people
:09:00. > :09:03.They are employed by that publisher and Facebook is clearly not that.
:09:04. > :09:07.So do we need a third definition, a third thing?
:09:08. > :09:17.I think there is a medium sort of definition that it's not
:09:18. > :09:20.beyond the wit of lawyers of the right kind to find that.
:09:21. > :09:22.Everyone knows you are a famous early adopter.
:09:23. > :09:24.You seem to try everything that comes out?
:09:25. > :09:26.Yeah, I'm perpetually driven by curiosity.
:09:27. > :09:30.It may even be a kind of addictive greed, which is a sort of rather
:09:31. > :09:33.graver example of the consumer greed that we all share, really.
:09:34. > :09:36.Maybe I'm just more of a jackdaw, who likes shiny things and I just...
:09:37. > :09:51.I still love the, you know, the first kind of...
:09:52. > :09:53.I get short of breath and I pant slightly.
:09:54. > :09:55.Your presentation was a warning that people should prepare
:09:56. > :09:57.for the changes that are coming, for example, artificial
:09:58. > :10:01.I said that it was a sort of transformation of
:10:02. > :10:04.You know, it's an obsolescence of certain types of job,
:10:05. > :10:06.but that doesn't mean forced redundancy of millions of workers.
:10:07. > :10:09.I mentioned one of the pleasing things about AI and robotics,
:10:10. > :10:11.and that is what's known as Moravec's paradox,
:10:12. > :10:14.that what we're incredibly bad at, as individuals, machines tend to be
:10:15. > :10:20.Complicated sums, rapid and incredible access of memory
:10:21. > :10:22.from a database of a kind that we could never do,
:10:23. > :10:26.sorting and swapping of information and cataloguing
:10:27. > :10:41.But things we can do without even thinking,
:10:42. > :10:45.like walk across the room or pick up a glass and have a sip of water,
:10:46. > :10:49.But that's fine, because we don't want them to do that for us.
:10:50. > :10:52.Where it gets difficult is in medium sort of service jobs, I think.
:10:53. > :10:56.Well, you could go the etymological route, and you could say
:10:57. > :11:03.Legere is read and inter, interleg, and that's pretty good, actually,
:11:04. > :11:14.Just being able to see connections in things and people are talking
:11:15. > :11:18.about the moment that we arrive at AGI, artificial general
:11:19. > :11:21.intelligence, and that's when the various types of pattern
:11:22. > :11:22.recognition, you know, numbers, data, you know,
:11:23. > :11:25.certain faces and things like that, they all come together
:11:26. > :11:32.so that they can be intelligent across these different things.
:11:33. > :11:35.If you've got an artificial intelligence that's good at that
:11:36. > :11:38.and another one that's good at that and another one that's good at that
:11:39. > :11:41.and you get enough of them, and then you put something on top
:11:42. > :11:44.that goes, "Oh, you want to know about a language problem,
:11:45. > :11:47.You want to know about walking and recognising objects,
:11:48. > :11:50.Surely, just a collection of specialists of intelligences
:11:51. > :11:52.under one umbrella is a generals intelligence.
:11:53. > :11:55.It doesn't have to be a breakthrough, it just has to be
:11:56. > :12:01.I think that's very likely to be the way it goes.
:12:02. > :12:04.Part of our intelligence, a huge part, is our emotional intelligence.
:12:05. > :12:06.Einstein would say he did say the same thing.
:12:07. > :12:08.It's not just the cerebral power, it is something to do
:12:09. > :12:10.with our blends of fury, inquisitive desire,
:12:11. > :12:13.These are very, very much more primal and much harder
:12:14. > :12:15.to reduce to a table of calculations and memories.
:12:16. > :12:18.But maybe you give it something else, some other kind
:12:19. > :12:21.It's reward is similar to our reward system
:12:22. > :12:25.It's tryptophan and serotonin and endorphins of various kinds that
:12:26. > :12:28.reward us and then we have a pain system to deter us,
:12:29. > :12:33.and there's nothing to stop us giving that to a machine.
:12:34. > :12:35.Stephen, thank you so much for your time.
:12:36. > :12:40.Thank you for having us at your place.
:12:41. > :12:49.It was the week where Android creator Andy Rubin
:12:50. > :12:53.Co-founder of Google Larry Page had a couple of novices
:12:54. > :12:55.take his Kitty Hawk Flyer for a spin.
:12:56. > :12:57.And Intel popped swing-sensing chips into bats for the ICC Champions Cup.
:12:58. > :13:00.Now, whenever you go into a church you expect to see a man
:13:01. > :13:12.of God or a woman of God, but not a robot of God!
:13:13. > :13:15.A Protestant church in Germany has unveiled a robotic priest,
:13:16. > :13:17.called BlessU-2, to mark 500 years since the reaffirmation.
:13:18. > :13:28.The machine delivers various blessings in eight languages.
:13:29. > :13:30.Researchers at MIT have come up with a unique way to let
:13:31. > :13:32.the visually impaired know more about what's around them.
:13:33. > :13:35.A 3D camera recognises stuff and feeds that detail back
:13:36. > :13:37.using vibrations through a belt and an electronic Braille interface.
:13:38. > :13:39.Microsoft showed off a range of mixed reality headsets,
:13:40. > :13:42.they're designed to blend the real world and virtual content into one
:13:43. > :13:52.where physical and digitally created objects co-exist and interact.
:13:53. > :13:59.Google Maps now doesn't just take you to your destination,
:14:00. > :14:11.Click on an art museum and take a peek around with each artwork
:14:12. > :14:13.displayed with the gallery's notations, plus extra
:14:14. > :14:23.Now, as you've just been hearing, Microsoft may be leading the way
:14:24. > :14:26.in mixed or augmented reality, but they're not the only company
:14:27. > :14:33.this's trying to make you see things that aren't really there.
:14:34. > :14:36.I'm at the Augmented World Expo which has doubled in size since last
:14:37. > :14:38.year, a sign of the excitement this relatively new technology
:14:39. > :14:58.Now you have a hologram in front you.
:14:59. > :15:06.When you open your fist, you release it and then you can look at it.
:15:07. > :15:10.Now you can stick your hand beside it.
:15:11. > :15:15.Place it anywhere in space, anywhere you want.
:15:16. > :15:20.I'm going to put it on top of Cody's head, behind the camera.
:15:21. > :15:26.I feel like this is going to strain my eyes if I use it for any
:15:27. > :15:31.Funny that you say that, we have a team of neuroscientists
:15:32. > :15:33.on staff who actually do user testing with us.
:15:34. > :15:35.We're confirming with our neuroscientists team that it does
:15:36. > :15:38.not cause any eye strain or neck pain or anything when
:15:39. > :15:44.The problem with all of these devices is that most of them require
:15:45. > :15:46.expensive new gadgets, like these $799 glasses or big bulky
:15:47. > :15:48.devices that soon become uncomfortable to wear.
:15:49. > :15:51.But I did find one idea that I think could genuinely make many
:15:52. > :15:54.Project Chalk is an app that uses augmented reality to let
:15:55. > :15:57.you remotely help someone use a gadget or fix an object.
:15:58. > :16:00.So next time your parents ring, you can show them what to do,
:16:01. > :16:09.You can see the clever thing about this app is that,
:16:10. > :16:12.even if I move around the machine, we still have the instructions
:16:13. > :16:15.overlaid in the same place which means it can be much more
:16:16. > :16:26.precise than if it was just drawn on a normal screen.
:16:27. > :16:28.Project Chalk's designed to work on all types of devices -
:16:29. > :16:31.phones, tablets and what we call digital eyewear, which you can also
:16:32. > :16:36.So this will be available on a free basis which means anyone will be
:16:37. > :16:39.able to start by downloading it from the app store and using
:16:40. > :16:45.But once people have to pay, do you think they'll go back to just
:16:46. > :16:48.using Skype or just talking on the phone to get
:16:49. > :17:03.Now last week, the world's best player in the Chinese board game go
:17:04. > :17:04.was pretty soundly beaten by Google's artificial
:17:05. > :17:17.For those following the rapid development of AI, the result
:17:18. > :17:20.20 years ago however, a victory of machine over
:17:21. > :17:21.man shocked the world, prompting apocalyptic
:17:22. > :17:23.warnings about our own role in society and questions
:17:24. > :17:25.about whether humanitarian could survive in a world
:17:26. > :17:27.where we could be out thought by a computer.
:17:28. > :17:30.The man at the centre of that match was chess
:17:31. > :17:38.The machine was IBM's super computer Deep Blue.
:17:39. > :17:42.You talk about the fact that it took you quite a long time to get over
:17:43. > :17:50.Can you explain why it took that long and how
:17:51. > :17:53.It was my first loss and losing this match,
:17:54. > :17:57.I mean, I knew I made mistakes and I didn't play well
:17:58. > :18:00.and I could kick myself in my head for not preparing
:18:01. > :18:08.Since then, you've become an advocate of AI?
:18:09. > :18:10.At the end of the day, the future's a self-fulfilling prophecy.
:18:11. > :18:12.So if we're negative, something wrong will happen.
:18:13. > :18:15.I'm not telling you that if we're positive we can avoid it,
:18:16. > :18:18.but at least we have a chance to turn it into an adventure.
:18:19. > :18:20.Since it's happening anyway, I could see what's happening
:18:21. > :18:23.in the game of chess, I have been looking for ways
:18:24. > :18:32.I believe that there's so many things we can do
:18:33. > :18:34.and when people say, "We don't know what's
:18:35. > :18:38.That's exactly the reason for us to move there because we don't know,
:18:39. > :18:44.that was one of the main driving forces in the history of humanity.
:18:45. > :18:46.The AlphaGo software, that's just won the match with go,
:18:47. > :18:49.works in a very different way, which is teaching itself the rules.
:18:50. > :18:51.How do you feel about that kind of AI?
:18:52. > :18:57.Is that the way that AI should go, learn about the rules of the world?
:18:58. > :18:59.One should say that Deep Blue was anything but intelligent.
:19:00. > :19:01.It was as intelligent as your alarm clock.
:19:02. > :19:03.Very expensive alarm clock, $10 million alarm clock,
:19:04. > :19:14.Very powerful, brute force, with little chess knowledge,
:19:15. > :19:16.but chess proved to be vulnerable to this brute force,
:19:17. > :19:21.But with 200 million positions per second,
:19:22. > :19:23.that's roughly the average speed of Deep Blue, it showed very
:19:24. > :19:27.So it gave us almost no insight into the mysteries
:19:28. > :19:34.AlphaGo is different because it's a self-teaching programme,
:19:35. > :19:36.and if we're looking for AI definition, it's still hard
:19:37. > :19:39.to really understand whether AlphaGo is AI because now we're moving
:19:40. > :19:55.So we still don't understand exactly what human intelligence is.
:19:56. > :19:58.What do you think of the grand challenges
:19:59. > :20:02.What are the things that maybe aren't being talked about enough
:20:03. > :20:05.or that people don't think about enough when it comes
:20:06. > :20:12.to accepting and implementing artificial intelligence?
:20:13. > :20:14.One thing that's important to remember thateverything
:20:15. > :20:17.that we do and we know how to do, machines will do better eventually.
:20:18. > :20:21.Everything we do and we know how we do.
:20:22. > :20:26.But there's so many things that we don't know how we do.
:20:27. > :20:28.Let's concentrate on that because machines have algorithms
:20:29. > :20:31.and they're getting better and better, but machines have no
:20:32. > :20:32.curiosity, no passion and, most importantly, machines
:20:33. > :20:36.Maybe the good thing is that so we know we have purpose,
:20:37. > :20:38.but we don't know exactly what the purpose is.
:20:39. > :20:51.So that guarantees that we'll stay around for quite a while.
:20:52. > :20:54.Now, it's time for artificial intelligence to be put
:20:55. > :20:58.For Click's show at this year's Hay Festival,
:20:59. > :21:00.we challenged Click code monkey Stephen Beckett to create a game
:21:01. > :21:11.We're going to split the audience in two now, right down there.
:21:12. > :21:13.This half is going to play against this half.
:21:14. > :21:24.Our teams have to be the last one to fall off this plank,
:21:25. > :21:26.whether they do that by canny balancing or more boisterous
:21:27. > :21:32.They're in two teams and by each moving their arms left and right,
:21:33. > :21:35.they can altogether control where their player goes.
:21:36. > :21:38.And there's one more surprise feature, by shouting out special key
:21:39. > :21:41.words they can make their player get bigger and therefore heavier.
:21:42. > :21:44.So that's the rules of the game, but how can it see what you're doing
:21:45. > :21:52.Behind the scenes, it's running in a game engine called Unity.
:21:53. > :21:56.You can download this for free, and it's a great way to get started
:21:57. > :21:59.There's a massive community out there offering free
:22:00. > :22:04.I've shared a few of my favourites over on Twitter.
:22:05. > :22:07.Unity is looking after the graphics and rules of the game and it's also
:22:08. > :22:09.hooked into IBM's artificially intelligent Watson system,
:22:10. > :22:11.which is turning the speech from the audience into text
:22:12. > :22:20.Watson runs in the cloud, so my little laptop can
:22:21. > :22:22.have the smarts of a supercomputer just using a Wi-Fi connection.
:22:23. > :22:25.If you've ever used a speech recognition service like Siri,
:22:26. > :22:27.Alexa or Google, you'll know that sometimes they can be
:22:28. > :22:36.IBM's Watson is no exception, things like background noise
:22:37. > :22:39.and music can throw it off, leading to some
:22:40. > :22:47.But to make the game more fun, I've set it up to also look for the most
:22:48. > :22:52.So words like say, stay and may will also count as a shout for hey.
:22:53. > :22:54.Despite this, I didn't anticipate just how loud our
:22:55. > :23:01.They're so loud in fact that the voice recognition struggled
:23:02. > :23:04.to hear all the words, so there wasn't quite as much
:23:05. > :23:12.So that's the voice, but how do we work out
:23:13. > :23:15.Well, we're using the openFrameworks and OpenCV libraries.
:23:16. > :23:18.These are two beefy bits of software to help you build creative
:23:19. > :23:23.OpenFrameworks is more complicated than Unity,
:23:24. > :23:26.but there are lots of great guides out there if you already have some
:23:27. > :23:31.This programme takes the feed from our cameras,
:23:32. > :23:33.looks for edges and then tries to find lines by
:23:34. > :23:37.So if you move your arm left or right, that counts
:23:38. > :23:41.And that is all it takes to get 300 people, in a tent,
:23:42. > :23:46.Give yourself a huge round of applause.
:23:47. > :23:50.Yeah, never a dull moment when you put us in a tent.
:23:51. > :23:53.That's it from us for the Hay Festival for this year.
:23:54. > :23:55.Hopefully, we didn't break too much and they'll have us
:23:56. > :24:05.In the meantime, why don't you follow us on Facebook for loads
:24:06. > :24:07.of extra content and of course on Twitter too @BBCClick.
:24:08. > :24:09.Thanks for watching and, one more time - hey, why.
:24:10. > :24:41.We will do the easy bit first and I will give you the weekend forecast,
:24:42. > :24:42.probably the bit you are after anyway! A mix