Fake News

Download Subtitles

Transcript

:00:00. > :00:10.Did false stories have an impact on the outcome

:00:11. > :00:14.Ever since that question came up, fake news has become,

:00:15. > :00:23.On this week's Talking Business, we examine how fake news is spread

:00:24. > :00:50.and what, if anything, should be done to rein it in.

:00:51. > :00:57.Welcome to the programme. I'm Michelle Fleury in Washington, DC.

:00:58. > :00:58.From politics to the media to technology companies, lately a lot

:00:59. > :01:09.of people worked up about for the back of the spreading of

:01:10. > :01:12.lies or news. The problem is made worse

:01:13. > :01:14.because there is money to be made in the fake news business. This is how

:01:15. > :02:14.it works. So, who is responsible for the boom,

:02:15. > :02:19.and the containment of fake news? Here to help me investigate,

:02:20. > :02:23.we're joined from San Francisco by a web and technology writer

:02:24. > :02:30.and part-time entrepreneur. In the studio with me

:02:31. > :02:33.are the director of the museum in Washington,

:02:34. > :02:37.as well as a Wikipedia editor Gentlemen, thank you

:02:38. > :02:46.very much joining me. If I could start a question,

:02:47. > :02:50.Andrew, to you... How should we be

:02:51. > :02:54.defining fake news? I talked about it

:02:55. > :02:56.as being about lies It is too broad

:02:57. > :03:01.a definition in many ways. One of the problems with the term

:03:02. > :03:04.fake news is it encompasses many different types of news sources,

:03:05. > :03:10.websites, you find. So, I think, one of the better

:03:11. > :03:16.pieces written about this, to try to break down the whole

:03:17. > :03:19.phenomenon of fake news into folks that might be intentionally

:03:20. > :03:22.creating news for profit, so we saw there were some operations

:03:23. > :03:26.out of Macedonia that did this You have partisan sites,

:03:27. > :03:29.who have sites which are not prey A whole bunch of different sites

:03:30. > :03:33.that fall under this It's important to understand there

:03:34. > :03:37.are different motivations for these. It is hard to paint everything

:03:38. > :03:41.as one fake news phenomenon. I think we're only very early

:03:42. > :03:51.in this news ecosystem where there A lot more people have

:03:52. > :03:56.their voices heard. But it also opens this up

:03:57. > :03:59.to the possibility of people deliberately polluting

:04:00. > :04:03.was sabotaging the news. I think that's a long-term

:04:04. > :04:08.concern in many ways but it's been brought about,

:04:09. > :04:10.been highlighted, in recent times because the speed

:04:11. > :04:13.and velocity of social media. You heard about the speed

:04:14. > :04:19.and velocity of social media I think fake news, as Andrew

:04:20. > :04:29.defined, or tried to define, is actually the right way

:04:30. > :04:31.to think about it. We have fake news coming out

:04:32. > :04:35.of places like Macedonia, which is essentially no

:04:36. > :04:39.different from spam. The rest of it is essentially

:04:40. > :04:41.propaganda and now propaganda I think that is the reality

:04:42. > :04:49.of the news ecosystem right now, that it works at a much faster pace

:04:50. > :04:52.and a much larger So, it kind of blurs the boundaries

:04:53. > :05:06.between real and fake and real and unreal,

:05:07. > :05:08.and truths and half truths. I think part of the new era

:05:09. > :05:12.is that there are no It used to be only a generation

:05:13. > :05:22.ago that you would have Water Kronkyte, and a few

:05:23. > :05:24.other anchors, and few in the United States very well

:05:25. > :05:30.respected national newspapers This is likely to be true,

:05:31. > :05:35.this is likely not to be. Now, for all kinds of good reasons,

:05:36. > :05:38.those gatekeepers have been torn down and we have a much more vibrant

:05:39. > :05:41.and dynamic news system. In the early days of this new era,

:05:42. > :05:44.we don't have the gatekeepers. In many ways, it's going to be up

:05:45. > :05:50.to the individual now. To be a much more intentional news

:05:51. > :06:00.consumer than in the past. Aren't you letting off the

:06:01. > :06:02.platforms, the Facebooks and Googles I think they are now beginning

:06:03. > :06:10.to acknowledge their new role in this information ecosystem,

:06:11. > :06:14.that they're just not pipes by which they distribute information

:06:15. > :06:17.and posts to people, but that they mediate,

:06:18. > :06:20.they edit, curated the news At the end of the day,

:06:21. > :06:24.I think that the social media platforms can do a significant

:06:25. > :06:27.amount that it is really If consumers demand good quality

:06:28. > :06:31.news, the algorithms and the platforms are

:06:32. > :06:33.provided to them. One thing that the traditional

:06:34. > :06:37.gatekeepers have to do is to make themselves relevant in a new age

:06:38. > :06:43.of technology and social media. As was mentioned, things

:06:44. > :06:45.are happening faster on social The most important thing

:06:46. > :06:52.is uncontextualised. These are tweets, these

:06:53. > :07:00.are Instagram posts, they are Facebook utterances

:07:01. > :07:02.and they are not 3000-5000 word In isolation, in little bits,

:07:03. > :07:05.in part of the internet. What you are seeing

:07:06. > :07:08.are some experiments, like the Washington Post

:07:09. > :07:09.for example, right It says we are providing a new

:07:10. > :07:18.plug-in for the Chrome browsers. So, whenever you look

:07:19. > :07:20.at a tweet from Donald Trump, were going to put a little fact

:07:21. > :07:23.checked right underneath it saying, Not too many people will

:07:24. > :07:27.use this, obviously. How many people are using a Chrome

:07:28. > :07:30.desktop browser and are installing It's kind of interesting

:07:31. > :07:32.they are experimenting We have also seen that Google

:07:33. > :07:36.and Facebook have realised that just being a platform,

:07:37. > :07:38.and just link sharing with no responsibility,

:07:39. > :07:40.is also not a great thing. They are almost what we call

:07:41. > :07:42.accidental or unintentional Because they are sharing the news

:07:43. > :07:46.but they are not doing the fact checking or flagging of content

:07:47. > :07:49.in a way that we see A lot of the moves we are seeing

:07:50. > :07:53.from Facebook and Google right now, annotating and flagging trying

:07:54. > :07:55.to get more information from the reader because readers

:07:56. > :07:57.are demanding that now. I do believe the new gatekeepers

:07:58. > :08:02.are the social media platform. They have to make sure that

:08:03. > :08:08.nefarious information They need to figure out the stuff

:08:09. > :08:21.coming out of places like Macedonia, the for profit spam sites should be

:08:22. > :08:28.stopped right away - they have

:08:29. > :08:31.the infrastructure to do that. The fact they did not do it tells me

:08:32. > :08:34.they dropped the bar. Even just to kind of say

:08:35. > :08:37.that we are looking into the fake news, I think it is a lot

:08:38. > :08:40.of lip service. In their universe, more attention

:08:41. > :08:42.to whatever people are doing, or reading on their platform,

:08:43. > :08:44.equals more profit. I want to talk about

:08:45. > :08:48.the decentralisation of news. From rolling news, two

:08:49. > :08:52.technology companies becoming How much of a role has that played

:08:53. > :09:00.in what we have seen today? I think we see the

:09:01. > :09:03.abundance of news sites. In many ways this is

:09:04. > :09:09.the golden age of news. More people have access

:09:10. > :09:11.to more sites and more information than ever before,

:09:12. > :09:13.if they are willing to undertake If they just want to be passive

:09:14. > :09:24.and have things come at them, I agree that the technology

:09:25. > :09:27.companies have a role. I don't want teenagers

:09:28. > :09:29.in Macedonia flooding our news I don't want to assign too much

:09:30. > :09:33.authority to Facebook and Google. They are doing what's best

:09:34. > :09:36.for their shareholders. But the algorithmically-derived

:09:37. > :09:44.results they produce are not particularly transparent and no one

:09:45. > :09:47.elected them to become the I don't think you can mitigate

:09:48. > :09:58.the responsibility that individuals have with all this access,

:09:59. > :10:01.we're now going to have to pay Later in the programme, can

:10:02. > :10:15.algorithms help wipe out fake news? Algorithms help determine

:10:16. > :10:17.which stories take precedence over Our comedy consultant has been

:10:18. > :10:23.taking a closer look. In this week's Talking Point,

:10:24. > :10:25.he examines the role played You've probably seen it written more

:10:26. > :10:34.than you heard it said. Usually written by someone

:10:35. > :10:37.on Facebook shouting virtually Now, sometimes people shout fake

:10:38. > :10:46.news in a Facebook comment purely because they disagree

:10:47. > :10:49.with the previous Facebook comment. But, real fake news,

:10:50. > :10:53.if you know what I mean, that's where a company,

:10:54. > :10:56.or just two guys with a laptop, set up a website that looks

:10:57. > :10:59.like a plausible news website and churn out stories that

:11:00. > :11:02.looks sort of possible. How does that fake news

:11:03. > :11:05.and those made up stories get A former vice president at Intel

:11:06. > :11:12.here in Ireland and something of a tech and start-up guru tells me

:11:13. > :11:15.it's something to do The beauty of algorithms

:11:16. > :11:21.is they neither know, nor care, Likewise, in the news space,

:11:22. > :11:29.if there are news articles which are being clicked on out

:11:30. > :11:32.there and people with interests like mine are looking at real news

:11:33. > :11:36.items, they are looking at false news items that they are looking

:11:37. > :11:46.at who knows what news items. As far as the algorithm is

:11:47. > :11:58.concerned, this is a high hit rate. For somebody like Philip,

:11:59. > :12:00.so I'm going to present that to him. Wouldn't you think though that

:12:01. > :12:04.rational thinking human beings will do little bit of verification

:12:05. > :12:07.on the stories that they are served The problem is, we don't

:12:08. > :12:12.do that verification. We're too happy to have our vices

:12:13. > :12:15.confirmed so when we see something plausible on our phones

:12:16. > :12:17.we just share it. So, controlling fake news

:12:18. > :12:19.is going to have to be Fact Matter is a Google-backed

:12:20. > :12:25.project for automated fact checking. Our mission is to solve

:12:26. > :12:28.misinformation in online content. Solving online misinformation

:12:29. > :12:30.sounds like a noble cause, We're focusing on sentence

:12:31. > :12:33.level fact checking. This is identifying claims

:12:34. > :12:35.that appear in text Machine intelligence

:12:36. > :12:42.comes in using natural language processing,

:12:43. > :12:43.to understand effectively What are the sources that

:12:44. > :12:54.you could use to check that claim - what that claim is

:12:55. > :12:57.about and the hardest part, which is linking

:12:58. > :12:59.that claim to a fact. It will be a browser extension

:13:00. > :13:02.which allows you to see highlighted sentence level claims

:13:03. > :13:05.about statistics and it will link them to sources that,

:13:06. > :13:08.for example, the World Bank, it will enable them to verify

:13:09. > :13:14.those claims immediately. Our extension could basically look

:13:15. > :13:20.at a piece of content and say this is likely to be misleading because,

:13:21. > :13:24.overall, this piece of news contains The only problem with this solution

:13:25. > :13:33.is it may not be able to tell the difference between fake news

:13:34. > :13:36.and when someone is just trying to be funny, which is a problem

:13:37. > :13:43.I face myself quite often. Touching on the role of algorithms

:13:44. > :13:47.in fake news for that you can see more of his short films

:13:48. > :13:54.on our website. Back here in Washington,

:13:55. > :13:56.in the rapid and seemingly uncontrollable spread of fake news,

:13:57. > :14:02.modern technology as Could it also turn out to be

:14:03. > :14:06.part of the solution? Doctor Geoffrey from the Museum,

:14:07. > :14:18.Professor Andrew Lee of Wikipedia and the technology writer

:14:19. > :14:20.from San Francisco. Are we right to blame algorithms

:14:21. > :14:35.for the problem of fake news? You know, algorithms only do what we

:14:36. > :14:39.specified. They are just a set of instructions we have written.

:14:40. > :14:43.Hopefully got up with machine learning there are all types of

:14:44. > :14:47.things happening we may not know of. It is about the machine making

:14:48. > :14:49.determinations without human eyeballs double-checking them or

:14:50. > :14:53.steering them. We had a problem recently where Google was giving

:14:54. > :14:59.search results for, did the Holocaust happen with some very

:15:00. > :15:03.unsavoury sites at the top of those rankings? Google has repeatedly over

:15:04. > :15:06.the years said, sorry, that is our algorithm. If you want to see

:15:07. > :15:11.something different at the top, they're going to have to go on the

:15:12. > :15:15.internet and make it happen on the internet. Google did tweet the

:15:16. > :15:19.algorithm at the end of the day so Holocaust the Mars were not at the

:15:20. > :15:23.top of the search results. I'm sympathetic to the companies. They

:15:24. > :15:27.have to recognises his very early on. Three or four macro years ago I

:15:28. > :15:30.do not think Facebook was thinking, we are going to become the platform

:15:31. > :15:35.by which most millennial is get the news was that I do not think that

:15:36. > :15:44.was in their heads. It has happened at such a speed and has caused a

:15:45. > :15:47.significant change in demands facing the to respond. We also have to

:15:48. > :15:53.recognise these are companies unlike any we have ever seen in that they

:15:54. > :15:58.are truly global. The next million customers of Facebook weather will

:15:59. > :16:03.probably be in south Asia, Africa and East Asia. They have to be

:16:04. > :16:11.responsive to all of that but also. We rightly demand more of the

:16:12. > :16:17.companies. I am quite sympathetic. I respectfully disagree with you.

:16:18. > :16:25.Facebook had specifically targeted the media, the news business. They

:16:26. > :16:31.replicated the idea of Twitter because they knew that is how people

:16:32. > :16:36.keep coming back to the Facebook platform. News is one of the most

:16:37. > :16:41.addictive things for people to come back to a platform or a service or a

:16:42. > :16:49.brand. They intentionally went ahead and targeted the news business.

:16:50. > :17:00.Between Google and Facebook, 85% of the online advertising are going to

:17:01. > :17:07.these two companies. These are not, Babes in the woods. They are private

:17:08. > :17:13.equities. Private entities have responsibilities. It is not just

:17:14. > :17:18.Facebook. Facebook, Google, Twitter, Microsoft, which owns the Being said

:17:19. > :17:24.engine and other entities like them need to sit down and figure out what

:17:25. > :17:28.is the right answer. I don't think the individual should bear the

:17:29. > :17:35.burden of trying to make decisions on every single piece of information

:17:36. > :17:42.they access on Facebook or any other service. I want to bring in Andrew.

:17:43. > :17:47.Wikipedia was mentioned. You are keen to jump in. What is the role

:17:48. > :17:52.for Wikipedia and the role free humans? Wikipedia has been around

:17:53. > :17:56.for 16 years when it first started in 2001. It was seen as an odd

:17:57. > :18:01.experiment on the side of the internet. You are not sure if you

:18:02. > :18:08.could trust it. It has been in the top ten most visited website in the

:18:09. > :18:12.world. Wikipedia has stayed at the top. There are debates of how much

:18:13. > :18:17.you should trust it that we have found that over the years it is

:18:18. > :18:21.still the go to place, including folks like Google whom I knit for

:18:22. > :18:25.information. Every time you do a Google search you will find

:18:26. > :18:30.references to Wikipedia. Facebook and Google are employing people who

:18:31. > :18:37.are learning from Wikipedia lesson. When you read an article will see a

:18:38. > :18:42.warning or the neutrality of a section is disputed. We need morsels

:18:43. > :18:47.is to verify this. Wikipedia has said, this is the best we know of

:18:48. > :18:53.this topic. We could use better sourcing and better facts. Facebook

:18:54. > :18:57.in Germany is trying out this experiment of annotating and putting

:18:58. > :19:01.warnings around content. Not necessarily saying this is true or

:19:02. > :19:06.untrue but to give more guidance to the reader. What do you think of the

:19:07. > :19:14.idea of crowd sourcing? All of these things should be implied. This is so

:19:15. > :19:17.compensated and it is so early, frankly, in the new ecosystem that

:19:18. > :19:22.we should be testing all of these things out. On the supply side, I

:19:23. > :19:25.think the platforms should do more. I think the responsibility of

:19:26. > :19:32.citizens also has to be explored. The Wikipedia example is a good one.

:19:33. > :19:36.The point it has lasted more than most tech start-ups or most tech

:19:37. > :19:39.company shows that people who provide a service with the public

:19:40. > :19:45.respect and demand that there is a market out there. Wikipedia is a

:19:46. > :19:50.financial model. I think we will have to explore all of these things.

:19:51. > :19:57.What we have seen in two or three years if there will become more

:19:58. > :20:03.clashes that they will become more sophisticated. What do you think

:20:04. > :20:10.about sites who debunk fake news? It is like racing a human against a jet

:20:11. > :20:17.plane. That is what we are dealing with. Fake news is happening at that

:20:18. > :20:25.scale and fact checking at human scale. We will always have that

:20:26. > :20:28.problem. I am all for it. If we find ourselves here a year from now

:20:29. > :20:33.having this conversation, what do you think would be talking about

:20:34. > :20:36.when it comes to this issue? I would say we will be talking about the

:20:37. > :20:42.same issue. The problem is much bigger. Other platforms are still

:20:43. > :20:49.dragging their feet. I agree we will still be talking about the problems.

:20:50. > :20:55.I think we will be a little bit more focused on crises and how public

:20:56. > :20:58.opinion can be manipulated in short duration, high stake crises where

:20:59. > :21:02.many of the safeguards we are talking about which may work over a

:21:03. > :21:07.few days and weeks will be useless when there is an issue like this was

:21:08. > :21:14.the back is what concerns me the most. Can we, as a society, or other

:21:15. > :21:20.societies be played in an economic crisis? All the things we have been

:21:21. > :21:24.talking about work better with time. We're not there yet in terms of an

:21:25. > :21:33.immediate response to a deliberate destabilisation. Can you get this to

:21:34. > :21:37.work at scale? We know that the classic quote, I could go halfway

:21:38. > :21:42.around the world before the truth gets its shoes on foot or can you

:21:43. > :21:46.make networks of annotation that fact checking, that get the good

:21:47. > :21:53.signal to consumers quicker and in a more useful way question what it is

:21:54. > :21:57.a big challenge. Twitter is a platform that is proprietary closed,

:21:58. > :22:00.Facebook is closed. Will they work together to combat fake and

:22:01. > :22:05.inaccurate information question about right now Wikipedia does it.

:22:06. > :22:11.That is where things end. There are little things going on like the

:22:12. > :22:16.Washington Post but are they going to work together? I am not so sure.

:22:17. > :22:24.I look forward to revisiting this at another time. Goodbye from all of us

:22:25. > :22:30.here in the studio. That is it from this edition of Talking Business.

:22:31. > :22:36.Next week Tania Beckett will be in London talking about how Browns can

:22:37. > :22:39.survive in an Iraq of algorithms and fake content. For now, it is

:22:40. > :22:55.goodbye. If you are fed up with this cold

:22:56. > :22:59.weather, there is a hint at the end of this broadcast that something

:23:00. > :23:05.more spring light might be on the horizon. Some sunshine today but you

:23:06. > :23:06.had to go quite a way to find it. For many of us