Browse content similar to Fake News. Check below for episodes and series from the same categories and more!
Line | From | To | |
---|---|---|---|
Did false stories have an impact on the outcome | :00:00. | :00:10. | |
Ever since that question came up, fake news has become, | :00:11. | :00:14. | |
On this week's Talking Business, we examine how fake news is spread | :00:15. | :00:23. | |
and what, if anything, should be done to rein it in. | :00:24. | :00:50. | |
Welcome to the programme. I'm Michelle Fleury in Washington, DC. | :00:51. | :00:57. | |
From politics to the media to technology companies, lately a lot | :00:58. | :00:58. | |
of people worked up about for the back of the spreading of | :00:59. | :01:09. | |
lies or news. The problem is made worse | :01:10. | :01:12. | |
because there is money to be made in the fake news business. This is how | :01:13. | :01:14. | |
it works. So, who is responsible for the boom, | :01:15. | :02:14. | |
and the containment of fake news? Here to help me investigate, | :02:15. | :02:19. | |
we're joined from San Francisco by a web and technology writer | :02:20. | :02:23. | |
and part-time entrepreneur. In the studio with me | :02:24. | :02:30. | |
are the director of the museum in Washington, | :02:31. | :02:33. | |
as well as a Wikipedia editor Gentlemen, thank you | :02:34. | :02:37. | |
very much joining me. If I could start a question, | :02:38. | :02:46. | |
Andrew, to you... How should we be | :02:47. | :02:50. | |
defining fake news? I talked about it | :02:51. | :02:54. | |
as being about lies It is too broad | :02:55. | :02:56. | |
a definition in many ways. One of the problems with the term | :02:57. | :03:01. | |
fake news is it encompasses many different types of news sources, | :03:02. | :03:04. | |
websites, you find. So, I think, one of the better | :03:05. | :03:10. | |
pieces written about this, to try to break down the whole | :03:11. | :03:16. | |
phenomenon of fake news into folks that might be intentionally | :03:17. | :03:19. | |
creating news for profit, so we saw there were some operations | :03:20. | :03:22. | |
out of Macedonia that did this You have partisan sites, | :03:23. | :03:26. | |
who have sites which are not prey A whole bunch of different sites | :03:27. | :03:29. | |
that fall under this It's important to understand there | :03:30. | :03:33. | |
are different motivations for these. It is hard to paint everything | :03:34. | :03:37. | |
as one fake news phenomenon. I think we're only very early | :03:38. | :03:41. | |
in this news ecosystem where there A lot more people have | :03:42. | :03:51. | |
their voices heard. But it also opens this up | :03:52. | :03:56. | |
to the possibility of people deliberately polluting | :03:57. | :03:59. | |
was sabotaging the news. I think that's a long-term | :04:00. | :04:03. | |
concern in many ways but it's been brought about, | :04:04. | :04:08. | |
been highlighted, in recent times because the speed | :04:09. | :04:10. | |
and velocity of social media. You heard about the speed | :04:11. | :04:13. | |
and velocity of social media I think fake news, as Andrew | :04:14. | :04:19. | |
defined, or tried to define, is actually the right way | :04:20. | :04:29. | |
to think about it. We have fake news coming out | :04:30. | :04:31. | |
of places like Macedonia, which is essentially no | :04:32. | :04:35. | |
different from spam. The rest of it is essentially | :04:36. | :04:39. | |
propaganda and now propaganda I think that is the reality | :04:40. | :04:41. | |
of the news ecosystem right now, that it works at a much faster pace | :04:42. | :04:49. | |
and a much larger So, it kind of blurs the boundaries | :04:50. | :04:52. | |
between real and fake and real and unreal, | :04:53. | :05:06. | |
and truths and half truths. I think part of the new era | :05:07. | :05:08. | |
is that there are no It used to be only a generation | :05:09. | :05:12. | |
ago that you would have Water Kronkyte, and a few | :05:13. | :05:22. | |
other anchors, and few in the United States very well | :05:23. | :05:24. | |
respected national newspapers This is likely to be true, | :05:25. | :05:30. | |
this is likely not to be. Now, for all kinds of good reasons, | :05:31. | :05:35. | |
those gatekeepers have been torn down and we have a much more vibrant | :05:36. | :05:38. | |
and dynamic news system. In the early days of this new era, | :05:39. | :05:41. | |
we don't have the gatekeepers. In many ways, it's going to be up | :05:42. | :05:44. | |
to the individual now. To be a much more intentional news | :05:45. | :05:50. | |
consumer than in the past. Aren't you letting off the | :05:51. | :06:00. | |
platforms, the Facebooks and Googles I think they are now beginning | :06:01. | :06:02. | |
to acknowledge their new role in this information ecosystem, | :06:03. | :06:10. | |
that they're just not pipes by which they distribute information | :06:11. | :06:14. | |
and posts to people, but that they mediate, | :06:15. | :06:17. | |
they edit, curated the news At the end of the day, | :06:18. | :06:20. | |
I think that the social media platforms can do a significant | :06:21. | :06:24. | |
amount that it is really If consumers demand good quality | :06:25. | :06:27. | |
news, the algorithms and the platforms are | :06:28. | :06:31. | |
provided to them. One thing that the traditional | :06:32. | :06:33. | |
gatekeepers have to do is to make themselves relevant in a new age | :06:34. | :06:37. | |
of technology and social media. As was mentioned, things | :06:38. | :06:43. | |
are happening faster on social The most important thing | :06:44. | :06:45. | |
is uncontextualised. These are tweets, these | :06:46. | :06:52. | |
are Instagram posts, they are Facebook utterances | :06:53. | :07:00. | |
and they are not 3000-5000 word In isolation, in little bits, | :07:01. | :07:02. | |
in part of the internet. What you are seeing | :07:03. | :07:05. | |
are some experiments, like the Washington Post | :07:06. | :07:08. | |
for example, right It says we are providing a new | :07:09. | :07:09. | |
plug-in for the Chrome browsers. So, whenever you look | :07:10. | :07:18. | |
at a tweet from Donald Trump, were going to put a little fact | :07:19. | :07:20. | |
checked right underneath it saying, Not too many people will | :07:21. | :07:23. | |
use this, obviously. How many people are using a Chrome | :07:24. | :07:27. | |
desktop browser and are installing It's kind of interesting | :07:28. | :07:30. | |
they are experimenting We have also seen that Google | :07:31. | :07:32. | |
and Facebook have realised that just being a platform, | :07:33. | :07:36. | |
and just link sharing with no responsibility, | :07:37. | :07:38. | |
is also not a great thing. They are almost what we call | :07:39. | :07:40. | |
accidental or unintentional Because they are sharing the news | :07:41. | :07:42. | |
but they are not doing the fact checking or flagging of content | :07:43. | :07:46. | |
in a way that we see A lot of the moves we are seeing | :07:47. | :07:49. | |
from Facebook and Google right now, annotating and flagging trying | :07:50. | :07:53. | |
to get more information from the reader because readers | :07:54. | :07:55. | |
are demanding that now. I do believe the new gatekeepers | :07:56. | :07:57. | |
are the social media platform. They have to make sure that | :07:58. | :08:02. | |
nefarious information They need to figure out the stuff | :08:03. | :08:08. | |
coming out of places like Macedonia, the for profit spam sites should be | :08:09. | :08:21. | |
stopped right away - they have | :08:22. | :08:28. | |
the infrastructure to do that. The fact they did not do it tells me | :08:29. | :08:31. | |
they dropped the bar. Even just to kind of say | :08:32. | :08:34. | |
that we are looking into the fake news, I think it is a lot | :08:35. | :08:37. | |
of lip service. In their universe, more attention | :08:38. | :08:40. | |
to whatever people are doing, or reading on their platform, | :08:41. | :08:42. | |
equals more profit. I want to talk about | :08:43. | :08:44. | |
the decentralisation of news. From rolling news, two | :08:45. | :08:48. | |
technology companies becoming How much of a role has that played | :08:49. | :08:52. | |
in what we have seen today? I think we see the | :08:53. | :09:00. | |
abundance of news sites. In many ways this is | :09:01. | :09:03. | |
the golden age of news. More people have access | :09:04. | :09:09. | |
to more sites and more information than ever before, | :09:10. | :09:11. | |
if they are willing to undertake If they just want to be passive | :09:12. | :09:13. | |
and have things come at them, I agree that the technology | :09:14. | :09:24. | |
companies have a role. I don't want teenagers | :09:25. | :09:27. | |
in Macedonia flooding our news I don't want to assign too much | :09:28. | :09:29. | |
authority to Facebook and Google. They are doing what's best | :09:30. | :09:33. | |
for their shareholders. But the algorithmically-derived | :09:34. | :09:36. | |
results they produce are not particularly transparent and no one | :09:37. | :09:44. | |
elected them to become the I don't think you can mitigate | :09:45. | :09:47. | |
the responsibility that individuals have with all this access, | :09:48. | :09:58. | |
we're now going to have to pay Later in the programme, can | :09:59. | :10:01. | |
algorithms help wipe out fake news? Algorithms help determine | :10:02. | :10:15. | |
which stories take precedence over Our comedy consultant has been | :10:16. | :10:17. | |
taking a closer look. In this week's Talking Point, | :10:18. | :10:23. | |
he examines the role played You've probably seen it written more | :10:24. | :10:25. | |
than you heard it said. Usually written by someone | :10:26. | :10:34. | |
on Facebook shouting virtually Now, sometimes people shout fake | :10:35. | :10:37. | |
news in a Facebook comment purely because they disagree | :10:38. | :10:46. | |
with the previous Facebook comment. But, real fake news, | :10:47. | :10:49. | |
if you know what I mean, that's where a company, | :10:50. | :10:53. | |
or just two guys with a laptop, set up a website that looks | :10:54. | :10:56. | |
like a plausible news website and churn out stories that | :10:57. | :10:59. | |
looks sort of possible. How does that fake news | :11:00. | :11:02. | |
and those made up stories get A former vice president at Intel | :11:03. | :11:05. | |
here in Ireland and something of a tech and start-up guru tells me | :11:06. | :11:12. | |
it's something to do The beauty of algorithms | :11:13. | :11:15. | |
is they neither know, nor care, Likewise, in the news space, | :11:16. | :11:21. | |
if there are news articles which are being clicked on out | :11:22. | :11:29. | |
there and people with interests like mine are looking at real news | :11:30. | :11:32. | |
items, they are looking at false news items that they are looking | :11:33. | :11:36. | |
at who knows what news items. As far as the algorithm is | :11:37. | :11:46. | |
concerned, this is a high hit rate. For somebody like Philip, | :11:47. | :11:58. | |
so I'm going to present that to him. Wouldn't you think though that | :11:59. | :12:00. | |
rational thinking human beings will do little bit of verification | :12:01. | :12:04. | |
on the stories that they are served The problem is, we don't | :12:05. | :12:07. | |
do that verification. We're too happy to have our vices | :12:08. | :12:12. | |
confirmed so when we see something plausible on our phones | :12:13. | :12:15. | |
we just share it. So, controlling fake news | :12:16. | :12:17. | |
is going to have to be Fact Matter is a Google-backed | :12:18. | :12:19. | |
project for automated fact checking. Our mission is to solve | :12:20. | :12:25. | |
misinformation in online content. Solving online misinformation | :12:26. | :12:28. | |
sounds like a noble cause, We're focusing on sentence | :12:29. | :12:30. | |
level fact checking. This is identifying claims | :12:31. | :12:33. | |
that appear in text Machine intelligence | :12:34. | :12:35. | |
comes in using natural language processing, | :12:36. | :12:42. | |
to understand effectively What are the sources that | :12:43. | :12:43. | |
you could use to check that claim - what that claim is | :12:44. | :12:54. | |
about and the hardest part, which is linking | :12:55. | :12:57. | |
that claim to a fact. It will be a browser extension | :12:58. | :12:59. | |
which allows you to see highlighted sentence level claims | :13:00. | :13:02. | |
about statistics and it will link them to sources that, | :13:03. | :13:05. | |
for example, the World Bank, it will enable them to verify | :13:06. | :13:08. | |
those claims immediately. Our extension could basically look | :13:09. | :13:14. | |
at a piece of content and say this is likely to be misleading because, | :13:15. | :13:20. | |
overall, this piece of news contains The only problem with this solution | :13:21. | :13:24. | |
is it may not be able to tell the difference between fake news | :13:25. | :13:33. | |
and when someone is just trying to be funny, which is a problem | :13:34. | :13:36. | |
I face myself quite often. Touching on the role of algorithms | :13:37. | :13:43. | |
in fake news for that you can see more of his short films | :13:44. | :13:47. | |
on our website. Back here in Washington, | :13:48. | :13:54. | |
in the rapid and seemingly uncontrollable spread of fake news, | :13:55. | :13:56. | |
modern technology as Could it also turn out to be | :13:57. | :14:02. | |
part of the solution? Doctor Geoffrey from the Museum, | :14:03. | :14:06. | |
Professor Andrew Lee of Wikipedia and the technology writer | :14:07. | :14:18. | |
from San Francisco. Are we right to blame algorithms | :14:19. | :14:20. | |
for the problem of fake news? You know, algorithms only do what we | :14:21. | :14:35. | |
specified. They are just a set of instructions we have written. | :14:36. | :14:39. | |
Hopefully got up with machine learning there are all types of | :14:40. | :14:43. | |
things happening we may not know of. It is about the machine making | :14:44. | :14:47. | |
determinations without human eyeballs double-checking them or | :14:48. | :14:49. | |
steering them. We had a problem recently where Google was giving | :14:50. | :14:53. | |
search results for, did the Holocaust happen with some very | :14:54. | :14:59. | |
unsavoury sites at the top of those rankings? Google has repeatedly over | :15:00. | :15:03. | |
the years said, sorry, that is our algorithm. If you want to see | :15:04. | :15:06. | |
something different at the top, they're going to have to go on the | :15:07. | :15:11. | |
internet and make it happen on the internet. Google did tweet the | :15:12. | :15:15. | |
algorithm at the end of the day so Holocaust the Mars were not at the | :15:16. | :15:19. | |
top of the search results. I'm sympathetic to the companies. They | :15:20. | :15:23. | |
have to recognises his very early on. Three or four macro years ago I | :15:24. | :15:27. | |
do not think Facebook was thinking, we are going to become the platform | :15:28. | :15:30. | |
by which most millennial is get the news was that I do not think that | :15:31. | :15:35. | |
was in their heads. It has happened at such a speed and has caused a | :15:36. | :15:44. | |
significant change in demands facing the to respond. We also have to | :15:45. | :15:47. | |
recognise these are companies unlike any we have ever seen in that they | :15:48. | :15:53. | |
are truly global. The next million customers of Facebook weather will | :15:54. | :15:58. | |
probably be in south Asia, Africa and East Asia. They have to be | :15:59. | :16:03. | |
responsive to all of that but also. We rightly demand more of the | :16:04. | :16:11. | |
companies. I am quite sympathetic. I respectfully disagree with you. | :16:12. | :16:17. | |
Facebook had specifically targeted the media, the news business. They | :16:18. | :16:25. | |
replicated the idea of Twitter because they knew that is how people | :16:26. | :16:31. | |
keep coming back to the Facebook platform. News is one of the most | :16:32. | :16:36. | |
addictive things for people to come back to a platform or a service or a | :16:37. | :16:41. | |
brand. They intentionally went ahead and targeted the news business. | :16:42. | :16:49. | |
Between Google and Facebook, 85% of the online advertising are going to | :16:50. | :17:00. | |
these two companies. These are not, Babes in the woods. They are private | :17:01. | :17:07. | |
equities. Private entities have responsibilities. It is not just | :17:08. | :17:13. | |
Facebook. Facebook, Google, Twitter, Microsoft, which owns the Being said | :17:14. | :17:18. | |
engine and other entities like them need to sit down and figure out what | :17:19. | :17:24. | |
is the right answer. I don't think the individual should bear the | :17:25. | :17:28. | |
burden of trying to make decisions on every single piece of information | :17:29. | :17:35. | |
they access on Facebook or any other service. I want to bring in Andrew. | :17:36. | :17:42. | |
Wikipedia was mentioned. You are keen to jump in. What is the role | :17:43. | :17:47. | |
for Wikipedia and the role free humans? Wikipedia has been around | :17:48. | :17:52. | |
for 16 years when it first started in 2001. It was seen as an odd | :17:53. | :17:56. | |
experiment on the side of the internet. You are not sure if you | :17:57. | :18:01. | |
could trust it. It has been in the top ten most visited website in the | :18:02. | :18:08. | |
world. Wikipedia has stayed at the top. There are debates of how much | :18:09. | :18:12. | |
you should trust it that we have found that over the years it is | :18:13. | :18:17. | |
still the go to place, including folks like Google whom I knit for | :18:18. | :18:21. | |
information. Every time you do a Google search you will find | :18:22. | :18:25. | |
references to Wikipedia. Facebook and Google are employing people who | :18:26. | :18:30. | |
are learning from Wikipedia lesson. When you read an article will see a | :18:31. | :18:37. | |
warning or the neutrality of a section is disputed. We need morsels | :18:38. | :18:42. | |
is to verify this. Wikipedia has said, this is the best we know of | :18:43. | :18:47. | |
this topic. We could use better sourcing and better facts. Facebook | :18:48. | :18:53. | |
in Germany is trying out this experiment of annotating and putting | :18:54. | :18:57. | |
warnings around content. Not necessarily saying this is true or | :18:58. | :19:01. | |
untrue but to give more guidance to the reader. What do you think of the | :19:02. | :19:06. | |
idea of crowd sourcing? All of these things should be implied. This is so | :19:07. | :19:14. | |
compensated and it is so early, frankly, in the new ecosystem that | :19:15. | :19:17. | |
we should be testing all of these things out. On the supply side, I | :19:18. | :19:22. | |
think the platforms should do more. I think the responsibility of | :19:23. | :19:25. | |
citizens also has to be explored. The Wikipedia example is a good one. | :19:26. | :19:32. | |
The point it has lasted more than most tech start-ups or most tech | :19:33. | :19:36. | |
company shows that people who provide a service with the public | :19:37. | :19:39. | |
respect and demand that there is a market out there. Wikipedia is a | :19:40. | :19:45. | |
financial model. I think we will have to explore all of these things. | :19:46. | :19:50. | |
What we have seen in two or three years if there will become more | :19:51. | :19:57. | |
clashes that they will become more sophisticated. What do you think | :19:58. | :20:03. | |
about sites who debunk fake news? It is like racing a human against a jet | :20:04. | :20:10. | |
plane. That is what we are dealing with. Fake news is happening at that | :20:11. | :20:17. | |
scale and fact checking at human scale. We will always have that | :20:18. | :20:25. | |
problem. I am all for it. If we find ourselves here a year from now | :20:26. | :20:28. | |
having this conversation, what do you think would be talking about | :20:29. | :20:33. | |
when it comes to this issue? I would say we will be talking about the | :20:34. | :20:36. | |
same issue. The problem is much bigger. Other platforms are still | :20:37. | :20:42. | |
dragging their feet. I agree we will still be talking about the problems. | :20:43. | :20:49. | |
I think we will be a little bit more focused on crises and how public | :20:50. | :20:55. | |
opinion can be manipulated in short duration, high stake crises where | :20:56. | :20:58. | |
many of the safeguards we are talking about which may work over a | :20:59. | :21:02. | |
few days and weeks will be useless when there is an issue like this was | :21:03. | :21:07. | |
the back is what concerns me the most. Can we, as a society, or other | :21:08. | :21:14. | |
societies be played in an economic crisis? All the things we have been | :21:15. | :21:20. | |
talking about work better with time. We're not there yet in terms of an | :21:21. | :21:24. | |
immediate response to a deliberate destabilisation. Can you get this to | :21:25. | :21:33. | |
work at scale? We know that the classic quote, I could go halfway | :21:34. | :21:37. | |
around the world before the truth gets its shoes on foot or can you | :21:38. | :21:42. | |
make networks of annotation that fact checking, that get the good | :21:43. | :21:46. | |
signal to consumers quicker and in a more useful way question what it is | :21:47. | :21:53. | |
a big challenge. Twitter is a platform that is proprietary closed, | :21:54. | :21:57. | |
Facebook is closed. Will they work together to combat fake and | :21:58. | :22:00. | |
inaccurate information question about right now Wikipedia does it. | :22:01. | :22:05. | |
That is where things end. There are little things going on like the | :22:06. | :22:11. | |
Washington Post but are they going to work together? I am not so sure. | :22:12. | :22:16. | |
I look forward to revisiting this at another time. Goodbye from all of us | :22:17. | :22:24. | |
here in the studio. That is it from this edition of Talking Business. | :22:25. | :22:30. | |
Next week Tania Beckett will be in London talking about how Browns can | :22:31. | :22:36. | |
survive in an Iraq of algorithms and fake content. For now, it is | :22:37. | :22:39. | |
goodbye. If you are fed up with this cold | :22:40. | :22:55. | |
weather, there is a hint at the end of this broadcast that something | :22:56. | :22:59. | |
more spring light might be on the horizon. Some sunshine today but you | :23:00. | :23:05. | |
had to go quite a way to find it. For many of us | :23:06. | :23:06. |