Browse content similar to Glenn Greenwald - Journalist. Check below for episodes and series from the same categories and more!
Line | From | To | |
---|---|---|---|
Welcome to HARDTalk. Thanks to Edward Snowden's leaking of American | :00:11. | :00:18. | |
intelligence secrets, the whole world now knows the extent of US and | :00:19. | :00:23. | |
UK surveillance of global phone and internet traffic. Have those | :00:24. | :00:30. | |
revelations flagged up a corrosive infringement of Civil Liberties or | :00:31. | :00:34. | |
undermined efforts to protect the world from terrorism? My guess is | :00:35. | :00:38. | |
Glenn Greenwald, who broke the Snowden story. His mission, he says, | :00:39. | :00:43. | |
is to hold power to account. But is this a journalistic crusade that has | :00:44. | :00:47. | |
gone too far? Glenn Greenwald in Rio, welcome to | :00:48. | :01:23. | |
HARDTalk. Great to be with you. Let me start not with the specifics of | :01:24. | :01:28. | |
the Edward Snowden story, but just asking you about the way you see | :01:29. | :01:31. | |
your role as a journalist. Is it fair to say that you don't believe | :01:32. | :01:37. | |
in the impartiality or the objectivity of the journalist? I | :01:38. | :01:44. | |
don't believe any human beings are impartial, I think we all view the | :01:45. | :01:48. | |
world through subjective prisons that are the by-product of a whole | :01:49. | :01:52. | |
variety of factors, cultural, socioeconomic and the like. The | :01:53. | :01:55. | |
question is not if we have opinions or not, the question is if we are | :01:56. | :02:00. | |
honest about the assumptions we have embraced or do we dishonestly | :02:01. | :02:05. | |
pretend we are something we are not? So, honestly, what is your own | :02:06. | :02:12. | |
subjective prism? I think individuals should not be monitored | :02:13. | :02:16. | |
and have DOS years compiled about them, or be analysed by the state, | :02:17. | :02:23. | |
unless there is evidence that they have done wrongdoing or if there is | :02:24. | :02:26. | |
suspicion to believe that they are planning to do so. That is what it | :02:27. | :02:32. | |
means to be a free individual. So a private realm of the state does not | :02:33. | :02:36. | |
intrude into it. That's an important point, we will come back to it. | :02:37. | :02:40. | |
Maybe an even bigger question, at the beginning, do you believe there | :02:41. | :02:46. | |
should be an assumption of trust between the citizen and those | :02:47. | :02:53. | |
responsible for national security in a democracy like the United States | :02:54. | :02:57. | |
or the United Kingdom? Absolutely not. If you look at the people that | :02:58. | :03:01. | |
have founded the United States, what they were most worried about was | :03:02. | :03:06. | |
having a system of government in which the power of leaders was | :03:07. | :03:13. | |
constrained, not by constitution, legal institutions or checks and | :03:14. | :03:16. | |
balances, or by simply citizens assuming that they were good people | :03:17. | :03:20. | |
and would not abuse the power, even if they were acting in the dark. | :03:21. | :03:26. | |
They experienced the exact opposite, as has the Enlightenment and | :03:27. | :03:31. | |
centuries of political sciences. That institutions run by human | :03:32. | :03:35. | |
beings cannot be trusted to exercise power in the dark, without | :03:36. | :03:39. | |
accountability, without abusing it. Your default position when it comes | :03:40. | :03:44. | |
to senior executives in positions of power, your default position is that | :03:45. | :03:51. | |
you do not trust a word they say? My default position is, as a journalist | :03:52. | :03:55. | |
and as a rational human being, is that when people in power make | :03:56. | :03:59. | |
claims they ought to have evidence to support those claims or they | :04:00. | :04:04. | |
should be treated with great scepticism. The role of the media, | :04:05. | :04:09. | |
journalism, is to investigate those claims for people in power, subject | :04:10. | :04:13. | |
them to critical scrutiny and investigate to determine if they are | :04:14. | :04:16. | |
true, rather than blindly assuming that they are true. Let's get to | :04:17. | :04:21. | |
specifics and the sense of investigation that now surrounds | :04:22. | :04:24. | |
Edward Snowden, everything he has revealed about the activities of not | :04:25. | :04:32. | |
only the National Security Agency United states but GCHQ and other key | :04:33. | :04:41. | |
institutions in the US and the UK. Just explain to me how Snowden | :04:42. | :04:46. | |
reached out to you and how you decided he was the ultimate credible | :04:47. | :04:51. | |
source. He reached out to me in December 2012, asking if I could | :04:52. | :04:57. | |
install encryption technology so we could communicate securely. We | :04:58. | :05:00. | |
talked for about a month about the prospect of doing so. I never | :05:01. | :05:05. | |
actually ended up doing that. He went to Laura Poitras, who did have | :05:06. | :05:09. | |
that. He spoke to her for a month or so and asked for me to become | :05:10. | :05:15. | |
involved. I asked him to provide me with documents that would verify the | :05:16. | :05:17. | |
authenticity of the claims he was making. He sent me two dozen or so | :05:18. | :05:25. | |
secret NSA documents, that work right shopping in what they | :05:26. | :05:29. | |
revealed. I flew to Hong Kong, I met with him and abetted him | :05:30. | :05:33. | |
extensively. I determined that he was who he said he was and that the | :05:34. | :05:39. | |
documents were valid. You vetted him extensively. You make it sound | :05:40. | :05:45. | |
forensics. But how on earth did you ultimately decide that this guy, not | :05:46. | :05:48. | |
just was giving you credible information, that is the relatively | :05:49. | :05:53. | |
easy part, and how did you decide it was the right thing to do, both for | :05:54. | :05:57. | |
him and for you, to press ahead with publication? Because I saw in the | :05:58. | :06:05. | |
documents shocking revelations about what is being done to people's | :06:06. | :06:09. | |
privacy. Not just in my country, but around the world. That this massive | :06:10. | :06:13. | |
system of suspicion and spying had been built without the people and | :06:14. | :06:16. | |
the countries whose governments were building it having any idea that it | :06:17. | :06:21. | |
was taking place. For me, it was a very easy call as a journalist to | :06:22. | :06:25. | |
understand that these documents are critically important for people in | :06:26. | :06:28. | |
democracies to learn about and to know about. We understood that they | :06:29. | :06:34. | |
had to be reported carefully. We have reported on them carefully. The | :06:35. | :06:40. | |
question of whether people around the world should know that all the | :06:41. | :06:43. | |
mutations are being collected by foreign governments and their own, | :06:44. | :06:48. | |
of course that is up for debate, we have a right to know that. I wonder | :06:49. | :06:51. | |
if it gave you pause, Bob Woodward, one of the famous investigative | :06:52. | :06:57. | |
journalists of the modern era, he looked at how you handle it and | :06:58. | :07:00. | |
concluded he would do things very differently. He said he would have | :07:01. | :07:06. | |
pretty much insisted that Edward Snowden remain anonymous. He said it | :07:07. | :07:09. | |
would be more effective and better for him if he remained a protected, | :07:10. | :07:13. | |
anonymous source. He also says, again, given the weight of | :07:14. | :07:17. | |
everything that was being revealed, that you should have taken more time | :07:18. | :07:21. | |
and more care, be more strategic about the way that you put | :07:22. | :07:28. | |
information to the public domain. If of Woodward approved of the | :07:29. | :07:32. | |
reporting I was doing I would be extremely alarmed. This is somebody | :07:33. | :07:35. | |
who has become very, very rich, probably the world's richest | :07:36. | :07:41. | |
journalist, by doing little more than spilling America's top secrets | :07:42. | :07:45. | |
in his books, fed to him by government officials designed to | :07:46. | :07:48. | |
venerate the US government and the policies it is pursuing. He has | :07:49. | :07:54. | |
become the ultimate establishment mouthpiece. There is no doubt Edward | :07:55. | :07:59. | |
Snowden thought you were the go to guy to this. What you have told the | :08:00. | :08:03. | |
world is that there has been extensive, globalised secrets | :08:04. | :08:09. | |
buying, engineered by the United States and the UK. In different | :08:10. | :08:12. | |
times, in different ways, it has involved some of the biggest web | :08:13. | :08:17. | |
taste and communications corporations in the world. It has | :08:18. | :08:21. | |
also, at different times, targeted key individuals, some of them key | :08:22. | :08:25. | |
individuals that are friends of the US and the UK. Fascinating stories. | :08:26. | :08:29. | |
Here is what Edward Snowden says about the import of what he has | :08:30. | :08:34. | |
revealed. What we show is a Dragnet, mass surveillance that puts entire | :08:35. | :08:44. | |
populations under an all seeing eye. It hits our country, it hits our | :08:45. | :08:51. | |
freedom to speak and communicate freely. Do you believe that to be | :08:52. | :08:57. | |
true? There is no question it is true. If you like a proper documents | :08:58. | :09:01. | |
reveal, what the ultimate point is, leaving aside the Independent | :09:02. | :09:04. | |
details and all of the individual stories, is that goal of the United | :09:05. | :09:08. | |
States government and the UK Government, its closest ally, is to | :09:09. | :09:13. | |
eliminate all privacy globally. By which I mean to make every form of | :09:14. | :09:17. | |
electronic communication between all human beings collected, stored, | :09:18. | :09:26. | |
analysed and monitored by the US and its four English-speaking partners | :09:27. | :09:29. | |
in the surveillance world. Let me just be clear. You are saying | :09:30. | :09:35. | |
important things. But they are not going through my e-mail, your e-mail | :09:36. | :09:38. | |
or any other citizen's e-mail and looking at the content? Apart from | :09:39. | :09:44. | |
anything else, it would be utterly impossible to analyse, your word, | :09:45. | :09:48. | |
analyse the content of all this data? What you just said is | :09:49. | :09:55. | |
completely factually false. They absolutely are looking at the | :09:56. | :09:59. | |
content of my e-mail. I know that because of the documents that were | :10:00. | :10:03. | |
filed by the UK Government. You may be a special case, I will grant you | :10:04. | :10:07. | |
that straightaway. You almost certainly are a special case now. | :10:08. | :10:12. | |
But I am thinking of my mother, my father, the millions and billions of | :10:13. | :10:16. | |
people across the world who now use the internet. They are not all, to | :10:17. | :10:21. | |
go back to Edward Snowden's words, they are not now having their | :10:22. | :10:23. | |
ability to speak, think and associate freely threatened by this | :10:24. | :10:31. | |
metadata analysis. You are absolutely wrong. If you look at | :10:32. | :10:37. | |
what communications experts say, they will say that metadata is a | :10:38. | :10:43. | |
more invasive form of surveillance and even listening to the content | :10:44. | :10:46. | |
because of what can be done with metadata at the moment. Think about | :10:47. | :10:49. | |
if your daughter decides that she once to get an abortion, if your | :10:50. | :11:01. | |
best friend has an HIV. If they listen to the telephone calls, they | :11:02. | :11:07. | |
will listen to them talking to the doctor, whose specialisations they | :11:08. | :11:10. | |
will not even know. If they collect metadata, they will see that the | :11:11. | :11:14. | |
woman called an abortion clinic, or a friend has called somebody | :11:15. | :11:18. | |
specialising in HIV treatment, or some body has called a drug | :11:19. | :11:23. | |
addiction or suicide hotline. There are comprehensive pictures that can | :11:24. | :11:26. | |
be assembled view by knowing who is e-mailing you who you are talking | :11:27. | :11:32. | |
to, how long you are speaking, these patterns that emerge are more | :11:33. | :11:38. | |
revealing than even the content. That is nothing to say having | :11:39. | :11:42. | |
internet history monitored, Google search terms collected, everything | :11:43. | :11:48. | |
that gives an indication of what you are interested in, what you are | :11:49. | :11:52. | |
reading, and incredibly invasive picture. Let's talk about the | :11:53. | :11:56. | |
process about how it was decided what to publish and what not to. He | :11:57. | :12:00. | |
worked particularly with the Guardian newspaper, you are no | :12:01. | :12:03. | |
longer with them but you worked with them for a long time. You talked | :12:04. | :12:06. | |
about a process involving editors, outside advisers. Was there absolute | :12:07. | :12:13. | |
consensus on what to publish and what not? | :12:14. | :12:20. | |
There was ultimate consensus. We sometimes began with different views | :12:21. | :12:28. | |
on which parts of documents should be withheld or published, but at the | :12:29. | :12:36. | |
end of the day, we met with editors, journalists, consulted with | :12:37. | :12:40. | |
experts and reached a consensus about the most responsible way to | :12:41. | :12:47. | |
report. Let me insert one critical fact. We have in our possession many | :12:48. | :12:54. | |
many thousands of documents we got from Mr Snowden. I believe the grand | :12:55. | :13:00. | |
total of documents we published so far is 250, a tiny fraction of the | :13:01. | :13:07. | |
amount of material, which shows how responsible we have been. | :13:08. | :13:11. | |
That raises important questions. You have only published a tiny fraction | :13:12. | :13:17. | |
of the huge number of top-secret documents that you have in your | :13:18. | :13:24. | |
possession. So, question number one, who owns those documents | :13:25. | :13:28. | |
intellectually, who has ultimate control? You no longer work for the | :13:29. | :13:33. | |
Guardian, Edward Snowden is stuck in Moscow. Who actually controls this | :13:34. | :13:38. | |
top-secret information which has yet to be published? | :13:39. | :13:46. | |
The journalist who he trusted. You? Myself, Laura Poitras, the | :13:47. | :13:58. | |
Guardian, the Washington Post. The world's largest and most respected | :13:59. | :14:05. | |
Western news poppers -- papers. But, you were the first he turned | :14:06. | :14:12. | |
to. Do you have most of the documents? | :14:13. | :14:18. | |
Myself and Laura Poitras have the full set of documents, others have | :14:19. | :14:23. | |
portions of them. Do you believe you have the right, you no longer work | :14:24. | :14:33. | |
for the Guardian. Do you believe you and Laura Poitras have the right to | :14:34. | :14:36. | |
decide going forward what further to publish? | :14:37. | :14:44. | |
We are working with media outlets in making those choices. Even though I | :14:45. | :14:50. | |
am no longer at the Guardian, we have started our own media outlet | :14:51. | :14:55. | |
with the most experienced and respected editors and journalists | :14:56. | :14:59. | |
already working with our organisation. When I reported in | :15:00. | :15:04. | |
foreign countries, I worked with some of the largest and most | :15:05. | :15:10. | |
respected media establishments in those countries. I worked with their | :15:11. | :15:15. | |
editors, lawyers, to make these decisions collaboratively. There has | :15:16. | :15:19. | |
never been a single document published because I, myself, have | :15:20. | :15:25. | |
decided. We worked in a journalistic structure. | :15:26. | :15:29. | |
On your own admission, you have, in your possession, thousands and | :15:30. | :15:33. | |
thousands of top-secret documents which you believe at least for now, | :15:34. | :15:38. | |
you believe are so sensitive that they should not be released into the | :15:39. | :15:43. | |
public domain. If they were, it would harm the security of key | :15:44. | :15:48. | |
countries, even the US. If they fell into the wrong hands, could be | :15:49. | :15:54. | |
extremely dangerous. That puts you in an extraordinarily difficult, | :15:55. | :15:58. | |
some would say vulnerable and exposed, position. Do you | :15:59. | :16:01. | |
acknowledge that? Sure, when you do journalism, you | :16:02. | :16:08. | |
are in difficult positions, you challenge powerful factions, you go | :16:09. | :16:13. | |
to war zones. It is a dangerous profession. It isn't about you as a | :16:14. | :16:17. | |
journalist. You control information which is of | :16:18. | :16:24. | |
vital national security interest to hundreds of millions of people who | :16:25. | :16:29. | |
live in the US, UK and other countries. Surely the wisest course | :16:30. | :16:36. | |
of action for you as a human being and journalist, is to return that | :16:37. | :16:41. | |
information from where it came? You have decided the information is so | :16:42. | :16:44. | |
sensitive it can never be published, so should you not get it | :16:45. | :16:49. | |
out of your possession and return it? | :16:50. | :16:54. | |
No, first of all, the people who can't be trusted to safeguard the | :16:55. | :16:58. | |
security of that information are called the NSA and GCHQ, who are so | :16:59. | :17:04. | |
reckless they put it on systems accessed by tens of thousands of | :17:05. | :17:08. | |
people, and lost control. We have maintained tight control. Secondly, | :17:09. | :17:15. | |
I never said all of the documents which have been published should not | :17:16. | :17:20. | |
be published. There is a lot of reporting I intend to do in | :17:21. | :17:24. | |
publishing these documents. How much more? I don't have an exact | :17:25. | :17:31. | |
number but I can tell you if I had to guess, we are still in the first | :17:32. | :17:37. | |
half of the reporting, the majority of reporting on these documents. | :17:38. | :17:46. | |
I asked so much about the security and ownership of the secrets that | :17:47. | :17:53. | |
you possess, because the argument of the security chiefs in the US and | :17:54. | :17:58. | |
here in the UK is that they have to assume that the secret you now | :17:59. | :18:03. | |
possess are no longer secure. They point to things, for example, you | :18:04. | :18:07. | |
have told the New York Times, I am borderline illiterate on matters of | :18:08. | :18:12. | |
computer encryption you said in summer. You said you have someone | :18:13. | :18:16. | |
well regarded doing your computer for you. When they hear that, and | :18:17. | :18:23. | |
see your partner David Miranda carrying flash cards with thousands | :18:24. | :18:28. | |
of secrets on airlines, pass words on paper, they assume that the | :18:29. | :18:32. | |
secrets you have cannot be regarded as secret any more, do you | :18:33. | :18:36. | |
understand that? No, I don't understand that and I | :18:37. | :18:41. | |
will tell you why. Everyone can toss around all sorts of inflammatory | :18:42. | :18:48. | |
accusations. Instinctively attacking journalists who in any way to find | :18:49. | :18:57. | |
what it is they want. I can make all types of accusations as well, but | :18:58. | :19:01. | |
ultimately the wake national people decide what is true is through the | :19:02. | :19:07. | |
evidence, reality. As I said, the reality is there is only one group | :19:08. | :19:11. | |
of people whose security measures were so reckless and sloppy that | :19:12. | :19:16. | |
they caused these documents to be lost, those are the NSA and GCHQ. | :19:17. | :19:21. | |
The journalists who have worked on this case have never lost control of | :19:22. | :19:26. | |
a single piece of paper, because even though it is true six months | :19:27. | :19:29. | |
ago when I first began I didn't have a very good grasp of encryption, I | :19:30. | :19:34. | |
consulted with experts, which is the responsible thing to do. Just like | :19:35. | :19:39. | |
the New York Times and Washington Post. We used the most sophisticated | :19:40. | :19:44. | |
forms of encryption to ensure that there would be no way to access | :19:45. | :19:49. | |
them. Let us talk about trust. You do not | :19:50. | :19:55. | |
trust the security chiefs who have said what you have done has | :19:56. | :20:02. | |
fundamentally undermined security and aided and abetted terrorism. | :20:03. | :20:06. | |
Oliver Robins, deputy national-security adviser for | :20:07. | :20:09. | |
intelligence in the UK Cabinet office who says, in written evidence | :20:10. | :20:16. | |
in court, it is known in the seized material there is personal | :20:17. | :20:19. | |
information which would allow intelligence staff to be identified. | :20:20. | :20:24. | |
He says the government has had to assume copies of information held by | :20:25. | :20:29. | |
Mr Snowden may now be held by one or more other states. You are saying he | :20:30. | :20:37. | |
is not telling the truth? There is this thing called the rock | :20:38. | :20:42. | |
war, in which the US and UK governments persuaded their media | :20:43. | :20:46. | |
outlets and populations to support an aggressive attack on another | :20:47. | :20:50. | |
country by making one false claim after the next to scare the | :20:51. | :20:53. | |
population into believing there was a security threat which did not | :20:54. | :20:58. | |
exist. That they had to go to war to stop it. What journalism is about is | :20:59. | :21:03. | |
based on the premise, when people like Mr Robbins and others who | :21:04. | :21:07. | |
exercise power in the dark, make those kinds of claims to justify | :21:08. | :21:11. | |
their own power, they are often lying, they often tell things to the | :21:12. | :21:16. | |
population which turn out to be untrue. The job of a journalist is | :21:17. | :21:20. | |
not to investigate other journalists, it is to try to be | :21:21. | :21:25. | |
responsible when telling their readers what government officials | :21:26. | :21:29. | |
are saying, and to assess whether there is evidence. That is my role | :21:30. | :21:33. | |
as a journalist. I have two challenge that, you are | :21:34. | :21:37. | |
accusing the most senior intelligence officials on both sides | :21:38. | :21:42. | |
of the Atlantic or routine and systematic lying. What is your | :21:43. | :21:47. | |
evidence? You say look at the rock war. What is your evidence, when the | :21:48. | :21:53. | |
head of MI6 says there is real evidence that since your | :21:54. | :21:58. | |
revelations, the sorts of communications conducted by | :21:59. | :22:00. | |
terrorists have changed because they have adapted to what they have | :22:01. | :22:04. | |
learned from you, where is your evidence there is intelligence | :22:05. | :22:07. | |
chiefs are lying? First of all, I think the rock war | :22:08. | :22:13. | |
is a pretty insignificant collection marked the rock war is a pretty | :22:14. | :22:20. | |
significant example. If you want to scream at me, I can disconnect. If | :22:21. | :22:24. | |
you want to ask the question, you need to give me time to answer. The | :22:25. | :22:28. | |
evidence government officials lied is found in history with things like | :22:29. | :22:35. | |
the rock war when the government is destroyed 26 million people based on | :22:36. | :22:38. | |
lies they told over the course of two years to their population. If | :22:39. | :22:44. | |
you look at countries where there are constitutional guarantees of | :22:45. | :22:48. | |
free which includes most western democracies, you find all sorts of | :22:49. | :22:52. | |
people who have created those perceptions, have done so based on | :22:53. | :22:59. | |
the recognition people in power will routinely lie to their population. | :23:00. | :23:03. | |
The evidence I have is three Democratic senators two weeks ago in | :23:04. | :23:08. | |
the US on the intelligence committee with access to classified ads -- | :23:09. | :23:14. | |
information, said the claims of NSA officials that these programmes have | :23:15. | :23:18. | |
stopped terrorist plots, that there is no evidence for that. | :23:19. | :23:23. | |
You are backed by peer media, worth billions of dollars -- Pierre | :23:24. | :23:34. | |
Omidyar. You say you will conduct journalism | :23:35. | :23:39. | |
in the future which is far from the glaring subservient super political | :23:40. | :23:45. | |
powers. So, should we expect that you will use most of the material | :23:46. | :23:52. | |
Edward Snowden gave you, ultimately? I am going to use most of the | :23:53. | :23:57. | |
material which is newsworthy. How much of that is part of the pile, I | :23:58. | :24:02. | |
cannot quantify right now. But there is a lot more reporting to do. We're | :24:03. | :24:05. | |
not the journalists who repeat what the government says and demanding | :24:06. | :24:10. | |
everyone accepts it without evidence. We believe the way you | :24:11. | :24:15. | |
hold power accountable is reporting on the truth, and the documents | :24:16. | :24:19. | |
reveal that. Glenn Greenwald, thank you for | :24:20. | :24:23. | |
joining me from Rio, thank you for being on HARDtalk. | :24:24. | :24:29. | |
IQ, my pleasure. -- thank you. | :24:30. | :24:31. |