Browse content similar to 03/02/2017. Check below for episodes and series from the same categories and more!
Line | From | To | |
---|---|---|---|
President Trump has put Iran on notice after its | :00:12. | :00:13. | |
missile test with limited sanctions, so who is testing who? | :00:14. | :00:16. | |
And is this the beginning of the unravelling of President Obama's | :00:17. | :00:19. | |
I'll be speaking to a former Deputy Prime Minister of Iran. | :00:20. | :00:23. | |
We are able to manipulate YouTube videos in real-time. | :00:24. | :00:26. | |
Here we demonstrate our method in a live | :00:27. | :00:28. | |
Is new technology, which can put the wrong words in your mouth, | :00:29. | :00:36. | |
a giant leap for fake news and alternative facts? | :00:37. | :00:42. | |
How do we sort out the truth, half-truth, and lies? | :00:43. | :00:44. | |
Africa is no longer the colonial subject. | :00:45. | :00:59. | |
President Trump today announced his first sanctions | :01:00. | :01:04. | |
against Iran over its ballistic missile test on Sunday. | :01:05. | :01:08. | |
The new president has been a long-time critic | :01:09. | :01:10. | |
Trump tweeted that Iran is playing with fire. | :01:11. | :01:16. | |
They don't appreciate how "kind" President Obama was to them. | :01:17. | :01:18. | |
The US National Security Advisor said the administration | :01:19. | :01:25. | |
was putting Iran on notice, and then the Treasury department | :01:26. | :01:27. | |
announced sanctions against thirteen people and a dozen "entities". | :01:28. | :01:30. | |
Iran's Foreign Minister responded, also on Twitter, saying "Iran | :01:31. | :01:35. | |
unmoved by threats as we derive security from our people. | :01:36. | :01:39. | |
We will never initiate war, but we can only reply | :01:40. | :01:41. | |
Is this a harbinger of much worse to come like a handbrake turn on Iran | :01:42. | :01:57. | |
policy? In a sense it is unfinished business from the last months of the | :01:58. | :02:02. | |
Obama administration, these missiles, over 1000 mile range, not | :02:03. | :02:07. | |
very accurate, the general view of intelligence experts is that they | :02:08. | :02:11. | |
are being tested as nuclear delivery systems, that is counter to United | :02:12. | :02:17. | |
Nations agreements. Another problem is that some people connected with | :02:18. | :02:26. | |
these, the firing of warships by rebels in recent months, another | :02:27. | :02:30. | |
issue is that basically apart from one counterstrike against those | :02:31. | :02:33. | |
batteries President Obama kicked the can down the road. So emphasis | :02:34. | :02:37. | |
tonight from US officials is that this doesn't mean the end of the | :02:38. | :02:41. | |
nuclear agreement, it is a separate issue but we have to do something. | :02:42. | :02:46. | |
The Trump rhetoric is heavy but these are not, as you say, | :02:47. | :02:52. | |
heavy-duty sanctions. Well, here's the problem which is that President | :02:53. | :02:56. | |
Trump and his national security adviser Mike Flynn are very hard | :02:57. | :03:01. | |
line on Iran. So these aspects of sanctions about these two separate | :03:02. | :03:05. | |
issues which you might see is business as usual for the National | :03:06. | :03:09. | |
Security system in Washington are coming at a time when President | :03:10. | :03:12. | |
Trump has said all options are on the table now for Iran, Mike Flynn | :03:13. | :03:18. | |
has said they are on notice. The dangers of misperception, the | :03:19. | :03:21. | |
Iranians have said that they will carry on testing and President Trump | :03:22. | :03:26. | |
feeling he has drawn a red line, remember how he criticised President | :03:27. | :03:30. | |
Obama for not enforcing the red line of Assyria, he may feel he has to | :03:31. | :03:35. | |
defend it, the scope for sliding into conflict now is considerable, I | :03:36. | :03:37. | |
would say. Thank you very much. Mohsen Sazegara was the Iranians | :03:38. | :04:01. | |
Deputy Prime Minister in his late dash in the late 80s but became | :04:02. | :04:06. | |
disillusioned with the resume and is now a political activist based in | :04:07. | :04:07. | |
Washington, DC. These 25 persons and institutions, I | :04:08. | :04:16. | |
expected it because of an escalation of tensions between these countries. | :04:17. | :04:19. | |
It started from the White House three nights ago by Flynn and Trump, | :04:20. | :04:29. | |
both of them. Do you sense the Iranians pushing a little, nibbling | :04:30. | :04:33. | |
away at Donald Trump with this test on Sunday? I think that right now, | :04:34. | :04:43. | |
the top officials in Iran, the leader, I mean, and the | :04:44. | :04:50. | |
revolutionary guard, they are very cautious and they prefer not to | :04:51. | :04:56. | |
escalate the tension, and more than that, I think, they are waiting for | :04:57. | :05:05. | |
the results of the President's trip to Moscow to see what President | :05:06. | :05:10. | |
Putin can do for them, because they expect that President Putin can | :05:11. | :05:13. | |
reduce the tensions between Washington and Iran. It is not | :05:14. | :05:24. | |
necessarily to the advantage of Tehran to have Putin and Trump | :05:25. | :05:30. | |
close, is it? It is not to the benefit of one side, and to the | :05:31. | :05:35. | |
other side, if there are new sanctions against Putin and Russia, | :05:36. | :05:41. | |
if these are passed by the Congress, it shows that Putin can't solve his | :05:42. | :05:51. | |
own problem. So both sides are not good signals for Iran. Let's look | :05:52. | :05:57. | |
separately at the sanctions and also the visa restrictions, two separate | :05:58. | :06:02. | |
things. Your reaction to the Visa restrictions? Visa restrictions | :06:03. | :06:08. | |
definitely harms a big group of Iranians who live in the USA and | :06:09. | :06:15. | |
their families, relatives, travel, and helps the regime of Iran for its | :06:16. | :06:20. | |
own propaganda and mobilises people to support the regime. But the | :06:21. | :06:27. | |
sanctions against the Revolutionary guard, or the leader, and the | :06:28. | :06:30. | |
institutions, most of them are corrupt in Iran as well, I think | :06:31. | :06:34. | |
this is something else that the people of Iran may like. It is | :06:35. | :06:41. | |
interesting what you say because there is a strong entrepreneurial | :06:42. | :06:44. | |
community of American Iranians who do not like what is happening with | :06:45. | :06:49. | |
this visa. It looks as if President Trump is prepared to put up with | :06:50. | :06:54. | |
that in order to pursue this heavy policy on visas. He doesn't actually | :06:55. | :07:00. | |
care about the impact that will have on the attitudes of American | :07:01. | :07:06. | |
Iranians. By the way I am definitely against such type of restrictions | :07:07. | :07:10. | |
for Iranian citizens, or any sanction which harms all the people | :07:11. | :07:23. | |
of Iran. But I support the sanctions which are targeted and are smart | :07:24. | :07:28. | |
against the top officials and the people who abuse and violate human | :07:29. | :07:34. | |
rights. Is it your sense that President Trump is ad hoc or just | :07:35. | :07:41. | |
sounding like a hawk. -- hawk, or just sounding like a hawk? I feel | :07:42. | :07:49. | |
that sometimes he is unstable and I cannot rely on his stances, but the | :07:50. | :07:56. | |
guys that he has picked for his administration, they are hawks. | :07:57. | :08:02. | |
Especially with respect to Iran, they are too tough. Mohsen Sazegara, | :08:03. | :08:06. | |
thank you for joining us tonight. Should we trust our leaders to tell | :08:07. | :08:08. | |
the truth, and is there something materially different about truth | :08:09. | :08:13. | |
in the technological age? How do we gauge what's true, | :08:14. | :08:14. | |
half true, or false? In the past month, new phrases have | :08:15. | :08:18. | |
entered the political lexicon, in particular, "fake news" | :08:19. | :08:21. | |
and "alternative facts". But what if there's another, | :08:22. | :08:23. | |
explosive ingredient in this febrile mix - the ability, | :08:24. | :08:25. | |
literally, to manipulate the words Here's our technology | :08:26. | :08:28. | |
editor, David Grossman. How do we know that | :08:29. | :08:40. | |
something happened? That we are not being | :08:41. | :08:44. | |
fooled by fakes? Some TV trickery is | :08:45. | :08:50. | |
pretty familiar to us. This technology, green screen | :08:51. | :08:56. | |
or colour separation overlay, has been around in some | :08:57. | :08:58. | |
form for decades. It allows us to convincingly give | :08:59. | :09:04. | |
real people backdrops of virtual However, we are about to cross | :09:05. | :09:07. | |
the threshold into a new world where it is possible to convincingly | :09:08. | :09:16. | |
recreate known real people - famous people like politicians - | :09:17. | :09:20. | |
and have them say or do more We're a few years, but not many | :09:21. | :09:23. | |
years, away from a situation now where we can not only create | :09:24. | :09:32. | |
a pretty sort of realistic environment for people, | :09:33. | :09:39. | |
but we can also do things like manipulate their voices | :09:40. | :09:41. | |
and manipulate their facial expressions and modulate | :09:42. | :09:44. | |
their speech in real-time. Here we demonstrate our | :09:45. | :09:49. | |
method in a live setup. This is the Face 2 Face | :09:50. | :10:02. | |
Project, a collaboration between Stanford University, | :10:03. | :10:12. | |
the Max Planck Institute and the University | :10:13. | :10:13. | |
of Erlangen-Nuremberg. As we can see, we are able | :10:14. | :10:15. | |
to generate a realistic They can take the facial expressions | :10:16. | :10:18. | |
of one person and match them onto the features | :10:19. | :10:22. | |
of another in real-time. The results are already | :10:23. | :10:24. | |
amazing and only going It seems like they are being | :10:25. | :10:26. | |
developed out with a specific ethical framework that helped them | :10:27. | :10:29. | |
to actually assess before technologies are developed, | :10:30. | :10:32. | |
the actual implications The perfect environments for fake | :10:33. | :10:33. | |
news, for a widespread This hugely damaging image | :10:34. | :10:40. | |
of John Kerry supposedly sharing a stage with Vietnam protester | :10:41. | :10:45. | |
Jane Fonda was actually Now the company that | :10:46. | :10:54. | |
invented Photoshop, Adobe, has unveiled a new, | :10:55. | :11:01. | |
potentially game changing At this event in November, | :11:02. | :11:03. | |
Adobe demonstrated Voco, which, loaded with 20 minutes | :11:04. | :11:07. | |
of real sample voice, can then make someone | :11:08. | :11:10. | |
say anything just by typing it in. Adobe said the auyo audio will be | :11:11. | :11:13. | |
watermarked so that fakes are easy to spot but that may not | :11:14. | :11:30. | |
stop them spreading. Often the things we see as fake news | :11:31. | :11:32. | |
or false stories are actually very easily debunked | :11:33. | :11:35. | |
in a matter of seconds. But it doesn't necessarily stop | :11:36. | :11:37. | |
them from spreading, partly because I think | :11:38. | :11:39. | |
the mechanisms now are so quick in terms of how virality is created | :11:40. | :11:44. | |
online but also because people But is there a flip side | :11:45. | :11:48. | |
to this technology? If it makes the fake seem real, | :11:49. | :11:54. | |
what does it do to our perception Will it allow those | :11:55. | :11:58. | |
intent on deceiving us to dismiss cold, solid, hard video | :11:59. | :12:05. | |
evidence as mere trickery? It becomes a term that can be used | :12:06. | :12:10. | |
by anyone who wants to call out something that they don't | :12:11. | :12:23. | |
like, and spread doubt And when you have a high level | :12:24. | :12:24. | |
of distrust in stories, and you have a high level | :12:25. | :12:33. | |
of distrust in institutions, which we do at the moment, | :12:34. | :12:38. | |
then a term like fake news becomes almost meaningless, because it's | :12:39. | :12:41. | |
deployed in so many ways which actually describe | :12:42. | :12:49. | |
things that are perfectly We've grown more sophisticated | :12:50. | :12:50. | |
in our ability to discern However, accelerating technological | :12:51. | :12:53. | |
change means we'll need to quickly refine how we weigh the evidence | :12:54. | :13:00. | |
of our senses. We're joined now by Claire Wardle, | :13:01. | :13:02. | |
Research Director at First Draft News, which aims | :13:03. | :13:12. | |
to improve the standard of online reporting and the philosopher, | :13:13. | :13:14. | |
Simon Blackburn, who wrote Truth - Good evening to you both in London | :13:15. | :13:24. | |
and New York. Simon, is truth just about the most important thing? It | :13:25. | :13:30. | |
is very important in our day-to-day lives, our sensors are adapted to | :13:31. | :13:34. | |
telling us how the world around us is and if we don't know how the | :13:35. | :13:38. | |
world around us is we will not behave well in it. Give an example | :13:39. | :13:44. | |
of how senses are adapted. I'm pretty good at knowing of the bus is | :13:45. | :13:49. | |
bearing down on me and pretty good at not crossing the road if I can | :13:50. | :13:53. | |
see one bearing down on me. I would be much worse in life if I could not | :13:54. | :13:57. | |
see that was a bus bearing down on me so I need the truth about that | :13:58. | :14:01. | |
kind of thing and that is true of all kinds of ways I behave in my | :14:02. | :14:04. | |
environment. I need to know whether the food I am looking at is | :14:05. | :14:10. | |
poisonous, I need to be able to rely on various deliveries of sense, | :14:11. | :14:15. | |
sound, and of course trust in things that people tell me. But it was ever | :14:16. | :14:23. | |
thus. Is there a difference now as technology change things? | :14:24. | :14:30. | |
Communication has exploded so we get communications from very different | :14:31. | :14:38. | |
parts of the world, not just from our neighbours, our parents. Big | :14:39. | :14:43. | |
communications from media outlets, fake media outlets and so one. | :14:44. | :14:49. | |
Sifting what we are told, whether it is trustworthy, becomes much harder. | :14:50. | :14:54. | |
We cannot go behind the scenes. I cannot see what the truth is about | :14:55. | :15:01. | |
what Bush is saying if someone else shows me some bizarre things. In a | :15:02. | :15:08. | |
way, you make it your mission not to be a single sister but to find a way | :15:09. | :15:14. | |
in which we can engage the truth. Do you think the technology that has | :15:15. | :15:19. | |
just been explained in that film will make a huge difference? People | :15:20. | :15:25. | |
look at people's bases and think they can trust their eyes, trust | :15:26. | :15:30. | |
what they see, and it is false. Absolutely. In the same way that | :15:31. | :15:36. | |
photo editing software and video editing software is on a laptop, | :15:37. | :15:40. | |
anyone in the world can create visuals. Because of technology they | :15:41. | :15:46. | |
move at huge speed across the world. Our brains are adapted to trust | :15:47. | :15:50. | |
visuals more. As technology becomes easier and cheaper that is why we | :15:51. | :15:55. | |
have at this explosion of false information. I suppose the more | :15:56. | :16:01. | |
people there are checking to find out it is false. How do you get at | :16:02. | :16:06. | |
the truth? We are having lots of people talking about news literacy | :16:07. | :16:11. | |
projects and educating people to stop and check. We're looking at our | :16:12. | :16:16. | |
phones and scrolling quickly. Although we might know to be | :16:17. | :16:19. | |
critical, sometimes things that are too good to be true, it is very easy | :16:20. | :16:25. | |
to click share. We do not stop and check when we should do. Politicians | :16:26. | :16:31. | |
particularly through the centuries have all tried to manipulate the | :16:32. | :16:35. | |
truth one way or another at different times. In a sense, is it | :16:36. | :16:40. | |
not easy because you can sift through and make decisions yourself? | :16:41. | :16:43. | |
Is it not easier to get information to the access? As human beings, we | :16:44. | :16:50. | |
want information to make us feel better. We are in a polarised world. | :16:51. | :16:57. | |
You sit in groups of people you'd think are the same as you and you | :16:58. | :17:00. | |
want information to make you feel better. It is easier to double check | :17:01. | :17:06. | |
and Google something that does not necessarily mean we are doing that. | :17:07. | :17:13. | |
What will it do to us? I find it destabilising sometimes if I do not | :17:14. | :17:19. | |
know what is true and what is false. It is difficult to predict. If | :17:20. | :17:27. | |
technologies do proliferate in the way described and they become very | :17:28. | :17:30. | |
popular and everyone is using them, I should have thought one possible | :17:31. | :17:34. | |
reaction, my own reaction for example, would be in a sense to | :17:35. | :17:39. | |
retreat. That is very bad for democracy. If I say I am not going | :17:40. | :17:43. | |
to believe anything about President Trump, I do not believe anything | :17:44. | :17:49. | |
about Theresa May, that means I am retreating from my historic duties | :17:50. | :17:52. | |
as a citizen, which is to inform myself about policy and what these | :17:53. | :17:59. | |
people are offering. The advent of global communication could actually | :18:00. | :18:05. | |
signal a retreat. It could indeed. It is a rational response to a world | :18:06. | :18:10. | |
in which nothing is trustworthy. If you cannot trust anything, do not | :18:11. | :18:16. | |
believe anything. Do not act. That basis for action, does that come | :18:17. | :18:22. | |
from early education? Had you get a basis for action? Our senses tell us | :18:23. | :18:27. | |
how the world is. We are very good at using them put up relying on | :18:28. | :18:31. | |
other people, that is something you learn when they are trustworthy and | :18:32. | :18:36. | |
when they are not. Unless you can get some experience in both sides of | :18:37. | :18:40. | |
it, you're not going to be a fully performing, fully active adult. Does | :18:41. | :18:49. | |
that fill you with dread? I have to say, it is a pretty troubling time | :18:50. | :18:54. | |
over here in the US. We are seeing people retreat and say they are not | :18:55. | :18:58. | |
looking at the news. People are already starting to say I am | :18:59. | :19:02. | |
confused, worried and scared and I do worry about what that means. Is | :19:03. | :19:08. | |
this now about the loss of control? There are so much of people's lives | :19:09. | :19:12. | |
which are not in their control. It is another worrying aspect of modern | :19:13. | :19:19. | |
life. I think certainly people feel overwhelmed by technology does it | :19:20. | :19:22. | |
comes to them even when they are not ready for it. You see an update on | :19:23. | :19:26. | |
your phone about something you did not expect. People feel out of | :19:27. | :19:32. | |
control and overwhelmed. There is a huge proliferation. There used to be | :19:33. | :19:36. | |
big blocks of media you could do to four different things. You knew what | :19:37. | :19:40. | |
they did. There is a preferential of all sorts of websites. A lot of them | :19:41. | :19:46. | |
are in high resolution, high technology sites. They look very | :19:47. | :19:51. | |
ill. How you meant to know if they are real or false? You're not meant | :19:52. | :19:57. | |
to know. That is the point. There are very systematic campaign is now | :19:58. | :20:01. | |
to ensure people see the same messages. Over time we are seeing | :20:02. | :20:05. | |
networks of information and systematic campaign to try to | :20:06. | :20:08. | |
persuade people. It is very sophisticated. As much as we try to | :20:09. | :20:13. | |
teach people to be critical, a lot of these things are really easy to | :20:14. | :20:19. | |
fullback on. The people who benefit are dictators, people who manipulate | :20:20. | :20:22. | |
the news for their own ends. Absolutely. We can see, even within | :20:23. | :20:27. | |
Europe and the elections that are coming up with France, Germany and | :20:28. | :20:31. | |
the Netherlands without huge concerns about systematic campaigns | :20:32. | :20:35. | |
quit using social networks to change public opinion. That is definitely | :20:36. | :20:39. | |
what is on the cards. How do you counter that? Thank God for the BBC. | :20:40. | :20:53. | |
You looked at gold standard. Touch wood, we have the Times, the BBC. | :20:54. | :21:02. | |
ITV and our colleagues. Now, of course, how long that will remain | :21:03. | :21:09. | |
and whether indeed the BBC will, for example, remain Independent in the | :21:10. | :21:14. | |
way that Donald Trump has ensured virtually no State Department can be | :21:15. | :21:19. | |
Independent in the USA. That kind of dictatorship, that kind of change in | :21:20. | :21:22. | |
Democratic politics is very worrying. Then we really do lose our | :21:23. | :21:30. | |
morals. She told EU leaders she wanted | :21:31. | :21:32. | |
to build a "strong partnership" with the EU and pledged the UK | :21:33. | :21:36. | |
would be a "good friend This went down well | :21:37. | :21:39. | |
with Chancellor Merkel. Better than her relationship | :21:40. | :21:42. | |
with Donald Trump, After publishing the Brexit White | :21:43. | :21:44. | |
Paper this week, we have a decent idea of what the government wants | :21:45. | :21:48. | |
to get out if its negotiation. But what about those | :21:49. | :21:51. | |
on the other side? Or policy editor, Chris Cook, has | :21:52. | :21:52. | |
been speaking to the EU Competition Trying to find out why we still know | :21:53. | :21:55. | |
so little. I think it is important | :21:56. | :22:06. | |
that we leave some things for the people who will be | :22:07. | :22:09. | |
in the room. The EU negotiator, the UK, | :22:10. | :22:11. | |
and negotiating team because they will have to put | :22:12. | :22:13. | |
together a new puzzle because some of the obvious signals from the UK | :22:14. | :22:18. | |
Government is that they want a new relationship, not | :22:19. | :22:23. | |
the Norwegian, not the Swiss, Therefore I am very careful not | :22:24. | :22:25. | |
to prejudge things because I think the people in the room, | :22:26. | :22:34. | |
they will have a task which is sufficiently difficult | :22:35. | :22:40. | |
without the rest of us trying to... You have an insight | :22:41. | :22:43. | |
into what Britain is planning which lots of people don't have | :22:44. | :22:51. | |
because you've actually seen the undertakings they gave | :22:52. | :22:55. | |
to Nissan and you've judged Well, of course we stay in touch | :22:56. | :22:57. | |
with the UK Government on issues of this kind, | :22:58. | :23:05. | |
just as well with a number The letter in itself, | :23:06. | :23:08. | |
we don't have concerns of state You don't think that while we're EU | :23:09. | :23:12. | |
members at the British Government is committing funds to Nissan | :23:13. | :23:24. | |
which wouldn't be available That would be a very | :23:25. | :23:26. | |
broad thing to answer. In the letter and the debates | :23:27. | :23:31. | |
we have no concerns. Is there any public | :23:32. | :23:33. | |
spending involved in this? Well, I think, eventually, | :23:34. | :23:35. | |
probably you will know. But, for us, having seen | :23:36. | :23:40. | |
the letter, we have no concern That is as well as we stay in touch | :23:41. | :23:42. | |
with the number of other Is your understanding | :23:43. | :23:51. | |
still that the commission's intention is that we'll have | :23:52. | :23:57. | |
like the divorce proceeding and then in Brexit terms and then | :23:58. | :24:12. | |
a trade negotiation? That is the most simple approach. | :24:13. | :24:20. | |
The figure out where you stand and how you move on foot of it is a | :24:21. | :24:25. | |
complex thing to be divorced and having some kind of partnership at | :24:26. | :24:30. | |
the same time. Isn't part of the problem with this that the two | :24:31. | :24:34. | |
things are not going to lie one to the other? Nothing will be decided | :24:35. | :24:41. | |
until everything is decided. There will be things we want in the | :24:42. | :24:44. | |
subsequent relationship but you might want to serve early. It seems | :24:45. | :24:50. | |
very hard to disentangle these two things. These negotiations will be | :24:51. | :24:54. | |
extremely complicated. Maybe you get something in between. To some | :24:55. | :25:00. | |
degree, you can say this will happen to be solved also in a future | :25:01. | :25:07. | |
relationship. This is definitely something where we will have a clean | :25:08. | :25:12. | |
divide commits this can be done now. I think all of these details, they | :25:13. | :25:17. | |
will have to be solved in the room. Not because discussions around the | :25:18. | :25:21. | |
room. But because the responsibility of getting it right is exactly for | :25:22. | :25:25. | |
the people who will be asked to do the deal. | :25:26. | :25:29. | |
Time now for Viewsnight, a new feature of Newsnight. | :25:30. | :25:31. | |
All week we've been bringing you new thoughts | :25:32. | :25:33. | |
and ideas from a range of opinionated people. | :25:34. | :25:35. | |
You might agree with them, disagree, or think again. | :25:36. | :25:37. | |
Tonight, the British Senegalese activist and business | :25:38. | :25:40. | |
The widely respected head of the American organisation | :25:41. | :27:36. | |
Human Rights Watch, Ken Roth, expressed his surprise this | :27:37. | :27:39. | |
week at the appointment of the new deputy head of the CIA. | :27:40. | :27:43. | |
As you might have noticed, we follow Trump's moves closely on this | :27:44. | :27:46. | |
programme, and at the risk of breaking BBC rules on bias, we can | :27:47. | :27:50. | |
Mr Roth was concerned that the new deputy had previously | :27:51. | :27:57. | |
been connected to running a CIA black site for torture. | :27:58. | :28:01. | |
We were more concerned at the double identity of those | :28:02. | :28:04. | |
Emily has issued a kind of denial, but whatever | :28:05. | :28:11. | |
the truth of the allegations, we here at Newsnight fully support | :28:12. | :28:15. | |
Before we go, you may have read in the papers | :28:16. | :28:22. | |
Storms in the Med have left us without fancy foreign | :28:23. | :28:26. | |
imports like lettuce, broccoli and aubergines. | :28:27. | :28:27. | |
This sceptred isle has weathered fiercer storms than this. | :28:28. | :28:31. | |
There was once even a time when, yes, we had no bananas. | :28:32. | :28:33. | |
We've ransacked the archive to find some advice | :28:34. | :28:35. | |
So, here are some useful pointers when attempting | :28:36. | :28:49. | |
to cook that decent, honest, British vegetable that never | :28:50. | :28:52. | |
goes out of season or fashion - the humble cabbage. | :28:53. | :28:54. | |
Sally's cabbage has been cooked in little water with the lid on, | :28:55. | :28:58. | |
retaining the full value of the vegetable. | :28:59. | :28:59. | |
Sally carefully pours the water into a cup. | :29:00. | :29:01. | |
Jane, on the other hand, has drowned her cabbage in water and | :29:02. | :29:06. | |
apart from losing the goodness, that cabbage is a wet soggy mess with no | :29:07. | :29:10. | |
Now, never put so much water in the cabbage again, it makes it | :29:11. | :29:36. | |
But you'll do it quite all right tomorrow. | :29:37. | :29:43. | |
Good evening. Still some unpleasant weather out there if you are | :29:44. | :30:04. | |
travelling. It could well be icy ones rain, sleet and snow meanders | :30:05. | :30:11. | |
northwards. More rain in the South later. Let's take a | :30:12. | :30:12. |