:00:15. > :00:29.I take pride in the words ich bin eine Berliner. The fundamental
:00:30. > :00:32.question of our time is whether the West has the will to survive.
:00:33. > :00:34.The President says the West may not survive.
:00:35. > :00:38.Is he right, or is he part of the problem?
:00:39. > :00:40.I think the large English speaking democracies, Britain,
:00:41. > :00:42.and the United States, are really moving rapidly
:00:43. > :00:48.We'll discuss how the Alliance can weather these storms.
:00:49. > :00:50.Also tonight, for some Remainers, the cause endures,
:00:51. > :00:52.the hope still lives, and the dream shall never die.
:00:53. > :00:56.The head of the CBI will make the case for a Brexit so soft,
:00:57. > :01:09.And just how close is artificial intelligence?
:01:10. > :01:12.It's literally in the past year we went from a place where it
:01:13. > :01:15.would get it right about 80% of the time to a point where now
:01:16. > :01:26.it's actually achieved human parody and speech recognition.
:01:27. > :01:32.Something may have been lost in translation but Donald Trump
:01:33. > :01:34.spent much of his Presidential campaign proudly proclaiming
:01:35. > :01:45.that Vladimir Putin had described him as a genius.
:01:46. > :01:47.This lofty regard was apparently mutual -
:01:48. > :01:48.with Trump regularly expressing his admiration
:01:49. > :01:51.Today, however, the American President seemed to place his
:01:52. > :01:54.Russian counterpart on the other side of a purported war
:01:55. > :02:00.During a speech in the Polish capital, Warsaw, he called on Russia
:02:01. > :02:04.to stop destabilising Ukraine and other countries and to end
:02:05. > :02:08.support for hostile regimes such as those in Syria and Iran.
:02:09. > :02:11.With the pair due to meet tomorrow at the G20 summit in Hamburg,
:02:12. > :02:13.Newsnight's Diplomatic Editor Mark Urban has been exploring
:02:14. > :02:25.the American President's apocalyptic warning.
:02:26. > :02:28.It's the President's second visit to Europe and today's
:02:29. > :02:34.speech was billed as a big foreign policy moment.
:02:35. > :02:38.Given in Warsaw's Krasinski Square in front of a memorial to the 1944
:02:39. > :02:40.uprising against the Nazis, an appreciative audience had been
:02:41. > :02:54.It fell to the First Lady to do the warm up.
:02:55. > :03:00.The president of the United States, Donald J Trump.
:03:01. > :03:03.And with that, Trump set out his stall of a West
:03:04. > :03:05.in existential crisis and his formula for success
:03:06. > :03:09.While we will always welcome new citizens who share our values
:03:10. > :03:11.and love our people, our borders will always
:03:12. > :03:24.be closed to terrorism and extremism of any kind.
:03:25. > :03:26.Today, the West is also confronted by the powers that
:03:27. > :03:28.seek to test our will, undermine our confidence
:03:29. > :03:38.To meet new forms of aggression, including propaganda,
:03:39. > :03:42.financial crimes and cyber warfare, we must adapt our alliance
:03:43. > :03:51.to compete effectively in new ways and on all new battlefields.
:03:52. > :03:54.And here, having alluded to the Russian and Chinese threats,
:03:55. > :03:56.he did at last state his commitment to Nato's mutual defence
:03:57. > :04:05.But it was a distinctly Trumpian formula that shed little light
:04:06. > :04:08.on the issue of how the West revives its fortunes economically.
:04:09. > :04:22.A large part of the answer to that question depends on whether Macron
:04:23. > :04:25.and Merkel can reignite the Franco-German motor,
:04:26. > :04:27.rewrite Europe's fiscal rules and really generate growth again
:04:28. > :04:34.That is where the hope lies and, if you like, the glass of champagne
:04:35. > :04:38.is half full at the moment in Paris and in Berlin.
:04:39. > :04:44.Today's speech owes much to White House strategy boss Steve Bannon.
:04:45. > :04:47.You have an expansionist Islam and an expansionist China, right?
:04:48. > :04:49.They are motivated, they are arrogant,
:04:50. > :04:51.they are on the march and they think the Judaeo-Christian West
:04:52. > :04:58.His view of the world revolves around hard power and the need
:04:59. > :05:06.Even so, many more mainstream conservatives
:05:07. > :05:12.I think the president struck the right tone on Polish soil today,
:05:13. > :05:17.a strong reiteration, I think, of the importance
:05:18. > :05:19.of the transatlantic alliance and a reminder of the values that
:05:20. > :05:29.The illiberality of this message and emphasis on religious faith
:05:30. > :05:31.worked well for this Polish audience, but it's out of kilter
:05:32. > :05:39.It was very significant, not only that he chose Poland,
:05:40. > :05:43.you know, which has got that law and justice government,
:05:44. > :05:45.a right-wing government, a very Christian government that
:05:46. > :05:47.refuses to take refugees from the Middle East
:05:48. > :05:50.and is being sued by the EU over that, but it's very significant
:05:51. > :05:55.that he, in his speech in Warsaw, did not use the word democracy once.
:05:56. > :05:58.The President today claimed that billions and billions of extra
:05:59. > :06:00.defence spending was now pouring into Nato as a result
:06:01. > :06:10.So typical transactional Trump, having got what he wanted,
:06:11. > :06:13.he gave the Europeans what he thought they were after.
:06:14. > :06:25.That's all very well, but it hardly builds Western unity.
:06:26. > :06:27.After today's Warsaw event, Hamburg looked very different this
:06:28. > :06:32.evening as the President arrived for a G20 meeting.
:06:33. > :06:39.Violent protests happened pre-Trump, of course,
:06:40. > :06:41.but in tone and substance, the President's message is hardly
:06:42. > :06:46.I'm joined now by Pulitzer prize winning historian,
:06:47. > :06:48.Eric Foner and Susan Glasser - former Foreign Policy editor
:06:49. > :06:58.in chief and the first editor of Politico magazine.
:06:59. > :07:05.Susan, was it significant, D-Link, or how significant was it that the
:07:06. > :07:10.word democracy did not appear at all in that speech? -- do you think.
:07:11. > :07:14.Significant but not a surprise. Me doesn't use the word democracy
:07:15. > :07:22.often. Some people here were likening his speech to a European
:07:23. > :07:23.version of his American carnage and inauguration speech. -- Trump
:07:24. > :07:39.doesn't you do a clash of civilisations, harking
:07:40. > :07:44.back to Samuel Huntington's 1993 work where is spam replaces Russia
:07:45. > :07:51.as the enemy of Western domination in the world. Does that tally with
:07:52. > :07:55.what you heard today? What was interesting was Trump was laying out
:07:56. > :08:01.this apocalyptic vision of the world divided into the forces of light,
:08:02. > :08:05.darkness, and it gives you an insight into what you might call the
:08:06. > :08:11.intellectual origins of Trump's outlook. It may seem absurd to put
:08:12. > :08:16.intellectual and Trump in the same sentence, because he doesn't read
:08:17. > :08:19.books, he has no literature curiosity. But with people like
:08:20. > :08:23.Steve Bannon around him, this is their view of the world, that it has
:08:24. > :08:29.always been these clashes of civilisations. That our whole
:08:30. > :08:32.civilisation is under assault from either Isis or radical Islam, as
:08:33. > :08:43.they call it, maybe the Chinese in the future rising. This is a view
:08:44. > :08:46.which isn't particularly conducive to compromise, to negotiation. Steve
:08:47. > :08:51.Bannon steams to think we are living back in the age of the Crusades
:08:52. > :08:56.where Christianity and Islam are at war. -- seems to think. And for the
:08:57. > :08:59.future of the world. If you look at Isis, it is ridiculous, it is a
:09:00. > :09:06.small group of violent criminal people but they don't pose a threat
:09:07. > :09:09.to the US or the UK. I mean, the Cold War, the existence of these
:09:10. > :09:17.countries was under threat, you know? From nuclear warfare. But, you
:09:18. > :09:23.know, this apocalyptic vision is not really an accurate representation of
:09:24. > :09:28.the way the world is today. Yet the rhetoric, Susan, of an assault on
:09:29. > :09:32.Western values, it puts bums on seats, doesn't it? What value is
:09:33. > :09:36.mighty realistically be able to persuade Americans are being
:09:37. > :09:48.threatened by a resurgent China, or and expansionist Islam? It is murky.
:09:49. > :09:51.What exactly is the clash of civilisations here? That is why
:09:52. > :09:56.Trump's speech today is probably really unlikely to amount to much in
:09:57. > :10:02.terms of policy. I was struck by the fact that you know who it reminds me
:10:03. > :10:06.of? Vladimir Putin's rhetoric. You captured earlier in the programme
:10:07. > :10:12.the tension of this on the one hand critical language towards Russia you
:10:13. > :10:20.haven't always seen Trump used. He suggested that they stopped shoring
:10:21. > :10:25.Assad. But that is different to the full throated, bombastic even common
:10:26. > :10:29.rhetorical nature of this speech. It is actually Vladimir Putin who often
:10:30. > :10:34.talks in terms very much like this. He says the number one threat Russia
:10:35. > :10:39.and Europe faces is from terrorism. He said that from the beginning of
:10:40. > :10:42.his tenure as Russia's leader. And he talks about restoring
:10:43. > :10:47.conservative values in a vague way. I think Trump was unclear exactly
:10:48. > :10:52.what the existential threat is right now. Do you think he knows himself
:10:53. > :10:58.what the existential threat is? Or are you casting him in the role of
:10:59. > :11:02.Steve Bannon's glove puppet? It's Steve Bannon, what we call the
:11:03. > :11:07.alternative right in the US. There is another forebear of Trump. You
:11:08. > :11:12.didn't mention this. But in his speech he started denouncing
:11:13. > :11:18.bureaucracy. Nobody likes to defend bureaucracy, but this goes back to
:11:19. > :11:22.an obscure radical, James Burnham, who wrote a book in the early 1940s,
:11:23. > :11:29.which has been picked up again in these obscure right-wing website to
:11:30. > :11:32.argue that the threat today is not from a standard from the
:11:33. > :11:41.administrative state. Trump attacks what they call regulation, or that
:11:42. > :11:44.kind of thing. That is a trope extreme writers are fond of using,
:11:45. > :11:51.that it is the state itself which is the danger to Western freedoms.
:11:52. > :11:58.Burnham contended that communism and capitalism were essentially two
:11:59. > :12:04.sides of the same undesirable coin. What word would you employ to
:12:05. > :12:13.describe whatever alternative it is that they want to replace the old
:12:14. > :12:17.world order with? Would you did not mention is that beneath this is an
:12:18. > :12:22.exclusive vision of what American civilisation or Western civilisation
:12:23. > :12:25.is. It is fundamentally Christian. It is fundamentally white. Other
:12:26. > :12:30.peoples don't have a role in it according to them. You could call it
:12:31. > :12:35.a white nationalism. That is what we often call it in the US. It is
:12:36. > :12:39.explicit now. Not in this speech but in the right-wing website and
:12:40. > :12:46.call-in radio. The racial element here. And the religious element is
:12:47. > :12:49.very strong. That goes all the way against the traditions of American
:12:50. > :12:54.values of separation of church and state and pluralism, and tolerance.
:12:55. > :12:59.Those are threats to our civilisation right now. They are
:13:00. > :13:02.coming from within. Susan, the meeting with Vladimir Putin
:13:03. > :13:08.tomorrow, do you speculate on what a positive outcome might be? I would
:13:09. > :13:10.caution people against thinking this is a definitive moment of
:13:11. > :13:15.confrontation when we will find out once and for all just what is the
:13:16. > :13:20.deal between Trump and Putin, or even find out what our policy is.
:13:21. > :13:25.We've just heard there is only going to be Donald Trump and Rex Tillerson
:13:26. > :13:29.and their translators in the meeting with Sergei Lavrov, the Foreign
:13:30. > :13:34.Minister, and President Putin himself. It is going to be an hour
:13:35. > :13:38.or less. Once you add the translation in, it amounts to a
:13:39. > :13:44.short chat between two countries. Even if they are talking, has been
:13:45. > :13:50.reported, and I terrorism moves, can you imagine any major significant
:13:51. > :13:59.arrangement being agreed to in half an hour? -- anti-terrorism moves.
:14:00. > :14:00.Forgive me, I need to move on. Thank you both so much for your time this
:14:01. > :14:02.evening. Staying with Trump, Russia -
:14:03. > :14:04.and, indeed, Ukraine - the Hungarian foreign
:14:05. > :14:08.minister, Peter Sijarto, about being positioned both
:14:09. > :14:10.politically and geographically right in the middle of the changing
:14:11. > :14:12.political landscape. We also discussed Brexit, of course,
:14:13. > :14:15.but I began by asking him about his government's perceived
:14:16. > :14:17.proximity to the Kremlin and possible problems this poses
:14:18. > :14:25.for Hungarian citizens in Ukraine. I don't like this kind
:14:26. > :14:29.of stigmatisation. And I don't like this kind
:14:30. > :14:33.of simplification of things. It was fair to say it is
:14:34. > :14:36.a friendly relationship. And if you live in Central Europe
:14:37. > :14:44.you know that you can't afford Because it's not just
:14:45. > :14:47.European countries that I'd love to know, where do
:14:48. > :14:54.you think, from what he said since becoming President,
:14:55. > :14:56.Donald Trump sits on that scale? Well, you know, actually,
:14:57. > :14:58.we cross fingers for And we cross fingers for him to be
:14:59. > :15:03.able to build a balanced relationship with Russia
:15:04. > :15:05.because you know, as I told you, we are living in Central Europe
:15:06. > :15:08.and we have a very clear Which says that whenever
:15:09. > :15:11.there was a conflict between East and West,
:15:12. > :15:14.Central Europe always lost. And we don't want to
:15:15. > :15:18.be losers any more. So, when we argue, or when we hope
:15:19. > :15:23.for a better relationship between the US and Russia,
:15:24. > :15:25.it's not because we are pro-Russia or pro-US,
:15:26. > :15:31.it's because we are pro-Hungarian. Did you agree with him
:15:32. > :15:33.when he said earlier today I totally agree with the position
:15:34. > :15:37.that the civilised world The better the relationship between
:15:38. > :15:42.the US and Russia is better for us. The worst relationship between US
:15:43. > :15:47.and Russia is the worst for us. You know, we are living
:15:48. > :15:49.in central Europe, OK? Is it fair to describe
:15:50. > :15:55.Viktor Orban's government as being one of the more Eurosceptic
:15:56. > :15:57.in the European Union? No, Hungarian people,
:15:58. > :16:01.including the Hungarian government, But what I can tell
:16:02. > :16:05.you is the following, that we are absolutely pro-European,
:16:06. > :16:11.we want strong European Union because Hungary can be really strong
:16:12. > :16:14.in a strong European Union. 80% of our trade goes
:16:15. > :16:16.on with the EU countries. So we are interested
:16:17. > :16:18.in a strong European Union. But we have a serious
:16:19. > :16:21.debate with Brussels, with some other member states,
:16:22. > :16:24.about how to get there. So we say that the federalist
:16:25. > :16:29.approach will not work out. So we are rather on a sovereignty
:16:30. > :16:32.path, saying that strong European Union must be based
:16:33. > :16:35.on strong member states. You know, to be very honest,
:16:36. > :16:40.we regretted the decision. Because it's a big
:16:41. > :16:45.political and economic loss for the European Union,
:16:46. > :16:47.because you had a very strong voice in the debate
:16:48. > :16:51.about the future of Europe. So this debate will now be
:16:52. > :17:00.unbalanced because the leader of one camp, or the strongest voice of one
:17:01. > :17:03.camp, falls out. In the meantime, here
:17:04. > :17:05.we have a nightmare scenario, If there is no deal,
:17:06. > :17:10.if there is no comprehensive economic trade and investment
:17:11. > :17:17.agreement, then we will be in big trouble in Europe,
:17:18. > :17:21.because the last time we were able to implement a free trade
:17:22. > :17:24.agreement was in 2011. So the problem is that the EU is
:17:25. > :17:31.very slow on free trade agreements. And if Britain gets free hands,
:17:32. > :17:34.then you will be able to sign free trade agreements with India,
:17:35. > :17:39.with Turkey, with the US, with Australia, with
:17:40. > :17:41.which we do not have. I mean, the European Union doesn't
:17:42. > :17:44.have free trade agreements. So if this is the case, then it
:17:45. > :17:47.will harm our competitiveness, harm the competitiveness
:17:48. > :17:52.of European Union furthermore. So that's why we are pushing
:17:53. > :17:55.for a fair, I don't like this Do you understand the
:17:56. > :18:02.categorisation, because I don't. You don't understand, OK,
:18:03. > :18:06.so that's a common point. We want fair Brexit,
:18:07. > :18:08.that's for sure. Balanced, fair Brexit,
:18:09. > :18:11.which will end up in mutual benefits But we want the most comprehensive
:18:12. > :18:18.economic trade and investment partnership with the UK
:18:19. > :18:20.in the future. But I think that we are
:18:21. > :18:23.on the right track. I hope European institutions
:18:24. > :18:26.are ready to negotiate in a, Because what we don't
:18:27. > :18:32.want is the following, that you look back to the time
:18:33. > :18:34.of your referendum. Then some of the reactions come
:18:35. > :18:37.on behalf of European institutions, where,
:18:38. > :18:41.like, as those people took it And we don't want any
:18:42. > :18:47.European institutions to sit at the negotiating table as a group
:18:48. > :18:51.of insulted people. And we don't want the European
:18:52. > :18:54.negotiators or EU negotiators What we want is to have a good
:18:55. > :19:01.deal at the end, a fair deal, Earlier this evening,
:19:02. > :19:08.the director-general of the CBI Carolyn Fairbairn warned
:19:09. > :19:12.in a lecture at the LSE that Brexit uncertainty is starting
:19:13. > :19:19.to damage the UK economy. She cited companies changing plans
:19:20. > :19:24.and slowing investment in anticipation of what she called
:19:25. > :19:26.the "serious disruption" that would ensue if the UK were to leave
:19:27. > :19:29.the EU without a deal. Her comments came as International
:19:30. > :19:32.Trade Minister Liam Fox appeared to add his weight to his Cabinet
:19:33. > :19:36.colleague Andrea Leadsom's recent contention that reporting unwelcome
:19:37. > :19:39.statistics about Brexit He claimed in the Commons that some
:19:40. > :19:45.elements of the media would rather see Britain fail
:19:46. > :20:06.than Brexit succeed. It speaks perhaps too difficult
:20:07. > :20:09.truth to you, which is when you describe an environment you consider
:20:10. > :20:12.less than conducive to business, you run the risk of making that
:20:13. > :20:16.environment even less conducive to business, talking the country down,
:20:17. > :20:22.if you like? One of the things that is really important to have now is a
:20:23. > :20:27.realistic debate. When we hear from firms across the country large and
:20:28. > :20:29.small about the way uncertainty is beginning to affect investment
:20:30. > :20:32.decisions, I think it is very important that we say that but also
:20:33. > :20:36.that we put ideas on the table so what were doing today is putting an
:20:37. > :20:47.idea on the table which is not about the weather of Brexit, is about the
:20:48. > :20:52.how. Whether with an H. It is about a Brexit that protects jobs and
:20:53. > :20:57.investment, that is what we are tabling and a proposal that means
:20:58. > :21:01.the UK would stay in the customs union and the single market as a
:21:02. > :21:06.bridge to a future deal, it has the added advantage that there will be
:21:07. > :21:11.only one transition. How long is the bridge? As short as is possible. It
:21:12. > :21:15.is very difficult to tell. It depends on the final point is, the
:21:16. > :21:18.final point is very different from today. Win over Canada free trade
:21:19. > :21:24.deal took seven years, we hope it would be very much shorter than
:21:25. > :21:28.that. It is very important to say this has no interest in anything
:21:29. > :21:30.that is open-ended and more uncertainty, so a short as
:21:31. > :21:34.practically possible but something that gives businesses the time to
:21:35. > :21:41.adapt. You say it is important to have the debate now, why now and not
:21:42. > :21:44.a year ago after the Lancaster house speech? There is something very
:21:45. > :21:47.important about that because we are heading into the time when companies
:21:48. > :21:50.plan their investment and every sector, every company, has a
:21:51. > :21:54.different point at which they start planning things. So a bakery in
:21:55. > :21:57.Northern Ireland, we know it would take them 20 months if they wanted
:21:58. > :22:03.to relocate to the Republic because of tariffs, so they are starting to
:22:04. > :22:06.think now about what they are having to do. Airlines, it is a year before
:22:07. > :22:10.because they are thinking about passenger reservations. So every
:22:11. > :22:14.company has a tipping point and we are heading into that period and
:22:15. > :22:18.that is why we are beginning to hear more concern from our members about
:22:19. > :22:23.those cliff edges. You did mention the election but you wouldn't be
:22:24. > :22:28.making the speech of Theresa May had secured a three figure majority. I
:22:29. > :22:31.think we would have done. Word for word? I think so because we have an
:22:32. > :22:35.important role to play at the moment, talking about what grassroot
:22:36. > :22:39.businesses, large and small, across the country are saying and they are
:22:40. > :22:42.saying it is beginning to bite and it is important that we are able to
:22:43. > :22:46.say that but also that we have a simple solution on the table. But it
:22:47. > :22:49.is not a solution in the strictest sense of the word, it is holding
:22:50. > :22:54.tactic, a postponement of either pain the unknown. I think the
:22:55. > :22:59.questionnaire dancers is how you give more confidence to business now
:23:00. > :23:03.to invest for the future -- I think the question it and so. The economy
:23:04. > :23:08.is a flywheel, so investment today is jobs in the future and I think
:23:09. > :23:11.our priority today is that, it is so important for growth in the future
:23:12. > :23:14.so let's deal with that problem first. The almost irresistible
:23:15. > :23:18.subtext of all of this is when we reached the end of the bridge,
:23:19. > :23:23.things at that end can't be as good as they were at this end. I think
:23:24. > :23:27.that is an area where we should be optimistic. I think we can still say
:23:28. > :23:32.that we need to get to an in principle agreement by March 2019.
:23:33. > :23:36.One of the important benefits of the proposal we put on the table today
:23:37. > :23:39.is that you can focus all the effort on that final deal, you are not
:23:40. > :23:44.talking about some interim other transitional arrangement which would
:23:45. > :23:50.take up a lot of time, so we think it would make it more likely to get
:23:51. > :23:54.to that outline deal by March 20 19. Except of course March 2019 is the
:23:55. > :23:57.date on which an awful lot of people would be expecting freedom of
:23:58. > :24:03.movement to end immediately and your proposal would well, continue at
:24:04. > :24:08.indefinitely. Not indefinitely. Indefinitely as in you can't tell me
:24:09. > :24:12.how long your bridge is. Firms accept that freedom of movement well
:24:13. > :24:15.and and again, this is about trade-offs and about timing. Firms
:24:16. > :24:18.are committed to, we know we are going to need to increase training
:24:19. > :24:23.and we are going to need to scale up to fill the gaps that are created.
:24:24. > :24:26.That is going to take time, so the other thing that the bridge to the
:24:27. > :24:30.future will give us is the chance to prepare, the chance to get ready, so
:24:31. > :24:33.I think that is a transition as well. I think a lot of people
:24:34. > :24:37.watching may be thinking that you would quite like to stay on the
:24:38. > :24:44.bridge for ever, and see you as one of these on crushed saboteurs.
:24:45. > :24:48.Really clear that that is not the case. I will go back to the point
:24:49. > :24:51.that business is one certainty, not some open-ended period of
:24:52. > :24:56.uncertainty, so as short as practically possible but long enough
:24:57. > :25:00.for the Government, for firms, for people to adapt. Businesses do think
:25:01. > :25:03.in years and they will need time to get ready, so it is a practical
:25:04. > :25:08.proposal that gives the certainty now and that bridge to the future.
:25:09. > :25:16.It is practical but it is almost completely unpolitical. Is that the
:25:17. > :25:18.definition of your role, you represent the interests of your
:25:19. > :25:20.members and don't worry about the difficulty that a Prime Minister may
:25:21. > :25:23.have in delivering the plan you describe? Well, I think everybody
:25:24. > :25:26.has an interest in the success of the economy and jobs and prosperity
:25:27. > :25:30.and I think one of the things we have seen since the election that is
:25:31. > :25:34.very welcome is the economy back centrestage, people are talking
:25:35. > :25:37.about it and how we will pay the public services, about the way we
:25:38. > :25:41.have jobs for our children, so I think that is where this comes
:25:42. > :25:45.together. That is why I do think we have a responsibility as businesses
:25:46. > :25:48.to talk about investment today, jobs in the future. So I'm hoping
:25:49. > :25:52.politics and economics can come together in this. Fingers crossed.
:25:53. > :25:56.Carolyn Fairbairn, many thanks indeed.
:25:57. > :26:01.There is always an element of chance in predicting the future, obviously,
:26:02. > :26:06.but the broad consensus among tech watchers is the biggest of all next
:26:07. > :26:10.big things will be AI, or artificial intelligence. Machines will be able
:26:11. > :26:16.to do things that for millennia, we have blithely presumed would always
:26:17. > :26:17.be the exclusive domain of humanity. Reasoning, recognising speech, text,
:26:18. > :26:20.images, collaboration. The impact from jobs to healthcare ,
:26:21. > :26:24.from transport to education is likely to be as profound
:26:25. > :26:27.as the industrial and information Our technology editor David Grossman
:26:28. > :26:36.has been given exclusive access to Microsoft's AI labs in Seattle
:26:37. > :26:38.to see how this There's nothing perhaps that
:26:39. > :26:49.looks quite so dated Seattle's salute to science
:26:50. > :26:58.in the century to come. See how man will live and work
:26:59. > :27:03.and play in the year 2000. Seattle's Space Needle and monorail,
:27:04. > :27:06.built for the '62 World's Fair, probably tell us more
:27:07. > :27:09.about the assumptions of that time Most often, predictions miss
:27:10. > :27:18.the really profound shifts. What Eve will look like in A.D.
:27:19. > :27:21.2000. Like this pre-war assumption
:27:22. > :27:24.that the 21st century woman of fashion would still have a lady's
:27:25. > :27:29.maid to help her dress. Shoes will have cantilever heels
:27:30. > :27:32.and an electric belt will adapt Pity then the people that work here,
:27:33. > :27:40.this is Building 99 In here, the predictions
:27:41. > :27:47.that they make determine the future of the company and perhaps,
:27:48. > :27:50.if they are right, We are betting the company
:27:51. > :27:57.on advances in AI. I've been given exclusive access
:27:58. > :28:04.to meet the people and see the projects that Microsoft believe
:28:05. > :28:08.will shape the future. It reached the point now
:28:09. > :28:11.where people can have, you know, very natural conversations
:28:12. > :28:13.with software and software can I look at how they were
:28:14. > :28:22.actually walking... Eric Horvitz is head
:28:23. > :28:28.of Microsoft's AI programme. Even the lifts here run
:28:29. > :28:34.on this new technology. So much of our civilisation,
:28:35. > :28:37.what we think is special about humans, is based
:28:38. > :28:39.on our intellects, on our ability to see and understand reason,
:28:40. > :28:44.inverse and collaborate to see and understand reason,
:28:45. > :28:46.converse and collaborate and for the first time in history,
:28:47. > :28:49.we are getting close to building machines that have some
:28:50. > :28:50.of that intellect. We went from a place
:28:51. > :28:53.where we would get it right about 80% of the time to a point
:28:54. > :28:58.where, now, it's actually achieved human parity in speech recognition
:28:59. > :29:00.and that's something that just You could probably make sense
:29:01. > :29:08.of the jumble of colours and shapes in this photograph almost instantly,
:29:09. > :29:11.even though chances are you've But consider what it would take
:29:12. > :29:17.for a machine to do that. We've taken natural language
:29:18. > :29:25.processing research, computer vision research and had
:29:26. > :29:32.people from those to field work computer vision research and had
:29:33. > :29:34.people from those two fields work together to be able to generate
:29:35. > :29:36.sentences about pictures. Here, the sentence that we generated
:29:37. > :29:40.with no context other than the contents
:29:41. > :29:42.of the image here is "A man swimming
:29:43. > :29:44.in a pool of water." You know, it used to be the case
:29:45. > :29:47.that it took thousands and thousands of images and hours
:29:48. > :29:49.and hours to train. Now we are down to dozens
:29:50. > :29:52.and minutes and seconds So the building blocks for an AI
:29:53. > :30:00.world are almost complete. Computers can now not only recognise
:30:01. > :30:02.pictures and objects, but gestures and video and speech
:30:03. > :30:06.and text, faces and even emotion. All of these skills can be used
:30:07. > :30:09.by developers in an almost infinite variety of combinations
:30:10. > :30:17.to create new applications. There's been research that
:30:18. > :30:24.Microsoft's been doing Only recently are we seeing these
:30:25. > :30:30.services at the level of quality at a developer can actually build
:30:31. > :30:33.on and have reliable experiences from, because,
:30:34. > :30:35.you know, prior to that, the amount of data required to truly
:30:36. > :30:37.make high, confident predictions from artificial intelligence wasn't
:30:38. > :30:39.there and the computing But a world of super intelligent
:30:40. > :30:52.computers understanding everything isn't everyone's idea
:30:53. > :30:53.of technological perfection. Does any part of this
:30:54. > :31:02.future terrify you at all? I'm concerned with potential
:31:03. > :31:06.misuse of this technology by malevolent forces,
:31:07. > :31:11.by people with ill will. By state and non-state actors
:31:12. > :31:14.who can gain strong powers I haven't also think that the answer
:31:15. > :31:20.to some of that is the AI itself, because there is no better defence
:31:21. > :31:26.and no better detector of And very soon, we might forget
:31:27. > :31:38.we are talking to computers at all. AI systems can have human facing
:31:39. > :31:41.front ends known as bots For example, Xiaoice has been
:31:42. > :31:49.developed by Microsoft to interact with people on Chinese social media
:31:50. > :31:52.and with every conversation, Xiaoice learns both
:31:53. > :31:57.about the individual and humanity. Absolutely, that is what they
:31:58. > :32:03.call me around here. More efficiently, Dan Driscoll
:32:04. > :32:05.is Development Manager and Principal Architect
:32:06. > :32:06.of the Microsoft bot framework. They form emotional connections
:32:07. > :32:14.with some of these chatbots and have I think the average for Xiaoice
:32:15. > :32:20.is 23 turns per conversation, so people will chat, will say,
:32:21. > :32:22."Hey, how are you doing?" "I am having a good day,
:32:23. > :32:26.how are you doing?" They form a kind of emotional
:32:27. > :32:29.relationship and that is one So many bots have both
:32:30. > :32:33.a sort of like a factual, an IQ component and an emotional
:32:34. > :32:41.or personality EQ component. AI will not only be able
:32:42. > :32:43.to know and recognise everything and everyone,
:32:44. > :32:46.it will know how to charm us, It will know how to reassure us
:32:47. > :32:49.and how to frighten us. Instead of us operating
:32:50. > :32:52.the computers, the computers will be Whoever controls the AI probably
:32:53. > :32:55.controls the future. There's already disquiet about using
:32:56. > :33:02.Big Data to target voters. Well, imagine what an all seeing,
:33:03. > :33:22.all knowing AI could do. Are you concerned at all,
:33:23. > :33:24.for example, about AI elections, AI systems can be designed
:33:25. > :33:28.to persuade, to... In an algorithmic view to optimise
:33:29. > :33:30.goals of changing someone's believes or enhancing the beliefs about one
:33:31. > :33:36.thing or another. The prospect that some day,
:33:37. > :33:40.data mining, data analysis, very close targeting a particular
:33:41. > :33:42.demographics can be used in elections to influence
:33:43. > :33:44.elections is a very, On the one hand, we can see
:33:45. > :33:51.and we can imagine how authoritarian regimes can use these technologies
:33:52. > :33:53.through tracking, surveillance, persuasion, that would
:33:54. > :33:54.strengthen this authoritarian On the other hand, these
:33:55. > :34:08.techniques of AI also open up the world for pluralism,
:34:09. > :34:10.for discussion and collaboration, understanding and tracking,
:34:11. > :34:12.you know, understanding the sources of persuasion and signalling
:34:13. > :34:14.coming into one's life. So we see this prospect of who is
:34:15. > :34:27.going in different directions. So we shouldn't ignore
:34:28. > :34:29.the huge potential benefits. About 30 miles outside Seattle,
:34:30. > :34:31.I saw Microsoft's AI form. Data driven farming
:34:32. > :34:33.could revolutionise how However, measuring precise moisture
:34:34. > :34:41.and nutrient levels for each part of the field would require thousands
:34:42. > :34:44.of sensors and the Instead, an AI model of the farm can
:34:45. > :34:50.be built with just a few sensors in the ground and a few photographs
:34:51. > :34:56.from the air. This is going to help
:34:57. > :34:58.the farmers reduce costs, use much less water,
:34:59. > :35:00.use much less lime, use less fertiliser,
:35:01. > :35:04.use less nutrients and stuff. So this is definitely
:35:05. > :35:07.going to have an impact on reducing the cost as well as less harm
:35:08. > :35:16.on the environment. And the early indications
:35:17. > :35:18.are that yields will rise Using cheap cameras
:35:19. > :35:22.and tethered helium balloons, AI could revolutionise subsistence
:35:23. > :35:27.farming in the developing world. Artificial intelligence is growing
:35:28. > :35:33.fast, getting smarter all the time. While some fear it could end
:35:34. > :35:36.of our species, others believe it Very soon, AI will take off
:35:37. > :35:41.and we will find out if we control It's a mighty tome but when you
:35:42. > :35:53.consider that its author Ibram X Kendi aspires to provide
:35:54. > :35:55.the definitive history of racist ideas in America,
:35:56. > :35:57.it's perhaps surprising that Stamped From The Beginning only runs
:35:58. > :36:05.to just north of 500 pages. The title comes from a speech
:36:06. > :36:07.given to Congress in 1860 by Jefferson Davis, the Mississipi
:36:08. > :36:10.senator who went on to serve as president of the Confederate
:36:11. > :36:15.states of America. He argued that so-called 'black
:36:16. > :36:17.inferiority' had been stamped from the beginning on the bodies
:36:18. > :36:20.of Africans at the Ibram X Kendi joins me down
:36:21. > :36:38.the line from Florida. It is a history book obviously, but
:36:39. > :36:45.it's motivation seems very of the moment. It is because I think I
:36:46. > :36:51.wanted to show readers that we have been engaged in a racial debate, the
:36:52. > :36:57.same racial debate we are engaged in right now, really for hundreds of
:36:58. > :37:02.years. That racial debate seeks to answer the question, why does racial
:37:03. > :37:06.inequality exist? Why do racial disparities exist in our societies?
:37:07. > :37:09.This book really takes the reader through hundreds of years of
:37:10. > :37:15.different people answering that question. And those that have
:37:16. > :37:19.expressed racist ideas have stated racial inequalities, because black
:37:20. > :37:24.people are inferior, and those have expressed anti-racist ideas have
:37:25. > :37:29.been suffering as a result of racial discrimination. Many people would
:37:30. > :37:32.point to the double election of Barack Obama as perhaps the
:37:33. > :37:37.beginning of the end of the history of racism. The one you describe. Yet
:37:38. > :37:42.you describe him as a following in the racist footsteps of every
:37:43. > :37:48.president since Richard Nixon. One of the things I wanted to do is
:37:49. > :37:52.state a very clear definition of a racist idea. And then apply that
:37:53. > :37:57.definition to many different thinkers. And I ended up before
:37:58. > :38:01.applying it to anyone. I ended up applying it to myself and realising
:38:02. > :38:07.that I had even expressed racist ideas. And people I admire like
:38:08. > :38:10.Frederick Douglass, and even Barack Obama, expressed racist ideas,
:38:11. > :38:16.suggesting there was something wrong and inferior about black people. I
:38:17. > :38:19.think that's how powerful and how widespread and how believable these
:38:20. > :38:25.ideas have been throughout American history. You also address the issue
:38:26. > :38:31.of why people in power choose to invoke the fear of a black man, of
:38:32. > :38:38.the black person, in the minds of white people, what answers did you
:38:39. > :38:44.arrive at? I think the underlying, sort of, thesis of the text is
:38:45. > :38:48.showing the ways of which racist ideas are merging and people are
:38:49. > :38:54.consuming those ideas and becoming fearful, becoming hateful, becoming
:38:55. > :38:59.ignorant, that these people are creating and producing these racist
:39:00. > :39:03.ideas to justify racist policies. I think people can understand if you
:39:04. > :39:06.are a slave owner and you make money from owning slaves, black slaves,
:39:07. > :39:11.you are going to create racist ideas to convince others that black people
:39:12. > :39:13.should be enslaved. That black people are so barbaric that if they
:39:14. > :39:18.are not enslaved they will just ravage society. Then you have people
:39:19. > :39:22.who consume those and then begin believing those ideas. That anecdote
:39:23. > :39:27.is indicative of the way racist ideas have function throughout
:39:28. > :39:32.American history. Do you worry you may have unwittingly created a
:39:33. > :39:38.compendium of inspirational racists? You site so much verbatim evidence
:39:39. > :39:41.from historical American political giants, from Abraham Lincoln to even
:39:42. > :39:51.Theodore Roosevelt, expressing, well, an explicit fear so that
:39:52. > :39:58.people on the right can say, we are right, even Abraham Lincoln agrees.
:39:59. > :40:03.Unfortunately, as a scholar, I didn't have the opportunity to think
:40:04. > :40:06.of the effect of this definition. I wanted to create a definition of a
:40:07. > :40:11.racist idea which is very simple, any idea that suggests a racial
:40:12. > :40:16.group is superior or inferior to another racial group in anyway. That
:40:17. > :40:21.definition ended up becoming applied to people I didn't realise it was
:40:22. > :40:28.going to be. But again I think that has been one of the problems. That
:40:29. > :40:32.we... So many people have tried to define their ideas outside of
:40:33. > :40:43.racism. And it has left us with a nearer -- and it has left us with an
:40:44. > :40:46.inaccurate idea of it. INAUDIBLE
:40:47. > :40:55.If we actually look at American history. During the enslavement era,
:40:56. > :40:58.by the time of the end of slavery, 4 million, 5 million poor whites,
:40:59. > :41:02.largely kept in poverty due to the riches of slave holders. Then you
:41:03. > :41:07.have the Reconstruction era which was a boon for many working class
:41:08. > :41:12.and poor whites, as it was for pre-blacks. But then that era was of
:41:13. > :41:21.course undermined by the rise of Jim Crow. Ben White poverty rose just as
:41:22. > :41:24.black poverty rose. The civil rights movement was great for black people
:41:25. > :41:32.and also great for many Americans. -- then white poverty rose just as
:41:33. > :41:36.black poverty rose. In this order to this spiralling inequality in white
:41:37. > :41:41.America. Ultimately you see this history of not only racism being bad
:41:42. > :41:42.for black people, but bad for almost everyone. Many thanks for your time
:41:43. > :41:44.this evening. Before we go, it has
:41:45. > :41:47.been ordained that today Yet by whom, and for what purpose,
:41:48. > :41:51.other than to assist news producers in their quest to fill the gaping
:41:52. > :41:54.void marked "content", Marking International Kissing Day
:41:55. > :42:00.will no doubt become Remember to tune in tomorrow,
:42:01. > :42:07.when Evan will be in the chair. That's what's wrong with you -
:42:08. > :42:12.you should be kissed and often. # Woah Baby
:42:13. > :42:15.# (Kiss me Baby) # Woah Baby
:42:16. > :42:18.# (Love to hold you) # Woah Baby
:42:19. > :42:22.#(Kiss me baby) # Woah Baby
:42:23. > :42:28.# (Love to hold you) # Woah Baby
:42:29. > :42:33.# (Kiss me Baby) # Woah Baby
:42:34. > :42:35.# (Love to hold you) # Woah Baby
:42:36. > :42:37.# (Kiss me baby) # Woah Baby
:42:38. > :42:41.# (Love to hold you) Maybe we should kiss just
:42:42. > :43:01.to break the tension? Friday promises to be a quieter day.
:43:02. > :43:02.A damp start with the weather front drifting across Scotland, dragging
:43:03. > :43:03.its