Health Click


Similar Content

Browse content similar to Health. Check below for episodes and series from the same categories and more!



This week, I robot. Robo chef. And some loud, noisy animals meet the


locals. The design Museum in London has


moved into a new home, and it is suitably stunning. I have come to


see Fear And Love, an exhibition of 11 designers reactions to our


increase in the context. The most animated star Joe has to be any


industrial robot arm it is will present a more friendly face to


robotics and maybe help us empathise with the economics of the future. It


senses where you are and comes bounding over to see you, but if it


gets bored it will turn its attention to someone else. It is


like an excitable puppy, actually. Who knows, installations like this


may help to allay our fears of being around giant machines like this. I


have to say, it will still be a while before I trust this thing with


a scalpel, for example. That said, computers are increasingly being


used in healthcare around the world. There is plenty of research into how


artificial intelligence can help doctors better look after patients.


We have been taking a look at some of the latest developments.


Around the world, hospitals are facing a backlog of patients, ageing


populations and a shortage of specialist staff. Some hospitals are


teaming up with artificial intelligence research teams to see


if there are ways high-tech solutions can supplement or even


enhance healthcare in the face of these challenges. Singapore has a


nursing crisis. Its health minister says they will need more than 30,000


new nurses before 2020, and completely rethink the way it cares


for its ageing population. So when the CEO of one of its largest


private hospital networks approached IBM's Watson team, they come up with


a pilot project to try to help nurses working with the most


critically ill patients. This is the intensive care unit at Mount


Elizabeth Hospital. It is where four beds are conducted to artificial


intelligence nursing systems, collecting vital signs from the


patients in digging nurses a more complete picture of who needs the


most care. -- and giving. In one of the first trials of its kind in the


world, the AI is constantly monitoring output and making


connections on a vast range of data, including a commonly used scale. The


scores correspond to a higher incidence of death, and it is


particularly important in the first 24 hours after admission. This


patient has four limes, so if you don't see anything flashing, it


means it needs monitoring. One of the patient is at the high end of


the alert and nurses can quickly access the information in real-time


and look at patterns in their vital signs to see if they are at greater


risk of infections like sepsis. In the AI could help photo imaging


which is the focus of research between Google and the NHS. The


Royal College of radiologists says 99% of hospitals are struggling to


keep up with demand, and the UK has the third lowest numbers of


specialists who can interpret 's gains in Europe. Seven per 100,000


people. The large amount of data is overwhelming a health service


stretched to the limit -- scans. If you can use algorithms or machine


learning or artificial intelligence to set an alert for you to trigger


to say something has happened, you need to go and see this, this is


urgent and you need to deal with that, in the next hour or so when


you may have not known about that. I think it will improve quality of


care and actually improve equity across the system. One of the first


areas where the NHS is testing artificial intelligence is at


Moorfields, one of the busiest I hospitals in the world. Google is


applying the same machinery technology behind its winning Alpha


go computer programme. It beat the world's best human player by


completing tens of thousands of positions per second. We started it


to develop general-purpose burning and use those systems and learning


to make the world a better place. It was obvious to us a few years ago


that there is a massive opportunity to deliver the lead meaningful and


improved benefits to many patients and people across the world using


our techniques to try to improve the way we diagnose and treat patients


at risk of all sorts of diseases. The Moorfields Hospital research is


using scans from this OCT, or optical coherence stenography


machine, which gets a 3-dimensional image. It is used to diagnose


diseases like macular degeneration and diabetic bred apathy, two


leading causes of sight loss. ! Our loss. DeepMind is trying to develop


a algorithm to show scans of consent. They were chosen because of


the high rate of information on the way they can be broken down into


pixels showing areas where damage has occurred. I was especially


attracted to speaking to DeepMind because I thought their algorithms


would have the best ability to deal with 3-D imaging of an extremely


high resolution form such as the city. This is such a delicate area


of the eye that any sort of disruption of the normal


architecture has really amazingly severe consequences -- OCT. I


believe health career could be at a pivotal moment in history by these


advances in technology such as artificial intelligence will


fundamentally change the way medicine is practised, and have huge


benefits for patients. If you think about it, the best humans in the


world will have seen only a fraction of the number of cases that we can


show to an algorithm. Imagine we took all of the cases that many of


the top ophthalmologists in the world have seen themselves, and


aggregate them all in one place. Now the algorithm can sample from all of


the case studies that are seen by various humans and deliver a much


higher standard, more consistently, when making a diagnosis. Are these


projects still in the research or project stage, but is fascinating to


see how artificial intelligence could transform healthcare and the


two better and faster treatment in the future -- all of these projects.


Hello, and welcome to the Week In tech.


It was the week that Amazon completed its first drone


Taking 30 minutes from order to delivery, plus three years


if you factor in research and development, the elaborately


orchestrated trial involved an Amazon product and a bag


It was also the week that Super Mario came to the iPhone,


Pokemon Go got an upgrade, and a UK surgeon filmed an operation


And mere hours after hitting the road in San Francisco,


Uber has been ordered to stop offering passengers


Regulators have warned the company required a state of permanent


The order comes after footage emerged of a self-driving car


And finally, Stanford students put teeny goggles on tiny parrots.


But this was to protect the birds' eyes as they were trained to fly


The new technique has allowed scientists to gain a greater


understanding of how birds fly by analysing the movement


of particles around their flight paths.


It is hoped the work will improve flying robots of the future.


MUSIC PLAYS Where did you love or loathe


cooking, sometimes it would be nice to just make it a little bit quicker


and easier. So I have been testing some of the latest gadgets that aim


to come to the rescue. I have called it a bit of help from a friend. This


prototype robotic kitchen is making crab bisque today. It meant the


slick moves from a professional chef, whose motions were tracked in


the same space, making the same dish, using sensors and cameras.


This is actually quite extraordinary to watch, and that is the first drop


of mess that I have seen. It seems to be pretty clean and tidy. The


only issue is it doesn't do the washing up. That's right, I am not


doing it! And no drinking that. Everything needs to be precisely


prepared before, although some form of ingredient recognition is claimed


to be within its abilities before it goes on sale, which as you might


imagine, will be at quite a cost. A figure of around ?100,000 is being


thrown around. While Moley gets on with things, I will use my devices


to make all of this, and there is nobody to do the troubling to me. I


had better get on. First up, the decision could go to make some miso


salmon. For anyone who doesn't know what this method is, like me a few


weeks ago, it involves serving food in a bad and cooking it in water at


a precise temperature for a specific amount of time, so it should end up


perfectly and evenly cooked all the way through. This device can connect


to a smartphone app where you will find recipes and instructions you


need. Once you have the baby food, and that is the salmon in the back,


quite literally. -- prepared the food. You pop it in any suitably


sized pot with the Anova attached and confirm you are ready to go.


Although this model, which is Wi-Fi enabled, you can set it remotely,


although you would need to have everything prepared, of course. So


that is the main bit of the cooking done. But it does still need ceiling


for one minute in a frying pan. This needs to cook for just one minute on


each side, so it might heat up! Searing. Now for the moment of


truth. The five is great. It feels evenly cooked throughout. --


flavour. I probably missed the fact it is not crispy from the pen. I


could have left it in to do that, but followed the instructions. But


the taste is fantastic and the flavour is really good. A smart


frying pan could have dealt with that issue. And funnily enough, that


just what Pantelligent is. I thought the idea it was dark to start with.


Who needs a Bluetooth connected frying pan that connects to your


mobile to tell you how long to cook things for? I do, it seems, as I


perfected some dishes that may otherwise have been compromised.


This is great. It tells you how many degrees lower it needs to be. The


pen's turbojets jet setter keeps track of the heatsink were regularly


reminded to turn it up and down! Pen's temperature setting. You are


told register and add other ingredients. That is really good. I


was concerned the potato wouldn't be ticked all the way through but if I


had done without this might frying pan, that would have been a brisk --


corked. But that was fantastic. Spot-on, I would say. Back to the


soup and it seems to be ready. This was the only dish it had on offer


for us today, but eventually it should be able to burn as many


recipes as it gets taught. -- learn. A great bit of theatre, but I am


very irritated by this mark on the bowl. But there is nothing to clean


it up with. And the soup needs trying. But I don't eat crab, which


is an issue. I am giving it a go. Oh, crab. It's really nice. I will


be a while. Do was Lara. Meanwhile, back in at


the Design Museum in London, some of the most beautiful 3D printing I


think that the scene. -- that was Lara. These are one artist's


suggestion about how we might revive the ancient culture of making death


masks. I wouldn't mind one because it would make me look like I was in


the film Alien.. Next, what would happen if you scaled that technology


right up? What if you were to let it loose on our homes, our cities and


our architecture? The buildings around us don't look the way they do


by accident. The design, the shape and the structure are all results,


-- the result of designers, what we need the buildings to do and the


practical limitations of the materials and building techniques


we've discovered. This is very much the age of concrete, steel and


glass. But with new technology and techniques, what could the next wave


for our buildings look like? The building industry is still in 19th


century technology. It hasn't really evolved like other disciplines and


if you look now at the speed at which cities are growing, of


technology is really lacking behind. Industrial scale 3D printing has


already been put to use the print full-scale buildings, like this


housing project in China. But researchers are now turning to


computers to not just create buildings but to help design them.


And the results? Well, a little unusual. This is a prototype: that's


been three -- 3D printed here at the University College London. We


basically used a computer and used algorithms to generate these forms


for us. They may look strange, but they are highly optimised. So these


forms attempt to save material and become more efficient, but at the


same time they produce a sort of aesthetic that is very appealing to


us as architects and it really doesn't look what the normal


building any more. Normal 3D printing creates objects by building


up thousands of the layers, which can imagine takes a fair while. The


idea here is to save time by printing just what you need, which


means rather than printing Flatley is instead with shapes, like


pyramids. The software they've created can take this a step further


by figuring out which bits are structurally essential and getting


rid of the rest. Before computers we had to build with hands and now we


can create algorithms that make this calculation is for us, but that


doesn't mean we don't design, we does optimise the process and we can


create in that we couldn't ever think of before. 3D printing will


allow architecture to be much more details, much more fine and also


much more efficient. You can 3D printing exactly the material that


you need in a specific part of the building. You will make it perform


much more efficiently. Before these new techniques can be put to use,


they first need to be proven to be strong and safe. Case in point, this


bridge project aims to 3D printed usable steel bridge right in the


centre of Amsterdam. Created using similar generative algorithms, the


project has been held up while the company proves the regulators that


the design is structurally sound. The actual bridge now isn't slated


to appear on till next year. Techniques like these promised to


spice up our city skylines, but it could still be a while before we see


3D printed is now building sites. That was Steve. Now, earlier this


year we shot an entire programme in 360 degrees. To get these shots we


had to use a six oh pro cameras strapped together and let me tell


you the postproduction was a nightmare. -- GoPro cameras. But


since March more than a dozen much cheaper consumer cameras have gone


on sale, so we felt we wanted to see if they were any good, so we sent


our top team on a mission. Go to central Africa, see if the cameras


can't cope and above all keep calm! It almost went to plan.


We're driving through Rwanda. I've come to shoot some of the highlights


of this landlocked country in 360, including a beach... We are close to


the border with Congo at Rwanda's very own lake. I found my way to the


beach and I have to try this first of all. This has two 180 cameras


that gets stuck together on the device. It is almost too easy to use


and superquick. We actually aren't here to shoot the beach, we are here


to capture something quite special. Meet some of this acrobatic squad


who have taken an interest in my new camera. I'm not sure this is a good


idea. It features the two 180 shops together really well with a few


aberrations near the edges of each lens. There is no post, so as soon


as it is shot you can watch it back or Sherrock. Time to try something


different. We are leading the beach and on our way to the mountains. It


is supposed to be a beautiful journey, so we will use this camera


to try to capture the beauty of the Rwandan countryside. Dashboard


cameras are typically used to record any accident that might happen, but


we made use of this super HD wide angles dash cam as a perfect camera,


each file has its GPS information attached. Before we set off, we set


up another 360 camera just in case we spotted filming opportunity. The


LG 360 camera is the cheapest of before we brought with us. It takes


a 200 degree shot, two of them, which are then stitched together.


Wigan arrived at the volcano mountains, ready for some unexpected


guests. -- we arrived. Unlike the Insta360, the LG cam Cannex


wirelessly to your iPhones you can leave it in the middle of the action


and then sit back and watch. -- connects. The picture wasn't as


crisp and colourful as the Insta360. The camera is lightweight and the in


battery didn't last long. But the three microphones offered good


surround sound, something the will appreciate more if you what your


movies through a VR headset. -- something you will. As the light


faded, we decided to prepare the series kit that we would be using to


fill out for high up on the mountain early the next morning. I brought


the 360 Fly, which looks like a golf ball with an eye. That the camera


with a 240 degrees superwide lens. That means there is no stitching


together of shots and that should mean a smooth and clean picture. She


used the Kodak double action camera. The two cameras need to be


synchronised, so they are started by a remote-controlled watch so the


record the same time. The image from the two cameras need to be stitched


together later with Kodak software, if the stitching works well we


should get winning results. We've been told Rwanda was stunning so we


decided to trek 3000 metres up to take a look. A fellow adventurer at


kindly agreed to be our cameraman, which means we strapped the golf


ball to his head and it soon became apparent what the limitation of his


single lens camera was. A great, lucky black pit at the bottom of the


picture. Ones in the jungle it looked awful. To be fair, it can be


cropped out later, leaving a better view that actually 360 horizontally


but you can't look down. The superwide angle made everything


seemed far away. Anything close up looked great, but the sound quality


was ruined. As we trudged through the undergrowth, we decided it was


time to swap over to the Kodak. It was then the adventure really took


off. The air got thinner and this camera looked like it would capture


anything we came across. Or anything that came across us.


By having two super high-definition cameras we weren't just able to


capture this incredible creatures wherever they went, but we have the


resolution to zoom in as well. On the downside, the two cameras didn't


automatically stitch well together. After fiddling with it using Kodak's


on software, we decided on shot was running behind the other. After a


calculated week got this much better results. -- tweak. The picture


quality was the best of the bunch. The 360 cameras can allow you to


capture everything in one go, but finer details still elude even the


best of them, meaning it will still be a while before you feel like


you're right there. That was Dan Simmons, clearly


angling to be the 360 David Attenborough. That's it from the


design Museum in London. Next week, it is the Click Christmas party, so


be prepared for well, anything! Plus a look back at our best bits of


2016. In the meantime, we live on Twitter. Thanks for watching!


Friday was another grey day for many parts of the country. Any breaks in


the cloud, mist and


Download Subtitles