The team visit the US to report on how NASA and others plan to transform how we explore and experience space.
Browse content similar to Space Special. Check below for episodes and series from the same categories and more!
Coming up next,
Coming up next, Click,
Coming up next, Click, followed
Coming up next, Click, followed by
Coming up next, Click, followed by Newswatch.
We've long fantasised about the possibility
of life on other planets.
But it was only in 1995 that we actually found the first
planet outside of our solar system.
These exoplanets are hard to find.
Of course they are, they're relatively tiny.
And so far they've mainly been detected indirectly,
either by the incredibly slight dimming of a star's light
as the planet moves in front of it, or by the wobble of the star
caused by something orbiting it.
In the last 20 years we've detected about 2000 exoplanets,
but we haven't actually seen many at all.
And this is why.
Well, the planets are very, very faint compared to a star
and they're very close to a star.
The kind planets where we might find life, an earthlike planet orbiting
a star, would be 10 billion times fainter than a star.
But if you can see the planets, you can start to look for evidence
of life on their surfaces.
What you need is something to block out the light of a star.
What you need is a star shade.
Due to go into space in the middle of the next decade,
it is a crazy-sounding thing that can be flown in between a space
telescope and the star to precisely block out the star's light
and reveal any planets.
It'll be a few tens of metres in diameter, and in order to block
out just the light from that distant star, it'll need to be
about 40,000 kilometres away from the telescope.
And this is not even the maddest part of the scheme.
See, there's a problem.
The star shade won't fit in a rocket.
And that's why a big part of the work being done here
at Nasa's Jet Propulsion Laboratory, in Pasadena, and the beautiful
solution they've come up with, is all about fitting the thing
into a tight space and then unfurling it once in space.
And the inspiration comes from origami.
It's really quite impressive.
At the end you can see how large an area you can fill with such
a small volume of material.
But this is only the half of it because you have petals
which come at here as well?
Yes, exactly. Oh, my goodness.
This cardboard model is the latest test to make sure the shade
can unfurl perfectly when it's all alone.
The flower shape blocks out the light better than a circle,
and those outer petals need to be made to an accuracy
of 50 to 100 microns.
If I may say, this sounds crazy!
This sounds like we want to spot some planets,
what are we going to do?
We're going to put a shade in space and we're going to fire
it 40,000 kms from the telescope?
That sounds insane.
Yeah, but what's really cool about that if there is this insane
concept of how you're going to fly this
massive shade so far away, 40,000 kilometres away from the telescope,
but once you start breaking it down into little problems,
you start testing and build a petal, you build the truss,
you build the shield, you realise piece by piece
what engineering needs to go in to that problem to solve it.
So we just break it down into little problems that we can solve
in a piecewise fashion.
Yeah, and isn't that a great motto for life?
Take an impossible problem and break it down into more possible chunks.
I love the fact that at JPL you can just wander into a random room
and it is called something like the Extreme Terrain Mobility lab.
That's what they're doing here.
They're making robots to cope with extreme terrain.
This is Axel, which is a robot with a pair of wheels that can be
lowered down cliffs.
And this is Fido and Athena.
These are the prototype is for the Mars rovers
Spirit and Opportunity.
Of course the point about robots is they can do things that humans
might want to do but in places that humans can't go.
All of these have fairly familiar designs, wheels here,
some robots have legs.
But Kate Russell has found one that looks like nothing
I have ever seen before.
In 2012 the world watched with baited breath as Nasa deployed
a rover on the surface of Mars using a sky crane.
This kind of science is incredibly expensive.
The rover weighed 900 kilograms, as much of a full grown giraffe.
But the equipment required to land it gently, it had to be able to take
the weight of 32 giraffes.
It would have been much cheaper if Curiosity was lightweight,
came flat-packed and was sturdy enough just to be dropped
on the red planet's surface.
Meet Super Ball, a tensgrity robot in development to Nasa Ames.
This lightweight sphere-like matrix can be packed down flat,
taking up minimal space in a rocket and vastly reducing launch costs.
Because of the unique structure of this robot and the fact
that it can deform and reform itself and take massive impacts,
eventually Nasa will be able to literally throw it at the surface
of a planet and its scientific payload in the middle
will be protected.
Once deployed, Super Ball can handle much rougher terrains then a rover,
rolling right over obstacles and up and down hills.
Tendon wires connecting the struts spool in and out to create momentum,
in much the same way as flexing your muscles
moves your limbs.
If it bumps into anything solid, it'll just bounce back.
It should even be able to survive falling off a cliff.
The next step for Super Ball is to redesign the robot such
that it can actually survive at least a one-storey drop.
You can expect to see a system like this on an actual Nasa mission
probably in 15 or 20 years' time.
Over at JPL, they are working on limbed robots.
Its research spawned from the DARPA Robotics Challenge where teams
competed to create highly mobile and dextrous robots that can move,
explore and build things without human intervention.
The plan for King Louis is to be sent into space to build stuff
with visual codes a bit like QR codes to guide it.
We have a structured environment.
We know what we are putting together so we put signposts
onto all the bits and pieces of the structure we are putting
together, that tell the robot a few things.
Most importantly, it tells the robot where those things
it is manipulating are in space, literally and figuratively,
so it can align itself better.
The codes will also include construction information
like which bits go together and how much torque to apply to a bolt.
This will allow robots to work autonomously in teams,
building space stations or planetary habitats faster
and more economically than previously possible.
But Nasa hasn't completely given up on our four-wheeled space helpers.
Here we've tried to develop new kinds of robots
for future space exploration.
This robot, for example, is called K-Rex.
It's one of our main research robots that we develop and test here
in the robotscape at Nasa Ames.
This is a large play area for robots, a proving ground
that we use to really try to develop things like navigation
or do the mission simulations.
So, the biggest question perhaps of the day for me,
can I drive K-Rex?
Let's have you do that.
Now lots of you think we Click reporters have the best jobs
in the world, but after spending a day at the roverscape testing
ground, I think there is another contender for that title.
I've had some really engaging virtual reality experiences.
One of them simply set in an office, but it seems if you are entering
at VR world, you might as well go somewhere really
exciting, like space.
That's where Home: A VR Spacewalk takes you.
Inspired by Nasa's training programme, it aims to bring
a mission in space to the masses.
After getting used to your new surroundings, you undertake
an emergency mission.
Whilst enjoying views of Earth from afar, a friendly hand
from a fellow astronaut helps to get you on your way.
Ah, I can hold a hand.
I feel a strange sense of safety there is another astronaut here.
The BBC commissioned the experience last year,
as its first steps into the world of virtual reality content.
We've taken all the storytelling power of the BBC and applied that
behind it, so there's a great script, a great narrative and then
we've looked at all the cutting edge explorations people are doing around
VR, in terms of bio-monitoring, haptic feedback etc etc and trying
to bring that into it as a massive piece of learning really.
My preview here on the HTC Vive saw it set up with a chair providing
haptic feedback and a heart rate monitor which resulted
in my being sent back to base if readings went too high.
But apparently I'm very calm in space.
In March it will be released for Vive on Steam as well as Oculus.
Wow, this is incredible.
Oh, goodness! I feel most disorientated!
Wow, the depth of it I think was the thing
that was most surprising.
You really got a sense of being up high, seeing things
really, really far away.
It took a while to get grips with what I was meant to be doing,
but just the fact that I was moving around within space
was quite incredible.
Whilst it wasn't possible to create a sense of weightlessness,
the pictures were amazing, but obviously, I can't vouch for how
true to life they are.