Series in which Sophie Raworth reveals how household products are tested, putting the makers' claims on trial and showing how to get the best value for money.
Browse content similar to Episode 14. Check below for episodes and series from the same categories and more!
Take a look around your home.
Can you be sure that every appliance is safe?
Is everything a company tells you about a product true?
And are you getting the best value for your money?
With the help of the country's top experts, we're going to see
what it takes to test the household products we use every day.
We'll discover how they're pushed to their limits...
..we'll put the makers' claims on trial...
..and show you how to make your money go further.
You'll find these products in any ordinary house,
but this is no an ordinary house,
and no ordinary street.
This is the Watchdog Test House.
Hello. We're deep inside one of Britain's leading science centres.
Here at the Building Research Establishment, some of the products
and materials that we use every day are put to the test
to make sure that they're safe,
environmentally friendly and that they don't fall apart.
Coming up on today's programme,
smoke alarms. A staggering one in four fail to go off in fires
attended by the fire brigade.
The smoke was coming out of the loft hatch quite severely,
and the smoke alarms didn't go off at all.
How do manufacturers try to ensure they're going to be fail-safe?
Fitness trackers - they claim to count calories and steps.
They're a fantastic motivation point. Track your calories
every day and try and improve on it day after day, run after run.
But how accurate are they? We put three products through their paces.
And exploding oven doors - no stranger to Watchdog...
..but aren't they supposed to be made of toughened glass?
Smoke alarms - the best way you to protect you
and your home against the threat of fire.
The good news? About 85% of households have them installed.
The bad news? They don't always go off.
The upstairs of the house was full of smoke.
I couldn't see where I was going.
Darren King was spending a Sunday afternoon at home with his two
children when his house caught fire.
My two sons were out the front playing,
and my eldest son came running in, just to say, "Daddy phone 999 -
"there's smoke coming out the roof."
When Darren went outside, he soon realised the fire was
actually inside the walls of the house.
The fire was actually started down in the bottom corner
of the threshold.
So I tried to put that out, and then when I came inside,
I put my hand on here and noticed there was a lot of heat
and I could hear crackling.
I realised at that point then that the fire was going all
the way up the cavity wall.
Before long, the flames had reached the roof.
That's how high the fire actually came up from the ground floor
and it's started burning through the wooden framework
of the actual house.
Darren called 999, and by the time the fire brigade had put
the flames out, the house was full of smoke,
but the smoke alarm was faulty and it never went off.
Luckily the fire was in the afternoon and not in the evening,
because if we'd been in bed, we wouldn't have noticed.
The smoke alarms wouldn't have gone off,
and we may not have got out of the house alive.
Fire brigade statistics show that in more than one in four
of the house fires they attend where a smoke alarm was present,
it failed to go off.
In Darren's case, the alarms were faulty,
but there can be plenty of other reasons too.
Smoke alarms have been going off, creating a false alarm,
and people have actually taken the batteries out
because of the nuisance factor, and forgot to put them back.
There are also issues with inadequate siting within the homes.
Smoke alarms must be sited where they can give adequate coverage
and protection, so they can be heard.
But of course that probably won't mean that it covers
every room within a house.
You also need to think about which type of alarm you buy.
Broadly, there are two types - ionisation and optical,
and fire and rescue services would recommend the fitting
of optical-type detectors in all but specific circumstances.
That's because optical detectors are safer in detecting
a broader range of fires.
Either way, you're better off with an alarm than without,
as you're four times as likely to die in a house fire
if you don't have a smoke alarm.
And to make sure they operate effectively, every model
on the market has to be thoroughly tested before going on sale.
Later in the programme, we'll be at the British Standards Institution,
to find out how.
Now, fitness trackers.
They monitor your activity throughout the day
and tell how many calories you burn.
They're marketed as the smartest way to stay active.
But just how accurate are they?
Well, Sophie, if you're anything like me and can't afford your own
personal trainer to keep on top of your exercise regime, you may
want turn to one of these nifty little gadgets for help.
Just ask this lot.
I just tell it I'm going running,
and it'll basically keep track of how far I've gone,
the speed that I'm going,
and then it gives me a breakdown of my run at the end of the run.
If you're the sort of person that likes to keep on top
of things, see your progress and so on,
I think these devices are excellent.
They're a fantastic motivation point. Track your calories
every day and try and improve on it day after day, run after run.
But just how reliable are these devices?
To find out, we've come to Brunel University, where,
with the help of Richard Godfrey - a lecturer in sports psychology,
Tom - a triathlete...
Hi, Tom. Nice to see you. I'm Richard.
..and Chris - the lab technician,
we're going to put two of their many claims to the test -
calories burnt and the number of steps taken.
Tom will be wearing three products chosen from the UK's
leading on-line retailers -
the cheapest we could find, the V-fit WSG pedometer,
costing just over £6,
a mid-range product, the Fitbit Zip, costing £49.99,
wand the top-of-the-range Nike+ FuelBand
which costs £129 -
all of which claim to monitor both calories and steps.
First up, calories.
Chris inputs Tom's vital statistics into all three devices
according to the manufacturers' instructions.
Then it's time for the work-out.
Three, two, one, go.
He's first of all going to do ten minutes of walking
at an easy pace, wearing the devices and having
oxygen consumption measured at the same time.
He'll then have a bit of a break before we get him to do
another ten minutes, this time at a higher intensity, just a jog,
slow jog, and then he'll have another break,
after which we'll do another ten minutes, this time at a faster jog.
As Tom exercises, his muscles use oxygen to burn fuel - or calories.
The more oxygen consumed, the more calories burned.
Therefore, by tracking Tom's oxygen consumption
throughout his work-out, Richard will have a scientifically accurate
measure of calorie expenditure.
He can then compare that to what our three devices say.
Five, four, three, two, one, stop.
With the work-out complete
and the calorie data captured from all three devices,
it's on to the second part of the test...
There we go. That's got to be better.
For this, Tom moves onto another crucial bit of scientific kit -
the fire escape.
According to Government guidelines, it's recommended that you take
around 10,000 steps in a normal day.
That's roughly the equivalent of a five-mile walk.
So having an accurate idea of how many you've taken is important.
We know that it takes exactly 152 steps to go to the bottom
and back to the top.
What we don't know is how many steps our devices will think he's taken.
So just to make sure we're accurate, we do the test twice.
With the tests now complete,
it's time for Richard to crunch the numbers.
And we'll be bringing you the results later in the programme.
Next, exploding oven doors -
a sudden bang, with glass spraying out across the kitchen floor.
It's enough to give anyone a fright.
But why does this happen when oven doors are made from toughened glass?
A question for Lynn Faulds Wood.
ARCHIVE: 'Welcome to Watchdog. In tonight's programme...
'All these people have written to us.'
In the 1980s, oven doors made of glass were still a bit of a novelty.
Now, you might not have noticed, but over the past few years,
there's been a revolution in the way our cookers are designed.
But by 1988, hundreds of Watchdog viewers were complaining
that so-called toughened glass in their oven doors
had shattered without warning.
There was a tremendous crashing noise. I turned round to find
that the front glass of the cooker had exploded out, and he was
actually standing in all these small pieces of very hot glass.
Imagine that happening in your kitchen.
And we discovered it was down to the way the glass had been
fitted into the oven door.
We go to the bottom oven, the glass is held in with these clips and
there's no seal underneath, so it's pressing metal on the glass on both
sides. Finally, if I shut the door, I think you'll see the problem.
That's not good enough -
it's stressing an already badly fixed door.
With criticism like that, no wonder manufacturers were persuaded
to spend millions improving door seals, hinges
and fittings to reduce the stress on the surface of the glass.
Philips even asked me to fly to their Italian factory
to see the changes they'd introduced.
The slot here that the glass fits into, before it was just two metal
pieces. Now it's a continuous metal piece all the way along. Much better.
But what is it about toughened glass that makes it prone to shattering
so spectacularly if fitted badly to the oven door?
Toughened glass is used in all sorts of modern inventions,
from patio doors to windows in your car,
but its origins date all the way back to the 17th century.
Prince Rupert of Bavaria came across the phenomenon by accident.
He found dropping hot molten glass into cold water formed
These "Rupert drops", as they became known,
have remarkable properties.
CLINK The fat end is surprisingly strong.
As you can see, it has survived the impact.
The thin end, however, shatters
at the slightest touch.
The drop has disappeared. It has completely shattered.
Why? When the hot glass is plunged into the cold water,
the outside hardens really quickly.
The glass on the inside, though, cools much more slowly, and like
a coiled spring, it pushes against the already hardened outside layer.
It's this tension between the inner glass pushing out against
the already hardened outside surface that makes the droplet so strong.
But if you damage that outside surface,
the inner tension is released,
and the droplet explodes into tiny, harmless fragments.
And that's exactly the principles on which toughened glass is based.
The glass is taken up to about 700 degrees
and cooled very quickly with air jets.
This has the same effect, where you get compressive layers
under the surface and tension in the centre.
At any point, if any scratch goes into that compressive layer,
the glass will break.
But it was its ability to be both strong and also break into harmless
fragments that caught the attention of the car industry.
This is what can happen with a windscreen
made of toughened safety glass.
As more cars came onto the roads in the '20s and '30s,
the numbers of accidents increased,
and with many injuries being caused by the windows breaking
into razor sharp pieces, toughened glass seemed the perfect solution.
The first British Standard for this glass in cars
was introduced in 1939.
Demand increased in the 1960s, as regulations were introduced actually requiring it
to be used in cars and in certain windows in buildings.
Today, wherever it's used, to meet current European standards,
toughened glass needs to be strong enough
to withstand the force of anything that might be thrown at it,
and when I say anything...
The weight of it, being 50kg,
is very similar to a child running into a window, so it will
demonstrate that that window or that patio door can survive that impact.
But again, despite its vast strength,
compromise that outer layer with a couple of scratches
and it becomes extremely vulnerable
and collapses into those harmless little cubes or dice.
So looking back to those poorly designed oven doors,
those exposed metal rims and clips that were
pressing on the toughened glass were in fact compromising
the surface in the same way as that scratch.
And the result?
Although the design of oven doors has much improved,
toughened glass will always have the potential to become
'compromised. But whether it's been scratched or chipped during
'manufacture, or even as a result of aggressive cleaning...'
At least you know with toughened glass
that if it does shatter, it won't go into long, sharp shards
but rather into something safer like this.
If you're thinking about buying a car, there's certainly
an awful lot to choose from,
so where do you start, and is it ever good value to buy a new one?
Well, Emma Butcher from What Car? is with us now. Is it?
Well, surprisingly, sometimes it can, yes.
Most people would think that buying new didn't represent good value
at all over buying used, because, of course,
when you drive away from the forecourts you get that massive
hit of depreciation that can be up to 20% of the cost of the car.
However, there are some amazing discounts out there at the moment,
because manufacturers and car dealers,
they really want you to buy a new car,
and they're incentivising that, so that can be anything from a finance
package with ultra-low interest rates, sometimes even 0% interest,
to throwing in some free insurance or free servicing.
So, potentially, you could be getting a better deal on a new car.
What about the claims that car manufacturers make?
Is it easy to compare them?
Well, it's a bit of a lottery. You would think that you could
benchmark them, but our research has found no,
so those MPG figures that you see
advertised basically come from Government-mandated laboratory tests
which every car-maker must put its car through before they go on sale,
but because they're conducted in a lab, that doesn't really reflect
what happens when you're out in real traffic conditions.
At What Car? we've tested about 400 different engines now,
and we take cars out on the road, in real traffic,
and work out their true MPG based on their tailpipe emissions,
and we found that on average there can be
a 17% shortfall from what is actually claimed that a car can do
to what it actually achieves in real life.
So when you're buying a new car, what should you look out for?
First of all, make sure that you take the car out for a test drive.
Don't ever buy without doing that.
It's a really good idea to take someone with you who can
distract the sales person so you can concentrate on the drive,
and, similarly, you might want to leave the kids at home
if you think they'll distract you as well, although do take
along the buggies and the car seats to see what kind of a fit they make.
-Emma, good advice. Thank you very much.
Earlier, we put three fitness trackers through their paces.
From the cheapest to the most expensive, they all claimed
to count the number of calories burnt and the number of steps taken.
So how did they perform? Let's find out.
First up, counting steps.
We asked triathlete Tom to climb up and down
this 152-step fire escape twice.
The actual number of steps is 304,
but how many did our devices record?
Starting with our most expensive product, the Nike+ FuelBand,
It didn't do quite so well, counting 281 steps -
23 under the actual number.
Quite surprising for a device as expensive as it is.
The Fitbit Zip, our mid-range product, costing £49.99,
did much better - it was only out by two steps.
As for our cheapest product, at just over £6, the V-fit WSG pedometer...
In terms of counting steps, the pedometer did really well indeed.
304 steps were taken and it counted 305,
so that's really good for a relatively cheap device.
Yes, our cheapest device actually performed the best.
It miscounted by just one step over the course of the two tests.
The pedometer's quite a simple device
and it's therefore best at measuring simple linear movement,
so simply counting steps is what it does very well.
So if it's just counting steps you're looking for,
a cheaper product may well be perfectly adequate.
But how good were the devices at measuring calorie-burn?
Now, remember, during his work-out Richard asked Tom to walk,
jog and run. He monitored Tom's oxygen consumption
throughout as a measure of fuel consumption
and therefore a more accurate record of calorie-burn.
He then compared that to what the three devices said.
Three, two, one, stop.
In terms of estimated calorie expenditure,
the Nike+ FuelBand, the most expensive device, did the best.
With our laboratory-based kit, we measured 248 calories.
The Nike FuelBand measured 310, so 62 more, which is actually very good
for a device of that type.
Our cheapest product, the V-fit WSG pedometer, struggled.
Whilst our laboratory based kit measured 248 calories, the V-fit
only measured 125 - a difference of 123 calories.
But it was the mid-range product, the Fitbit Zip, that performed the worst.
It measured 432 - an overestimation of 184 calories.
None of the devices was completely accurate.
However, the Fitbit Zip was the worst one,
overestimating by almost half a chocolate bar.
So if you used that device, you'd actually think you'd expended
a lot more energy than you really had.
But according to Richard, it might not be all bad news.
Well, measuring calories is very useful, but being consistent
is also important, and the device has to measure consistently.
So you have to know that it's you that's improved in any change
that you see, and not some quirk of the device.
So, time for one final test - consistency.
We had Tom repeat the work-out to see if the devices gave
the same measurements for calorie-burn the second time around.
First up, our cheapest product, the V-fit WSG pedometer.
Although again it wasn't very accurate,
it was at least consistent, as it gave an almost identical
incorrect calorie reading on our second test.
Next, our mid-range product, the Fitbit.
Not only was it the least accurate
when counting calories, it was also the least consistent.
For both those work-outs, you'd expect to see
the same results - you want to see reproducibility.
That didn't happen. There was a big, big difference,
which means it's not a particularly reliable device
and it doesn't assess performance particularly well.
As for the Nike+ FuelBand, our most expensive product, not only was it
the most accurate, it was also the most consistent -
at least in our test.
The Nike FuelBand did much better. It showed greater consistency,
and that's exactly what we want in devices of this type.
That's the name of the game.
Fitbit told us they are surprised
and disappointed with the results of our test, which they say
do not represent the normal experience of their customers.
They say the tracker has undergone thousands of scientific tests,
should be at least 95% accurate,
and they do everything to ensure customer experience is positive.
But whatever the device you choose to wear,
according to Richard, there is at least one benefit they all share.
With these devices, they're incredibly motivating.
You can upload information, you can share it with family and friends,
you can introduce elements of competition -
all of this means these devices are motivating.
So maybe these little devices CAN be your own personal trainer -
just as long as you're not always expecting a precision calorie-count
along the way.
Back to smoke alarms now.
You're four times more likely to die in a house fire
if there is no working smoke alarm.
So what do manufacturers do to try to ensure they're fail-safe?
Test them, of course.
The British Standards Institution in Hemel Hempstead.
Here they don't just test products - they help write the rule book,
known as the Standards.
BSI was established in 1901.
At the time, it was the world's first national standards body,
and remains today the UK national standards body.
For a smoke alarm to meet current safety standards,
it has to pass more than 40 tests.
It's critical that these products do actually
perform the function that they're designed for.
The last thing you want is to have a smoke alarm fitted
in your house which doesn't go off in the event of a fire.
Today we're going to be demonstrating some of those tests,
using a mid-range smoke alarm currently on the market
for about £20.
We're starting with the most important piece of machinery -
the smoke tunnel.
We used to actually create real smoke for the tunnel,
using this heated bar, here.
Now we use atomised liquid paraffin.
It's a much more controllable way of producing simulated smoke.
So, in goes the alarm,
then the smoke is blown gently around the tunnel,
not that you'll be able to see it.
It's very, very small quantities.
So we're talking literally parts of a million, here.
So when the smoke alarms are actually triggered, you wouldn't
be able to see with the human eye the smoke in the atmosphere.
The thickness of the smoke is gradually increased in order
to measure exactly how much is needed to set the alarm off.
What we're looking for here is to make sure it doesn't go off
at too sensitive a level, otherwise, obviously, it would
be going off at all hours of the night, and giving false readings.
Effectively, it has to trigger at a higher point than 0.2dB per metres,
and it triggered at 0.87, which is technically a pass.
What we have to do now is take five more measurements,
and then we'll assess the results to ensure
the ratio between the highest and lowest trigger point isn't too wide.
Effectively, what we're looking for here is to ensure that the product,
once it's operated once,
will operate again in the future at the same levels.
Checking the consistency of the alarm in this way
is called repeatability.
Next, they need to find out whether it has any weak spots.
This box is the ionisation chamber,
so this is the actual detector of the smoke.
We want to ensure that regardless of where the smoke's coming in,
it will reach that chamber and the alarm will trigger.
And you can see that if the smoke enters at this point, it has
a lot of electronic components that it has to negotiate before it
will actually reach that chamber and detect smoke.
We'll put the product back into the smoke tunnel,
and we'll turn it round at 45 degree angles.
Once we've identified the worst point of access for the smoke,
we'll carry out the rest of the tests in that particular orientation
to ensure it's the worst-case scenario.
Once you know that an individual alarm works,
and in all positions, you need to be sure each one manufactured
is roughly the same. The way to do that?
Test another 20. And if they're sufficiently consistent,
it's time for extreme conditions.
What we're doing now is adjusting the temperature of the smoke tunnel.
We're going to do two tests in here.
We'll do a cold operational test at nought degrees C,
and a dry heat test at 55 degrees C.
If you have a product which is installed in a house and it's
a particularly cold night, we want to make sure that it will activate
when it's supposed to as well as it would do on a very hot, sunny day.
So once you know the alarm can sniff out smoke in all manner
of conditions, attention turns to sound.
And for that you'll need some peace and quiet.
This is one of our special test chambers.
It's called an anechoic chamber.
We're using it to measure the volume of the smoke alarm in this case,
and what we don't want to do is to have the sound
bouncing off the walls back to our measuring equipment.
We only want to hear it once, as it comes out of the actual alarm itself.
You can see the construction of the room here.
It's covered with these special foam cones,
which are designed to trap and absorb the sound.
It's a very strange feeling with the door shut,
because you literally can't hear anything.
You can hear the pulse of your blood in your ears,
and when you talk it sounds as though you're actually talking in your head,
because there's literally no echo at all.
The perfect environment for accurately measuring noise.
What we're looking for is a sound level in excess of 85 decibels
when measured at three metres from the alarm point.
What we can't do is exceed 110 decibels.
Basically, those parameters are set so that the alarm is loud enough
to be heard when it needs to be, but not so loud
that it becomes debilitating.
So, your smoke alarm detects smoke and makes the right amount of noise.
One last thing - is it robust?
Yes, that's our alarm being hit with a hammer.
What we're trying to do here is to replicate the kind of impact that
it might realistically experience during the course of its lifetime -
from the point of manufacture, transportation, installation.
You might drop it on the floor... All these things we take into
consideration, and we're trying to replicate the worst kind of impact
that it's likely to experience.
A single blow measuring 1.6 joules, to be precise.
And if it survives, you can always try shaking it to bits.
What we're doing is subjecting it to different
cycles of vibration in three different axes.
So forward and backwards, side to side, and up and down.
At the end of the vibration test, we'll do a visual inspection.
We need to make sure that the cover is still attached,
that the electronic components are still firmly fixed to the board.
And then what we'll do is put it back together,
put it into the smoke tunnel, and subject it to the same smoke tests
that we did previously, to make sure that it still activates
in accordance with the manufacturer's specification.
If you want more information on the safety of products in your home,
you can go to our website:
That's all for today. Thanks for watching.
Series in which Sophie Raworth reveals how household products are tested, putting the makers' claims on trial and showing how to get the best value for money. Lynn Faulds Wood looks at the safety of products in the home and the Watchdog campaigns that have been saving lives for more than 30 years.