A study published in the Journal of the American Medical Association looked at a number of fitness trackers and smartphones to see how accurate they are at counting steps. A number of tests were carried out on treadmills, with participants walking 500 or 1,500 steps. The devices were compared with the actual steps that the participants took.
The test showed that many fitness trackers are inaccurate, something that I recently discussed, but looking at the results raises a number of questions.
First, why would different apps on the iPhone show different results? All apps that track steps do so using the built-in motion co-processor.
Second, they give a step count for the Nike Fuelband. But that device doesn’t track steps; it uses “Nike+ Fuel” as a metric.
Finally, why did they only test these devices on treadmills? That, in my tests, is where they are the most accurate. Fitness trackers need to be tested during everyday activities, because any worn on the wrist track steps when you make certain arm motions.
I note that the study confirms what I have seen (among the devices I’ve tested): that the FitBit One (Amazon.com, Amazon UK) is the most accurate at counting steps. It may not be the sexiest device, but it certainly does what it claims.
Some people claim that the accuracy of these devices isn’t that important; that it’s more important to look at them as recording trends. This is true, if all you do is check how many steps you count in order to be more active. But many people use these together with calorie counters, and the lack of reliability of the step count – and other activity tracking – means that any calculations of calories burned is wrong, ofter by a very large percentage.
I think it’s a shame that so many of these devices are sold that are grossly inaccurate at the one metric that they claim to measure. Consumers should demand more than just estimates.