Study Highlights Inaccuracy of Many Fitness Trackers

A study published in the Journal of the American Medical Association looked at a number of fitness trackers and smartphones to see how accurate they are at counting steps. A number of tests were carried out on treadmills, with participants walking 500 or 1,500 steps. The devices were compared with the actual steps that the participants took.

The test showed that many fitness trackers are inaccurate, something that I recently discussed, but looking at the results raises a number of questions.

Jama fitness tracker test

First, why would different apps on the iPhone show different results? All apps that track steps do so using the built-in motion co-processor.

Second, they give a step count for the Nike Fuelband. But that device doesn’t track steps; it uses “Nike+ Fuel” as a metric.

Finally, why did they only test these devices on treadmills? That, in my tests, is where they are the most accurate. Fitness trackers need to be tested during everyday activities, because any worn on the wrist track steps when you make certain arm motions.

I note that the study confirms what I have seen (among the devices I’ve tested): that the FitBit One (Amazon.com, Amazon UK) is the most accurate at counting steps. It may not be the sexiest device, but it certainly does what it claims.

Some people claim that the accuracy of these devices isn’t that important; that it’s more important to look at them as recording trends. This is true, if all you do is check how many steps you count in order to be more active. But many people use these together with calorie counters, and the lack of reliability of the step count – and other activity tracking – means that any calculations of calories burned is wrong, ofter by a very large percentage.

I think it’s a shame that so many of these devices are sold that are grossly inaccurate at the one metric that they claim to measure. Consumers should demand more than just estimates.

4 thoughts on “Study Highlights Inaccuracy of Many Fitness Trackers

  1. I also think it is curious that the chart shows only the standard deviation. This seems like a situation where a user would be interested in “how inaccurate could a particular model be”. In other words, the extremes are of equal or greater interest. I’d also like to know if different units of the same model would show different results. The brief introduction legible without buying the article seems to say that only one of each item was used for the research.

  2. I also think it is curious that the chart shows only the standard deviation. This seems like a situation where a user would be interested in “how inaccurate could a particular model be”. In other words, the extremes are of equal or greater interest. I’d also like to know if different units of the same model would show different results. The brief introduction legible without buying the article seems to say that only one of each item was used for the research.

  3. My guess for why different smartphone apps would count steps differently, despite using the same hardware, is because the app software differs in what it counts as a definition of a “step”.

  4. My guess for why different smartphone apps would count steps differently, despite using the same hardware, is because the app software differs in what it counts as a definition of a “step”.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.