How accurate is GPS for measuring speed?

28 November 2010


Hi Dr. Smith
Would you be so kind to clarify this relating to GPS speed monitor on next week's show with Redi?

My thoughts are as follows, but they are ever so slightly incoherent.

In South Africa a good signal will allow a WAAS enabled GPS to be around 3m accurate on average, with a variation up to 15m for the horizontal plane this is usually trebled in the vertical plane - so 9 to 45m.
Most GPS, outdoors type GARMINS - "take a point" every second - it may be saved as a track or waypoint or not, so if you use it to monitor your speed at around 120km/h- where you travel roughly 33.3 m/s you can have a horizontal error of up to 30m but mostly the distance measured could vary from -6 to 6m from 33.3 which seems very inaccurate - contrary to the "evidence" on display

So what is the reason for the accuracy, pure averaging?

Or does the error follow a pattern? I.e two points taken directly after each other will err more or less in the same direction and extent. I ask this because there is apparently still some error introduced by governments and this may cause a "pattern following" error, or will the ionospheric and atmospheric and satellite delays for two consecutive measurements be roughly the same?
I'm not even considering the vertical plane, as my trig is non-existent, but you may
BTW car manufacturers err on the side of caution for speedometers but are spot-on for odometers, unless you change your tyre circumference or gearing.
Andre Grobler


Dave - The two devices are measuring speed in very different ways. The speedometer is measuring the number of times your wheels turn every second or every minute, and if you know the circumference of your wheels you can work out how fast you're going.

There can be errors on that because the tyres will wear down, that will change the circumference of your wheels. Quite often, I think they build speedometers to possibly slightly under-read, which is quite good because that way, you get less speeding tickets.

Old-fashioned speedometers also weren't as accurate as modern ones. They weren't computerised and they just tended to be less sensitive at high speeds.

The GPS is basically measuring your position repeatedly and measuring how far you move over a certain period of time and then dividing that distance by the time it's averaged over, and then that will give your speed.

If you're stationary, the GPS will quite often give you a speed, so the GPS is not at all accurate in giving a speed when you're going very, very slowly because the errors in position can be a few metres. And the accuracy of the speed will depend on how long it's averaging over to get the speed.

So if it's averaging for 10 minutes the GPS will be more accurate, if it's averaging over 2 seconds the speedometer will certainly be.


GPS receivers do Not measure speed by differencing the positions over time. They could, but it would be grossly inaccurate. The speed is actually measured as a side effect of the solution to another problem: the satellites are moving, causing a Doppler shift in the received radio frequency. The receiver has to adjust for each satellite - but any remaining discrepancy of the received frequency must be due to the movement of the GPS receiver itself. Thereby, the GPS ends up knowing exactly how fast it is moving, to and accuracy much greater than that attainable from position differences. Try searching for GPS Doppler.

Add a comment