Physics

Precision VS Accuracy: Which is Better?

Devin Gates
Intuition
Published in
7 min readJul 29, 2022

--

You think about “accuracy” all the time in your every day life. Whether you make an accurate observation, or get frustrated with how out of sync your watch and computer clock get, it’s all around us. But, what about “precision”? Aren’t they just synonyms? Well…that’s precisely what we’ll discuss here…

Image by PIRO from Pixabay

As a brief review, synonyms are two dissimilar words which share the same, or closely related, definition. An example could be “cold” and “frigid”. While frigid may imply a different intensity than cold, they are both typically used to convey the same underlying meaning: “it’s at an uncomfortably low temperature outside”.

Synonyms differ from homonyms which are words that are spelled exactly the same, but have completely different meanings. For example, “duck” and “duck” are homonyms. You can either go outside to feed a duck, or you can duck under a tree to avoid the sun. Spelled exactly the same, but they convey wildly different meanings.

Okay, this isn’t going to be an English lesson, I promise.

The point is, “accuracy” and “precision” are not synonyms. They are not antonyms either. While they are used interchangeably to convey a similar meaning, they actually mean two completely different things in regard to measurement!

For example, something can be accurate without being precise. At the same time, something can be really precise, while being completely inaccurate. And sometimes, something can be neither accurate nor precise!

Confused yet? Let’s take a deeper look-see!

Bullseye!

The easiest way to conceptualize the difference between accuracy and precision is by using the analogy of throwing darts at a dart board. It’s easy to visualize, and you’ll learn something pretty interesting!

Let’s pretend it’s the end of the game, and the last thing you need to hit is the bullseye. That is the only point on the board toward which you’re aiming.

The accuracy of your throw is going to depend on how close you are to the center of the dart board. If you were to hit it right on the nose, you’d have 100% accuracy.

If you were to hit the number 12 near the outside of the board, your accuracy would be abysmal.

Image by Nicky ❤️🌿🐞🌿❤️ from Pixabay

But, now let’s say you actually have three shots to get the bullseye. If all three shots were to hit the bullseye, you’d again have 100% accuracy. But, if one of the shots were to miss, your accuracy would decrease proportionally to the distance away from the bullseye.

Or, if only one of your shots made it to the bullseye, then your accuracy would still not be 100%. In fact, since you took three shots to achieve those results, the highest possible accuracy rating you’d have in that situation is 33%.

Another way to think of it is, accuracy is the measure of how close you are to some target. Your target is the bullseye, and if you hit it directly, then that’s an accurate shot.

Or, in a more scientific way, accuracy can be how close the results of a performed measurement is to a specific standard, or target. For example, let’s say you wanted to use a scale to weigh a kg block of metal, just to see how accurate the scale is.

You’re also using a block that was calibrated and it’s exactly 1 kg.

That block, since it was calibrated, is your “standard” (the definition of “standard” is slightly different, but that’s good enough for now) to which you’re going to compare your performed measurement.

If the scale displays .9 kg, then you know your scale is slightly inaccurate. The same can be said if the scale were to display something even slightly above 1. We know this because the difference between the measured weight and our known standard is not zero.

Depending on how far away the scale reads from 1 is the measure of accuracy, or in this case, inaccuracy.

So, since our scale is off by one tenth of a kg, does that mean it can’t be precise either?

When All Else Fails, Strive for Precision

Before deciding if our scale is precise or not, let’s take a quick look back at the dart board analogy from earlier.

This time, you’re not going to be taking single shots. Let’s say when it’s your turn, you will always throw three darts. But, just like last time, you still want to hit that sweet, sweet bullseye.

So, what exactly is “precision”? Is it better than accuracy? Worse?

It’s neither better nor worse than accuracy, but that can also be quite subjective. You can actually use precision to compensate for inaccurate measurements, but that’s getting a little ahead of ourselves.

The point is, one isn’t better or worse than the other, they are two sides of the same coin in a lot of ways.

Let’s say you get really lucky on your first turn and you hit the bullseye with all three darts. Well, that shot was 100% precise and accurate.

As luck would have it, your next turn doesn’t go so well and all three darts actually hit the number 12, completely missing the bullseye.

Image by Miranda Bleijenberg from Pixabay

Believe it or not, you shot those darts with 100% precision! But your accuracy? Not even close!

How come the shot was completely precise but also completely inaccurate at the same time?

Precision can be expressed as the repeatability of the shots taken. You took three shots, and although you completely missed your target, they did all land very close to one another. That means you were able to hit the number 12 with the first dart, and repeat the same results on the following shots.

Or, another way to say it is that while accuracy is how close a measurement is to the standard, precision is how close repeated measurements are to one another.

Image Source: http://kaffee.50webs.com/Science/labs/Lab-Precision.vs.Accuracy.html

The image above illustrates this point extremely well. Using the definitions for accuracy and precision we just went over, you can now visually see the difference between the two!

But wait a second… what about the scale analogy? How does precision work in that scenario?

Well, as I said before, precision is the result of multiple measurements.

Going back to the scale and the 1 kg block, we know from our previous attempts that the scale isn’t totally accurate. But, just for the sake of fun, let’s say the scale is now way more inaccurate!

Photo by mali maeder: https://www.pexels.com/photo/yellow-analog-meter-50634/

When you put the block on the scale, it now measures .6 kg. That’s 30% less than the scale measured before!

But, let’s try it again just in case something went wrong with our measurement. You put the block on the scale again, after zeroing it, and it reads .6 kg again.

Just to be absolutely certain, let’s take a third measurement. If our scale reads the same .6 kg result again, then we know we have a precise measurement. It’s not exactly accurate, but it’s precise because each inaccurate measurement we made was off by the same amount.

On the flip side, if you were to get three different measurements after each attempt, then you can confidently say that your scale is neither accurate nor precise.

Remember when I said that precision can actually help us compensate for inaccurate measurements? Well, hopefully you can now see why that is the case, and why one isn’t better or worse than the other.

For our scale, since we have a very precise scale, we can use our inaccurately measured results to calculate what the measurement should actually read. Since we know that our scale is always off by 40% per kg, we can use that information to make accurate measurements in the future.

However, that isn’t what actually happens in the real world. It’s not like folks go around using inaccurate tools just because they can compensate with their superior precision. No, we use that information to help us calibrate our measurement device to the known standard!

But, I’m sure there are some cases in which compensation was the only option. I’d imagine NASA has had to do something similar at one time or another.

So… Precision or Accuracy?

Well, ideally you want and need BOTH to make reliable measurements. As you now know, there is a difference between the two words, but they are essentially describing the same thing — just in different ways.

Image by mohamed Hassan from Pixabay

As you can imagine, the worst type of measurement to have is one with high accuracy and low precision. While you may get the correct results some of the time, it’s not a dependable measurement device which completely negates the merits of its accuracy. Sorry, Weather Channel!

If there is no solid reliability in an instrument’s ability to measure, then the results are practically useless, depending on what’s being measured.

Or, in the case of darts, if you aren’t a reliable shooter who can hit your target, you’re less likely to win any games or be asked to join the dart team.

Are “dart teams” even a thing?!

***

What do YOU think?

Had you ever wondered if accuracy was “better” than precision? Do you think measurement is as cool as I do? Let me know what you think in the comments!

--

--

Devin Gates
Intuition

Writer | Learner | Musician | Sales Professional — Come read some of my wild theories, thoughts, and explanations of the Universe!