We explain what precision is and why it is important in measurements. Also, examples and differences with accuracy.
What is precision?
In general, when we talk about the accuracy of something or someone, we are referring to its short-term ability to hit the target, that is, to obtain the expected results or results very close to the expected. Although in everyday speech it can be synonymous with accuracy, it is not advisable to confuse these two terms.
The word precision comes from Latin praecisionisderived from the verb praeciderewhich can be translated as “cut well”, “cut at both ends” or “to separate completely by cutting what is left over”. This verb was composed of the voices prae- (“in advance” or “in advance”) and fallen (“cut” or sometimes “kill”).
Originally this word was used to refer to that which has been cut or severed from the body (eunuchs, for example, were called praecisus“cut”); while its current meaning comes from its figurative application in rhetoric, that is, in relation to oratory.
Over there, praecisus It referred to what was “well cut”, that is, well delimited, well focused, and which therefore adheres to the subject in question in the best way. That is, that which is pertinent, that adheres to what is necessary.
Thus, today we refer to precision as the ability to hit the target or near the target in different attempts. For example, a dart player has three chances to throw them at the bullseye, and once he has done so, he can judge how close to the center his throws were and know how accurate he was.
This type of value can be of utmost importance in the field of scientific disciplinesengineering or statistics.
See also: Scientific thinking
Precision in measuring instruments
Measuring instruments are the tools and devices that allow us to express a magnitude in numerical values determined by nature. These measurements can be more or less accuratethat is, contain a certain margin of error attributable to contextual and unpredictable factors. Thus, a set of measurements can vary from one another, even though it is the same magnitude that is being measured.
Let's imagine, as an example, that we take our body temperature with a thermometer, and we do it several times to be sure that there is no involuntary error. If we notice that the measurements all tend to the real value of the temperature (or in any case the estimated value), we will know that it is a precise thermometer, that is, that it records its values quite faithfully.
That is to say that an instrument that always tends to measure correctly is precise. On the other hand, if the temperature varies immensely between one measurement and the next, we must understand that the thermometer has lost its necessary precision, since some measurements will be closer to what is real and others will be enormously far from it. And how do you know which is which?
Examples of precision
As an example, we can visualize some cases in which precision is a determining factor:
- Every batter in a professional baseball league has a “average” batting average, or his batting performance. This average is a numerical approximation of your batting accuracy, that is, how many times you bat out of all those that correspond to you in a game.
- A soldier trains for war and fires the 100-round cartridge from his rifle at a target. Then you go and check the number of impacts on the dummy, and you can have an estimated idea of its precision, that is, of how many shots hit the target or came close to doing so, and how many shots were missed.
- During a medieval siege, catapult operators attempt to throw stones against the walls enemies. But the catapult is not very well calibrated, and each rock they throw follows a different trajectory: some hit the walls, others the nearby river, others the battlefield where they crush the allied troops. Logically, it is a very imprecise catapult, since its shots do not tend to hit where they have aimed it.
Precision and accuracy
In science, engineering and statistics, it is important to distinguish the notion of precision from accuracy, even though in everyday speech they are often used as synonymous words. This difference is particularly important when understanding or interpreting the results obtained during a measurement, and depends on the following:
- The precisionas we have seen, is determined by the ability of an instrument or a measurement technique to record similar values in a number of successive measurements, given that they can vary from one another depending on the margin of error. The closer the measurements are, the greater the precision of the device.
- The accuracy Instead, it has to do with the proximity of the measurements to the expected value or the real value. That is, how close a measurement is to reality. The closer to the expected or actual data, the more accurate the instrument will be.
This difference can be easily understood with an example: suppose a golfer tries to make a hole-in-one to break a record. Even if you are a good golfer, there are variables that influence your shots: the wind, the humidity, the perfection of the golf ball or the force it gives to the shot; So you will have to try many times until you finally achieve it.
If we judge how close the balls have landed to the hole, we will find a measure of their accuracy, since we know that the reference value is the hole itself. On the other hand, if we look at the number of times their shots came close to the hole, against the total number of attempts made, we can find their accuracy, that is, what margin of error their shots have in general.
References
- “Precision” in Wikipedia.
- “Precision and accuracy” on Wikipedia.
- “Precision Etymology” in the Online Spanish Etymological Dictionary.
- “Accuracy, precision and error” at the University of Murcia (Spain).
- “Precision or accuracy? What is most important to your team?” (video) at COMINTEC (Mexico).