The precision of a measuring tool tells how many digits of a measurement you can accurately determine using the tool. The precision of any instrument is to 1/10 of the smallest calibration of the instrument. So if you take a measurement to the correct precision of the instrument, you will be sure of every digit except the final one, which you try to gauge visually.

For example, a meter stick calibrated to millimeters has a precision of .0001 meters, or .1 millimeters. A measurement of 67.3 mm would have the correct precision. A measurement of 67 mm would not be precise enough - to the person reading your measurement, it tells that you had to guess the 7, when in fact you were sure of it. On the other hand, measurement of 67.35 mm would be too precise - it tells the person reading the measurement that you are sure of the 3, when you were not.

If you took the same meter stick and painted it black so you couldn't read any of the centimeter or millimeter markings, it would still be of use as a measuring tool. But the precision would only be to .1 meters, so a measurement of .364 meters using that meter stick would be ridiculous.

Therefore a measurement is considered precise if it includes the correct number of digits for the tool used to measure it.

See also Significant Digits.