In keeping with my recent columns summarizing the calibration of various gages, I offer this one as a catchall for the many precision hand tools used for measurements.
I use the term ‘hand tools’ to categorize the popular instruments in common use, all of which are typically handheld when used. Micrometers, universal calipers, etc. are probably the most used of this type. Their display of a reading may be analog in nature or digital and their level of precision can be quite high due to the use of electronics in their design. In some cases, makers of them have gone off the deep end with resolutions that suggest great precision but in practice are mechanically incapable of delivering the precision implied by the resolution. Despite this, they offer a lot of precision for the money.
Calibration of these instruments can be simple or complicated by those making a career out of the process. On the simple path, you simply need to determine how precise the device is doing what you want it to do. On the more complicated route, you start checking out specific elements of the device against the standard they were made to rather than an overall performance or functional calibration. I recommend a functional calibration because it is simple to do and answers the question most people have, which is: How good is this thing? If you don’t like the answer, then you can start checking various elements to determine why the performance isn’t what you expect.
I recommend a functional calibration because it is simple to do and answers the question most people have, which is: How good is this thing?
The process of a functional calibration is simple: You measure some masters covering the range of the device and compare your readings to the calibrated values of your masters. Any differences are due to inaccuracies in the instrument and problems with the person using it or the environment in which the calibration was done or all of them.
I recommend a functional calibration because it is simple to do and answers the question most people have, which is: How good is this thing?
The Masters
Gage blocks are the most used master for this type of calibration and it goes without saying but I will say it anyway, they must have a current calibration report showing their actual size. If you have a calibration report on the blocks, it doesn’t matter what grade they are because you will be using their calibrated values. On the other hand, if you are using blocks to a specific grade, you can skip the number crunching and assume the whole tolerance on the blocks into your uncertainty budget.
For example, if you are going to use calibrated values, you only have to put the uncertainty from their calibration report into your uncertainty budget such as five millionths of an inch. If you are going by the gage block grade and it’s plus/minus ten millionths of an inch, that’s the value required for your uncertainty budget.
Where micrometers and calipers are concerned, there is a number of standard master designs created for this purpose but make sure that is their purpose. Too often people are using what are really zero setting masters instead of calibration masters. Those designed for calibration have intermediate values while the setting masters have even sizes such as 1, 2, 3” etc. or 25, 50, 75 mm etc.
The Process
This type of calibration is comparative in nature – you’re comparing the readings from the instrument being calibrated against those from the calibration report on the master(s). Care must be taken to ensure you can duplicate the calibration. Where micrometers or calipers are involved, I recommend mounting the instrument in a stand or vise so manipulative variations and heat transfer don’t cause you problems. Yes, you can hold either in one hand and do the job but the values you obtain won’t be as precise due to heat transfer problems and manipulative errors or bias.
Limited Calibration
Long range instruments such as calipers cover standardized ranges part of which you may never use or calibration reveals is not as precise as the portion of their capacity you do use. You can calibrate the part you will use and indicate this on your documents and label the instrument accordingly. Ideally, you should provide some means of preventing the un-calibrated portion from being used by mechanical means or a prominently placed label on it to prevent or limit its use.
Since you are calibrating used instruments, your acceptance levels should be based on how ‘good’ the instrument has to be with respect to your product tolerances.
Hill Cox is president of Frank Cox Metrology Ltd. (Brampton, Ontario, Canada).