In recent columns I’ve commented on information requests accompanying calibration orders. Some of these are common and effective but some are not. Occasionally, they are brought about due to their inclusion in one standard or another but are misrepresented. In some cases, the standard they are from relates to in-house systems rather than calibration activities by outside parties.
One of my readers had such a dilemma on his hands and asked my advice on dealing with it. My reader calibrates gages and instruments for a number of groups within his organization and had to ensure his calibration uncertainties did not exceed a ratio of the tolerance for the item calibrated. This would seem to be a straightforward matter to figure out, but on reflection raised the following question: If the tolerance is plus/minus 1, is the ‘tolerance’ 1 or 2? After much qualifying, I told him that since there was no definition, I would take all the uncertainty I could get and consider 2 as the tolerance.
I also cautioned him that he could run into trouble trying to follow this type of ratio rule when calibrating thread gages because the published specifications for many of them were equal to the typical uncertainty so his ratio would end up around 1:1, causing all sorts of mayhem.
A similar ratio often requested of us requires that we ‘certify’ that our ‘masters’ are within ten percent of the tolerance on the item being calibrated. The problem with this request is that the term ‘master’ is not defined. It could mean our primary standards or masters or any master used directly in the calibration of the item. There can be a significant difference in the ratio between the two. In the end, I’m not sure what level of assurance will be obtained with this information whichever way it is meant. In a way, this is duplicating part of what measurement uncertainty includes but nowhere nearly as reliable as the following comments show.
For example, let’s say the ‘master’ involved is within twenty micro-inches and therefore meets the requirement. That is the starting point to be considered. It’s what comes after that where problems arise that includes the level of precision of the equipment using the master and thread measuring wires if used. Add the skill level of the technician and variations in the environment and the combination of these factors could add up 50% or more of the tolerance in many cases where gages are being calibrated. Whatever assurance was sought by the ‘master’ rule is totally negated by the overall process.
It’s my understanding that some of these rules originated before measurement uncertainty came into everyday use at the commercial level which is understandable. But since they are really attempts to do what measurement uncertainty does better, they are not that important anymore.
The ratio of calibration uncertainty to product tolerances has problems of its own as I noted earlier. On the surface, it would seem to be a reasonable way to evaluate the process but there are two parts of it that reduce its usefulness. The first relates to the actual tolerance of the item being calibrated. As I’ve written many times before, too many people apply new product tolerances in the ratio which exist for the maker of the gage or masters being calibrated. The tolerances of what they should be from the users’ point of view should be based on the users’ product tolerance they are being used to verify. The gage maker may have produced a gage that is on bottom limit but within his tolerance but if his tolerances are used for re-calibration acceptance, the gage could be rejected after very little use.
A 10:1 ratio is frequently used to establish re-calibration acceptance limits. Simply speaking, this means a product tolerance of .001” would require a gage with a .0001” acceptance limit. This works well until you start considering tighter tolerances. This was discovered a while back and the rule now allows a 4:1 ratio in some cases.
The problem with ratios is that it doesn’t take long before they require tolerances that are not achievable with today’s technology or the tolerances were created before manufacturing and/or calibration limitations became as well-known as they are today. Unlike some instruments, you can’t adjust fixed limit gages to get the ratios you want.