This question would seem to be an easy one to answer but—like too many things in life—nothing is simple anymore. This is due to the absence of standardized rules on which to base the decision which will vary from one organization to another.
This subject was brought up by a reader recently and is one that keeps popping up. Some companies make up their own rules but most use the new tolerance the gage was made to, which can be a waste. Those that follow this path could receive a new gage that is on bottom limit which means a few millionths of an inch wear could render it a reject while another on top limit could have say, .0003” wear before it is rejected.
In a perfect company, the fixed limit gage record would show what the tolerance should be for a new gage and also indicate how much wear is acceptable before it is scrapped. This information was often put on drawings depicting the gages involved—drawings that came out of the engineering department that also designed the product they were to be used on. This connection was logical and engineering based as it should be. For companies that are essentially sub-contractors one or more steps down the food chain, they may be using their own gages or loaners from the customer that have unknown rules attached to them.
MIL Standard 120 from 1950 dealt with this situation which tells you it’s a problem that has been around for a while. The military decided basically that the limit to which a Go thread plug gage could wear to before being scrapped was 5% of the product tolerance below the gage maker’s limit. For most popular threads this turns out to be an amount equal to the gage maker’s tolerance. When plain plug gages were discussed, they just said the gage could wear out of the gage maker’s limit by an amount equal to the gage maker’s tolerance before being scrapped.
This method of dealing with gages will work well for most applications. One time it could get messy is when the product being checked is at or near bottom limit for size so as long as you shoot for mid-limit size it won’t be a problem.
The subject of measurement uncertainty was not taken into account when this document was produced but it is a reality recognized today that has to be considered. For example, if calibration indicates a gage is on bottom limit, the uncertainty attached to the calibrated values mean the gage could be just inside or just outside the gage maker’s tolerance.
If you’re starting to suspect I’m giving a lot of reasons for there not being hard and fast rules to make this type of call you are right. You could take the military approach that if the gage is within say, half the gage maker’s tolerance of its size limit, it’s close enough for artillery but it may not be for other purposes.
You might consider allowing a gage to be undersize from the low limit by say .0001” and increase that allowance with gage size.
When you use your own gages to check work for a number of different customers you could come unstuck if their position on when a gage should be replaced is different than yours. This is something you should clarify with your customers to avoid problems.
All fixed limit gages wear at the front end of the Go gage and while they could be undersize by any standard, they could still be suitable for checking work with short through holes. You might consider a rule to keep this from going off the rails such as the front 25% of the gage length can get this bad before scrapping the gage. If blind holes are involved, little tapered condition for wear would be allowable.
If you are specifying special new gages, you could have the Go gage sizes arranged to provide a wear allowance which most gage makers will do for no extra charge. On standard thread gages you could do the same thing but this would make a standard gage a special one at higher cost.
I know, I’ve left measurement uncertainty hanging out there but I’ll beg off for now since I’m running out of space. Besides, I’ve left you with enough variables already.