Automation relies on precise and accurate data at each step to function correctly. As manufacturing becomes more automated, tracking measurement uncertainty becomes more important.
This monitoring requires careful attention and ongoing effort on the part of metrologists, especially in longer processes with less human involvement. If uncertainties are not properly managed, small errors can accumulate and lead to significant issues in a manufacturer’s final product.
“With longer automated processes that require less human intervention, careful consideration must be taken to maintain traceability and appropriately propagate uncertainty through the manufacturing process,” said Chris Gordon, senior application scientist, Optronic Laboratories.
Complexities of the trade
Learn more about measurement uncertainty:
While ISO’s Guide to the Expression of Uncertainty in Measurement and other texts offer detailed guidance on the topic, a full uncertainty analysis can be time consuming. Keeping thorough records through every step of a complex manufacturing process also requires consistent work and attention.
For example, baseline levels of consistency must be maintained, said Matt Noonan, quality manager, Pratt and Whitney Measurement Systems. Finding reference artifacts — or standard tools with a precisely known measurement— with low enough uncertainty is already difficult. Because reference artifacts are benchmarks for evaluating the performance of measurement instruments, un-precise ones can contribute to a domino-effect of measurement uncertainty.
“If we check an instrument by comparing its measurement of an artifact to the certified size of that artifact, 3 micro-inches might be a significant deviation, considering the capabilities of the machine — but the uncertainty of the reference artifact’s certified value might be as great as 3 micro-inches,” Noonan explained. “There are certainly ways to make finer measurements, but we generally aim to test our instruments with measurements akin to those that the instruments will be used for, such as contact measurements of steel artifacts like gage blocks, pins, and ring gages.”
Fluctuating environmental conditions, such as temperature changes, also complicate matters, Noonan added — as do variations in user understanding and changes in suppliers’ reported uncertainties.
When a supplier's reported uncertainty for a reference artifact changes, it affects the estimated uncertainty of all measurements relying on that artifact. If the reported uncertainty increases, it might actually be beneficial because it provides a more accurate picture of the measurement process, Noonan explained. Previously, unexplained variations might now be correctly attributed to the reference artifact's uncertainty.
Additionally, equipment such as the power supplies used to drive calibration standards, the transfer uncertainties of the calibration standard values, the reproducibility of the measurement system itself, are key contributors, Gordon said.
Measurement uncertainty is inherent in any production process, explained Terry Stransky, senior geologist, Terracon. “There will always be allowable variation, because you cannot accurately produce the same thing over and over and over again, with a piece of machinery,” he said.
But as part of identifying and evaluating the uncertainty contributors of a particular measurement, a metrologist should also be able to recognize opportunities to reduce and minimize uncertainty.
Once a metrologist pinpoints these areas, “the metrologist should be able to interpret the final estimate, and understand what conclusions can be drawn from the outcome,” Noonan said.
Precision vs. accuracy
Understanding and optimizing measurement uncertainty naturally leads to consistent product quality, experts say.
“In general, achieving lower uncertainty of measurement translates directly to improved quality and a higher rate of compliance,” Noonan said. “The lower the measurement uncertainty, the more reliably one can discern acceptable from unacceptable, and the more confident one can be in the measurement output.”
Furthermore, the more that quality teams understand measurement uncertainty, the more realistic their expectations of outcomes will be, he said. This helps people managing those processes to identify the best ways to improve.
Uncertainty is always present in measurements, Stransky said, and thus, metrologists must incorporate safety factors into their calculations.
Precision and accuracy are different, Stransky pointed out. Accuracy is hitting the true value, while precision is consistency in measurements. One can be precise but inaccurate.
“You can be very precise, meaning you can repeat a measurement over and over and over again, and the results that you get are very close to each other. But you can be entirely inaccurate,” he said. “For example, if you’re measuring the distance between two points, and the actual distance is 12 inches, and you keep measuring 11 inches over and over again, you are very precise, but you’re entirely inaccurate in terms of coming to the correct answer, and it’s just reinforcing the inaccuracy.”
If a particular method of measurement was used inappropriately, it could lead a metrologist to the wrong conclusions, Stransky explained. This is where regression analysis comes in — “where you take data and fit it to a curve, and then extrapolate conclusions from that curve,” he said.
Regression analysis is easy to misuse, “or to ignore certain features that don’t particularly sit on that curve,” resulting in incorrect answers, Stransky warned. This is a natural result of bias, which we all have, he said.
“Even with all the technology, sometimes you will interpret data the way you want to interpret it, because it supports what you’re trying to say, even though in fact it doesn’t. And unfortunately, sometimes it’s done intentionally, where people ignore the data that doesn’t fit their preconceived ideas,” he said.
Recent innovations
Recent innovations have reduced measurement uncertainty across various fields, experts say.
Predictive models and algorithms help minimize uncertainty in automated processes, and new instruments, accessories, and software improve measurement accuracy and repeatability. Products aid in precise positioning of measurement artifacts, reducing uncertainty from variations in contact position.
For example, positioning accessories, such as multi-point positioners and templates, allow users to quickly and consistently position different types of blocks, such as square, rectangular, and cylindrical blocks, at reference points that are typically measured and reported on gage block certificates. By using these positioning aids, operators can reduce the uncertainty caused by variations in contact position and gage block alignment. Additionally, these templates minimize the time needed to make measurements, which in turn reduces variation and uncertainty due to reference point drift.
Sensitive instruments like digital adjustable force systems and live anvil accessories further reduce uncertainty.
Environmental impact on instruments is minimized using tuned mass dampers, laser beam shrouds, and exhaust systems, experts say. Laser measurement, radar, and radio frequency techniques improve surveying accuracy and safety, while radio telescopes and magnetic techniques advance astronomical measurements.
“One of the greatest advances has been laser measurement when measuring the distance between two points,” Stransky said. “It’s much more accurate, for example, than the traditional engineering surveying that had been done in the past.”
Advanced ways of managing uncertainty
While automation and robotics have improved the speed and quantity of measurements, they have also introduced new challenges in terms of measurement uncertainty. In large, complex robotic processes, there are many potential sources of uncertainty, such as the accuracy of the robot’s movements, the calibration of the sensors, and the variability of the environment.
In the same token, automation has also necessitated the development of advanced methods for propagating and managing uncertainty in these complex systems. These include:
- Monte Carlo simulation: This technique involves running multiple simulations of the measurement process with randomly generated input parameters based on their probability distributions. By analyzing the distribution of the output results, operators can estimate the overall uncertainty of the measurement.
- Sensitivity analysis: This method involves evaluating how sensitive the measurement result is to changes in each input parameter. By pinpointing the most influential parameters, operators can focus on reducing the uncertainty associated with those specific factors.
- Bayesian inference: This approach combines prior knowledge about the measurement system with observed data to update the probability distributions of the input parameters. It allows for a more accurate estimation of the measurement uncertainty.
- Uncertainty budgeting: This is a systematic method for identifying, quantifying, and combining the individual sources of uncertainty in a measurement process. By creating an uncertainty budget, engineers can prioritize efforts to reduce the most significant contributors to the overall uncertainty.
- Machine learning techniques: Researchers are exploring the use of machine learning algorithms, such as neural networks, to model and predict the behavior of complex automated measurement systems. These models can help pinpoint sources of uncertainty and optimize the measurement process for better accuracy and reliability.
- Virtual metrology: This technique involves using sensor data and mathematical models to predict the outcome of a measurement process without physically performing the measurement. By comparing the predicted results with actual measurements, the uncertainty of the virtual metrology system can be assessed and improved over time.
“These days, automation is taking most of the measurements, then it’s up to the human being to determine whether the results they’re getting actually make sense,” Stransky said.