Today there are over 3,500 different grades of steel. In fact, steel is one of the world’s most innovative and essential materials for manufacturing, building and construction.
The COVID-19 pandemic has disrupted our daily lives on an unprecedented global scale. The need to alter our way of life to try to mitigate and contain the virus has made us press pause on everything we take for granted, from visiting family and friends to travelling to work and business continuity.
All objects—from toothbrushes to umbrellas to the components of a space shuttle—experience forces throughout their lifecycles. In performing everyday actions like tying a shoelace or ripping open a package, we all exert forces without even realizing it.
If you take time to understand these definitions, standards and testing methods, you’ll be able to determine the accuracy of CT in your specific application.
I often hear, “How accurate can this be measured using CT?” For CT accuracy and precision should be considered together. For accuracy versus precision, picture a target.
Conversations around quality in metal additive manufacturing often focus on the flashy application of high-frequency, in situ, real-time monitoring systems and the neural networks or machine learning required for map-reduction of the mountains of data generated. There is, however, an often-overlooked aspect of consistently making high-quality parts: calibration.
Almost every industry has seen explosive growth in additive manufacturing (AM or 3D printing) of metal components, either for prototyping or low to medium volume manufacture of often high value and safety critical parts.
Helium is in short supply and its cost is rising. Global sources may even run dry by the end of the century. And yet, it remains the dominant choice for trace-gas-based leak-testing on the production line. How can you make the most of this increasingly precious commodity for your critical quality assurance needs?