This article is intended to be a guideline for various machine vision implementations, but its main focus is applications relating to random surface defect detection or Automated Visual Inspection (AVI). The goal of AVI is to identify every anomaly visually detectable that could have a functional or esthetical impact on the integrity of a part:
- Process anomalies: grinding marks, machining steps, coating defects, etc.
- Poor handling: dents, nicks, scratches, chip coating, etc.
Products typically subjected to surface quality inspection are aircraft components, cars or any manufactured products with functional/esthetical value. For parts with complex shape, visual inspection is generally performed by human inspectors.
For end users interested in adding automated inspection technology to their process, we have found three critical steps to follow. The objective is to provide a framework to use to reduce the time to implement AVI into a manufacturing environment and maximize chances of having a successful and valuable implementation.
TECH TIPSThe first benefit that comes up in automating human visual inspection is to avoid the well-known human flaws. You need to designate a champion to fully understand and be responsible for your facility to digest the technology. Once the AVI system is developed and configured, it’s time to evaluate its performance. |
First – Clearly identify the gain
The first gain or benefit that comes up in automating human visual inspection is to avoid the well-known human flaws. One example being the lack of consistency when sustained attention is needed, leading to poor detection repeatability and reproducibility. Another flaw is the subjectivity in decision making. Human inspectors fall into a pattern and get used to a “regular” ratio of anomalies in a specific manufacturing process. They will tend to adjust behavior when this ratio goes higher or lower than a certain threshold, leading them to be more or less severe in the interpretation of theoretical quality specifications1.
To determine precisely what is to be automated, you must act as a detective to find undocumented tasks performed by human inspectors. If you look at official worksheets, two major activities are described: identify anomalies and quantify the severity of these anomalies. Digging a bit, there will be some interesting information about tasks actually performed but not officially documented. You might find it is impossible to automate some of these tasks within an inspection system (e.g. removal of coating overspray with emery paper).
Document these tasks and once you have a complete list of actions carried out by inspectors, you will have to evaluate the gain of adding automation. Calculate the payback based on salary, but also on the gain in inspection process stability. A successful approach takes into account that it should not be expected an automated visual inspection system will fully replace human inspection. It is a great tool to improve quality, limit escapes, add objectivity and give new feedback on production. It will inevitably generate a higher reject rate for three key reasons:
The first factor stems from the fact the nature of human inspection is probabilistic, having a machine vision system and setting performance compared to multiple inspectors will increase reject rate, even if the machine is “as good” as human1-2. As an example, two inspectors working in parallel are accepting 80% of the inspected parts. Each of the inspection decisions correlate for only 60% of the time. An automated inspection system having to match both inspectors will increase reject rate from 20% to 28% for the same inspected parts.
The second factor considers humans are using contextual information of the production to make a decision. For example, a defect made by a specific grinder will be differentiated from an identical defect made by handling and could be managed differently in the decision process by inspectors. A machine will not make this difference.
The third factor relates to some limitations of the automated process depending on the technology implemented. These limitations can lead to a higher reject rate. Examples are algorithm performance, resolution, cleanliness of the parts and production variation.
Knowing this, you will need to manage this increased reject rate. Change the process? Change visual acceptance specifications? Leave some specific tasks to human inspection?
Second - Be involved
For a successful implementation, you need to designate a champion which will be there to fully understand and be responsible for your facility to digest the technology. As the client, you need to be part of the adventure to handle and manage this technology leap in-house.
Here are a few important aspects to look for in your team selection:
Competent. Select someone with a strong technical background and interest for new technologies as a project leader. Team up this person with an experienced project manager.
Motivated. It is mandatory to avoid giving up before the end of the implementation. A good way is to allocate time to your project leader and give him support to ensure this person feels the project is important for the company.
Understand the end user. All along the project, the team must have a clear view on how the system will be used by operators. This will avoid a clash with the manufacturing team when the system is ready to be transferred to production.
Third - Compare performance to human
Once the AVI system is developed and configured, it’s time to evaluate its performance. For such a system, a good way is to perform a probability of detection³ (PoD). This technique allows proper evaluation of the system sensitivity that will be compared to your golden reference: the human inspector. The same PoD needs to be performed with your inspectors to determine the performance of automation versus your current inspection process.
In addition, an important characteristic is generally present in visual inspection, the type of anomaly (classification). By default, a PoD analysis is not considering this classification. Strategies should be put in place to measure this classification capability since a lot of false rejects may occur from defect type when comparing with human inspection and it is valuable information to distinguish this.
References:
1. Barber, Thomas A., 1999, Control of Particulate Matter Contamination in Healthcare Manufacturing, CRC Press, p.245
2. Knapp, J.Z. and L.R. Abramson, 1996, Evaluation and Validation of Nondestructive Particle Inspection Methods and Systems, In Liquid and surface-borne particle measurement handbook, New York, Marcel-Dekker, pp.295-450.
3. A good reference for PoD can be found at www.statisticalengineering.com/mh1823/index.html by M. Charles Annis