Machine vision systems are used throughout industry for critical inspection and measurement, frequently as part of a quality control approach. It is essential that the measurements made are both accurate and repeatable, because any variable, either internal or external that affects the performance of the imaging system could in turn affect the resulting measurements. This is just as important when inspecting low-cost, high-volume components as it is for high-cost, low-volume items.
Choosing the best lighting configuration
Machine vision systems evaluate the image of the object, not the object itself, so the first stage in the process is to get the correct lighting arrangement for the application, to ensure that the camera can capture an image containing all of the details needed to make the measurements. This is essential since if any information is missing as a result of incorrect illumination, subsequent measurements are much more difficult or impossible. Information that is completely missing from the image can never be recovered by analysis algorithms. The way the light strikes the target, the wavelength of the light and the nature of the sample surface all influence the image produced at the camera. The direction of light allows different features to be highlighted while the use of a light of particular wavelength can have a major influence on the contrast in the image. In addition, the surface of the object affects the image, with shiny surfaces in particular giving rise to multiple reflections.
Understanding and solving these problems has led to a myriad of lighting configurations such as back lighting, bright field or dark field illumination and the use of diffuse and dome illumination, just to name a few. Selecting the correct lighting technique goes a long way to address the special issues of a particular application. LED illumination has become the universal standard for machine vision. They have excellent reliability and offer almost maintenance-free operation. They can be assembled in almost any shape and are available in a wide choice of colors (wavelengths) including red, white, blue, green, IR and UV and also offer excellent value for the money. However, getting the correct type of illumination is only the first part of the solution. Ensuring that the light is consistent for every single measurement is critical in ensuring repeatable measurements.
Factors affecting consistency of illumination
Figure 1 shows how lighting is the foundation of the creation of an image of an object, and so variations in lighting result in variations in the resulting image, and in many applications, such variations are simply not acceptable. So what can affect LED light intensity? There are a surprisingly large number of factors that can influence this:
Age of the light. Although LEDs are remarkably reliable, the light output does eventually deteriorate over time.
Temperature of the light. We have characterized a number of machine vision lights and found that as LEDs heat up from 25 C to 90 C, the brightness of the LED drops by around 40%. This is a significant variation which may not be seen during commissioning but which can cause variability during normal running.
Variations in drive to the light. This refers to the stability of the power supply. LED output is proportional to the current through the device not the voltage so all LED device manufacturers specify that current control advised for efficient use.
Ambient light. This may cause a problem because it may change when we don’t expect it. Most machine vision systems try to exclude ambient light from the camera view to remove this as a variable factor, but in some cases this is just not possible. By reducing the exposure time, the effect of this ambient light can be reduced.
Variations in lighting and camera exposure. In many applications, it is necessary to pulse the lighting. For example, simply running an LED continuously may not generate enough light intensity to produce sufficient detail in the image. In these cases image quality can usually be enhanced by overdriving the LED to increase the light output intensity. This can only be done in short pulses to prevent damage to the LED, so it requires precise control of pulse frequency, duration and amount of overdriving. Another reason for pulsing, or strobing the light source is to freeze the image of moving objects, particularly at high speeds, where overdriving may also be essential to deliver sufficient light intensity for the short camera exposure times. This is perfect for automating vision inspection on busy production lines.
When using pulsed lighting it is essential to get the timing of the lighting pulse and the camera exposure perfectly aligned in order to optimize image brightness. If they are not aligned, then the image will appear dark or in the worst case, no image will be seen at all! This is illustrated in Figure 2, which shows a situation where the camera exposure is starting before the light pulse, so the image is quite dark. By adjusting the lighting pulse delay, the two can be easily brought into alignment.
Having discussed possible causes of light variations we now look at three options to compensate for these within applications.
Controlling illumination intensity
The first option is based on temperature feedback. One system is an intelligent lighting platform that allows fixed and variable data about the light to be stored and then read by the lighting controller and, if required, via GigE Vision to the wider machine vision system. The chip inside the light enables the lighting controller to know the real time temperature of the light and also hold a temperature compensation profile. The output current is therefore automatically increased when the lighting temperature increases according to this stored profile. A limit is applied to prevent thermal runaway.
The second option uses an optical sensor feedback loop. The optical sensor is arranged to measure the brightness of the light and provide this measurement to the lighting controller. The lighting controller has a target brightness and adjusts the current to the light to achieve this target. Although this feedback loop approach accounts for all the variables in lighting intensity it is making adjustments based on the light falling on the optical sensor. Depending on where the sensor is located, this could be slightly different to that falling on the camera itself.
The third and most comprehensive way of controlling image brightness is using a closed loop approach based on image illumination levels detected by the camera itself (Figure 5). This is a highly effective solution since it takes into account any changes in the illumination conditions whether as a result of the LED or any external factors. The closed loop system is structured as follows: first the camera determines optimum illumination levels for the application. In addition to imaging the target object, the camera also measures a defined area of the scene for illumination intensity—effectively a “test card” area of interest. The image processing software monitors acceptable operating illumination bandwidths and then the lighting controller automatically adjusts lighting intensity if necessary.
The effect can be illustrated in Figure 6, where an object is being illuminated by two LEDs. One of the lights is used to simulate ambient light changes, while the other is being used to illuminate the object for inspection. This control application is built using a major image processing software package linked to the LEDs via a lighting controller. Firstly the target illumination (blue line) and measured illumination (red line) have the output from the object light (purple line). For the first half of the trace, active control is switched off and it can be seen that the target and actual illumination are not in alignment because the ambient light was low. The system is then made active and the object illuminator goes high to compensate and then brings target and actual illumination into alignment.
Summary
This seamless link between imaging software, cameras, lights and lighting controllers means that not only is the current illumination level within the system known at any time, but proactive steps can be taken to accurately control these lighting levels. This is just the latest step in further enhancing the powerful capabilities offered by machine vision as a quality inspection tool.