Published on Jan 08, 2016
Machine vision refers to applications in which the PC automatically makes a decision based on visual input from a camera. Machine vision is a term typically used in industrial manufacturing, where applications range from culling blemished oranges from a conveyor belt to saving lives by inspecting to ensure that the correct drug capsule has been placed in the package before the product is shipped to the pharmacy.
Three dimensional vision based measurement systems have made their presence into production metrology applications, notably in the electronics field. However, in the more traditional fields of durable goods now dominated by hard gauges and CMMs, 3D optical systems has been hindered by perceptions and real limitations. This paper will review where 3D vision is today, and what advances have been made to enable more quantitative, shop floor metrology applications. The field of 3D machine vision is a less established field, but one that is actively growing today.
Three dimensional vision based measurements have come a long way in the past few years, moving from purely visualization tools that generate attractive color pictures, to serious measurement tools. These 3D systems include laser scanning, structured light, stereo viewing, and laser radar just to name a few.
As has already been stated, the key operational parameters needed for production machine vision include speed, resolution, and robustness especially to changing part surface conditions. Many systems that provide the best resolution are not the fastest, so a tradeoff must be made. Just as with touch probes, there are certain types of features or surfaces that optical 3D methods can be expected to work good on, and others where there may be problems. If has been pointed out that shiny, but not specular surfaces have offered one of the biggest challenges.
In like manner, when a surface changes from a shiny area to a dull, many sensors may generate a bias error. In the simple case of triangulation, the measurement is based upon finding the centroid of a light spot of some finite size. If half that spot is on an area that reflects back to the sensor well, and the other half is not, the center of brightness of the spot will not be the geometric center, but rather weighted toward the brighter region. Testing the sensor on edge and surface transition features is a valuable first test to consider
The next area of concern is the surface texture itself. A surface with machining marks has a texture which may scatter light into long lines that may confuse the sensor. In a similar manner, if the surface is translucent, the spot of light may spread out in an unpredictable manner, again offsetting the spot center, and hence the measurement made. Therefore, testing the sensor on a surface texture that matches the one to be measured is important. A final important feature consideration for many optical gages is the slope of the surface.
A standard way to test such effects is to scan a diffuse sphere and fit the data to the known diameter (see Figure 7).At some angle around the sphere, the surface can be expected to lift off the true surface, before the signal is lost entirely. As with any gage, comparison of the optical gage against other measurement tools provides valuable information regarding confidence of the measurements. This is not always an easy comparison, as the repeatability of many traditional gages may not be as good as the optical gage.
Anytime one is trying to improve a process, one encounters the challenge of demonstrating a capability against a tool with less capability. The very high speeds and data densities available from optical gages offer some significant advances. Yet trying to compare those advantages against a gage which stakes everything on a half dozen points is a difficult road. However, the comparisons and qualification of the new process must be done. Comparisons against inadequate measurement tools can prove very frustrating, so at least one independent means of comparison is desirable. The use of reference artifacts that can be verified by independent means, preferable with primary standard reference (e.g., the laser distance interferometer is the international standard of length measurement), is a valuable aid in these comparison.
Machine vision systems can be used in various fields for numerous purposes as stated below:
Industrial inspection (Inspecting machine parts, Adaptive inspection systems)
Vision for Autonomous Vehicles (Detecting obstructions, Exploring new surroundings, AV surveillance)
Transport (Traffic monitoring, Aerial navigation, Transport Safety)
Surveillance(Intruder monitoring, Number plate identification, Tracking people)
Vision and Remote Sensing (Land Management, Crop classification, Surveying by satellite)
Medical (Planning radiation therapy, Chromosome analysis, Merging medical images)
As with any technology of this nature, the performance changes with the component technology. The primary advance that has made machine vision systems feasible for shop floor gauging applications has been the speed up in computing power. This has brought the processing times from 15 or 20 minutes on an expensive workstation to seconds on a standard PC.
The other technologies that are influencing performance today include lower cost, digital cameras than provide better light range and pixel resolution with lower noise, and better light sources such as higher power laser diodes well as higher brightness and resolution LCD projectors. The consumer market largely influences all of these technologies, which is currently a much bigger driver than any manufacturing support system. However, as system prices decrease and performance improves, there is a wide range of new markets these systems will likely address ranging from dentistry to home 3D pictures.
1. ASME Journal of Machine Design
2. "Machine Vision Based Gaging of Manufactured Parts," K. Harding, Proc. SME Gaging '95, Detroit, Nov. 1995.
3. “The Promise of Machine Vision,” K. Harding, Optics and Photonics News, p. 30 May 1996 Optical Society of America
4. "Light Source Design for Machine Vision", Eric J. Sieczka and Kevin G. Harding, SPIE Proceedings Vol. 1614, Optics, Illumination and Image Sensing for Machine Vision VI, Boston, Nov. 1991.
5. L. Bieman, "Three-Dimensional Machine Vision," Photonics Spectra, May 1988, p. 81.
6. E. L. Hall and C. A. McPherson, "Three Dimensional Perception for Robot Vision," SPIE Proceedings Vol. 442, p. 117 (1983).