Muutke küpsiste eelistusi

E-raamat: Handbook of Machine and Computer Vision - The Guide for Developers and Users 2e: The Guide for Developers and Users 2nd Edition [Wiley Online]

Edited by (University of Applied Sciences of Esslingen, Germany)
  • Formaat: 860 pages
  • Ilmumisaeg: 19-Apr-2017
  • Kirjastus: John Wiley & Sons Inc
  • ISBN-10: 3527413405
  • ISBN-13: 9783527413409
Teised raamatud teemal:
  • Wiley Online
  • Hind: 290,76 €*
  • * hind, mis tagab piiramatu üheaegsete kasutajate arvuga ligipääsu piiramatuks ajaks
  • Formaat: 860 pages
  • Ilmumisaeg: 19-Apr-2017
  • Kirjastus: John Wiley & Sons Inc
  • ISBN-10: 3527413405
  • ISBN-13: 9783527413409
Teised raamatud teemal:
The second edition of this accepted reference work has been updated to reflect the rapid developments in the field and now covers both 2D and 3D imaging.
Written by expert practitioners from leading companies operating in machine vision, this one-stop handbook guides readers through all aspects of image acquisition and image processing, including optics, electronics and software. The authors approach the subject in terms of industrial applications, elucidating such topics as illumination and camera calibration. Initial chapters concentrate on the latest hardware aspects, ranging from lenses and camera systems to camera-computer interfaces, with the software necessary discussed to an equal depth in later sections. These include digital image basics as well as image analysis and image processing. The book concludes with extended coverage of industrial applications in optics and electronics, backed by case studies and design strategies for the conception of complete machine vision systems. As a result, readers are not only able to understand the latest systems, but also to plan and evaluate this technology.
With more than 500 images and tables to illustrate relevant principles and steps.
Preface Second Edition xxiii
Preface First Edition xxv
List of Contributors xxvii
1 Processing of Information in the Human Visual System 1(30)
Frank Schaeffel
1.1 Preface
1(1)
1.2 Design and Structure of the Eye
1(2)
1.3 Optical Aberrations and Consequences for Visual Performance
3(7)
1.4 Chromatic Aberration
10(1)
1.5 Neural Adaptation to Monochromatic Aberrations
11(1)
1.6 Optimizing Retinal Processing with Limited Cell Numbers, Space, and Energy
11(1)
1.7 Adaptation to Different Light Levels
12(2)
1.8 Rod and Cone Responses
14(2)
1.9 Spiking and Coding
16(1)
1.10 Temporal and Spatial Performance
17(1)
1.11 ON/OFF Structure, Division of the Whole Illuminance Amplitude
18(1)
1.12 Consequences of the Rod and Cone Diversity on Retinal Wiring
18(1)
1.13 Motion Sensitivity in the Retina
19(1)
1.14 Visual Information Processing in Higher Centers
20(1)
1.14.1 Morphology
21(1)
1.14.2 Functional Aspects - Receptive Field Structures and Cortical Modules
22(1)
1.15 Effects of Attention
23(1)
1.16 Color Vision, Color Constancy, and Color Contrast
23(2)
1.17 Depth Perception
25(1)
1.18 Adaptation in the Visual System to Color, Spatial, and Temporal Contrast
26(1)
1.19 Conclusions
26(2)
Acknowledgements
28(1)
References
28(3)
2 Introduction to Building a Machine Vision Inspection 31(32)
Axel Telljohann
2.1 Preface
31(1)
2.2 Specifying a Machine Vision System
32(4)
2.2.1 Task and Benefit
32(1)
2.2.2 Parts
33(1)
2.2.2.1 Different Part Types
33(1)
2.2.3 Part Presentation
33(1)
2.2.4 Performance Requirements
34(1)
2.2.4.1 Accuracy
34(1)
2.2.4.2 Time Performance
34(1)
2.2.5 Information Interfaces
34(1)
2.2.6 Installation Space
35(1)
2.2.7 Environment
35(1)
2.2.8 Checklist
35(1)
2.3 Designing a Machine Vision System
36(12)
2.3.1 Camera Type
36(1)
2.3.2 Field of View
37(1)
2.3.3 Resolution
38(2)
2.3.3.1 Camera Sensor Resolution
38(1)
2.3.3.2 Spatial Resolution
38(1)
2.3.3.3 Measurement Accuracy
38(1)
2.3.3.4 Calculation of Resolution
39(1)
2.3.3.5 Resolution for a Line Scan Camera
39(1)
2.3.4 Choice of Camera, Frame Grabber, and Hardware Platform
40(1)
2.3.4.1 Camera Model
40(1)
2.3.4.2 Frame Grabber
40(1)
2.3.4.3 Pixel Rate
40(1)
2.3.4.4 Hardware Platform
41(1)
2.3.5 Lens Design
41(3)
2.3.5.1 Focal Length
42(1)
2.3.5.2 Lens Flange Focal Distance
43(1)
2.3.5.3 Extension Tubes
43(1)
2.3.5.4 Lens Diameter and Sensor Size
43(1)
2.3.5.5 Sensor Resolution and Lens Quality
43(1)
2.3.6 Choice of Illumination
44(2)
2.3.6.1 Concept: Maximize Contrast
44(1)
2.3.6.2 Illumination Setups
44(1)
2.3.6.3 Light Sources
45(1)
2.3.6.4 Approach to the Optimum Setup
45(1)
2.3.6.5 Interfering Lighting
46(1)
2.3.7 Mechanical Design
46(1)
2.3.8 Electrical Design
46(1)
2.3.9 Software
46(3)
2.3.9.1 Software Library
47(1)
2.3.9.2 Software Structure
47(1)
2.3.9.3 General Topics
48(1)
2.4 Costs
48(1)
2.5 Words on Project Realization
49(1)
2.5.1 Development and Installation
49(1)
2.5.2 Test Run and Acceptance Test
49(1)
2.5.3 Training and Documentation
50(1)
2.6 Examples
50(13)
2.6.1 Diameter Inspection of Rivets
50(5)
2.6.1.1 Task
50(1)
2.6.1.2 Specification
51(1)
2.6.1.3 Design
51(4)
2.6.2 Tubing Inspection
55(8)
2.6.2.1 Task
55(1)
2.6.2.2 Specification
55(1)
2.6.2.3 Design
56(7)
3 Lighting in Machine Vision 63(116)
Irmgard Jahr
3.1 Introduction
63(4)
3.1.1 Prologue
63(1)
3.1.2 The Involvement of Lighting in the Complex Machine Vision Solution
63(4)
3.2 Demands on Machine Vision lighting
67(3)
3.3 Light used in Machine Vision
70(21)
3.3.1 What is Light? Axioms of Light
70(3)
3.3.2 Light and Light Perception
73(3)
3.3.3 Light Sources for Machine Vision
76(10)
3.3.3.1 Incandescent Lamps/Halogen Lamps
77(1)
3.3.3.2 Metal Vapor Lamps
78(1)
3.3.3.3 Xenon Lamps
79(2)
3.3.3.4 Fluorescent Lamps
81(1)
3.3.3.5 LEDs (Light Emitting Diodes)
82(3)
3.3.3.6 Lasers
85(1)
3.3.4 The Light Sources in Comparison
86(1)
3.3.5 Considerations for Light Sources: Lifetime, Aging, Drift
86(5)
3.3.5.1 Lifetime
86(2)
3.3.5.2 Aging and Drift
88(3)
3.4 Interaction of Test Object and Light
91(18)
3.4.1 Risk Factor Test Object
91(10)
3.4.1.1 What Does the Test Object do With the Incoming Light?
92(1)
3.4.1.2 Reflection/Reflectance/Scattering
92(3)
3.4.1.3 Total Reflection
95(1)
3.4.1.4 Transmission/Transmittance
96(1)
3.4.1.5 Absorption/Absorbance
97(2)
3.4.1.6 Diffraction
99(1)
3.4.1.7 Refraction
100(1)
3.4.2 Light Color and Part Color
101(8)
3.4.2.1 Visible Light (VIS) - Monochromatic Light
101(2)
3.4.2.2 Visible Light (VIS) - White Light
103(1)
3.4.2.3 Infrared Light (IR)
104(2)
3.4.2.4 Ultraviolet (UV) Light
106(1)
3.4.2.5 Polarized Light
107(2)
3.5 Basic Rules and Laws of Light Distribution
109(12)
3.5.1 Basic Physical Quantities of Light
110(1)
3.5.2 The Photometric Inverse Square Law
111(2)
3.5.3 The Constancy of Luminance
113(1)
3.5.4 What Light Arrives at the Sensor - Light Transmission Through the Lens
114(1)
3.5.5 Light Distribution of Lighting Components
115(3)
3.5.6 Contrast
118(2)
3.5.7 Exposure
120(1)
3.6 Light Filters
121(10)
3.6.1 Characteristic Values of Light Filters
121(2)
3.6.2 Influences of Light Filters on the Optical Path
123(1)
3.6.3 Types of Light Filters
124(2)
3.6.4 Anti-Reflective Coatings (AR)
126(1)
3.6.5 Light Filters for Machine Vision
127(4)
3.6.5.1 UV Blocking Filter
127(1)
3.6.5.2 Daylight Suppression Filter
128(1)
3.6.5.3 IR Suppression Filter
128(1)
3.6.5.4 Neutral Filter/Neutral Density Filter/Gray Filter
129(1)
3.6.5.5 Polarization Filter
130(1)
3.6.5.6 Color Filters
130(1)
3.6.5.7 Filter Combinations
131(1)
3.7 Lighting Techniques and Their Use
131(32)
3.7.1 How to Find a Suitable Lighting?
131(2)
3.7.2 Planning the Lighting Solution - Influence Factors
133(2)
3.7.3 Lighting Systematics
135(5)
3.7.3.1 Directional Properties of the Light
135(3)
3.7.3.2 Arrangement of the Lighting
138(1)
3.7.3.3 Properties of the Illuminated Field
138(2)
3.7.4 The Lighting Techniques in Detail
140(22)
3.7.4.1 Diffuse Bright Field Incident Light (No. 1, Table 3.14)
140(2)
3.7.4.2 Directed Bright Field Incident Light (No. 2, Table 3.14)
142(1)
3.7.4.3 Telecentric Bright Field Incident Light (No. 3, Table 3.14)
143(2)
3.7.4.4 Structured Bright Field Incident Light (No. 4, Table 3.14)
145(3)
3.7.4.5 Diffuse Directed Partial Bright Field Incident Light (Nos. 1 and 2, Table 3.14)
148(4)
3.7.4.6 Diffuse/Directed Dark Field Incident Light (Nos. 5 and 6, Table 3.14)
152(2)
3.7.4.7 The Limits of the Incident Lighting
154(1)
3.7.4.8 Diffuse Bright Field Transmitted Lighting (No. 7, Table 3.14)
155(2)
3.7.4.9 Directed Bright Field Transmitted Lighting (No. 8, Table 3.14)
157(1)
3.7.4.10 Telecentric Bright Field Transmitted Lighting (No. 9, Table 3.14)
158(3)
3.7.4.11 Diffuse/Directed Transmitted Dark Field Lighting (Nos. 10 and 11, Table 3.14)
161(1)
3.7.5 Combined Lighting Techniques
162(1)
3.8 Lighting Control
163(13)
3.8.1 Reasons for Light Control - The Environmental Industrial Conditions
164(1)
3.8.2 Electrical Control
164(9)
3.8.2.1 Stable Operation
164(2)
3.8.2.2 Brightness Control
166(1)
3.8.2.3 Temporal Control: Static-Pulse-Flash
167(1)
3.8.2.4 Some Considerations for the Use of Flash Light
168(3)
3.8.2.5 Temporal and Local Control: Adaptive Lighting
171(2)
3.8.3 Geometrical Control
173(2)
3.8.3.1 Lighting from Large Distances
173(2)
3.8.3.2 Light Deflection
175(1)
3.8.4 Suppression of Ambient and Extraneous Light - Measures for a Stable Lighting
175(1)
3.9 Lighting Perspectives for the Future
176(1)
References
177(2)
4 Optical Systems in Machine Vision 179(112)
Karl Lenhardt
4.1 A Look at the Foundations of Geometrical Optics
179(4)
4.1.1 From Electrodynamics to Light Rays
179(2)
4.1.2 Basic Laws of Geometrical Optics
181(2)
4.2 Gaussian Optics
183(52)
4.2.1 Reflection and Refraction at the Boundary between two Media
183(2)
4.2.2 Linearizing the Law of Refraction - The Paraxial Approximation
185(1)
4.2.3 Basic Optical Conventions
186(3)
4.2.3.1 Definitions for Image Orientations
186(1)
4.2.3.2 Definition of the Magnification Ratio beta
186(1)
4.2.3.3 Real and Virtual Objects and Images
187(1)
4.2.3.4 Tilt Rule for the Evaluation of Image Orientations by Reflection
188(1)
4.2.4 Cardinal Elements of a Lens in Gaussian Optics
189(4)
4.2.4.1 Focal Lengths f and f'
192(1)
4.2.4.2 Convention
192(1)
4.2.5 Thin Lens Approximation
193(1)
4.2.6 Beam-Converging and Beam-Diverging Lenses
193(2)
4.2.7 Graphical Image Constructions
195(1)
4.2.7.1 Beam-Converging Lenses
195(1)
4.2.7.2 Beam-Diverging Lenses
195(1)
4.2.8 Imaging Equations and Their Related Coordinate Systems
195(5)
4.2.8.1 Reciprocity Equation
196(1)
4.2.8.2 Newton's Equations
197(1)
4.2.8.3 General Imaging Equation
198(2)
4.2.8.4 Axial Magnification Ratio
200(1)
4.2.9 Overlapping of Object and Image Space
200(1)
4.2.10 Focal Length, Lateral Magnification, and the Field of View
200(2)
4.2.11 Systems of Lenses
202(3)
4.2.12 Consequences of the Finite Extension of Ray Pencils
205(9)
4.2.12.1 Effects of Limitations of the Ray Pencils
205(2)
4.2.12.2 Several Limiting Openings
207(3)
4.2.12.3 Characterizing the Limits of Ray Pencils
210(2)
4.2.12.4 Relation to the Linear Camera Model
212(2)
4.2.13 Geometrical Depth of Field and Depth of Focus
214(5)
4.2.13.1 Depth of Field as a Function of the Object Distance p
215(1)
4.2.13.2 Depth of Field as a Function of beta
216(1)
4.2.13.3 Hyperfocal Distance
217(1)
4.2.13.4 Permissible Size for the Circle of Confusion d'
218(1)
4.2.14 Laws of Central Projection-Telecentric System
219(16)
4.2.14.1 Introduction to the Laws of Perspective
219(9)
4.2.14.2 Central Projection from Infinity - Telecentric Perspective
228(7)
4.3 Wave Nature of Light
235(17)
4.3.1 Introduction
235(1)
4.3.2 Rayleigh-Sommerfeld Diffraction Integral
236(2)
4.3.3 Further Approximations to the Huygens-Fresnel Principle
238(3)
4.3.3.1 Fresnel's Approximation
239(2)
4.3.4 Impulse Response of an Aberration-Free Optical System
241(3)
4.3.4.1 Case of Circular Aperture, Object Point on the Optical Axis
243(1)
4.3.5 Intensity Distribution in the Neighborhood of the Geometrical Focus
244(4)
4.3.5.1 Special Cases
246(2)
4.3.6 Extension of the Point Spread Function in a Defocused Image Plane
248(1)
4.3.7 Consequences for the Depth of Field Considerations
249(3)
4.3.7.1 Diffraction and Permissible Circle of Confusion
249(1)
4.3.7.2 Extension of the Point Spread Function at the Limits of the Depth of Focus
250(1)
4.3.7.3 Useful Effective f -Number
251(1)
4.4 Information Theoretical Treatment of Image Transfer and Storage
252(25)
4.4.1 Physical Systems as Linear Invariant Filters
252(8)
4.4.1.1 Invariant Linear Systems
255(4)
4.4.1.2 Note to the Representation of Harmonic Waves
259(1)
4.4.2 Optical Transfer Function (OTF) and the Meaning of Spatial Frequency
260(1)
4.4.2.1 Note on the Relation Between the Elementary Functions in the Two Representation Domains
261(1)
4.4.3 Extension to the Two-Dimensional Case
261(4)
4.4.3.1 Interpretation of Spatial Frequency Components (r, s)
261(1)
4.4.3.2 Reduction to One-Dimensional Representations
262(3)
4.4.4 Impulse Response and MTF for Semiconductor Imaging Devices
265(2)
4.4.5 Transmission Chain
267(1)
4.4.6 Aliasing Effect and the Space-Variant Nature of Aliasing
267(10)
4.4.6.1 Space-Variant Nature of Aliasing
274(3)
4.5 Criteria for Image Quality
277(8)
4.5.1 Gaussian Data
277(1)
4.5.2 Overview on Aberrations of the Third Order
277(1)
4.5.2.1 Monochromatic Aberrations of the Third Order (Seidel Aberrations)
278(1)
4.5.2.2 Chromatic Aberrations
278(1)
4.5.3 Image Quality in the Space Domain: PSF, LSF, ESF, and Distortion
278(3)
4.5.3.1 Distortion
280(1)
4.5.4 Image Quality in the Spatial Frequency Domain: MTF
281(2)
4.5.4.1 Parameters that Influence the Modulation Transfer Function
282(1)
4.5.5 Other Image Quality Parameters
283(1)
4.5.5.1 Relative Illumination (Relative Irradiance)
283(1)
4.5.5.2 Deviation from Telecentricity (for Telecentric Lenses only)
284(1)
4.5.6 Manufacturing Tolerances and Image Quality
284(1)
4.5.6.1 Measurement Errors due to Mechanical Inaccuracies of the Camera System
285(1)
4.6 Practical Aspects: How to Specify Optics According to the Application Requirements?
285(4)
4.6.1 Example for the Calculation of an Imaging Constellation
287(2)
References
289(2)
5 Camera Calibration 291(26)
Robert Godding
5.1 Introduction
291(1)
5.2 Terminology
292(1)
5.2.1 Camera, Camera System
292(1)
5.2.2 Coordinate Systems
292(1)
5.2.3 Interior Orientation and Calibration
293(1)
5.2.4 Exterior and Relative Orientation
293(1)
5.2.5 System Calibration
293(1)
5.3 Physical Effects
293(2)
5.3.1 Optical System
293(1)
5.3.2 Camera and Sensor Stability
294(1)
5.3.3 Signal Processing and Transfer
294(1)
5.4 Mathematical Calibration Model
295(7)
5.4.1 Central Projection
295(1)
5.4.2 Camera Model
295(2)
5.4.3 Focal Length and Principal Point
297(1)
5.4.4 Distortion and Affinity
297(1)
5.4.5 Radial Symmetrical Distortion
297(2)
5.4.6 Radial Asymmetrical and Tangential Distortion
299(1)
5.4.7 Affinity and Nonorthogonality
299(1)
5.4.8 Variant Camera Parameters
299(2)
5.4.9 Sensor Flatness
301(1)
5.4.10 Other Parameters
301(1)
5.5 Calibration and Orientation Techniques
302(6)
5.5.1 In the Laboratory
302(1)
5.5.2 Using Bundle Adjustment to Determine Camera Parameters
302(5)
5.5.2.1 Calibration Based Exclusively on Image Information
302(2)
5.5.2.2 Calibration and Orientation with Additional Object Information
304(3)
5.5.2.3 Extended System Calibration
307(1)
5.5.3 Other Techniques
307(1)
5.6 Verification of Calibration Results
308(1)
5.7 Applications
309(5)
5.7.1 Applications with Simultaneous Calibration
309(2)
5.7.2 Applications with Precalibrated Cameras
311(6)
5.7.2.1 Tube Measurement within a Measurement Cell
311(1)
5.7.2.2 Online Measurements in the Field of Car Safety
312(1)
5.7.2.3 High Resolution 3D Scanning with White Light Scanners
312(1)
5.7.2.4 Other Applications
313(1)
References
314(3)
6 Camera Systems in Machine Vision 317(82)
Horst Mattfeldt
6.1 Camera Technology
317(2)
6.1.1 History in Brief
317(1)
6.1.2 Machine Vision versus Closed Circuit TeleVision (CCTV)
317(2)
6.2 Sensor Technologies
319(25)
6.2.1 Spatial Differentiation: 1D and 2D
319(1)
6.2.2 CCD Technology
320(8)
6.2.2.1 Interline Transfer
321(1)
6.2.2.2 Progressive Scan Interline Transfer
321(1)
6.2.2.3 Interlaced Scan Readout
322(2)
6.2.2.4 Enhancing Frame Rate by Multitap Sensors
324(1)
6.2.2.5 SONY HAD Technology
325(1)
6.2.2.6 SONY SuperHAD (II) and ExViewHAD (II) Technology
325(1)
6.2.2.7 CCD Image Artifacts
326(1)
6.2.2.8 Blooming
326(1)
6.2.2.9 Smear
326(2)
6.2.3 CMOS Image Sensor
328(10)
6.2.3.1 Advantages of CMOS Sensor
328(3)
6.2.3.2 CMOS Sensor Shutter Concepts
331(5)
6.2.3.3 Performance Comparison of CMOS versus CCD
336(1)
6.2.3.4 Integration Complexity of CCD versus CMOS Camera Technology
336(1)
6.2.3.5 CMOS Sensor Sensitivity Enhancements
337(1)
6.2.4 MATRIX VISION Available Cameras
338(6)
6.2.4.1 Why So Many Different Models? How to Choose Among These?
338(1)
6.2.4.2 Resolution and Video Standards
338(6)
6.2.4.3 Sensor Sizes and Dimensions
344(1)
6.3 Block Diagrams and Their Description
344(10)
6.3.1 Block Diagram of SONY Progressive Scan Analog Camera
345(5)
6.3.1.1 CCD Read Out Clocks
345(1)
6.3.1.2 CCD Binning Mode
345(3)
6.3.1.3 Spectral Sensitivity
348(1)
6.3.1.4 Analog Signal Processing
348(2)
6.3.1.5 Camera and Frame Grabber
350(1)
6.3.2 Block Diagram of Color Camera with Digital Image Processing
350(4)
6.3.2.1 Bayer™ Complementary Color Filter Array
351(1)
6.3.2.2 Complementary Color Filters Spectral Sensitivity
351(1)
6.3.2.3 Generation of Color Signals
351(3)
6.4 invBlueCOUGAR-X Line of Cameras
354(30)
6.4.1 Black and White Digital Camera mvBlueCOUGAR-X Camera Series
355(1)
6.4.1.1 Gray Level Sensor and Processing
355(1)
6.4.2 Color Camera myBlueCOUGAR-X Family
356(15)
6.4.2.1 Analog Processing
356(1)
6.4.2.2 Analog Front End (AFE)
357(1)
6.4.2.3 A/D Conversion
357(2)
6.4.2.4 One-Chip Color Processing
359(2)
6.4.2.5 Inputting Time Stamp Data into Data Stream
361(1)
6.4.2.6 Statistics Engine for White Balance and Auto Features
361(1)
6.4.2.7 Image Memory
361(1)
6.4.2.8 Lookup Table (LUT) and Gamma Function
362(3)
6.4.2.9 Shading Correction
365(1)
6.4.2.10 Reducing Noise by Adaptive Recursive Frame Averaging
366(1)
6.4.2.11 Color Interpolation
367(1)
6.4.2.12 Color Correction
368(2)
6.4.2.13 RGB --> YUV Conversion
370(1)
6.4.3 Controlling Image Capture
371(1)
6.4.4 Acquisition and Trigger Modes
371(6)
6.4.4.1 Sequencer
374(1)
6.4.4.2 Latency and Jitter Aspects
375(1)
6.4.4.3 Action Commands
375(2)
6.4.4.4 Scheduled Action Command
377(1)
6.4.5 Data Transmission
377(3)
6.4.5.1 GigE Vision and GVSP
378(2)
6.4.5.2 USB3 Vision
380(1)
6.4.6 Pixel Data
380(1)
6.4.7 Camera Connection
381(1)
6.4.8 Operating the Camera
381(1)
6.4.9 HiRose Jack Pin Assignment
382(1)
6.4.10 Sensor Frame Rates and Bandwidth
382(2)
6.5 Configuration of a GigE Vision Camera
384(2)
6.6 Qualifying Cameras and Noise Measurement (Dr. Gert Ferrano MV)
386(5)
6.6.1 Explanation of the Most Important Measurements
388(3)
6.6.1.1 Linearity Curve
388(1)
6.6.1.2 Photon Transfer Curve
388(3)
6.7 Camera Noise (by Henning Haider AVT, Updated by Author)
391(3)
6.7.1 Photon Noise
391(1)
6.7.2 Dark Current Noise
391(1)
6.7.3 Fixed Pattern Noise (FPN)
392(1)
6.7.4 Photo Response Non Uniformity (PRNU)
392(1)
6.7.5 Reset Noise
392(1)
6.7.6 1/f Noise (Amplifier Noise)
392(1)
6.7.7 Quantization Noise
392(1)
6.7.8 Noise Floor
393(1)
6.7.9 Dynamic Range
393(1)
6.7.10 Signal to Noise Ratio
393(1)
6.7.11 Example 1: SONY IMX-174 Sensor (mvBlueFOX3-2024)
394(1)
6.7.12 Example 2: CMOSIS CMV2000 (mvBlueCOUGAR-X104)
394(1)
6.8 Useful Links and Literature
394(1)
6.9 Digital Interfaces
395(4)
7 Smart Camera and Vision Systems Design 399(32)
Howard D. Gray
Nate Holmes
7.1 Introduction to Vision System Design
399(1)
7.2 Definitions
400(3)
7.3 Smart Cameras
403(15)
7.3.1 Applications
403(1)
7.3.2 Component Parts
404(9)
7.3.2.1 Processors
404(2)
7.3.2.2 FPGA Processing
406(1)
7.3.2.3 Memory and Storage
407(1)
7.3.2.4 Operating Systems
408(1)
7.3.2.5 Image Sensors
409(1)
7.3.2.6 Inputs and Outputs
410(2)
7.3.2.7 Other Interfaces
412(1)
7.3.2.8 Timers and Counters
413(1)
7.3.3 Programming and Configuring
413(3)
7.3.3.1 Scripting
413(1)
7.3.3.2 High-Level Languages
414(2)
7.3.3.3 Third-Party Tools
416(1)
7.3.4 Environment
416(2)
7.3.4.1 Power Dissipation
416(1)
7.3.4.2 Ingress Protection
417(1)
7.4 Vision Sensors
418(3)
7.4.1 Applications
419(1)
7.4.2 Component Parts
420(1)
7.4.3 Programming and Configuring
420(1)
7.4.4 Environment
421(1)
7.5 Embedded Vision Systems
421(4)
7.5.1 Applications
424(1)
7.5.1.1 Multi-Camera Applications
424(1)
7.5.1.2 Closed Loop Control Applications
424(1)
7.5.2 Component Parts
425(1)
7.5.3 Programming and Configuring
425(1)
7.5.4 Environment
425(1)
7.6 Conclusion
425(1)
References
426(3)
Further Reading
429(2)
8 Camera Computer Interfaces 431(74)
Nate Holmes
8.1 Overview
431(1)
8.2 Camera Buses
432(27)
8.2.1 Software Standards
433(2)
8.2.1.1 GenICam
433(1)
8.2.1.2 IIDC2
434(1)
8.2.2 Analog Camera Buses (Legacy)
435(4)
8.2.2.1 Analog Video Signal
436(1)
8.2.2.2 Interlaced Video
436(1)
8.2.2.3 Progressive Scan Video
436(1)
8.2.2.4 Timing Signals
437(1)
8.2.2.5 Analog Image Acquisition
437(1)
8.2.2.6 S-Video
438(1)
8.2.2.7 RGB
438(1)
8.2.2.8 Analog Connectors
439(1)
8.2.3 Parallel Digital Camera Buses (Legacy)
439(3)
8.2.3.1 Digital Video Transmission
439(1)
8.2.3.2 Taps
440(1)
8.2.3.3 Differential Signaling
441(1)
8.2.3.4 Line Scan
441(1)
8.2.3.5 Parallel Digital Connectors
441(1)
8.2.4 IEEE 1394 (FireWire) (Legacy)
442(7)
8.2.4.1 IEEE 1394 for Machine Vision
445(4)
8.2.5 Camera Link
449(2)
8.2.5.1 Camera Link Signals
450(1)
8.2.5.2 Camera Link Connectors
451(1)
8.2.6 Camera Link HS
451(1)
8.2.7 CoaXPress
452(1)
8.2.8 USB (USB3 Vision)
452(3)
8.2.8.1 USB for Machine Vision
454(1)
8.2.9 Gigabit Ethernet (GigE Vision)
455(3)
8.2.9.1 Gigabit Ethernet for Machine Vision
456(1)
8.2.9.2 GigE Vision Device Discovery
456(1)
8.2.9.3 GigE Vision Control Protocol (GVCP)
456(1)
8.2.9.4 GenlCam
457(1)
8.2.9.5 GigE Vision Stream Protocol (GVSP)
457(1)
8.2.9.6 Packet Loss and Resends
457(1)
8.2.10 Future Standards Development
458(1)
8.3 Choosing a Camera Bus
459(4)
8.3.1 Bandwidth
459(1)
8.3.2 Resolution
459(1)
8.3.3 Frame Rate
460(1)
8.3.4 Cables
460(1)
8.3.5 Line Scan
460(1)
8.3.6 Reliability
460(1)
8.3.7 Summary of Camera Bus Specifications
461(1)
8.3.8 Sample Use Cases
461(2)
8.3.8.1 Manufacturing Inspection
461(1)
8.3.8.2 LCD Inspection
462(1)
8.3.8.3 Security
463(1)
8.4 Computer Buses
463(8)
8.4.1 ISA/EISA
463(1)
8.4.2 PCl/CompactPCl/PXI
464(2)
8.4.3 PCI-X
466(1)
8.4.4 PCI Express/CompactPCl Express/PXI Express
467(2)
8.4.5 Throughput
469(2)
8.4.6 Prevalence and Lifetime
471(1)
8.4.6.1 Cost
471(1)
8.5 Choosing a Computer Bus
471(2)
8.5.1 Determine Throughput Requirements
471(2)
8.5.2 Applying the Throughput Requirements
473(1)
8.6 Driver Software
473(18)
8.6.1 Application Programming Interface
475(2)
8.6.2 Supported Platforms
477(1)
8.6.3 Performance
477(1)
8.6.4 Utility Functions
478(1)
8.6.5 Acquisition Mode
479(3)
8.6.5.1 Snap
479(1)
8.6.5.2 Grab
479(1)
8.6.5.3 Sequence
480(1)
8.6.5.4 Ring
481(1)
8.6.6 Image Representation
482(3)
8.6.6.1 Image Representation in Memory
482(3)
8.6.7 Bayer Color Encoding
485(2)
8.6.7.1 Image Representation on Disk
487(1)
8.6.8 Image Display
487(4)
8.6.8.1 Understanding Display Modes
488(1)
8.6.8.2 Palettes
489(1)
8.6.8.3 Nondestructive Overlays
490(1)
8.7 Features of a Machine Vision System
491(10)
8.7.1 Image Reconstruction
491(1)
8.7.2 Timing and Triggering
492(2)
8.7.3 Memory Handling
494(2)
8.7.4 Additional Features
496(9)
8.7.4.1 Look-Up Tables
497(2)
8.7.4.2 Region of Interest
499(1)
8.7.4.3 Color Space Conversion
499(2)
8.7.4.4 Shading Correction
501(1)
8.8 Summary
501(1)
References
502(3)
9 Machine Vision Algorithms 505(194)
Carsten Steger
9.1 Fundamental Data Structures
505(4)
9.1.1 Images
505(1)
9.1.2 Regions
506(2)
9.1.3 Subpixel-Precise Contours
508(1)
9.2 Image Enhancement
509(23)
9.2.1 Gray Value Transformations
509(3)
9.2.2 Radiometric Calibration
512(5)
9.2.3 Image Smoothing
517(11)
9.2.4 Fourier Transform
528(4)
9.3 Geometric Transformations
532(8)
9.3.1 Affine Transformations
532(1)
9.3.2 Projective Transformations
533(1)
9.3.3 Image Transformations
534(4)
9.3.4 Polar Transformations
538(2)
9.4 Image Segmentation
540(12)
9.4.1 Thresholding
540(8)
9.4.2 Extraction of Connected Components
548(2)
9.4.3 Subpixel-Precise Thresholding
550(2)
9.5 Feature Extraction
552(8)
9.5.1 Region Features
552(4)
9.5.2 Gray Value Features
556(3)
9.5.3 Contour Features
559(1)
9.6 Morphology
560(19)
9.6.1 Region Morphology
561(14)
9.6.2 Gray Value Morphology
575(4)
9.7 Edge Extraction
579(23)
9.7.1 Definition of Edges in One and Two Dimensions
579(4)
9.7.2 1D Edge Extraction
583(6)
9.7.3 2D Edge Extraction
589(7)
9.7.4 Accuracy of Edges
596(6)
9.8 Segmentation and Fitting of Geometric Primitives
602(11)
9.8.1 Fitting Lines
603(4)
9.8.2 Fitting Circles
607(1)
9.8.3 Fitting Ellipses
608(1)
9.8.4 Segmentation of Contours into Lines, Circles, and Ellipses
609(4)
9.9 Camera Calibration
613(18)
9.9.1 Camera Models for Area Scan Cameras
614(4)
9.9.2 Camera Model for Line Scan Cameras
618(4)
9.9.3 Calibration Process
622(4)
9.9.4 World Coordinates from Single Images
626(3)
9.9.5 Accuracy of the Camera Parameters
629(2)
9.10 Stereo Reconstruction
631(12)
9.10.1 Stereo Geometry
632(7)
9.10.2 Stereo Matching
639(4)
9.11 Template Matching
643(29)
9.11.1 Gray-Value-Based Template Matching
644(5)
9.11.2 Matching Using Image Pyramids
649(3)
9.11.3 Subpixel-Accurate Gray-Value-Based Matching
652(1)
9.11.4 Template Matching with Rotations and Scalings
653(1)
9.11.5 Robust Template Matching
654(18)
9.12 Optical Character Recognition
672(18)
9.12.1 Character Segmentation
672(2)
9.12.2 Feature Extraction
674(2)
9.12.3 Classification
676(14)
References
690(9)
10 Machine Vision in Manufacturing 699(102)
Peter Waszkewitz
10.1 Introduction
699(2)
10.1.1 The Machine Vision Market
699(2)
10.2 Application Categories
701(5)
10.2.1 Types of Tasks
701(2)
10.2.2 Types of Production
703(1)
10.2.2.1 Discrete Unit Production Versus Continuous Flow
703(1)
10.2.2.2 Job-Shop Production Versus Mass Production
704(1)
10.2.3 Types of Evaluations
704(1)
10.2.4 Value-Adding Machine Vision
705(1)
10.3 System Categories
706(9)
10.3.1 Common Types of Systems
707(1)
10.3.2 Sensors
707(1)
10.3.3 Vision Sensors
708(1)
10.3.4 Compact Systems
709(1)
10.3.5 Vision Controllers
710(1)
10.3.6 PC-Based Systems
710(3)
10.3.6.1 Library-Based Systems
711(1)
10.3.6.2 Application-Package-Based Systems
712(1)
10.3.6.3 Library-Based Application Packages
713(1)
10.3.7 Excursion: Embedded Image Processing
713(1)
10.3.8 Summary
714(1)
10.4 Integration and Interfaces
715(1)
10.4.1 Standardization
715(1)
10.4.2 Interfaces
716(1)
10.5 Mechanical Interfaces
716(9)
10.5.1 Dimensions and Fixation
717(1)
10.5.2 Working Distances
718(1)
10.5.3 Position Tolerances
718(1)
10.5.4 Forced Constraints
719(1)
10.5.5 Additional Sensor Requirements
719(1)
10.5.6 Additional Motion Requirements
720(1)
10.5.7 Environmental Conditions
721(1)
10.5.8 Reproducibility
722(1)
10.5.9 Gauge Capability
723(2)
10.6 Electrical Interfaces
725(4)
10.6.1 Wiring and Movement
726(1)
10.6.2 Power Supply
726(1)
10.6.3 Internal Data Connections
727(2)
10.6.4 External Data Connections
729(1)
10.7 Information Interfaces
729(9)
10.7.1 Interfaces and Standardization
730(1)
10.7.2 Traceability
730(1)
10.7.3 Types of Data and Data Transport
731(1)
10.7.4 Control Signals
731(1)
10.7.5 Result and Parameter Data
732(1)
10.7.6 Mass Data
733(1)
10.7.7 Digital I/O
733(1)
10.7.8 Field Bus
733(1)
10.7.9 Serial Interfaces
734(1)
10.7.10 Network
734(2)
10.7.10.1 Standard Ethernet-TCP/IP
734(1)
10.7.10.2 OPC UA and Industry 4.0
735(1)
10.7.10.3 Ethernet-Based Field Bus/Real-Time Ethernet
735(1)
10.7.11 Files
736(1)
10.7.12 Time and Integrity Considerations
736(2)
10.8 Temporal Interfaces
738(7)
10.8.1 Discrete Motion Production
738(2)
10.8.2 Continuous Motion Production
740(3)
10.8.3 Line-Scan Processing
743(2)
10.9 Human-Machine Interfaces
745(8)
10.9.1 Interfaces for Engineering Vision Systems
746(1)
10.9.2 Runtime Interface
747(3)
10.9.2.1 Using the PLC HMI for Machine Vision
749(1)
10.9.3 Remote Maintenance
750(1)
10.9.3.1 Safety Precaution: No Movements
751(1)
10.9.4 Offline Setup
751(2)
10.10 3D Systems
753(19)
10.10.1 Dimensionality and Representation
753(4)
10.10.1.1 Dimensionality
753(1)
10.10.1.2 2.5D and 3D
754(1)
10.10.1.3 Point Clouds and Registration
755(2)
10.10.1.4 Representation
757(1)
10.10.2 3D Data Acquisition
757(7)
10.10.2.1 Passive Methods
758(1)
10.10.2.2 Active Methods
759(5)
10.10.3 Applications
764(7)
10.10.3.1 Identification
765(1)
10.10.3.2 Completeness Check
765(1)
10.10.3.3 Object and Pose Recognition
766(1)
10.10.3.4 Shape and Dimension Applications
767(2)
10.10.3.5 Surface Inspection
769(1)
10.10.3.6 Robotics
770(1)
10.10.4 Conclusion
771(1)
10.11 Industrial Case Studies
772(17)
10.11.1 Glue Check Under UV Light
772(2)
10.11.1.1 Task
772(1)
10.11.1.2 Solution
773(1)
10.11.1.3 Equipment
773(1)
10.11.1.4 Algorithms
774(1)
10.11.1.5 Key Points
774(1)
10.11.2 Completeness Check
774(2)
10.11.2.1 Task
774(1)
10.11.2.2 Solution
774(1)
10.11.2.3 Key Point: Mechanical Setup
775(1)
10.11.2.4 Equipment
775(1)
10.11.2.5 Algorithms
775(1)
10.11.3 Multiple Position and Completeness Check
776(3)
10.11.3.1 Task
776(1)
10.11.3.2 Solution
776(2)
10.11.3.3 Key Point: Cycle Time
778(1)
10.11.3.4 Equipment
778(1)
10.11.3.5 Algorithms
779(1)
10.11.4 Pin-Type Verification
779(2)
10.11.4.1 Task
779(1)
10.11.4.2 Solution
779(2)
10.11.4.3 Key Point: Self-Test
781(1)
10.11.4.4 Equipment
781(1)
10.11.4.5 Algorithms
781(1)
10.11.5 Robot Guidance
781(3)
10.11.5.1 Task
781(1)
10.11.5.2 Solution
782(1)
10.11.5.3 Key Point: Calibration
782(1)
10.11.5.4 Key Point: Communication
783(1)
10.11.5.5 Equipment
784(1)
10.11.5.6 Algorithms
784(1)
10.11.6 Type and Result Data Management
784(2)
10.11.6.1 Task
784(1)
10.11.6.2 Solution
785(1)
10.11.6.3 Key Point: Type Data
785(1)
10.11.6.4 Key Point: Result Data
785(1)
10.11.6.5 Equipment
786(1)
10.11.7 Dimensional Check for Process Control
786(2)
10.11.7.1 Task
786(1)
10.11.7.2 Solution
787(1)
10.11.7.3 Equipment
787(1)
10.11.7.4 Algorithms
788(1)
10.11.8 Ceramic Surface Check
788(1)
10.11.8.1 Task
788(1)
10.11.8.2 Solution
788(1)
10.11.8.3 Equipment
789(1)
10.12 Constraints and Conditions
789(7)
10.12.1 Inspection Task Requirements
789(1)
10.12.2 Circumstantial Requirements
790(3)
10.12.2.1 Cost
791(1)
10.12.2.2 Automation Environment
791(1)
10.12.2.3 Organizational Environment
792(1)
10.12.3 Refinements
793(1)
10.12.4 Limits and Prospects
794(2)
References
796(5)
Appendix 801(4)
Index 805
The editor, Alexander Hornberg, worked as development and software engineer in industry. Since 1997 he has been working in the field of machine vision in an academic environment. He is Professor for Image Processing and Applied Optics at the University of Applied Sciences Esslingen, Germany. All contributors to this work are written by practitioners from leading companies which operate in the field of computer vision.