Muutke küpsiste eelistusi

Probability, Statistical Optics, and Data Testing: A Problem Solving Approach 3rd ed. 2001 [Pehme köide]

  • Formaat: Paperback / softback, 496 pages, kõrgus x laius: 235x155 mm, kaal: 1600 g, XXIV, 496 p., 1 Paperback / softback
  • Sari: Springer Series in Information Sciences 10
  • Ilmumisaeg: 17-Jul-2001
  • Kirjastus: Springer-Verlag Berlin and Heidelberg GmbH & Co. K
  • ISBN-10: 3540417087
  • ISBN-13: 9783540417088
Teised raamatud teemal:
  • Pehme köide
  • Hind: 141,35 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Tavahind: 166,29 €
  • Säästad 15%
  • Raamatu kohalejõudmiseks kirjastusest kulub orienteeruvalt 2-4 nädalat
  • Kogus:
  • Lisa ostukorvi
  • Tasuta tarne
  • Tellimisaeg 2-4 nädalat
  • Lisa soovinimekirja
  • Formaat: Paperback / softback, 496 pages, kõrgus x laius: 235x155 mm, kaal: 1600 g, XXIV, 496 p., 1 Paperback / softback
  • Sari: Springer Series in Information Sciences 10
  • Ilmumisaeg: 17-Jul-2001
  • Kirjastus: Springer-Verlag Berlin and Heidelberg GmbH & Co. K
  • ISBN-10: 3540417087
  • ISBN-13: 9783540417088
Teised raamatud teemal:
Scientists in optics are increasingly confronted with problems that are of a random nature and that require a working knowledge of probability and statistics for their solution. This textbook develops these subjects within the context of optics using a problem-solving approach. All methods are explicitly derived and can be traced back to three simple axioms given at the outset. Students with some previous exposure to Fourier optics or linear theory will find the material particularly absorbing and easy to understand. This third edition contains many new applications to optical and physical phenomena. This includes a method of estimating probability laws exactly, by regarding them as laws of physics to be determined using a new variational principle.

Arvustused

From the reviews of the third edition:









"Scientists in optics are increasingly confronted with problems that are of a random nature and that require a working knowledge of probability and statistics for their solutions. This textbook develops these subjects within the context of optics using a problem-solving approach. The book is exclusively wealthy in contents. The author generously shares his reflections and assumptions with the reader and puts the unsolved problems yet, what makes this book an especially interesting one." (Dmitry Ostrouchov, Zentralblatt MATH, Vol. 978, 2002)

Muu info

Springer Book Archives
Introduction
1(6)
What Is Chance, and Why Study It?
3(4)
Chance vs. Determinism
3(1)
Probability Problems in Optics
4(1)
Statistical Problems in Optics
5(1)
Historical Notes
5(2)
The Axiomatic Approach
7(32)
Notion of an Experiment; Events
7(2)
Event Space; The Space Event
8(1)
Disjoint Events
8(1)
The Certain Event
9(1)
Definition of Probability
9(1)
Relation to Frequency of Occurrence
10(1)
Some Elementary Consequences
10(2)
Additivity Property
11(1)
Normalization Property
11(1)
Marginal Probability
12(1)
The ``Traditional'' Definition of Probability
12(1)
Illustrative Problem: A Dice Game
13(1)
Illustrative Problem: Let's (Try to) Take a Trip
14(1)
Law of Large Numbers
15(1)
Optical Objects and Images as Probability Laws
15(2)
Conditional Probability
17(1)
The Quantity of Information
18(2)
Statistical Independence
20(2)
Illustrative Problem: Let's (Try to) Take a Trip (Continued)
21(1)
Informationless Messages
22(1)
A Definition of Noise
22(1)
``Additivity'' Property of Information
23(1)
Partition Law
23(1)
Illustrative Problem: Transmittance Through a Film
24(1)
How to Correct a Success Rate for Guesses
25(1)
Bayes' Rule
26(1)
Some Optical Applications
27(1)
Information Theory Application
28(1)
Application to Markov Events
28(1)
Complex Number Events
29(1)
What is the Probability of Winning a Lottery Jackpot?
30(1)
What is the Probability of a Coincidence of Birthdays at a Party?
31(8)
Continuous Random Variables
39(40)
Definition of a Random Variable
39(1)
Probability Density Function, Basic Properties
39(2)
Information Theory Application: Continuous Limit
41(1)
Optical Application: Continuous Form of Imaging Law
41(1)
Expected Values, Moments
42(1)
Optical Application: Moments of the Slit Diffraction Pattern
43(1)
Information Theory Application
44(1)
Case of Statistical Independence
45(1)
Mean of a Sum
45(1)
Optical Application
46(1)
Deterministic Limit; Representations of the Dirac δ-Function
47(1)
Correspondence Between Discrete and Continuous Cases
48(1)
Cumulative Probability
48(1)
The Means of an Algebraic Expression: A Simplified Approach
49(1)
A Potpourri of Probability Laws
50(18)
Poisson
50(1)
Binomial
51(1)
Uniform
51(1)
Exponential
52(1)
Normal (One-Dimensional)
53(1)
Normal (Two-Dimensional)
53(2)
Normal (Multi-Dimensional)
55(1)
Skewed Gaussian Case; Gram-Charlier Expansion
56(1)
Optical Application
56(2)
Geometric Law
58(1)
Cauchy Law
58(1)
sinc2 Law
58(10)
Derivation of Heisenberg Uncertainty Principle
68(2)
Schwarz Inequality for Complex Functions
68(1)
Fourier Relations
68(1)
Uncertainty Product
69(1)
Hirschman's Form of the Uncertainty Principle
70(1)
Measures of Information
70(3)
Kullback-Leibler Information
70(1)
Renyi Information
71(1)
Wootters Information
71(1)
Hellinger Information
72(1)
Tsallis Information
72(1)
Fisher Information
72(1)
Fisher Information Matrix
73(6)
Fourier Methods in Probability
79(28)
Characteristic Function Defined
79(1)
Use in Generating Moments
80(1)
An Alternative to Describing RV x
80(1)
On Optical Applications
80(1)
Shift Theorem
81(1)
Poisson Case
81(1)
Binomial Case
82(1)
Uniform Case
82(1)
Exponential Case
82(1)
Normal Case (One Dimension)
83(1)
Multidimensional Cases
83(1)
Normal Case (Two Dimensions)
83(1)
Convolution Theorem, Transfer Theorem
83(1)
Probability Law for the Sum of Two Independent RV's
84(1)
Optical Applications
85(2)
Imaging Equation as the Sum of Two Random Displacements
85(1)
Unsharp Masking
85(2)
Sum of n Independent RV's; The ``Random Walk'' Phenomenon
87(2)
Resulting Mean and Variance: Normal, Poisson, and General Cases
89(1)
Sum of n Dependent RV's
89(1)
Case of Two Gaussian Bivariate RV's
90(1)
Sampling Theorems for Probability
91(1)
Case of Limited Range of x, Derivation
91(1)
Discussion
92(1)
Optical Application
93(1)
Case of Limited Range of ω
94(1)
Central Limit Theorem
94(1)
Derivation
95(2)
How Large Does n Have To Be?
97(1)
Optical Applications
97(3)
Cascaded Electro-Optical Systems
97(1)
Laser Resonator
98(1)
Atmospheric Turbulence
99(1)
Generating Normally Distributed Numbers from Uniformly Random Numbers
100(2)
The Error Function
102(5)
Functions of Random Variables
107(40)
Case of a Single Random Variable
107(1)
Unique Root
108(1)
Application from Geometrical Optics
109(1)
Multiple Roots
110(1)
Illustrative Example
111(1)
Case of n Random Variables, r Roots
111(1)
Optical Applications
112(1)
Statistical Modeling
112(1)
Application of Transformation Theory to Laser Speckle
113(6)
Physical Layout
113(1)
Plan
114(1)
Statistical Model
114(1)
Marginal Probabilities for Light Amplitudes Ure, Uim
115(1)
Correlation Between Ure and Uim
116(1)
Joint Probability Law for Ure, Uim
117(1)
Probability Laws for Intensity and Phase; Transformation of the RV's
117(1)
Marginal Laws for Intensity and Phase
118(1)
Signal-to-Noise (S/N) Ratio in the Speckle Image
118(1)
Speckle Reduction by Use of a Scanning Aperture
119(4)
Statistical Model
119(1)
Probability Density for Output Intensity p1(v)
120(1)
Moments and S/N Ratio
121(1)
Standard Form for the Chi-Square Distribution
122(1)
Calculation of Spot Intensity Profiles Using Transformation Theory
123(3)
Illustrative Example
124(1)
Implementation by Ray-Trace
125(1)
Application of Transformation Theory to a Satellite-Ground Communication Problem
126(14)
Unequal Numbers of Input and Output Variables: ``Helper Variables''
140(2)
Probability Law for a Quotient of Random Variables
140(1)
Probability Law for a Product of Independent Random Variables
141(1)
More Complicated Transformation Problems
142(1)
Use of an Invariance Principle to Find a Probability Law
142(2)
Probability Law for Transformation of a Discrete Random Variable
144(3)
Bernoulli Trials and Limiting Cases
147(28)
Analysis
147(2)
Illustrative Problems
149(3)
Illustrative Problem: Let's (Try to) Take a Trip: The Last Word
149(1)
Illustrative Problem: Mental Telepathy as a Communication Link?
150(2)
Characteristic Function and Moments
152(1)
Optical Application: Checkerboard Model of Granularity
152(2)
The Poisson Limit
154(3)
Analysis
154(1)
Example of Degree of Approximation
155(1)
Normal Limit of Poisson Law
156(1)
Optical Application: The Shot Effect
157(1)
Optical Application: Combined Sources
158(1)
Poisson Joint Count for Two Detectors-Intensity Interferometry
158(4)
The Normal Limit (De Moivre-Laplace Law)
162(13)
Derivation
162(1)
Conditions of Use
163(1)
Use of the Error Function
164(11)
The Monte Carlo Calculation
175(16)
Producing Random Numbers That Obey a Prescribed Probability Law
176(2)
Illustrative Case
177(1)
Normal Case
177(1)
Analysis of the Photographic Emulsion by Monte Carlo Calculation
178(2)
Application of the Monte Carlo Calculation to Remote Sensing
180(1)
Monte Carlo Formation of Optical Images
181(2)
An Example
182(1)
Monte Carlo Simulation of Speckle Patterns
183(8)
Stochastic Processes
191(52)
Definition of a Stochastic Process
191(1)
Definition of Power Spectrum
192(2)
Some Examples of Power Spectra
194(1)
Definition of Autocorrelation Function; Kinds of Stationarity
194(1)
Fourier Transform Theorem
195(1)
Case of a ``White'' Power Spectrum
196(1)
Application: Average Transfer Function Through Atmospheric Turbulence
197(4)
Statistical Model for Phase Fluctuations
198(1)
A Transfer Function for Turbulence
199(2)
Transfer Theorems for Power Spectra
201(7)
Determining the MTF Using Random Objects
201(1)
Speckle Interferometry of Labeyrie
202(1)
Resolution Limits of Speckle Interferometry
203(5)
Transfer Theorem for Autocorrelation: The Knox-Thompson Method
208(3)
Additive Noise
211(1)
Random Noise
212(1)
Ergodic Property
213(4)
Optimum Restoring Filter
217(4)
Definition of Restoring Filter
217(1)
Model
218(1)
Solution
219(2)
Information Content in the Optical Image
221(5)
Statistical Model
222(1)
Analysis
223(1)
Noise Entropy
223(1)
Data Entropy
224(1)
The Answer
225(1)
Interpretation
225(1)
Data Information and Its Ability to be Restored
226(1)
Superposition Processes; the Shot Noise Process
227(16)
Probability Law for i
229(1)
Some Important Averages
229(1)
Mean Value (i(x0))
230(1)
Shot Noise Case
231(1)
Second Moment (i2(x0))
231(1)
Variance σ2
232(1)
Shot Noise Case
232(1)
Signal-to-Noise (S/N) Ratio
233(1)
Autocorrelation Function
234(1)
Shot Noise Case
235(1)
Application: An Overlapping Circular Grain Model for the Emulsion
236(1)
Application: Light Fluctuations due to Randomly Tilted Waves, the ``Swimming Pool'' Effect
237(6)
Introduction to Statistical Methods: Estimating the Mean, Median, Variance, S/N, and Simple Probability
243(34)
Estimating a Mean from a Finite Sample
244(1)
Statistical Model
244(1)
Analysis
245(1)
Discussion
246(1)
Error in a Discrete, Linear Processor: Why Linear Methods Often Fail
246(2)
Estimating a Probability: Derivation of the Law of Large Numbers
248(1)
Variance of Error
249(1)
Illustrative Uses of the Error Expression
250(2)
Estimating Probabilities from Empirical Rates
250(1)
Aperture Size for Required Accuracy in Transmittance Readings
251(1)
Probability Law for the Estimated Probability; Confidence Limits
252(1)
Calculation of the Sample Variance
253(5)
Unbiased Estimate of the Variance
253(2)
Expected Error in the Sample Variance
255(1)
Illustrative Problems
256(2)
Estimating the Signal-to-Noise Ratio; Student's Probability Law
258(3)
Probability Law for SNR
258(1)
Moments of SNR
259(2)
Limit c → 0; A Student Probability Law
261(1)
Properties of a Median Window
261(2)
Statistics of the Median
263(6)
Probability Law for the Median
264(1)
Laser Speckle Case: Exponential Probability Law
264(5)
Dominance of the Cauchy Law in Diffraction
269(8)
Estimating an Optical Slit Position: An Optical Central Limit Theorem
270(1)
Analysis by Characteristic Function
270(2)
Cauchy Limit, Showing Independence to Aberrations
272(1)
Widening the Scope of the Optical Central Limit Theorem
273(4)
Introduction to Estimating Probability Laws
277(30)
Estimating Probability Densities Using Orthogonal Expansions
278(3)
Karhunen-Loeve Expansion
281(1)
The Multinomial Probability Law
282(1)
Derivation
282(1)
Illustrative Example
283(1)
Estimating an Empirical Occurrence Law as the Maximum Probable Answer
283(24)
Principle of Maximum Probability (MP)
284(1)
Maximum Entropy Estimate
285(1)
The Search for ``Maximum Prior Ignorance''
286(1)
Other Types of Estimates (Summary)
287(1)
Return to Maximum Entropy Estimation, Discrete Case
288(1)
Transition to a Continuous Random Variable
289(1)
Solution
290(1)
Maximized H
290(1)
Illustrative Example: Significance of the Normal Law
290(1)
The Smoothness Property; Least Biased Aspect
291(1)
A Well Known Distribution Derived
292(1)
When Does the Maximum Entropy Estimate Equal the True Law?
293(1)
Maximum Probable Estimates of Optical Objects
294(2)
Case of Nearly Featureless Objects
296(11)
The Chi-Square Test of Significance
307(14)
Forming the χ2 Statistic
308(1)
Probability Law for χ2 Statistic
309(2)
When is a Coin Fixed?
311(1)
Equivalence of Chi-Square to Other Statistics; Sufficient Statistics
312(1)
When Is a Vote Decisive?
313(1)
Generalization to N Voters
314(1)
Use as an Image Detector
315(6)
The Student t-Test on the Mean
321(12)
Cases Where Data Accuracy is Unknown
322(1)
Philosophy of the Approach: Statistical Inference
322(1)
Forming the Statistic
323(2)
Student's t-Distribution: Derivation
325(1)
Some Properties of Student's t-Distribution
326(1)
Application to the Problem: Student's t-Test
327(1)
Illustrative Example
327(2)
Other Applications
329(4)
The F-Test on Variance
333(8)
Snedecor's F-Distribution; Derivation
333(1)
Some Properties of Snedecor's F-Distribution
334(1)
The F-Test
335(1)
Illustrative Example
336(1)
Application to Image Detection
336(5)
Least-Squares Curve Fitting-Regression Analysis
341(22)
Summation Model for the Physical Effect
341(2)
Linear Regression Model for the Noise
343(2)
Equivalence of ML and Least-Squares Solutions
345(1)
Solution
346(1)
Return to Film Problem
347(1)
``Significant'' Factors; The R-Statistic
347(2)
Example: Was T2 an Insignificant Factor?
349(1)
Accuracy of the Estimated Coefficients
350(13)
Absorptance of an Optical Fiber
350(1)
Variance of Error in the General Case
351(2)
Error in the Estimated Absorptance of an Optical Fiber
353(10)
Principal Components Analysis
363(12)
A Photographic Problem
363(1)
Equivalent Eigenvalue Problem
364(2)
The Eigenvalues as Sample Variances
366(1)
The Data in Terms of Principal Components
366(1)
Reduction in Data Dimensionality
367(1)
Return to the H-D Problem
368(1)
Application to Multispectral Imagery
368(3)
Error Analysis
371(4)
The Controversy Between Bayesians and Classicists
375(12)
Bayesian Approach to Confidence Limits for an Estimated Probability
376(4)
Probability Law for the Unknown Probability
377(1)
Assumption of a Uniform Prior
377
Irrelevance of Choice of Prior Statistic p0(x) if N is Large
376(3)
Limiting Form for N Large
379(1)
Illustrative Problem
379(1)
Laplace's Rule of Succession
380(7)
Derivation
380(3)
Role of the Prior
383(1)
Bull Market, Bear Market
384(3)
Introduction to Estimation Methods
387(64)
Deterministic Parameters: Likelihood Theory
388(10)
Unbiased Estimators
388(1)
Maximum Likelihood Estimators
389(2)
Cramer-Rao Lower Bound on Error
391(2)
Achieving the Lower Bound
393(1)
Testing for Efficient Estimators
394(1)
Can a Bound to the Error be Known if an Efficient Estimator Does Not Exist?
395(2)
When can the Bhattacharyya Bound be Achieved?
397(1)
Random Parameters: Bayesian Estimation Theory
398(15)
Cost Functions
399(1)
Risk
400(5)
MAP Estimates
405(8)
Exact Estimates of Probability Laws: The Principle of Extreme Physical Information
413(38)
A Knowledge-Based View of Nature
415(1)
Fisher Information as a Bridge Between Noumenon and Phenomenon
416(3)
No Absolute Origins
419(1)
Invariance of the Fisher Information Length to Unitary Transformation
420(1)
Multidimensional Form of I
421(1)
Lorentz Transformation of Special Relativity
422(1)
What Constants Should be Regarded as Universal Physical Constants?
423(1)
Transition to Complex Probability Amplitudes
424(1)
Space-Time Measurement: Information Capacity in Fourier Space
424(1)
Relation Among Energy, Mass and Momentum
425(1)
Ultimate Resolution
426(1)
Bound Information J and Efficiency Constant k
426(1)
Perturbing Effect of the Probe Particle
427(1)
Equality of the Perturbed Informations
428(3)
EPI Variational Principle, and Framework
431(2)
The Measurement Process in Detail
433(2)
Euler-Lagrange Solutions
435(1)
Free-Field Klein-Gordon Equation
435(1)
Dirac Equations
436(1)
Schroedinger Wave Equation (SWE)
437(1)
Dimensionality, and Plato's Cave
438(1)
Wheeler's ``Participatory Universe''
439(1)
Exhaustivity Property, and Future Research
439(1)
Can EPI be Used in a Design Mode?
440(1)
EPI as a Knowledge Acquisition Game
440(1)
Ultimate Uses of EPI
441(10)
Appendix 451(18)
Appendix A
451(2)
Appendix B
453(2)
Appendix C
455(1)
Appendix D
456(3)
Appendix E
459(1)
Appendix F
460(4)
Appendix G
464(5)
References 469(10)
Index 479