Muutke küpsiste eelistusi

E-raamat: Statistics in Engineering: With Examples in MATLAB and R, Second Edition

(University of Adelaide, Australia), , , , , (Greenfield Research, UK)
  • Formaat - PDF+DRM
  • Hind: 59,79 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Lisa ostukorvi
  • Lisa soovinimekirja
  • See e-raamat on mõeldud ainult isiklikuks kasutamiseks. E-raamatuid ei saa tagastada.

DRM piirangud

  • Kopeerimine (copy/paste):

    ei ole lubatud

  • Printimine:

    ei ole lubatud

  • Kasutamine:

    Digitaalõiguste kaitse (DRM)
    Kirjastus on väljastanud selle e-raamatu krüpteeritud kujul, mis tähendab, et selle lugemiseks peate installeerima spetsiaalse tarkvara. Samuti peate looma endale  Adobe ID Rohkem infot siin. E-raamatut saab lugeda 1 kasutaja ning alla laadida kuni 6'de seadmesse (kõik autoriseeritud sama Adobe ID-ga).

    Vajalik tarkvara
    Mobiilsetes seadmetes (telefon või tahvelarvuti) lugemiseks peate installeerima selle tasuta rakenduse: PocketBook Reader (iOS / Android)

    PC või Mac seadmes lugemiseks peate installima Adobe Digital Editionsi (Seeon tasuta rakendus spetsiaalselt e-raamatute lugemiseks. Seda ei tohi segamini ajada Adober Reader'iga, mis tõenäoliselt on juba teie arvutisse installeeritud )

    Seda e-raamatut ei saa lugeda Amazon Kindle's. 

Engineers are expected to design structures and machines that can operate in challenging and volatile environments, while allowing for variation in materials and noise in measurements and signals. Statistics in Engineering, Second Edition: With Examples in MATLAB and R covers the fundamentals of probability and statistics and explains how to use these basic techniques to estimate and model random variation in the context of engineering analysis and design in all types of environments.The first eight chapters cover probability and probability distributions, graphical displays of data and descriptive statistics, combinations of random variables and propagation of error, statistical inference, bivariate distributions and correlation, linear regression on a single predictor variable, and the measurement error model. This leads to chapters including multiple regression; comparisons of several means and split-plot designs together with analysis of variance; probability models; and sampling strategies. Distinctive features include: All examples based on work in industry, consulting to industry, and research for industry Examples and case studies include all engineering disciplinesEmphasis on probabilistic modeling including decision trees, Markov chains and processes, and structure functionsIntuitive explanations are followed by succinct mathematical justificationsEmphasis on random number generation that is used for stochastic simulations of engineering systems, demonstration of key concepts, and implementation of bootstrap methods for inferenceUse of MATLAB and the open source software R, both of which have an extensive range of statistical functions for standard analyses and also enable programing of specific applicationsUse of multiple regression for times series models and analysis of factorial and central composite designs Inclusion of topics such as Weibull analysis of failure times and split-plot designs that are commonly used in industry but are not usually included in introductory textbooksExperiments designed to show fundamental concepts that have been tested with large classes working in small groupsWebsite with additional materials that is regularly updatedAndrew Metcalfe, David Green, Andrew Smith, and Jonathan Tuke have taught probability and statistics to students of engineering at the University of Adelaide for many years and have substantial industry experience. Their current research includes applications to water resources engineering, mining, and telecommunications. Mahayaudin Mansor worked in banking and insurance before teaching statistics and business mathematics at the Universiti Tun Abdul Razak Malaysia and is currently a researcher specializing in data analytics and quantitative research in the Health Economics and Social Policy Research Group at the Australian Centre for Precision Health, University of South Australia. Tony Greenfield, formerly Head of Process Computing and Statistics at the British Iron and Steel Research Association, is a statistical consultant. He has been awarded the Chambers Medal for outstanding services to the Royal Statistical Society; the George Box Medal by the European Network for Business and Industrial Statistics for Outstanding Contributions to Industrial Statistics; and the William G. Hunter Award by the American Society for Quality.

Arvustused

"Statistics in Engineering: With Examples in MATLAB and R" is an ideal and unreservedly recommended textbook for college and university library collections." ~John Burroughs, Reviewer's Bookwatch

"Distinctive features of this new second edition of Statistics in Engineering iinclude: All examples being based on work in industry, consulting to industry, and research for industry; Emphasis on probabilistic modeling including decision trees, Markov chains and processes, and structure functions; Intuitive explanations are followed by succinct mathematical justifications; Emphasis on random number generation that is used for stochastic simulations of engineering systems, demonstration of key concepts, and implementation of bootstrap methods for inference; Use of MATLAB and the open source software R, both of which have an extensive range of statistical functions for standard analyses and also an extensive range of statistical functions for standard analyses and also enable programming of specific applications; Use of multiple regression for times series models and analysis of factorial and central composite for time series models and analysis of factorial and central composite designs; Inclusion of topics such as Weibull analysis of failure times and split-plot designs that are commonly used in industry but are not usually included in introductory textbooks: Experiments designed to show fundamental concepts that have been tested with large classes working in small groups." ~Midwest Book Review

Preface xvii
1 Why understand statistics?
1(2)
1.1 Introduction
1(1)
1.2 Using the book
2(1)
1.3 Software
2(1)
2 Probability and making decisions
3(52)
2.1 Introduction
3(1)
2.2 Random digits
4(3)
2.2.1 Concepts and uses
4(1)
2.2.2 Generating random digits
5(1)
2.2.3 Pseudo random digits
6(1)
2.3 Denning probabilities
7(8)
2.3.1 Defining probabilities -- Equally likely outcomes
8(3)
2.3.2 Defining probabilities -- Relative frequencies
11(2)
2.3.3 Defining probabilities -- Subjective probability and expected monetary value
13(2)
2.4 Axioms of probability
15(1)
2.5 The addition rule of probability
15(3)
2.5.1 Complement
16(2)
2.6 Conditional probability
18(7)
2.6.1 Conditioning on information
18(1)
2.6.2 Conditional probability and the multiplicative rule
18(2)
2.6.3 Independence
20(3)
2.6.4 Tree diagrams
23(2)
2.7 Bayes' theorem
25(4)
2.7.1 Law of total probability
26(1)
2.7.2 Bayes' theorem for two events
27(1)
2.7.3 Bayes' theorem for any number of events
28(1)
2.8 Decision trees
29(2)
2.9 Permutations and combinations
31(2)
2.10 Simple random sample
33(2)
2.11 Summary
35(2)
2.11.1 Notation
35(1)
2.11.2 Summary of main results
36(1)
2.11.3 MATLAB® and R commands
36(1)
2.12 Exercises
37(18)
3 Graphical displays of data and descriptive statistics
55(82)
3.1 Types of variables
55(3)
3.2 Samples and populations
58(3)
3.3 Displaying data
61(18)
3.3.1 Stem-and-leaf plot
61(1)
3.3.2 Time series plot
62(3)
3.3.3 Pictogram
65(3)
3.3.4 Pie chart
68(1)
3.3.5 Bar chart
68(2)
3.3.6 Rose plot
70(1)
3.3.7 Line chart for discrete variables
70(3)
3.3.8 Histogram and cumulative frequency polygon for continuous variables
73(4)
3.3.9 Pareto chart
77(2)
3.4 Numerical summaries of data
79(16)
3.4.1 Population and sample
79(2)
3.4.2 Measures of location
81(9)
3.4.3 Measures of spread
90(5)
3.5 Box-plots
95(2)
3.6 Outlying values and robust statistics
97(2)
3.6.1 Outlying values
97(1)
3.6.2 Robust statistics
98(1)
3.7 Grouped data
99(4)
3.7.1 Calculation of the mean and standard deviation for discrete data
99(1)
3.7.2 Grouped continuous data [ Mean and standard deviation for grouped continuous data]
100(1)
3.7.3 Mean as center of gravity
101(2)
3.7.4 Case study of wave stress on offshore structure
103(1)
3.8 Shape of distributions
103(5)
3.8.1 Skewness
103(1)
3.8.2 Kurtosis
104(1)
3.8.3 Some contrasting histograms
105(3)
3.9 Multivariate data
108(5)
3.9.1 Scatter plot
108(2)
3.9.2 Histogram for bivariate data
110(1)
3.9.3 Parallel coordinates plot
111(2)
3.10 Descriptive time series
113(8)
3.10.1 Definition of time series
113(1)
3.10.2 Missing values in time series
114(1)
3.10.3 Decomposition of time series
114(1)
3.10.3.1 Trend -- Centered moving average
114(1)
3.10.3.2 Seasonal component -- Additive monthly model
115(1)
3.10.3.3 Seasonal component -- Multiplicative monthly model
115(1)
3.10.3.4 Seasonal adjustment
116(1)
3.10.3.5 Forecasting
116(3)
3.10.4 Index numbers
119(2)
3.11 Summary
121(2)
3.11.1 Notation
121(1)
3.11.2 Summary of main results
121(1)
3.11.3 MATLAB and R commands
122(1)
3.12 Exercises
123(14)
4 Discrete probability distributions
137(38)
4.1 Discrete random variables
137(3)
4.1.1 Definition of a discrete probability distribution
138(1)
4.1.2 Expected value
139(1)
4.2 Bernoulli trial
140(2)
4.2.1 Introduction
140(1)
4.2.2 Defining the Bernoulli distribution
141(1)
4.2.3 Mean and variance of the Bernoulli distribution
141(1)
4.3 Binomial distribution
142(8)
4.3.1 Introduction
142(1)
4.3.2 Defining the Binomial distribution
142(5)
4.3.3 A model for conductivity
147(1)
4.3.4 Mean and variance of the binomial distribution
148(1)
4.3.5 Random deviates from binomial distribution
149(1)
4.3.6 Fitting a binomial distribution
149(1)
4.4 Hypergeometric distribution
150(3)
4.4.1 Defining the hypergeometric distribution
151(1)
4.4.2 Random deviates from the hypergeometric distribution
152(1)
4.4.3 Fitting the hypergeometric distribution
152(1)
4.5 Negative binomial distribution
153(5)
4.5.1 The geometric distribution
153(1)
4.5.2 Defining the negative binomial distribution
154(1)
4.5.3 Applications of negative binomial distribution
155(2)
4.5.4 Fitting a negative binomial distribution
157(1)
4.5.5 Random numbers from a negative binomial distribution
157(1)
4.6 Poisson process
158(4)
4.6.1 Defining a Poisson process in time
158(1)
4.6.2 Superimposing Poisson processes
158(1)
4.6.3 Spatial Poisson process
158(1)
4.6.4 Modifications to Poisson processes
159(1)
4.6.5 Poisson distribution
159(1)
4.6.6 Fitting a Poisson distribution
160(1)
4.6.7 Times between events
161(1)
4.7 Summary
162(2)
4.7.1 Notation
162(1)
4.7.2 Summary of main results
162(1)
4.7.3 MATLAB and R commands
163(1)
4.8 Exercises
164(11)
5 Continuous probability distributions
175(58)
5.1 Continuous random variables
175(6)
5.1.1 Definition of a continuous random variable
175(1)
5.1.2 Definition of a continuous probability distribution
176(1)
5.1.3 Moments of a continuous probability distribution
177(4)
5.1.4 Median and mode of a continuous probability distribution
181(1)
5.1.5 Parameters of probability distributions
181(1)
5.2 Uniform distribution
181(3)
5.2.1 Definition of a uniform distribution
182(1)
5.2.2 Applications of the uniform distribution
183(1)
5.2.3 Random deviates from a uniform distribution
183(1)
5.2.4 Distribution of F(X) is uniform
183(1)
5.2.5 Fitting a uniform distribution
184(1)
5.3 Exponential distribution
184(10)
5.3.1 Definition of an exponential distribution
184(2)
5.3.2 Markov property
186(1)
5.3.2.1 Poisson process
186(1)
5.3.2.2 Lifetime distribution
186(1)
5.3.3 Applications of the exponential distribution
187(2)
5.3.4 Random deviates from an exponential distribution
189(1)
5.3.5 Fitting an exponential distribution
190(4)
5.4 Normal (Gaussian) distribution
194(9)
5.4.1 Definition of a normal distribution
194(1)
5.4.2 The standard normal distribution Z ~ N(0,1)
195(4)
5.4.3 Applications of the normal distribution
199(4)
5.4.4 Random numbers from a normal distribution
203(1)
5.4.5 Fitting a normal distribution
203(1)
5.5 Probability plots
203(2)
5.5.1 Quantile-quantile plots
204(1)
5.5.2 Probability plot
204(1)
5.6 Lognormal distribution
205(4)
5.6.1 Definition of a lognormal distribution
205(3)
5.6.2 Applications of the lognormal distribution
208(1)
5.6.3 Random numbers from lognormal distribution
209(1)
5.6.4 Fitting a lognormal distribution
209(1)
5.7 Gamma distribution
209(4)
5.7.1 Definition of a gamma distribution
210(2)
5.7.2 Applications of the gamma distribution
212(1)
5.7.3 Random deviates from gamma distribution
212(1)
5.7.4 Fitting a gamma distribution
212(1)
5.8 Gumbel distribution
213(5)
5.8.1 Definition of a Gumbel distribution
213(2)
5.8.2 Applications of the Gumbel distribution
215(1)
5.8.3 Random deviates from a Gumbel distribution
215(1)
5.8.4 Fitting a Gumbel distribution
216(2)
5.9 Summary
218(2)
5.9.1 Notation
218(1)
5.9.2 Summary of main results
218(1)
5.9.3 MATLAB and R commands
219(1)
5.10 Exercises
220(13)
6 Correlation and functions of random variables
233(46)
6.1 Introduction
233(3)
6.2 Sample covariance and correlation coefficient
236(8)
6.2.1 Defining sample covariance
236(8)
6.3 Bivariate distributions, population covariance and correlation coefficient
244(12)
6.3.1 Population covariance and correlation coefficient
245(1)
6.3.2 Bivariate distributions -- Discrete case
246(2)
6.3.3 Bivariate distributions -- Continuous case
248(1)
6.3.3.1 Marginal distributions
248(1)
6.3.3.2 Bivariate histogram
249(1)
6.3.3.3 Covariate and correlation
250(1)
6.3.3.4 Bivariate probability distributions
251(5)
6.3.4 Copulas
256(1)
6.4 Linear combination of random variables (propagation of error)
256(9)
6.4.1 Mean and variance of a linear combination of random variables
257(2)
6.4.1.1 Bounds for correlation coefficient
259(1)
6.4.2 Linear combination of normal random variables
260(2)
6.4.3 Central Limit Theorem and distribution of the sample mean
262(3)
6.5 Non-linear functions of random variables (propagation of error)
265(2)
6.6 Summary
267(1)
6.6.1 Notation
267(1)
6.6.2 Summary of main results
267(1)
6.6.3 MATLAB and R commands
268(1)
6.7 Exercises
268(11)
7 Estimation and inference
279(78)
7.1 Introduction
279(1)
7.2 Statistics as estimators
279(6)
7.2.1 Population parameters
280(1)
7.2.2 Sample statistics and sampling distributions
280(2)
7.2.3 Bias and MSE
282(3)
7.3 Accuracy and precision
285(1)
7.4 Precision of estimate of population mean
285(14)
7.4.1 Confidence interval for population mean when a known
285(3)
7.4.2 Confidence interval for mean when a unknown
288(1)
7.4.2.1 Construction of confidence interval and rationale for the t-distribution
288(1)
7.4.2.2 The t-distribution
289(2)
7.4.3 Robustness
291(1)
7.4.4 Bootstrap methods
292(1)
7.4.4.1 Bootstrap resampling
292(1)
7.4.4.2 Basic bootstrap confidence intervals
293(1)
7.4.4.3 Percentile bootstrap confidence intervals
293(3)
7.4.5 Parametric bootstrap
296(3)
7.5 Hypothesis testing
299(6)
7.5.1 Hypothesis test for population mean when a known
300(2)
7.5.2 Hypothesis test for population mean when a unknown
302(1)
7.5.3 Relation between a hypothesis test and the confidence interval
303(1)
7.5.4 P-value
304(1)
7.5.5 One-sided confidence intervals and one-sided tests
304(1)
7.6 Sample size
305(2)
7.7 Confidence interval for a population variance and standard deviation
307(2)
7.8 Comparison of means
309(8)
7.8.1 Independent samples
309(1)
7.8.1.1 Population standard deviations differ
309(3)
7.8.1.2 Population standard deviations assumed equal
312(3)
7.8.2 Matched pairs
315(2)
7.9 Comparing variances
317(1)
7.10 Inference about proportions
318(7)
7.10.1 Single sample
318(2)
7.10.2 Comparing two proportions
320(3)
7.10.3 McNemar's test
323(2)
7.11 Prediction intervals and statistical tolerance intervals
325(2)
7.11.1 Prediction interval
325(1)
7.11.2 Statistical tolerance interval
326(1)
7.12 Goodness of fit tests
327(5)
7.12.1 Chi-square test
328(2)
7.12.2 Empirical distribution function tests
330(2)
7.13 Summary
332(3)
7.13.1 Notation
332(1)
7.13.2 Summary of main results
333(2)
7.13.3 MATLAB and R commands
335(1)
7.14 Exercises
335(22)
8 Linear regression and linear relationships
357(46)
8.1 Linear regression
357(19)
8.1.1 Introduction
357(2)
8.1.2 The model
359(2)
8.1.3 Fitting the model
361(1)
8.1.3.1 Fitting the regression line
361(2)
8.1.3.2 Identical forms for the least squares estimate of the slope
363(1)
8.1.3.3 Relation to correlation
363(1)
8.1.3.4 Alternative form for the fitted regression line
364(1)
8.1.3.5 Residuals
365(1)
8.1.3.6 Identities satisfied by the residuals
366(1)
8.1.3.7 Estimating the standard deviation of the errors
367(1)
8.1.3.8 Checking assumptions A3, A4 and A5
368(1)
8.1.4 Properties of the estimators
368(1)
8.1.4.1 Estimator of the slope
369(2)
8.1.4.2 Estimator of the intercept
371(1)
8.1.5 Predictions
371(1)
8.1.5.1 Confidence interval for mean value of Y given x
371(2)
8.1.5.2 Limits of prediction
373(1)
8.1.5.3 Plotting confidence intervals and prediction limits
374(1)
8.1.6 Summarizing the algebra
375(1)
8.1.7 Coefficient of determination R2
376(1)
8.2 Regression for a bivariate normal distribution
376(2)
8.2.1 The bivariate normal distribution
377(1)
8.3 Regression towards the mean
378(2)
8.4 Relationship between correlation and regression
380(3)
8.4.1 Values of x are assumed to be measured without error and can be preselected
381(1)
8.4.2 The data pairs are assumed to be a random sample from a bivariate normal distribution
381(2)
8.5 Fitting a linear relationship when both variables are measured with error
383(3)
8.6 Calibration lines
386(3)
8.7 Intrinsically linear models
389(4)
8.8 Summary
393(2)
8.8.1 Notation
393(1)
8.8.2 Summary of main results
393(1)
8.8.3 MATLAB and R commands
394(1)
8.9 Exercises
395(8)
9 Multiple regression
403(88)
9.1 Introduction
403(1)
9.2 Multivariate data
404(1)
9.3 Multiple regression model
405(3)
9.3.1 The linear model
405(1)
9.3.2 Random vectors
406(1)
9.3.2.1 Linear transformations of a random vector
406(1)
9.3.2.2 Multivariate normal distribution
407(1)
9.3.3 Matrix formulation of the linear model
407(1)
9.3.4 Geometrical interpretation
407(1)
9.4 Fitting the model
408(10)
9.4.1 Principle of least squares
408(1)
9.4.2 Multivariate calculus -- Three basic results
409(1)
9.4.3 The least squares estimator of the coefficients
410(1)
9.4.4 Estimating the coefficients
411(5)
9.4.5 Estimating the standard deviation of the errors
416(1)
9.4.6 Standard errors of the estimators of the coefficients
417(1)
9.5 Assessing the fit
418(4)
9.5.1 The residuals
419(1)
9.5.2 R-squared
420(1)
9.5.3 F-statistic
421(1)
9.5.4 Cross validation
422(1)
9.6 Predictions
422(2)
9.7 Building multiple regression models
424(26)
9.7.1 Interactions
424(4)
9.7.2 Categorical variables
428(5)
9.7.3 F-test for an added set of variables
433(7)
9.7.4 Quadratic terms
440(7)
9.7.5 Guidelines for fitting regression models
447(3)
9.8 Time series
450(15)
9.8.1 Introduction
450(1)
9.8.2 Aliasing and sampling intervals
450(1)
9.8.3 Fitting a trend and seasonal variation with regression
451(5)
9.8.4 Auto-covariance and auto-correlation
456(1)
9.8.4.1 Defining auto-covariance for a stationary times series model
457(1)
9.8.4.2 Defining sample auto-covariance and the correlogram
458(1)
9.8.5 Auto-regressive models
459(1)
9.8.5.1 AR(1) and AR(2) models
460(5)
9.9 Non-linear least squares
465(3)
9.10 Generalized linear model
468(6)
9.10.1 Logistic regression
468(2)
9.10.2 Poisson regression
470(4)
9.11 Summary
474(2)
9.11.1 Notation
474(1)
9.11.2 Summary of main results
474(1)
9.11.3 MATLAB and R commands
475(1)
9.12 Exercises
476(15)
10 Statistical quality control
491(68)
10.1 Continuous improvement
491(5)
10.1.1 Defining quality
491(1)
10.1.2 Taking measurements
492(1)
10.1.3 Avoiding rework
493(1)
10.1.4 Strategies for quality improvement
494(1)
10.1.5 Quality management systems
494(1)
10.1.6 Implementing continuous improvement
495(1)
10.2 Process stability
496(14)
10.2.1 Runs chart
496(3)
10.2.2 Histograms and box plots
499(2)
10.2.3 Components of variance
501(9)
10.3 Capability
510(4)
10.3.1 Process capability index
510(1)
10.3.2 Process performance index
511(1)
10.3.3 One-sided process capability indices
512(2)
10.4 Reliability
514(16)
10.4.1 Introduction
514(1)
10.4.1.1 Reliability of components
514(1)
10.4.1.2 Reliability function and the failure rate
515(2)
10.4.2 Weibull analysis
517(1)
10.4.2.1 Definition of the Weibull distribution
517(1)
10.4.2.2 Weibull quantile plot
518(4)
10.4.2.3 Censored data
522(2)
10.4.3 Maximum likelihood
524(5)
10.4.4 Kaplan-Meier estimator of reliability
529(1)
10.5 Acceptance sampling
530(3)
10.6 Statistical quality control charts
533(15)
10.6.1 Shewhart mean and range chart for continuous variables
533(1)
10.6.1.1 Mean chart
533(2)
10.6.1.2 Range chart
535(3)
10.6.2 P-charts for proportions
538(1)
10.6.3 C-charts for counts
539(3)
10.6.4 Cumulative sum charts
542(2)
10.6.5 Multivariate control charts
544(4)
10.7 Summary
548(2)
10.7.1 Notation
548(1)
10.7.2 Summary of main results
548(2)
10.7.3 MATLAB and R commands
550(1)
10.8 Exercises
550(9)
11 Design of experiments with regression analysis
559(46)
11.1 Introduction
559(3)
11.2 Factorial designs with factors at two levels
562(18)
11.2.1 Full factorial designs
562(1)
11.2.1.1 Setting up a 2fc design
562(3)
11.2.1.2 Analysis of 2fc design
565(15)
11.3 Fractional factorial designs
580(5)
11.4 Central composite designs
585(8)
11.5 Evolutionary operation (EVOP)
593(4)
11.6 Summary
597(1)
11.6.1 Notation
597(1)
11.6.2 Summary of main results
597(1)
11.6.3 MATLAB and R commands
598(1)
11.7 Exercises
598(7)
12 Design of experiments and analysis of variance
605(44)
12.1 Introduction
605(1)
12.2 Comparison of several means with one-way ANOVA
605(8)
12.2.1 Defining the model
606(1)
12.2.2 Limitation of multiple t-tests
606(1)
12.2.3 One-way ANOVA
607(3)
12.2.4 Testing HS
610(1)
12.2.5 Follow up procedure
610(3)
12.3 Two factors at multiple levels
613(8)
12.3.1 Two factors without replication (two-way ANOVA)
614(4)
12.3.2 Two factors with replication (three-way ANOVA)
618(3)
12.4 Randomized block design
621(5)
12.5 Split plot design
626(10)
12.6 Summary
636(2)
12.6.1 Notation
636(1)
12.6.2 Summary of main results
637(1)
12.6.3 MATLAB and R commands
637(1)
12.7 Exercises
638(11)
13 Probability models
649(50)
13.1 System reliability
649(13)
13.1.1 Series system
649(1)
13.1.2 Parallel system
650(1)
13.1.3 K-out-of-n system
651(1)
13.1.4 Modules
652(1)
13.1.5 Duality
653(2)
13.1.6 Paths and cut sets
655(1)
13.1.7 Reliability function
656(2)
13.1.8 Redundancy
658(1)
13.1.9 Non-repairable systems
658(1)
13.1.10 Standby systems
659(2)
13.1.11 Common cause failures
661(1)
13.1.12 Reliability bounds
661(1)
13.2 Markov chains
662(22)
13.2.1 Discrete Markov chain
663(4)
13.2.2 Equilibrium behavior of irreducible Markov chains
667(3)
13.2.3 Methods for solving equilibrium equations
670(5)
13.2.4 Absorbing Markov chains
675(6)
13.2.5 Markov chains in continuous time
681(3)
13.3 Simulation of systems
684(10)
13.3.1 The simulation procedure
685(4)
13.3.2 Drawing inference from simulation outputs
689(3)
13.3.3 Variance reduction
692(2)
13.4 Summary
694(2)
13.4.1 Notation
694(1)
13.4.2 Summary of main results
694(2)
13.5 Exercises
696(3)
14 Sampling strategies
699(28)
14.1 Introduction
699(3)
14.2 Simple random sampling from a finite population
702(6)
14.2.1 Finite population correction
702(1)
14.2.2 Randomization theory
703(1)
14.2.2.1 Defining the simple random sample
703(1)
14.2.2.2 Mean and variance of sample mean
704(1)
14.2.2.3 Mean and variance of estimator of population total
705(2)
14.2.3 Model based analysis
707(1)
14.2.4 Sample size
708(1)
14.3 Stratified sampling
708(5)
14.3.1 Principle of stratified sampling
709(1)
14.3.2 Estimating the population mean and total
709(2)
14.3.3 Optimal allocation of the sample over strata
711(2)
14.4 Multi-stage sampling
713(3)
14.5 Quota sampling
716(1)
14.6 Ratio estimators and regression estimators
716(2)
14.6.1 Introduction
716(1)
14.6.2 Regression estimators
716(1)
14.6.3 Ratio estimator
716(2)
14.7 Calibration of the unit cost data base
718(3)
14.7.1 Sources of error in an AMP
718(1)
14.7.2 Calibration factor
719(2)
14.8 Summary
721(1)
14.8.1 Notation
721(1)
14.8.2 Summary of main results
721(1)
14.9 Exercises
722(5)
Appendix A -- Notation
727(4)
A.1 General
727(1)
A.2 Probability
727(1)
A.3 Statistics
728(1)
A.4 Probability distributions
729(2)
Appendix B Glossary
731(14)
Appendix C Getting started in R
745(10)
C.1 Installing R
745(1)
C.2 Using R as a calculator
745(2)
C.3 Setting the path
747(1)
C.4 R scripts
747(1)
C.5 Data entry
747(2)
C.5.1 From keyboard
747(1)
C.5.2 From a file
748(1)
C.5.2.1 Single variable
748(1)
C.5.2.2 Several variables
748(1)
C.6 R vectors
749(1)
C.7 User defined functions
750(1)
C.8 Matrices
750(1)
C.9 Loops and conditionals
751(1)
C.10 Basic plotting
752(1)
C.11 Installing packages
753(1)
C.12 Creating time series objects
753(2)
Appendix D Getting started in MATLAB
755(10)
D.1 Installing MATLAB
755(1)
D.2 Using MATLAB as a calculator
755(1)
D.3 Setting the path
756(1)
D.4 MATLAB scripts (m-files)
756(1)
D.5 Data entry
757(1)
D.5.1 From keyboard
757(1)
D.5.2 From a file
757(1)
D.5.2.1 Single variable
757(1)
D.5.2.2 Several variables
758(1)
D.6 MATLAB vectors
758(3)
D.7 User defined functions
761(1)
D.8 Matrices
761(1)
D.9 Loops and conditionals
761(2)
D.10 Basic plotting
763(1)
D.11 Creating time series objects
764(1)
Appendix E Experiments
765(18)
E.1 How good is your probability assessment?
765(2)
E.1.1 Objectives
765(1)
E.1.2 Experiment
765(1)
E.1.3 Question sets
765(2)
E.1.4 Discussion
767(1)
E.1.5 Follow up questions
767(1)
E.2 Buffon's needle
767(1)
E.2.1 Objectives
767(1)
E.2.2 Experiment
767(1)
E.2.3 Questions
768(1)
E.2.4 Computer simulation
768(1)
E.2.5 Historical note
768(1)
E.3 Robot rabbit
768(4)
E.3.1 Objectives
768(1)
E.3.2 Experiment
769(1)
E.3.3 Data
770(1)
E.3.4 Discussion
770(2)
E.3.5 Follow up question
772(1)
E.4 Use your braking brains
772(1)
E.4.1 Objectives
772(1)
E.4.2 Experiment
772(1)
E.4.3 Discussion
772(1)
E.5 Predicting descent time from payload
773(1)
E.5.1 Objectives
773(1)
E.5.2 Experiment
773(1)
E.5.3 Discussion
774(1)
E.5.4 Follow up question
774(1)
E.6 Company efficiency, resources and teamwork
774(2)
E.6.1 Objectives
774(1)
E.6.2 Experiment
774(2)
E.6.3 Discussion
776(1)
E.7 Factorial experiment -- reaction times by distraction, dexterity and distinctness
776(2)
E.7.1 Aim
776(1)
E.7.2 Experiment
776(1)
E.7.3 Analysis
776(1)
E.7.4 Discussion
777(1)
E.7.5 Follow up questions
777(1)
E.8 Weibull analysis of cycles to failure
778(1)
E.8.1 Aim
778(1)
E.8.2 Experiment
778(1)
E.8.3 Weibull plot
778(1)
E.8.4 Discussion
779(1)
E.9 Control or tamper?
779(2)
E.10 Where is the summit?
781(2)
References 783(6)
Index 789
Andrew Metcalfe, David Green, Andrew Smith, and Jonathan Tuke have taught probability and statistics to students of engineering at the University of Adelaide for many years and have substantial industry experience. Their current research includes applications to water resources engineering, mining, and telecommunications. Mahayaudin Mansor worked in banking and insurance before teaching statistics and business mathematics at the Universiti Tun Abdul Razak Malaysia and is currently a researcher specializing in data analytics and quantitative research in the Health Economics and Social Policy Research Group at the Australian Centre for Precision Health, University of South Australia. Tony Greenfield, formerly Head of Process Computing and Statistics at the British Iron and Steel Research Association, is a statistical consultant. He has been awarded the Chambers Medal for outstanding services to the Royal Statistical Society; the George Box Medal by the European Network for Business and Industrial Statistics for Outstanding Contributions to Industrial Statistics; and the William G. Hunter Award by the American Society for Quality.

Visit their website here:

http://www.maths.adelaide.edu.au/david.green/BookWebsite/