Muutke küpsiste eelistusi

E-raamat: Bayes Linear Statistics: Theory and Methods

(University of Durham, UK), (University of Durham, UK)
Teised raamatud teemal:
  • Formaat - PDF+DRM
  • Hind: 164,19 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Lisa ostukorvi
  • Lisa soovinimekirja
  • See e-raamat on mõeldud ainult isiklikuks kasutamiseks. E-raamatuid ei saa tagastada.
  • Raamatukogudele
Teised raamatud teemal:

DRM piirangud

  • Kopeerimine (copy/paste):

    ei ole lubatud

  • Printimine:

    ei ole lubatud

  • Kasutamine:

    Digitaalõiguste kaitse (DRM)
    Kirjastus on väljastanud selle e-raamatu krüpteeritud kujul, mis tähendab, et selle lugemiseks peate installeerima spetsiaalse tarkvara. Samuti peate looma endale  Adobe ID Rohkem infot siin. E-raamatut saab lugeda 1 kasutaja ning alla laadida kuni 6'de seadmesse (kõik autoriseeritud sama Adobe ID-ga).

    Vajalik tarkvara
    Mobiilsetes seadmetes (telefon või tahvelarvuti) lugemiseks peate installeerima selle tasuta rakenduse: PocketBook Reader (iOS / Android)

    PC või Mac seadmes lugemiseks peate installima Adobe Digital Editionsi (Seeon tasuta rakendus spetsiaalselt e-raamatute lugemiseks. Seda ei tohi segamini ajada Adober Reader'iga, mis tõenäoliselt on juba teie arvutisse installeeritud )

    Seda e-raamatut ei saa lugeda Amazon Kindle's. 

Bayesian methods combine data with any prior information available from expert knowledge, with the linear approach providing a quantitative structure for expressing beliefs and systematic methods for adjusting beliefs, given observational data. It has become essential to engineers, computer scientists and a range of social science researchers as well as statisticians. Written for these professionals but accessible enough for graduate students, this covers the foundations of Bayesian linear statistics, including theory, methodology and practical applications. Goldstein and Woolfe (both U. of Durham) explain the features of the approach and turn immediately to expectation, adjusting beliefs, observed adjustment, partial Bayes linear analysis, exchangeable and co-exchange beliefs, population variances, belief comparison, Bayes linear graphical models, applicable matrix algebra, and implementation. Annotation ©2007 Book News, Inc., Portland, OR (booknews.com)

Bayesian methods combine information available from data with any prior information available from expert knowledge. The Bayes linear approach follows this path, offering a quantitative structure for expressing beliefs, and systematic methods for adjusting these beliefs, given observational data. The methodology differs from the full Bayesian methodology in that it establishes simpler approaches to belief specification and analysis based around expectation judgements. Bayes Linear Statistics presents an authoritative account of this approach, explaining the foundations, theory, methodology, and practicalities of this important field.

The text provides a thorough coverage of Bayes linear analysis, from the development of the basic language to the collection of algebraic results needed for efficient implementation, with detailed practical examples.

The book covers:

  • The importance of partial prior specifications for complex problems where it is difficult to supply a meaningful full prior probability specification.
  • Simple ways to use partial prior specifications to adjust beliefs, given observations.
  • Interpretative and diagnostic tools to display the implications of collections of belief statements, and to make stringent comparisons between expected and actual observations.
  • General approaches to statistical modelling based upon partial exchangeability judgements.
  • Bayes linear graphical models to represent and display partial belief specifications, organize computations, and display the results of analyses.

Bayes Linear Statistics is essential reading for all statisticians concerned with the theory and practice of Bayesian methods. There is an accompanying website hosting free software and guides to the calculations within the book.

Arvustused

The book is an essential reading for all statisticians concerned with the theory and practice of Bayesian methods. There is an accompanying website hosting free software and guides to the calculations within the book.  (Zentralblatt MATH, 2012)

"Summarizing, the book is an interesting compendium of methods of updating beliefs." (Stat Papers, 2010)

"The authors are to be congratulated for their pioneering effort in writing this book. Hopefully, more books and articles will follow, and the methodology will someday be part of mainstream statistics." (Technometrics, November 2008)

"The authors are to be congratulated for their pioneering effort in writing this book.  Hopefully, more books and articles will follow, and the methodology will someday be part of mainstream statistics." (Technometrics, November 2008)

"The book provides an extensive introduction and explanation of the subject and augments theory with numerous illustrative examples, including relevant considerations for specifying beliefs and diagnostics for assessing appropriateness." (Journal of the American Statistical Association, September 2008)

Preface.
1 The Bayes linear approach.
1.1 Combining beliefs with data
1.2 The Bayesian approach
1.3 Features of the Bayes linear approach.
1.4 Example.
1.4.1 Expectation, variance, and standardization.
1.4.2 Prior inputs.
1.4.3 Adjusted Expectations.
1.4.4 Adjusted versions.
1.4.5 Adjusted variances.
1.4.6 Checking data inputs.
1.4.7 Observed adjusted expectations.
1.4.8 Diagnostics for adjusted beliefs.
1.4.9 Further diagnostics for the adjusted versions.
1.4.10 Summary of basic adjustment.
1.4.11 Diagnostics for collections.
1.4.12 Exploring collections of beliefs
1.4.13 Modifying the original specifications.
1.4.14 Repeating the analysis for the revised model.
1.4.15 Global analysis of collections of observations.
1.4.16 Partial adjustments.
1.4.17 Partial diagnostics.
1.4.18 Summary.
1.5 Overview.
2 Expectation.
2.1 Expectation as a primitive.
2.2 Discussion: Expectation as a primitive.
2.3 Quantifying collections of uncertainties.
2.4 Specifying prior beliefs.
2.4.1 Example: oral glucose tolerance test.
2.5 Qualitative and quantitative prior specification.
2.6 Example: qualitative representation of uncertainty.
2.6.1 Identifying the quantities of interest.
2.6.2 Identifying relevant prior information.
2.6.3 Sources of variation.
2.6.4 Representing population variation.
2.6.5 The qualitative representation.
2.6.6 Graphical models.
2.7 Example: quantifying uncertainty.
2.7.1 Prior expectations.
2.7.2 Prior variances.
2.7.3 Prior covariances.
2.7.4 Summary of belief specifications.
2.8 Discussion: on the various methods for assigning expectations.
3 Adjusting beliefs.
3.1 Adjusted expectation
3.2 Properties of adjusted expectation.
3.3 Adjusted variance.
3.4 Interpretations of belief adjustment.
3.5 Foundational issues concerning belief.
adjustment.
3.6 Example: one-dimensional problem.
3.7 Collections of adjusted beliefs.
3.8 Examples.
3.8.1 Algebraic example.
3.8.2 Oral glucose tolerance test.
3.8.3 Many oral glucose tolerance tests.
3.9 Canonical analysis for a belief adjustment.
3.9.1 Canonical directions for the adjustment.
3.9.2 The resolution transform.
3.9.3 Partitioning the resolution.
3.9.4 The reverse adjustment.
3.9.5 Minimal linear sufficiency.
3.9.6 The adjusted belief transform matrix.
3.10 The geometric interpretation of belief adjustment.
3.11 Examples.
3.11.1 Simple one-dimensional problem.
3.11.2 Algebraic example.
3.11.3 Oral glucose tolerance test.
3.12 Further reading.
4 The observed adjustment.
4.1 Discrepancy.
4.1.1 Discrepancy for a collection.
4.1.2 Evaluating discrepancy over a basis.
4.1.3 Discrepancy for quantities with variance zero.
4.2 Properties of discrepancy measures.
4.2.1 Evaluating the discrepancy vector over a basis.
4.3 Examples.
4.3.1 Simple one-dimensional problem.
4.3.2 Detecting degeneracy.
4.3.3 Oral glucose tolerance test.
4.4 The observed adjustment.
4.4.1 Adjustment discrepancy.
4.4.2 Adjustment discrepancy for a collection.
4.4.3 Maximal discrepancy.
4.4.4 Construction over a basis.
4.4.5 Partitioning the discrepancy.
4.5 Examples.
4.5.1 Simple one-dimensional problem.
4.5.2 Oral glucose tolerance test.
4.6 The size of an adjustment.
4.6.1 The size of an adjustment for a collection.
4.7 The bearing for an adjustment.
4.7.1 Construction via a basis.
4.7.2 Representing discrepancy vectors as bearings.
4.8 Joint bearings.
4.9 Size diagnostics.
4.10 Geometric interpretation.
4.11 Linear likelihood.
4.12 Examples.
4.12.1 Algebraic example.
4.12.2 Oral glucose tolerance test.
5 Partial Bayes linear analysis.
5.1 Partial adjustment.
5.2 Partial variance.
5.3 Partial resolution transforms.
5.4 Relative belief adjustment.
5.5 Example - Oral glucose tolerance test.
5.5.1 Performing an initial adjustment.
5.5.2 Partial resolved variances.
5.5.3 Partial canonical directions.
5.5.4 Deducing changes for other linear combinations.
5.5.5 Relative belief adjustment.
5.5.6 Withdrawing quantities from the adjustment.
5.6 Partial bearings.
5.7 Partial data size.
5.8 Bearing and size for a relative adjustment.
5.9 Path correlation.
5.10 Example - Oral glucose tolerance test.
5.10.1 The initial observed adjustment.
5.10.2 Observed partial expectations.
5.10.3 The size of the partial adjustment.
5.10.4 The bearing for the partial adjustment.
5.10.5 The path correlation for the partial adjustment.
5.11 Sequential adjustment.
5.11.1 The data trajectory.
5.12 The canonical trajectory.
5.13 Detection of systematic bias.
5.14 Examples.
5.14.1 Example: Anscombe data sets.
5.14.2 Example - regression with correlated responses.
5.15 Bayes linear sufficiency and belief separation.
5.16 Properties of generalized conditional independence.
5.17 Properties of belief separation.
5.18 Example - regression with correlated responses.
5.18.1 Exploiting separation.
5.18.2 Heart of the transform.
5.19 Further reading.
6 Exchangeable beliefs.
6.1 Exchangeability.
6.2 Coin tossing.
6.3 Exchangeable belief structures.
6.4 The representation theorem.
6.5 Finite exchangeability .
6.6 Example: Oral glucose tolerance test.
6.7 Example: Analysing exchangeable regressions.
6.7.1 Introduction.
6.7.2 Error structure and specifications.
6.7.3 Regression coefficient specifications.
6.7.4 Structural implications.
6.8 Adjusting exchangeable beliefs.
6.9 Predictive sufficiency for exchangeable models.
6.10 Bayes linear sufficiency for sample means.
6.11 Belief adjustment for scalar exchangeable quantities.
6.12 Canonical structure for an exchangeable adjustment.
6.12.1 Standard form for the adjustment.
6.12.2 Further properties of exchangeable adjustments.
6.13 Algebraic example.
6.13.1 Representation.
6.13.2 Coherence.
6.13.3 Bayes linear sufficiency.
6.14 Example: Adjusting exchangeable regressions.
6.14.1 Bayes linear sufficiency.
6.14.2 Adjustment.
6.14.3 Resolution transforms.
6.14.4 Resolution partition for exchangeable cases.
6.14.5 Data diagnostics.
6.14.6 Sample size choice.
6.14.7 Adjustment for an equivalent linear space.
6.14.8 Data diagnostics for an equivalent linear space.
6.14.9 Compatibility of data sources.
6.15 Predictive adjustment.
6.16 Example: Oral glucose tolerance test.
6.16.1 Context of exchangeability.
6.16.2 Mean component adjustment.
6.16.3 Variance reduction for a predictive adjustment.
6.16.4 Observed exchangeable adjustments.
6.17 Example: predictive analysis for exchangeable regressions.
6.17.1 Choice of canonical directions.
6.18 Further reading.
7 Co-exchangeable beliefs.
7.1 Respecting exchangeability.
7.2 Adjustments respecting exchangeability .
7.3 Example: simple algebraic problem.
7.3.1 Coherence.
7.3.2 Resolution transform.
7.4 Co-exchangeable adjustments.
7.5 Example: analysing further exchangeable regressions.
7.5.1 The resolution envelope.
7.6 Example: exchangeability in a population dynamics experiment.
7.6.1 Model.
7.6.2 Specifications.
7.6.3 Issues.
7.6.4 Analysis.
8 Learning about population variances.
8.1 Assessing a population variance with known population mean.
8.2 Assessing a population variance with unknown population mean.
8.3 Choice of prior values.
8.4 Example: oral glucose tolerance test.
8.5 Adjusting the population residual variance in multiple linear regression: uncorrelated errors.
8.5.1 Sample information.
8.5.2 Choice of prior values.
8.6 Example: Anscombe data sets.
8.7 Adjusting the population residual variance in multiple linear regression: correlated errors.
8.8 Example - regression with correlated responses.
8.9 Example - analysing exchangeable regressions.
8.10 Adjusting a collection of population variances and covariances.
8.11 Direct adjustment for a population variance matrix.
8.12 Example - regression with correlated responses.
8.13 Separating direct adjustment for population variances and for correlation structure.
8.13.1 Assessing the equivalent sample size.
8.14 Example: oral glucose tolerance test.
8.15 Two stage Bayes linear analysis.
8.16 Example: oral glucose tolerance test.
8.17 Example - analysing exchangeable regressions.
8.18 Further reading.
9 Belief comparison.
9.1 Comparing variance specifications.
9.1.1 Rank degenerate case.
9.1.2 Comparison of orthogonal subspaces.
9.2 Example: variance comparison.
9.2.1 Canonical structure for the comparison.
9.2.2 Consistency checks.
9.2.3 Comparisons for further constructed quantities.
9.2.4 Construction of specifications.
9.3 Comparing many variance specifications.
9.4 Example: comparing some simple nested hypotheses.
9.5 General Belief Transforms.
9.5.1 General belief transforms.
9.5.2 Properties of general belief transforms
9.5.3 Adjusted belief transforms as general belief transforms .
9.5.4 Example: adjustment of exchangeable structures.
9.5.5 Example - analysing exchangeable regressions.
9.6 Comparing expectations and variances.
9.7 Geometric interpretation.
9.8 Residual forms for mean and variance comparisons.
9.8.1 Rank degenerate case.
9.9 The observed comparison.
9.9.1 Combined directions.
9.10 Example: mean and variance comparison.
9.10.1 The observed comparison.
9.11 Graphical comparison of specifications.
9.11.1 Belief comparison diagram.
9.11.2 The observed comparison.
9.11.3 Combining information.
9.11.4 Residual belief comparison diagrams.
9.12 Example: exchangeable regressions.
9.12.1 Basic canonical analysis.
9.12.2 Mean and residual comparisons.
9.13 Comparisons for exchangeable structures.
9.13.1 The observed comparison.
9.13.2 Example: exchangeable regressions.
9.14 Example: fly population dynamics.
9.14.1 Differences for the mean part of the average.
9.14.2 Differences for the residual part of the average.
9.14.3 Differences for the residual part of the average.
9.15 Assessing robustness of specifications.
9.15.1 Sensitivity analyses for expectations.
9.15.2 Example: robustness analysis for exchangeable regressions.
9.15.3 Sensitivity analyses for variances.
9.15.4 Example: robustness analysis for variance specifications.
9.16 Further reading.
10 Bayes linear graphical models.
10.1 Directed graphical models.
10.1.1 Construction via statistical models.
10.2 Operations on directed graphs.
10.3 Quantifying a directed graphical model.
10.4 Undirected graphs.
10.4.1 Node removal via the moral graph.
10.5 Example.
10.5.1 Plates for duplicated structures.
10.5.2 Reading properties from the diagram.
10.5.3 Alternative diagrams.
10.5.4 Diagrams for inference and prediction.
10.6 Displaying the flow of information.
10.6.1 Node shading.
10.6.2 Arc labelling.
10.6.3 Tracking information as it is received.
10.6.4 Example.
10.7 Displaying diagnostic information.
10.7.1 Node diagnostics.
10.7.2 Arc diagnostics.
10.7.3 Showing implications across all nodes.
10.7.4 Interpreting diagnostic warnings.
10.7.5 Example: inference and prediction.
10.8 Local computation: directed trees.
10.8.1 Propagation.
10.8.2 Example.
10.9 Junction trees.
10.10Sequential local computation on the junction tree.
10.11Example: correlated regressions.
10.12Example: problems of prediction in a large brewery.
10.12.1Problem summary.
10.12.2 Identifying the quantities of interest.
10.12.3 Modelling.
10.12.4 Initialisation values and specifications.
10.12.5 Examining the generated model.
10.12.6 Basic adjustment.
10.12.7 Exploration via graphical models.
10.13 Local computation for global adjustment of the junction tree.
10.13.1Merging separate adjustments.
10.13.2The global adjustment algorithm.
10.13.3Absorption of evidence.
10.14Further reading.
11 Matrix algebra.
11.1 Basic definitions.
11.2 Covariance matrices and quadratic forms.
11.3 Generalized Inverses.
11.3.1 Basic properties.
11.3.2 Computing the Moore-Penrose inverse.
11.3.3 Other properties of generalized inverses.
11.4 Multiplication laws.
11.5 Range and null space of a matrix.
11.6 Rank conditions.
11.7 Partitioned matrices.
11.7.1 Definiteness for a partitioned real symmetric matrix.
11.7.2 Generalized inverses for partitioned non-negative definite matrices.
11.8 Solving linear equations.
11.9 Eigensolutions to related matrices.
11.10Maximising a ratio of quadratic forms.
11.11The generalized eigenvalue problem.
11.11.1 Introduction.
11.11.2 The QZ algorithm.
11.11.3 An alternative algorithm.
11.11.4 An algorithm for B − A non-negative definite.
11.12 Direct products of matrices.
11.12.1 The Helmert matrix.
11.12.2 Direct products.
12 Implementing Bayes linear statistics.
12.1 Introduction.
12.2 Coherence of belief specifications.
12.2.1 Coherence for a single collection.
12.2.2 Coherence for two collections.
12.2.3 Coherence for three collections.
12.3 Consistency of data with beliefs.
12.3.1 Consistency for a single collection.
12.3.2 Consistency for a partitioned collection.
12.4 Adjusted expectation.
12.5 Adjusted and resolved variance.
12.6 The resolved variance matrix.
12.7 Matrix representations of the resolution transform.
12.7.1 The symmetrized resolution transform matrix.
12.7.2 The transform for the reverse adjustment.
12.7.3 Inverses for the resolved variance matrix.
12.7.4 Canonical quantities.
12.7.5 Coherence via the resolution transform matrix.
12.8 Assessing discrepant data.
12.9 Consistency of observed adjustments.
12.9.1 Partitioning the discrepancy.
12.10 The bearing and size of adjustment.
12.11 Partial Adjustments.
12.11.1 Partial and relative adjustment transforms.
12.11.2 Calculating the partial bearing.
12.12 Exchangeable adjustments.
12.12.1 Notation.
12.12.2 Coherence requirements for exchangeable adjustments.
12.12.3 Data consistency.
12.12.4 Pure exchangeable adjustments.
12.12.5 General exchangeable adjustments.
12.13 Implementing comparisons of belief.
12.13.1 Expectation comparisons.
12.13.2 Comparison of exchangeable beliefs.
A Notation.
B Index of examples.
C Software for Bayes linear computation.
C.1 [ B/D].
C.2 BAYES-LIN.
Bibliography.


Michael Goldstein, Professor of Statistics, Department of Mathematical Sciences, University of Durham Michael Goldstein has worked on and researched the Bayes linear approach for around 30 years, his general interests being in the foundations, methodology and applications of Bayesian/subjectivist approaches to statistics. He has an outstanding reputation as one of the most original thinkers in the field, and was a contributing author to Wileys Encyclopedia of Statistical Sciences. David Wooff, Director of Statistics & Mathematics Consultancy Unit and Senior Lecturer in Statistics, Department of Mathematical Sciences, University of Durham David Wooff has been involved in a long collaboration for over 20 years with Michael Goldstein and others on developing Bayes linear methods, his primary research interest being the general development and application of Bayes linear methodology.