Muutke küpsiste eelistusi

E-raamat: Mathematical Statistics: Essays on History and Methodology

  • Formaat: PDF+DRM
  • Sari: Springer Series in Statistics
  • Ilmumisaeg: 23-Oct-2017
  • Kirjastus: Springer-Verlag Berlin and Heidelberg GmbH & Co. K
  • Keel: eng
  • ISBN-13: 9783642310843
  • Formaat - PDF+DRM
  • Hind: 160,54 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Lisa ostukorvi
  • Lisa soovinimekirja
  • See e-raamat on mõeldud ainult isiklikuks kasutamiseks. E-raamatuid ei saa tagastada.
  • Formaat: PDF+DRM
  • Sari: Springer Series in Statistics
  • Ilmumisaeg: 23-Oct-2017
  • Kirjastus: Springer-Verlag Berlin and Heidelberg GmbH & Co. K
  • Keel: eng
  • ISBN-13: 9783642310843

DRM piirangud

  • Kopeerimine (copy/paste):

    ei ole lubatud

  • Printimine:

    ei ole lubatud

  • Kasutamine:

    Digitaalõiguste kaitse (DRM)
    Kirjastus on väljastanud selle e-raamatu krüpteeritud kujul, mis tähendab, et selle lugemiseks peate installeerima spetsiaalse tarkvara. Samuti peate looma endale  Adobe ID Rohkem infot siin. E-raamatut saab lugeda 1 kasutaja ning alla laadida kuni 6'de seadmesse (kõik autoriseeritud sama Adobe ID-ga).

    Vajalik tarkvara
    Mobiilsetes seadmetes (telefon või tahvelarvuti) lugemiseks peate installeerima selle tasuta rakenduse: PocketBook Reader (iOS / Android)

    PC või Mac seadmes lugemiseks peate installima Adobe Digital Editionsi (Seeon tasuta rakendus spetsiaalselt e-raamatute lugemiseks. Seda ei tohi segamini ajada Adober Reader'iga, mis tõenäoliselt on juba teie arvutisse installeeritud )

    Seda e-raamatut ei saa lugeda Amazon Kindle's. 

In the middle of the last century the development of mathematical statistics underwent an enduring change, due to the use of more refined mathematical tools.

New concepts like sufficiency, superefficiency, adaptivity etc. motivated scholars to reflect upon the interpretation of mathematical concepts in terms of their real-world relevance. Questions concerning the optimality of estimators, for instance, had remained unanswered for decades, as a meaningful concept of optimality (based on regularity of the estimators, the representation of their limit distribution and assertions about their concentration by means of Anderson’s Theorem) was not yet available.

The rapidly developing asymptotic theory provided approximate answers to questions for which non-asymptotic theory had found no satisfying solutions.

In four engaging essays, Pfanzagl’s book presents a detailed description of how the use of mathematical methods stimulated the development of a statistical theory.

A book on the history of mathematical statistics would offer a description of who did what and when. Pfanzagl’s book, centred on questions of methodology, points to missed opportunities, questionable proofs, neglected questions of priority, and to the presence of such deficiencies even in recent textbooks.



This detailed description of the fundamental developments in statistical theory around 1950 points out the centrally important interplay between increasingly refined mathematical techniques and the concomitant developments in methodological concepts.

1 Introduction
1(10)
References
9(2)
2 Sufficiency
11(32)
2.1 The Intuitive Idea
11(4)
2.2 Exhaustive Statistics
15(2)
2.3 Sufficient Statistics---Sufficient σ-Fields
17(2)
2.4 The Factorization Theorem
19(1)
2.5 Completeness
20(4)
2.6 Minimal Sufficiency
24(2)
2.7 Trivially Sufficient Statistics
26(1)
2.8 Sufficiency and Exponentiality
27(8)
2.9 Characterizations of Sufficiency
35(8)
References
38(5)
3 Descriptive Statistics
43(40)
3.1 Introduction
43(1)
3.2 Parameters and Functionals
44(1)
3.3 Estimands and Estimators
45(4)
3.4 Stochastic Order
49(4)
3.5 Spread
53(2)
3.6 Unimodality; Logconcave Distributions
55(1)
3.7 Concentration
56(3)
3.8 Anderson's Theorem
59(3)
3.9 The Spread of Convolution Products
62(2)
3.10 Interpretation of Convolution Products
64(2)
3.11 Loss Functions
66(8)
3.12 Pitman Closeness
74(9)
References
77(6)
4 Optimality of Unbiased Estimators: Nonasymptotic Theory
83(24)
4.1 Optimal Mean Unbiased Estimators
83(5)
4.2 Bahadur's Converse of the Rao--Blackwell--Lehmann--Scheffe Theorem
88(5)
4.3 Unbiased Estimation of Probabilities and Densities
93(2)
4.4 The Cramer--Rao Bound
95(2)
4.5 Optimal Median Unbiased Estimators
97(3)
4.6 Confidence Procedures
100(7)
References
102(5)
5 Asymptotic Optimality of Estimators
107(198)
5.1 Introduction
107(6)
5.2 Maximum Likelihood
113(16)
5.3 Convergence of Distributions
129(8)
5.4 Consistency and √n-consistency of Estimator Sequences
137(8)
5.5 Asymptotically Linear Estimator Sequences
145(10)
5.6 Functionals on General Families
155(11)
5.7 Adaptivity
166(8)
5.8 "Regular" Estimator Sequences
174(3)
5.9 Bounds for the Asymptotic Concentration of Estimator Sequences
177(10)
5.10 Regular Convergence and Continuity in the Limit
187(2)
5.11 The Neyman--Pearson Lemma and Applications
189(20)
5.12 Asymptotic Normality: Global and Local
209(5)
5.13 The Convolution Theorem
214(15)
5.14 The Extended Role of Intrinsic Bounds
229(8)
5.15 Local Asymptotic Minimaxity
237(10)
5.16 Superefficiency
247(9)
5.17 Rates of Convergence
256(16)
5.18 Second Order Optimality
272(21)
5.19 Asymptotic Confidence Procedures
293(12)
References
294(11)
Author Index 305(6)
Subject Index 311(4)
Symbol Index 315
Johann Pfanzagl completed his degree in Mathematics at the University of Vienna in 1951 as a student of Johann Radon and Edmund Hlawka. After working for several years as a Statistician at the Chamber of Commerce in Austria he became a Professor of Statistics at the University of Vienna in 1959, and at the University of Cologne in 1960. He is one of the few remaining scholars who, in the early days, contributed to the development of statistical theory using sophisticated mathematical tools.