Muutke küpsiste eelistusi

E-raamat: Matrix Analysis for Statistics

(University of Central Florida, Orlando, FL)
Teised raamatud teemal:
  • Formaat - PDF+DRM
  • Hind: 130,85 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Lisa ostukorvi
  • Lisa soovinimekirja
  • See e-raamat on mõeldud ainult isiklikuks kasutamiseks. E-raamatuid ei saa tagastada.
Teised raamatud teemal:

DRM piirangud

  • Kopeerimine (copy/paste):

    ei ole lubatud

  • Printimine:

    ei ole lubatud

  • Kasutamine:

    Digitaalõiguste kaitse (DRM)
    Kirjastus on väljastanud selle e-raamatu krüpteeritud kujul, mis tähendab, et selle lugemiseks peate installeerima spetsiaalse tarkvara. Samuti peate looma endale  Adobe ID Rohkem infot siin. E-raamatut saab lugeda 1 kasutaja ning alla laadida kuni 6'de seadmesse (kõik autoriseeritud sama Adobe ID-ga).

    Vajalik tarkvara
    Mobiilsetes seadmetes (telefon või tahvelarvuti) lugemiseks peate installeerima selle tasuta rakenduse: PocketBook Reader (iOS / Android)

    PC või Mac seadmes lugemiseks peate installima Adobe Digital Editionsi (Seeon tasuta rakendus spetsiaalselt e-raamatute lugemiseks. Seda ei tohi segamini ajada Adober Reader'iga, mis tõenäoliselt on juba teie arvutisse installeeritud )

    Seda e-raamatut ei saa lugeda Amazon Kindle's. 

An up-to-date version of the complete, self-contained introduction to matrix analysis theory and practice

Providing accessible and in-depth coverage of the most common matrix methods now used in statistical applications, Matrix Analysis for Statistics, Third Edition features an easy-to-follow theorem/proof format. Featuring smooth transitions between topical coverage, the author carefully justifies the step-by-step process of the most common matrix methods now used in statistical applications, including eigenvalues and eigenvectors; the Moore-Penrose inverse; matrix differentiation; and the distribution of quadratic forms.

An ideal introduction to matrix analysis theory and practice, Matrix Analysis for Statistics, Third Edition features:

New chapter or section coverage on inequalities, oblique projections, and antieigenvalues and antieigenvectors

Additional problems and chapter-end practice exercises at the end of each chapter

Extensive examples that are familiar and easy to understand

Self-contained chapters for flexibility in topic choice

Applications of matrix methods in least squares regression and the analyses of mean vectors and covariance matrices

Matrix Analysis for Statistics, Third Edition is an ideal textbook for upper-undergraduate and graduate-level courses on matrix methods, multivariate analysis, and linear models. The book is also an excellent reference for research professionals in applied statistics.

James R. Schott, PhD, is Professor in the Department of Statistics at the University of Central Florida. He has published numerous journal articles in the area of multivariate analysis. Dr. Schotts research interests include multivariate analysis, analysis of covariance and correlation matrices, and dimensionality reduction techniques.
Preface xi
About the Companion Website xv
1 A Review of Elementary Matrix Algebra
1(34)
1.1 Introduction
1(1)
1.2 Definitions and Notation
1(1)
1.3 Matrix Addition and Multiplication
2(1)
1.4 The Transpose
3(1)
1.5 The Trace
4(1)
1.6 The Determinant
5(4)
1.7 The Inverse
9(3)
1.8 Partitioned Matrices
12(2)
1.9 The Rank of a Matrix
14(1)
1.10 Orthogonal Matrices
15(1)
1.11 Quadratic Forms
16(2)
1.12 Complex Matrices
18(1)
1.13 Random Vectors and Some Related Statistical Concepts
19(16)
Problems
29(6)
2 Vector Spaces
35(60)
2.1 Introduction
35(1)
2.2 Definitions
35(7)
2.3 Linear Independence and Dependence
42(3)
2.4 Matrix Rank and Linear Independence
45(4)
2.5 Bases and Dimension
49(4)
2.6 Orthonormal Bases and Projections
53(5)
2.7 Projection Matrices
58(7)
2.8 Linear Transformations and Systems of Linear Equations
65(8)
2.9 The Intersection and Sum of Vector Spaces
73(3)
2.10 Oblique Projections
76(4)
2.11 Convex Sets
80(15)
Problems
85(10)
3 Eigenvalues and Eigenvectors
95(60)
3.1 Introduction
95(1)
3.2 Eigenvalues, Eigenvectors, and Eigenspaces
95(4)
3.3 Some Basic Properties of Eigenvalues and Eigenvectors
99(7)
3.4 Symmetric Matrices
106(8)
3.5 Continuity of Eigenvalues and Eigenprojections
114(2)
3.6 Extremal Properties of Eigenvalues
116(7)
3.7 Additional Results Concerning Eigenvalues Of Symmetric Matrices
123(6)
3.8 Nonnegative Definite Matrices
129(12)
3.9 Antieigenvalues and Antieigenvectors
141(14)
Problems
144(11)
4 Matrix Factorizations and Matrix Norms
155(46)
4.1 Introduction
155(1)
4.2 The Singular Value Decomposition
155(7)
4.3 The Spectral Decomposition of a Symmetric Matrix
162(7)
4.4 The Diagonalization of a Square Matrix
169(4)
4.5 The Jordan Decomposition
173(2)
4.6 The Schur Decomposition
175(3)
4.7 The Simultaneous Diagonalization of Two Symmetric Matrices
178(6)
4.8 Matrix Norms
184(17)
Problems
191(10)
5 Generalized Inverses
201(46)
5.1 Introduction
201(1)
5.2 The Moore-Penrose Generalized Inverse
202(3)
5.3 Some Basic Properties of the Moore-Penrose Inverse
205(6)
5.4 The Moore--Penrose Inverse of a Matrix Product
211(4)
5.5 The Moore--Penrose Inverse of Partitioned Matrices
215(4)
5.6 The Moore--Penrose Inverse of a Sum
219(3)
5.7 The Continuity of the Moore--Penrose Inverse
222(2)
5.8 Some Other Generalized Inverses
224(8)
5.9 Computing Generalized Inverses
232(15)
Problems
238(9)
6 Systems of Linear Equations
247(38)
6.1 Introduction
247(1)
6.2 Consistency of a System of Equations
247(4)
6.3 Solutions to a Consistent System of Equations
251(7)
6.4 Homogeneous Systems of Equations
258(2)
6.5 Least Squares Solutions to a System of Linear Equations
260(6)
6.6 Least Squares Estimation For Less Than Full Rank Models
266(5)
6.7 Systems of Linear Equations and The Singular Value Decomposition
271(2)
6.8 Sparse Linear Systems of Equations
273(12)
Problems
278(7)
7 Partitioned Matrices
285(30)
7.1 Introduction
285(1)
7.2 The Inverse
285(3)
7.3 The Determinant
288(8)
7.4 Rank
296(2)
7.5 Generalized Inverses
298(4)
7.6 Eigenvalues
302(13)
Problems
307(8)
8 Special Matrices and Matrix Operations
315(72)
8.1 Introduction
315(1)
8.2 The Kronecker Product
315(8)
8.3 The Direct Sum
323(1)
8.4 The Vec Operator
323(6)
8.5 The Hadamard Product
329(10)
8.6 The Commutation Matrix
339(7)
8.7 Some Other Matrices Associated With the Vec Operator
346(5)
8.8 Nonnegative Matrices
351(12)
8.9 Circulant and Toeplitz Matrices
363(6)
8.10 Hadamard and Vandermonde Matrices
369(18)
Problems
373(14)
9 Matrix Derivatives and Related Topics
387(46)
9.1 Introduction
387(1)
9.2 Multivariable Differential Calculus
387(3)
9.3 Vector and Matrix Functions
390(6)
9.4 Some Useful Matrix Derivatives
396(4)
9.5 Derivatives of Functions of Patterned Matrices
400(2)
9.6 The Perturbation Method
402(7)
9.7 Maxima and Minima
409(4)
9.8 Convex and Concave Functions
413(4)
9.9 The Method of Lagrange Multipliers
417(16)
Problems
423(10)
10 Inequalities
433(24)
10.1 Introduction
433(1)
10.2 Majorization
433(11)
10.3 Cauchy-Schwarz Inequalities
444(2)
10.4 Holder's Inequality
446(4)
10.5 Minkowski's Inequality
450(2)
10.6 The Arithmetic-Geometric Mean Inequality
452(5)
Problems
453(4)
11 Some Special Topics Related to Quadratic Forms
457(50)
11.1 Introduction
457(1)
11.2 Some Results on Idempotent Matrices
457(5)
11.3 Cochran's Theorem
462(3)
11.4 Distribution of Quadratic Forms in Normal Variates
465(6)
11.5 Independence of Quadratic Forms
471(6)
11.6 Expected Values of Quadratic Forms
477(8)
11.7 The Wishart Distribution
485(22)
Problems
496(11)
References 507(6)
Index 513
James R. Schott, PhD, is Professor in the Department of Statistics at the University of Central Florida. He has published numerous journal articles in the area of multivariate analysis. Dr. Schotts research interests include multivariate analysis, analysis of covariance and correlation matrices, and dimensionality reduction techniques.