Muutke küpsiste eelistusi

Multivariate Data Analysis on Matrix Manifolds: (with Manopt) 2021 ed. [Kõva köide]

  • Formaat: Hardback, 450 pages, kõrgus x laius: 235x155 mm, kaal: 955 g, 5 Illustrations, color; 1 Illustrations, black and white; XX, 450 p. 6 illus., 5 illus. in color., 1 Hardback
  • Sari: Springer Series in the Data Sciences
  • Ilmumisaeg: 16-Sep-2021
  • Kirjastus: Springer Nature Switzerland AG
  • ISBN-10: 3030769739
  • ISBN-13: 9783030769734
  • Kõva köide
  • Hind: 48,70 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Tavahind: 57,29 €
  • Säästad 15%
  • Raamatu kohalejõudmiseks kirjastusest kulub orienteeruvalt 2-4 nädalat
  • Kogus:
  • Lisa ostukorvi
  • Tasuta tarne
  • Tellimisaeg 2-4 nädalat
  • Lisa soovinimekirja
  • Formaat: Hardback, 450 pages, kõrgus x laius: 235x155 mm, kaal: 955 g, 5 Illustrations, color; 1 Illustrations, black and white; XX, 450 p. 6 illus., 5 illus. in color., 1 Hardback
  • Sari: Springer Series in the Data Sciences
  • Ilmumisaeg: 16-Sep-2021
  • Kirjastus: Springer Nature Switzerland AG
  • ISBN-10: 3030769739
  • ISBN-13: 9783030769734
This graduate-level textbook aims to give a unified presentation and solution of several commonly used techniques for multivariate data analysis (MDA). Unlike similar texts, it treats the MDA problems as optimization problems on matrix manifolds defined by the MDA model parameters, allowing them to be solved using (free) optimization software Manopt. The book includes numerous in-text examples as well as Manopt codes and software guides, which can be applied directly or used as templates for solving similar and new problems. The first two chapters provide an overview and essential background for studying MDA, giving basic information and notations. Next, it considers several sets of matrices routinely used in MDA as parameter spaces, along with their basic topological properties. A brief introduction to matrix (Riemannian) manifolds and optimization methods on them with Manopt complete the MDA prerequisite. The remaining chapters study individual MDA techniques in depth. The number of exercises complement the main text with additional information and occasionally involve open and/or challenging research questions. Suitable fields include computational statistics, data analysis, data mining and data science, as well as theoretical computer science, machine learning and optimization. It is assumed that the readers have some familiarity with MDA and some experience with matrix analysis, computing, and optimization. 
Preface vii
1 Introduction
1(8)
2 Matrix analysis and differentiation
9(36)
2.1 Matrix algebra
10(8)
2.2 Vector spaces, bases, and linear maps
18(4)
2.3 Metric, normed and inner product spaces
22(5)
2.4 Euclidean spaces of matrices and their norms
27(5)
2.5 Matrix differentials and gradients
32(3)
2.6 Conclusion
35(1)
2.7 Exercises
35(10)
3 Matrix manifolds in MDA
45(44)
3.1 Several useful matrix sets
46(7)
3.1.1 GL(n)
46(1)
3.1.2 O(n)
46(2)
3.1.3 O(n, p)
48(1)
3.1.4 O0(n, p)
49(1)
3.1.5 OB(n)
49(3)
3.1.6 OB(n, p)
52(1)
3.1.7 G(n, p)
52(1)
3.2 Differentiable manifolds
53(4)
3.3 Examples of matrix manifolds in MDA
57(4)
3.3.1 GL(n)
57(1)
3.3.2 O(n)
57(2)
3.3.3 O(n, p)
59(1)
3.3.4 O0(n, p)
59(1)
3.3.5 OB(n)
59(1)
3.3.6 G(n, p)
60(1)
3.4 Tangent spaces
61(8)
3.4.1 GL(n)
63(1)
3.4.2 O(n)
63(1)
3.4.3 O(n, p)
64(2)
3.4.4 O0(n, p)
66(1)
3.4.5 OB(n)
66(1)
3.4.6 OB(n, p)
67(1)
3.4.7 G(n, p)
67(2)
3.5 Optimization on matrix manifolds
69(8)
3.5.1 Dynamical systems
69(3)
3.5.2 Iterative schemes
72(3)
3.5.3 Do it yourself!
75(2)
3.6 Optimization with Manopt
77(4)
3.6.1 Matrix manifolds in Manopt
78(1)
3.6.2 Solvers
79(2)
3.6.3 Portability
81(1)
3.7 Conclusion
81(1)
3.8 Exercises
82(7)
4 Principal component analysis (PCA)
89(52)
4.1 Introduction
90(1)
4.2 Definition and main properties
91(5)
4.3 Correspondence analysis (CA)
96(2)
4.4 PCA interpretation
98(2)
4.5 Simple structure rotation in PCA (and FA)
100(9)
4.5.1 Simple structure concept
100(2)
4.5.2 Simple structure criteria and rotation methods
102(3)
4.5.3 Case study: PCA interpretation via rotation methods
105(2)
4.5.4 Rotation to independent components
107(2)
4.6 True simple structure: sparse loadings
109(9)
4.6.1 SPARSIMAX: rotation-like sparse loadings
110(3)
4.6.2 Know-how for applying SPARSIMAX
113(2)
4.6.3 Manopt code for SPARSIMAX
115(3)
4.7 Sparse PCA
118(3)
4.7.1 Sparse components: genesis, history and present times
118(2)
4.7.2 Taxonomy of PCA subject to 1 constraint (LASSO)
120(1)
4.8 Function-constrained sparse components
121(6)
4.8.1 Orthonormal sparse component loadings
121(2)
4.8.2 Uncorrelated sparse components
123(1)
4.8.3 Numerical Example
124(1)
4.8.4 Manopt code for weakly correlated sparse components with nearly orthonormal loadings
124(3)
4.9 New generation dimension reduction
127(6)
4.9.1 Centroid method
127(3)
4.9.2 Randomized SVD
130(1)
4.9.3 CUR approximations
131(1)
4.9.4 The Nystrom Method
132(1)
4.10 Conclusion
133(1)
4.11 Exercises
134(7)
5 Factor analysis (FA)
141(46)
5.1 Introduction
142(1)
5.2 Fundamental equations of EFA
143(2)
5.2.1 Population EFA definition
143(1)
5.2.2 Sample EFA definition
144(1)
5.3 EFA parameters estimation
145(6)
5.3.1 Classical EFA estimation
146(2)
5.3.2 EFA estimation on manifolds
148(3)
5.4 ML exploratory factor analysis
151(2)
5.4.1 Gradients
151(1)
5.4.2 Optimality conditions
152(1)
5.5 LS and GLS exploratory factor analysis
153(3)
5.5.1 Gradients
153(1)
5.5.2 Optimality conditions
154(2)
5.6 Manopt codes for classical ML-, LS-, and GLS-EFA
156(8)
5.6.1 Standard case: Φ2 ≥ Op
156(2)
5.6.2 Avoiding Hey wood cases: Φ2 > Op
158(6)
5.7 EFA as a low-rank-plus-sparse matrix decomposition
164(7)
5.8 Sparse EFA
171(6)
5.8.1 Introduction
171(1)
5.8.2 Sparse factor loadings with penalized EFA
172(1)
5.8.3 Implementing sparseness
173(1)
5.8.4 Numerical examples
174(3)
5.9 Comparison to other methods
177(4)
5.10 Conclusion
181(1)
5.11 Exercises
182(5)
6 Procrustes analysis (PA)
187(42)
6.1 Introduction
188(1)
6.2 Orthonormal PA
189(7)
6.2.1 Orthogonal Penrose regression (OPR)
189(1)
6.2.2 Projected Hessian
190(3)
6.2.3 Orthonormal Penrose regression (OnPR)
193(2)
6.2.4 Ordinary orthonormal PA
195(1)
6.3 Oblique PA
196(2)
6.3.1 Basic formulations and solutions
196(2)
6.4 Robust PA
198(6)
6.4.1 Some history remarks
198(1)
6.4.2 Robust OnPR
199(1)
6.4.3 Robust oblique PA
200(2)
6.4.4 PA with M-estimator
202(2)
6.5 Multi-mode PA
204(4)
6.5.1 PCA and one-mode PCA
204(1)
6.5.2 Multi-mode PCA and related PA problems
204(2)
6.5.3 Global minima on O(p)
206(2)
6.6 Some other PA problems
208(7)
6.6.1 Average of rotated matrices: generalized PA
208(1)
6.6.2 Mean rotation
209(6)
6.7 PA application to EFA of large data
215(10)
6.7.1 The classical case n > p
216(1)
6.7.2 The modern case p >> n
217(2)
6.7.3 EFA and RPCA when p >> n
219(1)
6.7.4 Semi-sparse PCA (well-defined EFA)
220(5)
6.8 Conclusion
225(1)
6.9 Exercises
225(4)
7 Linear discriminant analysis (LDA)
229(40)
7.1 Introduction
230(3)
7.2 LDA of vertical data (n > p)
233(8)
7.2.1 Standard canonical variates (CVs)
235(5)
7.2.2 Orthogonal canonical variates (OCVs)
240(1)
7.3 Sparse CVs and sparse OCVs
241(7)
7.4 LDA of horizontal data (p > n)
248(15)
7.4.1 LDA through GSVD
249(1)
7.4.2 LDA and pattern recognition
250(1)
7.4.3 Null space LDA (NLDA)
251(2)
7.4.4 LDA with CPC, PLS and MDS
253(1)
7.4.5 Sparse LDA with diagonal W
254(1)
7.4.6 Function-constrained sparse LDA
255(1)
7.4.7 Sparse LDA based on minimization of the classification error
256(2)
7.4.8 Sparse LDA through optimal scoring (SLDA)
258(1)
7.4.9 Multiclass sparse discriminant analysis
259(1)
7.4.10 Sparse LDA through GEVD
260(2)
7.4.11 Sparse LDA without sparse-inducing penalty
262(1)
7.5 Conclusion
263(1)
7.6 Exercises
264(5)
8 Cannonical correlation analysis (CCA)
269(20)
8.1 Introduction
270(1)
8.2 Classical CCA Formulation and Solution
270(2)
8.3 Alternative CCA Definitions
272(1)
8.4 Singular Scatter Matrices C11 and/or C22
273(1)
8.5 Sparse CCA
273(4)
8.5.1 Sparse CCA Through Sparse GEVD
274(2)
8.5.2 LS Approach to Sparse CCA
276(1)
8.6 CCA Relation to LDA and PLS
277(2)
8.6.1 CCA and LDA
277(1)
8.6.2 CCA and PLS
277(2)
8.7 More Than Two Groups of Variables
279(6)
8.7.1 CCA Generalizations
280(2)
8.7.2 CCA Based on CPC
282(3)
8.8 Conclusion
285(1)
8.9 Exercises
286(3)
9 Common principal components (CPC)
289(36)
9.1 Introduction
290(3)
9.2 CPC estimation problems
293(1)
9.3 ML-and LS-CPC
294(5)
9.3.1 Gradients and optimality conditions
294(2)
9.3.2 Example: Fisher's Iris data
296(1)
9.3.3 Appendix: MATLAB code for FG algorithm
297(2)
9.4 New procedures for CPC estimation
299(4)
9.4.1 Classic numerical solutions of ML- and LS-CPC
299(1)
9.4.2 Direct calculation of individual eigenvalues/variances
300(2)
9.4.3 CPC for known individual variances
302(1)
9.5 CPC for dimension reduction
303(6)
9.6 Proportional covariance matrices
309(6)
9.6.1 ML and LS proportional principal components
309(4)
9.6.2 Dimension reduction with PPC
313(2)
9.7 Some relations between CPC and ICA
315(8)
9.7.1 ICA formulations
315(1)
9.7.2 ICA by contrast functions
316(4)
9.7.3 ICA methods based on diagonalization
320(3)
9.8 Conclusion
323(1)
9.9 Exercises
324(1)
10 Metric multidimensional scaling (MDS) and related methods
325(48)
10.1 Introduction
326(1)
10.2 Proximity measures
327(1)
10.3 Metric MDS
328(8)
10.3.1 Basic identities and classic solution
328(2)
10.3.2 MDS that fits distances directly
330(2)
10.3.3 Some related/adjacent MDS problems
332(4)
10.4 INDSCAL -- Individual Differences Scaling
336(4)
10.4.1 The classical INDSCAL solution and some problems
337(1)
10.4.2 Orthonormality-constrained INDSCAL
338(2)
10.5 DINDSCAL -- Direct INDSCAL
340(5)
10.5.1 DINDSCAL model
341(1)
10.5.2 DINDSCAL solution
342(1)
10.5.3 Manopt code for DINDSCAL
343(2)
10.6 DEDICOM
345(3)
10.6.1 Introduction
345(1)
10.6.2 Alternating DEDICOM
346(2)
10.6.3 Simultaneous DEDICOM
348(1)
10.7 GIPSCAL
348(5)
10.7.1 GIPSCAL model
348(3)
10.7.2 GISPSCAL solution
351(2)
10.7.3 Three-way GIPSCAL
353(1)
10.8 Tensor data analysis
353(13)
10.8.1 Basic notations and definitions
355(3)
10.8.2 CANDECOMP/PARAFAC (CP)
358(2)
10.8.3 Three-mode PCA (TUCKER3)
360(3)
10.8.4 Multi-mode PCA
363(1)
10.8.5 Higher order SVD (HOSVD)
364(2)
10.9 Conclusion
366(2)
10.10 Exercises
368(5)
11 Data analysis on simplexes
373(30)
11.1 Archetypal analysis (AA)
374(14)
11.1.1 Introduction
374(1)
11.1.2 Definition of the AA problem
375(1)
11.1.3 AA solution on multinomial manifold
376(2)
11.1.4 AA solution on oblique manifolds
378(3)
11.1.5 AA as interior point flows
381(3)
11.1.6 Conclusion
384(1)
11.1.7 Exercises
385(3)
11.2 Analysis of compositional data (CoDa)
388(15)
11.2.1 Introduction
388(1)
11.2.2 Definition and main properties
389(2)
11.2.3 Geometric clr structure of the data simplex
391(3)
11.2.4 PCA and sparse PCA for compositions
394(2)
11.2.5 Case study
396(1)
11.2.6 Manopt code for sparse PCA of CoDa
397(4)
11.2.7 Conclusion
401(1)
11.2.8 Exercises
401(2)
Bibliography 403(38)
Index 441
Nickolay T. Trendafilov is Reader of Computational Statistics in the School of Mathematics and Statistics, Open University, UK. He received MSc and PhD in the Department of Mathematics and Informatics, University of Sofia St. Kl. Ohridski, and then joined the Laboratory of Computational Stochastic, Bulgarian Academy of Sciences. He held research and visiting positions in a number of universities in Belgium, Italy, Japan and USA. His interests are in the computational aspects of multivariate data analysis and interpretation. Other activities include elected memberships in the International Statistical Institute (ISI), the Royal Statistical Society's (RSS) Computing Section, and the Board of Directors, European Regional Section of the International Association for Statistical Computing (IASC).  Michele Gallo is Professor in the Department of Human and Social Sciences at the University of Naples LOrientale. He received his PhD degree in Total Quality Management from the University of Naples Federico II, in 2000. His current research interest is in Multivariate Data Analysis, Compositional and Ordinal Data, Rasch Analysis. He has published more than 90 research articles. He is Associate-Editor of the journal Computational Statistics.