Muutke küpsiste eelistusi

Matrix Methods in Data Mining and Pattern Recognition illustrated edition [Pehme köide]

  • Formaat: Paperback, 235 pages, kõrgus x laius x paksus: 229x152x16 mm, kaal: 424 g, 36 worked examples 54 figures
  • Sari: Fundamentals of Algorithms No. 4
  • Ilmumisaeg: 30-Apr-2007
  • Kirjastus: Society for Industrial & Applied Mathematics,U.S.
  • ISBN-10: 0898716268
  • ISBN-13: 9780898716269
Teised raamatud teemal:
  • Formaat: Paperback, 235 pages, kõrgus x laius x paksus: 229x152x16 mm, kaal: 424 g, 36 worked examples 54 figures
  • Sari: Fundamentals of Algorithms No. 4
  • Ilmumisaeg: 30-Apr-2007
  • Kirjastus: Society for Industrial & Applied Mathematics,U.S.
  • ISBN-10: 0898716268
  • ISBN-13: 9780898716269
Teised raamatud teemal:
Several very powerful numerical linear algebra techniques are available for solving problems in data mining and pattern recognition. This application-oriented book describes how modern matrix methods can be used to solve these problems, gives an introduction to matrix theory and decompositions, and provides students with a set of tools that can be modified for a particular application. Matrix Methods in Data Mining and Pattern Recognition is divided into three parts. Part I gives a short introduction to a few application areas before presenting linear algebra concepts and matrix decompositions that students can use in problem-solving environments such as MATLAB(R). Some mathematical proofs that emphasize the existence and properties of the matrix decompositions are included. In Part II, linear algebra techniques are applied to data mining problems. Part III is a brief introduction to eigenvalue and singular value algorithms. The applications discussed by the author are: classification of handwritten digits, text mining, text summarization, pagerank computations related to the Google? search engine, and face recognition. Exercises and computer assignments are available on a Web page that supplements the book.
Preface ix
I. Linear Algebra Concepts and Matrix Decompositions
Vectors and Matrices in Data Mining and Pattern Recognition
3(10)
Data Mining and Pattern Recognition
3(1)
Vectors and Matrices
4(3)
Purpose of the Book
7(1)
Programming Environments
8(1)
Floating Point Computations
8(3)
Notation and Conventions
11(2)
Vectors and Matrices
13(10)
Matrix-Vector Multiplication
13(2)
Matrix-Matrix Multiplication
15(2)
Inner Product and Vector Norms
17(1)
Matrix Norms
18(2)
Linear Independence: Bases
20(1)
The Rank of a Matrix
21(2)
Linear Systems and Least Squares
23(14)
LU Decomposition
23(2)
Symmetric, Positive Definite Matrices
25(1)
Perturbation Theory and Condition Number
26(1)
Rounding Errors in Gaussian Elimination
27(2)
Banded Matrices
29(2)
The Least Squares Problem
31(6)
Orthogonality
37(10)
Orthogonal Vectors and Matrices
38(2)
Elementary Orthogonal Matrices
40(5)
Number of Floating Point Operations
45(1)
Orthogonal Transformations in Floating Point Arithmetic
46(1)
QR Decomposition
47(10)
Orthogonal Transformation to Triangular Form
47(4)
Solving the Least Squares Problem
51(1)
Computing or Not Computing Q
52(1)
Flop Count for QR Factorization
53(1)
Error in the Solution of the Least Squares Problem
53(1)
Updating the Solution of a Least Squares Problem
54(3)
Singular Value Decomposition
57(18)
The Decomposition
57(4)
Fundamental Subspaces
61(2)
Matrix Approximation
63(3)
Principal Component Analysis
66(1)
Solving Least Squares Problems
66(3)
Condition Number and Perturbation Theory for the Least Squares Problem
69(1)
Rank-Deficient and Underdetermined Systems
70(2)
Computing the SVD
72(1)
Complete Orthogonal Decomposition
72(3)
Reduced-Rank Least Squares Models
75(16)
Truncated SVD: Principal Component Regression
77(3)
A Krylov Subspace Method
80(11)
Tensor Decomposition
91(10)
Introduction
91(1)
Basic Tensor Concepts
92(2)
A Tensor SVD
94(2)
Approximating a Tensor by HOSVD
96(5)
Clustering and Nonnegative Matrix Factorization
101(12)
The k-Means Algorithm
102(4)
Nonnegative Matrix Factorization
106(7)
II. Data Mining Applications
Classification of Handwritten Digits
113(16)
Handwritten Digits and a Simple Algorithm
113(2)
Classification Using SVD Bases
115(7)
Tangent Distance
122(7)
Text Mining
129(18)
Preprocessing the Documents and Queries
130(1)
The Vector Space Model
131(4)
Latent Semantic Indexing
135(4)
Clustering
139(2)
Nonnegative Matrix Factorization
141(1)
LGK Bidiagonalization
142(3)
Average Performance
145(2)
Page Ranking for a Web Search Engine
147(14)
Pagerank
147(3)
Random Walk and Markov Chains
150(4)
The Power Method for Pagerank Computation
154(5)
HITS
159(2)
Automatic Key Word and Key Sentence Extraction
161(8)
Saliency Score
161(4)
Key Sentence Extraction from a Rank-k Approximation
165(4)
Face Recognition Using Tensor SVD
169(10)
Tensor Representation
169(3)
Face Recognition
172(3)
Face Recognition with HOSVD Compression
175(4)
III. Computing the Matrix Decompositions
Computing Eigenvalues and Singular Values
179(30)
Perturbation Theory
180(5)
The Power Method and Inverse Iteration
185(2)
Similarity Reduction to Tridiagonal Form
187(2)
The QR Algorithm for a Symmetric Tridiagonal Matrix
189(7)
Computing the SVD
196(1)
The Nonsymmetric Eigenvalue Problem
197(1)
Sparse Matrices
198(2)
The Arnoldi and Lanczos Methods
200(7)
Software
207(2)
Bibliography 209(8)
Index 217
Lars Elden is professor of numerical analysis at Linkoping University in Sweden.