Muutke küpsiste eelistusi

Matrix Methods in Data Analysis [Kõva köide]

  • Formaat: Hardback, kõrgus x laius: 235x155 mm, Approx. 1025 p.
  • Sari: Texts in Applied Mathematics
  • Ilmumisaeg: 12-Oct-2026
  • Kirjastus: Springer Nature Switzerland AG
  • ISBN-10: 303211313X
  • ISBN-13: 9783032113139
Teised raamatud teemal:
  • Kõva köide
  • Hind: 92,17 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Tavahind: 122,89 €
  • Säästad 25%
  • See raamat ei ole veel ilmunud. Raamatu kohalejõudmiseks kulub orienteeruvalt 3-4 nädalat peale raamatu väljaandmist.
  • Kogus:
  • Lisa ostukorvi
  • Tasuta tarne
  • Tellimisaeg 2-4 nädalat
  • Lisa soovinimekirja
  • Formaat: Hardback, kõrgus x laius: 235x155 mm, Approx. 1025 p.
  • Sari: Texts in Applied Mathematics
  • Ilmumisaeg: 12-Oct-2026
  • Kirjastus: Springer Nature Switzerland AG
  • ISBN-10: 303211313X
  • ISBN-13: 9783032113139
Teised raamatud teemal:
This textbook offers a fresh and balanced approach to the study of Linear Algebra in the context of modern Data Science. Whereas many existing texts either emphasize theory with little connection to practice or jump straight to applications with minimal mathematical explanation, this book provides equal weight to both foundations and applications.



Designed for undergraduates who have completed a proof-based Linear Algebra course, it introduces concepts and tools from Matrix Analysis that are essential for Data Science and Machine Learning. Topics include:







Vector norms and distances, orthogonality, and projections Matrix factorizations such as LU, CR, QR, and SVD Special matrix types: symmetric, positive definite, nonnegative, stochastic, and covariance matrices Key numerical algorithms, including the QR algorithm and the Power Method



Each chapter is enriched with real-world applicationsfrom Google PageRank and Principal Component Analysis to clustering, dimensionality reduction, and linear regressionhighlighting the role of matrix methods in Data Science.



To further support hands-on learning, the book is accompanied by a GitHub repository with Python labs, allowing students to implement the techniques covered and bridge the gap between theory and computation.



With its clear explanations, practical insights, and balance of theory and application, Matrix Methods in Data Analysis is an invaluable resource for courses in applied Linear Algebra, Data Science, and introductory Machine Learning.
Part I: Linear Algebra And Machine Learning.- Why Should We Care?.- What
You May Have Learned Before..- Core Topics.- Supplementary Topics.- Part II:
Matrix Multiplication And Partitioned Matrices.- Why Should We Care?.- What
You May Have Learned Before.- Core Topics.- Supplementary Topics.- From The
Classroom To Real Life.- Part III: Norms, Distances, And Similarities.- Why
Should We Care?.- What You May Have Learned Before.- Core Topics.-
Supplementary Topics.- From The Classroom To Real Life.- Part IV: The Four
Fundamental Subspaces Of A Matrix, And Gram-Matrices.-  Why Should We Care?
.- What You May Have Learned Before.- Core Topics.- Supplementary Topics.-
From The Classroom To Real Life.- Part V: The Lu Factorization Of A Matrix.-
Why Should We Care? .- What You May Have Learned Before.- Core Topics.-
Supplementary Topics.- From The Classroom To Real Life.- Part VI:
Orthogonality And The Qr Factorization.- Why Should We Care? .- What You May
Have Learned Before.- Core Topics.- Supplementary Topics.- From The Classroom
To Real Life.- Part VII: Orthogonal Projections And The Least Squares
Problem.- Why Should We Care? .- What You May Have Learned Before.- Core
Topics.- Supplementary Topics.- From The Classroom To Real Life.- Part VIII:
Eigenvalues, Eigenvectors, And Algorithms.- Why Should We Care? .- What You
May Have Learned Before.- Core Topics.- Supplementary Topics.- From The
Classroom To Real Life.- Part IX: Symmetric And Positive Definite Matrices.-
Why Should We Care? .- What You May Have Learned Before.- Core Topics.-
Supplementary Topics.- From The Classroom To Real Life.- Part X: Singular
Value Decomposition.- Why Should We Care? .- What You May Have Learned
Before.- Core Topics.- Supplementary Topics.-From The Classroom To Real
Life.- Part XI: Nonnegative Matrices And Perron Theory.- Why Should We Care?
.- What You May Have Learned Before.- Core Topics.- Supplementary Topics.-
From The Classroom To Real Life.- Index.
Maria Isabel Bueno is a Teaching Professor at the University of California, Santa Barbara, where she has served since 2006. She holds a Ph.D. from Universidad Carlos III de Madrid. Her research focuses on linear algebra and numerical linear algebra.



Javier Perez Alvaro is an Associate Professor at the University of Montana in Missoula, where he has served since 2017. He earned his Ph.D. from Universidad Carlos III de Madrid. His research focuses on numerical linear algebra and numerical analysis.