Muutke küpsiste eelistusi

E-raamat: Practical Linear Algebra for Data Science

  • Formaat: 328 pages
  • Ilmumisaeg: 06-Sep-2022
  • Kirjastus: O'Reilly Media
  • Keel: eng
  • ISBN-13: 9781098120580
Teised raamatud teemal:
  • Formaat - PDF+DRM
  • Hind: 56,15 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Lisa ostukorvi
  • Lisa soovinimekirja
  • See e-raamat on mõeldud ainult isiklikuks kasutamiseks. E-raamatuid ei saa tagastada.
  • Formaat: 328 pages
  • Ilmumisaeg: 06-Sep-2022
  • Kirjastus: O'Reilly Media
  • Keel: eng
  • ISBN-13: 9781098120580
Teised raamatud teemal:

DRM piirangud

  • Kopeerimine (copy/paste):

    ei ole lubatud

  • Printimine:

    ei ole lubatud

  • Kasutamine:

    Digitaalõiguste kaitse (DRM)
    Kirjastus on väljastanud selle e-raamatu krüpteeritud kujul, mis tähendab, et selle lugemiseks peate installeerima spetsiaalse tarkvara. Samuti peate looma endale  Adobe ID Rohkem infot siin. E-raamatut saab lugeda 1 kasutaja ning alla laadida kuni 6'de seadmesse (kõik autoriseeritud sama Adobe ID-ga).

    Vajalik tarkvara
    Mobiilsetes seadmetes (telefon või tahvelarvuti) lugemiseks peate installeerima selle tasuta rakenduse: PocketBook Reader (iOS / Android)

    PC või Mac seadmes lugemiseks peate installima Adobe Digital Editionsi (Seeon tasuta rakendus spetsiaalselt e-raamatute lugemiseks. Seda ei tohi segamini ajada Adober Reader'iga, mis tõenäoliselt on juba teie arvutisse installeeritud )

    Seda e-raamatut ei saa lugeda Amazon Kindle's. 

If you want to work in any computational or technical field, you need to understand linear algebra. As the study of matrices and operations acting upon them, linear algebra is the mathematical basis of nearly all algorithms and analyses implemented in computers. But the way it's presented in decades-old textbooks is much different from how professionals use linear algebra today to solve real-world modern applications.

This practical guide from Mike X Cohen teaches the core concepts of linear algebra as implemented in Python, including how they're used in data science, machine learning, deep learning, computational simulations, and biomedical data processing applications. Armed with knowledge from this book, you'll be able to understand, implement, and adapt myriad modern analysis methods and algorithms.

Ideal for practitioners and students using computer technology and algorithms, this book introduces you to:

  • The interpretations and applications of vectors and matrices
  • Matrix arithmetic (various multiplications and transformations)
  • Independence, rank, and inverses
  • Important decompositions used in applied linear algebra (including LU and QR)
  • Eigendecomposition and singular value decomposition
  • Applications including least-squares model fitting and principal components analysis

Preface xi
1 Introduction
1(6)
What Is Linear Algebra and Why Learn It?
1(1)
About This Book
2(1)
Prerequisites
2(2)
Math
3(1)
Attitude
3(1)
Coding
3(1)
Mathematical Proofs Versus Intuition from Coding
4(1)
Code, Printed in the Book and Downloadable Online
5(1)
Code Exercises
5(1)
How to Use This Book (for Teachers and Self Learners)
6(1)
2 Vectors, Part 1
7(26)
Creating and Visualizing Vectors in NumPy
7(4)
Geometry of Vectors
10(1)
Operations on Vectors
11(6)
Adding Two Vectors
11(1)
Geometry of Vector Addition and Subtraction
12(1)
Vector-Scalar Multiplication
13(1)
Scalar-Vector Addition
14(1)
Transpose
15(1)
Vector Broadcasting in Python
16(1)
Vector Magnitude and Unit Vectors
17(1)
The Vector Dot Product
18(4)
The Dot Product Is Distributive
20(1)
Geometry of the Dot Product
21(1)
Other Vector Multiplications
22(2)
Hadamard Multiplication
22(1)
Outer Product
23(1)
Cross and Triple Products
24(1)
Orthogonal Vector Decomposition
24(4)
Summary
28(1)
Code Exercises
29(4)
3 Vectors, Part 2
33(16)
Vector Sets
33(1)
Linear Weighted Combination
34(1)
Linear Independence
35(3)
The Math of Linear Independence
37(1)
Independence and the Zeros Vector
38(1)
Subspace and Span
38(3)
Basis
41(5)
Definition of Basis
44(2)
Summary
46(1)
Code Exercises
46(3)
4 Vector Applications
49(12)
Correlation and Cosine Similarity
49(3)
Time Series Filtering and Feature Detection
52(1)
k-Means Clustering
53(4)
Code Exercises
57(4)
Correlation Exercises
57(1)
Filtering and Feature Detection Exercises
58(2)
k-Means Exercises
60(1)
5 Matrices, Parti
61(20)
Creating and Visualizing Matrices in NumPy
61(4)
Visualizing, Indexing, and Slicing Matrices
61(2)
Special Matrices
63(2)
Matrix Math: Addition, Scalar Multiplication, Hadamard Multiplication
65(2)
Addition and Subtraction
65(1)
"Shifting" a Matrix
66(1)
Scalar and Hadamard Multiplications
67(1)
Standard Matrix Multiplication
67(5)
Rules for Matrix Multiplication Validity
68(1)
Matrix Multiplication
69(1)
Matrix-Vector Multiplication
70(2)
Matrix Operations: Transpose
72(1)
Dot and Outer Product Notation
73(1)
Matrix Operations: LIVE EVIL (Order of Operations)
73(1)
Symmetric Matrices
74(1)
Creating Symmetric Matrices from Nonsymmetric Matrices
74(1)
Summary
75(1)
Code Exercises
76(5)
6 Matrices, Part 2
81(32)
Matrix Norms
82(2)
Matrix Trace and Frobenius Norm
83(1)
Matrix Spaces (Column, Row, Nulls)
84(7)
Column Space
84(4)
Row Space
88(1)
Null Spaces
88(3)
Rank
91(8)
Ranks of Special Matrices
94(2)
Rank of Added and Multiplied Matrices
96(1)
Rank of Shifted Matrices
97(1)
Theory and Practice
98(1)
Rank Applications
99(2)
In the Column Space?
99(1)
Linear Independence of a Vector Set
100(1)
Determinant
101(5)
Computing the Determinant
102(1)
Determinant with Linear Dependencies
103(1)
The Characteristic Polynomial
104(2)
Summary
106(1)
Code Exercises
107(6)
7 Matrix Applications
113(16)
Multivariate Data Covariance Matrices
113(3)
Geometric Transformations via Matrix-Vector Multiplication
116(4)
Image Feature Detection
120(4)
Summary
124(1)
Code Exercises
124(5)
Covariance and Correlation Matrices Exercises
124(2)
Geometric Transformations Exercises
126(1)
Image Feature Detection Exercises
127(2)
8 Matrix Inverse
129(18)
The Matrix Inverse
129(1)
Types of Inverses and Conditions for Invertibility
130(1)
Computing the Inverse
131(7)
Inverse of a 2 × 2 Matrix
131(2)
Inverse of a Diagonal Matrix
133(1)
Inverting Any Square Full-Rank Matrix
134(2)
One-Sided Inverses
136(2)
The Inverse Is Unique
138(1)
Moore-Penrose Pseudoinverse
138(1)
Numerical Stability of the Inverse
139(2)
Geometric Interpretation of the Inverse
141(1)
Summary
142(1)
Code Exercises
143(4)
9 Orthogonal Matrices and QR Decomposition
147(12)
Orthogonal Matrices
147(2)
Gram-Schmidt
149(1)
QR Decomposition
150(4)
Sizes of Q and R
151(3)
QR and Inverses
154(1)
Summary
154(1)
Code Exercises
155(4)
10 Row Reduction and LU Decomposition
159(16)
Systems of Equations
159(4)
Converting Equations into Matrices
160(1)
Working with Matrix Equations
161(2)
Row Reduction
163(6)
Gaussian Elimination
165(1)
Gauss-Jordan Elimination
166(1)
Matrix Inverse via Gauss-Jordan Elimination
167(2)
LU Decomposition
169(2)
Row Swaps via Permutation Matrices
170(1)
Summary
171(1)
Code Exercises
172(3)
11 General Linear Models and Least Squares
175(18)
General Linear Models
176(2)
Terminology
176(1)
Setting Up a General Linear Model
176(2)
Solving GLMs
178(5)
Is the Solution Exact?
179(1)
A Geometric Perspective on Least Squares
180(1)
Why Does Least Squares Work?
181(2)
GLM in a Simple Example
183(4)
Least Squares via QR
187(1)
Summary
188(1)
Code Exercises
188(5)
12 Least Squares Applications
193(20)
Predicting Bike Rentals Based on Weather
193(7)
Regression Table Using statsmodels
198(1)
Multicollinearity
199(1)
Regularization
199(1)
Polynomial Regression
200(4)
Grid Search to Find Model Parameters
204(2)
Summary
206(1)
Code Exercises
206(7)
Bike Rental Exercises
206(1)
Multicollinearity Exercise
207(1)
Regularization Exercise
208(2)
Polynomial Regression Exercise
210(1)
Grid Search Exercises
210(3)
13 Eigendecomposition
213(28)
Interpretations of Eigenvalues and Eigenvectors
214(3)
Geometry
214(1)
Statistics (Principal Components Analysis)
215(1)
Noise Reduction
216(1)
Dimension Reduction (Data Compression)
217(1)
Finding Eigenvalues
217(3)
Finding Eigenvectors
220(2)
Sign and Scale Indeterminacy of Eigenvectors
221(1)
Diagonalizing a Square Matrix
222(2)
The Special Awesomeness of Symmetric Matrices
224(3)
Orthogonal Eigenvectors
224(2)
Real-Valued Eigenvalues
226(1)
Eigendecomposition of Singular Matrices
227(1)
Quadratic Form, Definiteness, and Eigenvalues
228(4)
The Quadratic Form of a Matrix
228(2)
Definiteness
230(1)
ATA Is Positive (Semi)definite
231(1)
Generalized Eigendecomposition
232(1)
Summary
233(1)
Code Exercises
234(7)
14 Singular Value Decomposition
241(14)
The Big Picture of the SVD
241(2)
Singular Values and Matrix Rank
243(1)
SVD in Python
243(1)
SVD and Rank-1 "Layers" of a Matrix
244(2)
SVD from EIG
246(3)
SVD of ATA
247(1)
Converting Singular Values to Variance, Explained
247(1)
Condition Number
248(1)
SVD and the MP Pseudoinverse
249(1)
Summary
250(1)
Code Exercises
251(4)
15 Eigendecomposition and SVD Applications
255(24)
PCA Using Eigendecomposition and SVD
255(5)
The Math of PCA
256(3)
The Steps to Perform a PCA
259(1)
PCA via SVD
259(1)
Linear Discriminant Analysis
260(2)
Low-Rank Approximations via SVD
262(1)
SVD for Denoising
263(1)
Summary
263(1)
Exercises
264(15)
PCA
264(5)
Linear Discriminant Analyses
269(3)
SVD for Low-Rank Approximations
272(3)
SVD for Image Denoising
275(4)
16 Python Tutorial
279(24)
Why Python, and What Are the Alternatives?
279(1)
IDEs (Interactive Development Environments)
280(1)
Using Python Locally and Online
280(2)
Working with Code Files in Google Colab
281(1)
Variables
282(3)
Data Types
283(1)
Indexing
284(1)
Functions
285(5)
Methods as Functions
286(1)
Writing Your Own Functions
287(1)
Libraries
288(1)
NumPy
289(1)
Indexing and Slicing in NumPy
289(1)
Visualization
290(3)
Translating Formulas to Code
293(3)
Print Formatting and F-Strings
296(1)
Control Flow
297(4)
Comparators
297(1)
If Statements
297(2)
For Loops
299(1)
Nested Control Statements
300(1)
Measuring Computation Time
301(1)
Getting Help and Learning More
301(1)
What to Do When Things Go Awry
301(1)
Summary
302(1)
Index 303
Mike is an associate professor of neuroscience at the Donders Institute (Radboud University Medical Centre) in the Netherlands. He has over 20 years experience teaching scientific coding, data analysis, statistics, and related topics, and has authored several online courses and textbooks. He has a suspiciously dry sense of humor and enjoys anything purple.