Muutke küpsiste eelistusi

E-raamat: Application-Inspired Linear Algebra

Teised raamatud teemal:
  • Formaat - EPUB+DRM
  • Hind: 55,56 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Lisa ostukorvi
  • Lisa soovinimekirja
  • See e-raamat on mõeldud ainult isiklikuks kasutamiseks. E-raamatuid ei saa tagastada.
Teised raamatud teemal:

DRM piirangud

  • Kopeerimine (copy/paste):

    ei ole lubatud

  • Printimine:

    ei ole lubatud

  • Kasutamine:

    Digitaalõiguste kaitse (DRM)
    Kirjastus on väljastanud selle e-raamatu krüpteeritud kujul, mis tähendab, et selle lugemiseks peate installeerima spetsiaalse tarkvara. Samuti peate looma endale  Adobe ID Rohkem infot siin. E-raamatut saab lugeda 1 kasutaja ning alla laadida kuni 6'de seadmesse (kõik autoriseeritud sama Adobe ID-ga).

    Vajalik tarkvara
    Mobiilsetes seadmetes (telefon või tahvelarvuti) lugemiseks peate installeerima selle tasuta rakenduse: PocketBook Reader (iOS / Android)

    PC või Mac seadmes lugemiseks peate installima Adobe Digital Editionsi (Seeon tasuta rakendus spetsiaalselt e-raamatute lugemiseks. Seda ei tohi segamini ajada Adober Reader'iga, mis tõenäoliselt on juba teie arvutisse installeeritud )

    Seda e-raamatut ei saa lugeda Amazon Kindle's. 

This textbook invites students to discover abstract ideas in linear algebra within the context of applications. Diffusion welding and radiography, the two central applications, are introduced early on and used throughout to frame the practical uses of important linear algebra concepts. Students will learn these methods through explorations, which involve making conjectures and answering open-ended questions. By approaching the subject in this way, new avenues for learning the material emerge: For example, vector spaces are introduced early as the appropriate setting for the applied problems covered; and an alternative, determinant-free method for computing eigenvalues is also illustrated. In addition to the two main applications, the authors also describe possible pathways to other applications, which fall into three main areas: Data and image analysis (including machine learning); dynamical modeling; and optimization and optimal design. Several appendices are included as well, one of which offers an insightful walkthrough of proof techniques. Instructors will also find an outline for how to use the book in a course. Additional resources can be accessed on the authors’ website, including code, data sets, and other helpful material. Application-Inspired Linear Algebra will motivate and immerse undergraduate students taking a first course in linear algebra, and will provide instructors with an indispensable, application-first approach.

1 Introduction To Applications
1(8)
1.1 A Sample of Linear Algebra in Our World
1(2)
1.1.1 Modeling Dynamical Processes
1(1)
1.1.2 Signals and Data Analysis
2(1)
1.1.3 Optimal Design and Decision-Making
2(1)
1.2 Applications We Use to Build Linear Algebra Tools
3(2)
1.2.1 CAT Scans
3(1)
1.2.2 Diffusion Welding
4(1)
1.2.3 Image Warping
5(1)
1.3 Advice to Students
5(1)
1.4 The Language of Linear Algebra
6(1)
1.5 Rules of the Game
7(1)
1.6 Software Tools
7(1)
1.7 Exercises
7(2)
2 Vector Spaces
9(74)
2.1 Exploration: Digital Images
9(5)
2.1.1 Exercises
11(3)
2.2 Systems of Equations
14(24)
2.2.1 Systems of Equations
14(4)
2.2.2 Techniques for Solving Systems of Linear Equations
18(11)
2.2.3 Elementary Matrix
29(4)
2.2.4 The Geometry of Systems of Equations
33(2)
2.2.5 Exercises
35(3)
2.3 Vector Spaces
38(15)
2.3.1 Images and Image Arithmetic
38(3)
2.3.2 Vectors and Vector Spaces
41(7)
2.3.3 The Geometry of the Vector Space R3
48(2)
2.3.4 Properties of Vector Spaces
50(2)
2.3.5 Exercises
52(1)
2.4 Vector Space Examples
53(13)
2.4.1 Diffusion Welding and Heat States
54(1)
2.4.2 Function Spaces
55(3)
2.4.3 Matrix Spaces
58(1)
2.4.4 Solution Spaces
59(2)
2.4.5 Other Vector Spaces
61(2)
2.4.6 Is My Set a Vector Space?
63(1)
2.4.7 Exercises
64(2)
2.5 Subspaces
66(17)
2.5.1 Subsets and Subspaces
68(2)
2.5.2 Examples of Subspaces
70(3)
2.5.3 Subspaces of Rn
73(1)
2.5.4 Building New Subspaces
74(3)
2.5.5 Exercises
77(6)
3 Vector Space Arithmetic and Representations
83(92)
3.1 Linear Combinations
83(27)
3.1.1 Linear Combinations
84(4)
3.1.2 Matrix Products
88(4)
3.1.3 The Matrix Equation Ax = b
92(6)
3.1.4 The Matrix Equation Ax = 0
98(6)
3.1.5 The Principle of Superposition
104(2)
3.1.6 Exercises
106(4)
3.2 Span
110(14)
3.2.1 The Span of a Set of Vectors
111(3)
3.2.2 To Span a Set of Vectors
114(5)
3.2.3 Span X is a Vector Space
119(3)
3.2.4 Exercises
122(2)
3.3 Linear Dependence and Independence
124(12)
3.3.1 Linear Dependence and Independence
126(2)
3.3.2 Determining Linear (In)dependence
128(4)
3.3.3 Summary of Linear Dependence
132(1)
3.3.4 Exercises
133(3)
3.4 Basis and Dimension
136(24)
3.4.1 Efficient Heat State Descriptions
136(3)
3.4.2 Basis
139(3)
3.4.3 Constructing a Basis
142(4)
3.4.4 Dimension
146(6)
3.4.5 Properties of Bases
152(5)
3.4.6 Exercises
157(3)
3.5 Coordinate Spaces
160(15)
3.5.1 Cataloging Heat States
160(3)
3.5.2 Coordinates in Rn
163(2)
3.5.3 Example Coordinates of Abstract Vectors
165(5)
3.5.4 Brain Scan Images and Coordinates
170(1)
3.5.5 Exercises
171(4)
4 Linear Transformations
175(106)
4.1 Explorations: Computing Radiographs and the Radiographic Transformation
175(8)
4.1.1 Radiography on Slices
176(2)
4.1.2 Radiographic Scenarios and Notation
178(1)
4.1.3 A First Example
179(1)
4.1.4 Radiographic Setup Example
180(1)
4.1.5 Exercises
180(3)
4.2 Transformations
183(21)
4.2.1 Transformations are Functions
185(1)
4.2.2 Linear Transformations
185(7)
4.2.3 Properties of Linear Transformations
192(6)
4.2.4 Exercises
198(6)
4.3 Explorations: Heat Diffusion
204(8)
4.3.1 Heat States as Vectors
205(3)
4.3.2 Heat Evolution Equation
208(1)
4.3.3 Exercises
209(1)
4.3.4 Extending the Exploration: Application to Image Warping
209(3)
4.4 Matrix Representations of Linear Transformations
212(18)
4.4.1 Matrix Transformations between Euclidean Spaces
212(3)
4.4.2 Matrix Transformations
215(9)
4.4.3 Change of Basis Matrix
224(3)
4.4.4 Exercises
227(3)
4.5 The Determinants of a Matrix
230(17)
4.5.1 Determinant Calculations and Algebraic Properties
235(12)
4.6 Explorations: Re-Evaluating Our Tomographic Goal
247(3)
4.6.1 Seeking Tomographic Transformations
247(1)
4.6.2 Exercises
248(2)
4.7 Properties of Linear Transformations
250(31)
4.7.1 One-To-One Transformations
250(4)
4.7.2 Properties of One-To-One Linear Transformations
254(4)
4.7.3 Onto Linear Transformations
258(3)
4.7.4 Properties of Onto Linear Transformations
261(2)
4.7.5 Summary of Properties
263(1)
4.7.6 Bijections and Isomorphisms
263(1)
4.7.7 Properties of Isomorphic Vector Spaces
264(3)
4.7.8 Building and Recognizing Isomorphisms
267(2)
4.7.9 Inverse Transformations
269(4)
4.7.10 Left Inverse Transformations
273(2)
4.7.11 Exercises
275(6)
5 Invertibility
281(48)
5.1 Transformation Spaces
282(19)
5.1.1 The Nullspace
282(5)
5.1.2 Domain and Range Spaces
287(5)
5.1.3 One-to-One and Onto Revisited
292(3)
5.1.4 The Rank-Nullity Theorem
295(3)
5.1.5 Exercises
298(3)
5.2 Matrix Spaces and the Invertible Matrix Theorem
301(23)
5.2.1 Matrix Spaces
302(11)
5.2.2 The Invertible Matrix Theorem
313(9)
5.2.3 Exercises
322(2)
5.3 Exploration: Reconstruction Without an Inverse
324(5)
5.3.1 Transpose of a Matrix
324(1)
5.3.2 Invertible Transformation
325(1)
5.3.3 Application to a Small Example
325(1)
5.3.4 Application to Brain Reconstruction
326(3)
6 Diagonalization
329(50)
6.1 Exploration: Heat State Evolution
330(2)
6.2 Eigenspaces and Diagonalizable Transformations
332(27)
6.2.1 Eigenvectors and Eigenvalues
333(2)
6.2.2 Computing Eigenvalues and Finding Eigenvectors
335(9)
6.2.3 Using Determinants to Find Eigenvalues
344(4)
6.2.4 Eigenbases
348(2)
6.2.5 Diagonalizable Transformations
350(7)
6.2.6 Exercises
357(2)
6.3 Explorations: Long-Term Behavior and Diffusion Welding Process Termination Criterion
359(5)
6.3.1 Long-Term Behavior in Dynamical Systems
359(1)
6.3.2 Using Matlab/Octave to Calculate Eigenvalues and Eigenvectors
360(2)
6.3.3 Termination Criterion
362(2)
6.3.4 Reconstruct Heat State at Removal
364(1)
6.4 Markov Processes and Long-Term Behavior
364(15)
6.4.1 Matrix Convergence
365(4)
6.4.2 Long-Term Behavior
369(3)
6.4.3 Markov Processes
372(2)
6.4.4 Exercises
374(5)
7 Inner Product Spaces and Pseudo-Invertibility
379(100)
7.1 Inner Products, Norms, and Coordinates
379(23)
7.1.1 Inner Product
381(4)
7.1.2 Vector Norm
385(3)
7.1.3 Properties of Inner Product Spaces
388(2)
7.1.4 Orthogonality
390(5)
7.1.5 Inner Product and Coordinates
395(3)
7.1.6 Exercises
398(4)
7.2 Projections
402(30)
7.2.1 Coordinate Projection
404(5)
7.2.2 Orthogonal Projection
409(13)
7.2.3 Gram-Schmidt Process
422(7)
7.2.4 Exercises
429(3)
7.3 Orthogonal Transformations
432(15)
7.3.1 Orthogonal Matrices
433(6)
7.3.2 Orthogonal Diagonalization
439(3)
7.3.3 Completing the Invertible Matrix Theorem
442(1)
7.3.4 Symmetric Diffusion Transformation
443(2)
7.3.5 Exercises
445(2)
7.4 Exploration: Pseudo-Inverting the Non-invertible
447(4)
7.4.1 Maximal Isomorphism Theorem
447(2)
7.4.2 Exploring the Nature of the Data Compression Transformation
449(2)
7.4.3 Additional Exercises
451(1)
7.5 Singular Value Decomposition
451(20)
7.5.1 The Singular Value Decomposition
453(12)
7.5.2 Computing the Pseudo-Inverse
465(5)
7.5.3 Exercises
470(1)
7.6 Explorations: Pseudo-Inverse Tomographic Reconstruction
471(8)
7.6.1 The First Pseudo-Inverse Brain Reconstructions
471(2)
7.6.2 Understanding the Effects of Noise
473(1)
7.6.3 A Better Pseudo-Inverse Reconstruction
473(1)
7.6.4 Using Object-Prior Information
474(3)
7.6.5 Additional Exercises
477(2)
8 Conclusions
479(8)
8.1 Radiography and Tomography Example
479(1)
8.2 Diffusion
480(1)
8.3 Your Next Mathematical Steps
481(3)
8.3.1 Modeling Dynamical Processes
482(1)
8.3.2 Signals and Data Analysis
482(1)
8.3.3 Optimal Design and Decision Making
483(1)
8.4 How to move forward
484(1)
8.5 Final Words
485(2)
Appendix A Transmission Radiography and Tomography: A Simplified Overview 487(10)
Appendix B The Diffusion Equation 497(4)
Appendix C Proof Techniques 501(18)
Appendix D Fields 519(4)
Index 523
Heather A. Moon received her PhD in Mathematics at Washington State University. She spent 8 years as an Assistant/Associate Professor of Mathematics at liberal arts colleges, where she mentored numerous undergraduate research projects in data and image analysis.  Currently, she is pursuing her scientific interests, in the doctoral program in Physics at Washington State University, where she is excited to use her expertise in mathematics and data and image analysis to study the world around us.

Thomas J. Asaki received his Ph.D. in Physics at Washington State University.  He spent 13 years as a postdoc and technical staff member at Los Alamos National Laboratory exploring inverse problems, image and data analysis methods, nondestructive acoustic testing and related optimization problems.  He is currently an Associate Professor in the Department of Mathematics and Statistics at Washington State University where, since 2008, he has enjoyed continuing research and helping students discover richness in applied mathematics.



 Marie Snipes received her Ph.D. in Mathematics from the University of Michigan.  Prior to her doctoral studies, she spent four years in the US Air Force as a data analyst.  She is currently an Associate Professor of Mathematics at Kenyon College, where she regularly teaches inquiry-based courses and mentors undergraduate research projects.  Her research interests include geometric function theory, analysis in metric spaces, and applications of these areas to data problems.







The authors are co-creators of the IMAGEMath Project (www.imagemath.org), bringing image and data applications (of calculus of variations, PDE's, and linear algebra) into the undergraduate curriculum.