Sebastopol (CA): O’Reilly Media, 2022. — 326 p. — ISBN 1098120612.
If you want to work in any computational or technical field, you need to understand linear algebra. As the study of matrices and operations acting upon them, linear algebra is the mathematical basis of nearly all algorithms and analyses implemented in computers. But the way it's presented in decades-old textbooks is much different from how professionals use linear algebra today to solve real-world modern applications. This practical guide from Mike X Cohen teaches the core concepts of linear algebra as implemented in Python, including how they're used in data science, machine learning, deep learning, computational simulations, and biomedical data processing applications. Armed with knowledge from this book, you'll be able to understand, implement, and adapt myriad modern analysis methods and algorithms.
Ideal for practitioners and students using computer technology and algorithms, this book introduces you to:
The interpretations and applications of vectors and matrices.
Matrix arithmetic (various multiplications and transformations).
Independence, rank, and inverses.
Important decompositions used in applied linear algebra (including LU and QR).
Eigendecomposition and singular value decomposition.
Applications including least-squares model fitting and principal components analysis.
Conventions Used in This Book
Using Code Examples
O’Reilly Online Learning
How to Contact Us
What Is Linear Algebra and Why Learn It?About This BookPrerequisitesMath
Attitude
Coding
Mathematical Proofs Versus Intuition from Coding
Code, Printed in the Book and Downloadable Online
Code Exercises
How to Use This Book (for Teachers and Self Learners)Vectors, Part 1Creating and Visualizing Vectors in NumPyGeometry of Vectors
Operations on VectorsAdding Two Vectors
Geometry of Vector Addition and Subtraction
Vector-Scalar Multiplication
Scalar-Vector Addition
Transpose
Vector Broadcasting in Python
Vector Magnitude and Unit Vectors
The Vector Dot ProductThe Dot Product Is Distributive
Geometry of the Dot Product
Other Vector MultiplicationsHadamard Multiplication
Outer Product
Cross and Triple Products
Orthogonal Vector DecompositionCode ExercisesVectors, Part 2Vector Sets
Linear Weighted Combination
Linear IndependenceThe Math of Linear Independence
Independence and the Zeros Vector
Subspace and Span
BasisDefinition of Basis
Summary
Code ExercisesVector ApplicationsCorrelation and Cosine Similarity
Time Series Filtering and Feature Detection
k-Means Clustering
Code ExercisesCorrelation Exercises
Filtering and Feature Detection Exercises
k-Means Exercises
Matrices, Part 1Creating and Visualizing Matrices in NumPyVisualizing, Indexing, and Slicing Matrices
Special Matrices
Matrix Math: Addition, Scalar Multiplication, Hadamard MultiplicationAddition and Subtraction
“Shifting” a Matrix
Scalar and Hadamard Multiplications
Standard Matrix MultiplicationRules for Matrix Multiplication Validity
Matrix Multiplication
Matrix-Vector Multiplication
Matrix Operations: TransposeDot and Outer Product Notation
Matrix Operations: LIVE EVIL (Order of Operations)
Symmetric MatricesCreating Symmetric Matrices from Nonsymmetric Matrices
Summary
Code ExercisesMatrices, Part 2Matrix NormsMatrix Trace and Frobenius Norm
Matrix Spaces (Column, Row, Nulls)Column Space
Row Space
Null Spaces
RankRanks of Special Matrices
Rank of Added and Multiplied Matrices
Rank of Shifted Matrices
Theory and Practice
Rank ApplicationsIn the Column Space?
Linear Independence of a Vector Set
DeterminantComputing the Determinant
Determinant with Linear Dependencies
The Characteristic Polynomial
Summary
Code ExercisesMatrix ApplicationsMultivariate Data Covariance Matrices
Geometric Transformations via Matrix-Vector Multiplication
Image Feature DetectionCode ExercisesCovariance and Correlation Matrices Exercises
Geometric Transformations Exercises
Image Feature Detection Exercises
Matrix InverseThe Matrix Inverse
Types of Inverses and Conditions for Invertibility
Computing the InverseInverse of a 2 × 2 Matrix
Inverse of a Diagonal Matrix
Inverting Any Square Full-Rank Matrix
One-Sided Inverses
The Inverse Is Unique
Moore-Penrose Pseudoinverse
Numerical Stability of the Inverse
Geometric Interpretation of the InverseCode ExercisesOrthogonal Matrices and QR DecompositionOrthogonal Matrices
Gram-Schmidt
QR DecompositionSizes of Q and R
QR and Inverses
Summary
Code ExercisesRow Reduction and LU DecompositionSystems of EquationsConverting Equations into Matrices
Working with Matrix Equations
Row ReductionGaussian Elimination
Gauss-Jordan Elimination
Matrix Inverse via Gauss-Jordan Elimination
LU DecompositionRow Swaps via Permutation Matrices
Summary
Code ExercisesGeneral Linear Models and Least SquaresGeneral Linear ModelsTerminology
Setting Up a General Linear Model
Solving GLMsIs the Solution Exact?
A Geometric Perspective on Least Squares
Why Does Least Squares Work?
GLM in a Simple Example
Least Squares via QRCode ExercisesLeast Squares ApplicationsPredicting Bike Rentals Based on WeatherRegression Table Using statsmodels
Multicollinearity
Regularization
Polynomial Regression
Grid Search to Find Model ParametersCode ExercisesBike Rental Exercises
Multicollinearity Exercise
Regularization Exercise
Polynomial Regression Exercise
Grid Search Exercises
EigendecompositionInterpretations of Eigenvalues and EigenvectorsGeometry
Statistics (Principal Components Analysis)
Noise Reduction
Dimension Reduction (Data Compression)
Finding Eigenvalues
Finding EigenvectorsSign and Scale Indeterminacy of Eigenvectors
Diagonalizing a Square Matrix
The Special Awesomeness of Symmetric MatricesOrthogonal Eigenvectors
Real-Valued Eigenvalues
Eigendecomposition of Singular Matrices
Quadratic Form, Definiteness, and EigenvaluesThe Quadratic Form of a Matrix
Definiteness
𝐀 T 𝐀 Is Positive (Semi)definite
Generalized EigendecompositionCode ExercisesSingular Value DecompositionThe Big Picture of the SVDSingular Values and Matrix Rank
SVD in Python
SVD and Rank-1 “Layers” of a Matrix
SVD from EIGSVD of 𝐀 T 𝐀
Converting Singular Values to Variance, Explained
Condition Number
SVD and the MP PseudoinverseCode ExercisesEigendecomposition and SVD ApplicationsPCA Using Eigendecomposition and SVDThe Math of PCA
The Steps to Perform a PCA
PCA via SVD
Linear Discriminant Analysis
Low-Rank Approximations via SVDSVD for Denoising
Summary
ExercisesPCA
Linear Discriminant Analyses
SVD for Low-Rank Approximations
SVD for Image Denoising
Python TutorialWhy Python, and What Are the Alternatives?
IDEs (Interactive Development Environments)
Using Python Locally and OnlineWorking with Code Files in Google Colab
VariablesData Types
Indexing
FunctionsMethods as Functions
Writing Your Own Functions
Libraries
NumPy
Indexing and Slicing in NumPy
Visualization
Translating Formulas to Code
Print Formatting and F-Strings
Control FlowComparators
If Statements
For Loops
Nested Control Statements
Measuring Computation Time
Getting Help and Learning MoreWhat to Do When Things Go Awry
Index
About the AuthorTrue PDF