Skip to main content

Linear Algebra

ConceptDefinitionKey OperationsApplications
Vectors and ScalarsVectors are ordered arrays of numbers representing points in space; scalars are single numbersAddition, scalar multiplication, dot product, cross productData points, feature vectors, gradients in optimization
MatricesRectangular arrays of numbers representing linear transformationsAddition, multiplication, transpose, inverse, determinantDataset representation, transformation matrices, covariance matrices
Systems of Linear EquationsSets of equations with multiple variables and linear relationshipsGaussian elimination, LU decomposition, matrix inversionSolving regression problems, network flow analysis, optimization
Vector Spaces and SubspacesCollections of vectors closed under addition and scalar multiplicationBasis, dimension, span, linear independence, null spaceUnderstanding data structure, dimensionality reduction, feature spaces
Eigenvalues and EigenvectorsEigenvectors are directions unchanged by linear transformations; eigenvalues are scaling factorsEigen-decomposition, spectral theorem, power iterationPrincipal Component Analysis (PCA), stability analysis, quantum mechanics
DeterminantsScalar value indicating matrix properties like invertibility and volume scalingFormula computation, properties, Cramer's ruleTesting matrix invertibility, computing areas and volumes
NormsMeasures of vector/matrix magnitude or distanceL1 norm (Manhattan), L2 norm (Euclidean), Frobenius normRegularization in ML, distance metrics, error measurement
Singular Value Decomposition (SVD)Matrix factorization into UΣVTU\Sigma V^T where UU and VV are orthogonal, Σ\Sigma is diagonalFull SVD, reduced SVD, applications in data analysisDimensionality reduction, recommender systems, image compression
OrthogonalityVectors/matrices with dot product zero, representing perpendicular directionsOrthogonal vectors, orthogonal matrices, orthonormal basesCoordinate system transformation, Gram-Schmidt process, QR decomposition