ResearchForge / Calculators
← all articles
linear-algebramatrix-operationseigenvaluesdiagonalizationdeterminantsSat Apr 25
3Blue1Brown-style animation reel

Linear Algebra: Core Equations and Relations

Abstract

This article surveys foundational equations and structural relationships in linear algebra, focusing on matrix operations, determinants, eigenvalue problems, and diagonalization. These concepts form the backbone of computational linear algebra and its applications across engineering, data science, and physics. We present formal statements alongside intuitive interpretations and worked examples.

Background

Linear algebra provides a language for describing linear transformations and solving systems of equations. At its core are matrices—rectangular arrays of scalars—and the operations defined on them. Understanding how matrices interact, what properties they possess, and how to decompose them into simpler forms is essential for both theoretical understanding and practical computation.

The central question in much of linear algebra is: given a matrix AA, what can we learn about its structure, invertibility, and behavior under repeated application? The answers involve determinants, eigenvalues, and eigenvectors.

Key Results

Matrix Multiplication

[matrix-multiplication] defines the fundamental operation of combining two matrices. For matrices AA (size m×nm \times n) and BB (size n×pn \times p), the product ABAB is computed entry-wise as:

(AB)ij=k=1nAikBkj(AB)_{ij} = \sum_{k=1}^{n} A_{ik} B_{kj}

This operation is not commutative: in general, ABBAAB \neq BA. Matrix multiplication is essential for representing linear transformations and solving systems of equations. The non-commutativity reflects the fact that the order in which transformations are applied matters.

Determinants and Invertibility

The determinant is a scalar-valued function on square matrices that encodes critical information about invertibility and volume scaling. [determinant-of-a-matrix] establishes that for a 2×22 \times 2 matrix,

det(A)=adbcforA=(abcd)\det(A) = ad - bc \quad \text{for} \quad A = \begin{pmatrix} a & b \\ c & d \end{pmatrix}

More generally, determinants can be computed via row reduction or cofactor expansion for larger matrices.

[determinant-properties] and [determinant-properties] enumerate key properties:

  1. Row swaps change the sign: det(A)=det(A)\det(A') = -\det(A)
  2. Scaling a row by scalar cc scales the determinant: det(cR)=cdet(A)\det(cR) = c \cdot \det(A)
  3. The determinant of a triangular matrix equals the product of diagonal entries
  4. Transposition preserves determinant: det(AT)=det(A)\det(A^T) = \det(A)
  5. Adding a multiple of one row to another preserves the determinant

A matrix is invertible if and only if det(A)0\det(A) \neq 0. A zero determinant indicates the transformation collapses space into a lower dimension, implying linear dependence among rows or columns.

Eigenvalues and Eigenvectors

[eigenvalues-of-a-matrix] defines eigenvalues as scalars λ\lambda satisfying the characteristic equation:

det(AλI)=0\det(A - \lambda I) = 0

Eigenvalues reveal how much a corresponding eigenvector is stretched or compressed under the transformation represented by AA. They are crucial in stability analysis, principal component analysis, and understanding oscillatory behavior in dynamical systems.

Diagonalization

[diagonalizable-matrix] and [diagonalizable-matrix] establish that a matrix AA is diagonalizable if there exist an invertible matrix PP and a diagonal matrix DD such that:

A=PDP1A = PDP^{-1}

Here, the columns of PP are linearly independent eigenvectors of AA, and DD contains the corresponding eigenvalues on its diagonal. Diagonalization simplifies matrix powers and solving differential equations, since An=PDnP1A^n = PD^nP^{-1} and DnD^n is trivial to compute.

Column Space and Null Space

[basis-of-column-space] describes the column space Col(A)\text{Col}(A) as the subspace spanned by the columns of AA. Its basis consists of the pivot columns in row echelon form, and its dimension equals the number of pivots (the rank of AA).

[basis-of-null-space] defines the null space Null(A)\text{Null}(A) as the solution set to Ax=0Ax = 0. Its basis is found by solving the homogeneous system, and its dimension equals the number of free variables. Together, these spaces satisfy the rank-nullity relationship: rank(A)+nullity(A)=n\text{rank}(A) + \text{nullity}(A) = n (the number of columns).

Matrix Equations

[matrix-equation-solution] and [matrix-inversion-formula] demonstrate how to solve matrix equations by algebraic manipulation. For example, given invertible matrices AA and BB, solving

B1(AX)=AXB^{-1}(A - X) = AX

yields

X=(BA+I)1AX = (BA + I)^{-1}A

provided BA+IBA + I is invertible. Such manipulations rely on properties of matrix multiplication and inversion, and the solution's existence depends on invertibility conditions.

Worked Examples

Example 1: Computing a 2×2 Determinant

Let A=(2314)A = \begin{pmatrix} 2 & 3 \\ 1 & 4 \end{pmatrix}. By [determinant-of-a-matrix],

det(A)=(2)(4)(3)(1)=83=5\det(A) = (2)(4) - (3)(1) = 8 - 3 = 5

Since det(A)0\det(A) \neq 0, the matrix is invertible.

Example 2: Finding Eigenvalues

For the same matrix AA, we solve det(AλI)=0\det(A - \lambda I) = 0 [eigenvalues-of-a-matrix]:

det(2λ314λ)=(2λ)(4λ)3=λ26λ+5=0\det\begin{pmatrix} 2-\lambda & 3 \\ 1 & 4-\lambda \end{pmatrix} = (2-\lambda)(4-\lambda) - 3 = \lambda^2 - 6\lambda + 5 = 0

Factoring: (λ1)(λ5)=0(\lambda - 1)(\lambda - 5) = 0, so λ1=1\lambda_1 = 1 and λ2=5\lambda_2 = 5.

Example 3: Diagonalization

With eigenvalues λ1=1\lambda_1 = 1 and λ2=5\lambda_2 = 5, we find corresponding eigenvectors and form PP and DD. If P=(3111)P = \begin{pmatrix} 3 & 1 \\ 1 & 1 \end{pmatrix} and D=(1005)D = \begin{pmatrix} 1 & 0 \\ 0 & 5 \end{pmatrix}, then by [diagonalizable-matrix], A=PDP1A = PDP^{-1}. Computing A10A^{10} becomes A10=PD10P1A^{10} = PD^{10}P^{-1}, where D10=(100510)D^{10} = \begin{pmatrix} 1 & 0 \\ 0 & 5^{10} \end{pmatrix} is trivial.

References

AI Disclosure

This article was drafted with the assistance of an AI language model based on personal class notes in Zettelkasten format. All mathematical statements and definitions are sourced from the cited notes and reflect standard linear algebra pedagogy. The article has been reviewed for technical accuracy and clarity.

Try the math live

References

AI disclosure: Generated from personal class notes with AI assistance. Every factual claim cites a note. Model: claude-haiku-4-5-20251001.