The concept of the determinant arises naturally through the study of systems of linear equations. It serves as a fundamental algebraic tool that characterizes the existence and uniqueness of solutions. Specifically, the determinant acts as a "gatekeeper": its value dictates whether a system possesses a unique solution or behaves singularly.
The2×2Case
To understand the origin of this quantity, let us consider a system of two linear equations with two variables:
ax+bycx+dy=p=q
Our objective is to isolatexby eliminatingy. To achieve this, we multiply the first equation bydand the second byb:
adx+bdybcx+bdy=dp=bq
Subtracting the second equation from the first yields:(ad−bc)x=dp−bq
This result reveals that a unique solution forxexists if and only if the coefficient(ad−bc)is non-zero. Ifad−bc=0, the equations are linearly dependent, and the system fails to yield a unique solution. Because this specific quantity determines the fundamental nature of the system's solution, it is aptly named the determinant.
The3×3Case
Generalizing this logic to a3×3system introduces greater algebraic complexity but follows the same principle of elimination. Consider the following system:
Assuminga33=0, we can eliminatex3from Equations (1) and (2) by performing the row operationsa33×(1)−a13×(3)anda33×(2)−a23×(3). This reduction results in a2×2system inx1andx2:
The determinant possesses several key properties that facilitate both its calculation and its theoretical application. These properties can be verified directly from the algebraic expansion:
Row/Column Interchanges: Interchanging any two rows or any two columns reverses the sign of the determinant.
Identical Rows/Columns: If a matrix contain two identical rows or two identical columns, its determinant is zero.
Scalar Multiplication: Multiplying a single row or column by a scalarcscales the entire determinant byc.
Row/Column Addition: Adding a multiple of one row (or column) to another row (or column) leaves the determinant unchanged.
General Form: The Leibniz Formula
If we examine the3×3expansion in Equation (4), a clear structural pattern emerges:
Each term is a product of exactly one element from each row and each column.
The column indices(j,k,l)constitute a permutation of the set{1,2,3}.
The expansion consists of3!=6terms, representing every possible permutation.
This observation generalizes to anyn×nmatrix. The determinant of ann×nmatrixA, known as the Leibniz formula, is defined as the sum over all permutations:
det(A)=σ∈Sn∑sgn(σ)a1σ(1)a2σ(2)…anσ(n)
where:
Snis the symmetric group (the set of alln!permutations of{1,2,…,n}).
σis a specific permutation mapping each rowito a columnσ(i).
sgn(σ)is the sign (or signature) of the permutation.
The Sign of a Permutation
The sign of a permutation is determined by its decomposition into disjoint cycles.
Note: A cycle(i1,i2,…,ik)describes a mapping wherei1→i2→⋯→ik→i1. Elements not included in a cycle are considered fixed (cycles of length 1).
Letnbe the order of the permutation andkbe the total number of disjoint cycles (including fixed elements). The sign is defined as:sgn(σ)=(−1)n−k
Even Permutations: Ifn−kis even,sgn(σ)=1.
Odd Permutations: Ifn−kis odd,sgn(σ)=−1.
This formal definition provides a unified framework that consistently yields the2×2and3×3formulas derived earlier.
While the Leibniz formula generalizes the determinant to anyn×nmatrix, its formal justification requires demonstrating that it satisfies the essential property of the determinant: it is non-zero if and only if the matrix is invertible. This proof involves algebraic machinery that is beyond the scope of this course.
Historical Context
The concept of the determinant is unique in mathematical history because it predates the formal definition of a "matrix" by nearly two centuries. It was initially developed as a specialized algorithmic tool for solving linear systems.
Chronology of Development
Late 17th Century: The concept was discovered independently by Seki Takakazu in Japan (1683) and Gottfried Wilhelm Leibniz in Europe (1693).
18th Century: Gabriel Cramer published "Cramer's Rule" in 1750, providing a systematic method for solving systems ofnequations.
19th Century: Carl Friedrich Gauss introduced the term "determinant" in 1801, though in the context of quadratic forms. Augustin-Louis Cauchy (1812) established the modern definition, proving foundational theorems such as the multiplication theorem (det(AB)=det(A)det(B)).
Key Figure Contributions
Seki Takakazu (1683): Developed methods for calculating the "resultant" of polynomials. He successfully computed determinants for matrices up to5×5, predating European developments.
Gottfried Wilhelm Leibniz (1693): Recognized that a system of three equations in two unknowns has a solution only if a specific combination of coefficients—the determinant—vanishes.
Gabriel Cramer (1750): Formulated the rule that bears his name, expressing the variables of a linear system as the ratio of two determinants.
Carl Friedrich Gauss (1801): While his term "determinant" originally referred to the discriminant of a quadratic form, his work on Gaussian elimination provided the numerical foundation for matrix theory.
Augustin-Louis Cauchy (1812): Provided the first modern and systematic treatment. Cauchy introduced the vertical bar notation, defined adjoint matrices, and proved the multiplicative property of determinants.