Square Matrix with Duplicate Rows has Zero Determinant
Theorem
If two rows of a square matrix over a commutative ring $\struct {R, +, \circ}$ are the same, then its determinant is zero.
Corollary
If a square matrix has a zero row, its determinant is zero.
Proof 1
The proof proceeds by induction over $n$, the order of the square matrix.
Basis for the Induction
Let $n = 2$, which is the smallest natural number for which a square matrix of order $n$ can have two identical rows.
Let $\mathbf A = \sqbrk a_2$ be a square matrix over $R$ with two identical rows.
Then, by definition of determinant:
- $\map \det {\mathbf A} = a_{1 1} a_{2 2} - a_{1 2} a_{2 1} = a_{1 1} a_{2 2} - a_{2 2} a_{1 1} = 0$
$\Box$
This is the basis for the induction.
Induction Hypothesis
Assume for $n \in \N_{\ge 2}$ that any square matrices of order $n$ over $R$ with two identical rows has determinant equal to $0$.
Induction Step
Let $\mathbf A$ be a square matrix of order $n+1$ over $R$ with two identical rows.
Let $i_1, i_2 \in \set {1, \ldots, n + 1}$ be the indices of the identical rows, and let $i_1 < i_2$.
Let $i \in \set {1, \ldots, n + 1}$.
Let $\map {\mathbf A} {i; 1}$ denote the submatrix obtained from $\mathbf A$ by removing row $i$ and column $1$.
If $i \ne i_1$ and $i \ne i_2$, then $\map {\mathbf A} {i; 1}$ still contains two identical rows.
By the induction hypothesis:
- $\map \det {\map {\mathbf A} {i; 1} } = 0$
Now consider the matrices $\map {\mathbf A} {i_1; 1}$ and $\map {\mathbf A} {i_2; 1}$.
Let $r_j$ denote row $j$ of $\map {\mathbf A} {i_1; 1}$.
If we perform the following sequence of $i_2 - i_1 - 1$ elementary row operations on $\map {\mathbf A} {i_1; 1}$:
- $r_{i_1} \leftrightarrow r_{i_1 + 1} \ ; \ r_{i_1 + 1} \leftrightarrow r_{i_1 + 2} \ ; \ \ldots \ ; \ r_{i_2 - 1} \leftrightarrow r_{i_2}$
we will transform $\map {\mathbf A} {i_1; 1}$ into $\map {\mathbf A} {i_2; 1}$.
From Determinant with Rows Transposed:
- $\map \det {\map {\mathbf A} {i_1; 1} } = \paren {-1}^{i_2 - i_1 - 1} \map \det {\map {\mathbf A} {i_2; 1} }$
Then:
\(\ds \map \det {\mathbf A}\) | \(=\) | \(\ds \sum_{k \mathop = 1 }^{n + 1} a_{k 1} \paren {-1}^{k + 1} \map \det {\map {\mathbf A} {k; 1} }\) | expanding the determinant along column $1$ | |||||||||||
\(\ds \) | \(=\) | \(\ds a_{i_1 1} \paren {-1}^{i_1 + 1} \map \det {\map {\mathbf A} {i_1; 1} } + a_{i_2 1} \paren {-1}^{i_2 + 1} \map \det {\map {\mathbf A} {i_2; 1} }\) | ||||||||||||
\(\ds \) | \(=\) | \(\ds a_{i_2 1} \paren {-1}^{i_1 + 1 + i_2 - i_1 - 1} \map \det {\map {\mathbf A} {i_2; 1} } + a_{i_2 1} \paren {-1}^{i_2 + 1} \map \det {\map {\mathbf A} {i_2; 1} }\) | ||||||||||||
\(\ds \) | \(=\) | \(\ds 0\) |
$\blacksquare$
Proof 2
Suppose that $\forall x \in R: x + x = 0 \implies x = 0$.
From Determinant with Rows Transposed, if you exchange two rows of a square matrix, the sign of its determinant changes.
If you exchange two identical rows of a square matrix, then the sign of its determinant changes from $D$, say, to $-D$.
But the matrix stays the same.
So $D = -D$ and so $D = 0$.
$\blacksquare$
Also see
Sources
- 1994: Robert Messer: Linear Algebra: Gateway to Mathematics: $\S 7.2$
- 2014: Christopher Clapham and James Nicholson: The Concise Oxford Dictionary of Mathematics (5th ed.) ... (previous) ... (next): determinant (i)