Square Matrix with Duplicate Rows has Zero Determinant

From ProofWiki
Jump to: navigation, search

Theorem

If two rows of a square matrix over a commutative ring $\left({R, +, \circ}\right)$ are the same, then its determinant is zero.


Corollary

If a square matrix has a zero row or zero column, its determinant is zero.


Proof 1

Proof by induction over $n$, the order of the square matrix.


Basis for the Induction

Let $n=2$, which is the smallest natural number for which a square matrix of order $n$ can have two identical rows.

Let $\mathbf A = \left[{a}\right]_2$ be a square matrix over $R$ with two identical rows.

Then, by definition of determinant:

$\det \left({\mathbf A}\right) = a_{11}a_{22} - a_{12}a_{21} = a_{11}a_{22}-a_{22}a_{11} = 0$

$\Box$


Induction Hypothesis

Assume for $n \in \N_{\ge 2}$ that any square matrices of order $n$ over $R$ with two identical rows has determinant equal to $0$.


Induction Step

Let $\mathbf A$ be a square matrix of order $n+1$ over $R$ with two identical rows.

Let $i_1, i_2 \in \left\{ {1, \ldots, n+1}\right\}$ be the indices of the identical rows, and let $i_1 < i_2$.

Let $i \in \left\{ {1, \ldots, n+1}\right\}$.

Let $\mathbf A \left({i ; 1}\right)$ denote the submatrix obtained from $\mathbf A$ by removing row $i$ and column $1$.

If $i \ne i_1$ and $i \ne i_2$, then $\mathbf A \left({i ; 1}\right)$ still contains two identical rows.

By the induction hypotesis, it follows that $\det \left({\mathbf A \left({i ; 1}\right) }\right) = 0$.


Now consider the matrices $\mathbf A \left({i_1 ; 1}\right)$ and $\mathbf A \left({i_2 ; 1}\right)$.

Let $r_j$ denote row $j$ of $\mathbf A \left({i_1 ; 1}\right)$.

If we perform the following sequence of $i_2 - i_1 -1$ elementary row operations on $\mathbf A \left({i_1 ; 1}\right)$:

$r_{i_1} \leftrightarrow r_{i_1 +1} \ ; \ r_{i_1 + 1} \leftrightarrow r_{i_1 +2} \ ; \ \ldots \ ; \ r_{i_2 - 1} \leftrightarrow r_{i_2}$

we will transform $\mathbf A \left({i_1 ; 1}\right)$ into $\mathbf A \left({i_2 ; 1}\right)$.

From Determinant with Rows Transposed, it follows that $\det \left({\mathbf A \left({i_1 ; 1}\right) }\right) = \left({-1}\right)^{i_2 - i_1 - 1} \det \left({\mathbf A \left({i_2 ; 1}\right) }\right)$


Then:

\(\displaystyle \) \(\displaystyle \) \(\displaystyle \) \(\displaystyle \det \left({\mathbf A}\right)\) \(=\) \(\displaystyle \) \(\displaystyle \sum_{k \mathop = 1 }^{n + 1} a_{k1} \left({-1}\right)^{k + 1} \det \left({\mathbf A \left({k ; 1}\right) }\right)\) \(\displaystyle \) \(\displaystyle \)          expanding the determinant along column $1$          
\(\displaystyle \) \(\displaystyle \) \(\displaystyle \) \(\displaystyle \) \(=\) \(\displaystyle \) \(\displaystyle a_{i_1 1}\left({-1}\right)^{i_1 + 1} \det \left({\mathbf A \left({i_1 ; 1}\right) }\right) + a_{i_2 1} \left({-1}\right)^{i_2 + 1} \det \left({\mathbf A \left({i_2 ; 1}\right) }\right)\) \(\displaystyle \) \(\displaystyle \)                    
\(\displaystyle \) \(\displaystyle \) \(\displaystyle \) \(\displaystyle \) \(=\) \(\displaystyle \) \(\displaystyle a_{i_2 1} \left({-1}\right)^{i_1 + 1 + i_2 - i_1 - 1} \det \left({\mathbf A \left({i_2; 1}\right) }\right) + a_{i_2 1} \left({-1}\right)^{i_2 + 1} \det \left({\mathbf A \left({i_2 ; 1}\right) }\right)\) \(\displaystyle \) \(\displaystyle \)                    
\(\displaystyle \) \(\displaystyle \) \(\displaystyle \) \(\displaystyle \) \(=\) \(\displaystyle \) \(\displaystyle 0\) \(\displaystyle \) \(\displaystyle \)                    

$\blacksquare$


Proof 2

Suppose that $\forall x \in R: x + x = 0 \implies x = 0$.

From Determinant with Rows Transposed, if you swap over two rows of a square matrix, the sign of its determinant changes.

If you swap over two identical rows of a matrix, then the sign of its determinant changes from $D$, say, to $-D$.

But the matrix is the same.

So $D = -D$ and so $D = 0$.

$\blacksquare$


Sources