Vandermonde Matrix Identity for Cauchy Matrix

From ProofWiki
Jump to navigation Jump to search



Theorem

Assume values $\set {x_1, \ldots, x_n, y_1, \ldots, y_n}$ are distinct in matrix

\(\ds C\) \(=\) \(\ds \begin {pmatrix}
       \dfrac 1 {x_1 - y_1} & \dfrac 1 {x_1 - y_2} & \cdots & \dfrac 1 {x_1 - y_n} \\
       \dfrac 1 {x_2 - y_1} & \dfrac 1 {x_2 - y_2} & \cdots & \dfrac 1 {x_2 - y_n} \\
       \vdots               & \vdots               & \cdots & \vdots \\
       \dfrac 1 {x_n - y_1} & \dfrac 1 {x_n - y_2} & \cdots & \dfrac 1 {x_n - y_n} \\

\end {pmatrix}\)

Cauchy matrix of order $n$

Then:

\(\ds C\) \(=\) \(\ds -P V_x^{-1} V_y Q^{-1}\) Vandermonde matrix identity for a Cauchy matrix


Definitions of Vandermonde matrices $V_x$, $V_y$ and diagonal matrices $P$, $Q$:

$V_x = \begin {pmatrix}

1 & 1 & \cdots & 1 \\ x_1 & x_2 & \cdots & x_n \\ \vdots & \vdots & \ddots & \vdots \\ {x_1}^{n - 1} & {x_2}^{n - 1} & \cdots & {x_n}^{n - 1} \\ \end {pmatrix}, \quad V_y = \begin {pmatrix} 1 & 1 & \cdots & 1 \\ y_1 & y_2 & \cdots & y_n \\ \vdots & \vdots & \ddots & \vdots \\ {y_1}^{n - 1} & {y_2}^{n - 1} & \cdots & {y_n}^{n - 1} \\ \end {pmatrix}$ Vandermonde matrices

$P = \begin {pmatrix}

\map {p_1} {x_1} & \cdots & 0 \\ \vdots & \ddots & \vdots \\ 0 & \cdots & \map {p_n} {x_n} \\ \end {pmatrix}, \quad Q = \begin {pmatrix} \map p {y_1} & \cdots & 0 \\ \vdots & \ddots & \vdots \\ 0 & \cdots & \map p {y_n} \\ \end {pmatrix}$ Diagonal matrices

Definitions of polynomials $p, p_1, \ldots, p_n$:

$\ds \map p x = \prod_{i \mathop = 1}^n \paren {x - x_i}$
$\ds \map {p_k} x = \dfrac {\map p x} {x - x_k} = \prod_{i \mathop = 1, i \mathop \ne k}^n \paren {x - x_i}$, $1 \mathop \le k \mathop \le n$


Proof

Matrices $P$ and $Q$ are invertible because all diagonal elements are nonzero.

For $1 \le i \le n$ express polynomial $p_i$ as:

$\ds \map {p_i} x = \sum_{k \mathop = 1}^n a_{i k} x^{k - 1}$

Then:

\(\ds \paren {\map {p_i} {x_j} }\) \(=\) \(\ds \paren {a_{i j} } V_x\) Definition of Matrix Product (Conventional)
\(\ds P\) \(=\) \(\ds \paren {a_{i j} } V_x\) as $\map {p_i} {x_j} = 0$ for $i \ne j$.
\(\ds \paren {a_{i j} }\) \(=\) \(\ds P V_x^{-1}\) solving for matrix $paren {a_{i j} }$
\(\ds \paren {\map {p_i} {y_j} }\) \(=\) \(\ds \paren {a_{i j} } V_y\) Definition of Matrix Product (Conventional)
\(\ds \paren {\map {p_i} {y_j} }\) \(=\) \(\ds P V_x^{-1} V_y\) substituting $paren {a_{i j} } = P V_x^{-1}$


Use second equation $\map {p_i} {y_j} = \dfrac {\map p {y_j} } {y_j - x_i}$:

\(\ds \paren {\map {p_i} {y_j} }\) \(=\) \(\ds -C Q\) Definition of Matrix Product (Conventional)
\(\ds -C Q\) \(=\) \(\ds P V_x^{-1} V_y\) equating competing equations for $\paren {\map {p_i} {y_j} }$
\(\ds C\) \(=\) \(\ds -P V_x^{-1} V_y Q^{-1}\) solving for $C$

$\blacksquare$


Examples

$3 \times 3$ Matrix

Illustrate $3 \times 3$ case for Vandermonde Matrix Identity for Cauchy Matrix and Value of Cauchy Determinant.

Let $C$ denote the Cauchy matrix of order $3$:

$C = \begin {pmatrix}

\dfrac 1 {x_1 - y_1} & \dfrac 1 {x_1 - y_2} & \dfrac 1 {x_1 - y_3} \\ \dfrac 1 {x_2 - y_1} & \dfrac 1 {x_2 - y_2} & \dfrac 1 {x_2 - y_3} \\ \dfrac 1 {x_3 - y_1} & \dfrac 1 {x_3 - y_2} & \dfrac 1 {x_3 - y_3} \\ \end{pmatrix}$

where the values in $\set {x_1, x_2, x_3, y_1, y_2, y_3}$ are assumed to be distinct.

Then:

\(\ds C\) \(=\) \(\ds -P V_x^{-1} V_y Q^{-1}\) Vandermonde Matrix Identity for Cauchy Matrix
\(\ds \map \det C\) \(=\) \(\ds \paren {-1}^3 \dfrac {\paren {x_3 - x_1} \paren {x_3 - x_2} \paren {x_2 - x_1} \paren {y_3 - y_1} \paren {y_3 - y_2} \paren {y_2 - y_1} }
                         {\paren {x_1 - y_1} \paren {x_1 - y_2} \paren {x_1 - y_3} \paren {x_2 - y_1} \paren {x_2 - y_2} \paren {x_2 - y_3} \paren {x_3 - y_1} \paren {x_3 - y_2} \paren {x_3 - y_3} }\)
Determinant of Matrix Product


$n \times n$ Matrix

The methods of the $3 \times 3$ example apply unchanged for the general $n \times n$ Cauchy matrix:

Assume values $\set {x_1, \ldots, x_n, y_1, \ldots, y_n}$ are distinct. Then:

$\map \det {\begin{smallmatrix}

\dfrac 1 {x_1 - y_1} & \dfrac 1 {x_1 - y_2} & \cdots & \dfrac 1 {x_1 - y_n} \\ \dfrac 1 {x_2 - y_1} & \dfrac 1 {x_2 - y_2} & \cdots & \dfrac 1 {x_2 - y_n} \\ \vdots & \vdots & \cdots & \vdots \\ \dfrac 1 {x_n - y_1} & \dfrac 1 {x_n - y_2} & \cdots & \dfrac 1 {x_n - y_n} \\ \end{smallmatrix} } = \paren {-1}^n \dfrac {\ds \prod_{1 \mathop \le j \mathop < i \mathop \le n} \paren {x_i - x_j} \quad \prod_{1 \mathop \le j \mathop < i \mathop \le n} \paren {y_i - y_j} } {\ds \prod_{i \mathop = 1}^n \prod_{j \mathop = 1}^n \paren {x_i - y_j} }$ Value of Cauchy Determinant

Assume values $\set {x_1, \ldots, x_n, -y_1, \ldots, -y_n}$ are distinct, then replace in the preceding equation $y_i$ by $-y_i$, $1 \le i \le n$:

$\map \det {\begin{smallmatrix}

\dfrac 1 {x_1 + y_1} & \dfrac 1 {x_1 + y_2} & \cdots & \dfrac 1 {x_1 + y_n} \\ \dfrac 1 {x_2 + y_1} & \dfrac 1 {x_2 + y_2} & \cdots & \dfrac 1 {x_2 + y_n} \\ \vdots & \vdots & \cdots & \vdots \\ \dfrac 1 {x_n + y_1} & \dfrac 1 {x_n + y_2} & \cdots & \dfrac 1 {x_n + y_n} \\ \end{smallmatrix} } = \paren {-1}^n \dfrac {\ds \prod_{1 \mathop \le j \mathop < i \mathop \le n} \paren {x_i - x_j} \quad \prod_{1 \mathop \le j \mathop < i \mathop \le n} \paren {y_j - y_i} } {\ds \prod_{i \mathop = 1}^n \prod_{j \mathop = 1}^n \paren {x_i + y_j} }$ Value of Cauchy Determinant

$\blacksquare$


Also see


Historical Note

Roderick Gow established Vandermonde Matrix Identity for Cauchy Matrix using interpolation polynomials and change of basis facts.


Sources