Matrix Product (Conventional)/Examples

From ProofWiki
Jump to navigation Jump to search

Examples of Matrix Product

$2 \times 2$ Real Matrices

Let $\mathbf A = \begin {pmatrix} p & q \\ r & s \end {pmatrix}$ and $\mathbf B = \begin {pmatrix} w & x \\ y & z \end {pmatrix}$ be order $2$ square matrices over the real numbers.


Then the matrix product of $\mathbf A$ with $\mathbf B$ is given by:

$\mathbf A \mathbf B = \begin {pmatrix} p w + q y & p x + q z \\ r w + s y & r x + s z \end {pmatrix}$


$3 \times 3$ Matrix-Vector Multiplication Formula

The $3 \times 3$ matrix-vector multiplication formula is an instance of the matrix product operation:

$\mathbf A \mathbf v = \begin{bmatrix}

a_{11} & a_{12} & a_{13} \\ a_{21} & a_{22} & a_{23} \\ a_{31} & a_{32} & a_{33} \\ \end{bmatrix} \begin{bmatrix} x \\ y \\ z \end{bmatrix} = \begin{bmatrix} a_{11} x + a_{12} y + a_{13} z \\ a_{21} x + a_{22} y + a_{23} z \\ a_{31} x + a_{32} y + a_{33} z \\ \end{bmatrix}$


Cayley's Motivation

Let there be $3$ Cartesian coordinate systems:

$\tuple {x, y}$, $\tuple {x', y'}$, $\tuple {x, y}$


Let them be connected by:

$\begin {cases} x' = x + y \\ y' = x - y \end {cases}$

and:

$\begin {cases} x = -x' - y' \\ y = -x' + y' \end {cases}$


The relationship between $\tuple {x, y}$ and $\tuple {x, y}$ is given by:

$\begin {cases}

x = -x' - y' = -\paren {x + y} - \paren {x - y} = -2 x \\ y = -x' + y' = -\paren {x + y} + \paren {x - y} = -2 y \end {cases}$


Arthur Cayley devised the compact notation that expressed the changes of coordinate systems by arranging the coefficients in an array:

$\begin {pmatrix} 1 & 1 \\ 1 & -1 \end {pmatrix} \begin {pmatrix} -1 & -1 \\ -1 & 1 \end {pmatrix} = \begin {pmatrix} -2 & 0 \\ 0 & -2 \end {pmatrix}$

As such, he can be considered as having invented matrix multiplication.


Change of Axes

Consider the Cartesian coordinate system:

$C := \tuple {x, y, z}$

Let $\mathbf A$ denote the square matrix:

$\mathbf A = \begin {pmatrix} 0 & 1 & 0 \\ 1 & 0 & 0 \\ 0 & 0 & 1 \end {pmatrix}$

Then $\mathbf A$ has the effect of exchanging the $x$ and $y$ axes of $C$.


Column Matrix All $0$ except for one $1$

Let $\mathbf A$ be a matrix of order $m \times n$.

For $1 \le i \le n$, let $\mathbf e_i$ be the column matrix of order $n \times 1$ defined as:

$e_k = \delta_{k i}$

where:

$e_k$ is the element of $\mathbf e_i$ whose indices are $\tuple {k, 1}$
$\delta_{k i}$ denotes the Kronecker delta.

Then $\mathbf A \mathbf e_i$ is the column matrix which is equal to the $i$th column of $\mathbf A$.


Arbitrary Matrices $1$

$\begin {bmatrix} 2 & 1 & 0 \\ 3 & 0 & 7 \end {bmatrix} \begin {bmatrix} 2 & 3 & 5 & 8 \\ 4 & 8 & 6 & 1 \\ -1 & 7 & 0 & 7 \end {bmatrix} = \begin {bmatrix} 8 & 14 & 16 & 17 \\ -1 & 58 & 15 & 73 \end {bmatrix}$


Arbitrary Matrices $2$

$\begin {bmatrix} 5 & 3 & 1 \end {bmatrix} \begin {bmatrix} 2 \\ 0 \\ 6 \end {bmatrix} = \begin {bmatrix} 16 \end {bmatrix}$


Arbitrary Matrices $3$

$\begin {bmatrix} 2 \\ 0 \\ 6 \end {bmatrix} \begin {bmatrix} 5 & 3 & 1 \end {bmatrix} = \begin {bmatrix} 10 & 6 & 2 \\ 0 & 0 & 0 \\ 30 & 18 & 6 \end {bmatrix}$


Arbitrary Matrices $4$

$\begin {bmatrix} 1 & 3 & -2 \\ -2 & -6 & 4 \\ 4 & 12 & -8 \end {bmatrix} \begin {bmatrix} 3 & -1 & 2 \\ 3 & 5 & -4 \\ 6 & 7 & -5 \end {bmatrix} = \begin {bmatrix} 0 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end {bmatrix}$