Solution to Simultaneous Linear Equations

From ProofWiki
Jump to navigation Jump to search

Theorem

Let $\ds \forall i \in \closedint 1 m: \sum _{j \mathop = 1}^n {\alpha_{i j} x_j} = \beta_i$ be a system of simultaneous linear equations

where all of $\alpha_1, \ldots, a_n, x_1, \ldots x_n, \beta_i, \ldots, \beta_m$ are elements of a field $K$.


Then $x = \tuple {x_1, x_2, \ldots, x_n}$ is a solution of this system if and only if:

$\sqbrk \alpha_{m n} \sqbrk x_{n 1} = \sqbrk \beta_{m 1}$

where $\sqbrk a_{m n}$ is an $m \times n$ matrix.


Proof

We can see the truth of this by writing them out in full.

$\ds \sum_{j \mathop = 1}^n {\alpha_{i j} x_j} = \beta_i$

can be written as:

\(\ds \alpha_{1 1} x_1 + \alpha_{1 2} x_2 + \ldots + \alpha_{1 n} x_n\) \(=\) \(\ds \beta_1\)
\(\ds \alpha_{2 1} x_1 + \alpha_{2 2} x_2 + \ldots + \alpha_{2 n} x_n\) \(=\) \(\ds \beta_2\)
\(\ds \) \(\vdots\) \(\ds \)
\(\ds \alpha_{m 1} x_1 + \alpha_{m 2} x_2 + \ldots + \alpha_{m n} x_n\) \(=\) \(\ds \beta_m\)

while $\sqbrk \alpha_{m n} \sqbrk x_{n 1} = \sqbrk \beta_{m 1}$ can be written as:

$\begin {bmatrix}

\alpha_{1 1} & \alpha_{1 2} & \cdots & \alpha_{1 n} \\ \alpha_{2 1} & \alpha_{2 2} & \cdots & \alpha_{2 n} \\

     \vdots &       \vdots & \ddots &       \vdots \\

\alpha_{m 1} & \alpha_{m 2} & \cdots & \alpha_{m n} \end {bmatrix} \begin {bmatrix} x_1 \\ x_2 \\ \vdots \\ x_n \end {bmatrix} = \begin {bmatrix} \beta_1 \\ \beta_2 \\ \vdots \\ \beta_m \end {bmatrix}$


So the question:

Find a solution to the following system of $m$ simultaneous linear equations in $n$ variables

is equivalent to:

Given the following element $\mathbf A \in \map {\MM_K} {m, n}$ and $\mathbf b \in \map {\MM_K} {m, 1}$, find the set of all $\mathbf x \in \map {\MM_K} {n, 1}$ such that $\mathbf A \mathbf x = \mathbf b$

where $\map {\MM_K} {m, n}$ is the $m \times n$ matrix space over $S$.

$\blacksquare$


Sources