From ProofWiki
Jump to navigation Jump to search


The complexity of an algorithm is defined as the number of discrete steps needed to terminate as a function of the number of inputs.

An algorithm may be analysed in the context of its complexity.

This is typically via measures of run-times using big-$\OO$, or big-$\Omega$, or $\Theta$ notation.

Complexity can be through 3 main types of analysis:

  1. Average-Case Analysis, using a distribution of cases to find an average case of the algorithm run-time for analysis
  2. Best-Case Analysis, using the best possible problem instance of the algorithm for analysis
  3. Worst-Case Analysis,using the worst possible problem instance of the algorithm for analysis


Multiplication of Square Matrices

Consider two square matrices $\mathbf A$ and $\mathbf B$ of order $n$.

Consider the matrix multiplication operation:

$\ds \mathbf A \times \mathbf B := \forall i j \in \closedint 1 n: c_{i j} = \sum_{k \mathop = 1}^n a_{i k} \circ b_{k j}$

This takes $2 n^3$ steps when using the usual method.

However, using the Strassen algorithm, this takes a number of steps proportional to $n^{\lg 7}$, where $\lg$ denotes logarithm base $2$.

As $\lg 7 < 3$, it follows that the Strassen algorithm uses fewer steps than the usual method, for sufficiently large $n$.


The increase in use of computers for all sorts of automation projects requires that the algorithms used are as efficient as possible, using as few computer resources as possible.

The study of algorithmic complexity as a way of comparing algorithms has come into prominence as a result.

Also see

  • Results about algorithmic complexity can be found here.