# Definition:Algorithm/Analysis/Complexity

## Definition

The **complexity** of an algorithm is defined as the number of discrete steps needed to terminate as a function of the number of inputs.

This page has been identified as a candidate for refactoring of advanced complexity.In particular: The below needs to be restructured and rethoughtUntil this has been finished, please leave
`{{Refactor}}` in the code.
Because of the underlying complexity of the work needed, it is recommended that you do not embark on a refactoring task until you have become familiar with the structural nature of pages of $\mathsf{Pr} \infty \mathsf{fWiki}$.To discuss this page in more detail, feel free to use the talk page.When this work has been completed, you may remove this instance of `{{Refactor}}` from the code. |

This page or section has statements made on it that ought to be extracted and proved in a Theorem page.In particular: all the belowYou can help $\mathsf{Pr} \infty \mathsf{fWiki}$ by creating any appropriate Theorem pages that may be needed.To discuss this page in more detail, feel free to use the talk page. |

An algorithm may be analysed in the context of its **complexity**.

This is typically via measures of run-times using big-$\OO$, or big-$\Omega$, or $\Theta$ notation.

Complexity can be through 3 main types of analysis:

- Average-Case Analysis, using a distribution of cases to find an average case of the algorithm run-time for analysis
- Best-Case Analysis, using the best possible problem instance of the algorithm for analysis
- Worst-Case Analysis,using the worst possible problem instance of the algorithm for analysis

## Examples

### Multiplication of Square Matrices

Consider two square matrices $\mathbf A$ and $\mathbf B$ of order $n$.

Consider the matrix multiplication operation:

- $\ds \mathbf A \times \mathbf B := \forall i j \in \closedint 1 n: c_{i j} = \sum_{k \mathop = 1}^n a_{i k} \circ b_{k j}$

This takes $2 n^3$ steps when using the usual method.

However, using the Strassen algorithm, this takes a number of steps proportional to $n^{\lg 7}$, where $\lg$ denotes logarithm base $2$.

As $\lg 7 < 3$, it follows that the Strassen algorithm uses fewer steps than the usual method, for sufficiently large $n$.

## Motivation

The increase in use of computers for all sorts of automation projects requires that the algorithms used are as efficient as possible, using as few computer resources as possible.

The study of **algorithmic complexity** as a way of comparing algorithms has come into prominence as a result.

## Also see

- Results about
**algorithmic complexity**can be found**here**.

## Sources

- 1998: David Nelson:
*The Penguin Dictionary of Mathematics*(2nd ed.) ... (previous) ... (next):**complexity**(of an algorithm) - 2008: David Nelson:
*The Penguin Dictionary of Mathematics*(4th ed.) ... (previous) ... (next):**complexity**(of an algorithm) - 2014: Christopher Clapham and James Nicholson:
*The Concise Oxford Dictionary of Mathematics*(5th ed.) ... (previous) ... (next):**algorithmic complexity** - 2021: Richard Earl and James Nicholson:
*The Concise Oxford Dictionary of Mathematics*(6th ed.) ... (previous) ... (next):**algorithmic complexity**