Definition:O Notation

From ProofWiki
Jump to navigation Jump to search

Definition

$\OO$ notation is a type of order notation, typically used in computer science for comparing 'run-times' of algorithms, or in analysis for comparing growth rates between two growth functions.


Big-$\OO$ Notation

Let $f$ and $g$ be real functions defined on a neighborhood of $+ \infty$ in $\R$.


The statement:

$\map f x = \map \OO {\map g x}$ as $x \to \infty$

is equivalent to:

$\exists c \in \R_{\ge 0}: \exists x_0 \in \R: \forall x \in \R: \paren {x \ge x_0 \implies \size {\map f x} \le c \cdot \size {\map g x} }$


That is:

$\size {\map f x} \le c \cdot \size {\map g x}$

for $x$ sufficiently large.


This statement is voiced $f$ is big-$\OO$ of $g$ or simply $f$ is big-$\OO$ $g$.


Little-$\oo$ Notation

Let $\map g x \ne 0$ for $x$ sufficiently large.


$f$ is little-$\oo$ of $g$ as $x \to \infty$ if and only if:

$\ds \lim_{x \mathop \to \infty} \frac {\map f x} {\map g x} = 0$


Also known as

The $\OO$ and $\oo$ used in $\OO$ notation can be referred to as the Landau symbols.


Sources