- Asymptotic notation
- Standard notation and common functions

For a given function, $g(n)$ , we denote by $O(g(n))$ the set of functions, $O(g(n))$ $=$ { $f(n)$: there exist positive constants $c$ and ${n}_{0}$ such that $0\le f(n)\le cg(n)$ for all $n\ge {n}_{0}$}.Less formally, this means that for all sufficiently big $n$, the running time (space requirements etc) of the algorithm is less than $g(n)$ multiplied by some constant. Even less formally, if $f(n)=O({n}^{2})$ then we can say that $f(n)$ runs no slower than ${n}^{2}$.

- Given $f(n)=3n+5$, then $f(n)=O(n)$.
- Given $f(n)={n}^{3}$, then $f(n)\ne O({n}^{2})$.
- Given $f(n)={n}^{2}$, then $f(n)=O({n}^{3}-7{n}^{2}+3)$.

For a given function, $g(n)$, we denote by $\Omega (g(n))$ the set of functions, $\Omega (g(n))$ $=$ { $f(n)$: there exist positive constants $c$ and ${n}_{0}$ such that $0\le cg(n)\le f(n)$ for all $n\ge {n}_{0}$}.Less formally, this means that for all sufficiently big $n$, the running time (space requirements etc) of the algorithm is greater than $g(n)$ multiplied by some constant. Even less formally, if $f(n)=\Omega ({n}^{2})$ then we can say that $f(n)$ runs no faster than ${n}^{2}$.

- Given $f(n)=3n+5$, then $f(n)=\Omega (n)$.
- Given $f(n)={n}^{2}$, then $f(n)\ne \Omega ({n}^{3})$.
- Given $f(n)={n}^{4}$, then $f(n)=\Omega ({n}^{3}-7{n}^{2}+3)$.

For a given function, $g(n)$, we denote by $\Theta (g(n))$ the set of functions $\Theta (g(n))$ $=$ { $f(n)$: there exist positive constants ${c}_{1}$, ${c}_{2}$, and ${n}_{0}$ such that $0\le {c}_{1}g(n)\le f(n)\le {c}_{2}g(n)$ for all $n\ge {n}_{0}$}.

- Given $f(n)=\frac{1}{2}{n}^{2}-3n+7$, then $f(n)=\Theta ({n}^{2})$.
- Given $f(n)=\sum _{i=1}^{n}i$, then $f(n)=\Theta ({n}^{2})$.

For a given function, $g(n)$, we denote by $o(g(n))$ the set of functions $o(g(n))$ $=$ { $f(n)$: forThe implication of this, is that $f(n)$ becomes insignificant relative to $g(n)$ as $n$ approaches infinity:anypositive constant $c>0$, there exists a constant ${n}_{0}>0$ such that $0\le f(n)\le \mathrm{cg}(n)$ for all $n\ge {n}_{0}$}.

$\underset{n\to \infty}{lim}\frac{f(n)}{g(n)}=0$ |

For example, $2n=o({n}^{2})$ but $2{n}^{2}\ne o({n}^{2})$.

For a given function, $g(n)$, we denote by $\omega (g(n))$ the set of functions $\omega (g(n))$ $=$ { $f(n)$: forThe implication of this is that $g(n)$ becomes insignificant relative to $f(n)$ as $n$ approaches infinity:anypositive constant $c>0$, there exists a constant ${n}_{0}>0$ such that $0\le cg(n)\le f(n)$ for all $n\ge {n}_{0}$}.

$\underset{n\to \infty}{lim}\frac{f(n)}{g(n)}=\infty$ |

For example $\frac{{n}^{2}}{2}=\omega (n)$ but $\frac{{n}^{2}}{2}\ne \omega ({n}^{2})$.

File translated from T

On 31 Mar 2006, 18:12.