We wish to develop a convention for classifying the runtimes (e.g. the asymptotic behavior) of a given algorithm that runs in *g*(*n*) time (effort). Below are a set of such definitions for capturing the runtime of an algorithm in terms of a function which we will compare it to.

Let (*n*) and *g*(*n*) be functions that map positive integers to positive real numbers. Let *g*(*n*) be a function that expresses the runtime of some algorithm for a input size of *n*. Let (*n*) be a function against which we wish to compare the runtime *g*(*n*).

Below is a table for classifying *g*(*n*) as compared to some function (*n*). The graph is meant to give some intuition into how a certain (*n*) relates to *g*(*n*) under these classifications. This is not a formal graph.

Sometimes we wish to say that an algorithm with a run time of *f*(*n*) takes *no more time* than a certain amount of time. It could take less time, or the same time, but not more. A function *g*(*n*) could be selected such that is provides an *upper bound* for the algorithm with an algorithm of *f*(*n*). We use Big-O for this.

Say we have an algorithm with a running time of *f*(*n*). If there is another function *g*(*n*) such that for some constant *c∈ℝ, *and for all *n*>*N*∈Z+, *c⋅g*(*n*)>*f*(*n*), we say that *f*(*n*)=O(*g*(*n*)):

So we say the **run time** of *f*(*n*) is at least as good as *g*(*n*), or f(*n*) = O(*g*(*n*)).

Formally, For functions (*n*) and *g*(*n*), (*n*) = O(*g*(*n*)) if ∃*N*, *c* > 0 such that (*n*) < c*g*(*n*) ∀*n* > *N*

Alternately: O((*n*)) = {(*n*) : ∃ 𝑐∈ℝ lim_{n→∞} *g*(*n*)/(*n*) = c}

So O((*n*)) is the set of all functions *g*(*n*) s.t. *g*(*n*) grows slower or the at same rate as (*n*) as *n* goes to ∞.

Thus we say that the running time is "big-O of (*n*)" or just "O of (*n*)."

We use big-O notation for **asymptotic upper bounds**, since it bounds the growth of the running time from above for large enough input sizes.

Sometimes we wish to say that an algorithm with a run time of *f*(*n*) takes *no less time* than a certain amount of time. It could take more time, or the same time, but not more. A function *g*(*n*) could be selected such that is provides an *lower bound* for the algorithm with an algorithm of *f*(*n*). We use Big-Ω for this.

Say we have an algorithm with a running time of *f*(*n*). If there is another function *g*(*n*) such that for some constant *c∈ℝ, *and for all *n*>*N*∈Z+, *c⋅g*(*n*)<f(*n*), we say that *f*(*n*)=O(g(*n*)):

So we say the **run time** of *f*(*n*) is no better than *g*(*n*), or *f*(*n*) = Ω(*g*(*n*)).

Formally: Ω((*n*)) = {*g*(*n*) : ∃N, c > 0 ∀*n*>*N* *g*(*n*) > *c*(*n*)}

Also: o((*n*)) = {(*n*) : ∃ 𝑐∈ℝ, lim_{n→∞} *g*(*n*)/(*n*) = c, where c≥0}

We use big-Ω notation for **asymptotic lower bounds**, since it bounds the growth of the running time from below for large enough input sizes.

Sometime we wish to classify an algorithms **running time** very closely. That is we want to find a simplified function that can provide both and **upper bound** and **lower bound** at the same time.

Another way of saying this is say that we want to find a (*n*) for an algorithm with runtime *g*(*n*), where *g*(*n*) = O((*n*)) and *g*(*n*) = Ω((*n*)) are **both** true.

Let's say that we have an algorithm with a runtime of *g*(*n*) and a function (*n*) where for sufficiently large *n*, there exists two constants c_{1} and c_{2} such that *g*(*n*) is between *c*_{1}⋅(*n*) and *c*_{2}⋅(*n*). Then *g*(*n*) = Θ((*n*)).

Formally: Θ((*n*)) = {*g*(*n*): *g*(*n*)=O((*n*)) and *g*(*n*)=Ω((*n*))}

Alternative: Θ((*n*)) = {*g*(*n*) : ∃*c* > 0 s.t. lim_{n→∞}(*n*)/*g*(*n*) = *c*}

Big-Ο is typically used as a tight upper-bound on the growth of an algorithms effort even though, as written, it can also be a loose upper-bound. Little-ο (ο()) notation is used to describe an upper-bound that cannot be tight, it is strictly a faster growing function.

**Definition :** Let (*n*) and *g*(*n*) be functions that map positive integers to positive real numbers. We say that *g*(*n*) is ο((*n*)) (or *g*(*n*) ∈ ο((*n*))) if for **any real** constant *c* > 0, there exists an integer constant n_{0} ≥ 1 such that 0 ≤ *g*(*n*) < c*(*n*).

Formally: o((*n*)) = {(*n*) : lim_{n→∞} *g*(*n*)/(*n*) = ∞}

Its means little o() means **loose upper-bound** of f(n)

Big-Ω is used as a tight lower-bound on the growth of an algorithms effort (this effort is described by the function (*n*)), even though, as written, it can also be a loose lower-bound. Little-ω (ω()) notation is used to describe an lower-bound that cannot be tight.

**Definition :** Let (*n*) and *g*(*n*) be functions that map positive integers to positive real numbers. We say that *g*(*n*) is ω((*n*)) (or *g*(*n*) ∈ ω((*n*))) if for **any real** constant *c* > 0, there exists an integer constant n_{0} ≥ 1 such that *g*(*n*) > c*(*n*) for all n>n_{0}.

Formally: ω((*n*)) = {(*n*) : lim_{n→∞} *g*(*n*)/(*n*) = 0}

Its means little o() means **loose upper-bound** of f(n)

I | Attachment | History | Action | Size | Date | Who | Comment |
---|---|---|---|---|---|---|---|

png | AAChart.png | r1 | manage | 40.0 K | 2019-02-14 - 17:00 | JimSkon | |

png | BidTheta.png | r1 | manage | 12.6 K | 2019-02-13 - 16:23 | JimSkon | |

png | BigO.png | r1 | manage | 9.7 K | 2019-02-13 - 15:22 | JimSkon | |

png | BigOEx.png | r1 | manage | 9.5 K | 2019-02-14 - 17:00 | JimSkon | |

png | BigOGraph.png | r1 | manage | 9.6 K | 2019-02-14 - 23:16 | JimSkon | |

png | BigOmega.png | r3 r2 r1 | manage | 8.5 K | 2019-02-14 - 23:29 | JimSkon | |

png | BigTheta.png | r1 | manage | 11.8 K | 2019-02-15 - 01:31 | JimSkon | |

png | ComparAA.png | r2 r1 | manage | 51.5 K | 2019-02-15 - 01:25 | JimSkon | |

png | asymptotic.png | r3 r2 r1 | manage | 35.0 K | 2019-02-13 - 17:53 | JimSkon |

Copyright © 2008-2019 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.

Ideas, requests, problems regarding TWiki? Send feedback

Ideas, requests, problems regarding TWiki? Send feedback