Asymptotic Notation: Big-O, little-o, θ, Ω, ω

We wish to develop a convention for classifying the runtimes (e.g. the asymptotic behavior) of a given algorithm that runs in g(n) time (effort). Below are a set of such definitions for capturing the runtime of an algorithm in terms of a function which we will compare it to.

Let ƒ(n) and g(n) be functions that map positive integers to positive real numbers. Let g(n) be a function that expresses the runtime of some algorithm for a input size of n. Let ƒ(n) be a function against which we wish to compare the runtime g(n).

Below is a table for classifying g(n) as compared to some function ƒ(n). The graph is meant to give some intuition into how a certain ƒ(n) relates to g(n) under these classifications. This is not a formal graph.

ComparAA.png

Big-O (asymptotic upper bound)

Sometimes we wish to say that an algorithm with a run time of f(n) takes no more time than a certain amount of time. It could take less time, or the same time, but not more. A function g(n) could be selected such that is provides an upper bound for the algorithm with an algorithm of f(n). We use Big-O for this.

Say we have an algorithm with a running time of f(n). If there is another function g(n) such that for some constant c∈ℝ, and for all n>N∈Z+, c⋅g(n)>f(n), we say that f(n)=O(g(n)):

BigOGraph.png

So we say the run time of f(n) is at least as good as g(n), or f(n) = O(g(n)).

Formally, For functions ƒ(n) and g(n), ƒ(n) = O(g(n)) if ∃N, c > 0 such that ƒ(n) < cg(n) ∀n > N

Alternately: O(ƒ(n)) = {g(n) : ∃ 𝑐∈ℝ limn→∞ f(n)/g(n) = c}

So O(ƒ(n)) is the set of all functions g(n) s.t. g(n) grows slower or the at same rate as ƒ(n) as n goes to ∞.

Thus we say that the running time is "big-O of ƒ(n)" or just "O of ƒ(n)."

We use big-O notation for asymptotic upper bounds, since it bounds the growth of the running time from above for large enough input sizes.

Big-Ω (asymptotic lower bound)

Sometimes we wish to say that an algorithm with a run time of f(n) takes no less time than a certain amount of time. It could take more time, or the same time, but not more. A function g(n) could be selected such that is provides an lower bound for the algorithm with an algorithm of f(n). We use Big-Ω for this.

Say we have an algorithm with a running time of f(n). If there is another function g(n) such that for some constant c∈ℝ, and for all n>N∈Z+, c⋅g(n)<f(n), we say that f(n)=O(g(n)):

BigOmega.png

So we say the run time of f(n) is no better than g(n), or f(n) = Ω(g(n)).

Formally: Ω(ƒ(n)) = {g(n) : ∃N, c > 0 ∀n>N g(n) > cƒ(n)}

Also: Ω(ƒ(n)) = {g(n) : ∃ 𝑐∈ℝ, limn→∞ g(n)/ƒ(n) = c or ∞}

We use big-Ω notation for asymptotic lower bounds, since it bounds the growth of the running time from below for large enough input sizes.

Big-θ (asymptotic tight bound)

Sometime we wish to classify an algorithms running time very closely. That is we want to find a simplified function that can provide both and upper bound and lower bound at the same time.

Another way of saying this is say that we want to find a ƒ(n) for an algorithm with runtime g(n), where g(n) = O(ƒ(n)) and g(n) = Ω(ƒ(n)) are both true.

Let's say that we have an algorithm with a runtime of g(n) and a function ƒ(n) where for sufficiently large n, there exists two constants c1 and c2 such that g(n) is between c1⋅ƒ(n) and c2⋅ƒ(n). Then g(n) = Θ(ƒ(n)).

BigTheta.png

Formally: Θ(ƒ(n)) = {g(n): g(n)=O(ƒ(n)) and g(n)=Ω(ƒ(n))}

Alternative: Θ(ƒ(n)) = {g(n) : ∃c > 0 s.t. limn→∞ƒ(n)/g(n) = c}

Little-o (non-tight upper bound)

Big-Ο is typically used as a tight upper-bound on the growth of an algorithm’s effort even though, as written, it can also be a loose upper-bound. “Little-ο” (ο()) notation is used to describe an upper-bound that cannot be tight, it is strictly a faster growing function.

Definition : Let ƒ(n) and g(n) be functions that map positive integers to positive real numbers. We say that g(n) is ο(ƒ(n)) (or g(n) ∈ ο(ƒ(n))) if for any real constant c > 0, there exists an integer constant n0 ≥ 1 such that 0 ≤ g(n) < c*ƒ(n).

Formally: o(ƒ(n)) = {g(n) : limn→∞ g(n)/ƒ(n) = 0}

Its means little o() means loose upper-bound of f(n)

Little-ω (non-tight lower bound)

Big-Ω is used as a tight lower-bound on the growth of an algorithm’s effort (this effort is described by the function ƒ(n)), even though, as written, it can also be a loose lower-bound. “Little-ω” (ω()) notation is used to describe an lower-bound that cannot be tight.

Definition : Let ƒ(n) and g(n) be functions that map positive integers to positive real numbers. We say that g(n) is ω(ƒ(n)) (or g(n) ∈ ω(ƒ(n))) if for any real constant c > 0, there exists an integer constant n0 ≥ 1 such that g(n) > c*ƒ(n) for all n>n0.

Formally: ω(ƒ(n)) = {g(n) : limn→∞ g(n)/ƒ(n) = ∞}

Its means little o() means loose upper-bound of f(n)

Topic attachments
I Attachment History Action Size Date Who Comment
PNGpng AAChart.png r1 manage 40.0 K 2019-02-14 - 17:00 JimSkon  
PNGpng BidTheta.png r1 manage 12.6 K 2019-02-13 - 16:23 JimSkon  
PNGpng BigO.png r1 manage 9.7 K 2019-02-13 - 15:22 JimSkon  
PNGpng BigOEx.png r1 manage 9.5 K 2019-02-14 - 17:00 JimSkon  
PNGpng BigOGraph.png r1 manage 9.6 K 2019-02-14 - 23:16 JimSkon  
PNGpng BigOmega.png r3 r2 r1 manage 8.5 K 2019-02-14 - 23:29 JimSkon  
PNGpng BigTheta.png r1 manage 11.8 K 2019-02-15 - 01:31 JimSkon  
PNGpng ComparAA.png r2 r1 manage 51.5 K 2019-02-15 - 01:25 JimSkon  
PNGpng asymptotic.png r3 r2 r1 manage 35.0 K 2019-02-13 - 17:53 JimSkon  
Edit | Attach | Watch | Print version | History: r9 < r8 < r7 < r6 < r5 | Backlinks | Raw View | Raw edit | More topic actions
Topic revision: r9 - 2019-02-22 - JimSkon
 
This site is powered by the TWiki collaboration platform Powered by PerlCopyright © 2008-2019 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback