Asymptotic Notation
CSE-250 Fall 2022 - Section B
Sept 7 and 12, 2022
Textbook: Ch. 7.3-7.4
When is an algorithm "fast"?
Growth Functions
f(n)
- n: The "size" of the input
- e.g., the number of users, rows of data, etc...
- f(n): The number of "steps" taken for an input of size n
- e.g., 20 steps per user is 20×n (with n=|Users|)
Growth Function Assumptions
- Problem sizes are non-negative integers
- n∈Z+∪{0}
- We can't reverse time
- f(n)≥0
- Smaller problems aren't harder than bigger problems
- For any n1<n2, f(n1)≤f(n2)
To make the math simpler, we'll allow fractional steps.
... but f1(n)=20n≢f2(n)=19n
Idea: Organize growth functions into complexity classes.
Asymptotic Analysis @ 5000 feet
Case 1:
limn→∞f(n)g(n)=∞
(f(n) is "bigger"; g(n) is the better runtime on larger data)
Case 2:
limn→∞f(n)g(n)=0
(g(n) is "bigger"; f(n) is the better runtime on larger data)
Case 3:
limn→∞f(n)g(n)=some constant
(f(n), g(n) "behave the same" on larger data)
Big-Theta
The following are all saying the same thing
- limn→∞f(n)g(n)= some non-zero constant.
- f(n) and g(n) have the same complexity.
- f(n) and g(n) are in the same complexity class.
- f(n)∈Θ(g(n))
Big-Theta (As a Limit)
f(n)∈Θ(g(n)) iff...
0<limn→∞f(n)g(n)<∞
Big-Theta
Θ(g(n)) is the set of functions in the same complexity class as g(n)
People sometimes write f(n)=Θ(g(n)) when they mean f(n)∈Θ(g(n))
Symmetric: f(n)∈Θ(g(n)) is the same as g(n)∈Θ(f(n))
If you can shift/stretch g(n) into f(n), they're in the same class.
... Instead, think of g(n) as a bound.
Can you bound f(n) by shift/stretching g(n)?
Big-Theta
The following are all saying the same thing
- limn→∞f(n)g(n)= some non-zero constant.
- f(n) and g(n) have the same complexity.
- f(n) and g(n) are in the same complexity class.
- f(n)∈Θ(g(n))
- f(n) is bounded from above and below by g(n)
Big-Theta (As a Bound)
f(n)∈Θ(g(n)) iff...
- ∃clow,n0 s.t. ∀n>n0, f(n)≥clow⋅g(n)
- There is some clow that we can multiply g(n) by so that f(n) is always bigger than clowg(n) for values of n above some n0
- ∃chigh,n0 s.t. ∀n>n0, f(n)≤chigh⋅g(n)
- There is some chigh that we can multiply g(n) by so that f(n) is always smaller than chighg(n) for values of n above some n0
Proving Big-Theta (Without Limits)
- Assume f(n)≥clowg(n).
- Rewrite the above formula to find a clow for which it holds (for big enough n).
- Assume f(n)≤chighg(n).
- Rewrite the above formula to find a chigh for which it holds (for big enough n).
Tricks
If f(n)≥g′(n) and g′(n)≥g(n) then f(n)≥g′(n)
Lesson: To show f(n)≥cg(n), you can instead show:
- f(n)≥cg′(n)
- cg′(n)≥cg(n)
Tricks
If f(n)≥g(n) and f′(n)≥g′(n) then f(n)+f′(n)≥g(n)+g′(n)
Lesson: To show f(n)+f′(n)≥cg(n)+c′g′(n), you can instead show:
- f(n)≥cg(n)
- f′(n)≥c′g′(n)
Tricks
- log(n)≥c (for any n≥2c)
- n≥log(n) for any n≥0
- n2≥n for any n≥1
- 2n≥nc for sufficiently large n
Examples
2n+4n∈Θ(n2)?
2n+4n∈Θ(n)?
1000nlog(n)+5n∈Θ(nlog(n))?
Shortcut: Find the dominant term being summed, and remove constants.
We write T(n) to mean a runtime growth function.
In data structures, n is usually the number of elements in a collection.
Examples
What is the asymptotic runtime of...
- ...counting the number of times x appears in a Linked List?
- ...counting the number of times x appears in a Linked List?
- ...using multiplication to compute Factorial?
Common Runtimes
- Constant Time: Θ(1)
- e.g., T(n)=c (runtime is independent of n)
- Logarithmic Time: Θ(log(n))
- e.g., T(n)=clog(n) (for some constant c)
- Linear Time: Θ(n)
- e.g., T(n)=c1n+c0 (for some constants c0,c1)
- Quadratic Time: Θ(n2)
- e.g., T(n)=c2n2+c1n+c0
- Polynomial Time: Θ(nk) (for some k∈Z+)
- e.g., T(n)=cknk+…+c1n+c0
- Exponential Time: Θ(cn) (for some c≥1)
What is the asymptotic runtime of...
- ...looking up an element in an Array?
The runtime depends on where the item is in the list.
for(i <- 0 until data.size)
{
if( data(i) == target ){ return i }
}
return NOT_FOUND
What is the runtime growth function?
T(n)={ℓif data(0)==target2ℓif data(1)==target3ℓif data(2)==target……(n−1)ℓif data(n−1)==targetnℓotherwise
Aside: No general, meaningful notion of limit for T(n)s like this.
T(n)∈Θ(n)?
If we choose c=ℓ, we can show T(n)≤c⋅n (for any n)
... but there is no c s.t. T(n)≥c⋅n always!
... T(1000000) could be as small as â„“,
so T(1000000)≱1000000ℓ
T(n)∈Θ(1)?
If we choose c=ℓ, we can show T(n)≥c⋅1 (for any n)
... but there is no c s.t. T(n)≤c⋅1 always!
... T(1000000) could be as big as 1000000â„“,
so T(1000000)≰ℓ
Problem: What if g(n) doesn't bound f(n)
from both above and below?
if input = 1:
else:
Schroedinger's Code: Simultaneously behaves like
f1(n)=1 and f2(n)=n (can't tell until runtime)
Upper, Lower Bounds
- "Worst-Case Complexity"
- O(g(n)) is the set of functions that are in g(n)'s complexity class, or a "smaller" class.
- "Best-Case Complexity"
- Ω(g(n)) is the set of functions that are in g(n)'s complexity class, or a "bigger" class.
Big-O
f(n)∈O(g(n)) iff...
∃chigh,n0 s.t. ∀n>n0, f(n)≤chigh⋅g(n)
There is some chigh that we can multiply g(n) by so that f(n) is always smaller than chighg(n) for values of n above some n0
Examples
2n+4n∈O(n2)?
2n+4n∈O(n4+8n3)?
nlog(n)+5n∈O(n2+5n)?
Big-Ω
f(n)∈Ω(g(n)) iff...
∃clow,n0 s.t. ∀n>n0, f(n)≥clow⋅g(n)
There is some clow that we can multiply g(n) by so that f(n) is always smaller than clowg(n) for values of n above some n0
Examples
2n+4n∈Ω(n2+5)?
2n+4n∈Ω(log(n))?
nlog(n)+5n∈Ω(nlog(n))?
Recap
- Big-O: "Worst Case" bound
- O(g(n)) is the functions that g(n) bounds from above
- Big-Ω: "Best Case" bound
- Ω(g(n)) is the functions that g(n) bounds from below
- Big-Ï´: "Tight" bound
- Θ(g(n)) is the functions that g(n) bounds from both above and below
f(n)∈Θ(g(n)) ↔ f(n)∈O(g(n)) and f(n)∈Ω(g(n))
Recap
- Big-O: "Worst Case" bound
- If T(n)∈O(g(n)), then the runtime is no worse than g(n)
- Big-Ω: "Best Case" bound
- If T(n)∈Ω(g(n)), then the runtime is no better than g(n)
- Big-Ï´: "Tight" bound
- If T(n)∈Θ(g(n)), then the runtime is always g(n)
f(n)∈Θ(g(n)) ↔ f(n)∈O(g(n)) and f(n)∈Ω(g(n))
Asymptotic Notation
CSE-250 Fall 2022 - Section B
Sept 7 and 12, 2022
Textbook: Ch. 7.3-7.4