Asymptotic Order of Growth
Objectives
We would like to :
- Understand O, Ω Θ
Notes
- Last time we said: O(g(n)) = {f(n): there exist positive constants c and n0 such that 0 ≤ f(n) ≤ c×g(n) for all n ≥ n0}
- Note that this is a set of functions.
- To say linear search is O(n), implies that it's performance is in this set.
- Big O says that the functions are below the given function
- What are these constants c and n0
- Let's look at desmos again.
- Lock the viewport, set x between 0 and 100, set y between 0 and 10,000
- Plot 3x2 + x + 100 and x2
- Note that the second function is below the first.
- But we want it to be above the first.
- Can we find a value for c and n0
- This allows all of the variance from the previous set of notes.
- We don't really care about the multiple of the number of instructions.
- We don't really care about the startup cost.
- Let's look at the definition again.
- f(n) ≤ c×g(n) for all n > n0
- What does the c term do?
- I suggest it makes up for the difference in the number of instructions in the leading term.
- Are 7n (intel add array) and 8n (mips add array) in the same class? (from the last notes)
- sure, because 8n < 2 * 7n
- and because 7n < 8n
- What does the n > n0 do?
- I suggest it makes up for the "non-expensive" part of the code.
- Think of the ssort program. (This is program not algorithm, but it still works)
- It initialized the array (O(n))
- It checked the final result (O(n))
- But these don't matter in the long run.
- The main loop is O(n2)
- So a program that does not really change performance characteristics if we check the array or not.
- In the end, big O notation gives us an estimate of the performance of an algorithm
- Since .01×n2 and 2,000,000,000×n2 are in the same class, there is quite a lot of variance.
- So there are times when two algorithms of the same order perform radically different.
- It is a measure of how performance changes as the size of the data increases.
- There are times when we will employ a less efficient algorithm because it has better performance for a lower number of elements.