Join 36000+ teachers and students using TTIO.
Jonathan: My algorithm is faster!
Ruth: No, mine is!
Where will this lead? Will Ruth resort to silencing Jonathan? Will Jonathan just give up? Is there any way they can settle their dispute
Ruth and Jonathan could program their algorithms and try them out on some sample inputs.
Jonathan: But my algorithm is too complicated to implement if we're just going to throw it away!
Ruth: Maybe we'll miss some inputs on which your algorithm is bad!
Additional Character: Arf! [translation: Maybe Ruth's algorithm will look better on small test inputs, but when we get to the really big, interesting problems, Jonathan's is better.]
Additional Character #2: Doesn't this depend on which machine you use?
This isn't really a bad idea, but they have a point.
Graph performance against problem size, and extrapolate.
| . | current . | computing . | power -+ . | limits | . | | . ^ | | . | | v. time | _~ | - | _~ | _-~ |___--~ | +--------------------------- problem size -->
Jonathan: But how do I know if the extrapolation is correct?
It would be better if we didn't treat the algorithm as a black box.
Write down the algorithm and create a formula for the running time.
Prime-Test-All i <- 2 | v no +-> i <= sqrt(N)? -----> output IS_PRIME | | | | yes | v yes | does i divide N? ---> output NOT_PRIME | | | | no | v +--- i <- i + 1
The time consumed by Prime-Test-All for input N is at most
T[PTA](N) <= T[i <- 2] + (sqrt(N) - 1) (T[i <= sqrt(N)?] + T[i divide N?] + T[i <- i + 1]) + T[output]
Jonathan: No implementation!
Ruth: But what a mess!
For really big inputs, we can ignore everything but the fastest-growing term: T[PTA](N) <= c[PTA] sqrt(N) (where c[PTA] is a constant that will change between machines).
Say Jonathan's algorithm takes T[B](N) <= c[B] sqrt(N) log_2(N) time.
For what N is Jonathan's algorithm better?
To compare algorithms with very different bounds, we can just ignore the constants. We write T[PTA] = O(sqrt(N)) or T[B] = O(sqrt(N) log_2(N)).
This is called big-O notation.
Ruth: Aren't you simplifying a bit too much?
Yes, this is a gross simplification. But it often works!
50 x^2 + 25 x + 40 = O(x^2) 5,096 log_2 n + 0.02 n = O(n) 4,236,121 = O(1) 4 * 2^n log_2 n + n^2 = O(2^n log_2 n)
Algorithm Prime-Test-Odds(n) if n = 2 then return YES fi if n is even then return NO fi for each odd i such that 3 <= i <= sqrt(n) do if i divides n then return NO fi od return YES O(sqrt(n))
Source: https://www.cs.cmu.edu/ (archive adaptation)
www.teachyourselfpython.com