Big O Notation
Big O notation is a language to express how long an algorithm will take to run. Instead of measuring the exact runtime of an algorithm (which depends on the processor's speed), Big O notation will tell us how quickly the runtime grows relative to its input variable.
O(1) time or constant time
O(n) time or linear time
O(n^2) time or quadratic time
Drop constants
Constants are thrown out when calculating Big O
As input n gets arbitrarily large, the significance of constant value decreases drastically
This is O(1 + n/2 + 100), which we just call O(n)
Links: https://www.interviewcake.com/article/java/big-o-notation-time-and-space-complexity
Last updated