ADS · complexity
Understanding how algorithms scale is the foundation of everything else in ADS.
Big-O notation describes the upper bound of an algorithm's growth rate — how its runtime or memory usage scales as input n grows toward infinity. We drop constants and lower-order terms because they become irrelevant at scale.
O(1) constant — hash table lookup
O(log n) logarithmic — binary search
O(n) linear — simple scan
O(n log n) linearithmic — merge sort
O(n²) quadratic — nested loops
O(2ⁿ) exponential — brute-force subsets
Dominant term wins. O(n² + n) is just O(n²). At large n, the lower term is noise.
Constants don't matter asymptotically. O(3n) = O(n). But in practice, constants matter — a O(n log n) with a huge constant can lose to O(n²) for small inputs.
Best vs average vs worst case. Big-O is usually worst case. QuickSort is O(n log n) average but O(n²) worst. Know which you're analysing.
Same notation, different resource. Recursive algorithms often have hidden space cost via the call stack.
Iterative sum: O(1) space
Recursive sum: O(n) space ← call stack depth