← back to topics

ADS · complexity

Complexity & Big-O

donenotes2 resources2 problems
Notes

Complexity & Big-O

Understanding how algorithms scale is the foundation of everything else in ADS.

What is Big-O?

Big-O notation describes the upper bound of an algorithm's growth rate — how its runtime or memory usage scales as input n grows toward infinity. We drop constants and lower-order terms because they become irrelevant at scale.

O(1)       constant     — hash table lookup
O(log n)   logarithmic  — binary search
O(n)       linear       — simple scan
O(n log n) linearithmic — merge sort
O(n²)      quadratic    — nested loops
O(2ⁿ)      exponential  — brute-force subsets

The key intuitions

Dominant term wins. O(n² + n) is just O(n²). At large n, the lower term is noise.

Constants don't matter asymptotically. O(3n) = O(n). But in practice, constants matter — a O(n log n) with a huge constant can lose to O(n²) for small inputs.

Best vs average vs worst case. Big-O is usually worst case. QuickSort is O(n log n) average but O(n²) worst. Know which you're analysing.

Space complexity

Same notation, different resource. Recursive algorithms often have hidden space cost via the call stack.

Iterative sum:   O(1) space
Recursive sum:   O(n) space  ← call stack depth

Common traps

  • Assuming a loop = O(n). Only true if the loop runs n times and each iteration is O(1).
  • Forgetting that string operations in many languages are O(k) where k is string length.
  • Binary search is O(log n) — but only on a sorted, random-access structure.