Growth Rate Visualization
See how different complexities grow as input size increases. Use the slider to explore different values of n.
Operations at n = 100:
| Complexity | n = 10 | n = 100 | n = 1.0K | n = 10.0K |
|---|---|---|---|---|
| O(1)Constant | 1.0 | 1.0 | 1.0 | 1.0 |
| O(log n)Logarithmic | 3.3 | 6.6 | 10.0 | 13 |
| O(n)Linear | 10 | 100 | 1.0K | 10.0K |
| O(n log n)Linearithmic | 33 | 664 | 10.0K | 132.9K |
| O(n²)Quadratic | 100 | 10.0K | 1.0M | 100.0M |
| O(2^n)Exponential | 1.0K | 1.1B | 1.1B | 1.1B |
Notice how O(n²) and O(2^n) explode for large inputs. At n=1000, O(n²) is 1 million operations, while O(log n) is only about 10!
Operations take the same time regardless of input size
- Array access by index
- Arithmetic operations
- Variable assignment
Operations grow slowly as input doubles
- Binary search
- Balanced tree operations
- Loops that halve input
Operations grow proportionally with input size
- Single loop through array
- Linear search
- Finding max/min
Slightly worse than linear, common in efficient sorting
- Merge Sort
- Quick Sort (average)
- Heap Sort
Operations grow with the square of input size
- Nested loops
- Bubble Sort
- Comparing all pairs
Operations double with each additional input
- Naive Fibonacci
- Subset generation
- Brute force combinations
Efficient: O(1), O(log n), O(n)
These scale well even for very large inputs. Binary search on 1 billion items takes only ~30 steps!
Moderate: O(n log n)
The sweet spot for sorting. Merge sort and quick sort use this complexity to efficiently sort large datasets.
Use with Caution: O(n²)
Nested loops are fine for small inputs (n < 1000), but become slow for larger datasets. Often can be optimized.
Avoid: O(2^n), O(n!)
Exponential algorithms only work for tiny inputs (n < 20). Beyond that, they become impractical.