docs/concepts/algorithms-big-o.mdx
Why does one solution pass all tests instantly while another times out? Why do interviewers care so much about "time complexity"? Consider these two functions that both find if an array contains duplicates:
// Approach A: Nested loops
function hasDuplicatesA(arr) {
for (let i = 0; i < arr.length; i++) {
for (let j = i + 1; j < arr.length; j++) {
if (arr[i] === arr[j]) return true
}
}
return false
}
// Approach B: Using a Set
function hasDuplicatesB(arr) {
return new Set(arr).size !== arr.length
}
Both work correctly. But with 100,000 elements, Approach A takes several seconds while Approach B finishes in milliseconds. The difference? Big O notation, which tells us how code performance scales with input size.
<Info> **What you'll learn in this guide:** - What Big O notation actually measures - The common complexities: O(1), O(log n), O(n), O(n log n), O(n²) - How to analyze your own code's complexity - JavaScript built-in methods and their complexity - Implementing binary search and merge sort - Common interview patterns: two pointers, sliding window, frequency counter </Info> <Warning> **Prerequisite:** This guide assumes you're familiar with [data structures](/concepts/data-structures) like arrays, objects, Maps, and Sets. You should also be comfortable with [recursion](/concepts/recursion) for the sorting algorithms section. </Warning>Big O notation describes how an algorithm's runtime or space requirements grow as input size increases. First formalized by Paul Bachmann in 1894 and later popularized by Donald Knuth in The Art of Computer Programming, it provides an upper bound on growth rate and ignores constants, giving us a way to compare algorithms regardless of hardware.
Imagine you need to deliver packages to houses on a street:
┌─────────────────────────────────────────────────────────────────────────┐
│ HOW ALGORITHMS SCALE │
├─────────────────────────────────────────────────────────────────────────┤
│ │
│ Operations │
│ ▲ │
│ │ O(n²) │
│ 1M │ •••• │
│ │ ••• │
│ │ ••• │
│ │ ••• │
│ 100K │ •••• O(n log n) │
│ │ ••• ════════════ │
│ │ •••• ═════ │
│ │ •••• ═════ │
│ 10K │ ••••• ═════ O(n) │
│ │ •••• ═════ ────────────── │
│ │ •••• ═════ ─────── │
│ │ •••• ═════ ─────── O(log n) │
│ 1K │•••• ═════ ────── ............ │
│ │═════ ─────── ........ │
│ │ ────── .......... O(1) │
│ 100 │──── .......... ══════════════ │
│ └──────────────────────────────────────────────────────────► │
│ 100 1K 10K 100K 1M Input (n) │
│ │
└─────────────────────────────────────────────────────────────────────────┘
Here are the most common complexities you'll encounter, from fastest to slowest:
| Complexity | Name | Example | 1,000 items | 1,000,000 items |
|---|---|---|---|---|
| O(1) | Constant | Array access | 1 op | 1 op |
| O(log n) | Logarithmic | Binary search | ~10 ops | ~20 ops |
| O(n) | Linear | Simple loop | 1,000 ops | 1,000,000 ops |
| O(n log n) | Linearithmic | Merge sort | ~10,000 ops | ~20,000,000 ops |
| O(n²) | Quadratic | Nested loops | 1,000,000 ops | 1,000,000,000,000 ops |
```javascript
// Array access by index
const arr = [1, 2, 3, 4, 5]
const element = arr[2] // O(1) - instant, no matter array size
// Object/Map lookup
const user = { name: 'Alice', age: 30 }
const name = user.name // O(1)
const map = new Map()
map.set('key', 'value')
map.get('key') // O(1)
// Array push and pop (end operations)
arr.push(6) // O(1)
arr.pop() // O(1)
```
```javascript
// Binary search - covered in detail below
// With 1,000,000 elements, only ~20 comparisons needed!
// Think of it like a phone book:
// Open middle → wrong half eliminated → repeat
```
```javascript
// Finding maximum value
function findMax(arr) {
let max = arr[0]
for (let i = 1; i < arr.length; i++) { // Visits each element once
if (arr[i] > max) max = arr[i]
}
return max
}
// Most array methods are O(n)
arr.indexOf(5) // O(n) - may check every element
arr.includes(5) // O(n)
arr.find(x => x > 3) // O(n)
arr.filter(x => x > 3) // O(n)
arr.map(x => x * 2) // O(n)
```
```javascript
// JavaScript's built-in sort is O(n log n)
const sorted = [...arr].sort((a, b) => a - b)
// Merge sort and quick sort (average case) are also O(n log n)
```
```javascript
// Checking all pairs
function findPairs(arr) {
const pairs = []
for (let i = 0; i < arr.length; i++) { // O(n)
for (let j = i + 1; j < arr.length; j++) { // O(n) for each i
pairs.push([arr[i], arr[j]])
}
}
return pairs // Total: O(n) × O(n) = O(n²)
}
// Bubble sort - O(n²), mostly used for teaching
```
Follow these steps to determine your code's complexity:
<Steps> <Step title="Identify the input size"> What variable represents n? Usually it's array length or a number parameter. </Step> <Step title="Count the loops"> - One loop over n elements = O(n) - Nested loops = multiply: O(n) × O(n) = O(n²) - Loop that halves each time = O(log n) </Step> <Step title="Drop constants and lower terms"> - O(2n) → O(n) - O(n² + n) → O(n²) - O(500) → O(1) </Step> </Steps>// Example analysis
function example(arr) {
console.log(arr[0]) // O(1)
for (let i = 0; i < arr.length; i++) { // O(n)
console.log(arr[i])
}
for (let i = 0; i < arr.length; i++) { // O(n)
for (let j = 0; j < arr.length; j++) { // × O(n)
console.log(arr[i], arr[j])
}
}
}
// Total: O(1) + O(n) + O(n²) = O(n²)
// The n² dominates, so we say this function is O(n²)
Knowing these helps you make better decisions:
| Method | Complexity | Why |
|---|---|---|
push(), pop() | O(1) | End operations, no shifting |
shift(), unshift() | O(n) | Must re-index all elements |
splice() | O(n) | May shift elements |
slice() | O(n) | Creates copy of portion |
indexOf(), includes() | O(n) | Linear search |
find(), findIndex() | O(n) | Linear search |
map(), filter(), forEach() | O(n) | Visits each element |
sort() | O(n log n) | V8 uses TimSort since 2019 |
| Operation | Object | Map | Set |
|---|---|---|---|
| Get/Set/Has | O(1) | O(1) | O(1) |
| Delete | O(1) | O(1) | O(1) |
| Keys/Values | O(n) | O(n) | O(n) |
Check each element one by one. Simple but slow for large arrays.
function linearSearch(arr, target) {
for (let i = 0; i < arr.length; i++) {
if (arr[i] === target) return i
}
return -1
}
linearSearch([3, 7, 1, 9, 4], 9) // Returns 3
Divide and conquer on a sorted array. Eliminates half the remaining elements each step. As noted in Introduction to Algorithms (Cormen et al.), binary search is one of the most efficient search algorithms with guaranteed O(log n) worst-case performance.
function binarySearch(arr, target) {
let left = 0
let right = arr.length - 1
while (left <= right) {
const mid = Math.floor((left + right) / 2)
if (arr[mid] === target) return mid
if (arr[mid] < target) left = mid + 1
else right = mid - 1
}
return -1
}
binarySearch([1, 3, 5, 7, 9, 11, 13], 9) // Returns 4
| Algorithm | Best | Average | Worst | Space | Use When |
|---|---|---|---|---|---|
| Bubble Sort | O(n)* | O(n²) | O(n²) | O(1) | Never in production |
| Merge Sort | O(n log n) | O(n log n) | O(n log n) | O(n) | Need guaranteed performance |
| Quick Sort | O(n log n) | O(n log n) | O(n²) | O(log n)** | General purpose |
JS sort() | O(n log n) | O(n log n) | O(n log n) | O(n) | Most cases |
*Bubble sort achieves O(n) best case only with early termination optimization (when no swaps needed). **Quick sort space is O(log n) average case, O(n) worst case due to recursion stack depth.
Repeatedly swaps adjacent elements if they're in wrong order. Educational, but too slow for real use.
function bubbleSort(arr) {
const result = [...arr]
const n = result.length
for (let i = 0; i < n; i++) {
let swapped = false
for (let j = 0; j < n - i - 1; j++) {
if (result[j] > result[j + 1]) {
[result[j], result[j + 1]] = [result[j + 1], result[j]]
swapped = true
}
}
// If no swaps occurred, array is sorted
if (!swapped) break
}
return result
}
Divide array in half, sort each half, merge them back. Consistent performance with guaranteed O(n log n).
function mergeSort(arr) {
if (arr.length <= 1) return arr
const mid = Math.floor(arr.length / 2)
const left = mergeSort(arr.slice(0, mid))
const right = mergeSort(arr.slice(mid))
return merge(left, right)
}
function merge(left, right) {
const result = []
let i = 0
let j = 0
while (i < left.length && j < right.length) {
if (left[i] <= right[j]) {
result.push(left[i])
i++
} else {
result.push(right[j])
j++
}
}
return result.concat(left.slice(i)).concat(right.slice(j))
}
mergeSort([38, 27, 43, 3, 9, 82, 10]) // [3, 9, 10, 27, 38, 43, 82]
These patterns solve many algorithm problems efficiently.
Use two pointers moving toward each other or in the same direction. Great for sorted arrays and finding pairs.
// Find pair that sums to target in sorted array
function twoSum(arr, target) {
let left = 0
let right = arr.length - 1
while (left < right) {
const sum = arr[left] + arr[right]
if (sum === target) return [left, right]
if (sum < target) left++
else right--
}
return null
}
twoSum([1, 3, 5, 7, 9], 12) // [1, 4] (3 + 9 = 12)
Maintain a "window" that slides through the array. Perfect for subarray problems.
// Maximum sum of k consecutive elements
function maxSumSubarray(arr, k) {
if (arr.length < k) return null
// Calculate first window
let windowSum = 0
for (let i = 0; i < k; i++) {
windowSum += arr[i]
}
let maxSum = windowSum
// Slide the window: remove left element, add right element
for (let i = k; i < arr.length; i++) {
windowSum = windowSum - arr[i - k] + arr[i]
maxSum = Math.max(maxSum, windowSum)
}
return maxSum
}
maxSumSubarray([2, 1, 5, 1, 3, 2], 3) // 9 (5 + 1 + 3)
Use an object or Map to count occurrences. Avoids nested loops when comparing collections.
// Check if two strings are anagrams
function isAnagram(str1, str2) {
if (str1.length !== str2.length) return false
const freq = {}
// Count characters in first string
for (const char of str1) {
freq[char] = (freq[char] || 0) + 1
}
// Subtract counts for second string
for (const char of str2) {
if (!freq[char]) return false
freq[char]--
}
return true
}
isAnagram('listen', 'silent') // true
isAnagram('hello', 'world') // false
Big O measures scalability, not absolute speed. It tells you how performance changes as input grows.
O(1) and O(log n) are fast. O(n) is acceptable. O(n²) gets slow quickly. Avoid O(2^n) for any significant input.
Nested loops multiply complexity. Two nested loops over n = O(n²). Three = O(n³).
Drop constants and lower terms. O(2n + 100) simplifies to O(n).
Array end operations are O(1), beginning operations are O(n). Prefer push/pop over shift/unshift.
Use Set for O(1) lookups instead of Array.includes() for repeated membership checks.
Binary search is O(log n) but requires sorted data. Worth sorting first if you'll search multiple times.
JavaScript's sort() is O(n log n) in all modern browsers. Use it unless you have specific requirements.
Learn the patterns: Two pointers, sliding window, and frequency counter solve most interview problems.
Space complexity matters too. Creating a new array of size n uses O(n) space.
</Info>**Answer:** O(n)
The outer loop runs n times, but the inner loop always runs exactly 10 times (constant). So it's O(n × 10) = O(10n) = O(n). The constant 10 is dropped.
With 1,000 elements: 1000 → 500 → 250 → 125 → 62 → 31 → 15 → 7 → 3 → 1
That's about 10 steps. log₂(1000) ≈ 10. With 1,000,000 elements, it only takes ~20 steps.
- **One lookup**: `indexOf()` is faster. O(n) vs O(n) for Set creation + O(1) for lookup.
- **Many lookups**: Convert to Set first. O(n) creation + O(1) per lookup beats O(n) per lookup.
Rule of thumb: If you'll search more than once, use a Set.
With 10,000 elements: ~100,000,000 operations. JavaScript's built-in sort() at O(n log n) would take ~130,000 operations for the same data.
Bubble sort is useful for learning but should never be used in production code.
```javascript
function hasDuplicates(arr) {
const seen = new Set()
for (const item of arr) {
if (seen.has(item)) return true // O(1) lookup
seen.add(item) // O(1) insert
}
return false
}
```
This is O(n) time and O(n) space. The naive nested loop approach would be O(n²) time but O(1) space.
```javascript
function longestUniqueSubstring(s) {
const seen = new Set()
let maxLen = 0
let left = 0
for (let right = 0; right < s.length; right++) {
while (seen.has(s[right])) {
seen.delete(s[left])
left++
}
seen.add(s[right])
maxLen = Math.max(maxLen, right - left + 1)
}
return maxLen
}
```
Time: O(n), Space: O(min(n, alphabet size))