Sorting Algorithm Visualizer: How It Works
Sorting is one of the most fundamental operations in computer science. Every language has a built-in sort, but understanding how sorting works and which algorithm to choose is what separates a developer who uses tools from one who understands them.
This visualizer lets you watch 8 different sorting algorithms operate on the same data. See why merge sort is O(n log n) while bubble sort is O(n²). Watch how quick sort partitions. Understand why counting sort can beat them all when the data is right.
Algorithm Comparison
| Algorithm | Best | Average | Worst | Space | Stable |
|---|---|---|---|---|---|
| Bubble Sort | O(n) | O(n²) | O(n²) | O(1) | Yes |
| Selection Sort | O(n²) | O(n²) | O(n²) | O(1) | No |
| Insertion Sort | O(n) | O(n²) | O(n²) | O(1) | Yes |
| Merge Sort | O(n log n) | O(n log n) | O(n log n) | O(n) | Yes |
| Quick Sort | O(n log n) | O(n log n) | O(n²) | O(log n) | No |
| Randomized QS | O(n log n) | O(n log n) | O(n²)* | O(log n) | No |
| Counting Sort | O(n + k) | O(n + k) | O(n + k) | O(k) | Yes |
| Radix Sort | O(d·n) | O(d·n) | O(d·n) | O(n + k) | Yes |
*Randomized quick sort makes the O(n²) worst case astronomically unlikely.
How Each Sorting Algorithm Works
Bubble Sort Visualizer
Repeatedly walk through the array, comparing adjacent elements and swapping them if they're in the wrong order. Each pass bubbles the largest unsorted element to its correct position. Simple to understand, terrible for large datasets. O(n) best case when the array is already sorted (with early termination).
Selection Sort Visualizer
Find the minimum element in the unsorted portion, swap it with the first unsorted element, and repeat. Always O(n²) regardless of input. Not stable. Its only advantage is minimal swaps (exactly n-1), which matters when writes are expensive.
Insertion Sort Visualizer
Build the sorted portion one element at a time. Take the next unsorted element and insert it into its correct position by shifting larger elements right. Excellent for small or nearly sorted arrays. O(n) best case. Used as the base case in hybrid sorts like Timsort.
Merge Sort Visualizer
Divide the array in half, recursively sort each half, then merge the two sorted halves. The merge step walks through both halves simultaneously, picking the smaller element each time. Guaranteed O(n log n) in all cases. Stable. Requires O(n) extra space for the merge.
Quick Sort Visualizer
Pick a pivot element, partition the array so all elements less than the pivot come before it and all greater elements come after. Recursively sort the two partitions. O(n log n) average, but O(n²) worst case when the pivot is consistently the min or max. In practice, the fastest general-purpose sort due to cache efficiency.
Counting Sort Visualizer
Count the occurrences of each value, compute cumulative counts, then place elements directly into their sorted positions. O(n + k) where k is the range of values. Not comparison-based. Only works for discrete values with a bounded range. Stable when implemented correctly.
Radix Sort Visualizer
Sort by each digit, starting from the least significant digit (LSD). Each digit pass uses a stable sort (typically counting sort). After processing all digits, the array is fully sorted. O(d·n) where d is the number of digits. Beats comparison sorts when d is small relative to log n.
When to Use Which Algorithm
- •Small arrays (<20 elements) — Insertion sort. Low overhead, fast for tiny inputs.
- •Nearly sorted data — Insertion sort. O(n) when the array is almost sorted.
- •General purpose — Quick sort or merge sort. Most standard library sorts use a hybrid of these.
- •Guaranteed O(n log n) — Merge sort. No worst-case degradation.
- •Memory constrained — Quick sort or heap sort. Both are in-place (O(log n) stack space).
- •Integers in a small range — Counting sort. O(n + k), beats all comparison sorts.
- •Large integers or strings — Radix sort. O(d·n) where d is digit/character count.
- •Stability required — Merge sort, insertion sort, counting sort, or radix sort.
Sorting in Coding Interviews
Sorting comes up directly and indirectly in interviews:
- •Implement merge sort / quick sort — The most common direct sorting question. You need to write it from scratch and explain the time/space complexity.
- •Sort an array of 0s, 1s, and 2s — Dutch National Flag problem. Three-pointer approach, single pass.
- •Find the kth largest element — Quick select (partition-based), or use a min-heap of size k.
- •Merge k sorted arrays — Min-heap of size k, or divide-and-conquer merge.
- •Sort a nearly sorted array — Insertion sort or min-heap of size k (where each element is at most k positions from its sorted position).
Frequently Asked Questions
What is the fastest sorting algorithm?add
It depends on the data. For general-purpose comparison-based sorting, merge sort and quick sort are both O(n log n) average case. Quick sort is typically faster in practice due to cache locality, but merge sort is stable and guarantees O(n log n) worst case. For integer data with a known range, counting sort and radix sort achieve O(n) time.
What is the difference between stable and unstable sorting?add
A stable sort preserves the relative order of elements with equal keys. Merge sort, insertion sort, counting sort, and radix sort are stable. Quick sort, selection sort, and heap sort are unstable. Stability matters when sorting by multiple criteria or when the original order of equal elements carries meaning.
Why is quick sort faster than merge sort in practice?add
Quick sort operates in-place (O(log n) extra space vs O(n) for merge sort) and has better cache locality since it accesses elements sequentially in the same array. Merge sort requires copying data to auxiliary arrays during merges, which causes more cache misses. However, quick sort's worst case is O(n^2) when the pivot is consistently bad, which randomized quick sort mitigates.
When should I use counting sort or radix sort?add
Use counting sort when sorting integers within a small, known range (e.g., ages 0-120, ASCII characters). Use radix sort when sorting integers or fixed-length strings where the number of digits/characters is small relative to n. Both achieve O(n) time but require non-comparison-based approaches and extra space.
Which sorting algorithms are asked about in coding interviews?add
Merge sort and quick sort are the most common. Interviewers expect you to implement them from scratch, explain the partition/merge logic, and analyze time and space complexity. Understanding when to use counting sort (for bounded integers) or insertion sort (for nearly sorted data) also comes up. Knowing the stability and in-place properties of each algorithm is important.
How does bubble sort work?add
Bubble sort repeatedly walks through the array comparing adjacent elements and swapping them if they are in the wrong order. Each pass bubbles the largest unsorted element to its final position at the end. It is O(n^2) average and worst case, but O(n) best case on an already sorted array with early termination. It is stable and in-place.
How does merge sort work?add
Merge sort divides the array in half recursively until each subarray has one element, then merges pairs of sorted subarrays back together. The merge step walks through both halves simultaneously, picking the smaller element each time. It guarantees O(n log n) in all cases, is stable, but requires O(n) extra space for the merge.
How does quick sort work?add
Quick sort picks a pivot element, then partitions the array so all elements less than the pivot come before it and all greater elements come after. It recursively sorts the two partitions. Average case is O(n log n), worst case O(n^2) when the pivot is consistently the smallest or largest element. Randomized pivot selection makes the worst case extremely unlikely.
How does insertion sort work?add
Insertion sort builds the sorted portion one element at a time. It takes the next unsorted element and shifts larger elements right to make room, then inserts the element in its correct position. It is O(n^2) average and worst case, but O(n) on nearly sorted data. It is stable, in-place, and used as the base case in hybrid sorts like Timsort.