What is a Recursion Tree?
A recursion tree diagrams every recursive call an algorithm makes. Each node represents one call, and edges connect a call to the sub-calls it spawns. By summing the work at each level of the tree, you can derive the total time complexity.
For divide-and-conquer algorithms like merge sort, the tree is balanced: each level does O(n) total work across all calls, and there are log n levels, giving O(n log n). For naive Fibonacci, the tree is exponential: the number of nodes roughly doubles at each level, giving O(2^n).
Recursion DAG and Memoization
A recursion DAG merges duplicate calls into shared nodes. This is exactly what memoization does: when you cache results, duplicate calls become O(1) lookups instead of full recomputations.
The difference is dramatic. Fibonacci's recursion tree has O(2^n) nodes, but the DAG has only O(n) unique calls. This is why memoized Fibonacci runs in O(n) time instead of O(2^n). Visualizing both side-by-side makes this intuition concrete.
The Master Theorem
The Master Theorem provides a direct formula for recurrences of the form T(n) = a·T(n/b) + O(n^d):
| Condition | Result | Intuition |
|---|---|---|
| log_b(a) > d | O(n^(log_b(a))) | Recursion dominates: leaves do the most work |
| log_b(a) = d | O(n^d · log n) | Balanced: equal work at every level |
| log_b(a) < d | O(n^d) | Root dominates: top-level work dwarfs recursion |
Common Recurrences
| Algorithm | Recurrence | Case | Result |
|---|---|---|---|
| Binary Search | T(n) = T(n/2) + O(1) | Case 2 | O(log n) |
| Merge Sort | T(n) = 2T(n/2) + O(n) | Case 2 | O(n log n) |
| Karatsuba | T(n) = 3T(n/2) + O(n) | Case 1 | O(n^1.585) |
| Strassen | T(n) = 7T(n/2) + O(n²) | Case 1 | O(n^2.807) |
| Naive Fibonacci | T(n) = T(n-1) + T(n-2) | N/A | O(2^n) |
| Memoized Fibonacci | Each call O(1) | N/A | O(n) |
Recursion in Coding Interviews
- •Draw the recursion tree — Interviewers love when you sketch the tree to analyze complexity. It shows you understand the algorithm deeply, not just the code.
- •Identify overlapping subproblems — If the recursion tree has duplicate nodes, memoization or DP will help. The DAG view makes this obvious.
- •Apply the Master Theorem — For divide-and-conquer problems, state a, b, d and name the case. This is faster than drawing the full tree.
- •Explain tree vs DAG tradeoff — Tree = naive recursion (exponential). DAG = memoized (polynomial). Being able to articulate this shows strong fundamentals.
- •Optimize with bottom-up DP — Once you see the DAG, you can often replace recursion with iterative DP, eliminating call stack overhead entirely.
Frequently Asked Questions
What is a recursion tree?add
A recursion tree is a visual representation of all recursive calls made by an algorithm. Each node represents a function call, and edges connect a call to its sub-calls. The tree shows the total work done at each level and helps analyze time complexity. For example, fib(5) creates a tree with nodes for fib(5), fib(4), fib(3), fib(2), fib(1), with many duplicate calls visible.
What is the difference between a recursion tree and a recursion DAG?add
A recursion tree shows every call as a separate node, even duplicates. A recursion DAG (directed acyclic graph) merges duplicate calls into a single node with multiple parents. The DAG represents what happens with memoization: duplicate calls hit the cache instead of recomputing. For Fibonacci, the tree has O(2^n) nodes but the DAG has only O(n) nodes.
What is the Master Theorem?add
The Master Theorem solves recurrences of the form T(n) = aT(n/b) + O(n^d), where a is the number of subproblems, b is the factor by which n shrinks, and d is the exponent of the non-recursive work. It has three cases: if log_b(a) > d, T(n) = O(n^log_b(a)); if log_b(a) = d, T(n) = O(n^d log n); if log_b(a) < d, T(n) = O(n^d).
How does the Master Theorem apply to merge sort?add
Merge sort has the recurrence T(n) = 2T(n/2) + O(n), so a=2, b=2, d=1. log_2(2) = 1 = d, which is Case 2. Therefore T(n) = O(n log n). The recursion tree has log_2(n) levels, each doing O(n) total work, confirming the n log n result.
Why is the Fibonacci recursion tree exponential?add
Each fib(n) call makes two recursive calls: fib(n-1) and fib(n-2). Without memoization, this creates a binary tree of depth n with roughly 2^n nodes. Many calls are duplicates (fib(3) is computed multiple times). With memoization (the DAG), each unique call is computed only once, reducing the time from O(2^n) to O(n).
How do you use recursion trees in coding interviews?add
Draw the recursion tree to analyze time complexity of recursive algorithms. Count the nodes to find total work, or sum the work per level. For divide-and-conquer (merge sort, quick sort), the tree reveals the O(n log n) pattern. For dynamic programming (Fibonacci, coin change), the tree shows overlapping subproblems that memoization eliminates.