Recursion Tree, DAG & Master Theorem Visualizer

See recursion trees expand, watch memoization collapse them into DAGs, and calculate complexity with the Master Theorem.

n =6
account_tree

Build a recursion tree to explore call structure

Pick a recurrence and click Build to visualize the recursion.

What is a Recursion Tree?

A recursion tree diagrams every recursive call an algorithm makes. Each node represents one call, and edges connect a call to the sub-calls it spawns. By summing the work at each level of the tree, you can derive the total time complexity.

For divide-and-conquer algorithms like merge sort, the tree is balanced: each level does O(n) total work across all calls, and there are log n levels, giving O(n log n). For naive Fibonacci, the tree is exponential: the number of nodes roughly doubles at each level, giving O(2^n).

Recursion DAG and Memoization

A recursion DAG merges duplicate calls into shared nodes. This is exactly what memoization does: when you cache results, duplicate calls become O(1) lookups instead of full recomputations.

The difference is dramatic. Fibonacci's recursion tree has O(2^n) nodes, but the DAG has only O(n) unique calls. This is why memoized Fibonacci runs in O(n) time instead of O(2^n). Visualizing both side-by-side makes this intuition concrete.

The Master Theorem

The Master Theorem provides a direct formula for recurrences of the form T(n) = a·T(n/b) + O(n^d):

ConditionResultIntuition
log_b(a) > dO(n^(log_b(a)))Recursion dominates: leaves do the most work
log_b(a) = dO(n^d · log n)Balanced: equal work at every level
log_b(a) < dO(n^d)Root dominates: top-level work dwarfs recursion

Common Recurrences

AlgorithmRecurrenceCaseResult
Binary SearchT(n) = T(n/2) + O(1)Case 2O(log n)
Merge SortT(n) = 2T(n/2) + O(n)Case 2O(n log n)
KaratsubaT(n) = 3T(n/2) + O(n)Case 1O(n^1.585)
StrassenT(n) = 7T(n/2) + O(n²)Case 1O(n^2.807)
Naive FibonacciT(n) = T(n-1) + T(n-2)N/AO(2^n)
Memoized FibonacciEach call O(1)N/AO(n)

Recursion in Coding Interviews

  • Draw the recursion treeInterviewers love when you sketch the tree to analyze complexity. It shows you understand the algorithm deeply, not just the code.
  • Identify overlapping subproblemsIf the recursion tree has duplicate nodes, memoization or DP will help. The DAG view makes this obvious.
  • Apply the Master TheoremFor divide-and-conquer problems, state a, b, d and name the case. This is faster than drawing the full tree.
  • Explain tree vs DAG tradeoffTree = naive recursion (exponential). DAG = memoized (polynomial). Being able to articulate this shows strong fundamentals.
  • Optimize with bottom-up DPOnce you see the DAG, you can often replace recursion with iterative DP, eliminating call stack overhead entirely.

Frequently Asked Questions

What is a recursion tree?add

A recursion tree is a visual representation of all recursive calls made by an algorithm. Each node represents a function call, and edges connect a call to its sub-calls. The tree shows the total work done at each level and helps analyze time complexity. For example, fib(5) creates a tree with nodes for fib(5), fib(4), fib(3), fib(2), fib(1), with many duplicate calls visible.

What is the difference between a recursion tree and a recursion DAG?add

A recursion tree shows every call as a separate node, even duplicates. A recursion DAG (directed acyclic graph) merges duplicate calls into a single node with multiple parents. The DAG represents what happens with memoization: duplicate calls hit the cache instead of recomputing. For Fibonacci, the tree has O(2^n) nodes but the DAG has only O(n) nodes.

What is the Master Theorem?add

The Master Theorem solves recurrences of the form T(n) = aT(n/b) + O(n^d), where a is the number of subproblems, b is the factor by which n shrinks, and d is the exponent of the non-recursive work. It has three cases: if log_b(a) > d, T(n) = O(n^log_b(a)); if log_b(a) = d, T(n) = O(n^d log n); if log_b(a) < d, T(n) = O(n^d).

How does the Master Theorem apply to merge sort?add

Merge sort has the recurrence T(n) = 2T(n/2) + O(n), so a=2, b=2, d=1. log_2(2) = 1 = d, which is Case 2. Therefore T(n) = O(n log n). The recursion tree has log_2(n) levels, each doing O(n) total work, confirming the n log n result.

Why is the Fibonacci recursion tree exponential?add

Each fib(n) call makes two recursive calls: fib(n-1) and fib(n-2). Without memoization, this creates a binary tree of depth n with roughly 2^n nodes. Many calls are duplicates (fib(3) is computed multiple times). With memoization (the DAG), each unique call is computed only once, reducing the time from O(2^n) to O(n).

How do you use recursion trees in coding interviews?add

Draw the recursion tree to analyze time complexity of recursive algorithms. Count the nodes to find total work, or sum the work per level. For divide-and-conquer (merge sort, quick sort), the tree reveals the O(n log n) pattern. For dynamic programming (Fibonacci, coin change), the tree shows overlapping subproblems that memoization eliminates.

apartment

Company Interview Questions

See which problems Google, Meta, Amazon, and 200+ companies actually ask in real interviews.

Browse questions arrow_forward

Can you analyze a recursive algorithm's complexity in an interview?

Drawing the recursion tree, identifying overlapping subproblems, and applying the Master Theorem under pressure. Practice with an AI interviewer.

Try a free mock interview arrow_forward