The technique that solves problems by making them smaller — until they are trivially solved. Essential for trees, graphs, and divide-and-conquer. When Facebook ran a friend query without a depth limit, the database ran for 4 minutes.
Write recursive functions with correct base case and recursive case. Know that recursion uses stack space. Recognise when naive Fibonacci is O(2^n) and how to fix it.
Write recursive tree and graph traversals. Know when to add memoization. Know the depth limits of common runtimes and when to convert to iteration.
Design recursive algorithms with explicit depth limits for production use. Know tail-call optimisation. Recognize the memoization pattern and explain it as the foundation of dynamic programming.
The technique that solves problems by making them smaller — until they are trivially solved. Essential for trees, graphs, and divide-and-conquer. When Facebook ran a friend query without a depth limit, the database ran for 4 minutes.
"People You May Know" query runs for a user with 500 friends and a large social graph
Query has expanded recursively to millions of friend lookups — database CPU climbs to 100%
WARNINGNotifications service shows degraded health — writes are timing out behind the stalled database
CRITICALEngineer manually kills the query. Database CPU drops immediately.
WARNINGFix deployed: recursion depth capped at 2 (friends-of-friends only). 125M lookups become at most 250,500.
The question this raises
If removing a depth limit from one recursive query can generate 125 million database calls — what is the rule that every recursive function must follow to terminate safely?
A recursive function calls itself — each time with a slightly smaller input. When does the computer know to stop calling?
Lesson outline
The house-cleaning rule
Your cleaning rule: "Clean this room. If there are more rooms, apply the cleaning rule to the rest." You do not need to know how many rooms there are. You just clean one, then apply the same rule to whatever is left. When there are no rooms left (base case: "if no rooms, stop"), you are done. Recursion works identically: a function that handles one piece and delegates the rest to itself, until there is nothing left to delegate.
Recursive function structure
function solve(n) {
if (n <= 0) return base; // base case
return combine(solve(n-1)); // recursive case
}Use for: Reach for this when the problem is naturally self-similar (tree traversal, divide-and-conquer, combinatorics) and depth is bounded
The key insight: trust the smaller problem. When writing a recursive function, you do not need to think about HOW the recursive call works. You only need to know WHAT it returns. If you are writing factorial(n), trust that factorial(n-1) returns the correct answer for (n-1). Your only job is: "given that factorial(n-1) is correct, how do I compute factorial(n)?" The answer is n * factorial(n-1).
How this concept changes your thinking
Computing factorial(4)
“I would trace through all the calls: factorial(4) calls factorial(3) which calls factorial(2)... I get lost trying to track the whole chain at once.”
“I trust factorial(3) returns 6. My job is just: 4 * 6 = 24. I do not need to trace how factorial(3) works. That is the contract: factorial(n) = n * factorial(n-1).”
Traversing every node in a binary tree
“I would write a complex loop keeping track of which nodes I have visited and which subtrees I still need to process.”
“A tree is a root plus two subtrees. Each subtree is also a tree. The recursive function: process(node) = process left subtree, print node, process right subtree. Stop when node is null. The recursion handles the traversal automatically.”
Recursive function with no base case
“It looks like it should work — it calls itself with a smaller value each time.”
“Without a base case, it never stops. The call stack grows until Node.js throws "Maximum call stack size exceeded." The base case is not optional — it is what terminates the recursion.”
Computing factorial(4) — tracing the call stack
01
We call factorial(4). The function asks: "what is 4 × factorial(3)?" It does not know yet — it must wait for factorial(3) to return. A stack frame for factorial(4) is pushed.
02
factorial(3) is called. It asks: "what is 3 × factorial(2)?" It waits. Stack frame pushed.
03
factorial(2) asks: "what is 2 × factorial(1)?" It waits. Stack frame pushed.
04
factorial(1) asks: "what is 1 × factorial(0)?" It waits. Stack frame pushed.
05
Base case reached: factorial(0) returns 1 immediately — no recursive call. Stack frame popped. The call stack starts unwinding.
06
factorial(1) receives 1, computes 1 × 1 = 1, returns 1. Stack frame popped.
07
factorial(2) receives 1, computes 2 × 1 = 2, returns 2. Stack frame popped.
08
factorial(3) receives 2, computes 3 × 2 = 6, returns 6. Stack frame popped.
09
factorial(4) receives 6, computes 4 × 6 = 24, returns 24. Final answer. The call stack is empty again.
We call factorial(4). The function asks: "what is 4 × factorial(3)?" It does not know yet — it must wait for factorial(3) to return. A stack frame for factorial(4) is pushed.
factorial(3) is called. It asks: "what is 3 × factorial(2)?" It waits. Stack frame pushed.
factorial(2) asks: "what is 2 × factorial(1)?" It waits. Stack frame pushed.
factorial(1) asks: "what is 1 × factorial(0)?" It waits. Stack frame pushed.
Base case reached: factorial(0) returns 1 immediately — no recursive call. Stack frame popped. The call stack starts unwinding.
factorial(1) receives 1, computes 1 × 1 = 1, returns 1. Stack frame popped.
factorial(2) receives 1, computes 2 × 1 = 2, returns 2. Stack frame popped.
factorial(3) receives 2, computes 3 × 2 = 6, returns 6. Stack frame popped.
factorial(4) receives 6, computes 4 × 6 = 24, returns 24. Final answer. The call stack is empty again.
Factorial — the simplest recursive function that shows all three key parts:
1function factorial(n) {2// Base case: trivially solvedThe base case — this is what stops the recursion. Without this line, factorial calls itself forever until stack overflow.3if (n <= 0) return 1;n <= 0 (not just n === 0) handles negative inputs safely — they would recurse forever otherwise.45// Recursive case: current * smaller problemThe recursive case: trust that factorial(n-1) returns the correct answer. Your only job here is n * that answer.6return n * factorial(n - 1);n - 1 makes the problem smaller each call, guaranteeing we eventually hit the base case.7}89// factorial(4)10// = 4 * factorial(3)11// = 4 * 3 * factorial(2)12// = 4 * 3 * 2 * factorial(1)13// = 4 * 3 * 2 * 1 * factorial(0)14// = 4 * 3 * 2 * 1 * 115// = 24
For Fibonacci, the recursive definition looks elegant. The naive implementation is O(2^n) — it recalculates the same values billions of times.
O(2^n) naive Fibonacci vs O(n) memoised
// Naive — O(2^n), crashes at n=50
function fib(n) {
if (n <= 1) return n;
return fib(n - 1) + fib(n - 2);
}
// fib(50) makes 2^50 = 1 quadrillion calls
// fib(40) takes ~2 seconds
// fib(50) never finishes// Memoised — O(n) time and space
function fib(n, memo = {}) {
if (n <= 1) return n;
if (n in memo) return memo[n];
memo[n] = fib(n - 1, memo) + fib(n - 2, memo);
return memo[n];
}
// fib(1000) returns instantlyThe naive version recalculates fib(n-2) when computing fib(n), and then recalculates it again when computing fib(n-1), and again, and again. At n=50, there are 2^50 function calls — roughly 1 quadrillion. The memoised version stores each result the first time it is computed. Every subsequent call for the same n returns instantly from the cache. O(n) calls instead of O(2^n). This pattern — cache recursive results — is the gateway to dynamic programming.
Is recursion the right tool?
| Recursive function | What happens | Speed |
|---|---|---|
| Factorial of n | n recursive calls, each O(1) | Fast (O(n)) |
| Binary search on n items | Halves problem each call — log₂(n) calls | Very fast (O(log n)) |
| Tree traversal (n nodes) | Visits each node once | Fast (O(n)) |
| Naive Fibonacci of n | Recalculates the same subproblems exponentially | Catastrophic (O(2^n)) |
| Memoised Fibonacci of n | Each subproblem computed once and cached | Fast (O(n)) |
Recursion vs iteration
📖 What the exam expects
Recursion uses the call stack implicitly. Iteration uses explicit loops. Recursion is natural for problems with recursive structure (trees, graphs). Both are equally powerful — any recursive function can be written iteratively and vice versa.
Toggle between what certifications teach and what production actually requires
Recursion appears in: tree traversal (always), DFS (always), divide-and-conquer (merge sort, binary search), generating permutations and combinations, and "flatten this nested structure" problems. The interviewer is checking: do you know the base case, the recursive case, and when NOT to use recursion.
Common questions:
Strong answer: states the base case before the recursive case / mentions stack depth in the complexity analysis / offers the iterative version when depth could be large / can draw the call stack for factorial(3) from memory
Red flags: writes recursion without a base case / cannot trace the call stack for a small example / uses recursion for a problem where depth could be 10,000+ (stack overflow risk) / cannot explain what happens to the call stack when the function returns
💡 Analogy
You are cleaning a messy house. Your rule: clean this room, then apply the same rule to the rest of the house. Each time you apply the rule, there is one fewer room. Eventually there are zero rooms left — that is the base case: "if no rooms remain, stop." Recursion is the same: a function that solves a smaller version of the same problem until the problem is so small the answer is obvious (the base case). The function does not need to know the total size — it only needs to know when it is done.
⚡ Core Idea
A recursive function is a function that calls itself with a simpler version of the problem. The simplification must guarantee convergence: each call must move closer to the base case. Factorial(n) = n * factorial(n-1) converges because n-1 is smaller than n, and factorial(0) = 1 is the base case. Without the base case or the simplification, the recursion is infinite.
🎯 Why It Matters
Most tree and graph algorithms are naturally recursive because trees are recursive structures: a tree is a root node plus subtrees, and each subtree is itself a tree. Writing these algorithms iteratively requires explicitly managing a stack — which is exactly what the runtime does for you with recursion. Understanding this lets you know WHEN recursion is safe and when to convert it.
Ready to see how this works in the cloud?
Switch to Career Paths for structured paths (e.g. Developer, DevOps) and provider-specific lessons.
View role-based pathsSign in to track your progress and mark lessons complete.
Questions? Discuss in the community or start a thread below.
Join DiscordSign in to start or join a thread.