Recursion
The call stack as a machine — push, pop, repeat.
Recursion
A function that calls itself. That's it. The trick is that the language runtime bookkeeps the calls for you on a stack, so "solve this smaller version and I'll glue the answers together" becomes a one-line pattern. Master the pattern and an intimidating class of problems — tree traversal, parsing, filesystem walks, divide-and-conquer — collapses to a few lines of code.
Two pieces, always
Every recursive function has exactly two things:
- A base case — the small input where the answer is obvious and you return immediately, no further recursion.
- A recursive case — where you break the input into one-or-more smaller pieces, call yourself on each, and combine the results.
If you forget the base case, you recurse forever and the program explodes the moment the stack fills up. If you forget to shrink the input on the recursive step, same thing. The entire discipline of writing recursion is: make sure the recursive case reduces toward the base case, and make sure the base case actually handles what you reduce to.
The call stack as a machine
When a function calls itself, the runtime pushes a stack frame — a little package holding the function's arguments, local variables, and the return address. When the function returns, the frame pops. Recursion is just "push, call, eventually pop, combine". Nothing mysterious; it's the same stack every non-recursive function call uses, it just happens to be the same function on every frame.
That's the entire intuition for the call-stack view in the playground on the next tab. Press Step once — one frame pushes. Step again — another frame pushes. The stack grows vertically until a base case fires; then each level returns its answer to the level below, and the stack drains.
Fibonacci as the canonical example
The naive recursive Fibonacci is the "Hello, World" of recursion:
function fib(n) {
if (n <= 0) return 0; // base case
if (n === 1) return 1; // base case
return fib(n - 1) + fib(n - 2); // recursive case
}
Read it as English: "fib(0) and fib(1) are zero and one. For anything else, compute the two smaller fibs and add them." It's almost verbatim the mathematical definition.
The call tree for fib(5) is where the beauty breaks. It branches — fib(5) calls fib(4) and fib(3), fib(4) calls fib(3) and fib(2), and so on. The subtrees overlap massively: fib(3) shows up twice, fib(2) three times, fib(1) five times. This is O(φⁿ) work — exponential — for a problem that has an O(n) iterative solution.
But the stack depth is not exponential. At any moment, the runtime only has one active path from root to leaf in memory: it fully evaluates the left child before starting the right. So the peak stack depth for fib(n) is exactly n — the depth of the longest left-spine chain. That's the number the daily challenge on this level asks you to compute.
Towers of Hanoi — move n-1, move 1, move n-1
Hanoi is the other classic recursion demo, and its recursive structure is so clean it reads like a proof:
To move n disks from peg A to peg C using peg B:
- move the top n-1 disks from A to B (recursively, using C as spare),
- move the single bottom disk from A to C,
- move the n-1 disks from B to C (recursively, using A as spare).
That's the whole algorithm. The number of single-disk moves is exactly 2ⁿ - 1, but — like fib — the stack depth is only n. One deep chain at a time.
When to recurse, when to iterate
Recursion shines when the problem itself is recursive — trees, divide-and-conquer, grammar parsing, filesystem walks. The code ends up mirroring the structure of the data. Iteration wins when the problem is sequential — summing an array, scanning a string, advancing a pointer. A loop is usually clearer and always cheaper (no frame-push overhead, no stack-depth risk). Ask "is the input tree-shaped or list-shaped?" — if tree-shaped, recurse; if list-shaped, loop.
Tail recursion (and why JS won't save you)
A tail-recursive call is one where the recursive call is the very last thing the function does — nothing else to combine, no pending arithmetic. In languages that perform tail-call optimisation (TCO) — Scheme, Scala, OCaml, Erlang, Lua, a tiny corner of Kotlin — the runtime can reuse the current stack frame instead of pushing a new one. Tail-recursive code in those languages runs in O(1) stack space, equivalent to a loop.
JavaScript does not optimise tail calls. The ES2015 spec mandates TCO in strict mode; no major engine actually implements it — V8 had a flag for a while, then ripped it out. Safari's JavaScriptCore is the only lingering exception, and even then only in strict mode. Python, C, Java, Go, Rust: no TCO either. For the vast majority of the languages you write, even a perfectly tail-recursive function still pushes a frame per call. Rewriting a deep recursion in tail form does not save you from stack overflow in JS, Python, or most other mainstream runtimes.
Stack overflow is real
The call stack is finite. Default limits: Node.js around 10,000 frames, Python 1000, Chrome v8 around 10,000. Recurse past the limit and you get RangeError: Maximum call stack size exceeded (JS) or RecursionError: maximum recursion depth exceeded (Python). On deep inputs — walking a million-node tree, or the naive fib of n = 50,000 — you will blow the stack. The fixes, in order of preference: (1) rewrite iteratively with an explicit stack/queue, (2) use a trampoline (return a continuation, loop outside), (3) bump the stack size (Error.stackTraceLimit in Node, sys.setrecursionlimit in Python) — last resort, it just delays the inevitable.
Memoisation — the bridge to dynamic programming
The naive fib is exponential in time because the same subproblems get solved over and over. Memoisation caches the answer the first time each input is seen:
const cache = new Map();
function fib(n) {
if (n <= 0) return 0;
if (n === 1) return 1;
if (cache.has(n)) return cache.get(n);
const v = fib(n - 1) + fib(n - 2);
cache.set(n, v);
return v;
}
One line added, and the runtime drops from O(φⁿ) to O(n). Dynamic programming is, in one sentence, "recursion plus memoisation". Every DP problem starts as a naive recursion; the DP shape is what you get when you notice the overlap and cache it. Learn recursion cold, and DP follows naturally.
Recursion in the wild
You'll see recursion everywhere once you start looking:
- Parsers and compilers — recursive-descent parsers mirror the grammar one-to-one; each rule is a function, each reference to another rule is a recursive call.
- Filesystem walks — "process this directory" is "process each file, and recurse into each subdirectory".
- DOM traversal — React's virtual-DOM diffing, browser layout passes, every "walk this tree" algorithm.
- Fractals and generative art — the Koch snowflake, the Mandelbrot set, L-systems. Self-similar structure begs for self-calling code.
- Divide-and-conquer sorts — mergesort and quicksort are recursion over the two halves of the array.
Gotchas
A few things to keep in mind, gathered from sharp edges people hit:
- No base case = infinite recursion. Always write the base case first.
- Mutating a shared argument between recursive calls — common with arrays. Either pass copies or remember to undo mutations on the way back up.
- Combining returns incorrectly — a classic bug where the recursive function works but the caller forgets to use the returned value.
- Exponential blow-up without memoisation — any time you call
self(x)twice with the samexin the same tree, you're recomputing. Memoise. - Misreading the stack — the playground's stack view grows upward (top of stack on top). Some textbook diagrams flip this — same data either way.
Playground
Pick fib or Hanoi, pick an n between 1 and 8, and press Play. The stack inflates as calls push, deflates as returns pop. Watch the peak depth climb to exactly n and then hold there.
Visualizer
The Recursion Tree panel draws the full call tree and animates it with the playhead. Nodes on the current call path glow; returned nodes fade and show their return value. Drag to the end and you can see the whole tree at once — for fib that's a stark picture of how much redundant work the naive version does.