Search lands in PR-5.1 (Pagefind).

Explanation Intermediate

Chapter 8 Updated

How Node.js actually executes your code

Watch a mixed sync + async program move through the call stack, libuv, and the event loop, step by step.

  • Full 18m
  • Revision 4m
  • Flow 2m

The example we trace through every step:

app.js
let a = 10786;
let b = 20987;
 
// Async operation 1: API call (callback A)
https.get("https://api.fbi.com", (res) => { // A
  console.log(res.secret);
});
 
// Async operation 2: Timer (callback B)
setTimeout(() => { // B
  console.log("setTimeout");
}, 5000);
 
// Async operation 3: File read (callback C)
fs.readFile("./gossip.txt", "utf8", (data) => { // C
  console.log("File Data", data);
});
 
function multiplyFn(x, y) {
  const result = x * y;
  return result;
}
 
let c = multiplyFn(a, b);
console.log(c); // 226215682

Initialization — code loaded into V8

Node.js loads your file. The V8 engine receives the source code as a string. Nothing has executed yet.

  • Call stack is empty
  • Memory heap is clean
  • libuv’s event loop is initialized and idle
  • OS is standing by

Phase 1 — Memory Creation (GEC + hoisting)

V8 creates the Global Execution Context (GEC) and pushes it onto the call stack. Then it enters the memory creation phase:

  • Variables a, b, c are allocated and set to undefined
  • Function multiplyFn is stored with its entire definition (this is why you can call functions before they appear in code — hoisting)
  • The async function calls (https.get, setTimeout, fs.readFile) are NOT executed yet — just recognized as identifiers

Phase 2 — Code Execution (sync assignments)

V8 enters the code execution phase. It processes line 1: a is assigned 10786 in the memory heap.

  • This is synchronous — it runs directly on the call stack, instantly
  • V8 moves to the next line immediately

b gets 20987. Still synchronous, still inside the GEC. V8 continues to the next executable line.

Async offload #1 — https.get() → OS kernel

V8 sees https.get() — a network I/O operation. V8 cannot do networking! It hands this off to libuv.

  • V8 calls the Node.js C++ binding for https.get
  • libuv receives the request and stores callback A
  • libuv tells the OS kernel to make the HTTP request using non-blocking sockets (epoll on Linux, kqueue on macOS, IOCP on Windows)
  • V8 does NOT wait — it immediately moves to the next line

Async offload #2 — setTimeout() → libuv timer

V8 encounters setTimeout(). Timers are managed by libuv internally — not by V8 and not by the OS.

  • libuv stores callback B with a 5000ms delay
  • libuv starts an internal timer (using its own “concept of now”)
  • V8 moves to the next line — no waiting

Async offload #3 — fs.readFile() → libuv thread pool

V8 hits fs.readFile() — file system I/O. Unlike networking, file operations use libuv’s thread pool.

  • libuv stores callback C
  • libuv assigns the file read to a worker thread from its pool (default: 4 threads, configurable via UV_THREADPOOL_SIZE)
  • The worker thread performs the blocking read() syscall
  • V8 continues — doesn’t wait

Sync execution — multiplyFn() creates a new FEC

V8 encounters multiplyFn(a, b) — a synchronous function call.

  • A new Function Execution Context (FEC) is created and pushed on top of GEC
  • Memory phase: result allocated as undefined
  • Execution phase: x = 10786, y = 20987, result = 10786 × 20987 = 226215682
  • return result → FEC is popped off the call stack
  • Garbage collector may clean up FEC’s memory
  • c = 226215682 is assigned in GEC

Meanwhile, libuv is still working in the background on all three async operations. The main thread never stopped for them.

Sync execution — console.log(c), the turning point

console.log(c) prints 226215682. This is the last synchronous line.

  • GEC is popped off the call stack
  • The call stack is now EMPTY
  • V8 has finished all synchronous work
  • But the program doesn’t exit! libuv still has active handles (3 pending async operations)
  • Node.js stays alive as long as the event loop has work

Event loop active — cycling through phases

The event loop begins cycling through its 6 phases. At each phase boundary, it first drains microtask queues.

  • Microtask check: nextTick queue? Empty. Promise queue? Empty.
  • Timers phase: Is callback B ready? No — 5 seconds hasn’t elapsed yet
  • Pending callbacks: None
  • Poll phase: Any I/O ready? Waiting for network response and file read…
  • The event loop may block in the poll phase waiting for I/O events, since there’s nothing else to do

Poll phase — file read completes, callback C runs

The thread pool worker finishes reading gossip.txt. libuv places callback C into the poll queue.

  • Event loop reaches the poll phase and finds callback C ready
  • Before executing: drain nextTick queue (empty), drain Promise queue (empty)
  • console.log("File Data", data) is pushed onto the call stack, runs, and is popped
  • After executing: drain microtasks again (still empty)

Poll phase — API response arrives, callback A runs

The HTTP response arrives. The OS kernel (epoll/kqueue) notifies libuv, which places callback A into the poll queue.

  • Same process: drain microtasks → execute callback A → drain microtasks
  • console.log(res.secret) runs on the call stack
  • Call stack empties again, event loop continues cycling

Timers phase — 5 seconds elapsed, callback B runs

5 seconds have passed. libuv marks callback B as ready. The event loop’s timers phase picks it up.

  • Before executing: drain nextTick queue (empty), drain Promise queue (empty)
  • console.log("setTimeout") pushed onto call stack, runs, popped
  • After executing: drain microtasks (empty)
  • All three callbacks have now executed

Shutdown — loop exits cleanly

The event loop checks: any active handles? Pending timers? Pending I/O? No.

  • All 3 async operations completed
  • All callbacks executed
  • All queues empty
  • uv_run() returns, Node.js process exits with code 0

The complete reference — execution priority order

#KindType
1Synchronous codesync
2process.nextTick()micro
3Promise .then/.catchmicro
4setTimeout / setIntervalmacro
5I/O callbacks (fs, http, db)macro
6setImmediate()macro
7Close callbacksmacro

Event loop phase cycle

Between every phase the loop drains nextTick() then Promises.

  1. TimerssetTimeout, setInterval callbacks ↑ drain nextTick() then Promises
  2. Pending callbacks — Deferred I/O callbacks (TCP errors etc) ↑ drain nextTick() then Promises
  3. Idle / Prepare — Internal Node.js housekeeping ↑ drain nextTick() then Promises
  4. Poll — I/O events: fs, http, net, db. May block here. ↑ drain nextTick() then Promises
  5. ChecksetImmediate() callbacks ↑ drain nextTick() then Promises
  6. Close callbackssocket.on('close') etc ↑ drain microtasks, then loop to phase 1

Memory trick for priority order

S · T · P · T · I · I · C

Sync → nextTick → Promise · Timer → I/O → Immediate → Close

Between every phase: drain nextTick queue, then drain Promise queue. After every individual callback (Node 11+): drain microtasks again.

Comments

Comments are disabled in this environment. Set PUBLIC_GISCUS_REPO, PUBLIC_GISCUS_REPO_ID, and PUBLIC_GISCUS_CATEGORY_ID to enable.