Generators & Yield
Master the mechanics of suspendable functions. Learn how to use Generators for custom iteration, bidirectional data flow, and high-performance asynchronous streaming.
The Mechanics of Suspension
Unlike regular functions that follow a "Run-to-Completion" model, **Generator Functions** (`function*`) can pause their execution context and yield control back to the caller. When resumed, they maintain their entire internal state (variable values, program counter, lexical scope).
// Architectural Logic: Pause & Resume
// Generator functions do not run to completion immediately.
function* statefulProcess() {
console.log('Phase 1: Initializing');
yield 'READY'; // Suspension point 1
console.log('Phase 2: Processing');
yield 'IN_PROGRESS'; // Suspension point 2
console.log('Phase 3: Finalizing');
return 'COMPLETE';
}
const it = statefulProcess();
console.log(it.next()); // Logs Phase 1, { value: 'READY', done: false }
console.log(it.next()); // Logs Phase 2, { value: 'IN_PROGRESS', done: false }Bidirectional Communication
Generators are not just for producing series of values; they are **state machines**. You can pass data *into* the generator when resuming it via `next(value)`. The value passed becomes the result of the `yield` expression inside the function body.
// Engineering Pattern: Bidirectional Communication
// We can send values BACK into the generator via next(arg)
function* energyGrid() {
const consumption = yield 'Current Demand?';
console.log(`Received Demand: ${consumption}MW`);
if (consumption > 100) {
yield 'Activating Backup Generators';
} else {
yield 'Grid Stable';
}
}
const grid = energyGrid();
console.log(grid.next().value); // 'Current Demand?'
console.log(grid.next(150).value); // 'Activating Backup Generators'Async Generators & Streaming
In modern engineering, **Async Generators** are the gold standard for handling memory-efficient data streams. Instead of waiting for a 100MB JSON response to fully download, you can `yield` items as they are parsed, significantly lowering the "Time to First Interaction."
// Production Pattern: Async Data Streaming
// Perfect for handling Paginated APIs or WebSockets
async function* liveLogStream(resourceId) {
let cursor = null;
while (true) {
const response = await fetch(`/api/logs/${resourceId}?cursor=${cursor}`);
const { logs, nextCursor } = await response.json();
for (const log of logs) {
yield log; // Stream individual logs as they arrive
}
if (!nextCursor) break;
cursor = nextCursor;
}
}
// Consumption via for-await-of
for await (const log of liveLogStream('srv-01')) {
console.log('New Log:', log.message);
}Technical Insight: Performance and Memory
Generators are highly memory-efficient because they do not allocate entire arrays upfront. They only allocate memory for one item at a time (the one being yielded). This is critical when working with infinite sequences or massive datasets that exceed the available heap size.
Generator Best Practices:
- ✅ **Stream Processing:** Use `async function*` for large API responses.
- ✅ **State Machines:** Use Generators to manage complex UI logic (e.g., multi-step forms).
- ✅ **Yield Delegation:** Use `yield* otherGen()` to compose complex sequences.
- ✅ **Resource Cleanup:** Wrap yields in `try...finally` to ensure resources (like DB connections) close.
- ✅ **Early Exit:** Use `gen.return()` to terminate a generator from the outside.