docs/concepts/generators-iterators.mdx
What if a function could pause mid-execution, return a value, and then resume right where it left off? What if you could create a sequence of values that are computed only when you ask for them — not all at once?
// This function can PAUSE and RESUME
function* countToThree() {
yield 1 // Pause here, return 1
yield 2 // Resume, pause here, return 2
yield 3 // Resume, pause here, return 3
}
const counter = countToThree()
console.log(counter.next().value) // 1
console.log(counter.next().value) // 2
console.log(counter.next().value) // 3
This is the power of generators. Introduced in the ECMAScript 2015 specification, these are functions that can pause with yield and pick up where they left off. Combined with iterators (objects that define how to step through a sequence), they open up patterns like lazy evaluation, infinite sequences, and clean data pipelines.
Before getting into generators, we need to cover iterators, the foundation that makes generators work.
An iterator is an object that defines a sequence and provides a way to access values one at a time. It must have a .next() method that returns an object with two properties:
value — the next value in the sequencedone — true if the sequence is finished, false otherwise// Creating an iterator manually
function createCounterIterator(max) {
let count = 0
return {
next() {
if (count < max) {
return { value: count++, done: false }
} else {
return { value: undefined, done: true }
}
}
}
}
const counter = createCounterIterator(3)
console.log(counter.next()) // { value: 0, done: false }
console.log(counter.next()) // { value: 1, done: false }
console.log(counter.next()) // { value: 2, done: false }
console.log(counter.next()) // { value: undefined, done: true }
Why not just use an array? Two reasons:
Say you need to process a million records. With an array, you'd load all million into memory. With an iterator, you process one at a time. Memory stays flat.
Many JavaScript built-ins are already iterable (they have iterators built in):
| Type | Example | What it iterates over |
|---|---|---|
| Array | [1, 2, 3] | Each element |
| String | "hello" | Each character |
| Map | new Map([['a', 1]]) | Each [key, value] pair |
| Set | new Set([1, 2, 3]) | Each unique value |
| arguments | arguments object | Each argument passed to a function |
| NodeList | document.querySelectorAll('div') | Each DOM node |
You can access their iterator using Symbol.iterator:
const arr = [10, 20, 30]
const iterator = arr[Symbol.iterator]()
console.log(iterator.next()) // { value: 10, done: false }
console.log(iterator.next()) // { value: 20, done: false }
console.log(iterator.next()) // { value: 30, done: false }
console.log(iterator.next()) // { value: undefined, done: true }
Generators click when you have the right mental picture. Think of them like a vending machine:
┌─────────────────────────────────────────────────────────────────────────┐
│ GENERATOR AS A VENDING MACHINE │
├─────────────────────────────────────────────────────────────────────────┤
│ │
│ YOU VENDING MACHINE │
│ (caller) (generator) │
│ │
│ ┌─────────┐ ┌─────────────────┐ │
│ │ │ │ ┌───────────┐ │ │
│ │ "I'll │ ──── Press button ─────────► │ │ Snack A │ │ │
│ │ have │ (call .next()) │ ├───────────┤ │ │
│ │ one" │ │ │ Snack B │ │ │
│ │ │ ◄─── Dispense one item ───── │ ├───────────┤ │ │
│ │ │ (yield value) │ │ Snack C │ │ │
│ │ │ │ └───────────┘ │ │
│ │ │ * Machine PAUSES * │ │ │
│ │ │ * Waits for next * │ [ PAUSED ] │ │
│ │ │ * button press * │ │ │
│ └─────────┘ └─────────────────┘ │
│ │
│ KEY INSIGHT: The machine remembers where it stopped! │
│ When you press the button again, it gives you the NEXT item, │
│ not the first one again. │
│ │
└─────────────────────────────────────────────────────────────────────────┘
Here's how this maps to generator concepts:
| Vending Machine | Generator |
|---|---|
| Press the button | Call .next() |
| Machine dispenses one item | yield returns a value |
| Machine pauses, waits | Generator pauses at yield |
| Press button again | Call .next() again |
| Machine remembers position | Generator remembers its state |
| Machine is empty | done: true |
A generator works the same way: one value at a time, pausing between each.
A generator is a function that can stop mid-execution, hand you a value, and pick up where it left off later. You create one using function* (note the asterisk) and pause it with the yield keyword.
// The asterisk (*) makes this a generator function
function* myGenerator() {
console.log('Starting...')
yield 'First value'
console.log('Resuming...')
yield 'Second value'
console.log('Finishing...')
return 'Done!'
}
When you call a generator function, the code inside doesn't run yet. You just get back a generator object (which is an iterator):
const gen = myGenerator() // Nothing logs yet!
console.log(gen) // Object [Generator] {}
The code only runs when you call .next():
const gen = myGenerator()
// First .next() — runs until first yield
console.log(gen.next())
// Logs: "Starting..."
// Returns: { value: 'First value', done: false }
// Second .next() — resumes and runs until second yield
console.log(gen.next())
// Logs: "Resuming..."
// Returns: { value: 'Second value', done: false }
// Third .next() — resumes and runs to the end
console.log(gen.next())
// Logs: "Finishing..."
// Returns: { value: 'Done!', done: true }
// Fourth .next() — generator is exhausted
console.log(gen.next())
// Returns: { value: undefined, done: true }
Because generator objects follow the iterator protocol, you can use them with for...of:
function* colors() {
yield 'red'
yield 'green'
yield 'blue'
}
for (const color of colors()) {
console.log(color)
}
// Output:
// red
// green
// blue
You can also spread them into arrays:
function* numbers() {
yield 1
yield 2
yield 3
}
const arr = [...numbers()]
console.log(arr) // [1, 2, 3]
yield Keyword Deep Diveyield is what makes generators tick. It pauses the function and sends a value back to the caller. When you call .next() again, execution picks up right after the yield.
yieldfunction* countdown() {
yield 3
yield 2
yield 1
yield 'Liftoff!'
}
const rocket = countdown()
console.log(rocket.next().value) // 3
console.log(rocket.next().value) // 2
console.log(rocket.next().value) // 1
console.log(rocket.next().value) // "Liftoff!"
yield vs returnBoth yield and return can return values, but they behave very differently:
yield | return |
|---|---|
| Pauses the generator | Ends the generator |
done: false | done: true |
| Can have multiple | Only one matters |
Value accessible in for...of | Value NOT accessible in for...of |
function* example() {
yield 'A' // Pauses, done: false
yield 'B' // Pauses, done: false
return 'C' // Ends, done: true
}
// With for...of — return value is ignored!
for (const val of example()) {
console.log(val)
}
// Output: A, B (no C!)
// With .next() — you can see the return value
const gen = example()
console.log(gen.next()) // { value: 'A', done: false }
console.log(gen.next()) // { value: 'B', done: false }
console.log(gen.next()) // { value: 'C', done: true }
yield* — Delegating to Other IterablesWhen you want to pass through all values from another iterable, use yield*:
function* inner() {
yield 'a'
yield 'b'
}
function* outer() {
yield 1
yield* inner() // Delegates to inner generator
yield 2
}
console.log([...outer()]) // [1, 'a', 'b', 2]
yield* shines when flattening nested structures:
function* flatten(arr) {
for (const item of arr) {
if (Array.isArray(item)) {
yield* flatten(item) // Recursively delegate
} else {
yield item
}
}
}
const nested = [1, [2, 3, [4, 5]], 6]
console.log([...flatten(nested)]) // [1, 2, 3, 4, 5, 6]
You can also send values into a generator by passing them to .next(value). The value becomes the result of the yield expression inside the generator:
function* conversation() {
const name = yield 'What is your name?'
const color = yield `Hello, ${name}! What's your favorite color?`
yield `${color} is a great color, ${name}!`
}
const chat = conversation()
// First .next() — no value needed, just starts the generator
console.log(chat.next().value)
// "What is your name?"
// Second .next() — pass in the answer
console.log(chat.next('Alice').value)
// "Hello, Alice! What's your favorite color?"
// Third .next() — pass in another answer
console.log(chat.next('Blue').value)
// "Blue is a great color, Alice!"
┌─────────────────────────────────────────────────────────────────────────┐
│ DATA FLOW WITH yield │
├─────────────────────────────────────────────────────────────────────────┤
│ │
│ CALLER GENERATOR │
│ │
│ .next() ─────────────────────► starts execution │
│ ◄───────────────────── yield 'question' │
│ │
│ .next('Alice') ─────────────────────► const name = 'Alice' │
│ ◄───────────────────── yield 'Hello Alice' │
│ │
│ .next('Blue') ─────────────────────► const color = 'Blue' │
│ ◄───────────────────── yield 'Blue is great' │
│ │
│ The value passed to .next() becomes the RESULT of the yield │
│ expression inside the generator. │
│ │
└─────────────────────────────────────────────────────────────────────────┘
.return() and .throw()Beyond .next(), generators have two more control methods that give you full control over execution.
.return()The .return(value) method ends the generator immediately and returns the specified value:
function* countdown() {
yield 3
yield 2
yield 1
yield 'Liftoff!'
}
const rocket = countdown()
console.log(rocket.next()) // { value: 3, done: false }
console.log(rocket.return('Aborted')) // { value: 'Aborted', done: true }
console.log(rocket.next()) // { value: undefined, done: true }
// Generator is now closed — subsequent .next() calls return done: true
This is useful for cleanup or when you need to stop iteration early.
.throw()The .throw(error) method throws an exception at the current yield point. If the generator has a try/catch, it can handle the error:
function* resilientGenerator() {
try {
yield 'A'
yield 'B'
yield 'C'
} catch (e) {
yield `Caught: ${e.message}`
}
yield 'Done'
}
const gen = resilientGenerator()
console.log(gen.next().value) // "A"
console.log(gen.throw(new Error('Oops!')).value) // "Caught: Oops!"
console.log(gen.next().value) // "Done"
If there's no try/catch, the error propagates out:
function* fragileGenerator() {
yield 'A'
yield 'B' // Error thrown here if we call .throw() after first yield
}
const gen = fragileGenerator()
gen.next() // { value: 'A', done: false }
try {
gen.throw(new Error('Boom!'))
} catch (e) {
console.log(e.message) // "Boom!"
}
Symbol.iterator)Now for the fun part: making your own objects work with for...of. An object is iterable if it has a [Symbol.iterator] method that returns an iterator.
const myCollection = {
items: ['apple', 'banana', 'cherry'],
// This makes the object iterable
[Symbol.iterator]() {
let index = 0
const items = this.items
return {
next() {
if (index < items.length) {
return { value: items[index++], done: false }
} else {
return { value: undefined, done: true }
}
}
}
}
}
// Now we can use for...of!
for (const item of myCollection) {
console.log(item)
}
// Output: apple, banana, cherry
// And spread syntax!
console.log([...myCollection]) // ['apple', 'banana', 'cherry']
All that manual iterator code? Generators cut it down to almost nothing:
const myCollection = {
items: ['apple', 'banana', 'cherry'],
// Generator as the Symbol.iterator method
*[Symbol.iterator]() {
for (const item of this.items) {
yield item
}
}
}
for (const item of myCollection) {
console.log(item)
}
// Output: apple, banana, cherry
Here's a Range class you can loop over with for...of:
class Range {
constructor(start, end, step = 1) {
this.start = start
this.end = end
this.step = step
}
// Generator makes this easy!
*[Symbol.iterator]() {
for (let i = this.start; i <= this.end; i += this.step) {
yield i
}
}
}
const oneToFive = new Range(1, 5)
console.log([...oneToFive]) // [1, 2, 3, 4, 5]
const evens = new Range(0, 10, 2)
console.log([...evens]) // [0, 2, 4, 6, 8, 10]
// Works with for...of
for (const n of new Range(1, 3)) {
console.log(n) // 1, 2, 3
}
for...of Really DoesWhen you write a for...of loop, JavaScript does this behind the scenes:
Here's what that looks like in code:
// This:
for (const item of iterable) {
console.log(item)
}
// Is equivalent to this:
const iterator = iterable[Symbol.iterator]()
let result = iterator.next()
while (!result.done) {
const item = result.value
console.log(item)
result = iterator.next()
}
The killer feature of generators is lazy evaluation. Values are computed only when you ask for them, not ahead of time.
Compare these two approaches for creating a range of numbers:
// Eager evaluation — creates entire array in memory
function rangeArray(start, end) {
const result = []
for (let i = start; i <= end; i++) {
result.push(i)
}
return result
}
// Lazy evaluation — computes values on demand
function* rangeGenerator(start, end) {
for (let i = start; i <= end; i++) {
yield i
}
}
// For small ranges, both work fine
console.log(rangeArray(1, 5)) // [1, 2, 3, 4, 5]
console.log([...rangeGenerator(1, 5)]) // [1, 2, 3, 4, 5]
// For large ranges, generators shine
// rangeArray(1, 1000000) — Creates array of 1 million numbers!
// rangeGenerator(1, 1000000) — Creates nothing until you iterate
Because generators are lazy, you can create infinite sequences, something impossible with arrays:
// Infinite sequence of natural numbers
function* naturalNumbers() {
let n = 1
while (true) { // Infinite loop!
yield n++
}
}
// This would crash with an array, but generators are lazy
const numbers = naturalNumbers()
console.log(numbers.next().value) // 1
console.log(numbers.next().value) // 2
console.log(numbers.next().value) // 3
// We can keep going forever...
A classic example: the infinite Fibonacci sequence:
function* fibonacci() {
let prev = 0
let curr = 1
while (true) {
yield curr
const next = prev + curr
prev = curr
curr = next
}
}
const fib = fibonacci()
console.log(fib.next().value) // 1
console.log(fib.next().value) // 1
console.log(fib.next().value) // 2
console.log(fib.next().value) // 3
console.log(fib.next().value) // 5
console.log(fib.next().value) // 8
You'll often want to take a limited number of items from an infinite generator:
// Helper function to take N items from any iterable
function* take(n, iterable) {
let count = 0
for (const item of iterable) {
if (count >= n) return
yield item
count++
}
}
// Get first 10 Fibonacci numbers
const firstTenFib = [...take(10, fibonacci())]
console.log(firstTenFib) // [1, 1, 2, 3, 5, 8, 13, 21, 34, 55]
// Get first 5 natural numbers
const firstFive = [...take(5, naturalNumbers())]
console.log(firstFive) // [1, 2, 3, 4, 5]
// ❌ DANGER — This will hang/crash!
const all = [...naturalNumbers()] // Trying to collect infinite items
// ✓ SAFE — Use take() or break early
const some = [...take(100, naturalNumbers())]
Here are some patterns that make generators worth knowing.
Generate unique IDs without tracking global state:
function* createIdGenerator(prefix = 'id') {
let id = 1
while (true) {
yield `${prefix}_${id++}`
}
}
const userIds = createIdGenerator('user')
const orderIds = createIdGenerator('order')
console.log(userIds.next().value) // "user_1"
console.log(userIds.next().value) // "user_2"
console.log(orderIds.next().value) // "order_1"
console.log(userIds.next().value) // "user_3"
console.log(orderIds.next().value) // "order_2"
Process large datasets in manageable chunks:
function* chunk(array, size) {
for (let i = 0; i < array.length; i += size) {
yield array.slice(i, i + size)
}
}
const data = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]
for (const batch of chunk(data, 3)) {
console.log('Processing batch:', batch)
}
// Output:
// Processing batch: [1, 2, 3]
// Processing batch: [4, 5, 6]
// Processing batch: [7, 8, 9]
// Processing batch: [10]
This is great for batch processing, API rate limiting, or breaking up heavy computations:
function* processInBatches(items, batchSize) {
for (const batch of chunk(items, batchSize)) {
// Process each batch
const results = batch.map(item => heavyComputation(item))
yield results
}
}
// Process 1000 items in batches of 100
const allItems = new Array(1000).fill(null).map((_, i) => i)
for (const batchResults of processInBatches(allItems, 100)) {
console.log(`Processed ${batchResults.length} items`)
// Could add delay here to avoid blocking the main thread
}
Create composable data pipelines:
function* filter(iterable, predicate) {
for (const item of iterable) {
if (predicate(item)) {
yield item
}
}
}
function* map(iterable, transform) {
for (const item of iterable) {
yield transform(item)
}
}
// Compose them together
function* range(start, end) {
for (let i = start; i <= end; i++) {
yield i
}
}
// Pipeline: numbers 1-10 → filter evens → double them
const result = map(
filter(range(1, 10), n => n % 2 === 0),
n => n * 2
)
console.log([...result]) // [4, 8, 12, 16, 20]
Generators naturally model state machines because they remember their position:
function* trafficLight() {
while (true) {
yield 'green'
yield 'yellow'
yield 'red'
}
}
const light = trafficLight()
console.log(light.next().value) // "green"
console.log(light.next().value) // "yellow"
console.log(light.next().value) // "red"
console.log(light.next().value) // "green" (cycles back)
console.log(light.next().value) // "yellow"
A more complex example with different wait times:
function* trafficLightWithDurations() {
while (true) {
yield { color: 'green', duration: 30000 } // 30 seconds
yield { color: 'yellow', duration: 5000 } // 5 seconds
yield { color: 'red', duration: 25000 } // 25 seconds
}
}
const light = trafficLightWithDurations()
function changeLight() {
const { color, duration } = light.next().value
console.log(`Light is now ${color} for ${duration / 1000} seconds`)
setTimeout(changeLight, duration)
}
// changeLight() // Uncomment to run
Generators work great for traversing trees:
function* traverseTree(node) {
yield node.value
if (node.children) {
for (const child of node.children) {
yield* traverseTree(child) // Recursive delegation
}
}
}
const tree = {
value: 'root',
children: [
{
value: 'child1',
children: [
{ value: 'grandchild1' },
{ value: 'grandchild2' }
]
},
{
value: 'child2',
children: [
{ value: 'grandchild3' }
]
}
]
}
console.log([...traverseTree(tree)])
// ['root', 'child1', 'grandchild1', 'grandchild2', 'child2', 'grandchild3']
for await...ofWhat about yielding values from async operations like API calls, file reads, that kind of thing? That's what async generators are for.
Regular generators are synchronous. If you try to yield a Promise, you get the Promise object itself, not its resolved value:
function* fetchUsers() {
yield fetch('/api/user/1').then(r => r.json())
yield fetch('/api/user/2').then(r => r.json())
}
const gen = fetchUsers()
console.log(gen.next().value) // Promise { <pending> } — not the user!
An async generator combines async functions with generators. You can await inside them, and you iterate with for await...of:
async function* fetchUsersAsync() {
const user1 = await fetch('/api/user/1').then(r => r.json())
yield user1
const user2 = await fetch('/api/user/2').then(r => r.json())
yield user2
}
// Use for await...of to consume
async function displayUsers() {
for await (const user of fetchUsersAsync()) {
console.log(user.name)
}
}
Fetch all pages of data from a paginated API:
async function* fetchAllPages(baseUrl) {
let page = 1
let hasMore = true
while (hasMore) {
const response = await fetch(`${baseUrl}?page=${page}`)
const data = await response.json()
yield data.items // Yield this page's items
hasMore = data.hasNextPage
page++
}
}
// Process all pages
async function processAllUsers() {
for await (const pageOfUsers of fetchAllPages('/api/users')) {
console.log(`Processing ${pageOfUsers.length} users...`)
for (const user of pageOfUsers) {
// Process each user
await saveToDatabase(user)
}
}
}
When do you reach for an async generator over Promise.all?
// Promise.all — All requests in parallel, wait for ALL to complete
async function fetchAllAtOnce(userIds) {
const users = await Promise.all(
userIds.map(id => fetch(`/api/user/${id}`).then(r => r.json()))
)
return users // Returns all users at once
}
// Async generator — Process as each completes
async function* fetchOneByOne(userIds) {
for (const id of userIds) {
const user = await fetch(`/api/user/${id}`).then(r => r.json())
yield user // Yield each user as it's fetched
}
}
| Approach | Best for |
|---|---|
Promise.all | When you need all results before proceeding |
| Async generator | When you want to process results as they arrive |
| Async generator | When fetching everything at once would be too memory-intensive |
| Async generator | When you might want to stop early |
Here's a real pattern for processing a stream line by line:
async function* readLines(reader) {
const decoder = new TextDecoder()
let buffer = ''
while (true) {
const { done, value } = await reader.read()
if (done) {
if (buffer) yield buffer // Yield any remaining content
return
}
buffer += decoder.decode(value, { stream: true })
const lines = buffer.split('\n')
buffer = lines.pop() // Keep incomplete line in buffer
for (const line of lines) {
yield line
}
}
}
// Usage with fetch
async function processLogFile(url) {
const response = await fetch(url)
const reader = response.body.getReader()
for await (const line of readLines(reader)) {
console.log('Log entry:', line)
}
}
// ✓ CORRECT — Note the asterisk
function* myGenerator() {
yield 1
}
```
The asterisk can go next to `function` or next to the name — both work:
```javascript
function* foo() {} // ✓
function *foo() {} // ✓
function * foo() {} // ✓
```
greet() // Nothing logged! Returns generator object
// ✓ CORRECT — You must call .next() or iterate
const gen = greet()
gen.next() // NOW it logs "Hello!"
// Or use for...of
for (const val of greet()) {
console.log(val)
}
```
console.log([...letters()]) // ['a', 'b'] — no 'c'!
// ✓ CORRECT — Use yield for all iteration values
function* letters() {
yield 'a'
yield 'b'
yield 'c'
}
console.log([...letters()]) // ['a', 'b', 'c']
```
const gen = nums()
console.log([...gen]) // [1, 2]
console.log([...gen]) // [] — generator is exhausted!
// ✓ CORRECT — Create a new generator each time
console.log([...nums()]) // [1, 2]
console.log([...nums()]) // [1, 2]
```
const all = [...forever()] // Infinite loop trying to collect all values!
// ✓ SAFE — Use take() or break early
function* take(n, gen) {
let count = 0
for (const val of gen) {
if (count++ >= n) return
yield val
}
}
const firstHundred = [...take(100, forever())] // Safe!
```
// ✓ SIMPLER — Just use an array
const daysOfWeek = [
'Monday', 'Tuesday', 'Wednesday', 'Thursday',
'Friday', 'Saturday', 'Sunday'
]
```
**Use generators when:**
- Values are computed on-demand (lazy)
- Sequence is infinite or very large
- You need to pause/resume execution
- Values come from async operations
**Use arrays when:**
- You have a fixed, known set of values
- Values are already computed
- You need random access (`array[5]`)
Iterators are objects with a .next() method that returns { value, done }
Generators are functions that pause at yield and resume at .next()
Don't forget the asterisk — it's function*, not function
yield pauses, return ends — and return values don't show up in for...of
yield* passes through all values from another iterable
Generators are lazy — nothing runs until you ask for it
Infinite sequences work because generators compute on-demand
Symbol.iterator is how you make objects work with for...of
Async generators (async function*) let you await inside and iterate with for await...of
Generators are single-use — once done, you need a fresh one
</Info>- `yield` **pauses** the generator and returns `{ value, done: false }`. The generator can resume from where it paused.
- `return` **ends** the generator and returns `{ value, done: true }`. The generator cannot resume.
Important: Values from `return` are NOT included when using `for...of`, spread syntax, or `Array.from()`.
```javascript
function* example() {
yield 'A' // Included in iteration
yield 'B' // Included in iteration
return 'C' // NOT included in for...of!
}
console.log([...example()]) // ['A', 'B']
```
Add a `[Symbol.iterator]` method that returns an iterator (an object with a `.next()` method):
```javascript
const myObject = {
data: [1, 2, 3],
// Method 1: Return an iterator object
[Symbol.iterator]() {
let index = 0
const data = this.data
return {
next() {
if (index < data.length) {
return { value: data[index++], done: false }
}
return { done: true }
}
}
}
}
// Method 2: Use a generator (simpler!)
const myObject2 = {
data: [1, 2, 3],
*[Symbol.iterator]() {
yield* this.data
}
}
```
const g = gen()
console.log('Start')
console.log(g.next().value)
console.log('Middle')
console.log(g.next().value)
```
**Answer:**
```
Start
A
1
Middle
B
2
```
**Explanation:**
1. `gen()` creates the generator but doesn't run any code
2. `'Start'` logs
3. First `g.next()` runs until first `yield` — logs `'A'`, returns `{ value: 1, done: false }`
4. We log the value `1`
5. `'Middle'` logs
6. Second `g.next()` resumes and runs until second `yield` — logs `'B'`, returns `{ value: 2, done: false }`
7. We log the value `2`
8. `'C'` never logs because we didn't call `g.next()` a third time
Pass values as arguments to `.next(value)`. The value becomes the result of the `yield` expression:
```javascript
function* adder() {
const a = yield 'Enter first number'
const b = yield 'Enter second number'
yield `Sum: ${a + b}`
}
const gen = adder()
console.log(gen.next().value) // "Enter first number"
console.log(gen.next(10).value) // "Enter second number" (a = 10)
console.log(gen.next(5).value) // "Sum: 15" (b = 5)
```
Note: The first `.next()` starts the generator. Any value passed to it is ignored because there's no `yield` waiting to receive it yet.
Use async generators when you need to yield values from asynchronous operations:
- **Paginated APIs** — Fetch and yield page by page
- **Streaming data** — Process chunks as they arrive
- **Database cursors** — Iterate through large result sets
- **File processing** — Read and yield lines from large files
```javascript
async function* fetchPages(url) {
let page = 1
while (true) {
const response = await fetch(`${url}?page=${page}`)
const data = await response.json()
if (data.items.length === 0) return
yield data.items
page++
}
}
// Consume with for await...of
for await (const items of fetchPages('/api/products')) {
processItems(items)
}
```
Spread syntax (`...`) tries to collect ALL values into an array. With an infinite generator, this means infinite iteration. Your program will hang trying to collect infinite values.
```javascript
function* forever() {
let i = 0
while (true) yield i++
}
// ❌ DANGER — Hangs forever!
const all = [...forever()]
// ✓ SAFE — Limit how many you take
function* take(n, gen) {
let i = 0
for (const val of gen) {
if (i++ >= n) return
yield val
}
}
const first100 = [...take(100, forever())]
```
Always use a limiting function like `take()`, or manually call `.next()` a specific number of times.