00:01
<Justin Ridgewell>
Unfortunately that doesn't work, async generator functions will not reenter until all previous next calls settle
00:04
<Justin Ridgewell>
async function* foo() {
    yield new Promise(r => setTimeout(r, 1000));
    console.log('reenter');
}
const it = foo();
it.next();
it.next();
00:05
<bakkot>
the previous call settles in my example
00:05
<Justin Ridgewell>
The reenter won't log until the 1000ms is up
00:05
<bakkot>
yes but it doesn't need to
00:05
<bakkot>
my example still works
00:06
<bakkot>
async function* foo(){ yield 0; await sleep(1000); yield 1; }
let it = foo().map(x => fetch(x));
it.next();
// almost immediately after this point, the promise returned from the first call to the underlying foo()'s `.next` settles
// then the first call to `fetch` starts
it.next() // so this can almost immediately kick off another call to the underlying foo()'s `next`, and have that actually hit the `sleep`
00:08
<Justin Ridgewell>
async function* foo(){
    yield 0;
    console.log('reenter');
    await sleep(1000);
    yield 1;
}
let it = foo();
it.next();
it.next();
console.log('end');

Logs end then reenter, the first yield doesn't settle the iterator until after a tick

00:09
<Justin Ridgewell>
I think you're incorrect, and even if you're correct, you're example will start to fail again as soon as you chain your iterators with async functionality
00:09
<Justin Ridgewell>
let it = foo()
  .map(x => fetch(x))
  .map(r => r.json());
00:10
<bakkot>
not with the definition of map in the slides
00:10
<bakkot>
I am pretty sure
00:10
<bakkot>
but I will check after work and get back to you
00:10
<bakkot>
(couple hours probably)
00:12
<Justin Ridgewell>
It depends on how the source iterator performs parallelization
00:14
<bakkot>
pretty sure it does not
00:15
<Justin Ridgewell>
Oh you're right, the second .map will work in parallel with the first .map
00:16
<Justin Ridgewell>
I'm recalling when I was writing map as an async-gen function itself, which means I was still hitting backpressure
00:16
<Justin Ridgewell>
So only the first iterator chained off a back-pressure iterator will be serial
00:17
<Justin Ridgewell>
We still hit the reentrancy issue with an async-gen function, which would be neat to ~solve~change
03:04
<bakkot>
Justin Ridgewell: yeah, and that's the problem with the current spec too: the async iterator helpers are implemented as async generators
03:04
<bakkot>
here's code you can run today which demonstrates that the mapper from the slides can be parallel, both with itself and with the underlying generator https://gist.github.com/bakkot/a338838aee667517adb03edfa83aaed1
03:06
<bakkot>
and indeed the original generator can't be parallel with itself. I don't think there's anything we can do about that, though - the whole point of generators is that they're written as straight-line code, and you can't start executing the next line until the previous one finishes
03:09
<Kris Kowal>
I think interleaving of async map iterators is expected behavior.
03:09
<bakkot>
Kris Kowal: what do you mean by interleaving?
03:10
<bakkot>
just that you can be executing the mapper function in parallel with itself?
03:10
<bakkot>
(assuming the resulting mapped thing is pumped multiple times)
03:10
<Kris Kowal>
That effect for one, yes.
03:11
<Justin Ridgewell>
and indeed the original generator can't be parallel with itself. I don't think there's anything we can do about that, though - the whole point of generators is that they're written as straight-line code, and you can't start executing the next line until the previous one finishes
It's not just the running of the generators code, but the settlement of the next() promise external from the genrator that blocks
03:11
<Kris Kowal>
But also map(xf).map(yf) would interleave turns of xf and yf.
03:12
<Justin Ridgewell>
Eg, I want to be able to return a yield fetch() and reenter without that fetch having settled
03:12
<bakkot>
It's not just the running of the generators code, but the settlement of the next() promise external from the genrator that blocks
In the case that you specifically yield a promise, do you mean, or is there another case I'm not thinking of?
03:12
<bakkot>
ah yeah
03:12
<bakkot>
with this helper, you could do yield { v: fetch() } in your async generator and then .map(box => box.v) to get parallelism in the resulting thing (or rather the ability to be parallel assuming someone calls .next eagerly)
03:12
<bakkot>
a little silly but works fine
03:12
<bakkot>
... I think
03:13
<bakkot>
But also map(xf).map(yf) would interleave turns of xf and yf.
Right, yeah. You get that with my proposed tweak also (again assuming someone calls .next() eagerly)
03:14
<Kris Kowal>
Eg, I want to be able to return a yield fetch() and reenter without that fetch having settled
I don’t have spare attention to dig, but I would find it very surprising if yield fetch() were equivalent to yield await fetch().
03:15
<Justin Ridgewell>
They are, unfortunately
03:16
<Justin Ridgewell>
async function* foo() {
    yield new Promise(r => setTimeout(r, 1000));
    console.log('reenter');
}
const it = foo();
it.next();
it.next();
03:16
<Justin Ridgewell>
^ That waits a full second before logging reenter
03:17
<Justin Ridgewell>
My original map helper was something like for await (const x of source) yield mapper(x), and if mapper has any async waiting, it blocks.
03:18
<bakkot>
here's the slides where we decided that yield is yield await https://docs.google.com/presentation/d/1U6PivKbFO0YgoFlrYB82MtXf1ofCp1xSVOODOvranBM/edit#slide=id.g223fba4116_0_196
03:19
<bakkot>
and notes https://github.com/tc39/notes/blob/55af84ac0ed7a250206849dddd628b2c1db2c9b1/meetings/2017-05/may-25.md#15iva-revisiting-async-generator-yield-behavior
03:19
<Justin Ridgewell>
😦 Before my time
04:39
<Kris Kowal>
Ah, surprising but maybe for the best. In any case, “overdriving” (drawing concurrent promises from next()) a pipeline of async maps to get parallelism is useful, so glad that’ll work regardless.
04:40
<Kris Kowal>

In fact, the implicit await is doing me a big favor in:

function parallel(limit, process) {
  function *workers() {
    for (const worker of count(limit)) {
      yield process(worker);
    }
  }
  return Promise.all(workers());
}
04:44
<Kris Kowal>

Out of which you can build bounded concurrency pretty easily (I’m taking the liberty of assuming the existence of AsyncIterator.from having not followed along particularly closely)

const parallelForEach = async (limit, values, process) =>
  parallel(limit, () => AsyncIterator.from(values).forEach(process));
05:14
<bakkot>

In fact, the implicit await is doing me a big favor in:

function parallel(limit, process) {
  function *workers() {
    for (const worker of count(limit)) {
      yield process(worker);
    }
  }
  return Promise.all(workers());
}
There's no async iterator there and so no implicit await; did you mean a different thing?
05:14
<bakkot>

Out of which you can build bounded concurrency pretty easily (I’m taking the liberty of assuming the existence of AsyncIterator.from having not followed along particularly closely)

AsyncIterator.prototype.parallelForEach = async (limit, values, process) =>
  parallel(limit, () => AsyncIterator.from(values).forEach(process));
AsyncIterator.from works and does the thing you want, yup, though your specification of parallelForEach is possibly confused - prototype-placed methods generally want to refer to this
05:15
<Kris Kowal>
AsyncIterator.from works and does the thing you want, yup, though your specification of parallelForEach is possibly confused - prototype-placed methods generally want to refer to this
Indeed, lost in translation. Local copy wasn’t framed as a prototype method.
05:16
<Kris Kowal>
There's no async iterator there and so no implicit await; did you mean a different thing?
Oh, indeed.
05:38
<Justin Ridgewell>

If you want the flexibility of not blocking on the async operation before continuing the async generator body, then just don't yield the value yet.

🤔 That doesn't really make sense.

05:55
<Justin Ridgewell>
I wonder how difficult it would be to change to Option 2, which would allow easy parallel processing
05:58
<Justin Ridgewell>

The only way to consume async iterables:

  • Callers of .next() will await (they already had to, we're returning promises)
  • for-await-of can be updated to await the inner value
    • Also call .return() if rejection happens
05:59
<Justin Ridgewell>
Surprisingly, I don't think there'll be a web-compat risk.
06:02
<Justin Ridgewell>

The ways to consume an async iterator:

  • manual .next()
  • yield* (which only works inside another async iterable)
  • for-await-of
06:02
<Justin Ridgewell>
I don't imagine manual .next() iteration is super common?
06:03
<Justin Ridgewell>
So the main way to consume an async iterable is to use for-await-of, which we can update to await the inner promise (and cleanup the iterator on rejections)
06:04
<Justin Ridgewell>
We can ignore yield*, because it's only available in other async iterators, because we still have an async iterator and that still needs to be consumed with for-await-of.
06:06
<Justin Ridgewell>
.next() iteration could be updated to await the inner value, but not block on that settlement
06:08
<Justin Ridgewell>
Or maybe we need a .nextRaw() which won't unwrap the inner
06:09
<bakkot>
the main web-compat risk is that (async function*(){ try { yield Promise.reject(1) } catch (e) { console.log('caught'); } })().next() prints caught
06:09
<bakkot>
i.e. yielding a rejected promise triggers catch handlers around the yield
06:09
<bakkot>
changing that would be fraught at this point
06:10
<Justin Ridgewell>
A .nextRaw() then
06:11
<Justin Ridgewell>
the main web-compat risk is that (async function*(){ try { yield Promise.reject(1) } catch (e) { console.log('caught'); } })().next() prints caught
Actually, how common is manual iteration?
06:13
<bakkot>
sorry, it's not the .next that's relevant here
06:13
<bakkot>
you'd get the same caught in a for-await-of
06:13
<bakkot>
the relevant bit is that the catch handler inside the async generator gets triggered
06:14
<bakkot>
no idea how common manual iteration is though. my guess would be not especially but there's definitely times you want it, e.g. for queues and stuff
06:37
<Justin Ridgewell>

you'd get the same caught in a for-await-of
the relevant bit is that the catch handler inside the async generator gets triggered

I would phrase this as ".next() will call .error() if promise rejects"

06:40
<Justin Ridgewell>

Eg, the slides say:

// Given:
AsyncIterator.prototype = {
  async next(v) {
    return { value: await resumeWithValue(v), done }
  }

  async error(e) {
    return { value: await resumeWithError(e), done }
  }
}

// Input
async function* foo() {
    try {
        yield Promise.reject(1);
    } finally {
        console.log(2);
    }
}

// Output
async function* foo() {
    try {
        yield await Promise.reject(1);
    } finally {
        console.log(2);
    }
}
06:41
<Justin Ridgewell>
I think we could unobservably change that to:
06:42
<Justin Ridgewell>
// Given:
AsyncIterator.prototype = {
  async next(v) {
    try {
      return { value: await resumeWithValue(v), done }
    } catch (e) {
      return { value: await resumeWithError(e), done }
    }
  }

  async error(e) {
    return { value: await resumeWithError(e), done }
  }
}

// Input
async function* foo() {
    try {
        yield Promise.reject(1);
    } finally {
        console.log(2);
    }
}

// Output has no change
06:43
<Justin Ridgewell>

And if we can do that, why can't we do:

// Given:
AsyncIterator.prototype = {
  async next(v) {
    try {
      return { value: await this.nextRaw(v).value, done }
    } catch (e) {
      return { value: await resumeWithError(e), done }
    }
  }

  async nextRaw(v) {
    return { value: resumeWithValue(v), done }
  }

  async error(e) {
    return { value: await resumeWithError(e), done }
  }
}
15:01
<Mathieu Hofman>
the main web-compat risk is that (async function*(){ try { yield Promise.reject(1) } catch (e) { console.log('caught'); } })().next() prints caught
That implicit await is actually surprising to me.
15:05
<Mathieu Hofman>
no idea how common manual iteration is though. my guess would be not especially but there's definitely times you want it, e.g. for queues and stuff
We do quite a bit of manual iteration, and it's error prone. I would expect that calling .next() would unwrap / await the inner promise. I don't see a reason to need nextRaw()
15:09
<Mathieu Hofman>
And yes making the .next() implementation trigger the equivalent .throw(e) logic on rejection during unwrapping would preserve the consumer behaviors (without needing to update for-await-of)
15:11
<Mathieu Hofman>
tricky part is that on the producer side, the throw() continuation may come in when the generator has already moved on if the consumer called .next() without awaiting (manually iterated)
16:19
<bakkot>
Having .next call .throw on the generator would be kind of weird - that would inject the throw completion into the async generator at a point unrelated to the place that produced the rejected promise.
16:20
<bakkot>
tricky part is that on the producer side, the throw() continuation may come in when the generator has already moved on if the consumer called .next() without awaiting (manually iterated)
Which I guess is the thing you were saying here, yeah.
16:23
<Mathieu Hofman>
I was just trying to analyze Justin Ridgewell suggestion above. In most cases it would behave the same, except for multiple un-awaited .next() calls
16:25
<Mathieu Hofman>
Regardless of the approach, if we want to make generators not implicitly await, it will be a contract change on the producer side in case the consumer doesn't await itself
16:27
<Mathieu Hofman>
And I don't think there's a way to make the producer opt-in somehow, besides foregoing generators of course
16:28
<bakkot>
There is the hacky way using my tweak to async iterator helpers: replace async gen*f(){ yield p; } with let f = () => (async gen*f(){ yield { v: p }; })().map(box => box.v)
16:32
<Mathieu Hofman>
yeah true
16:34
<Mathieu Hofman>

ok I think that means not changing the generator semantics, and leaving it to the producer opting in these semantics through helpers.

I assume the helper would call .throw() on the wrapped generator if it encounters a rejection ?

16:38
<bakkot>
No. Consider the case when you're not doing this box thing, just .map(f) - if f throws, that's not an error in the producer
16:39
<bakkot>
If the mapper function throws then .map will call .return on the producer, in the same way that for await (let item of iter) { throw 0 } would call .return on the producer, of course
16:39
<bakkot>
but right now nothing in the language calls .throw, and iterator helpers will not change that
16:46
<Mathieu Hofman>
Oh right I always forget that .throw() is only called for yield * but not in for await of
16:50
<Mathieu Hofman>
Should f get access to the iterator as a 3rd argument like array.map() ?
16:52
<Mathieu Hofman>
so that we can do .map(async (box, i, iter) => { try { return await box.v } catch(e) { iter.throw(e); throw e; } }) ?
16:53
<Mathieu Hofman>
wouldn't be very clean tho as the spec would still call .return after .throw being called by the mapper function ...
16:54
<bakkot>
Yeah, I think if the producer wants to handle rejected promises itself, it needs to await them