05:01
<David Alsh>

Hi all, new guy here!

I'm entertaining the idea of opening a proposal but wanted to get a sense on how the community feels about the concept before I commit a weekend to writing it out - haha.

Generally, it's the concept of the incorporation of Rust-inspired static macros. Macos are functions that take tokens and return static ECMAScript, similar to a template literal + eval however with LSP support and without the need for eval.

All macros would be defined by users, none defined in the language.

// Parse tokens and generate React.createElement() calls
const Foo = () => jsx!(<div>Hello World</div>)

// Create a getter/setter for a property
class Bar {
  #[observe]
  baz = undefined
}

The syntax itself can be changed, I am just defaulting to Rust syntax for familiarity, but something like jsx!() could be ~jsx() or anything (to avoid conflicts), I'm not opinionated here

While I know ECMAScript is not a compiled language, there are interesting use cases like;

  • Enabling support for jsx/other templating languages without an external preprocessor/eval
  • Macro annotations that facilitate more ergonomic mutation observability leading to more ergonomic syntax for frameworks like Vue/Angular
  • A reduced need for custom compilers (ngc, .vue, .svelte)
  • More ergonomic usage for tools like protobuf
  • etc

This would also lead to support in preprocessors like TypeScript which could statically compile macros - offering an interesting/novel hybrid macro system.

Any thoughts on the concept?

05:12
<Ashley Claymore>
For the "create a getter/setter" example, that looks like it can be accomplished via a decorator https://github.com/tc39/proposal-decorators
05:16
<Ashley Claymore>
Hi and welcome btw 👋🏻 
05:22
<David Alsh>

Yes, I do agree that this idea shares utility with decorators however, due to the static nature of macros (accept tokens, return ecmascript), it expands the concept in various ways, like;

  • Opening a path for preprocessors (like TypeScript, swc, probably v8) to do AoT compilation leading to applications being faster at runtime
  • Incorporating the annotatability of variable assignments as well as class properties
  • Runtime (and potentially AoT) transformation of tokens which allows parsing (and LSP support) for jsx, templating languages and even data formats like yaml, xml, protobuf, etc
  • Many more use cases
const vueComponent = {
  template: vue!(<div>Hello {{data.world}}</div>),
  data: () => ({ world: 'World' })
}

const contents = yaml!(foo: 'bar')
// Expands to
// const contents = { foo: 'bar' }

LSPs would see the expanded form of macros and the macro implementation would define where errors occur in a transformation.

In the Rust world, this has been used to great effect to create GUI frameworks with vastly different approaches without preprocessors, some resembling React, others Vue:

  • Web wasm frameworks [1] [2]
  • Native desktop frameworks [1] [2]
05:58
<Ashley Claymore>
I am curious how things like auto-complete and refactoring work within Rust macros. These both sound easier for tooling to implement when the language extension is concrete, instead of being defined as a macro.
06:21
<David Alsh>

For autocomplete, the LSP expands the macro and uses the output

const foo = add!(1, 1)

Would expand into

const foo = 2

So consumers see it as a number rather than a macro. It simply replaces itself with the transformed result.

As for refactoring; when someone implements a macro, they parse "tokens" which they move around into the equivalent valid language syntax.

The editor can unwind that to determine what to update - though there are some macros that are too complex and refactoring tools bail out.

Implementers can also throw during parsing, which bubbles up to the editor, showing errors in the correct place in the macro

06:31
<Ashley Claymore>
For auto-complete, when the macro is expanded, how is the IDE cursor position preserved?
06:32
<Ashley Claymore>
the position could expand to a much larger block of code 
06:34
<Ashley Claymore>
the expanded code would also lack type annotations, meaning TypeScript would only be able to infer what might be a correct auto complete 
06:38
<David Alsh>

These both sound easier for tooling to implement when the language extension is concrete

Macros are more of a preprocessing step rather than dynamic functionality (like decorators) - but there's a chicken and egg problem.

The best supported illustration of macros in tooling is the built-in transformation of jsx in TypeScript. There are many GitHub issues requesting to expand this to offer custom programmatic macros however it's has been deemed out of scope.

Attempts by other toolmakers to implement the functionality that macros offer have resulted in fragmented, brittle implementations (.vue files and .svelte files - custom tooling, editor plugins and poor support for test runners, linters, formatters, etc).

Decorators also feel like an attempt at addressing the same problem albeit at runtime.

After spending some time in the Rust world, I personally feel that there is evidence that the ES world shows a need for this type of functionality

06:40
<David Alsh>

the expanded code would also lack type annotations, meaning TypeScript would only be able to infer what might be a correct auto complete

It's likely that TypeScript would compile macros into TypeScript at compile time (and in the editor, like they do with jsx), where the emitted runtime code would be the static JavaScript equivalent

That said, if they simply strip the types and leave macros to be evaluated at runtime, then perhaps type information would be an issue

06:42
<Ashley Claymore>
I'm also wondering for tools like swc, it wouldn't be able to execute the macros natively in Rust. It would need to delegate to JS adding overhead that negates the tools desire for speed
06:43
<Ashley Claymore>
Unlike with Rust macros where the compiler naturally understands how to execute Rust
06:45
<David Alsh>

I'm also wondering for tools like swc, it wouldn't be able to execute the macros natively in Rust. It would need to delegate to JS adding overhead that negates the tools desire for speed

Calling into Nodejs is not as fast as running Rust native code directly and there is overhead in crossing from Rust to Nodejs, but it wouldn't be a substantial performance hit (can do hundreds of millions of nodejs calls per second).

There are also options like embedding Deno, QuickJS or llrt for macros that don't use niche nodejs functionality

If callable macros are defined using a templating syntax (like in Rust) then an interpreter could be written (or borrowed directly from an existing JS engine) and run entirely in Rust example

06:51
<David Alsh>
Also, I really appreciate you entertaining this idea and asking such great questions 🙂
06:53
<David Alsh>

For auto-complete, when the macro is expanded, how is the IDE cursor position preserved?

If I get to the point where a proposal is actually on the cards, I will certainly dig deeper into how Rust does this to provide a clearer answer

07:32
<ljharb>
rust macros have some really unfortunate limitations tho - in particular, it seems they can't handle anywhere close to the level of expressiveness and dynamism i'd expect coming from JS
16:58
<ptomato>
there are two kinds of rust macros; for the cases where the normal macros aren't expressive enough, you can write a procedural macro to literally bypass the parser and operate on a stream of tokens from the lexer
16:59
<ptomato>
(and the JS equivalent of that starts to sound like https://github.com/tc39/proposal-binary-ast)
17:02
<ljharb>
and even with that i still haven't been able to get what i want to do, done :-/ i'm sure it's a mix of lacking capability in both self and language
17:03
<snek>
what haven't you been able to do with proc macros?
17:05
<bakkot>
David Alsh: macros in the language mean moving complexity and build time from the developer's computer to the user's computer, which I think is very negative
17:05
<bakkot>
as a developer it's obviously appealing, but in terms of its effects on the world I think it would be very bad
17:06
<Luca Casonato>
Proc macros can generate arbitrary Rust code by executing arbitrary Rust code taking arbitrary lexer tokens as inputs. They are turing complete - there is nothing they can not do
17:07
<snek>
they can't properly attribute error locations :P
17:08
<Luca Casonato>
They can in nightly :D
17:08
<Michael Ficarra>
Proc macros can generate arbitrary Rust code by executing arbitrary Rust code taking arbitrary lexer tokens as inputs. They are turing complete - there is nothing they can not do
there's a ton of things that Turing machines cannot do
17:09
<Luca Casonato>
Sure. Then more accurately: they have all the same capabilities as any other Rust program (because they are a special kind of Rust program)
17:09
<ljharb>
then that probably means that rust's type system can't do what i want :-p
17:48
<sirisian>
I started looking at code generation (like C#) before to see how it could function in JS and it has similar issues as it moves what might be build-time to the user's computer which as mentioned is contentious.
21:01
<James M Snell>
Hey all, just doing some sketching out on a proposal for a Promise.lazy(fn) API (https://github.com/jasnell/proposal-promise-lazy) ... the idea is to simplify a pattern that is currently possible to do with custom thenables with a fair amount of boilerplate. This is very early and not yet ready for in committee discussion but I'd like to invite folks to take a look at provide early feedback. I'll likely raise it for in-committee discussion for the next scheduled plenary
21:16
<bakkot>
Hey all, just doing some sketching out on a proposal for a Promise.lazy(fn) API (https://github.com/jasnell/proposal-promise-lazy) ... the idea is to simplify a pattern that is currently possible to do with custom thenables with a fair amount of boilerplate. This is very early and not yet ready for in committee discussion but I'd like to invite folks to take a look at provide early feedback. I'll likely raise it for in-committee discussion for the next scheduled plenary
neat
21:18
<bakkot>
one thing to consider: Promise.resolve (which is also used in the await syntax) has a special case for "real" promises (which works by checking the .constructor property), and you'd need to consider whether this would count
21:18
<Mathieu Hofman>
I am pretty sure this is a not going to be acceptable to some delegates, including us. Promises are one-way, and I don't want them to grow a back channel
21:19
<bakkot>
I am pretty sure this is a not going to be acceptable to some delegates, including us. Promises are one-way, and I don't want them to grow a back channel
it's not adding them to actual promises
21:19
<bakkot>
it's making a new kind of thenable, which you could already do in userland
21:19
<Mathieu Hofman>
We actually go through a great extent to protect against promises with a custom then or then getter in our environment.
21:20
<Mathieu Hofman>
Right, nothing prevents anyone from making a new thenable, but that's not a promise
21:21
<bakkot>
well, ok, call it LazyPromise and make it a subclass instead of Promise.lazy then
21:21
<bakkot>
that seems like a stage 2 concern
21:21
<Mathieu Hofman>
Promise.resolve() which is extensively used to check if something is a promise would trigger the laziness
21:22
<bakkot>
¯\_(ツ)_/¯ seems fine
21:22
<bakkot>
I very much contest "extensively" though
21:23
<bakkot>
I would buy "extensively in our extremely unusual codebase"
21:23
<Mathieu Hofman>
Right I might have a miopic vision here.
21:23
<Ashley Claymore>
await does feel like the wrong place to start work
21:24
<bakkot>
fwiw that's how it works in Rust and it's fine
21:24
<James M Snell>
Well, to be clear, per the proposal, the work is not started on the await (or the .then call). The actual work is deferred as a microtask.
21:25
<Ashley Claymore>
oh I jumped to the wrong assumption
21:26
<James M Snell>
Mathieu Hofman: ... I get the objection... if these were brand-checkable would that make a difference to you? e.g. something like const p = Promise.lazy(...); if (Promise.isLazy(p) { ... }
21:26
<Mathieu Hofman>
FYI that class LazyPromise example doesn't work correctly when then is called multiple times (which re-inforces that it's hard to actually get this right)
21:26
<James M Snell>
FYI that class LazyPromise example doesn't work correctly when then is called multiple times (which re-inforces that it's hard to actually get this right)
yep, it was a quick example. didn't go through all the necessary boilerplate to make to complete
21:27
<Mathieu Hofman>
deferred to a microtask after the await or then call, right ?
21:27
<James M Snell>
deferred to a microtask after the await or then call, right ?
Yes. When the reaction is added, the callback is deferred to the microtask at that point
21:29
<snek>
do you have some example use cases for this?
21:31
<James M Snell>
do you have some example use cases for this?
I need to work up a few more of the examples but the use case that prompted this for me was a work queue built around custom thenables that ended up having a fair amount of boilerplate to get right (for instance, Mathieu Hofman's comment about about the LazyPromise example needing a lot more to be safe)
21:32
<bakkot>
do you have some example use cases for this?
(there is one example at the bottom of the readme, in case you didn't see that it down there)
21:33
<snek>
i feel like this example is a meta example
21:33
<snek>
now i'm curious what you would put into this queue in practice
21:33
<Ashley Claymore>
would a generator help there?
21:33
<James M Snell>
yeah that example is lacking. One of my key tasks before bringing this officially to the committee is to expand the use cases and examples
21:34
<Michael Ficarra>
I need to work up a few more of the examples but the use case that prompted this for me was a work queue built around custom thenables that ended up having a fair amount of boilerplate to get right (for instance, Mathieu Hofman's comment about about the LazyPromise example needing a lot more to be safe)
how about we just provide a built-in work queue instead?
21:34
<Michael Ficarra>
(I am already considering doing this btw)
21:35
<James M Snell>
would a generator help there?
This was about 8 months or so ago that I first started thinking about this and at the time we did end up moving the code to a generator to improve things but even then it was still a fair amount of boilerplate. It definitely helped that case tho
21:35
<snek>
i look forward to seeing some. i don't think i have ever (in js or rust) needed to take advantage of executor lazyness in a way that semantically matters.
21:35
<James M Snell>
i look forward to seeing some. i don't think i have ever (in js or rust) needed to take advantage of executor lazyness in a way that semantically matters.
Fair
21:35
<snek>
only for performance optimization
21:36
<James M Snell>
I'll have that (better documenting the use cases) at the top of the todo list then before moving this forward
21:37
<Ashley Claymore>
they feel similar to a memo'ed lazy 0-arg async function. Where the trigger is an explicit call. The lazy promise looks useful when the code needs to be generic. Would be good to see an example where the code wouldn't have worked with a regular lazy function
21:38
<snek>
this sort of reminds me about the disagreement over iterator reuse, though maybe i'm over-generalizing.
21:40
<Ashley Claymore>
as long as we don't add a .then to async functions so that we have both Then-ables and Then-ors
21:46
<James M Snell>
as long as we don't add a .then to async functions so that we have both Then-ables and Then-ors
oh no no no... definitely not
21:47
<snek>
thenerators
21:47
<James M Snell>
the motivation for this is not to add anything that cannot currently already be done with custom thenables, it's just to make it easier to implement that pattern more correctly and with less boilerplate
21:48
<snek>
i think you can trick v8 into emitting negative line numbers
21:48
<Ashley Claymore>
I'm trying to work out if it composes with https://github.com/tc39/proposal-concurrency-control, but it's been a long day
21:49
<snek>
ljharb: https://gc.gy/5787c068-7972-4c5b-bc38-dc39b7704308.png
22:01
<Michael Ficarra>
I'm trying to work out if it composes with https://github.com/tc39/proposal-concurrency-control, but it's been a long day
I have a secret prototype task queue that I will share some day
22:02
<bakkot>

I think so? Something like

let governor = new CountingGovernor(5);

let tasks = data.map(
  x => Promise.resolve(x)
    .lazyThen(async () => { using _ = await governor.acquire(); return await doWork(x); }
);

this gives you: task are not started until first polled, and no matter how many are being polled no more than 5 can be doing work at a time

22:02
<bakkot>
but a real task queue would still be better I expect