For 'authors' and 'license' this is a bulk setting for basically every
crate in the repo. It's not really *shorter* to say that they all get
their values from the root Cargo.toml, but it sets a precedent for
other values.
'version' is a little more interesting. The versions we use for tags
only really apply to the app libraries; it would be odd for Rust
clients to see a bump in the "version" for libsignal-protocol whenever
we do a release when most releases have no changes in
libsignal-protocol. So for now, only the bridge crates are taking the
version from the workspace; if we ever, say, release
libsignal-protocol on crates.io, it'll be important that it has its
own versioning scheme separate from libsignal as a whole.
Use https://github.com/tamasfe/taplo to auto-format TOML files. Add a config
file to force reordering of dependencies in Cargo.toml files. Run taplo in CI
to check formatting.
- Feature flags removed for unconditionally-provided APIs.
- A function's this() is no longer guaranteed to be an object,
so we have to check and error out more often.
- Use of usize instead of i32 in a few places.
- Convenience for fetching globals.
We still do some extra work to ensure panics are caught and turned
into JavaScript errors, so our promise() function doesn't go away
altogether, but the implementation's much simpler and now more
strongly typed.
- Drop our fork of Neon now that our changes have been integrated
- Adopt rename of EventQueue to Channel
- Add a napi-6 feature to signal-neon-futures to make it easier to test
under the configuration we're actually shipping
Both futures::executor::block_on and our own expect_ready were being
used to resolve futures that were, in practice, known to be
non-blocking. FutureExt::now_or_never handles that case more lightly
than block_on and more uniformly than expect_ready.
This lets us drop the dependency on the full 'futures' crate down to
just futures_util, which should help with compile time.
Node can more efficiently handle multiple tasks coming in on the same
queue, so remove the "convenience" APIs that derive a new queue from a
Context, and require an existing EventQueue instead. This cuts more
time off of our decryption benchmark (not checked in).
Additionally, run the first poll for the future synchronously, to
avoid having to wait for the event loop to pick up the task to start
the future.
It turns out this takes less than a millisecond, at least when the
event loop is empty, but the setup might still be useful in the
future. Run with `cargo bench -p signal-neon-futures`.
In particular, I got the terms "fulfill" and "resolve" mixed up. This
version should have the correct use of "settled", "fulfilled",
"rejected", and (in rare cases) "resolved".
Neon provides a way to expose *synchronous* JavaScript functions from
Rust. This means that if Rust wants to wait for the result of a
JavaScript promise, it can at best return a callback to continue its
work when the promise resolves. This does not naturally compose with
Rust's `async`, which works in terms of Futures.
This commit adds a new crate, signal-neon-futures, that provides
functionality for (1) wrapping JavaScript futures so they can be
awaited on in Rust, and (2) producing a JavaScript promise that wraps
a Rust future. It does so by synchronously resuming execution of the
Rust future whenever an awaited JavaScript promise is resolved.