Commit Graph

21 Commits

Author SHA1 Message Date
Andrew Kaster
645eac1734 CI: Use new name for macOS JS artifact in js-benchmarks workflow
This was missed in 1a0b83a59f
2025-05-30 14:20:44 -06:00
Jelle Raaijmakers
8e9bf01917 CI: Rename os input to runner
This more clearly describes what the value is being used for, and avoids
some confusion between `os` and `os_name` in `lagom-template.yml`.
2025-05-27 12:10:55 +02:00
Jelle Raaijmakers
47e1aa054b CI: Use explicit run IDs to download JS binary and call webhook
When we try to retrieve benchmark results in the webhook call, we cannot
use the `head_sha` parameter since the workflow run might have a
different `head_sha` associated with it than the upstream workflow run.
This can happen when the JS repl binary workflow runs, a new commit is
pushed to master, followed by a JS benchmarks workflow run causing this
latter run to be associated with a different commit ID.

This extends the webhook payload to include the current run ID, which
can eventually be used by the webhook script to specifically download
the benchmark results associated with the current run.

Additionally, this changes the JS artifact download to use the upstream
run ID which seems nicer to do anyway.
2025-05-26 17:18:40 +02:00
Timothy Flynn
47569c1714 CI: Do not install clang in the JS benchmarks workflow
It's not needed. This is primarily to reduce the number of places needed
to be updated on the next clang rollout.
2025-05-15 18:53:49 +02:00
Jelle Raaijmakers
9f044cb547 CI: Only add LLVM repository if it is missing
For our js-benchmarks and libjs-test262 workflow runs, we already know
that they're provisioned with these repositories and can skip adding the
key and repo altogether.
2025-05-15 15:13:55 +02:00
Jelle Raaijmakers
44db17f273 CI: Switch from wget to curl
We were using both wget and curl arbitrarily; use curl exclusively since
that is installed by default on our machines and containers. Fixes the
js-benchmarks workflow.
2025-05-15 14:22:38 +02:00
Andrew Kaster
09ed4bd265 CI: Manually add apt repo for llvm 20 to all relevant jobs 2025-05-14 19:43:52 -04:00
Timothy Flynn
70d2b0b6f3 CI: Update the Clang pipeline to Clang 20 2025-05-14 02:01:59 -06:00
Jelle Raaijmakers
edaac0f2ee CI: Add missing event key to JS benchmarks workflow 2025-04-15 14:23:27 +02:00
Jelle Raaijmakers
1b4a4b0225 CI: Use workflow run's event commit SHA for JS benchmarks
Chaining workflows does not cause the subsequently spawned workflow runs
to use the same event, but rather it uses the latest head SHA based on
the branch it runs on. This would cause the JS benchmarks jobs to not be
able to find artifacts (if a new JS repl workflow was started before the
previous one could finish) and/or assign the wrong commit SHA to the
benchmark results.

Since `github.event` contains information about the original workflow
run that spawned the JS benchmarks jobs, we can take the commit SHA from
there and use it to download the correct artifact.
2025-04-15 13:53:07 +02:00
Jelle Raaijmakers
1f81f75add CI: Remove concurrency configuration for JS artifacts and benchmarks
We had concurrency set on the JS artifacts and JS benchmarks workflows
causing them to not run in parallel for the same combination of
(workflow, OS name). You'd expect that this causes a FIFO queue to exist
of the jobs to run sequentially, but in reality GitHub maintains a
single job to prioritize and cancels all others. We don't want that for
our artifacts and benchmarks: we want them to run on each push.

For example, a new push could have workflows getting cancelled because
someone restarted a previously failed workflow, resulting in the
following message:

  "Canceling since a higher priority waiting request for [..] exists"

By removing the concurrency setting from these workflows, we make use of
all available runners to execute the jobs and potentially run some of
them in parallel. For the benchmarks however, we currently only have one
matching self-hosted runner per job, and as such they are still not run
in parallel.
2025-04-15 12:47:45 +02:00
Jelle Raaijmakers
d616ab0d95 CI: Remove EOL escapes from single quoted string in js-benchmarks
Don't need these, they mess up the JSON payload.
2025-04-15 10:55:05 +02:00
Jelle Raaijmakers
612b0cdddd CI: Add architecture to js-benchmarks webhook callback 2025-04-15 10:55:05 +02:00
Jelle Raaijmakers
8eb16633fe CI: Fix SHA-256 signature in webhook callback for js-benchmarks workflow 2025-04-14 17:09:41 +02:00
Jelle Raaijmakers
e18e7d6019 CI: Simplify workflow name for js-benchmarks.yml
No need to state 'Run the' part.
2025-04-14 14:15:49 +02:00
Jelle Raaijmakers
e6f674fb7f CI: Run benchmarks on macOS as well
This introduces a matrix for the js-benchmarks workflow and runs both
the Linux x86_64 and macOS arm64 JS repl builds against our benchmarks
repository.
2025-04-14 14:15:49 +02:00
Jelle Raaijmakers
4600f9a5b0 CI: Call benchmarks webhook with curl instead of a dedicated action
The workflow-webhook action that was being used didn't work on macOS or
machines without Docker, so let's create the payload ourselves, sign it
and send it over using plain old `curl`.
2025-04-14 14:15:49 +02:00
Jelle Raaijmakers
a0f3099333 CI: Make JS Benchmarks use the JS repl from the job with the same commit
In practice this does not make a big difference, but technically it
could happen that a second JS Repl artifact was built before the first
JS Benchmarks job is executed. So make sure to filter on commit ID.
2025-03-26 11:22:54 +00:00
Jelle Raaijmakers
582084e74e CI: Call webhook as soon as new JS benchmark results are in 2025-03-24 12:49:30 +01:00
Jelle Raaijmakers
b11064c0ae CI: Reduce js-benchmarks artifact retention to 90 days
This is GitHub's default maximum. Prevents generating a warning on each
workflow run.
2025-03-24 01:20:00 +01:00
Jelle Raaijmakers
2752e01fe7 CI: Add js-benchmarks workflow
This workflow starts after a successful js-artifacts workflow, picks up
the JS repl binary and runs our js-benchmarks tool. It does not yet
publish or otherwise store the benchmark results, but it's a start!
2025-03-22 13:05:43 +01:00