Compare commits

...

180 Commits

Author SHA1 Message Date
Jonathan Schwender
54d21b4136 CI: Use github environment to protect release workflow
Signed-off-by: Jonathan Schwender <schwenderjonathan@gmail.com>
2026-04-10 08:44:11 +02:00
Tim van der Lippe
4b02f87cce script: Implement preserving ranges for moves during commands (#44041)
This ensures that the selection after the move is as expected.
It mostly fixes tests that had fully passing implementation, but
the after fontsize was incorrect, since the selection wasn't
properly updated.

Some new regressions were false positives.

Part of #25005

Testing: WPT

---------

Signed-off-by: Tim van der Lippe <tvanderlippe@gmail.com>
Signed-off-by: Tim van der Lippe <TimvdLippe@users.noreply.github.com>
2026-04-10 04:35:08 +00:00
dependabot[bot]
eb9784c9fb build: bump libredox from 0.1.15 to 0.1.16 (#44077)
Bumps libredox from 0.1.15 to 0.1.16.


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=libredox&package-manager=cargo&previous-version=0.1.15&new-version=0.1.16)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-10 00:40:36 +00:00
Euclid Ye
560498923e script: Skip clearing subtree layout boxes of detached element when attaching shadow to it (#44052)
There is no layout boxes to clear in this case. For the example in
#43998,
this skips the traversal many times.

Testing: Should not change visible behaviour.
[Try](https://github.com/servo/servo/actions/runs/24131782865).

---------

Signed-off-by: Euclid Ye <yezhizhenjiakang@gmail.com>
2026-04-10 00:12:07 +00:00
Euclid Ye
8b61ca94fc script: Make Node::clean_up_style_and_layout_data work as named (#44072)
Do not `cancel_animations_for_node` for this function.
We can then reuse it more often.

Testing: Just a refactor.

Signed-off-by: Euclid Ye <yezhizhenjiakang@gmail.com>
2026-04-10 00:11:19 +00:00
elomscansio
65b917bd4f devtools: Replace new with register for NodeActor (#44071)
Added a `register` method to `NodeActor` following the same pattern used
by other actors in the devtools codebase. Updated
`NodeInfoToProtocol::get_or_register_node_actor` to use it instead of
constructing `NodeActor` directly.
Testing: 
Ran `./mach try linux-unit-tests` and `./mach test-devtools`. No new
failures introduced.

Fixes:part of  #43800

---------

Signed-off-by: Emmanuel Paul Elom <elomemmanuel007@gmail.com>
2026-04-10 00:06:39 +00:00
rtjkro
f56c108625 font: Refactor font list generation on OpenHarmony platform (#44061)
font: Use `read_fonts` module to generate font list on OpenHarmony
platform. Specifically:

- `Family name` is obtained from the font's `name` table.
- `width` & `weight`  is obtained from the font's `os2` table.
- `style` is obtained from the font's `postscript` table.

Additionally, I'd like to mention that I plan to add a caching mechanism
to store the font list in the near future (as mentioned in the related
issue)

Reference: [TrueType reference
manual](https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6.html)
Testing: No behavior changes expected.
Fixes: Part of [#43596](https://github.com/servo/servo/issues/43596)

---------

Signed-off-by: Richard Tjokroutomo <richard.tjokro2@gmail.com>
2026-04-09 16:18:28 +00:00
elomscansio
11cea2f023 devtools: Replace new with register for PageStyleActor (#44068)
Added a `register` method to `PageStyleActor` following the same pattern
used by other actors in the devtools codebase. Updated
`InspectorActor::register` to use it instead of constructing
`PageStyleActor` directly.
  Testing: 
Ran `./mach try linux-unit-tests` and `./mach test-devtools`. No new
failures introduced.

  Fixes:part of  #43800

Signed-off-by: Emmanuel Paul Elom <elomemmanuel007@gmail.com>
Signed-off-by: eri <eri@igalia.com>
Co-authored-by: eri <eri@igalia.com>
2026-04-09 16:07:20 +00:00
atbrakhi
9e5e5603df devtools: Use DebuggerValue in console (#44064)
Use same `DebuggerValue` in console actor as well. This helps us have
single source of truth!

Testing: Added a new test
Fixes: part of https://github.com/servo/servo/issues/39858

Signed-off-by: atbrakhi <atbrakhi@igalia.com>
Co-authored-by: eri <eri@igalia.com>
2026-04-09 14:25:02 +00:00
Martin Robinson
f45223568e servoshell: Ignore requests to focus unknown WebViews via keyboard shortcuts (#44070)
Instead of panicking when pressing a keyboard shortcut for an uknown
WebView, just silently ignore the request. This code path is only
followed when using keyboard shortcuts, so this isn't going to hide any
unexpected behavior.

Testing: We do not have testing at this level of servoshell.
Fixes: #44056.

Signed-off-by: Martin Robinson <mrobinson@igalia.com>
2026-04-09 14:09:07 +00:00
Messi II Innocent R.
4029f196e6 media: Clean up shadow root content when removing controls (#43983)
When media controls are removed (either by removing the controls
attribute or by removing the element from the DOM), the shadow root's
children are now cleared. This breaks the reference cycle between the js
mediacontrols instance and the media element, the event listeners and
this.media reference in the controls script would otherwise keep the
element alive and prevent garbage collection.

I tested the controls and it renders correctly and deleting a video with
controls doesn't crash.

Fixes: #43828

Signed-off-by: Messi002 <rostandmessi2@gmail.com>
2026-04-09 13:21:03 +00:00
CynthiaOketch
c52726eda0 devtools: Replace new with register for HighlighterActor (#44067)
Added a `register` method to `HighlighterActor` following the same
pattern used by other actors in the devtools codebase. Updated
`InspectorActor::register` to use it instead of onstructing
`HighlighterActor` directly.
  Testing: 
Ran `./mach try linux-unit-tests` and `./mach test-devtools`. No new
failures introduced.

  Fixes:part of  #43800

---------

Signed-off-by: CynthiaOketch <cynthiaoketch6@gmail.com>
2026-04-09 12:40:40 +00:00
atbrakhi
f32cc3dc51 devtools: Make isAsync and isGenerator optional (#44023)
Make `isAsync` and `isGenerator` optional

Testing: Current tests as passing
Fixes:  part of #36027

Signed-off-by: atbrakhi <atbrakhi@igalia.com>
Signed-off-by: eri <eri@igalia.com>
Co-authored-by: eri <eri@igalia.com>
2026-04-09 11:52:09 +00:00
Oriol Brufau
e9fbed1d74 fonts: Use ICU's Language instead of Stylo's XLang (#44057)
Testing: Not needed, no behavior change

Signed-off-by: Oriol Brufau <obrufau@igalia.com>
2026-04-09 10:53:44 +00:00
Euclid Ye
4c6d13d11e servoshell (Windows): Add CJK fonts for egui (#44055)
We configure fonts for Windows referring to [doc
example](https://docs.rs/egui/latest/egui/struct.FontDefinitions.html).
There is a [discussion](https://github.com/emilk/egui/discussions/1344)
on this about various ways.

Testing: Not possible to write automated test.
Before: 
<img width="232" height="109" alt="image"
src="https://github.com/user-attachments/assets/be9f3724-9ee5-4157-bd9d-313b519d1e57"
/>

After: 
<img width="243" height="67" alt="image"
src="https://github.com/user-attachments/assets/e748389f-48e3-48c1-bdaf-23c49837a1c6"
/>

Signed-off-by: Euclid Ye <yezhizhenjiakang@gmail.com>
2026-04-09 10:43:41 +00:00
atbrakhi
3c4b8c61ea devtools: Handle different number values and their types (#44022)
Handle different number values and their types

Testing: Current tests are passing
Fixes:  part of #36027

Signed-off-by: atbrakhi <atbrakhi@igalia.com>
Co-authored-by: eri <eri@igalia.com>
2026-04-09 10:42:05 +00:00
Martin Robinson
718c8913af script/constellation: Rename and consolidate cross-Document focus messaging (#44020)
There are two times that Servo needs to ask other `Document`s to either
focus or blur.

- During processing of the "focusing steps". When a new element gains
  focus this may cause focus to be lost or gained in parent `<iframe>`s.
- When calling `focus()` on a DOM Window from another origin.

In both of these cases we need to request that a `Document` gain or lose
focus via the Constellation, but in the second case we may have a
`BrowsingContextId` of the `<iframe>` gaining focus and a
`FocusSequence`. This change splits those cases into two kinds of
messages.

Finally, run the entire focusing steps when calling `window.focus()`
instead of going to the constellation immediately. This will be
important in a followup changes where messaging order is made more
consistent.

Testing: This should not change behavior so is covered by existing
tests.

Signed-off-by: Martin Robinson <mrobinson@igalia.com>
2026-04-09 08:59:47 +00:00
Taym Haddadi
553f125773 IndexedDB: Align IDBDatabase.transaction validation with the IndexedDB spec (#44059)
Testing: Existing WPT test pass.
part of https://github.com/servo/servo/issues/40983

Signed-off-by: Taym Haddadi <haddadi.taym@gmail.com>
2026-04-09 08:59:05 +00:00
elomscansio
ae5abfbdea mach: remove shorthand support for direct test file paths in mach test-unit (#43951)
This PR removes support for passing direct test file paths (e.g.,
`./mach test-unit <test_file.rs>`) to `mach test-unit`.

Previously, this shorthand was supported in `test-unit` but was
subsequently broken in #39897. Supporting this behavior now requires
extracting test names from a generic parameter list, which is
error-prone due to the presence of arbitrary arguments and inconsistent
argument ordering.

With this change, `mach test-unit` aligns more closely with standard
Cargo test patterns. Users are expected to specify test modules or
patterns instead (e.g., `./mach test-unit <module>::` or using
Cargo-compatible arguments).

This simplifies the implementation and avoids maintaining fragile logic
for shorthand support that was not widely used.

---

### Testing

* Ran `./mach test-unit` with various valid patterns and module-based
inputs
* Verified that tests execute correctly across supported use cases
* Confirmed that removal of direct file path support does not affect
standard workflows

---

Fixes: #41065

---------

Signed-off-by: Emmanuel Paul Elom <elomemmanuel007@gmail.com>
2026-04-09 07:41:31 +00:00
Gae24
a0d397bd1a script: Queue a networking task to proceed when a pending module fetch is terminated (#44042)
Our implementation of [fetch a single module
script](https://html.spec.whatwg.org/multipage/webappapis.html#fetch-a-single-module-script)
was missing the task queueing steps:
```
5. If moduleMap[(url, moduleType)] is "fetching", wait in parallel until that entry's value changes,
   then queue a task on the networking task source to proceed with running the following steps.

6. If moduleMap[(url, moduleType)] exists, run onComplete given moduleMap[(url, moduleType)], and return.
```
Instead we appended a `PromiseNativeHandler`, which would run
`on_complete` when resolved, on the promise of the pending fetch.

Testing: This change shouldn't be observable since modules are evaluated
in sync, but it's required for #39417.

Signed-off-by: Gae24 <96017547+Gae24@users.noreply.github.com>
2026-04-09 07:27:53 +00:00
dependabot[bot]
046ed0f236 build: bump tokio from 1.51.0 to 1.51.1 in the tokio-rs-related group (#44048)
Bumps the tokio-rs-related group with 1 update:
[tokio](https://github.com/tokio-rs/tokio).

Updates `tokio` from 1.51.0 to 1.51.1
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/tokio-rs/tokio/releases">tokio's
releases</a>.</em></p>
<blockquote>
<h2>Tokio v1.51.1</h2>
<h1>1.51.1 (April 8th, 2026)</h1>
<h3>Fixed</h3>
<ul>
<li>sync: fix semaphore reopens after forget (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/8021">#8021</a>)</li>
<li>net: surface errors from <code>SO_ERROR</code> on <code>recv</code>
for UDP sockets on Linux (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/8001">#8001</a>)</li>
</ul>
<h3>Fixed (unstable)</h3>
<ul>
<li>metrics: fix <code>worker_local_schedule_count</code> test (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/8008">#8008</a>)</li>
<li>rt: do not leak fd when cancelling io_uring open operation (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7983">#7983</a>)</li>
</ul>
<p><a
href="https://redirect.github.com/tokio-rs/tokio/issues/7983">#7983</a>:
<a
href="https://redirect.github.com/tokio-rs/tokio/pull/7983">tokio-rs/tokio#7983</a>
<a
href="https://redirect.github.com/tokio-rs/tokio/issues/8001">#8001</a>:
<a
href="https://redirect.github.com/tokio-rs/tokio/pull/8001">tokio-rs/tokio#8001</a>
<a
href="https://redirect.github.com/tokio-rs/tokio/issues/8008">#8008</a>:
<a
href="https://redirect.github.com/tokio-rs/tokio/pull/8008">tokio-rs/tokio#8008</a>
<a
href="https://redirect.github.com/tokio-rs/tokio/issues/8021">#8021</a>:
<a
href="https://redirect.github.com/tokio-rs/tokio/pull/8021">tokio-rs/tokio#8021</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="98df02d7a4"><code>98df02d</code></a>
chore: prepare Tokio v1.51.1 (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/8023">#8023</a>)</li>
<li><a
href="3ea11e2a5f"><code>3ea11e2</code></a>
sync: fix semaphore reopens after forget (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/8021">#8021</a>)</li>
<li><a
href="c79121391d"><code>c791213</code></a>
rt: do not leak fd when cancelling io_uring open operation (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7983">#7983</a>)</li>
<li><a
href="ad8c59add6"><code>ad8c59a</code></a>
net: surface errors from <code>SO_ERROR</code> on <code>recv</code> for
UDP sockets on Linux (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/8001">#8001</a>)</li>
<li><a
href="654d38b132"><code>654d38b</code></a>
metrics: fix <code>worker_local_schedule_count</code> test (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/8008">#8008</a>)</li>
<li><a
href="857ba80933"><code>857ba80</code></a>
docs: improve contributing docs on how to specify crates dependency
versions ...</li>
<li><a
href="95b9342da7"><code>95b9342</code></a>
chore: remove path deps for tokio-macros 2.7.0 (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/8007">#8007</a>)</li>
<li>See full diff in <a
href="https://github.com/tokio-rs/tokio/compare/tokio-1.51.0...tokio-1.51.1">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=tokio&package-manager=cargo&previous-version=1.51.0&new-version=1.51.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-09 06:50:01 +00:00
Tim van der Lippe
1c849a362d script: Fix computation of "fontsize" command value (#44039)
The initial interpretation of "convert the font size to the value in
pixels" was completely off. I thought it meant the existing font
elements in the DOM, but instead it implied that you would have to
convert these into pixels according to the HTML size table.

Therefore, use the implementation in Stylo to convert the html size to a
keyword and then compute the value for the keyword.

To make that work, we need to compute the pixel size as fallback when
resolving the CSS value on the node. We check if it were pixels and then
go through the conversion. If they aren't pixels, we can skip all that
logic and directly convert, saving a few cycles.

Part of #25005

Testing: WPT

---------

Signed-off-by: Tim van der Lippe <tvanderlippe@gmail.com>
Signed-off-by: Tim van der Lippe <TimvdLippe@users.noreply.github.com>
Co-authored-by: Josh Matthews <josh@joshmatthews.net>
2026-04-09 06:38:42 +00:00
dependabot[bot]
5eb2ffc623 build: bump thin-vec from 0.2.14 to 0.2.15 (#44051)
Bumps [thin-vec](https://github.com/gankra/thin-vec) from 0.2.14 to
0.2.15.
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/mozilla/thin-vec/blob/main/RELEASES.md">thin-vec's
changelog</a>.</em></p>
<blockquote>
<h1>Version 0.2.15 (2025-02-19)</h1>
<ul>
<li>Support AutoTArrays created from rust in Gecko FFI mode.</li>
<li>Add extract_if.</li>
<li>Add const new() support behind feature flag.</li>
<li>Fix <code>thin_vec</code> macro not being hygienic when
recursing</li>
<li>Improve extend() performance.</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="70bcca0960"><code>70bcca0</code></a>
chore: Bump version to v0.2.15</li>
<li><a
href="322423b7a6"><code>322423b</code></a>
Fix miri error on extract_if().</li>
<li><a
href="eca5334c29"><code>eca5334</code></a>
Don't make push_unchecked public.</li>
<li><a
href="90e23c39cc"><code>90e23c3</code></a>
Minor nits, go back to call push_reserved push_unchecked.</li>
<li><a
href="ee9d6bb28f"><code>ee9d6bb</code></a>
Optimize extend() to avoid unnecessary capacity checks</li>
<li><a
href="7fd47080c0"><code>7fd4708</code></a>
feat: add const_new feature for const ThinVec::new()</li>
<li><a
href="beb652d66b"><code>beb652d</code></a>
Merge pull request <a
href="https://redirect.github.com/gankra/thin-vec/issues/75">#75</a>
from jtracey/patch-1</li>
<li><a
href="6f3da2525d"><code>6f3da25</code></a>
Merge pull request <a
href="https://redirect.github.com/gankra/thin-vec/issues/73">#73</a>
from emilio/auto-array-tweaks</li>
<li><a
href="d3d7475118"><code>d3d7475</code></a>
gecko: Keep the auto-bit across relocations.</li>
<li><a
href="faa01eb790"><code>faa01eb</code></a>
Merge pull request <a
href="https://redirect.github.com/gankra/thin-vec/issues/66">#66</a>
from GnomedDev/extract-if</li>
<li>Additional commits viewable in <a
href="https://github.com/gankra/thin-vec/compare/v0.2.14...v0.2.15">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=thin-vec&package-manager=cargo&previous-version=0.2.14&new-version=0.2.15)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-09 00:38:28 +00:00
dependabot[bot]
071c35e5a2 build: bump zerofrom-derive from 0.1.6 to 0.1.7 (#44049)
Bumps [zerofrom-derive](https://github.com/unicode-org/icu4x) from 0.1.6
to 0.1.7.
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/unicode-org/icu4x/blob/main/CHANGELOG.md">zerofrom-derive's
changelog</a>.</em></p>
<blockquote>
<h1>Changelog</h1>
<h2>icu 2.2.x</h2>
<p>Several crates have had patch releases in the 2.2 stream:</p>
<ul>
<li>Components
<ul>
<li>(2.2.1) <code>icu_calendar</code>
<ul>
<li>Fix extended year calculations in Gregorian-like and Coptic-like
calendars (unicode-org#7849)</li>
</ul>
</li>
</ul>
</li>
</ul>
<h2>icu4x 2.2</h2>
<ul>
<li>Components
<ul>
<li>General
<ul>
<li>Use HTTPS links in docs (unicode-org#7212)</li>
<li>Update MSRV to 1.86 (unicode-org#7576)</li>
<li>Updated to CLDR 48.2 (unicode-org#7792)</li>
<li>Replace <code>experimental</code> features with
<code>unstable</code> features (unicode-org#7566)</li>
<li>Add categories and keywords to Cargo.toml for all components
(unicode-org#7737)</li>
</ul>
</li>
<li><code>icu_calendar</code>
<ul>
<li>Add <code>Date::try_new</code>, which replaces
<code>Date::try_new_from_codes</code>, and takes typed year/month
values. (unicode-org#7773, unicode-org#7764)</li>
<li>New methods: <code>Date::try_new</code> (and primarily-internal
<code>Calendar::new_date</code>)</li>
<li>New types: <code>InputYear</code>, <code>DateNewError</code></li>
<li>Handle possible <code>Overflow</code> values on individual calendars
(unicode-org#7795)</li>
<li>New <code>Date::try_from_fields</code> API for fully general date
construction from various choices of year and month values
(unicode-org#7798)</li>
<li>New methods: <code>Date::try_from_fields()</code></li>
<li>New types: <code>DateFields</code>,
<code>DateFromFieldsOptions</code>, <code>Overflow</code>,
<code>MissingFieldsStrategy</code>,
<code>DateFromFieldsError</code></li>
<li>New associated method: <code>Calendar::from_fields()</code></li>
<li>New Date arithmetic APIs for adding and subtracting dates
(unicode-org#7798, unicode-org#7355, unicode-org#7257)</li>
<li>New methods: <code>Date::try_add_with_options</code>,
<code>Date::try_added_with_options</code>,
<code>Date::try_until_with_options</code></li>
<li>New types: <code>DateDuration</code>, <code>DateAddOptions</code>,
<code>DateDifferenceOptions</code>, <code>DateDurationUnit</code>,
<code>DateDurationParseError</code>, <code>DateAddError</code>,
<code>MismatchedCalendarError</code></li>
<li>New associated items: <code>Calendar::add</code>,
<code>Calendar::until</code>,
<code>Calendar::DateCompatibilityError</code></li>
<li>Introduce a new <code>Month</code> type, preferred over using month
codes (unicode-org#7147, unicode-org#7756)
<ul>
<li>New type: <code>Month</code></li>
<li>New method: <code>MonthInfo::to_input()</code></li>
</ul>
</li>
<li>Introduce year/date ranges to all APIs, documented on the APIs
themselves. <code>Date</code> now has a fundamental range (ISO years
between ±999,999), and most constructors enforce a stricter range of
±9999 years for input years. (unicode-org#7676, unicode-org#7062,
unicode-org#7629, unicode-org#7753, unicode-org#7219,
unicode-org#7227)</li>
<li>Add constructors with <code>Month</code> for lunisolar calendars
(unicode-org#7485)</li>
<li>New methods: <code>Date::try_new_korean_traditional()</code>,
<code>Date::try_new_chinese_traditional()</code>,
<code>Date::try_new_hebrew_v2()</code></li>
<li>Expose <code>LeapStatus</code> on <code>MonthInfo</code>
(unicode-org#7667)</li>
<li>New method: <code>MonthInfo::leap_status()</code></li>
<li>New enum: <code>LeapStatus</code></li>
<li>(Unstable) Integrate with <code>chrono</code>, <code>jiff</code>,
and <code>time</code> (unicode-org#7617, unicode-org#7711)</li>
<li>New impls: <code>From&lt;chrono::NaiveDate&gt;</code>,
<code>From&lt;jiff::civil::Date&gt;</code>,
<code>From&lt;time::Date&gt;</code> for
<code>Date&lt;Gregorian&gt;</code></li>
<li>Replace <code>Date::day_of_week</code> by <code>Date::weekday</code>
(unicode-org#7288)
<ul>
<li>New method: <code>Date::weekday()</code></li>
</ul>
</li>
<li>Deprecate <code>Date::new_from_iso</code>/<code>Date::to_iso</code>
(unicode-org#7287)</li>
<li>Deprecate <code>Date::extended_year()</code> (use
<code>Date::year().extended_year()</code>) (unicode-org#7289)</li>
<li>Remove <code>YearInfo: PartialEq</code> bound
(unicode-org#7743)</li>
<li>Start producing Meiji era only after Meiji 6 (unicode-org#7503)</li>
</ul>
</li>
</ul>
</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/unicode-org/icu4x/commits">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=zerofrom-derive&package-manager=cargo&previous-version=0.1.6&new-version=0.1.7)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-09 00:37:01 +00:00
dependabot[bot]
ceb8cf07c1 build: bump cryptography from 46.0.6 to 46.0.7 (#44045)
Bumps [cryptography](https://github.com/pyca/cryptography) from 46.0.6
to 46.0.7.
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/pyca/cryptography/blob/main/CHANGELOG.rst">cryptography's
changelog</a>.</em></p>
<blockquote>
<p>46.0.7 - 2026-04-07</p>
<pre><code>
* **SECURITY ISSUE**: Fixed an issue where non-contiguous buffers could
be
  passed to APIs that accept Python buffers, which could lead to buffer
  overflow. **CVE-2026-39892**
* Updated Windows, macOS, and Linux wheels to be compiled with OpenSSL
3.5.6.
<p>.. _v46-0-6:<br />
</code></pre></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="622d672e42"><code>622d672</code></a>
46.0.7 release (<a
href="https://redirect.github.com/pyca/cryptography/issues/14602">#14602</a>)</li>
<li>See full diff in <a
href="https://github.com/pyca/cryptography/compare/46.0.6...46.0.7">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=cryptography&package-manager=uv&previous-version=46.0.6&new-version=46.0.7)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
You can disable automated security fix PRs for this repo from the
[Security Alerts page](https://github.com/servo/servo/network/alerts).

</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-08 22:40:19 +00:00
Martin Robinson
b8ef264268 layout: Integrate more details into FontAndScriptInfo (#43974)
This will be needed for ensuring that shaped text is valid during
relayout. It duplicates some information, but the hope is that when all
of these change are done, there will be many fewer individual shaped
segments of text.

Testing: This should not really change observable behavior so is covered
by existing tests.

Signed-off-by: Martin Robinson <mrobinson@igalia.com>
2026-04-08 21:37:44 +00:00
Tim van der Lippe
eca6eb2b4e script: Further implement fontsize command (#44030)
While debugging test failures, I discovered that I had reversed the
`is_allowed_child` arguments. That made a bunch more tests pass, but
then also started to fail tests as we weren't clearing the previous
value and computing loose equivalence.

Therefore, this PR fixes the reversal and implements the relevant parts
of some algorithms as to not regress too much. There are two new
failures related to how integers should be parsed, but tackling that
separately.

Part of #25005

Testing: WPT

Signed-off-by: Tim van der Lippe <tvanderlippe@gmail.com>
2026-04-08 18:51:48 +00:00
Jonathan Schwender
516dba791f allocator: Add libc Heap information on macos (#44037)
Despite using `jemalloc` by default on macos as the Rust global
allocator, the default system allocator will still be used by some C/C++
libraries.
Using `malloc_zone_statistics` on macos allows us to get information
about the system allocator heap usage.
Since the macos statistic also provides information about reserved, but
currently unused memory, we also expose that and attempt to calculate
the same metric on Linux, which should be
arena (Non-mmapped space allocated (bytes)) + hblkhd (Space allocated in
mmapped regions (bytes)) See
https://man7.org/linux/man-pages/man3/mallinfo.3.html

Loading `servo.org` in a debug build on macOS and then navigating to
about:memory, I see 31MB system-heap-allocated and 92 MB
system-heap-reserved.

Testing: Manually tested on macOS. Not tested on Linux.

---------

Signed-off-by: Jonathan Schwender <schwenderjonathan@gmail.com>
2026-04-08 18:23:49 +00:00
CynthiaOketch
1174dfe3d2 script: Remove pointless import renames in components/script/dom/request.rs (#44025)
CacheMode, CredentialsMode, Destination, RedirectMode, and Referrer were
imported with NetTraitsRequest* aliases to avoid naming conflicts. These
conflicts no longer exist; their original names do not clash with any
other imports or types in this file, so the aliases are restored.

Testing: Pure refactor with no behavior change. Ran `./mach try
linux-unit-tests` to verify existing tests continue to pass.
Fixes: #42981

Signed-off-by: CynthiaOketch <cynthiaoketch6@gmail.com>
2026-04-08 17:26:51 +00:00
Simon Wülker
695b8ee913 script: Claim blob before loading <video> poster frame (#44035)
Refer to https://github.com/servo/servo/pull/43746 for a description of
the problem. This change ensures that video posters can be loaded from
blob URLs, even if the URL is revoked right after.

Testing: This change adds a test

---------

Signed-off-by: Simon Wülker <simon.wuelker@arcor.de>
2026-04-08 17:03:56 +00:00
atbrakhi
e50cfa7af6 devtools: Fix recursion in debugger (#44024)
Fix recursion causing panic in debugger

Testing: Current tests are passing, also manual testing 
Fixes: part of #36027

Signed-off-by: atbrakhi <atbrakhi@igalia.com>
Co-authored-by: eri <eri@igalia.com>
2026-04-08 15:29:20 +00:00
Taym Haddadi
78c9fe2a4c IndexedDB: Align IndexedDB binary key conversion with the spec (#44009)
IndexedDB: Align IndexedDB binary key conversion with the spec

Testing: covered by WPT test.

part of https://github.com/servo/servo/issues/40983

---------

Signed-off-by: Taym Haddadi <haddadi.taym@gmail.com>
2026-04-08 10:26:52 +00:00
Freya Arbjerg
adde320fb0 paint: Add minimum size checks for RenderingContext (#44011)
Returns an error on `RenderingContext` constructors when size is 0 in
either dimension. Also adds panics to resize functions in the same case.

Testing: Added a unit test for the new error on
`SoftwareRenderingContext::new()`.
Fixes: https://github.com/servo/servo/issues/36061

Signed-off-by: Freya Arbjerg <git@arbjerg.dev>
Co-authored-by: Martin Robinson <martin@abandonedwig.info>
2026-04-08 10:18:21 +00:00
TIN TUN AUNG
280d984d3b media: Implement Player for ohos backend (#43208)
Implement Player Trait in ohos backend using Harmony OS MediaKit's[
AVPlayer](https://developer.huawei.com/consumer/en/doc/harmonyos-references/capi-avplayer-h).

The modular design of `VideoSink`, `InnerPlayer`, and `MediaSource` is
taken from the GStreamer backend.
Only support HarmonyOS SDK API 21, because the
`OH_AVPlayer_SetDataSource` only started to be exposed on API 21.

Testing: N/A, as there are no platform specific task.
Fixes: N/A, now we can play video on HarmonyOS phone using `<video>`

---------

Signed-off-by: rayguo17 <rayguo17@gmail.com>
2026-04-08 08:48:10 +00:00
Simon Wülker
1b336760ae mozjs: Rebuild from source if jitspew feature is enabled (#44010)
The `IONFLAGS` environment variable configure logging for the JIT, but
it must be enabled in spidermonkey at compile time. The prebuilt mozjs
binaries don't enable it, so we must build from source.

Companion PR for https://github.com/servo/mozjs/pull/728

Testing: We don't have tests for the build process

---------

Signed-off-by: Simon Wülker <simon.wuelker@arcor.de>
2026-04-08 08:40:07 +00:00
Euclid Ye
712b4f9bc2 script: Reduce ShadowRoot::bind_to_tree complexity from O(2^N) to O(N) (#44016)
During traversal, exclude shadow roots.

Analysis:
Each shadow root is processed twice:
- via host: Element::bind_to_tree()
-  Via iterator

In total, this would be
```math
\sum_{i=0}^{N-1} 2^i = 2^N - 1
```

Testing: Added a test.
Fixes: https://github.com/servo/servo/issues/43998

---------

Signed-off-by: Euclid Ye <yezhizhenjiakang@gmail.com>
Co-authored-by: webbeef <me@webbeef.org>
2026-04-08 08:20:20 +00:00
Jonathan Schwender
0ea14d1b60 release: Fix result check for cancelled workflows (#44017)
If the entire workflow was cancelled we also need to check for
`cancelled()`. Simply checking needs.*.result is not sufficient - it was
observed that the success branch was still entered when only checking
needs.

Testing: Tested manually, by cancelling [this
workflow](https://github.com/servo/servo/actions/runs/24119740924/job/70371050119)
which resulted in a draft release publish (failure branch)

---------

Signed-off-by: Jonathan Schwender <schwenderjonathan@gmail.com>
Signed-off-by: Jonathan Schwender <55576758+jschwe@users.noreply.github.com>
Co-authored-by: Mukilan Thiyagarajan <mukilanthiagarajan@gmail.com>
2026-04-08 05:58:58 +00:00
Bennet Bleßmann
cff186777c devtools: remove impl JsonPacketStream for TcpStream (#44006)
Removes the `impl JsonPacketStream for TcpStream` that was supposed to
be removed as part of servo/servo#43472

> Testing: [..] Removing the JsonPacketStream implementation for
TcpStream should discourage regressions due to improper use of raw
streams.

Confirmed in
https://github.com/servo/servo/pull/43472#issuecomment-4201684071

Testing: no new test needed as this only removes code

Signed-off-by: Bennet Bleßmann <bennet.blessmann+github@googlemail.com>
2026-04-08 01:09:46 +00:00
rovertrack
ba4f031f86 script: Pass &CStr to get_callable_property (#44008)
Fixes: #43968 
made get callable property accept ` &CStr ` instead of `&str`

Testing: no behaviour change expected, so existing WPT tests are
sufficient.

Signed-off-by: Rover track <rishan.pgowda@gmail.com>
2026-04-08 00:56:47 +00:00
dependabot[bot]
b561212988 build: bump async-signal from 0.2.13 to 0.2.14 (#44015)
Bumps [async-signal](https://github.com/smol-rs/async-signal) from
0.2.13 to 0.2.14.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/smol-rs/async-signal/releases">async-signal's
releases</a>.</em></p>
<blockquote>
<h2>v0.2.14</h2>
<ul>
<li>Fix build error on haiku. (<a
href="https://redirect.github.com/smol-rs/async-signal/issues/59">#59</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/smol-rs/async-signal/blob/master/CHANGELOG.md">async-signal's
changelog</a>.</em></p>
<blockquote>
<h1>Version 0.2.14</h1>
<ul>
<li>Fix build error on haiku. (<a
href="https://redirect.github.com/smol-rs/async-signal/issues/59">#59</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="d750e57adc"><code>d750e57</code></a>
Release 0.2.14</li>
<li><a
href="eb72cfd64a"><code>eb72cfd</code></a>
Fix build error on haiku</li>
<li><a
href="1ffadd3db8"><code>1ffadd3</code></a>
Update signal-hook requirement from 0.3.14 to 0.4.1 (<a
href="https://redirect.github.com/smol-rs/async-signal/issues/57">#57</a>)</li>
<li><a
href="a43dca7fd1"><code>a43dca7</code></a>
Fix clippy::io_other_error warning</li>
<li><a
href="c00258a4a6"><code>c00258a</code></a>
Bump MSRV to 1.85</li>
<li><a
href="3fdfdee177"><code>3fdfdee</code></a>
ci: Use taiki-e/checkout-action action</li>
<li><a
href="0f8053d9af"><code>0f8053d</code></a>
ci: Use cargo-hack's --rust-version flag for msrv check</li>
<li>See full diff in <a
href="https://github.com/smol-rs/async-signal/compare/v0.2.13...v0.2.14">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=async-signal&package-manager=cargo&previous-version=0.2.13&new-version=0.2.14)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-08 00:52:05 +00:00
dependabot[bot]
0941407c8f build: bump fastrand from 2.3.0 to 2.4.1 (#44013)
Bumps [fastrand](https://github.com/smol-rs/fastrand) from 2.3.0 to
2.4.1.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/smol-rs/fastrand/releases">fastrand's
releases</a>.</em></p>
<blockquote>
<h2>v2.4.1</h2>
<ul>
<li>Fix build failure with <code>js</code> feature. (<a
href="https://redirect.github.com/smol-rs/fastrand/issues/125">#125</a>)</li>
</ul>
<h2>v2.4.0</h2>
<ul>
<li>Bump MSRV to 1.63. (<a
href="https://redirect.github.com/smol-rs/fastrand/issues/104">#104</a>)</li>
<li>Improve quality of f32/f64 generation. (<a
href="https://redirect.github.com/smol-rs/fastrand/issues/103">#103</a>)</li>
<li>Add <code>f{32,64}_inclusive</code> and
<code>Rng::f{32,64}_inclusive</code>. (<a
href="https://redirect.github.com/smol-rs/fastrand/issues/103">#103</a>)</li>
<li>Make <code>Rng::with_seed</code> const. (<a
href="https://redirect.github.com/smol-rs/fastrand/issues/107">#107</a>)</li>
<li>Update <code>getrandom</code> to 0.3. (<a
href="https://redirect.github.com/smol-rs/fastrand/issues/104">#104</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/smol-rs/fastrand/blob/master/CHANGELOG.md">fastrand's
changelog</a>.</em></p>
<blockquote>
<h1>Version 2.4.1</h1>
<ul>
<li>Fix build failure with <code>js</code> feature. (<a
href="https://redirect.github.com/smol-rs/fastrand/issues/125">#125</a>)</li>
</ul>
<h1>Version 2.4.0</h1>
<ul>
<li>Bump MSRV to 1.63. (<a
href="https://redirect.github.com/smol-rs/fastrand/issues/104">#104</a>)</li>
<li>Improve quality of f32/f64 generation. (<a
href="https://redirect.github.com/smol-rs/fastrand/issues/103">#103</a>)</li>
<li>Add <code>f{32,64}_inclusive</code> and
<code>Rng::f{32,64}_inclusive</code>. (<a
href="https://redirect.github.com/smol-rs/fastrand/issues/103">#103</a>)</li>
<li>Make <code>Rng::with_seed</code> const. (<a
href="https://redirect.github.com/smol-rs/fastrand/issues/107">#107</a>)</li>
<li>Update <code>getrandom</code> to 0.3. (<a
href="https://redirect.github.com/smol-rs/fastrand/issues/104">#104</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="728a5b503f"><code>728a5b5</code></a>
Release 2.4.1</li>
<li><a
href="0c619f6a39"><code>0c619f6</code></a>
Fix build failure with js feature</li>
<li><a
href="a4077e2373"><code>a4077e2</code></a>
ci: Add missing js feature test</li>
<li><a
href="1fd5bbb300"><code>1fd5bbb</code></a>
Release 2.4.0 (<a
href="https://redirect.github.com/smol-rs/fastrand/issues/116">#116</a>)</li>
<li><a
href="074345b7e7"><code>074345b</code></a>
chore: make some documents clearer (<a
href="https://redirect.github.com/smol-rs/fastrand/issues/115">#115</a>)</li>
<li><a
href="ce9a48c2ee"><code>ce9a48c</code></a>
chore: update dependencies to latest versions and bump MSRV to 1.63 (<a
href="https://redirect.github.com/smol-rs/fastrand/issues/104">#104</a>)</li>
<li><a
href="978dde1cad"><code>978dde1</code></a>
ci: Use reusable workflows for clippy</li>
<li><a
href="8561f13c21"><code>8561f13</code></a>
bench: Add benchmark of f32()</li>
<li><a
href="1def02cb23"><code>1def02c</code></a>
Fix rustdoc::broken_intra_doc_links warning</li>
<li><a
href="c2cbdd4965"><code>c2cbdd4</code></a>
Remove manual doc(cfg(..))</li>
<li>Additional commits viewable in <a
href="https://github.com/smol-rs/fastrand/compare/v2.3.0...v2.4.1">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=fastrand&package-manager=cargo&previous-version=2.3.0&new-version=2.4.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-08 00:41:56 +00:00
Taym Haddadi
ac4df79bd6 Make localStorage and sessionStorage throw on opaque origins (#44002)
Testing: covered by WPT test.
Fixes #43999

---------

Signed-off-by: Taym Haddadi <haddadi.taym@gmail.com>
2026-04-07 20:41:10 +00:00
Josh Matthews
9334d3094b script: Use the global's origin when claiming blob tokens. (#44004)
`global.api_base_url().origin()` returns a unique opaque origin when the
base URL is an opaque origin. When we use the global's origin instead,
requests for claimed blobs can now pass the same-origin check.

Testing: Newly passing tests.
Fixes: #43326
Fixes: #43973

---------

Signed-off-by: Josh Matthews <josh@joshmatthews.net>
2026-04-07 19:40:15 +00:00
Kelechi Ebiri
a7e4b80b31 script: Support sprintf-style substitutions in console methods (#43897)
Implement the Console Formatter operation for all console methods.
Fixes: #43827

---------

Signed-off-by: Kelechi Ebiri <ebiritg@gmail.com>
2026-04-07 18:55:36 +00:00
Abbas Olanrewaju Sarafa
66d232e1e7 Remove introduction_type_override field from HTMLScriptElement (#44003)
Removed ```introduction_type_override``` field from
```HTMLScriptElement``` & passed the new variable to
```fetch_inline_module_script```

Testing: No testing required, compiles successfully.
Fixes: #43980

Signed-off-by: Sabb <sarafaabbas@gmail.com>
2026-04-07 18:36:21 +00:00
Simon Wülker
37a1f93b91 url: Let origins of file:// URLs be potentially trustworthy (#43989)
The origin of a `file` URL is unspecified. Engines act like they're
opaque except in a few special cases - one of which is the "is
potentially trustworthy" algorithm. This change allows consumers of
`servo-url` to distinguish between regular opaque origins and file
origins. Then we use that to mark file origins as "potentially
trustworthy" which is what the spec wants.

For now we can get away without changes to the `url` crate (the one used
in the wider ecosystem, not just servo), but I'm unsure if that will be
the case in the future.

Testing: This change adds a test
Fixes: https://github.com/servo/servo/issues/42540

---------

Signed-off-by: Simon Wülker <simon.wuelker@arcor.de>
2026-04-07 18:29:30 +00:00
Alex Feyerke
f977c06f9d Modernize and improve the Android UI (#43795)
Modernize and improve the Android UI, and add browsing history panel.

---------

Signed-off-by: Alex Feyerke <alex@neighbourhood.ie>
2026-04-07 16:39:19 +00:00
Euclid Ye
8b1619ba1d script/net: Make URL List closer to spec (#43987)
[Redirected](https://fetch.spec.whatwg.org/#dom-response-redirected)
should be decided by the URL List.
Currently it does not. We add some TODO, fill URL in more places
according to spec.

Testing: This should not change behaviour, as URL list is mostly used by
[Redirected](https://fetch.spec.whatwg.org/#dom-response-redirected). If
we do it now, we get test failures as URL list is not fully set in all
spec steps.

---------

Signed-off-by: Euclid Ye <yezhizhenjiakang@gmail.com>
2026-04-07 14:55:40 +00:00
Oriol Brufau
0615c394b9 stylo: Enable multiple color arguments in color-mix() (#43890)
Bumps Stylo to https://github.com/servo/stylo/pull/348

Testing: 2 WPT improve

Signed-off-by: Oriol Brufau <obrufau@igalia.com>
2026-04-07 14:08:50 +00:00
Narfinger
3612ba9e5b OHOS CI: Fix parsefromstring (#42995)
ParseFromString currently complains that it matches multiple trace
lines. The reason for that is a bit unclear as it should only produce
one alert. Local testing shows that it produces multiple (even for one
run). This should at least give us the metric back.

Testing: This is currently untested. As it is a small CI change that
can't break a broken test it should be fine.
Fixes: https://github.com/servo/servo/issues/42992

Signed-off-by: Narfinger <Narfinger@users.noreply.github.com>
2026-04-07 13:48:17 +00:00
Abubakar Abdulazeez Usman
750fb41bdb devtools: Include layer rules in CSS panel using rule tree (#43912)
DevTools was collecting CSS rules by walking stylesheets and matching
selector text. This ignored cascade order and did not correctly handle
rules inside layer blocks.

This change uses computed values (rule tree) to get the actual applied
rules in cascade order. It then maps those rules back to CSSStyleRule
using the declaration block identity, and walks the CSSOM to get
selector text and layer ancestry.

This fills ancestor_data with layer names and lets the inspector show
layered rules correctly.


Testing: 
- Verified using the minimized testcase from the issue

- Verified on https://www.sharyap.com/

- Confirmed that rules inside layer blocks are now shown with correct
order and hierarchy.


Fixes: #43541

Signed-off-by: arabson99 <arabiusman99@gmail.com>
2026-04-07 11:26:12 +00:00
Simon Wülker
57adfc136f script: Remove FIXME about deprecated performance.timing (#43996)
`performance.timing` is not going anywhere anytime soon. Deprecating it
is none of servo's concern. The spec links are also outdated.

Signed-off-by: Simon Wülker <simon.wuelker@arcor.de>
2026-04-07 11:16:35 +00:00
eri
4f13fcc38d devtools: Pass steppingType to onPop hook (#43995)
Fixes crashes when stepping in certain situations.

Testing: Ran `mach test-devtools` and manual testing
Part of: #36027

Signed-off-by: eri <eri@igalia.com>
Co-authored-by: atbrakhi <atbrakhi@igalia.com>
2026-04-07 09:57:20 +00:00
Messi II Innocent R.
88a08d775c Don't crash if rustup is not installed (#43982)
Check if rustup is available before calling it. 

If it is not found, skip the step with a warning instead of crashing.
This allows users who installed Rust through their distribution's
package manager (without rustup) to build Servo without errors.

Three places were updated:

- In command_base.py I added a skip automatic target installation for
cross-compilation
- In base.py, I added the skip toolchain installation
- In, bootstrap_commands.py , it is a skip clean-nightlies if rustup not
found

Fixes: #43871

Signed-off-by: Messi002 <rostandmessi2@gmail.com>
2026-04-07 09:42:55 +00:00
Narfinger
8a38c5e217 servoshell: Port from sig to signal_hook_registry (#43891)
Testing: We do not currently have a way to test signal handling in the
servoshell binary, so this change does not include tests.
Fixes: #43836

---------

Signed-off-by: Narfinger <Narfinger@users.noreply.github.com>
2026-04-07 08:28:36 +00:00
atbrakhi
7a559ba459 devtools: Fix worker targets in debugger tab (#43981)
Firefox DevTools client determines target type by checking if the actor
contains specific substring. For workers it
[requires](https://searchfox.org/firefox-main/source/devtools/client/fronts/watcher.js#65)
`/workerTarget` in the actor name to create a `WorkerTargetFront`.

Testing: Manual testing
Fixes: #36727


<img width="1084" height="558" alt="Screenshot 2026-04-06 at 21 47 59"
src="https://github.com/user-attachments/assets/207a8368-0f8a-48a6-ab7e-a5ee3750381f"
/>

Signed-off-by: atbrakhi <atbrakhi@igalia.com>
2026-04-07 07:49:43 +00:00
dependabot[bot]
793f0c8ec8 build: bump zerofrom from 0.1.6 to 0.1.7 (#43986)
Bumps [zerofrom](https://github.com/unicode-org/icu4x) from 0.1.6 to
0.1.7.
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/unicode-org/icu4x/blob/main/CHANGELOG.md">zerofrom's
changelog</a>.</em></p>
<blockquote>
<h1>Changelog</h1>
<h2>icu 2.2.x</h2>
<p>Several crates have had patch releases in the 2.2 stream:</p>
<ul>
<li>Components
<ul>
<li>(2.2.1) <code>icu_calendar</code>
<ul>
<li>Fix extended year calculations in Gregorian-like and Coptic-like
calendars (unicode-org#7849)</li>
</ul>
</li>
</ul>
</li>
</ul>
<h2>icu4x 2.2</h2>
<ul>
<li>Components
<ul>
<li>General
<ul>
<li>Use HTTPS links in docs (unicode-org#7212)</li>
<li>Update MSRV to 1.86 (unicode-org#7576)</li>
<li>Updated to CLDR 48.2 (unicode-org#7792)</li>
<li>Replace <code>experimental</code> features with
<code>unstable</code> features (unicode-org#7566)</li>
<li>Add categories and keywords to Cargo.toml for all components
(unicode-org#7737)</li>
</ul>
</li>
<li><code>icu_calendar</code>
<ul>
<li>Add <code>Date::try_new</code>, which replaces
<code>Date::try_new_from_codes</code>, and takes typed year/month
values. (unicode-org#7773, unicode-org#7764)</li>
<li>New methods: <code>Date::try_new</code> (and primarily-internal
<code>Calendar::new_date</code>)</li>
<li>New types: <code>InputYear</code>, <code>DateNewError</code></li>
<li>Handle possible <code>Overflow</code> values on individual calendars
(unicode-org#7795)</li>
<li>New <code>Date::try_from_fields</code> API for fully general date
construction from various choices of year and month values
(unicode-org#7798)</li>
<li>New methods: <code>Date::try_from_fields()</code></li>
<li>New types: <code>DateFields</code>,
<code>DateFromFieldsOptions</code>, <code>Overflow</code>,
<code>MissingFieldsStrategy</code>,
<code>DateFromFieldsError</code></li>
<li>New associated method: <code>Calendar::from_fields()</code></li>
<li>New Date arithmetic APIs for adding and subtracting dates
(unicode-org#7798, unicode-org#7355, unicode-org#7257)</li>
<li>New methods: <code>Date::try_add_with_options</code>,
<code>Date::try_added_with_options</code>,
<code>Date::try_until_with_options</code></li>
<li>New types: <code>DateDuration</code>, <code>DateAddOptions</code>,
<code>DateDifferenceOptions</code>, <code>DateDurationUnit</code>,
<code>DateDurationParseError</code>, <code>DateAddError</code>,
<code>MismatchedCalendarError</code></li>
<li>New associated items: <code>Calendar::add</code>,
<code>Calendar::until</code>,
<code>Calendar::DateCompatibilityError</code></li>
<li>Introduce a new <code>Month</code> type, preferred over using month
codes (unicode-org#7147, unicode-org#7756)
<ul>
<li>New type: <code>Month</code></li>
<li>New method: <code>MonthInfo::to_input()</code></li>
</ul>
</li>
<li>Introduce year/date ranges to all APIs, documented on the APIs
themselves. <code>Date</code> now has a fundamental range (ISO years
between ±999,999), and most constructors enforce a stricter range of
±9999 years for input years. (unicode-org#7676, unicode-org#7062,
unicode-org#7629, unicode-org#7753, unicode-org#7219,
unicode-org#7227)</li>
<li>Add constructors with <code>Month</code> for lunisolar calendars
(unicode-org#7485)</li>
<li>New methods: <code>Date::try_new_korean_traditional()</code>,
<code>Date::try_new_chinese_traditional()</code>,
<code>Date::try_new_hebrew_v2()</code></li>
<li>Expose <code>LeapStatus</code> on <code>MonthInfo</code>
(unicode-org#7667)</li>
<li>New method: <code>MonthInfo::leap_status()</code></li>
<li>New enum: <code>LeapStatus</code></li>
<li>(Unstable) Integrate with <code>chrono</code>, <code>jiff</code>,
and <code>time</code> (unicode-org#7617, unicode-org#7711)</li>
<li>New impls: <code>From&lt;chrono::NaiveDate&gt;</code>,
<code>From&lt;jiff::civil::Date&gt;</code>,
<code>From&lt;time::Date&gt;</code> for
<code>Date&lt;Gregorian&gt;</code></li>
<li>Replace <code>Date::day_of_week</code> by <code>Date::weekday</code>
(unicode-org#7288)
<ul>
<li>New method: <code>Date::weekday()</code></li>
</ul>
</li>
<li>Deprecate <code>Date::new_from_iso</code>/<code>Date::to_iso</code>
(unicode-org#7287)</li>
<li>Deprecate <code>Date::extended_year()</code> (use
<code>Date::year().extended_year()</code>) (unicode-org#7289)</li>
<li>Remove <code>YearInfo: PartialEq</code> bound
(unicode-org#7743)</li>
<li>Start producing Meiji era only after Meiji 6 (unicode-org#7503)</li>
</ul>
</li>
</ul>
</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/unicode-org/icu4x/commits">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=zerofrom&package-manager=cargo&previous-version=0.1.6&new-version=0.1.7)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-07 01:00:46 +00:00
dependabot[bot]
d596c5dc9e build: bump cc from 1.2.58 to 1.2.59 (#43985)
Bumps [cc](https://github.com/rust-lang/cc-rs) from 1.2.58 to 1.2.59.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/rust-lang/cc-rs/releases">cc's
releases</a>.</em></p>
<blockquote>
<h2>cc-v1.2.59</h2>
<h3>Fixed</h3>
<ul>
<li><em>(ar)</em> deterministic archives with <code>D</code> modifier
(<a
href="https://redirect.github.com/rust-lang/cc-rs/pull/1697">#1697</a>)</li>
</ul>
<h3>Other</h3>
<ul>
<li>Regenerate target info (<a
href="https://redirect.github.com/rust-lang/cc-rs/pull/1698">#1698</a>)</li>
<li>Fix target abi parsing for sanitiser targets (<a
href="https://redirect.github.com/rust-lang/cc-rs/pull/1695">#1695</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/rust-lang/cc-rs/blob/main/CHANGELOG.md">cc's
changelog</a>.</em></p>
<blockquote>
<h2><a
href="https://github.com/rust-lang/cc-rs/compare/cc-v1.2.58...cc-v1.2.59">1.2.59</a>
- 2026-04-03</h2>
<h3>Fixed</h3>
<ul>
<li><em>(ar)</em> deterministic archives with <code>D</code> modifier
(<a
href="https://redirect.github.com/rust-lang/cc-rs/pull/1697">#1697</a>)</li>
</ul>
<h3>Other</h3>
<ul>
<li>Regenerate target info (<a
href="https://redirect.github.com/rust-lang/cc-rs/pull/1698">#1698</a>)</li>
<li>Fix target abi parsing for sanitiser targets (<a
href="https://redirect.github.com/rust-lang/cc-rs/pull/1695">#1695</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="f4c5ac7a7e"><code>f4c5ac7</code></a>
chore(cc): release v1.2.59 (<a
href="https://redirect.github.com/rust-lang/cc-rs/issues/1699">#1699</a>)</li>
<li><a
href="9cfcecbb9d"><code>9cfcecb</code></a>
Regenerate target info (<a
href="https://redirect.github.com/rust-lang/cc-rs/issues/1698">#1698</a>)</li>
<li><a
href="025d046f99"><code>025d046</code></a>
fix(ar): deterministic archives with <code>D</code> modifier (<a
href="https://redirect.github.com/rust-lang/cc-rs/issues/1697">#1697</a>)</li>
<li><a
href="fe32d6834a"><code>fe32d68</code></a>
Fix target abi parsing dor sanitiser targets (<a
href="https://redirect.github.com/rust-lang/cc-rs/issues/1695">#1695</a>)</li>
<li>See full diff in <a
href="https://github.com/rust-lang/cc-rs/compare/cc-v1.2.58...cc-v1.2.59">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=cc&package-manager=cargo&previous-version=1.2.58&new-version=1.2.59)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-07 00:54:38 +00:00
Simon Wülker
0ddc7a08d0 script: Lock blob URL entry during XHR open() (#43977)
This is a followup to https://github.com/servo/servo/pull/43746 that
applies the same fix to XHR. Refer to that PR for a description of the
problem.

Testing: New tests start to pass

---------

Signed-off-by: Simon Wülker <simon.wuelker@arcor.de>
2026-04-06 19:46:52 +00:00
Babalola Taiwo J
6279f7bdef devtools: Rename ThreadActor variable names (#43955)
Renames local variables holding a `ThreadActor` instance to
`thread_actor`, following the `{}_actor` convention for actor variables
as described in #43606.

Changes:
- `actors/thread.rs`: `actor` → `thread_actor` in
`ThreadActor::register()`
- `lib.rs`: `thread` → `thread_actor` in
`handle_notifyscriptinterrupted` and `handle_create_frame_actor`

Part of #43606.

Signed-off-by: thebabalola <t.babalolajoseph@gmail.com>
Signed-off-by: eri <eri@igalia.com>
Co-authored-by: eri <eri@igalia.com>
2026-04-06 19:42:05 +00:00
Abbas Olanrewaju Sarafa
02a350d864 layout: Rename confusing ``SequentialLayoutState::collapse_margins`` (#43978)
Renamed ```SequentialLayoutState::collapse_margins``` to
```commit_margin``` in ```float.rs```, ```mod.rs``` &
```inline/mod.rs```

Testing: No testing required - just renaming. Compiles successfully.
Fixes: #43941

Signed-off-by: Sabb <sarafaabbas@gmail.com>
2026-04-06 19:10:28 +00:00
Babalola Taiwo J
f38de75888 devtools: Rename SourceActor variable names (#43959)
Renames local variable holding a `SourceActor` instance to
`source_actor`, following the `{}_actor` convention for actor variables
as described in #43606.

Changes:
- `actors/source.rs`: `actor` → `source_actor` in
`SourceActor::register()`

Part of #43606.

Signed-off-by: thebabalola <t.babalolajoseph@gmail.com>
2026-04-06 16:01:26 +00:00
Babalola Taiwo J
a7870df4c7 devtools: Rename WorkerActor variables and add register method (#43963)
Renames local variable `worker` to `worker_actor` in `lib.rs` and
`root.rs`, following the `{}_actor` convention for actor struct
variables and `{}_name` for actor name string variables established in
#43606.

Also adds a `WorkerActor::register()` method (part of #43800), replacing
the inline struct construction in `lib.rs` with a consistent pattern
pattern matching other actors like `ThreadActor` and `SourceActor`.

**Changes:**
- `actors/worker.rs`: Add `WorkerActor::register()` method
- `actors/root.rs`: Rename `worker` → `worker_actor` in
`listServiceWorkerRegistrations` handler
- `lib.rs`: Replace inline struct construction with
`WorkerActor::register()` call

**Testing:** No testing required, compiles successfully.

Fixes: Part of #43606
Fixes: Part of #43800

Signed-off-by: thebabalola <t.babalolajoseph@gmail.com>
2026-04-06 15:37:58 +00:00
Jonathan Schwender
d21fc8238a profile: Add debug_span and minor refactoring (#43971)
Add an internal macro to avoid duplication, and use that to implement
the existing two `trace` and `info` macros and add `debug_span`. We skip
`warn` and `error` (from the tracing-rs library), since those names
don't fit our profiling usage too well, and 3 different levels should
also be enough. If we need more levels in the future we could still add
more macros than (after deciding on better names than warn and error)

Testing: Servo is built with the tracing feature in CI (for HarmonyOS)

---------

Signed-off-by: Jonathan Schwender <schwenderjonathan@gmail.com>
2026-04-06 15:22:49 +00:00
Simon Wülker
e73c010bb1 Force callers to claim blob url before making a fetch request (#43746)
`blob` URLs have a implicit blob URL entry attached, which stores the
data contained in the blob. The specification requires this entry to be
resolved as the URL is parsed. We only resolve it inside `net` when
loading the URL. That causes problems if the blob entry has been revoked
in the meantime - see https://github.com/servo/servo/issues/25226.

Ideally we would want to resolve blobs at parse-time as required. But
because `ServoUrl` is such a fundamental type, I've not managed to do
this change without having to touch hundreds of files at once.

Thus, we now require passing a `UrlWithBlobClaim` instead of a
`ServoUrl` when `fetch`-ing. This type proves that the caller has
acquired the blob beforehand.

As a temporary escape hatch, I've added
`UrlWithBlobClaim::from_url_without_having_claimed_blob`. That method
logs a warning if its used unsafely. This method is currently used in
most places to keep this change small. Only workers now acquire the blob
beforehand.

Testing: A new test starts to pass
Part of https://github.com/servo/servo/issues/43326
Part of https://github.com/servo/servo/issues/25226

---------

Signed-off-by: Simon Wülker <simon.wuelker@arcor.de>
Co-authored-by: Josh Matthews <josh@joshmatthews.net>
2026-04-06 14:21:55 +00:00
Tim van der Lippe
324fed274a script: Pass &mut JSContext to Clipboard API's (#43975)
Part of #40600
Follow-up to #43943

Testing: It compiles

Signed-off-by: Tim van der Lippe <tvanderlippe@gmail.com>
2026-04-06 13:47:17 +00:00
Tim van der Lippe
cac1a7f0fc script: Add basic implementation of font-size command (#43287)
This makes the most basic tests pass for the font-size command. Future
PR's will continue the work,
but since this is already large enough, this is a good save point.

Part of #25005

Signed-off-by: Tim van der Lippe <tvanderlippe@gmail.com>
2026-04-06 13:23:36 +00:00
Euclid Ye
cb2dd62e62 net: Improve HTTP fetch compliance (#43970)
Most notably, implement step 5 and removes the incorrect place for it.
Part of follow up to #43798

Testing:
[Try](https://github.com/yezhizhen/servo/actions/runs/24021988217/job/70054752138)

---------

Signed-off-by: Euclid Ye <yezhizhenjiakang@gmail.com>
2026-04-06 12:28:52 +00:00
Abbas Olanrewaju Sarafa
3d0cfe34bb storage: Make add_new_environment return a Result instead of panicking (#43949)
Removed the ```.unwrap()``` in ```add_new_environment``` causing the
storage thread to panic on SQLite failures

Testing: Ran ```./mach test-wpt tests/wpt/tests/webstorage/```
Result; 
```
▶ TIMEOUT [expected OK] /webstorage/storage_local_setitem_quotaexceedederr.window.html

Ran 53 tests finished in 63.9 seconds.
  • 52 ran as expected.
  • 1 tests timed out unexpectedly
```
Fixes: #43880

---------

Signed-off-by: Sabb <sarafaabbas@gmail.com>
2026-04-06 12:25:52 +00:00
Martin Robinson
82c2f1434e fonts: Clean up application of word-spacing (#43899)
Let `word_spacing` be optional in `ShapingOptions` like
`letter_spacing`. In addition, send `None` for Canvas rendering instead
of the width of the space character. It is supposed to represent *extra*
space added between words. Finally, remove duplicated code that applied
word and letter spacing for the fast shaper. This can just reuse the
code from the Harfbuzz shaper.

Testing: This causes some canvas tests to start passing.

Signed-off-by: Martin Robinson <mrobinson@igalia.com>
2026-04-06 09:20:34 +00:00
Taym Haddadi
12ab179b05 IndexedDB: Fix object-store key range handling (#43901)
Fix object-store key range handling

Testing: key range handling wpt test fixed.
Fixes: part of #40983

---------

Signed-off-by: Taym Haddadi <haddadi.taym@gmail.com>
2026-04-06 09:15:27 +00:00
Abbas Olanrewaju Sarafa
00aa8c0e85 devtools: Replace new with register for LayoutInspectorActor (#43954)
Replaced new with register for LayoutInspectorActor in walker.rs &
layout.rs

Testing: No testing required, compiles successfully.
Fixes: Part of #43800

Signed-off-by: Sabb <sarafaabbas@gmail.com>
2026-04-06 09:13:33 +00:00
Abbas Olanrewaju Sarafa
c938e97b3c devtools: Replace new with register for TargetConfigurationActor (#43915)
Replaced new with register for TargetConfigurationActor in
target_configuration.rs & watcher.rs

Testing: No testing required, compiles successfully.
Fixes: Part of #43800

Signed-off-by: Sabb <sarafaabbas@gmail.com>
2026-04-06 09:12:17 +00:00
elomscansio
a7bc4a6fa9 script: implement render-blocking option for script fetch and load events (#43741)
Implement render-blocking option for script fetch and load events

## Testing: 
### Test summary on main branch
```bash
  ▶ OK [expected CRASH] /html/semantics/scripting-1/the-script-element/moving-between-documents/move-back-createHTMLDocument-success-external-module.html

Ran 470 tests finished in 217.9 seconds.
  • 453 ran as expected.
  • 4 tests crashed unexpectedly
  • 1 tests timed out unexpectedly
  • 11 tests unexpectedly okay
  • 2 tests had unexpected subtest results

/home/elomscansio/.local/share/uv/python/cpython-3.11.15-linux-x86_64-gnu/lib/python3.11/multiprocessing/resource_tracker.py:254: UserWarning: resource_tracker: There appear to be 3 leaked semaphore objects to clean up at shutdown
  warnings.warn('resource_tracker: There appear to be %d '

```

### Test summary for this commits
```bash

Ran 470 tests finished in 488.0 seconds.
  • 397 ran as expected.
  • 15 tests crashed unexpectedly
  • 27 tests timed out unexpectedly
  • 11 tests unexpectedly okay
  • 28 tests had unexpected subtest results

```
Fixes: #43697
Fixes: #43354

---------

Signed-off-by: Emmanuel Paul Elom <elomemmanuel007@gmail.com>
Signed-off-by: elomscansio <163124154+elomscansio@users.noreply.github.com>
Signed-off-by: Gae24 <96017547+Gae24@users.noreply.github.com>
Co-authored-by: Gae24 <96017547+Gae24@users.noreply.github.com>
2026-04-06 08:08:43 +00:00
Abbas Olanrewaju Sarafa
414a97c2e3 devtools: Replace new with register for AccessibleWalkerActor (#43967)
Replaced new with register for AccessibleWalkerActor in accessibility.rs

Testing: No testing required, compiles successfully.
Fixes: Part of #43800

Signed-off-by: Sabb <sarafaabbas@gmail.com>
2026-04-06 07:26:54 +00:00
dependabot[bot]
5184707255 build: bump tokio from 1.50.0 to 1.51.0 in the tokio-rs-related group across 1 directory (#43960)
Bumps the tokio-rs-related group with 1 update in the / directory:
[tokio](https://github.com/tokio-rs/tokio).

Updates `tokio` from 1.50.0 to 1.51.0
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/tokio-rs/tokio/releases">tokio's
releases</a>.</em></p>
<blockquote>
<h2>Tokio v1.51.0</h2>
<h1>1.51.0 (April 3rd, 2026)</h1>
<h3>Added</h3>
<ul>
<li>net: implement <code>get_peer_cred</code> on Hurd (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7989">#7989</a>)</li>
<li>runtime: add <code>tokio::runtime::worker_index()</code> (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7921">#7921</a>)</li>
<li>runtime: add runtime name (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7924">#7924</a>)</li>
<li>runtime: stabilize <code>LocalRuntime</code> (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7557">#7557</a>)</li>
<li>wasm: add wasm32-wasip2 networking support (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7933">#7933</a>)</li>
</ul>
<h3>Changed</h3>
<ul>
<li>runtime: steal tasks from the LIFO slot (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7431">#7431</a>)</li>
</ul>
<h3>Fixed</h3>
<ul>
<li>docs: do not show &quot;Available on non-loom only.&quot; doc label
(<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7977">#7977</a>)</li>
<li>macros: improve overall macro hygiene (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7997">#7997</a>)</li>
<li>sync: fix <code>notify_waiters</code> priority in
<code>Notify</code> (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7996">#7996</a>)</li>
<li>sync: fix panic in <code>Chan::recv_many</code> when called with
non-empty vector on closed channel (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7991">#7991</a>)</li>
</ul>
<p><a
href="https://redirect.github.com/tokio-rs/tokio/issues/7431">#7431</a>:
<a
href="https://redirect.github.com/tokio-rs/tokio/pull/7431">tokio-rs/tokio#7431</a>
<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7557">#7557</a>:
<a
href="https://redirect.github.com/tokio-rs/tokio/pull/7557">tokio-rs/tokio#7557</a>
<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7921">#7921</a>:
<a
href="https://redirect.github.com/tokio-rs/tokio/pull/7921">tokio-rs/tokio#7921</a>
<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7924">#7924</a>:
<a
href="https://redirect.github.com/tokio-rs/tokio/pull/7924">tokio-rs/tokio#7924</a>
<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7933">#7933</a>:
<a
href="https://redirect.github.com/tokio-rs/tokio/pull/7933">tokio-rs/tokio#7933</a>
<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7977">#7977</a>:
<a
href="https://redirect.github.com/tokio-rs/tokio/pull/7977">tokio-rs/tokio#7977</a>
<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7989">#7989</a>:
<a
href="https://redirect.github.com/tokio-rs/tokio/pull/7989">tokio-rs/tokio#7989</a>
<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7991">#7991</a>:
<a
href="https://redirect.github.com/tokio-rs/tokio/pull/7991">tokio-rs/tokio#7991</a>
<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7996">#7996</a>:
<a
href="https://redirect.github.com/tokio-rs/tokio/pull/7996">tokio-rs/tokio#7996</a>
<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7997">#7997</a>:
<a
href="https://redirect.github.com/tokio-rs/tokio/pull/7997">tokio-rs/tokio#7997</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="0af06b7bab"><code>0af06b7</code></a>
chore: prepare Tokio v1.51.0 (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/8005">#8005</a>)</li>
<li><a
href="01a7f1dfab"><code>01a7f1d</code></a>
chore: prepare tokio-macros v2.7.0 (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/8004">#8004</a>)</li>
<li><a
href="eeb55c733b"><code>eeb55c7</code></a>
runtime: steal tasks from the LIFO slot (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7431">#7431</a>)</li>
<li><a
href="1fc450aefb"><code>1fc450a</code></a>
runtime: stabilize <code>LocalRuntime</code> (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7557">#7557</a>)</li>
<li><a
href="324218f9bb"><code>324218f</code></a>
Merge tag 'tokio-1.47.4' (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/8003">#8003</a>)</li>
<li><a
href="aa65d0d0b8"><code>aa65d0d</code></a>
chore: prepare Tokio v1.47.4 (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/8002">#8002</a>)</li>
<li><a
href="bf18ed452d"><code>bf18ed4</code></a>
sync: fix panic in <code>Chan::recv_many</code> when called with
non-empty vector on clo...</li>
<li><a
href="43134f1e57"><code>43134f1</code></a>
wasm: add wasm32-wasip2 networking support (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7933">#7933</a>)</li>
<li><a
href="b4c3246d33"><code>b4c3246</code></a>
macros: improve overall macro hygiene (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7997">#7997</a>)</li>
<li><a
href="7947fa4bd7"><code>7947fa4</code></a>
rt: add runtime name (<a
href="https://redirect.github.com/tokio-rs/tokio/issues/7924">#7924</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/tokio-rs/tokio/compare/tokio-1.50.0...tokio-1.51.0">compare
view</a></li>
</ul>
</details>
<br />

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-06 07:16:38 +00:00
niya
06921b0725 devtools: rename the remaining NodeActor variables (#43969)
- Renames variables containing the `NodeActor` name to `node_name`

Testing: Manually tested with `mach test-devtools` 
Fixes: #43606 as described
[here](https://github.com/servo/servo/issues/43606#issuecomment-4189623228)
for `NodeActor`

---------

Signed-off-by: Niya Gupta <niyabits@disroot.org>
2026-04-06 07:05:26 +00:00
Martin Robinson
73c64b6182 servoshell: Only dismiss the most-recently opened IME (#43932)
Servo may hide and show IME when handling `blur` and `focus` events, but
those events can be fired asynchronously when `<iframe>`s are involved.
This change ensures that we only dismiss the IME when it was the
most-recently opened one, making it so that an asynchronously fired
'blur' event for another control doesn't dismiss a newly opened one.

Testing: We don't really have testing for this level of servoshell.

Signed-off-by: Martin Robinson <mrobinson@fastmail.fm>
Co-authored-by: Martin Robinson <mrobinson@fastmail.fm>
2026-04-06 07:02:26 +00:00
Abbas Olanrewaju Sarafa
9799656127 devtools: Replace new with register for PauseActor (#43957)
Replaced new with register for PauseActor in pause.rs & lib.rs

Testing: No testing required, compiles successfully.
Fixes: Part of #43800

Signed-off-by: Sabb <sarafaabbas@gmail.com>
2026-04-06 05:20:01 +00:00
Abbas Olanrewaju Sarafa
b8cff58b6c devtools: Replace new with register for WatcherActor (#43911)
Replaced new with register for WatcherActor in browsing_context &
watcher.rs

Testing: No testing required, compiles successfully.
Fixes: Part of #43800

---------

Signed-off-by: Sabb <sarafaabbas@gmail.com>
Signed-off-by: Mukilan Thiyagarajan <mukilan@igalia.com>
Co-authored-by: Mukilan Thiyagarajan <mukilan@igalia.com>
2026-04-06 05:14:13 +00:00
Tim Miller
3b128b37e3 script: Fix GC tracing of compiled JSScript pointers in ClassicScript (#43933)
Stores the compiled JSScript pointer in a rooted location that is
traceable by the GC. This fixes a crash observed in an experimental C#
Servo binding that could not be reproduced in servoshell.

Testing: No automated deterministic way to trigger this problem.

---------

Signed-off-by: Tim Miller <innerlogic4321@gmail.com>
2026-04-06 04:44:37 +00:00
Euclid Ye
0353f11ee2 script/mach: Increase stack size of ScriptThread/StyleThread to 8MiB to match recursion depth of other browsers (#43888)
TL;DR: We increase stack size of `ScriptThread` to 8MiB, and set Stylo
stack size environment var
to 8 MiB for all builds. This only reserves virtual memory space which
is
basically unlimited for 64-bit machine,
matches the recursion depth of Chromium for the example, which also uses
8MiB.

Stylo stack increase is necessary to prevent overflow when
refreshing/navigating to the example,
probably because initial load restyle incrementally but not refresh.

Testing: Added a Servo-specific test.

---
For example in #43845, we get stack overflow when we got more than 394
nested shadow roots.
For Chromium, it happens for more than 1631: 
<img width="290" height="127" alt="image"
src="https://github.com/user-attachments/assets/b3d75627-4e80-4586-9b85-4b58d8a0cd33"
/>
For Firefox, it overflows for more than 1052.

Initially I thought we didn't implement this optimally, and have
unnecessary recursion depth.
But the real reason is explained in Rust std:
> The default stack size is platform-dependent and subject to change.
Currently, it is 2 MiB on all Tier-1 platforms.

For Chromium, the visual studio dumpbin shows the stack size :
```
Dump of file C:\Program Files (x86)\Microsoft\Edge\Application\msedge.exe
OPTIONAL HEADER VALUES
          800000 size of stack reserve
```
This is hex value, which is $8*16^5$, exactly 8MiB.

After we make the same change in Servo, we are fine at 1601 and
overflows at 1602.
This matches Chromium behaviour, defeating firefox, and should not
create much overhead,
as this only reserves virtual memory space: 
I tried to increase the value to 512MiB, but task manager still says
73MB RAM used,
and we won't crash even with 10000 nested shadow roots. 
But that is just for more evidence and uncalled for.

Fixes: #43845

---------

Signed-off-by: Euclid Ye <yezhizhenjiakang@gmail.com>
2026-04-06 03:10:11 +00:00
dependabot[bot]
da9b9d9aa0 build: bump semver from 1.0.27 to 1.0.28 (#43966)
[//]: # (dependabot-start)
⚠️  **Dependabot is rebasing this PR** ⚠️ 

Rebasing might not happen immediately, so don't worry if this takes some
time.

Note: if you make any changes to this PR yourself, they will take
precedence over the rebase.

---

[//]: # (dependabot-end)

Bumps [semver](https://github.com/dtolnay/semver) from 1.0.27 to 1.0.28.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/dtolnay/semver/releases">semver's
releases</a>.</em></p>
<blockquote>
<h2>1.0.28</h2>
<ul>
<li>Documentation improvements</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="7625c7aa3f"><code>7625c7a</code></a>
Release 1.0.28</li>
<li><a
href="fd404d082c"><code>fd404d0</code></a>
Merge pull request 351 from czy-29/master</li>
<li><a
href="f75f26e984"><code>f75f26e</code></a>
The <code>doc_auto_cfg</code> and <code>doc_cfg</code> features have
been merged</li>
<li><a
href="9e2bfa2ec8"><code>9e2bfa2</code></a>
Enable <code>serde</code> on <code>docs.rs</code> and automatically add
<code>serde</code> flag to the docs</li>
<li><a
href="8591f2344b"><code>8591f23</code></a>
Unpin CI miri toolchain</li>
<li><a
href="66bdd2ce5f"><code>66bdd2c</code></a>
Pin CI miri to nightly-2026-02-11</li>
<li><a
href="324ffce5d9"><code>324ffce</code></a>
Switch from cargo bench to criterion</li>
<li><a
href="34133a568a"><code>34133a5</code></a>
Update actions/upload-artifact@v5 -&gt; v6</li>
<li><a
href="7f935ffc72"><code>7f935ff</code></a>
Update actions/upload-artifact@v4 -&gt; v5</li>
<li><a
href="c07fb91353"><code>c07fb91</code></a>
Switch from test::black_box to std::hint::black_box</li>
<li>Additional commits viewable in <a
href="https://github.com/dtolnay/semver/compare/1.0.27...1.0.28">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=semver&package-manager=cargo&previous-version=1.0.27&new-version=1.0.28)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-06 00:56:15 +00:00
dependabot[bot]
2a47b3dcef build: bump libz-sys from 1.1.25 to 1.1.28 (#43962)
Bumps [libz-sys](https://github.com/rust-lang/libz-sys) from 1.1.25 to
1.1.28.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/rust-lang/libz-sys/releases">libz-sys's
releases</a>.</em></p>
<blockquote>
<h2>1.1.28</h2>
<p>This release is mainly for testing the new <code>maint</code> tool to
prevent wrong releases in future.</p>
<p>It also adds a macOS fix for when the <code>cc</code> based build
script is used.</p>
<h2>1.1.27</h2>
<h2>What's Changed</h2>
<ul>
<li>Bump actions/download-artifact from 8.0.0 to 8.0.1 in the
github-actions group by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a>[bot]
in <a
href="https://redirect.github.com/rust-lang/libz-sys/pull/263">rust-lang/libz-sys#263</a></li>
<li>fix(zlib): remove unnecessary defines by <a
href="https://github.com/weihanglo"><code>@​weihanglo</code></a> in <a
href="https://redirect.github.com/rust-lang/libz-sys/pull/264">rust-lang/libz-sys#264</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/weihanglo"><code>@​weihanglo</code></a>
made their first contribution in <a
href="https://redirect.github.com/rust-lang/libz-sys/pull/264">rust-lang/libz-sys#264</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/rust-lang/libz-sys/compare/1.1.25...1.1.27">https://github.com/rust-lang/libz-sys/compare/1.1.25...1.1.27</a></p>
<h2>1.1.26 [YANKED]</h2>
<h2>YANKED</h2>
<p>These didn't contain the actual source code and thus wasn't
functional.</p>
<h2>What's Changed</h2>
<ul>
<li>Bump actions/download-artifact from 8.0.0 to 8.0.1 in the
github-actions group by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a>[bot]
in <a
href="https://redirect.github.com/rust-lang/libz-sys/pull/263">rust-lang/libz-sys#263</a></li>
<li>fix(zlib): remove unnecessary defines by <a
href="https://github.com/weihanglo"><code>@​weihanglo</code></a> in <a
href="https://redirect.github.com/rust-lang/libz-sys/pull/264">rust-lang/libz-sys#264</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/weihanglo"><code>@​weihanglo</code></a>
made their first contribution in <a
href="https://redirect.github.com/rust-lang/libz-sys/pull/264">rust-lang/libz-sys#264</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/rust-lang/libz-sys/compare/1.1.25...1.1.26">https://github.com/rust-lang/libz-sys/compare/1.1.25...1.1.26</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="346d882bef"><code>346d882</code></a>
bump version to 1.1.28 for macOS arm build fix via CC</li>
<li><a
href="7b4f21928c"><code>7b4f219</code></a>
cargo fmt and clippy</li>
<li><a
href="613d426250"><code>613d426</code></a>
Create a new <code>maint</code> tool to prevent common publishing
mistakes (<a
href="https://redirect.github.com/rust-lang/libz-sys/issues/265">#265</a>)</li>
<li><a
href="e4f06f16f4"><code>e4f06f1</code></a>
fix(zng): use crc32_armv8 for ARM cc builds</li>
<li><a
href="847cabf870"><code>847cabf</code></a>
bump to 1.1.27 for re-release after 1.1.26 was yanked</li>
<li><a
href="613a5cbca2"><code>613a5cb</code></a>
adapt <code>cargo-zng</code> script to deal with Cargo.lock file (by
ignoring it)</li>
<li><a
href="a2f22b1b51"><code>a2f22b1</code></a>
bump patch level prior to release</li>
<li><a
href="580d147321"><code>580d147</code></a>
Merge pull request <a
href="https://redirect.github.com/rust-lang/libz-sys/issues/264">#264</a>
from weihanglo/freebsd</li>
<li><a
href="817bbc0c6d"><code>817bbc0</code></a>
fix(zlib): remove unnecessary defines</li>
<li><a
href="232b03a9f9"><code>232b03a</code></a>
Merge pull request <a
href="https://redirect.github.com/rust-lang/libz-sys/issues/263">#263</a>
from rust-lang/dependabot/github_actions/github-actio...</li>
<li>Additional commits viewable in <a
href="https://github.com/rust-lang/libz-sys/compare/1.1.25...1.1.28">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=libz-sys&package-manager=cargo&previous-version=1.1.25&new-version=1.1.28)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-06 00:41:52 +00:00
dependabot[bot]
56cc25aaf0 build: bump arc-swap from 1.9.0 to 1.9.1 (#43961)
Bumps [arc-swap](https://github.com/vorner/arc-swap) from 1.9.0 to
1.9.1.
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/vorner/arc-swap/blob/master/CHANGELOG.md">arc-swap's
changelog</a>.</em></p>
<blockquote>
<h1>1.9.1</h1>
<ul>
<li>One more SeqCst :-| (<a
href="https://redirect.github.com/vorner/arc-swap/issues/204">#204</a>).</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="f100e6c2ee"><code>f100e6c</code></a>
One more SeqCst</li>
<li>See full diff in <a
href="https://github.com/vorner/arc-swap/compare/v1.9.0...v1.9.1">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=arc-swap&package-manager=cargo&previous-version=1.9.0&new-version=1.9.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-06 00:37:16 +00:00
dependabot[bot]
ae1d4b56cf build: bump glib from 0.22.3 to 0.22.4 in the gstreamer-related group (#43958)
Bumps the gstreamer-related group with 1 update:
[glib](https://github.com/gtk-rs/gtk-rs-core).

Updates `glib` from 0.22.3 to 0.22.4
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/gtk-rs/gtk-rs-core/releases">glib's
releases</a>.</em></p>
<blockquote>
<h2>0.22.4</h2>
<pre><code>Bilal Elmoussaoui:
      glib: Allow setting FINAL/DEPRECATED flags
      ci: Only run compile tests on glib crate
      Fix typos job
<p>Ignacio Casal Quinteiro:
glib-win32: fix export of function</p>
<p>Sebastian Dröge:
Update gir / gir-files
Regenerate with latest gir / gir-files
rustfmt: Update to 2024 edition
glib: Make sure to acquire the main context and make it thread default
in <code>MainContext::block_on()</code>
glib: Add various <code>#[allow(deprecated)]</code> to
<code>glib::wrapper!</code> for when the parent class/interface is
deprecated
Update versions to 0.22.4
</code></pre></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="48b79229f2"><code>48b7922</code></a>
Revert &quot;gio: Add &quot;D-Bus&quot; and &quot;DBus&quot; to
Cargo.toml keywords and description&quot;</li>
<li><a
href="42be84cb2e"><code>42be84c</code></a>
Update versions to 0.22.4</li>
<li><a
href="0835bbce70"><code>0835bbc</code></a>
glib: Add various <code>#[allow(deprecated)]</code> to
<code>glib::wrapper!</code> for when the par...</li>
<li><a
href="dc7961a302"><code>dc7961a</code></a>
gio: Add &quot;D-Bus&quot; and &quot;DBus&quot; to Cargo.toml keywords
and description</li>
<li><a
href="8377bd2cc7"><code>8377bd2</code></a>
glib-win32: fix export of function</li>
<li><a
href="d77af0b6c3"><code>d77af0b</code></a>
Fix typos job</li>
<li><a
href="9ea24d8403"><code>9ea24d8</code></a>
ci: Only run compile tests on glib crate</li>
<li><a
href="a9d5040cb4"><code>a9d5040</code></a>
glib: Allow setting FINAL/DEPRECATED flags</li>
<li><a
href="aae7fcb22c"><code>aae7fcb</code></a>
glib: Make sure to acquire the main context and make it thread default
in `Ma...</li>
<li><a
href="fac3d07f2a"><code>fac3d07</code></a>
rustfmt: Update to 2024 edition</li>
<li>Additional commits viewable in <a
href="https://github.com/gtk-rs/gtk-rs-core/compare/0.22.3...0.22.4">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=glib&package-manager=cargo&previous-version=0.22.3&new-version=0.22.4)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-06 00:17:52 +00:00
Sharan Poojari
57b86edb89 layout: Fix line breaking opportunities for Chinese and Japanese (#43744)
Enable `CJK-aware` line breaking in inline layout so Chinese/Japanese
text gets proper wrap opportunities instead of overflowing containers.

Testing: some WPT are now passing

---------

Signed-off-by: SharanRP <z8903830@gmail.com>
2026-04-05 23:15:54 +00:00
Abbas Olanrewaju Sarafa
674bd2567c devtools: Replace new with register for TabDescriptorActor (#43909)
Replaced new with register for TabDescriptorActor in browsing_context &
tab.rs

Testing: No testing required, compiles successfully.
Fixes: Part of #43800

Signed-off-by: Sabb <sarafaabbas@gmail.com>
2026-04-05 21:47:40 +00:00
Tim van der Lippe
a1c8896eda script: Pass &mut JSContext to reflect_node_with_proto (#43952)
A lot (and I mean, really a lot) depends on these constructors.
Therefore, this is the one spaghetti ball that I could extract and
convert all `can_gc` to `cx`. There are some new introductions of
`temp_cx` in the callbacks of the servo parser, but we already had some
in other callbacks.

Part of #40600

Testing: It compiles

Signed-off-by: Tim van der Lippe <tvanderlippe@gmail.com>
2026-04-05 18:07:30 +00:00
rovertrack
621f6b0cf7 servo: Use _ prefixed names for unused arguments for public API functions (#43947)
updated all methods that were using underscore in arguments

- `WebViewDelegate`
   - `notify_cursor_changed`: Added _cursor.
   - `notify_traversal_complete`: Added _traversal_id.
   - `notify_input_event_handled`: Added _event_id and _result.
   - `notify_fullscreen_state_changed`: Added _is_fullscreen.
   - `request_move_to`: Added _point.
   - `request_create_new`: Added _request.
   - `request_permission:` Added _request.
   - `show_bluetooth_device_dialog`: Added _request.
 - `WebXrRegistry`
    - `register`: Added _registry

Testing: No tests necessary as this is just renaming argument.
Fixes: #43894

Signed-off-by: Rover track <rishan.pgowda@gmail.com>
2026-04-05 16:03:51 +00:00
Muhammad Mostafa
bb368dbb12 script: Add error messages in StaticRange (#43260)
Adds specific InvalidNodeType error messages in StaticRange for
Constructor().

Part of [#40756](https://github.com/servo/servo/issues/40756)

Signed-off-by: Mohamed Mostafa <mohamedmoustafaa1998@gmail.com>
2026-04-05 12:40:03 +00:00
Tim van der Lippe
46582ec41d script: Remove CanGc::note() from create.rs (#43946)
Part of #40600

Testing: It compiles

Signed-off-by: Tim van der Lippe <tvanderlippe@gmail.com>
2026-04-05 12:28:12 +00:00
Tim van der Lippe
496d808ab1 script: Pass &mut JSContext to consume_stream (#43944)
Migrates all but one `CanGc::note()` to `cx`. The
last one requires `invoke_script_environment_preparer` to have a safe
hook in mozjs.

Part of #40600

Testing: It compiles

Signed-off-by: Tim van der Lippe <tvanderlippe@gmail.com>
2026-04-05 11:31:21 +00:00
Tim van der Lippe
1fd77d022e script: Pass &mut JSContext to make_atomic_setter (#43942)
Part of #42812

Testing: It compiles

Signed-off-by: Tim van der Lippe <tvanderlippe@gmail.com>
2026-04-05 11:26:55 +00:00
Tim van der Lippe
cf1b104b1a script: Pass &mut JSContext to RoutedPromiseListener (#43943)
Part of #40600

Testing: It compiles

Signed-off-by: Tim van der Lippe <tvanderlippe@gmail.com>
2026-04-05 11:18:25 +00:00
Martin Robinson
9da7484061 script: Support deleting words with backspace in textual inputs (#43940)
This change makes it so that when you press alt or control (depending on
the platform) while backspacing, entire words are deleted. This matches
the behavior of the major desktop platforms.

Testing: This change adds a Servo-specific test for this behavior as
well
as for normal backspacing.

Signed-off-by: Martin Robinson <mrobinson@fastmail.fm>
Co-authored-by: Martin Robinson <mrobinson@fastmail.fm>
2026-04-05 10:28:31 +00:00
Taym Haddadi
6d6d1d371e IndexedDB: Fix fire-success/error event exception handling and explicit commit (#43914)
Fix fire-success/error event exception handling and explicit commit

Testing: fire-success/error event wpt test fixed.
Fixes: part of #40983

Signed-off-by: Taym Haddadi <haddadi.taym@gmail.com>
2026-04-05 07:26:34 +00:00
webbeef
0b5688fdfa parser: add a pretty printer for top-level json documents (#43702)
This adds a new resource implementing a simple pretty printer for json
documents.

Testing: build this branch and launch with `./mach run
https://httpbin.org/json`

<img width="1044" height="1064" alt="image"
src="https://github.com/user-attachments/assets/42680c4b-2971-482a-af2b-9017f0f81752"
/>

---------

Signed-off-by: webbeef <me@webbeef.org>
Signed-off-by: Tim van der Lippe <tvanderlippe@gmail.com>
Co-authored-by: Tim van der Lippe <TimvdLippe@users.noreply.github.com>
2026-04-05 06:50:52 +00:00
Kingsley Yung
623ce59383 script: Move Web Crypto interfaces to script/dom/webcrypto (#43939)
Move interfaces defined by the Web Crypto specification to the
`script/dom/webcrypto/` from `script/dom/`. This includes `Crypto`,
`CryptoKey` and `SubtleCrypto` (with its submodules).

Testing: No behavior changes.
Fixes: Part of #38901

Signed-off-by: Kingsley Yung <kingsley@kkoyung.dev>
2026-04-05 06:42:00 +00:00
Jonathan Schwender
e6ed6954b4 bootstrap: Sync packages with book (#43931)
This commit removes some unneeded lines from README.md in the
linux_packages folder. This will trigger `book-export` to run, since it
creates a diff in that folder.
Since the export only happens if there is a diff, and there have been
not changes since the auto sync PR landed, the formatting related
changes never got upstreamed (sorting the packages).

This also duplicates as a test of the #43920 (merged), to be absolutely
sure that it works as intended.

Testing: not required / in a way this is a test of our book-export
action.

Signed-off-by: Jonathan Schwender <schwenderjonathan@gmail.com>
2026-04-05 06:36:49 +00:00
Abbas Olanrewaju Sarafa
2476d98f34 devtools: Replace new with register for ThreadConfigurationActor (#43916)
Replaced new with register for ThreadConfigurationActor in
thread_configuration.rs & watcher.rs

Testing: No testing required, compiles successfully.
Fixes: Part of #43800

Signed-off-by: Sabb <sarafaabbas@gmail.com>
2026-04-05 06:15:27 +00:00
Euclid Ye
bb94d98bb2 mach: Set up build mode (Side effect: Stylo thread stack size reduced to 512 KiB again for most build) (#43930)
Previously the mode is always "". Now we set it to the real profile.
Note that previously, all the build has 2MiB stylo thread stack size,
which is unintended.
What we wanted is to make it so for debug build, and 8MiB for ASAN.

Now, those other than debug/ASAN would have 512KiB default stack size
again.

Testing: Manually tested and printed the mode at
0b6b97384d/python/servo/command_base.py (L419-L420).
Fixes: #43927

Signed-off-by: Euclid Ye <yezhizhenjiakang@gmail.com>
2026-04-05 01:51:58 +00:00
Servo WPT Sync
ff9e242de9 Sync WPT with upstream (05-04-2026) (#43938)
Automated downstream sync of changes from upstream as of 05-04-2026
[no-wpt-sync]

Signed-off-by: WPT Sync Bot <ghbot+wpt-sync@servo.org>
2026-04-05 01:39:41 +00:00
Jambong Ralpher
54987adba2 Automatically generate CONTENT_EVENT_HANDLER_NAMES from WebIDL (#43848)
Replaced the hand-written CONTENT_EVENT_HANDLER_NAMES list in
eventtarget.rs with an auto-generated version. A new
ContentEventHandlerNames method in codegen.py iterates through WebIDL
descriptors whose prototype chain includes Node, collects attribute
members with EventHandler callback types, and generates a sorted,
duplicate free Rust array. The list is included from eventtarget.rs
eliminating manual maintenance and ensuring it stays in sync
automatically.

Testing: Build passes with cargo build -j2 -p servo-script. The
generated ContentEventHandlerNames.rs produces the same set of event
handler names as the previous hand-written list. No new runtime tests
are needed since this is a compile time change.

Fixes: #43611

Signed-off-by: staysafe020 <jambongralpher@gmail.com>
2026-04-04 20:52:41 +00:00
Abbas Olanrewaju Sarafa
7c05e1556f devtools: Replace new with register for ProcessActor (#43923)
Replaced new with register for ProcessActor in process.rs & root.rs

Testing: No testing required, compiles successfully.
Fixes: Part of #43800

Signed-off-by: Sabb <sarafaabbas@gmail.com>
2026-04-04 13:17:20 +00:00
Abbas Olanrewaju Sarafa
3af1abc013 devtools: Replace new with register for AccessibilityActor (#43907)
Replaced new with register for AccessibilityActor in browsing_context &
accessibility.rs

Testing: No testing required, compiles successfully.
Fixes: Part of #43800

---------

Signed-off-by: Sabb <sarafaabbas@gmail.com>
2026-04-04 13:14:16 +00:00
Jonathan Schwender
06f6320609 ci: Secure book-sync secrets with environment (#43920)
We defined a github environment `book-sync` which contains the required
secrets.
After merging this PR we can remove the secrets from the per-repository
secrets, which reduces the scope the secrets are available in, and has
the added restriction of only being available on protected branches.

Testing: Prior to this PR, the functionality was tested on the
`environments` branch and discussed in the maintainers chat. After this
PR is merged, a manual check should be done to ensure the book-export
workflow still continues to work as expected.

Signed-off-by: Jonathan Schwender <schwenderjonathan@gmail.com>
2026-04-04 11:34:27 +00:00
Tim van der Lippe
a683c03140 script: Pass &mut JSContext to make_int_setter (#43928)
Part of #42638

Testing: It compiles

Signed-off-by: Tim van der Lippe <tvanderlippe@gmail.com>
2026-04-04 11:23:37 +00:00
Martin Robinson
953206e7c4 script: Add an initial implementation of relatedTarget forfocus/blur events (#43926)
This is an initial implementation for `relatedTarget` of `focus` and
`blur` events. As the code doesn't yet follow the specification this is
a bit tricky to associate with specification text, but this should be
correct. A later change will adapt the code to the "focusing steps" part
of the specification.

In addition, a duplicated call to `Element::set_focus_state` is removed.

Testing: This gets one more WPT test passing.

Signed-off-by: Martin Robinson <mrobinson@fastmail.fm>
Co-authored-by: Martin Robinson <mrobinson@fastmail.fm>
2026-04-04 10:32:22 +00:00
Ashwin Naren
3718abd626 Prevent panic on client storage directory conflict (#43918)
Instead of erroring out, it makes sense to open an in memory database so
that the thread can keep functioning.

Fixes:
https://servo.zulipchat.com/#narrow/channel/263398-general/topic/WPT.20regression

Signed-off-by: Ashwin Naren <arihant2math@gmail.com>
2026-04-04 10:24:24 +00:00
Abbas Olanrewaju Sarafa
571562c092 devtools: Replace new with register for NetworkEventActor (#43925)
Replaced new with register for NetworkEventActor in network_event.rs &
lib.rs

Testing: No testing required, compiles successfully.
Fixes: Part of #43800

Signed-off-by: Sabb <sarafaabbas@gmail.com>
2026-04-04 09:24:45 +00:00
Abbas Olanrewaju Sarafa
86dca5f7dd devtools: Replace new with register for BrowsingContextActor (#43924)
Replaced new with register for BrowsingContextActor in
browsing_context.rs & lib.rs

Testing: No testing required, compiles successfully.
Fixes: Part of #43800

---------

Signed-off-by: Sabb <sarafaabbas@gmail.com>
2026-04-04 09:23:36 +00:00
Jonathan Schwender
42ba33d7ef mach: Fix typo in allocator feature name (#43922)
This name was mistyped during the rename of the servo crates to be
`servo-` prefixed.

Testing: Manual testing with `./mach build --with-asan`. This is not run
in CI.

Signed-off-by: Jonathan Schwender <55576758+jschwe@users.noreply.github.com>
2026-04-04 09:15:58 +00:00
Abbas Olanrewaju Sarafa
0b6b97384d devtools: Replace new with register for PerformanceActor (#43921)
Replaced new with register for PerformanceActor in performance.rs &
root.rs

Testing: No testing required, compiles successfully.
Fixes: Part of #43800

---------

Signed-off-by: Sabb <sarafaabbas@gmail.com>
2026-04-04 08:27:26 +00:00
Abbas Olanrewaju Sarafa
b886d57024 devtools: Replace new with register for ThreadActor (#43910)
Replaced new with register for ThreadActor in browsing_context,
thread.rs & lib.rs

Testing: No testing required, compiles successfully.
Fixes: Part of #43800

---------

Signed-off-by: Sabb <sarafaabbas@gmail.com>
2026-04-04 07:47:23 +00:00
Abbas Olanrewaju Sarafa
f161acbdaf devtools: Replace new with register for BreakpointListActor (#43917)
Replaced new with register for BreakpointListActor in breakpoint.rs &
watcher.rs

Testing: No testing required, compiles successfully.
Fixes: Part of #43800

Signed-off-by: Sabb <sarafaabbas@gmail.com>
2026-04-04 06:50:16 +00:00
Tim van der Lippe
20265ced1d script: Allow for implicit cx for setters (#43524)
Rather than specifying a cx for every single CSS property from Stylo, we
instead make it implicit. This allows us to remove the `CanGc::note()`
from the macro and pass in the `cx` as normal.

We can use a similar approach for setters in other elements where we use
the setter macros.

Part of #42812

Testing: It compiles

Signed-off-by: Tim van der Lippe <tvanderlippe@gmail.com>
2026-04-04 06:46:13 +00:00
Abbas Olanrewaju Sarafa
4876a49eba devtools: Replace new with register for CssPropertiesActor (#43908)
Replaced new with register for CssPropertiesActor in browsing_context &
css_properties.rs

Testing: No testing required, compiles successfully.
Fixes: Part of #43800

Signed-off-by: Sabb <sarafaabbas@gmail.com>
Signed-off-by: Mukilan Thiyagarajan <mukilan@igalia.com>
Co-authored-by: Mukilan Thiyagarajan <mukilan@igalia.com>
2026-04-04 05:57:41 +00:00
Abbas Olanrewaju Sarafa
2ce06e370d devtools: Replace new with register for NetworkParentActor (#43913)
Replaced new with register for NetworkParentActor in network_parent.rs &
watcher.rs

Testing: No testing required, compiles successfully.
Fixes: Part of #43800

Signed-off-by: Sabb <sarafaabbas@gmail.com>
2026-04-04 04:59:45 +00:00
Abbas Olanrewaju Sarafa
410f4a4270 devtools: Replace new with register for ReflowActor (#43906)
Replaced new with register for ReflowActor in browsing_context &
reflow.rs

Testing: No testing required, compiles successfully.
Fixes: Part of #43800

Signed-off-by: Sabb <sarafaabbas@gmail.com>
2026-04-04 04:56:07 +00:00
yvt
c6bb3cc363 Implement the IDL security check (#28583)
<https://heycam.github.io/webidl/#es-operations>
<https://html.spec.whatwg.org/multipage/#integration-with-idl>

This PR implements the IDL security check, which is an important (but
not only) mechanism to restrict cross-origin DOM accesses.

This implementation follows WebKit's behavior and not the specification
for the following reasons:

 - Neither Gecko nor WebKit implements the specification's behavior.
- This would require a relatively elaborate mechanism *just* to have
access to the target object's `CrossOriginProperties`.

---
- [x] `./mach build -d` does not report any errors
- [x] `./mach test-tidy` does not report any errors
- [ ] These changes fix #___ (GitHub issue number if applicable)

---
- [x] There are tests for these changes OR
- [ ] These changes do not require tests because ___

---------

Signed-off-by: Josh Matthews <josh@joshmatthews.net>
Co-authored-by: Josh Matthews <josh@joshmatthews.net>
2026-04-04 04:18:08 +00:00
Simon Sapin
894327e5d9 Don’t print passing unit tests (#43902)
This removes from the output of `./mach test-unit` hundreds of lines
like:

```
        PASS [   0.011s] style_tests str::test_str_join_empty
```

Signed-off-by: Simon Sapin <simon@igalia.com>
2026-04-04 01:59:05 +00:00
xtqqczze
c8c200cf96 build(deps): bump rustix from 1.1.2 to 1.1.4 (#43898)
Signed-off-by: xtqqczze <45661989+xtqqczze@users.noreply.github.com>
2026-04-04 00:55:03 +00:00
Taym Haddadi
2c5f981d46 indexeddb: abort pending worker upgrade and delete new db on rollback (#42998)
Abort pending upgrade requests when a worker closes, correctly roll back
newlyy created databases by deleting backend state on old_version == 0.

Testing: IndexedDB/worker-termination-aborts-upgrade.window.js.ini test
pass.
part of https://github.com/servo/servo/issues/40983

---------

Signed-off-by: Taym Haddadi <haddadi.taym@gmail.com>
2026-04-03 22:32:13 +00:00
Martijn Gribnau
9b69acb88b Replace normal struct instantiation with register for WalkerActor (#43881)
Part of a larger refactor to name all actors and variables consistently,
#43800.

Testing: not necessary beyond `mach test-devtools` per #43800 / #43606
Fixes: `#43800` ((not linking because it's one of many PR's).

Signed-off-by: Martijn Gribnau <garm@ilumeo.com>
2026-04-03 19:04:12 +00:00
Martin Robinson
c0d0a87b01 script: Modernize the HTMLElement WebIDL (#43904)
The WebIDL file for `HTMLElement` was quite out of date. This change
makes it match the current HTML specification and also moves `blur()` to
`HTMLOrSVGElement` as it is in the spec. The implementation is just a
copy of the one for `HTMLElement` as we do for `focus()`.

Testing: This should not change behavior (other than adding a `blur()`
method
for SVG -- which we don't support), so should be covered by existing
tests.

Signed-off-by: Martin Robinson <mrobinson@igalia.com>
2026-04-03 18:25:33 +00:00
Martin Robinson
b9df62427b script: Expose legacy table cellSpacing, cellPadding, and align DOM properties (#43903)
We support these setting these as attributes on the `<table>` element
itself, so I suppose it also makes sense to support their DOM APIs as
well. This trades a teensy bit of code for compatability with a some old
web content.

Testing: This causes a handful of reflection subtests to start passing.
There
isn't much testing for the behavior of the attributes, but this is
legacy
behavior so it sort of makes sense.

Signed-off-by: Martin Robinson <mrobinson@igalia.com>
2026-04-03 18:09:13 +00:00
Shubham Gupta
4e315eba20 paint: Use viewport meta tag's min-scale and max-scale to clamp pinch zoom (#40098)
**Note: Targeting Mobile devices only.**

Clamp Pinch Zoom factor using `viewport`' scale range parsed from
`<meta>` tag.

## **Behavior in Servo with this PR**

Zoom Type |  Device Type | Meta Supported |Range 
-- | -- | -- | -- 
Pinch Zoom |  Mobile | Yes |Parsed from <meta> 
Pinch Zoom |  Desktop | No |Default

## **Observed behavior in Chrome**:

Device Type | Viewport Meta Support (pref) | Pinch Zoom (No Reflow) |
Zoom (using keyboard)
-- | -- | -- | --
Mobile (Android) | Yes | Clamped within Viewport Meta Range | Applied to
Pinch Zoom*
Desktop (Touch Screen) | No | Clamped within Default Range | Same as
Zoom (using menu)

## References from Chromium:
- Defination of
[page_scale_delta](https://source.chromium.org/chromium/chromium/src/+/main:cc/trees/layer_tree_host_client.h;drc=56c66e417c83e2096a4e4e8a5c4ab7bbd525c9f3;bpv=1;bpt=1;l=39?gsn=page_scale_delta&gs=KYTHE%3A%2F%2Fkythe%3A%2F%2Fchromium.googlesource.com%2Fcodesearch%2Fchromium%2Fsrc%2F%2Fmain%3Flang%3Dc%252B%252B%3Fpath%3Dcc%2Ftrees%2Flayer_tree_host_client.h%23SKME3VsvfEmKrf_3d5aQckyeMEaDxgiGETVRCM1Haac);
-
[WebViewImpl::ApplyViewportChanges](https://source.chromium.org/chromium/chromium/src/+/main:third_party/blink/renderer/core/exported/web_view_impl.cc;drc=56c66e417c83e2096a4e4e8a5c4ab7bbd525c9f3;bpv=1;bpt=1;l=4108?gsn=ApplyViewportChanges&gs=KYTHE%3A%2F%2Fkythe%3A%2F%2Fchromium.googlesource.com%2Fcodesearch%2Fchromium%2Fsrc%2F%2Fmain%3Flang%3Dc%252B%252B%3Fpath%3Dthird_party%2Fblink%2Frenderer%2Fcore%2Fexported%2Fweb_view_impl.cc%23ll5snmeunTDzY4PwOwxyrNwNc4EI13SFfZnHIWuBsNw)
->
[SetPageScaleFactorAndLocation](https://source.chromium.org/chromium/chromium/src/+/main:third_party/blink/renderer/core/exported/web_view_impl.cc;drc=56c66e417c83e2096a4e4e8a5c4ab7bbd525c9f3;bpv=1;bpt=1;l=2430?gsn=SetPageScaleFactorAndLocation&gs=KYTHE%3A%2F%2Fkythe%3A%2F%2Fchromium.googlesource.com%2Fcodesearch%2Fchromium%2Fsrc%2F%2Fmain%3Flang%3Dc%252B%252B%3Fpath%3Dthird_party%2Fblink%2Frenderer%2Fcore%2Fexported%2Fweb_view_impl.cc%23wJPsxPYe-aA-buH8xy-IKLTHLqtZ_IgGsuARSYerlUE)
->
[SetScaleAndLocation](https://source.chromium.org/chromium/chromium/src/+/main:third_party/blink/renderer/core/frame/visual_viewport.cc;drc=56c66e417c83e2096a4e4e8a5c4ab7bbd525c9f3;bpv=1;bpt=1;l=513?gsn=SetScaleAndLocation&gs=KYTHE%3A%2F%2Fkythe%3A%2F%2Fchromium.googlesource.com%2Fcodesearch%2Fchromium%2Fsrc%2F%2Fmain%3Flang%3Dc%252B%252B%3Fpath%3Dthird_party%2Fblink%2Frenderer%2Fcore%2Fframe%2Fvisual_viewport.cc%23u-DPCmPBwcQcKi3oKJ1duPI83otfHDXzsQI8KMYusaA)
->
[DidSetScaleOrLocation](https://source.chromium.org/chromium/chromium/src/+/main:third_party/blink/renderer/core/frame/visual_viewport.cc;drc=56c66e417c83e2096a4e4e8a5c4ab7bbd525c9f3;bpv=1;bpt=1;l=543?gsn=DidSetScaleOrLocation&gs=KYTHE%3A%2F%2Fkythe%3A%2F%2Fchromium.googlesource.com%2Fcodesearch%2Fchromium%2Fsrc%2F%2Fmain%3Flang%3Dc%252B%252B%3Fpath%3Dthird_party%2Fblink%2Frenderer%2Fcore%2Fframe%2Fvisual_viewport.cc%23cUPTZHhvInyEDOapVOzXmCPKg9DO_tYGVZY7y0D9EBw)
->
[ClampToConstraints](https://source.chromium.org/chromium/chromium/src/+/main:third_party/blink/renderer/core/frame/page_scale_constraints.cc;drc=56c66e417c83e2096a4e4e8a5c4ab7bbd525c9f3;bpv=1;bpt=1;l=60?gsn=ClampToConstraints&gs=KYTHE%3A%2F%2Fkythe%3A%2F%2Fchromium.googlesource.com%2Fcodesearch%2Fchromium%2Fsrc%2F%2Fmain%3Flang%3Dc%252B%252B%3Fpath%3Dthird_party%2Fblink%2Frenderer%2Fcore%2Fframe%2Fpage_scale_constraints.cc%23hQCpRu6_p6TTaFLTpDMnO_d-g3SnRpG-p5UlazTZlK8)

Testing: New WPT Test Added:
`wpt/visual-viewport/viewport-scale-clamped.html`



Fixes: #40390 (Observed Inconsistent behavior with Chrome Android)

---------

Signed-off-by: Shubham Gupta <shubham.gupta@chromium.org>
2026-04-03 11:50:31 +00:00
Narfinger
6cece2de59 servo: Add WebView::load_request and UrlRequest to the API (#43338)
This is the continuation of https://github.com/servo/servo/pull/43310
with permission from https://github.com/longvatrong111 to continue the
work as they are currently busy.

This changes the previous PR by having a a new URLRequest object that
can have more parameters. Currently we only implement headers.


Testing: We only have unit tests for the embedder API and I am not sure
how they would look like in this case.

---------

Signed-off-by: Narfinger <Narfinger@users.noreply.github.com>
Co-authored-by: batu_hoang <longvatrong111@gmail.com>
Co-authored-by: Martin Robinson <mrobinson@igalia.com>
2026-04-03 11:09:30 +00:00
Abbas Olanrewaju Sarafa
b95dbb8721 devtools: Replace new with register for ConsoleActor (#43896)
Replace new with register for ConsoleActor in console & lib.rs

Testing: No testing required, compiles successfully.
Fixes: Part of #43800

Signed-off-by: Sabb <sarafaabbas@gmail.com>
2026-04-03 10:50:21 +00:00
Martin Robinson
f5c23cd711 servoshell: Only send ImeEvent::Dismissed when the user dismisses the IME (#43872)
We receive the `Ime::Disabled` event both when the user dismisses the
IME and
when Servo itself dismisses it (for instance, when changing focus on the
page).
Servo will blur the current element upon receiving
`ImeEvent::Dismissed`,
leading to spurious focus behavior. Address this by only sending this
message
to Servo when it was triggered by the user (as best as we can tell).
This
problem was revealed by improvements in the internal focus APIs.

This is a bit of a bandaid until we have a more robust IME API. There
are
still missing pieces in both Servo and in winit.

Testing: This is a bit tricky to test as it depends a lot on when
messages are
sent to servoshell from winit / the windowing system and when Servo
processes
focus events.

Signed-off-by: Martin Robinson <mrobinson@igalia.com>
2026-04-03 10:41:49 +00:00
Martin Robinson
e3c894b12e servo: Remove WebViewDelegate::play_gamepad_haptic_effect and WebViewDelegate::stop_gamepad_haptic_effect (#43895)
This functionality was moved to the `GamepadDelegate`, but the old
delegate methods were never removed. They are currently unused.

Testing: This change just removes dead code, so no tests are necessary.
Fixes: #43743

Signed-off-by: Martin Robinson <mrobinson@igalia.com>
2026-04-03 10:23:08 +00:00
Simon Wülker
a9dd2b3d15 servo: Fix rustdoc warnings (#43892)
One warning remains:
```
warning: public documentation for `intercept` links to private item `InterceptedWebResourceLoad`
   --> components/servo/webview_delegate.rs:240:11
    |
240 |     /// [`InterceptedWebResourceLoad`].
    |           ^^^^^^^^^^^^^^^^^^^^^^^^^^ this item is private
    |
    = note: this link resolves only because you passed `--document-private-items`, but will break without
    = note: `#[warn(rustdoc::private_intra_doc_links)]` on by default

warning: `servo` (lib doc) generated 1 warning
```

I think that's a false positive, because `InterceptedWebResourceLoad`
*is* public.

Testing: We don't run `./mach doc` in CI, so there's no way to test this

Signed-off-by: Simon Wülker <simon.wuelker@arcor.de>
2026-04-03 10:19:38 +00:00
Abbas Olanrewaju Sarafa
91ede77766 devtools: Replace new with register for DeviceActor (#43893)
Replace new with register for DeviceActor in device & root.rs

Testing: No testing required, compiles successfully.
Fixes: Part of #43800

Signed-off-by: Sabb <sarafaabbas@gmail.com>
2026-04-03 10:18:04 +00:00
Babalola Taiwo J
ac712151b5 script: Fix assertion failure when stringifying cross-origin location object (#43844)
This fixes the assertion failure (`!JS_IsExceptionPending(*cx)`) that
happens when `console.log()` is called on a cross-origin location object
(e.g. `console.log(frame.contentWindow.location)`).

The problem was that `maybe_stringify_dom_object` calls `ToString` on
cross-origin objects, which throws a JS exception via the
`DissimilarOriginLocation` stringifier. That exception was never
cleared, so subsequent JS API calls would hit the assertion.

The fix refactors `console_argument_from_handle_value` using an
inner/outer function pattern based on @jdm's suggestion:
- The inner function returns `Result<ConsoleArgument, ()>` and returns
`Err(())` when `console_object_from_handle_value` returns `None` for an
object, instead of falling through to `stringify_handle_value` which
could trigger the same crash
- The outer function catches the `Err`, reports any pending JS exception
via `report_pending_exception`, and returns a fallback `ConsoleArgument`

Fixes #43530

---

- [x] `./mach build -d` does not report any errors
- [x] `./mach fmt` produces no changes
- [x] There are tests for these changes

---------

Signed-off-by: thebabalola <t.babalolajoseph@gmail.com>
2026-04-03 10:15:05 +00:00
Sam
e6782a5bda script: Support auto generating all types of cx (#43174)
This will be useful in the future but for now we just always emit old
cx.

Testing: None
Work for #40600

---------

Signed-off-by: sagudev <16504129+sagudev@users.noreply.github.com>
2026-04-03 08:43:58 +00:00
Abbas Olanrewaju Sarafa
c5c26230a2 devtools: Replace new with register for StyleSheetsActor (#43889)
Replaced new with register for StyleSheetsActor in browsing_context &
stylesheets

Testing: No testing required, compiles successfully.
Fixes: Part of #43800

Signed-off-by: Sabb <sarafaabbas@gmail.com>
2026-04-03 07:00:54 +00:00
Kingsley Yung
c1e9e04868 script: Implement SubtleCrypto.supports (#43703)
Implement the `SubtleCrypto.supports` method in our WebCrypto API. The
*check support for an algorithm* algorithm is also implemented to
support the functionality of the `SubtleCrypto.supports` method.

The `SubtleCrypto.supports` method has two overloads. One of them has a
union containing a non-object value at the distinguishing index. Our
`codegen.py` currently does not support unions of non-object values at
distinguish index. So, `codegen.py` is also patched to extend its
support to unions of objects, strings, numbers and boolean values.

Specification of `SubtleCrypto.supports`:

https://wicg.github.io/webcrypto-modern-algos/#SubtleCrypto-method-supports
Specification of "check support for an algorithm":

https://wicg.github.io/webcrypto-modern-algos/#dfn-check-support-for-algorithm

Testing:
- Pass WPT tests related to `supports` method.
- Add new tests for IDL operation overloading with unions of various
types.

Fixes: Part of #40687

---------

Signed-off-by: Kingsley Yung <kingsley@kkoyung.dev>
2026-04-03 06:52:26 +00:00
Euclid Ye
b47ab9c500 net: Set "request-includes-credentials" and URL list of response for "HTTP-network-or-cache fetch" (#43798)
Part of #33616

Implement step 11, 13 of [HTTP-network-or-cache
fetch](https://fetch.spec.whatwg.org/#concept-http-network-or-cache-fetch)

Testing: The value should be used in
https://fetch.spec.whatwg.org/#cross-origin-resource-policy-internal-check,
which will be done in a follow up. Right now it is not used.

Signed-off-by: Euclid Ye <yezhizhenjiakang@gmail.com>
2026-04-03 06:49:10 +00:00
treetmitterglad
0054b71d38 Fix error message for illegal HTML element constructors (#41107) (#43882)
Changes the error message for illegal HTML element constructors like
`new HTMLElement()` from `new.target must not be the active function
object` to `Illegal constructor.`, matching Chrome, Firefox, and the
HTML spec.

Ran `./mach test-wpt tests/wpt/tests/custom-elements`. 180 tests passed
as expected, 1 pre-existing unrelated crash in `HTMLMediaElement`.

#41107

Signed-off-by: Eli Bowman <asdfup@protonmail.com>
2026-04-03 01:41:56 +00:00
Josh Matthews
b386708c93 script: Move eventsource out of event module. (#43884)
The EventSource interface is unrelated to the other code in the event
module.

Testing: Just moving code around; no runtime impact.

Signed-off-by: Josh Matthews <josh@joshmatthews.net>
2026-04-03 01:40:49 +00:00
CynthiaOketch
1385107eb9 wpt: Enable /infrastructure tests (#20889) (#43879)
Added the top-level `/infrastructure` WPT directory to `include.ini`,
which was previously excluded by the root `skip: true` default.

- Skipped `[reftest]` as I was unsure whether the failures indicate a
real bug in Servo's reftest engine
- Skipped `[testdriver][bidi]` as BiDi WebDriver is not supported
- Added metadata for known expected failures (service workers, shared
workers, expected-fail harness tests, etc.)
- Removed stale metadata for tests that now pass


Testing: 
  Ran` ./mach test-wpt infrastructure/ ` before and after the change.
                                                                  
  Before: 114 ran as expected, 69 unexpected results.
  After: 166 ran as expected. Remaining 15 unexpected results are
  all in /infrastructure/reftest/ which is skipped in include.ini.

Fixes: #20889

---------

Signed-off-by: CynthiaOketch <cynthiaoketch6@gmail.com>
2026-04-03 00:46:22 +00:00
Oriol Brufau
b0582911db Upgrade Stylo to 2026-04-01 (#43878)
This continues  #43045

Changelog:
- Upstream:
74ddab4091...6de1071549
- Servo fixups:
9f2f4f3f1b...6cfce6f329

Stylo tracking issue: https://github.com/servo/stylo/issues/347

Summary of improvements:
 - Adding support for the CSS-wide `revert-rule` keyword
- `::details-content` becomes element-backed, thus accepting nested
pseudo-elements like `::details-content::before`.
- Custom properties can now be registered using dashed idents in their
syntax
 - Various `attr()` improvements

Testing: Various WPT improvements

---------

Signed-off-by: Oriol Brufau <obrufau@igalia.com>
2026-04-03 00:03:25 +00:00
niya
b1fc99bc6c refactor: rename PropertyIteratorActor related variable names (#43876)
Rename variables associated with `PropertyIteratorActor`

Testing: Tested locally with `mach test-devtools`

Fixes: A part of #43606

Signed-off-by: Niya Gupta <niyabits@disroot.org>
2026-04-02 19:06:09 +00:00
niya
ff4122bc51 devtools: rename ObjectActor and SymbolIteratorActor variables names (#43875)
Rename variables associated with `ObjectActor` and `SymbolIteratorActor`

Testing: Tested locally with mach test-devtools

Fixes: A part of https://github.com/servo/servo/issues/43606

Signed-off-by: Niya Gupta <niyabits@disroot.org>
2026-04-02 17:52:01 +00:00
Martin Robinson
87627c4c74 script: Also set the focus state on focused area's shadow hosts (#43873)
The specification says that `:focus` should be active on all shadow
hosts of ancestors of focused areas. This change does that by exposing a
new `DocumentFocusHandler::set_focused_element` method and also using it
during the "removing steps." This is important because unlike the
"focusing steps" `set_focused_element` does not cause focus and blur
events.

Testing: This leads to a progression in results on 8 WPT tests.

Signed-off-by: Martin Robinson <mrobinson@igalia.com>
2026-04-02 17:21:16 +00:00
Kelechi Ebiri
e1f4c1f869 stylo: Update Stylo to remove PseudoElement::DetailsSummary. (#43849)
Companion PR for servo/stylo#345
Fixes: #43812

Signed-off-by: Kelechi Ebiri <ebiritg@gmail.com>
2026-04-02 17:14:52 +00:00
Martin Robinson
ed8576b163 script: Move all focus-related code to components/script/document/focus.rs and create DocumentFocusHandler (#43868)
This continues the work to split up the large DOM structs. In this case
we create a helper struct to store focus-related state much like
`DocumentEventHandler`.

Testing: This is mostly code motion, so should be covered by existing
tests.
Fixes: #43720

Signed-off-by: Martin Robinson <mrobinson@igalia.com>
2026-04-02 14:11:32 +00:00
Abbas Olanrewaju Sarafa
782dce68c4 Used encoding-parsing algorithm in follow_hyperlink (#43822)
Used encoding-parsing algorithm in follow_hyperlink

Testing: Ran ```./mach test-wpt
tests/wpt/tests/html/semantics/links/links-created-by-a-and-area-elements```
Result;
```
Running 11 tests in web-platform-tests

Ran 11 tests finished in 74.9 seconds.
  • 11 ran as expected.
```

Second test;
```./mach test-wpt tests/wpt/tests/html/semantics/links/links-created-by-a-and-area-elements/anchor-src-encoding.html```
Result;
```
web-platform-test
~~~~~~~~~~~~~~~~~
Ran 2 checks (1 subtests, 1 tests)
Expected results: 2
Unexpected results: 0
OK
```

Fixes: #43508

---------

Signed-off-by: Sabb <sarafaabbas@gmail.com>
2026-04-02 13:45:08 +00:00
Jonathan Schwender
78ffbe8caa servo-hyper-serde: Use the workspace version (#43866)
`hyper_serde` currently has no dependants on crates.io. We give up the
independant versioning and simplify our setup by using the same version
as our workspace. The new crate is servo-hyper-serde.

Testing: Not required, policy change.

Signed-off-by: Jonathan Schwender <schwenderjonathan@gmail.com>
2026-04-02 12:25:12 +00:00
elomscansio
1e213e59b2 script: Move MutationObserver DOM interfaces to script/dom/mutationobserver (#43865)
Moves MutationObserver interfaces into script/dom/mutationobserver/
module from script/dom/.

Testing: Just a refactor shouldn't need any testing.
Fixes: Part of https://github.com/servo/servo/issues/38901

Signed-off-by: Emmanuel Paul Elom <elomemmanuel007@gmail.com>
2026-04-02 11:52:56 +00:00
Martin Robinson
33f74feffd script: Fully implement DocumentOrShadowRoot#activeElement (#43861)
`DocumentOrShadowRoot#activeElement` should return retargeted results.
What that means is that if the DOM anchor of the `Document`'s focused
focusable area is within a shadow root, `Document#activeElement` should
return the shadow host. This change implements that behavior, properly
returning the `activeElement` from both `Document` and `ShadowRoot`.

Testing: This causes a decent number of WPT tests and subtests to start
passing. One subtest starts to fail, because it uses the `autofocus`
attribute
which we do not yet support.

Signed-off-by: Martin Robinson <mrobinson@igalia.com>
2026-04-02 11:12:30 +00:00
atbrakhi
9a43d43c32 devtools: Merge preview and scopes (#43860)
Before we had seperate parsing functions for preview and scopes. In this
change we merge them into one.

Testing: current tests are passing, also tested manually.
Fixes: #36027

Signed-off-by: atbrakhi <atbrakhi@igalia.com>
Co-authored-by: eri <eri@igalia.com>
2026-04-02 10:02:38 +00:00
Martin Robinson
5b743ef1f4 script: Run the focusing steps when navigating to a fragment (#43859)
When navigating to a fragment the specification says to run the focusing
steps. This is possible now that the focusing steps are properly
exposed. This change also adds support for the fallback option, which is
used when the focus target is not associated with a focuable area. In
this case the fallback is to focus the viewport.

Testing: This change adds a new WPT for this behavior, which was
seemingly not tested before.

Signed-off-by: Martin Robinson <mrobinson@igalia.com>
2026-04-02 09:34:05 +00:00
Narfinger
f7d77754ff base: Implement MallocSizeOf for some more types (#43858)
This implements MallocSizeOf for a couple more types and removes some
"ignore_malloc_size_of" throughout the codebase.
- std::path::PathBuf
- tokio::sync::oneshot::Sender
- http::HeaderMap (with a reasonable approximation of iterating over all
headers)
- data_url::Mime by looking at the inner type
- http::Method: Is an enum internally
- urlpattern::Urlpattern: Iterating over all public fields that are
strings as an approximation.

Testing: We cannot test if MallocSizeOf is correct currently.

Signed-off-by: Narfinger <Narfinger@users.noreply.github.com>
2026-04-02 08:13:08 +00:00
Tim van der Lippe
5fdd425dd2 script: Align navigate more with spec steps (#43857)
This moves step 23 to the end of the algorithm and also updates the spec
steps related to JavaScript navigations.

Testing: WPT

Signed-off-by: Tim van der Lippe <tvanderlippe@gmail.com>
2026-04-02 08:11:07 +00:00
Euclid Ye
beed34612c devtools: Update outdated TODO (#43851)
These were here 10 years ago.

1. All Actor names are already immutable: 
- All actor implementations store a name: String field set at
construction and never mutate it.
- The name is only used as key by `ActorRegistry` to look up
2. There is no `register_later`

---------

Signed-off-by: Euclid Ye <yezhizhenjiakang@gmail.com>
2026-04-02 07:37:16 +00:00
dependabot[bot]
12fb7f415d build: bump hyper from 1.8.1 to 1.9.0 (#43856)
Bumps [hyper](https://github.com/hyperium/hyper) from 1.8.1 to 1.9.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/hyperium/hyper/releases">hyper's
releases</a>.</em></p>
<blockquote>
<h2>v1.9.0</h2>
<h2>Features</h2>
<ul>
<li><strong>client:</strong>
<ul>
<li>expose HTTP/2 current max stream count (<a
href="https://redirect.github.com/hyperium/hyper/issues/4026">#4026</a>)
(<a
href="d51cb71569">d51cb715</a>)</li>
<li>add HTTP/2 <code>max_local_error_reset_streams</code> option (<a
href="https://redirect.github.com/hyperium/hyper/issues/4021">#4021</a>)
(<a
href="577874591c">57787459</a>)</li>
</ul>
</li>
<li><strong>error:</strong> add 'Error::is_parse_version_h2' method (<a
href="393c77c711">393c77c7</a>)</li>
<li><strong>http1:</strong> add UpgradeableConnection::into_parts (<a
href="e21205cfe4">e21205cf</a>)</li>
</ul>
<h2>Bug Fixes</h2>
<ul>
<li><strong>ffi:</strong> validate null pointers before dereferencing in
request/response functions (<a
href="https://redirect.github.com/hyperium/hyper/issues/4038">#4038</a>
(<a
href="28e73ccd23">28e73ccd</a>)</li>
<li><strong>http1:</strong>
<ul>
<li>allow keep-alive for chunked requests with trailers (<a
href="https://redirect.github.com/hyperium/hyper/issues/4043">#4043</a>)
(<a
href="7211ec25ef">7211ec25</a>,
closes <a
href="https://redirect.github.com/hyperium/hyper/issues/4044">#4044</a>)</li>
<li>use case-insensitive matching for trailer fields (<a
href="https://redirect.github.com/hyperium/hyper/issues/4011">#4011</a>)
(<a
href="3b344cac9f">3b344cac</a>,
closes <a
href="https://redirect.github.com/hyperium/hyper/issues/4010">#4010</a>)</li>
<li>use httparse config for Servers (<a
href="https://redirect.github.com/hyperium/hyper/issues/4002">#4002</a>)
(<a
href="bcb8ec5766">bcb8ec57</a>,
closes <a
href="https://redirect.github.com/hyperium/hyper/issues/3923">#3923</a>)</li>
</ul>
</li>
<li><strong>http2:</strong>
<ul>
<li>cancel sending client request body on response future drop (<a
href="https://redirect.github.com/hyperium/hyper/issues/4042">#4042</a>)
(<a
href="5b17a69ebc">5b17a69e</a>,
closes <a
href="https://redirect.github.com/hyperium/hyper/issues/4040">#4040</a>)</li>
<li>non-utf8 char in Connection header may cause panic when calling
to_str (<a
href="https://redirect.github.com/hyperium/hyper/issues/4019">#4019</a>)
(<a
href="c36ca8a5c5">c36ca8a5</a>)</li>
</ul>
</li>
</ul>
<h2>Refactors and chores</h2>
<ul>
<li>docs(error): add more information about is_incomplete_message by <a
href="https://github.com/seanmonstar"><code>@​seanmonstar</code></a> in
<a
href="https://redirect.github.com/hyperium/hyper/pull/3978">hyperium/hyper#3978</a></li>
<li>Run cargo-audit in CI to check for known vulnerabilities in
dependencies. by <a
href="https://github.com/f0rki"><code>@​f0rki</code></a> in <a
href="https://redirect.github.com/hyperium/hyper/pull/3246">hyperium/hyper#3246</a></li>
<li>refactor(http1): simplify match of Token parse error by <a
href="https://github.com/seanmonstar"><code>@​seanmonstar</code></a> in
<a
href="https://redirect.github.com/hyperium/hyper/pull/3981">hyperium/hyper#3981</a></li>
<li>refactor(http1): use saturating_sub instead of manual impl by <a
href="https://github.com/seanmonstar"><code>@​seanmonstar</code></a> in
<a
href="https://redirect.github.com/hyperium/hyper/pull/3983">hyperium/hyper#3983</a></li>
<li>refactor(http1): replace many args of Chunked::step with struct by
<a href="https://github.com/seanmonstar"><code>@​seanmonstar</code></a>
in <a
href="https://redirect.github.com/hyperium/hyper/pull/3982">hyperium/hyper#3982</a></li>
<li>docs: fix comment in <code>put_slice()</code> by <a
href="https://github.com/coryan"><code>@​coryan</code></a> in <a
href="https://redirect.github.com/hyperium/hyper/pull/3986">hyperium/hyper#3986</a></li>
<li>test(lib): fix unused warnings due to feature gating test imports by
<a href="https://github.com/seanmonstar"><code>@​seanmonstar</code></a>
in <a
href="https://redirect.github.com/hyperium/hyper/pull/3997">hyperium/hyper#3997</a></li>
<li>docs: improve Read trait and ReadBufCursor documentation by <a
href="https://github.com/majiayu000"><code>@​majiayu000</code></a> in <a
href="https://redirect.github.com/hyperium/hyper/pull/4000">hyperium/hyper#4000</a></li>
<li>fix: use h1 parser config when parsing server req by <a
href="https://github.com/0xPoe"><code>@​0xPoe</code></a> in <a
href="https://redirect.github.com/hyperium/hyper/pull/4002">hyperium/hyper#4002</a></li>
<li>test(server): fix flaky disable_keep_alive_mid_request by <a
href="https://github.com/seanmonstar"><code>@​seanmonstar</code></a> in
<a
href="https://redirect.github.com/hyperium/hyper/pull/4009">hyperium/hyper#4009</a></li>
<li>chore(ci): update to actions/checkout@v6 by <a
href="https://github.com/tottoto"><code>@​tottoto</code></a> in <a
href="https://redirect.github.com/hyperium/hyper/pull/4005">hyperium/hyper#4005</a></li>
<li>chore(ci): update to cargo-check-external-types 0.4.0 by <a
href="https://github.com/tottoto"><code>@​tottoto</code></a> in <a
href="https://redirect.github.com/hyperium/hyper/pull/4006">hyperium/hyper#4006</a></li>
<li>update copyright year to 2026 by <a
href="https://github.com/jasmyhigh"><code>@​jasmyhigh</code></a> in <a
href="https://redirect.github.com/hyperium/hyper/pull/4007">hyperium/hyper#4007</a></li>
<li>refactor: avoid unwrap examples by <a
href="https://github.com/0xPoe"><code>@​0xPoe</code></a> in <a
href="https://redirect.github.com/hyperium/hyper/pull/4001">hyperium/hyper#4001</a></li>
<li>fix(http1): use case-insensitive matching for trailer fields by <a
href="https://github.com/HueCodes"><code>@​HueCodes</code></a> in <a
href="https://redirect.github.com/hyperium/hyper/pull/4011">hyperium/hyper#4011</a></li>
<li>chore: convert bug report template to GitHub form by <a
href="https://github.com/njg7194"><code>@​njg7194</code></a> in <a
href="https://redirect.github.com/hyperium/hyper/pull/4015">hyperium/hyper#4015</a></li>
<li>chore(ci): force toml mode in yq selecting msrv by <a
href="https://github.com/seanmonstar"><code>@​seanmonstar</code></a> in
<a
href="https://redirect.github.com/hyperium/hyper/pull/4020">hyperium/hyper#4020</a></li>
<li>fix: non-utf8 char may cause panic when calling to_str by <a
href="https://github.com/cuiweixie"><code>@​cuiweixie</code></a> in <a
href="https://redirect.github.com/hyperium/hyper/pull/4019">hyperium/hyper#4019</a></li>
<li>feat(http2/client): add <code>max_local_error_reset_streams</code>
option by <a
href="https://github.com/ffuugoo"><code>@​ffuugoo</code></a> in <a
href="https://redirect.github.com/hyperium/hyper/pull/4021">hyperium/hyper#4021</a></li>
<li>chore: drop pin-utils dependency by <a
href="https://github.com/tottoto"><code>@​tottoto</code></a> in <a
href="https://redirect.github.com/hyperium/hyper/pull/4023">hyperium/hyper#4023</a></li>
<li>[minor] doc: Fix HTTP/2 max concurrent stream link by <a
href="https://github.com/dentiny"><code>@​dentiny</code></a> in <a
href="https://redirect.github.com/hyperium/hyper/pull/4037">hyperium/hyper#4037</a></li>
<li>fix(ffi): validate null pointers before dereferencing in
request/resp… by <a
href="https://github.com/DhruvaD1"><code>@​DhruvaD1</code></a> in <a
href="https://redirect.github.com/hyperium/hyper/pull/4038">hyperium/hyper#4038</a></li>
<li>h2: expose current max stream count by <a
href="https://github.com/howardjohn"><code>@​howardjohn</code></a> in <a
href="https://redirect.github.com/hyperium/hyper/pull/4026">hyperium/hyper#4026</a></li>
<li>fix(http1): allow keep-alive for chunked requests with trailers by
<a href="https://github.com/wi-adam"><code>@​wi-adam</code></a> in <a
href="https://redirect.github.com/hyperium/hyper/pull/4043">hyperium/hyper#4043</a></li>
<li>fix(http2): cancel pipe_task and send RST_STREAM on response future
drop by <a
href="https://github.com/mmishra100"><code>@​mmishra100</code></a> in <a
href="https://redirect.github.com/hyperium/hyper/pull/4042">hyperium/hyper#4042</a></li>
<li>Add APIs to allow switching an HTTP1 connection to HTTP2 if H2
preface is seen by <a
href="https://github.com/pborzenkov"><code>@​pborzenkov</code></a> in <a
href="https://redirect.github.com/hyperium/hyper/pull/3996">hyperium/hyper#3996</a></li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/hyperium/hyper/blob/master/CHANGELOG.md">hyper's
changelog</a>.</em></p>
<blockquote>
<h2>v1.9.0 (2026-03-31)</h2>
<h4>Bug Fixes</h4>
<ul>
<li><strong>ffi:</strong> validate null pointers before dereferencing in
request/response functions (<a
href="https://redirect.github.com/hyperium/hyper/issues/4038">#4038</a>
(<a
href="28e73ccd23">28e73ccd</a>)</li>
<li><strong>http1:</strong>
<ul>
<li>allow keep-alive for chunked requests with trailers (<a
href="https://redirect.github.com/hyperium/hyper/issues/4043">#4043</a>)
(<a
href="7211ec25ef">7211ec25</a>,
closes <a
href="https://redirect.github.com/hyperium/hyper/issues/4044">#4044</a>)</li>
<li>use case-insensitive matching for trailer fields (<a
href="https://redirect.github.com/hyperium/hyper/issues/4011">#4011</a>)
(<a
href="3b344cac9f">3b344cac</a>,
closes <a
href="https://redirect.github.com/hyperium/hyper/issues/4010">#4010</a>)</li>
<li>use httparse config for Servers (<a
href="https://redirect.github.com/hyperium/hyper/issues/4002">#4002</a>)
(<a
href="bcb8ec5766">bcb8ec57</a>,
closes <a
href="https://redirect.github.com/hyperium/hyper/issues/3923">#3923</a>)</li>
</ul>
</li>
<li><strong>http2:</strong>
<ul>
<li>cancel sending client request body on response future drop (<a
href="https://redirect.github.com/hyperium/hyper/issues/4042">#4042</a>)
(<a
href="5b17a69ebc">5b17a69e</a>,
closes <a
href="https://redirect.github.com/hyperium/hyper/issues/4040">#4040</a>)</li>
<li>non-utf8 char in Connection header may cause panic when calling
to_str (<a
href="https://redirect.github.com/hyperium/hyper/issues/4019">#4019</a>)
(<a
href="c36ca8a5c5">c36ca8a5</a>)</li>
</ul>
</li>
</ul>
<h4>Features</h4>
<ul>
<li><strong>client:</strong>
<ul>
<li>expose HTTP/2 current max stream count (<a
href="https://redirect.github.com/hyperium/hyper/issues/4026">#4026</a>)
(<a
href="d51cb71569">d51cb715</a>)</li>
<li>add HTTP/2 <code>max_local_error_reset_streams</code> option (<a
href="https://redirect.github.com/hyperium/hyper/issues/4021">#4021</a>)
(<a
href="577874591c">57787459</a>)</li>
</ul>
</li>
<li><strong>error:</strong> add 'Error::is_parse_version_h2' method (<a
href="393c77c711">393c77c7</a>)</li>
<li><strong>http1:</strong> add UpgradeableConnection::into_parts (<a
href="e21205cfe4">e21205cf</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="0d6c7d5469"><code>0d6c7d5</code></a>
v1.9.0</li>
<li><a
href="e21205cfe4"><code>e21205c</code></a>
feat(http1): add UpgradeableConnection::into_parts</li>
<li><a
href="393c77c711"><code>393c77c</code></a>
feat(error): add 'Error::is_parse_version_h2' method</li>
<li><a
href="5b17a69ebc"><code>5b17a69</code></a>
fix(http2): cancel sending client request body on response future drop
(<a
href="https://redirect.github.com/hyperium/hyper/issues/4042">#4042</a>)</li>
<li><a
href="7211ec25ef"><code>7211ec2</code></a>
fix(http1): allow keep-alive for chunked requests with trailers (<a
href="https://redirect.github.com/hyperium/hyper/issues/4043">#4043</a>)</li>
<li><a
href="d51cb71569"><code>d51cb71</code></a>
feat(client): expose HTTP/2 current max stream count (<a
href="https://redirect.github.com/hyperium/hyper/issues/4026">#4026</a>)</li>
<li><a
href="28e73ccd23"><code>28e73cc</code></a>
fix(ffi): validate null pointers before dereferencing in
request/response fun...</li>
<li><a
href="e13e783927"><code>e13e783</code></a>
docs(client): fix HTTP/2 max concurrent stream link to spec (<a
href="https://redirect.github.com/hyperium/hyper/issues/4037">#4037</a>)</li>
<li><a
href="8ba900853b"><code>8ba9008</code></a>
chore(dependencies): drop pin-utils dependency (<a
href="https://redirect.github.com/hyperium/hyper/issues/4023">#4023</a>)</li>
<li><a
href="577874591c"><code>5778745</code></a>
feat(client): add HTTP/2 <code>max_local_error_reset_streams</code>
option (<a
href="https://redirect.github.com/hyperium/hyper/issues/4021">#4021</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/hyperium/hyper/compare/v1.8.1...v1.9.0">compare
view</a></li>
</ul>
</details>
<br />

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-02 05:55:40 +00:00
Kelechi Ebiri
ed78dadacb doc: explain how to create a WebView (#43712) (#43787)
Added a "Creating a WebView" section to the `WebView` doc comment
explaining that `WebViewBuilder` is the correct way to create a
`WebView`, with a usage example. Also expanded the `WebViewBuilder` doc
comment to be more descriptive.
Fixes: #43712

Signed-off-by: Kelechi Ebiri <ebiritg@gmail.com>
2026-04-02 05:22:41 +00:00
dependabot[bot]
8bda09c65a build: bump libc from 0.2.183 to 0.2.184 (#43854)
Bumps [libc](https://github.com/rust-lang/libc) from 0.2.183 to 0.2.184.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/rust-lang/libc/releases">libc's
releases</a>.</em></p>
<blockquote>
<h2>0.2.184</h2>
<h3>MSRV</h3>
<p>This release increases the MSRV of <code>libc</code> to 1.65. With
this update, you can now always use the
<code>core::ffi::c_*</code> types with <code>libc</code> definitions,
since <code>libc</code> has been changed to reexport from
<code>core</code> rather than redefining them. (This <em>usually</em>
worked before but had edge cases.)
(<a
href="https://redirect.github.com/rust-lang/libc/pull/4972">#4972</a>)</p>
<h3>Added</h3>
<ul>
<li>BSD: Add <code>IP_MINTTL</code> to bsd (<a
href="https://redirect.github.com/rust-lang/libc/pull/5026">#5026</a>)</li>
<li>Cygwin: Add <code>TIOCM_DSR</code> (<a
href="https://redirect.github.com/rust-lang/libc/pull/5031">#5031</a>)</li>
<li>FreeBSD: Added <code>xfile</code> structe and file descriptor types
(<a
href="https://redirect.github.com/rust-lang/libc/pull/5002">#5002</a>)</li>
<li>Linux: Add CAN netlink bindings (<a
href="https://redirect.github.com/rust-lang/libc/pull/5011">#5011</a>)</li>
<li>Linux: Add <code>struct ethhdr</code> (<a
href="https://redirect.github.com/rust-lang/libc/pull/4239">#4239</a>)</li>
<li>Linux: Add <code>struct ifinfomsg</code> (<a
href="https://redirect.github.com/rust-lang/libc/pull/5012">#5012</a>)</li>
<li>Linux: Define <code>max_align_t</code> for riscv64 (<a
href="https://redirect.github.com/rust-lang/libc/pull/5029">#5029</a>)</li>
<li>NetBSD: Add missing <code>CLOCK_</code> constants (<a
href="https://redirect.github.com/rust-lang/libc/pull/5020">#5020</a>)</li>
<li>NuttX: Add <code>_SC_HOST_NAME_MAX</code> (<a
href="https://redirect.github.com/rust-lang/libc/pull/5004">#5004</a>)</li>
<li>VxWorks: Add <code>flock</code> and <code>F_*LCK</code> constants
(<a
href="https://redirect.github.com/rust-lang/libc/pull/4043">#4043</a>)</li>
<li>WASI: Add all <code>_SC_*</code> sysconf constants (<a
href="https://redirect.github.com/rust-lang/libc/pull/5023">#5023</a>)</li>
</ul>
<h3>Deprecated</h3>
<p>The remaining fixed-width integer aliases, <code>__uint128_t</code>,
<code>__uint128</code>, <code>__int128_t</code>, and
<code>__int128</code>,
have been deprecated. Use <code>i128</code> and <code>u128</code>
instead. (<a
href="https://redirect.github.com/rust-lang/libc/pull/4343">#4343</a>)</p>
<h3>Fixed</h3>
<ul>
<li><strong>breaking</strong> Redox: Fix signal action constant types
(<a
href="https://redirect.github.com/rust-lang/libc/pull/5009">#5009</a>)</li>
<li>EspIDF: Correct the value of <code>DT_*</code> constants (<a
href="https://redirect.github.com/rust-lang/libc/pull/5034">#5034</a>)</li>
<li>Redox: Fix locale values and add <code>RTLD_NOLOAD</code>, some TCP
constants (<a
href="https://redirect.github.com/rust-lang/libc/pull/5025">#5025</a>)</li>
<li>Various: Use <code>Padding::new(&lt;zeroed&gt;)</code> rather than
<code>Padding::uninit()</code> (<a
href="https://redirect.github.com/rust-lang/libc/pull/5036">#5036</a>)</li>
</ul>
<h3>Changed</h3>
<ul>
<li><strong>potentially breaking</strong> Linux: Add new fields to
<code>struct ptrace_syscall_info</code> (<a
href="https://redirect.github.com/rust-lang/libc/pull/4966">#4966</a>)</li>
<li>Re-export <code>core::ffi</code> integer types rather than
redefining (<a
href="https://redirect.github.com/rust-lang/libc/pull/5015">#5015</a>)</li>
<li>Redox: Update <code>F_DUPFD</code>, <code>IP</code>, and
<code>TCP</code> constants to match relibc (<a
href="https://redirect.github.com/rust-lang/libc/pull/4990">#4990</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/rust-lang/libc/blob/0.2.184/CHANGELOG.md">libc's
changelog</a>.</em></p>
<blockquote>
<h2><a
href="https://github.com/rust-lang/libc/compare/0.2.183...0.2.184">0.2.184</a>
- 2026-04-01</h2>
<h3>MSRV</h3>
<p>This release increases the MSRV of <code>libc</code> to 1.65. With
this update, you can now always use the
<code>core::ffi::c_*</code> types with <code>libc</code> definitions,
since <code>libc</code> has been changed to reexport from
<code>core</code> rather than redefining them. (This <em>usually</em>
worked before but had edge cases.)
(<a
href="https://redirect.github.com/rust-lang/libc/pull/4972">#4972</a>)</p>
<h3>Added</h3>
<ul>
<li>BSD: Add <code>IP_MINTTL</code> to bsd (<a
href="https://redirect.github.com/rust-lang/libc/pull/5026">#5026</a>)</li>
<li>Cygwin: Add <code>TIOCM_DSR</code> (<a
href="https://redirect.github.com/rust-lang/libc/pull/5031">#5031</a>)</li>
<li>FreeBSD: Added <code>xfile</code> structe and file descriptor types
(<a
href="https://redirect.github.com/rust-lang/libc/pull/5002">#5002</a>)</li>
<li>Linux: Add CAN netlink bindings (<a
href="https://redirect.github.com/rust-lang/libc/pull/5011">#5011</a>)</li>
<li>Linux: Add <code>struct ethhdr</code> (<a
href="https://redirect.github.com/rust-lang/libc/pull/4239">#4239</a>)</li>
<li>Linux: Add <code>struct ifinfomsg</code> (<a
href="https://redirect.github.com/rust-lang/libc/pull/5012">#5012</a>)</li>
<li>Linux: Define <code>max_align_t</code> for riscv64 (<a
href="https://redirect.github.com/rust-lang/libc/pull/5029">#5029</a>)</li>
<li>NetBSD: Add missing <code>CLOCK_</code> constants (<a
href="https://redirect.github.com/rust-lang/libc/pull/5020">#5020</a>)</li>
<li>NuttX: Add <code>_SC_HOST_NAME_MAX</code> (<a
href="https://redirect.github.com/rust-lang/libc/pull/5004">#5004</a>)</li>
<li>VxWorks: Add <code>flock</code> and <code>F_*LCK</code> constants
(<a
href="https://redirect.github.com/rust-lang/libc/pull/4043">#4043</a>)</li>
<li>WASI: Add all <code>_SC_*</code> sysconf constants (<a
href="https://redirect.github.com/rust-lang/libc/pull/5023">#5023</a>)</li>
</ul>
<h3>Deprecated</h3>
<p>The remaining fixed-width integer aliases, <code>__uint128_t</code>,
<code>__uint128</code>, <code>__int128_t</code>, and
<code>__int128</code>,
have been deprecated. Use <code>i128</code> and <code>u128</code>
instead. (<a
href="https://redirect.github.com/rust-lang/libc/pull/4343">#4343</a>)</p>
<h3>Fixed</h3>
<ul>
<li><strong>breaking</strong> Redox: Fix signal action constant types
(<a
href="https://redirect.github.com/rust-lang/libc/pull/5009">#5009</a>)</li>
<li>EspIDF: Correct the value of <code>DT_*</code> constants (<a
href="https://redirect.github.com/rust-lang/libc/pull/5034">#5034</a>)</li>
<li>Redox: Fix locale values and add <code>RTLD_NOLOAD</code>, some TCP
constants (<a
href="https://redirect.github.com/rust-lang/libc/pull/5025">#5025</a>)</li>
<li>Various: Use <code>Padding::new(&lt;zeroed&gt;)</code> rather than
<code>Padding::uninit()</code> (<a
href="https://redirect.github.com/rust-lang/libc/pull/5036">#5036</a>)</li>
</ul>
<h3>Changed</h3>
<ul>
<li><strong>potentially breaking</strong> Linux: Add new fields to
<code>struct ptrace_syscall_info</code> (<a
href="https://redirect.github.com/rust-lang/libc/pull/4966">#4966</a>)</li>
<li>Re-export <code>core::ffi</code> integer types rather than
redefining (<a
href="https://redirect.github.com/rust-lang/libc/pull/5015">#5015</a>)</li>
<li>Redox: Update <code>F_DUPFD</code>, <code>IP</code>, and
<code>TCP</code> constants to match relibc (<a
href="https://redirect.github.com/rust-lang/libc/pull/4990">#4990</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="b1fd610c7e"><code>b1fd610</code></a>
chore: Release libc 0.2.184</li>
<li><a
href="f596819d7c"><code>f596819</code></a>
ci: Don't enforce cargo-semver-checks</li>
<li><a
href="4645f60c3a"><code>4645f60</code></a>
linux: update ptrace_syscall_info struct</li>
<li><a
href="14cbbec353"><code>14cbbec</code></a>
types: Remove <code>Padding::uninit</code></li>
<li><a
href="b5dcda885f"><code>b5dcda8</code></a>
pthread: Use <code>Padding::new(\&lt;zeroed&gt;)</code> rather than
<code>Padding::uninit()</code></li>
<li><a
href="bbb1c5d350"><code>bbb1c5d</code></a>
types: Add a <code>new</code> function to <code>Padding</code></li>
<li><a
href="df06e43309"><code>df06e43</code></a>
Fix locale values and add RTLD_NOLOAD, some TCP constants</li>
<li><a
href="078f5c6b3c"><code>078f5c6</code></a>
newlib/espidf: Move DT_* to espidf/mod.rs</li>
<li><a
href="d32b83db3c"><code>d32b83d</code></a>
Add IP_MINTTL to bsd</li>
<li><a
href="939e0ec2a8"><code>939e0ec</code></a>
Define max_align_t for riscv64-linux</li>
<li>Additional commits viewable in <a
href="https://github.com/rust-lang/libc/compare/0.2.183...0.2.184">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=libc&package-manager=cargo&previous-version=0.2.183&new-version=0.2.184)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-02 04:51:16 +00:00
dependabot[bot]
eb296b77f6 build: bump the wayland-related group with 5 updates (#43853)
Bumps the wayland-related group with 5 updates:

| Package | From | To |
| --- | --- | --- |
| [wayland-client](https://github.com/smithay/wayland-rs) | `0.31.13` |
`0.31.14` |
| [wayland-cursor](https://github.com/smithay/wayland-rs) | `0.31.13` |
`0.31.14` |
| [wayland-protocols](https://github.com/smithay/wayland-rs) | `0.32.11`
| `0.32.12` |
| [wayland-protocols-plasma](https://github.com/smithay/wayland-rs) |
`0.3.11` | `0.3.12` |
| [wayland-protocols-wlr](https://github.com/smithay/wayland-rs) |
`0.3.11` | `0.3.12` |

Updates `wayland-client` from 0.31.13 to 0.31.14
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/smithay/wayland-rs/commits">compare
view</a></li>
</ul>
</details>
<br />

Updates `wayland-cursor` from 0.31.13 to 0.31.14
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/smithay/wayland-rs/commits">compare
view</a></li>
</ul>
</details>
<br />

Updates `wayland-protocols` from 0.32.11 to 0.32.12
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/smithay/wayland-rs/commits">compare
view</a></li>
</ul>
</details>
<br />

Updates `wayland-protocols-plasma` from 0.3.11 to 0.3.12
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/smithay/wayland-rs/commits">compare
view</a></li>
</ul>
</details>
<br />

Updates `wayland-protocols-wlr` from 0.3.11 to 0.3.12
<details>
<summary>Commits</summary>
<ul>
<li>See full diff in <a
href="https://github.com/smithay/wayland-rs/commits">compare
view</a></li>
</ul>
</details>
<br />


Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-04-02 04:44:54 +00:00
treetmitterglad
a81b15d5a5 devtools: Rename NodeActor variables in walker.rs (#43847)
Standardize variable naming in `WalkerActor` (`walker.rs`), renaming
`node` → `node_name` in `find_child` to follow the `{}_name` convention.

Testing: `./mach test-devtools`, all 60 tests pass.

Fixes: Part of #43606

Signed-off-by: Eli Bowman <asdfup@protonmail.com>
2026-04-01 23:27:27 +00:00
niya
6150b1b2c0 devtools: rename NodeActor related variables (#43841)
Rename variables associated with `NodeActor`

Testing: Tested locally with mach test-devtools

Fixes: A part of https://github.com/servo/servo/issues/43606

---------

Signed-off-by: Niya Gupta <niyabits@disroot.org>
2026-04-01 23:13:52 +00:00
Messi II Innocent R.
ce61f2eaa9 Fix child disappearing with mixed overflow clip/visible. (#43620)
Fix child element disappearing when parent has mixed overflow
clip/visible

When a parent element clips overflow on one axis but not the other (e.g.
overflow-y: clip with overflow-x: visible), the non-clipped axis was
using f32::MIN/f32::MAX. On screens with a device pixel ratio above 1,
this causes child elements to disappear.

The fix uses LayoutRect::max_rect() instead.

Testing: Reproduced the bug using --device-pixel-ratio 2 on Linux and
the child disappears on main branch but renders correctly with this fix.


fixes: #43599

Signed-off-by: Messi002 <rostandmessi2@gmail.com>
2026-04-01 22:38:51 +00:00
Oriol Brufau
827129a1a9 layout: Update group index when reusing table-column box (#43846)
Even if we can reuse the box, its column group index may have changed,
e.g. due to the removal of a previous column group. So just update it.

Testing: Adding crash test
Fixes: #39142

Signed-off-by: Oriol Brufau <obrufau@igalia.com>
2026-04-01 21:44:26 +00:00
Martin Robinson
a486ed525e script: Move "scroll into view" call out of "focusing steps" (#43842)
`HTMLOrSVGElement#focus` specifies the "scroll into view" steps should
be run after focusing. This was happening implicitly as part of the
`Document`'s implementation of the "focusing steps." That behavior is
not in the specification, so this change moves the scrolling call to
where it is specified. Along with making the code match the
specification, this change simplifies it as well.

Testing: This should not modify behavior, so existing tests should
suffice.

Signed-off-by: Martin Robinson <mrobinson@igalia.com>
2026-04-01 21:40:13 +00:00
Simon Wülker
155016354c script: Implement PerformanceMeasureOptions (#43753)
The relevant changes are in `Performance::Measure`, the rest are
comments I added.

Testing: New tests start to pass

---------

Signed-off-by: Simon Wülker <simon.wuelker@arcor.de>
2026-04-01 20:15:16 +00:00
Simon Wülker
8791236c7a script: Let wheel, keyboard- and pointerevents be composed when fired by the user agent (#43799)
Addtionally, this updates lots of outdated spec links for mouse-related
things that have moved from the uievents spec to the pointerevents spec.

Testing: This change adds new tests

Part of https://github.com/servo/servo/issues/35997
Fixes https://github.com/servo/servo/issues/37772

---------

Signed-off-by: Simon Wülker <simon.wuelker@arcor.de>
2026-04-01 19:59:10 +00:00
niya
305276c55a devtools: rename LongStringActor name variables (#43838)
Changes `name` to `long_string_name`
`long_string` to `long_string_actor`

Testing: Tested locally with mach test-devtools

Fixes: A part of #43606

---------

Signed-off-by: Niya Gupta <niyabits@disroot.org>
Co-authored-by: Niya Gupta <niyabits@disroot.org>
2026-04-01 16:44:09 +00:00
Martin Robinson
ceac966c34 script: Remove focus transaction concept (#43834)
For years Servo has had the concept of a focus transaction which was
used only to allow falling back to focusing the viewport when focusing a
clicked element failed. As this concept isn't part of the specification,
this change removes it.

Instead, a `FocusableArea` (a specification concept) is passed to
the `Document` focusing code. A `FocusableArea` might also be the
`Document`'s viewport.

As part of this change, some focus-related methods are moved to `Node`
from `Element` as the `Document` is not an `Element`.  This brings the
code closer to implementing the "focusing steps" from the specification.

Testing: This should not change behavior and is thus covered by existing
tests.

Signed-off-by: Martin Robinson <mrobinson@igalia.com>
2026-04-01 15:12:21 +00:00
Euclid Ye
5fb3877f9f layout/script: Free memory earlier by consuming instead of draining the temporary container if it is unused (#43826)
Follow up of https://github.com/servo/servo/pull/43226
This also allows us to remove some `mut` declaration.
Fix some unintended Chinese quotation mark.

Testing: This is a micro-optimization which does not change visible
behaviour.

Signed-off-by: Euclid Ye <yezhizhenjiakang@gmail.com>
2026-04-01 13:46:53 +00:00
Jonathan Schwender
a463495c90 webgpu: Fix feature guard (#43831)
Remove the dependency on the `webgpu` crate if the feature is disabled.

Note: This doesn't seem to improve the binary size in production mode,
so presumably dead code elimination is already working well.
Nevertheless, it's preferable to correctly feature guard and it should
save a bit of compile-time (when not building with the webgpu feature).


Testing: `webgpu` feature is enabled by default in CI, and we test
`--no-default-features` too in the HOS build in CI. `cargo tree -p servo
--no-default-features` does not show webgpu anymore after this change.

Signed-off-by: Jonathan Schwender <schwenderjonathan@gmail.com>
2026-04-01 13:41:12 +00:00
eri
187206dc5a devtools: Show variable values in scopes (#43792)
<img width="312" height="307" alt="image"
src="https://github.com/user-attachments/assets/30c8cd9a-9712-4b53-9487-37c289a14520"
/>

Testing: Ran mach test-devtools and manual testing
Part of: #36027
Depends on: #43791

Signed-off-by: eri <eri@igalia.com>
Co-authored-by: atbrakhi <atbrakhi@igalia.com>
2026-04-01 13:08:18 +00:00
Mukilan Thiyagarajan
069aa65d54 config: Add documentation for the public API. (#43802)
Testing: No code changes, so no testing is required.

Signed-off-by: Mukilan Thiyagarajan <mukilan@igalia.com>
2026-04-01 11:50:19 +00:00
Martin Robinson
9cb65e242e script: Use TextInputWidget to implement <textarea> shadow DOM (#43770)
In addition to making it so that textual `<input>` and `<textarea>`
share a common shadow DOM structure, this allows styling the
`<textarea>` placeholder. It is now properly shown in grey.

Testing: This fixes a few WPT tests.

Signed-off-by: Martin Robinson <mrobinson@igalia.com>
2026-04-01 11:41:28 +00:00
eri
c121a46c4f CODEOWNERS: Add atbrakhi and eerii as DevTools owners (#43832)
We have been working on the DevTools code and would like to have
notifications of new patches so we can review them.

Signed-off-by: eri <eri@igalia.com>
2026-04-01 10:10:54 +00:00
Shubham Gupta
ef3e8f7123 wpt: Add more tests to css-device-adapt to verify clamp behavior. (#43715)
Add more tests with values other than non-default values. 
Earlier tests were curated around default value. So, we never know if
anything is wrong.

Testing: Add more tests and update test expectations. Tests fail right
now, which will be dealt with in next patches. It requires two PRs to
properly fix.

---------

Signed-off-by: Shubham Gupta <shubham.gupta@chromium.org>
2026-04-01 10:08:16 +00:00
shuppy
4041256e24 a11y: Miscellaneous cleanups to our accessibility code (#43772)
this patch cleans up a few minor issues in our accessibility code:

- we remove the initial synchronous TreeUpdate in
[WebView::set_accessibility_active()](https://doc.servo.org/servo/struct.WebView.html#method.set_accessibility_active).
it’s not actually necessary, because AccessKit only requires that
subtree updates happen after the graft node is created in the parent,
but those subtree updates can be delayed indefinitely. removing it
simplifies our API a bit.

- we rename notify_accessibility_tree_id() to
notify_document_accessibility_tree_id(), and do the same to the
underlying ConstellationToEmbedderMsg variant. this helps clarify that
we’re referring to the (active top-level) document’s accessibility tree.

- we make that method pub(crate) too. it doesn’t need to be pub.

- we remove the label property from webview-to-pipeline graft nodes.
properties set on graft nodes are only visible in the accesskit_consumer
API, not in the platform accessibility API, and we don’t need it in our
tests, so there’s no longer any reason to keep setting it.

Testing: updated the relevant libservo accessibility tests
Fixes: part of #4344

Signed-off-by: delan azabani <dazabani@igalia.com>
Co-authored-by: Alice Boxhall <alice@igalia.com>
2026-04-01 09:58:05 +00:00
shuppy
805519deac servoshell: Activate accessibility in all webviews (#43558)
this patch plumbs the webview accessibility trees (#43029, #43556) into
servoshell. we add a global flag in servoshell, which is set when the
platform activates accessibility and cleared when the platform
deactivates accessibility. the flag in turn [activates
accessibility](https://doc.servo.org/servo/struct.WebView.html#method.set_accessibility_active)
in existing and new webviews.

Testing: none in this patch, but will be covered by end-to-end platform
a11y tests in WPT
Fixes: part of #4344, extracted from our work in #42338

Signed-off-by: delan azabani <dazabani@igalia.com>
Co-authored-by: Luke Warlow <lwarlow@igalia.com>
Co-authored-by: Alice Boxhall <alice@igalia.com>
2026-04-01 09:46:48 +00:00
eri
fdadd1d31d devtools: Improve encoding of ObjectActor (#43791)
Remove a duplicate `ObjectPreview` and use `ObjectActor::encode` for
serialization.

Testing: Ran mach test-devtools and manual tests.
Part of: #36027

Signed-off-by: eri <eri@igalia.com>
Co-authored-by: atbrakhi <atbrakhi@igalia.com>
2026-04-01 09:45:18 +00:00
1160 changed files with 27470 additions and 10959 deletions

View File

@@ -1,6 +1,7 @@
[profile.default]
# Print a slow warning after period, terminate unit-test after 4x period.
slow-timeout = { period = "5s", terminate-after = 4 }
status-level = "leak"
[profile.ci]
fail-fast = false

4
.github/CODEOWNERS vendored
View File

@@ -39,6 +39,10 @@
# Reviewers for XPath related code
/components/xpath @simonwuelker
# Reviewers for DevTools
/components/devtools @atbrakhi @eerii
/components/shared/devtools @atbrakhi @eerii
# Reviewers for CI related code
/.github/workflows @sagudev
/.github/actions @sagudev

View File

@@ -9,6 +9,9 @@ jobs:
# Run job only on servo/servo
if: github.repository == 'servo/servo'
runs-on: ubuntu-latest
environment:
name: book-sync
deployment: false
steps:
- name: Check out Servo
uses: actions/checkout@v6

View File

@@ -28,6 +28,9 @@ jobs:
if: github.repository == 'servo/servo' || github.event_name == 'workflow_dispatch'
name: Create Draft GH Release
runs-on: ubuntu-latest
environment: &publish-environment
name: publish
deployment: false
steps:
- id: create-release
run: |
@@ -66,9 +69,10 @@ jobs:
&& (inputs.regular_release || false) == false
name: Publish GH Release
runs-on: ubuntu-latest
environment: *publish-environment
steps:
- name: Publish as latest (success)
if: ${{ !contains(needs.*.result, 'failure') && !contains(needs.*.result, 'cancelled') }}
if: ${{ !contains(needs.*.result, 'failure') && (!contains(needs.*.result, 'cancelled') && !cancelled()) }}
run: |
gh api \
--method PATCH \
@@ -77,7 +81,7 @@ jobs:
/repos/${RELEASE_REPO}/releases/${RELEASE_ID} \
-F draft=false
- name: Publish as latest (failure)
if: ${{ contains(needs.*.result, 'failure') || contains(needs.*.result, 'cancelled') }}
if: ${{ contains(needs.*.result, 'failure') || (contains(needs.*.result, 'cancelled') || cancelled()) }}
run: |
gh api \
--method PATCH \
@@ -117,6 +121,9 @@ jobs:
contents: write
id-token: write
attestations: write
environment:
name: publish
deployment: false
env:
ARTIFACT_BASENAME: "servo-${{ needs.create-draft-release.outputs.release-tag }}-src-vendored"
ARTIFACT_FILENAME: "servo-${{ needs.create-draft-release.outputs.release-tag }}-src-vendored.tar.gz"
@@ -146,40 +153,75 @@ jobs:
if: |
(github.repository == 'servo/servo' || github.event_name == 'workflow_dispatch')
&& (inputs.regular_release || false) == false
permissions:
runs-on: ubuntu-latest
permissions: &nightly-upload-permissions
id-token: write
attestations: write
environment: *publish-environment
needs:
- create-draft-release
- build-win
secrets:
github_upload_token: ${{ secrets.NIGHTLY_REPO_TOKEN }}
s3_upload_token: ${{ secrets.S3_UPLOAD_CREDENTIALS }}
uses: ./.github/workflows/upload_release.yml
with:
artifact_ids: ${{ needs.build-win.outputs.artifact_ids }}
artifact_platform: windows-msvc
github_release_id: ${{ needs.create-draft-release.outputs.release-id }}
target_repo: ${{ github.repository_owner }}/${{ inputs.regular_release && 'servo' || 'servo-nightly-builds' }}
env:
ARTIFACT_IDS: ${{ needs.build-win.outputs.artifact_ids }}
ARTIFACT_PLATFORM: windows-msvc
GITHUB_RELEASE_ID: ${{ needs.create-draft-release.outputs.release-id }}
RELEASE_REPO: ${{ github.repository_owner }}/servo-nightly-builds
RELEASE_REPO_TOKEN: ${{ secrets.NIGHTLY_REPO_TOKEN }}
S3_UPLOAD_CREDENTIALS: ${{ secrets.S3_UPLOAD_CREDENTIALS }}
steps: &upload-release-steps
- uses: actions/checkout@v6
with:
sparse-checkout: |
.github
etc/ci
fetch-depth: 1
- name: Setup Python
uses: ./.github/actions/setup-python
- name: Validate artifact IDs
run: |
if [[ -z "${ARTIFACT_IDS}" ]]; then
echo "Error: No artifact IDs provided."
echo "Help: Check the build job's outputs.artifact_ids value."
echo "If you recently renamed the build job without updating the corresponding output reference,"
echo "that is likely the cause of this error."
exit 1
fi
- uses: actions/download-artifact@v8
with:
artifact-ids: ${{ env.ARTIFACT_IDS }}
merge-multiple: true
path: release-artifacts
- name: Generate artifact attestation
uses: actions/attest-build-provenance@v3
with:
subject-path: release-artifacts/*
- name: Upload release artifacts
run: |
./etc/ci/upload_nightly.py "${ARTIFACT_PLATFORM}" \
--secret-from-environment \
--github-release-id "${GITHUB_RELEASE_ID}" \
release-artifacts/*
upload-win-release:
if: github.event_name == 'workflow_dispatch' && inputs.regular_release
permissions:
runs-on: ubuntu-latest
permissions: &release-upload-permissions
id-token: write
attestations: write
# Necessary for the github token to upload artifacts to the release.
contents: write
environment: *publish-environment
needs:
- create-draft-release
- build-win
secrets:
s3_upload_token: ${{ secrets.S3_UPLOAD_CREDENTIALS }}
uses: ./.github/workflows/upload_release.yml
with:
artifact_ids: ${{ needs.build-win.outputs.artifact_ids }}
artifact_platform: windows-msvc
github_release_id: ${{ needs.create-draft-release.outputs.release-id }}
target_repo: ${{ github.repository_owner }}/${{ inputs.regular_release && 'servo' || 'servo-nightly-builds' }}
env:
ARTIFACT_IDS: ${{ needs.build-win.outputs.artifact_ids }}
ARTIFACT_PLATFORM: windows-msvc
GITHUB_RELEASE_ID: ${{ needs.create-draft-release.outputs.release-id }}
RELEASE_REPO: ${{ github.repository_owner }}/servo
RELEASE_REPO_TOKEN: ${{ github.token }}
S3_UPLOAD_CREDENTIALS: ${{ secrets.S3_UPLOAD_CREDENTIALS }}
steps: *upload-release-steps
build-mac:
@@ -196,40 +238,37 @@ jobs:
if: |
(github.repository == 'servo/servo' || github.event_name == 'workflow_dispatch')
&& (inputs.regular_release || false) == false
permissions:
id-token: write
attestations: write
runs-on: ubuntu-latest
permissions: *nightly-upload-permissions
environment: *publish-environment
needs:
- create-draft-release
- build-mac
secrets:
github_upload_token: ${{ secrets.NIGHTLY_REPO_TOKEN }}
s3_upload_token: ${{ secrets.S3_UPLOAD_CREDENTIALS }}
uses: ./.github/workflows/upload_release.yml
with:
artifact_ids: ${{ needs.build-mac.outputs.artifact_ids }}
artifact_platform: mac
github_release_id: ${{ needs.create-draft-release.outputs.release-id }}
target_repo: ${{ github.repository_owner }}/${{ inputs.regular_release && 'servo' || 'servo-nightly-builds' }}
env:
ARTIFACT_IDS: ${{ needs.build-mac.outputs.artifact_ids }}
ARTIFACT_PLATFORM: mac
GITHUB_RELEASE_ID: ${{ needs.create-draft-release.outputs.release-id }}
RELEASE_REPO: ${{ github.repository_owner }}/servo-nightly-builds
RELEASE_REPO_TOKEN: ${{ secrets.NIGHTLY_REPO_TOKEN }}
S3_UPLOAD_CREDENTIALS: ${{ secrets.S3_UPLOAD_CREDENTIALS }}
steps: *upload-release-steps
upload-mac-release:
if: github.event_name == 'workflow_dispatch' && inputs.regular_release
permissions:
id-token: write
attestations: write
# Necessary for the github token to upload artifacts to the release.
contents: write
runs-on: ubuntu-latest
permissions: *release-upload-permissions
environment: *publish-environment
needs:
- create-draft-release
- build-mac
secrets:
s3_upload_token: ${{ secrets.S3_UPLOAD_CREDENTIALS }}
uses: ./.github/workflows/upload_release.yml
with:
artifact_ids: ${{ needs.build-mac.outputs.artifact_ids }}
artifact_platform: mac
github_release_id: ${{ needs.create-draft-release.outputs.release-id }}
target_repo: ${{ github.repository_owner }}/${{ inputs.regular_release && 'servo' || 'servo-nightly-builds' }}
env:
ARTIFACT_IDS: ${{ needs.build-mac.outputs.artifact_ids }}
ARTIFACT_PLATFORM: mac
GITHUB_RELEASE_ID: ${{ needs.create-draft-release.outputs.release-id }}
RELEASE_REPO: ${{ github.repository_owner }}/servo
RELEASE_REPO_TOKEN: ${{ github.token }}
S3_UPLOAD_CREDENTIALS: ${{ secrets.S3_UPLOAD_CREDENTIALS }}
steps: *upload-release-steps
build-mac-arm64:
# This job is only useful when run on upstream servo.
@@ -245,40 +284,37 @@ jobs:
if: |
(github.repository == 'servo/servo' || github.event_name == 'workflow_dispatch')
&& (inputs.regular_release || false) == false
permissions:
id-token: write
attestations: write
runs-on: ubuntu-latest
permissions: *nightly-upload-permissions
environment: *publish-environment
needs:
- create-draft-release
- build-mac-arm64
secrets:
github_upload_token: ${{ secrets.NIGHTLY_REPO_TOKEN }}
s3_upload_token: ${{ secrets.S3_UPLOAD_CREDENTIALS }}
uses: ./.github/workflows/upload_release.yml
with:
artifact_ids: ${{ needs.build-mac-arm64.outputs.artifact_ids }}
artifact_platform: mac-arm64
github_release_id: ${{ needs.create-draft-release.outputs.release-id }}
target_repo: ${{ github.repository_owner }}/${{ inputs.regular_release && 'servo' || 'servo-nightly-builds' }}
env:
ARTIFACT_IDS: ${{ needs.build-mac-arm64.outputs.artifact_ids }}
ARTIFACT_PLATFORM: mac-arm64
GITHUB_RELEASE_ID: ${{ needs.create-draft-release.outputs.release-id }}
RELEASE_REPO: ${{ github.repository_owner }}/servo-nightly-builds
RELEASE_REPO_TOKEN: ${{ secrets.NIGHTLY_REPO_TOKEN }}
S3_UPLOAD_CREDENTIALS: ${{ secrets.S3_UPLOAD_CREDENTIALS }}
steps: *upload-release-steps
upload-mac-arm64-release:
if: github.event_name == 'workflow_dispatch' && inputs.regular_release
permissions:
id-token: write
attestations: write
# Necessary for the github token to upload artifacts to the release.
contents: write
runs-on: ubuntu-latest
permissions: *release-upload-permissions
environment: *publish-environment
needs:
- create-draft-release
- build-mac-arm64
secrets:
s3_upload_token: ${{ secrets.S3_UPLOAD_CREDENTIALS }}
uses: ./.github/workflows/upload_release.yml
with:
artifact_ids: ${{ needs.build-mac-arm64.outputs.artifact_ids }}
artifact_platform: mac-arm64
github_release_id: ${{ needs.create-draft-release.outputs.release-id }}
target_repo: ${{ github.repository_owner }}/${{ inputs.regular_release && 'servo' || 'servo-nightly-builds' }}
env:
ARTIFACT_IDS: ${{ needs.build-mac-arm64.outputs.artifact_ids }}
ARTIFACT_PLATFORM: mac-arm64
GITHUB_RELEASE_ID: ${{ needs.create-draft-release.outputs.release-id }}
RELEASE_REPO: ${{ github.repository_owner }}/servo
RELEASE_REPO_TOKEN: ${{ github.token }}
S3_UPLOAD_CREDENTIALS: ${{ secrets.S3_UPLOAD_CREDENTIALS }}
steps: *upload-release-steps
build-linux:
# This job is only useful when run on upstream servo.
@@ -294,40 +330,37 @@ jobs:
if: |
(github.repository == 'servo/servo' || github.event_name == 'workflow_dispatch')
&& (inputs.regular_release || false) == false
permissions:
id-token: write
attestations: write
runs-on: ubuntu-latest
permissions: *nightly-upload-permissions
environment: *publish-environment
needs:
- create-draft-release
- build-linux
secrets:
github_upload_token: ${{ secrets.NIGHTLY_REPO_TOKEN }}
s3_upload_token: ${{ secrets.S3_UPLOAD_CREDENTIALS }}
uses: ./.github/workflows/upload_release.yml
with:
artifact_ids: ${{ needs.build-linux.outputs.artifact_ids }}
artifact_platform: linux
github_release_id: ${{ needs.create-draft-release.outputs.release-id }}
target_repo: ${{ github.repository_owner }}/${{ inputs.regular_release && 'servo' || 'servo-nightly-builds' }}
env:
ARTIFACT_IDS: ${{ needs.build-linux.outputs.artifact_ids }}
ARTIFACT_PLATFORM: linux
GITHUB_RELEASE_ID: ${{ needs.create-draft-release.outputs.release-id }}
RELEASE_REPO: ${{ github.repository_owner }}/servo-nightly-builds
RELEASE_REPO_TOKEN: ${{ secrets.NIGHTLY_REPO_TOKEN }}
S3_UPLOAD_CREDENTIALS: ${{ secrets.S3_UPLOAD_CREDENTIALS }}
steps: *upload-release-steps
upload-linux-release:
if: github.event_name == 'workflow_dispatch' && inputs.regular_release
permissions:
id-token: write
attestations: write
# Necessary for the github token to upload artifacts to the release.
contents: write
runs-on: ubuntu-latest
permissions: *release-upload-permissions
environment: *publish-environment
needs:
- create-draft-release
- build-linux
secrets:
s3_upload_token: ${{ secrets.S3_UPLOAD_CREDENTIALS }}
uses: ./.github/workflows/upload_release.yml
with:
artifact_ids: ${{ needs.build-linux.outputs.artifact_ids }}
artifact_platform: linux
github_release_id: ${{ needs.create-draft-release.outputs.release-id }}
target_repo: ${{ github.repository_owner }}/${{ inputs.regular_release && 'servo' || 'servo-nightly-builds' }}
env:
ARTIFACT_IDS: ${{ needs.build-linux.outputs.artifact_ids }}
ARTIFACT_PLATFORM: linux
GITHUB_RELEASE_ID: ${{ needs.create-draft-release.outputs.release-id }}
RELEASE_REPO: ${{ github.repository_owner }}/servo
RELEASE_REPO_TOKEN: ${{ github.token }}
S3_UPLOAD_CREDENTIALS: ${{ secrets.S3_UPLOAD_CREDENTIALS }}
steps: *upload-release-steps
build-android:
# This job is only useful when run on upstream servo.
@@ -343,40 +376,37 @@ jobs:
if: |
(github.repository == 'servo/servo' || github.event_name == 'workflow_dispatch')
&& (inputs.regular_release || false) == false
permissions:
id-token: write
attestations: write
runs-on: ubuntu-latest
permissions: *nightly-upload-permissions
environment: *publish-environment
needs:
- create-draft-release
- build-android
secrets:
github_upload_token: ${{ secrets.NIGHTLY_REPO_TOKEN }}
s3_upload_token: ${{ secrets.S3_UPLOAD_CREDENTIALS }}
uses: ./.github/workflows/upload_release.yml
with:
artifact_ids: ${{ needs.build-android.outputs.artifact_ids }}
artifact_platform: android
github_release_id: ${{ needs.create-draft-release.outputs.release-id }}
target_repo: ${{ github.repository_owner }}/${{ inputs.regular_release && 'servo' || 'servo-nightly-builds' }}
env:
ARTIFACT_IDS: ${{ needs.build-android.outputs.artifact_ids }}
ARTIFACT_PLATFORM: android
GITHUB_RELEASE_ID: ${{ needs.create-draft-release.outputs.release-id }}
RELEASE_REPO: ${{ github.repository_owner }}/servo-nightly-builds
RELEASE_REPO_TOKEN: ${{ secrets.NIGHTLY_REPO_TOKEN }}
S3_UPLOAD_CREDENTIALS: ${{ secrets.S3_UPLOAD_CREDENTIALS }}
steps: *upload-release-steps
upload-android-release:
if: github.event_name == 'workflow_dispatch' && inputs.regular_release
permissions:
id-token: write
attestations: write
# Necessary for the github token to upload artifacts to the release.
contents: write
runs-on: ubuntu-latest
permissions: *release-upload-permissions
environment: *publish-environment
needs:
- create-draft-release
- build-android
secrets:
s3_upload_token: ${{ secrets.S3_UPLOAD_CREDENTIALS }}
uses: ./.github/workflows/upload_release.yml
with:
artifact_ids: ${{ needs.build-android.outputs.artifact_ids }}
artifact_platform: android
github_release_id: ${{ needs.create-draft-release.outputs.release-id }}
target_repo: ${{ github.repository_owner }}/${{ inputs.regular_release && 'servo' || 'servo-nightly-builds' }}
env:
ARTIFACT_IDS: ${{ needs.build-android.outputs.artifact_ids }}
ARTIFACT_PLATFORM: android
GITHUB_RELEASE_ID: ${{ needs.create-draft-release.outputs.release-id }}
RELEASE_REPO: ${{ github.repository_owner }}/servo
RELEASE_REPO_TOKEN: ${{ github.token }}
S3_UPLOAD_CREDENTIALS: ${{ secrets.S3_UPLOAD_CREDENTIALS }}
steps: *upload-release-steps
build-ohos:
@@ -394,37 +424,34 @@ jobs:
if: |
(github.repository == 'servo/servo' || github.event_name == 'workflow_dispatch')
&& (inputs.regular_release || false) == false
permissions:
id-token: write
attestations: write
runs-on: ubuntu-latest
permissions: *nightly-upload-permissions
environment: *publish-environment
needs:
- create-draft-release
- build-ohos
secrets:
github_upload_token: ${{ secrets.NIGHTLY_REPO_TOKEN }}
s3_upload_token: ${{ secrets.S3_UPLOAD_CREDENTIALS }}
uses: ./.github/workflows/upload_release.yml
with:
artifact_ids: ${{ needs.build-ohos.outputs.artifact_ids }}
artifact_platform: ohos
github_release_id: ${{ needs.create-draft-release.outputs.release-id }}
target_repo: ${{ github.repository_owner }}/${{ inputs.regular_release && 'servo' || 'servo-nightly-builds' }}
env:
ARTIFACT_IDS: ${{ needs.build-ohos.outputs.artifact_ids }}
ARTIFACT_PLATFORM: ohos
GITHUB_RELEASE_ID: ${{ needs.create-draft-release.outputs.release-id }}
RELEASE_REPO: ${{ github.repository_owner }}/servo-nightly-builds
RELEASE_REPO_TOKEN: ${{ secrets.NIGHTLY_REPO_TOKEN }}
S3_UPLOAD_CREDENTIALS: ${{ secrets.S3_UPLOAD_CREDENTIALS }}
steps: *upload-release-steps
upload-ohos-release:
if: github.event_name == 'workflow_dispatch' && inputs.regular_release
permissions:
id-token: write
attestations: write
# Necessary for the github token to upload artifacts to the release.
contents: write
runs-on: ubuntu-latest
permissions: *release-upload-permissions
environment: *publish-environment
needs:
- create-draft-release
- build-ohos
secrets:
s3_upload_token: ${{ secrets.S3_UPLOAD_CREDENTIALS }}
uses: ./.github/workflows/upload_release.yml
with:
artifact_ids: ${{ needs.build-ohos.outputs.artifact_ids }}
artifact_platform: ohos
github_release_id: ${{ needs.create-draft-release.outputs.release-id }}
target_repo: ${{ github.repository_owner }}/${{ inputs.regular_release && 'servo' || 'servo-nightly-builds' }}
env:
ARTIFACT_IDS: ${{ needs.build-ohos.outputs.artifact_ids }}
ARTIFACT_PLATFORM: ohos
GITHUB_RELEASE_ID: ${{ needs.create-draft-release.outputs.release-id }}
RELEASE_REPO: ${{ github.repository_owner }}/servo
RELEASE_REPO_TOKEN: ${{ github.token }}
S3_UPLOAD_CREDENTIALS: ${{ secrets.S3_UPLOAD_CREDENTIALS }}
steps: *upload-release-steps

View File

@@ -1,69 +0,0 @@
name: Upload and Attest Release Assets
on:
workflow_call:
inputs:
artifact_platform:
type: string
required: true
description: "The platform of the release artifacts to upload."
target_repo:
type: string
required: true
description: "The target repository owner and name (e.g. `servo/servo`) where the release will be created."
github_release_id:
type: string
required: true
description: "The ID of the GitHub release to which assets will be added."
artifact_ids:
required: true
type: string
description: "A comma-separated list of artifact IDs to upload."
secrets:
github_upload_token:
required: false
description: "A GitHub token with permission to upload release assets. If omitted github.token will be used instead."
s3_upload_token:
required: true
description: "A token with permission to upload release artifacts to our S3 bucket."
jobs:
upload-artifact:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v6
with:
sparse-checkout: |
.github
etc/ci
fetch-depth: '1'
- name: Setup Python
uses: ./.github/actions/setup-python
- name: Validate artifact IDs
run: |
if [[ -z "${{ inputs.artifact_ids }}" ]]; then
echo "Error: No artifact IDs provided."
echo "Help: Check the calling workflow's output.artifact_ids parameter, usually created by a build workflow."
echo "If you recently renamed the build job for the artifacts, without updating the `outputs.artifact_ids` "
echo "parameter then this might be the cause of the error."
exit 1
fi
- uses: actions/download-artifact@v8
with:
artifact-ids: ${{ inputs.artifact_ids }}
merge-multiple: true
path: release-artifacts
- name: Generate artifact attestation
uses: actions/attest-build-provenance@v3
with:
subject-path: 'release-artifacts/*'
- name: Upload release artifacts
run: |
./etc/ci/upload_nightly.py ${{ inputs.artifact_platform}} \
--secret-from-environment \
--github-release-id ${{ inputs.github_release_id }} \
release-artifacts/*
env:
S3_UPLOAD_CREDENTIALS: ${{ secrets.s3_upload_token }}
RELEASE_REPO_TOKEN: ${{ secrets.github_upload_token || github.token }}
RELEASE_REPO: ${{ inputs.target_repo }}

359
Cargo.lock generated
View File

@@ -331,7 +331,7 @@ version = "1.1.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "40c48f72fd53cd289104fc64099abca73db4166ad86ea0b4341abe65af83dadc"
dependencies = [
"windows-sys 0.60.2",
"windows-sys 0.61.2",
]
[[package]]
@@ -342,7 +342,7 @@ checksum = "291e6a250ff86cd4a820112fb8898808a366d8f9f58ce16d1f538353ad55747d"
dependencies = [
"anstyle",
"once_cell_polyfill",
"windows-sys 0.60.2",
"windows-sys 0.61.2",
]
[[package]]
@@ -383,15 +383,15 @@ dependencies = [
"objc2-foundation 0.3.2",
"parking_lot",
"percent-encoding",
"windows-sys 0.60.2",
"windows-sys 0.59.0",
"x11rb",
]
[[package]]
name = "arc-swap"
version = "1.9.0"
version = "1.9.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a07d1f37ff60921c83bdfc7407723bdefe89b44b98a9b772f225c8f9d67141a6"
checksum = "6a3a1fd6f75306b68087b831f025c712524bcb19aad54e557b1129cfa0a2b207"
dependencies = [
"rustversion",
]
@@ -522,7 +522,7 @@ dependencies = [
"futures-lite",
"parking",
"polling",
"rustix 1.1.2",
"rustix 1.1.4",
"slab",
"windows-sys 0.61.2",
]
@@ -553,7 +553,7 @@ dependencies = [
"cfg-if",
"event-listener",
"futures-lite",
"rustix 1.1.2",
"rustix 1.1.4",
]
[[package]]
@@ -569,9 +569,9 @@ dependencies = [
[[package]]
name = "async-signal"
version = "0.2.13"
version = "0.2.14"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "43c070bbf59cd3570b6b2dd54cd772527c7c3620fce8be898406dd3ed6adc64c"
checksum = "52b5aaafa020cf5053a01f2a60e8ff5dccf550f0f77ec54a4e47285ac2bab485"
dependencies = [
"async-io",
"async-lock",
@@ -579,7 +579,7 @@ dependencies = [
"cfg-if",
"futures-core",
"futures-io",
"rustix 1.1.2",
"rustix 1.1.4",
"signal-hook-registry",
"slab",
"windows-sys 0.61.2",
@@ -1126,9 +1126,9 @@ dependencies = [
[[package]]
name = "cc"
version = "1.2.58"
version = "1.2.59"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e1e928d4b69e3077709075a938a05ffbedfa53a84c8f766efbf8220bb1ff60e1"
checksum = "b7a4d3ec6524d28a329fc53654bbadc9bdd7b0431f5d65f1a56ffb28a1ee5283"
dependencies = [
"find-msvc-tools",
"jobserver",
@@ -2418,9 +2418,9 @@ checksum = "7360491ce676a36bf9bb3c56c1aa791658183a54d2744120f27285738d90465a"
[[package]]
name = "fastrand"
version = "2.3.0"
version = "2.4.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "37909eebbb50d72f9059c3b6d82c0463f2ff062c9e95845c43a6c9c0355411be"
checksum = "9f1f227452a390804cdb637b74a86990f2a7d7ba4b7d5693aac9b4dd6defd8d6"
[[package]]
name = "fdeflate"
@@ -2787,7 +2787,7 @@ version = "1.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1bd49230192a3797a9a4d6abe9b3eed6f7fa4c8a8a4947977c6f80025f92cbd8"
dependencies = [
"rustix 1.1.2",
"rustix 1.1.4",
"windows-link 0.2.1",
]
@@ -2934,9 +2934,9 @@ dependencies = [
[[package]]
name = "glib"
version = "0.22.3"
version = "0.22.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "039f93465ac17e6cb02d16f16572cd3e43a77e736d5ecc461e71b9c9c5c0569c"
checksum = "02856b71413e175be50eff37fe0aefa53e227054ee3d96196faee8a0823cac80"
dependencies = [
"bitflags 2.11.0",
"futures-channel",
@@ -3521,6 +3521,8 @@ version = "0.16.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5419bdc4f6a9207fbeba6d11b604d481addf78ecd10c11ad51e76c2f6482748d"
dependencies = [
"allocator-api2",
"equivalent",
"foldhash 0.2.0",
]
@@ -3793,9 +3795,9 @@ dependencies = [
[[package]]
name = "hyper"
version = "1.8.1"
version = "1.9.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2ab2d4f250c3d7b1c9fcdff1cece94ea4e2dfbec68614f7b87cb205f24ca9d11"
checksum = "6299f016b246a94207e63da54dbe807655bf9e00044f73ded42c3ac5305fbcca"
dependencies = [
"atomic-waker",
"bytes",
@@ -3808,7 +3810,6 @@ dependencies = [
"httpdate",
"itoa",
"pin-project-lite",
"pin-utils",
"smallvec",
"tokio",
"want",
@@ -3821,7 +3822,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e3c93eb611681b207e1fe55d5a71ecf91572ec8a6705cdb6857f7d8d5242cf58"
dependencies = [
"http 1.4.0",
"hyper 1.8.1",
"hyper 1.9.0",
"hyper-util",
"log",
"rustls",
@@ -3844,12 +3845,12 @@ dependencies = [
"futures-util",
"http 1.4.0",
"http-body 1.0.1",
"hyper 1.8.1",
"hyper 1.9.0",
"ipnet",
"libc",
"percent-encoding",
"pin-project-lite",
"socket2 0.6.1",
"socket2 0.5.10",
"tokio",
"tower-service",
"tracing",
@@ -4726,9 +4727,9 @@ checksum = "09edd9e8b54e49e587e4f6295a7d29c3ea94d469cb40ab8ca70b288248a81db2"
[[package]]
name = "libc"
version = "0.2.183"
version = "0.2.184"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b5b646652bf6661599e1da8901b3b9522896f01e736bad5f723fe7a3a27f899d"
checksum = "48f5d2a454e16a5ea0f4ced81bd44e4cfc7bd3a507b61887c99fd3538b28e4af"
[[package]]
name = "libdbus-sys"
@@ -4777,9 +4778,9 @@ checksum = "b6d2cec3eae94f9f509c767b45932f1ada8350c4bdb85af2fcab4a3c14807981"
[[package]]
name = "libredox"
version = "0.1.15"
version = "0.1.16"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7ddbf48fd451246b1f8c2610bd3b4ac0cc6e149d89832867093ab69a17194f08"
checksum = "e02f3bb43d335493c96bf3fd3a321600bf6bd07ed34bc64118e9293bdffea46c"
dependencies = [
"bitflags 2.11.0",
"libc",
@@ -4810,9 +4811,9 @@ dependencies = [
[[package]]
name = "libz-sys"
version = "1.1.25"
version = "1.1.28"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d52f4c29e2a68ac30c9087e1b772dc9f44a2b66ed44edf2266cf2be9b03dafc1"
checksum = "fc3a226e576f50782b3305c5ccf458698f92798987f551c6a02efe8276721e22"
dependencies = [
"cc",
"libc",
@@ -4834,9 +4835,9 @@ checksum = "d26c52dbd32dccf2d10cac7725f8eae5296885fb5703b261f7d0a0739ec807ab"
[[package]]
name = "linux-raw-sys"
version = "0.11.0"
version = "0.12.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "df1d3c3b53da64cf5760482273a98e575c651a67eec7f77df96b5b642de8f039"
checksum = "32a66949e030da00e8c7d4434b251670a91556f4144941d37452769c25d58a53"
[[package]]
name = "litemap"
@@ -4875,6 +4876,15 @@ dependencies = [
"imgref",
]
[[package]]
name = "lru"
version = "0.16.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a1dc47f592c06f33f8e3aea9591776ec7c9f9e4124778ff8a3c3b87159f7e593"
dependencies = [
"hashbrown 0.16.0",
]
[[package]]
name = "mach2"
version = "0.6.0"
@@ -5078,9 +5088,9 @@ dependencies = [
[[package]]
name = "mozjs"
version = "0.15.7"
version = "0.15.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "226edc79aa3f990a61330d4011471910273f6ee6bbfd0dcb93b18f5a03505f6b"
checksum = "d4e51874bd557fcc5809a3e133714998d2b856ab29aa59fba79f3398f0537e48"
dependencies = [
"bindgen",
"cc",
@@ -5093,9 +5103,9 @@ dependencies = [
[[package]]
name = "mozjs_sys"
version = "0.140.8-2"
version = "0.140.8-3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4b50224a02c96e1e703f7d055377431333dfc5cd3a09ccf03f8a2c8a156c7af8"
checksum = "f70e26e45204d1cbd73d2508007ab5cbb3411bddb057c47977bae6844861a8fa"
dependencies = [
"bindgen",
"cc",
@@ -5870,6 +5880,15 @@ version = "0.2.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "060ca76c500c8ffde25a89724d3476d4faca3b55fa3fe02cd8a3607e95b0861d"
[[package]]
name = "ohos-media-sys"
version = "0.0.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6384d6e3befdaaec0206772e05a6b9222188e8ce023f6a164ca96e1091ef2e87"
dependencies = [
"ohos-sys-opaque-types",
]
[[package]]
name = "ohos-sys-opaque-types"
version = "0.1.9"
@@ -5900,6 +5919,15 @@ dependencies = [
"ohos-sys-opaque-types",
]
[[package]]
name = "ohos-window-sys"
version = "0.1.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f801f5de727bd01cfdcd6e9b0de9d6d3e674e0ac73ea7ee202c0bcd75fc1daf7"
dependencies = [
"ohos-sys-opaque-types",
]
[[package]]
name = "once_cell"
version = "1.21.4"
@@ -6245,12 +6273,6 @@ version = "0.2.17"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a89322df9ebe1c1578d689c92318e070967d1042b512afbe49518723f4e6d5cd"
[[package]]
name = "pin-utils"
version = "0.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8b870d8c151b6f2fb93e84a13146138f05d02ed11c7e7c54f8826aaaf7c9f184"
[[package]]
name = "piper"
version = "0.2.5"
@@ -6358,7 +6380,7 @@ dependencies = [
"concurrent-queue",
"hermit-abi",
"pin-project-lite",
"rustix 1.1.2",
"rustix 1.1.4",
"windows-sys 0.61.2",
]
@@ -6719,6 +6741,12 @@ version = "0.1.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ca45419789ae5a7899559e9512e58ca889e41f04f1f2445e9f4b290ceccd1d08"
[[package]]
name = "rangemap"
version = "1.7.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "973443cf09a9c8656b574a866ab68dfa19f0867d0340648c7d2f6a71b8a8ea68"
[[package]]
name = "rav1e"
version = "0.7.1"
@@ -7034,14 +7062,14 @@ dependencies = [
[[package]]
name = "rustix"
version = "1.1.2"
version = "1.1.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cd15f8a2c5551a84d56efdc1cd049089e409ac19a3072d5037a17fd70719ff3e"
checksum = "b6fe4565b9518b83ef4f91bb47ce29620ca828bd32cb7e408f0062e9930ba190"
dependencies = [
"bitflags 2.11.0",
"errno",
"libc",
"linux-raw-sys 0.11.0",
"linux-raw-sys 0.12.1",
"windows-sys 0.52.0",
]
@@ -7278,8 +7306,8 @@ dependencies = [
[[package]]
name = "selectors"
version = "0.36.1"
source = "git+https://github.com/servo/stylo?rev=8557228b96c0e343764953e72a62ea503baf01b3#8557228b96c0e343764953e72a62ea503baf01b3"
version = "0.37.0"
source = "git+https://github.com/servo/stylo?rev=ddf2109bdfff62c83a14e3a3c7dc1c6130653283#ddf2109bdfff62c83a14e3a3c7dc1c6130653283"
dependencies = [
"bitflags 2.11.0",
"cssparser",
@@ -7298,9 +7326,9 @@ dependencies = [
[[package]]
name = "semver"
version = "1.0.27"
version = "1.0.28"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d767eb0aabc880b29956c35734170f26ed551a859dbd361d140cdbeca61ab1e2"
checksum = "8a7852d02fc848982e0c167ef163aaff9cd91dc640ba85e263cb1ce46fae51cd"
[[package]]
name = "serde"
@@ -7426,9 +7454,10 @@ dependencies = [
"gstreamer",
"http 1.4.0",
"http-body-util",
"hyper 1.8.1",
"hyper 1.9.0",
"image",
"ipc-channel",
"itertools 0.14.0",
"keyboard-types",
"log",
"mozangle",
@@ -7456,6 +7485,7 @@ dependencies = [
"servo-media",
"servo-media-dummy",
"servo-media-gstreamer",
"servo-media-ohos",
"servo-media-thread",
"servo-net",
"servo-net-traits",
@@ -7648,6 +7678,7 @@ version = "0.1.0"
dependencies = [
"accesskit",
"backtrace",
"base64 0.22.1",
"content-security-policy",
"crossbeam-channel",
"euclid",
@@ -7692,6 +7723,7 @@ dependencies = [
name = "servo-constellation-traits"
version = "0.1.0"
dependencies = [
"base64 0.22.1",
"content-security-policy",
"encoding_rs",
"euclid",
@@ -7914,12 +7946,12 @@ dependencies = [
[[package]]
name = "servo-hyper-serde"
version = "0.13.2"
version = "0.1.0"
dependencies = [
"cookie 0.18.1",
"headers 0.4.1",
"http 1.4.0",
"hyper 1.8.1",
"hyper 1.9.0",
"mime",
"serde",
"serde_bytes",
@@ -8031,9 +8063,11 @@ dependencies = [
"atomic_refcell",
"content-security-policy",
"crossbeam-channel",
"data-url",
"encoding_rs",
"euclid",
"http 1.4.0",
"icu_locid",
"indexmap",
"ipc-channel",
"keyboard-types",
@@ -8207,6 +8241,31 @@ dependencies = [
"servo-media-player",
]
[[package]]
name = "servo-media-ohos"
version = "0.1.0"
dependencies = [
"crossbeam-channel",
"ipc-channel",
"libc",
"log",
"lru",
"mime",
"ohos-media-sys",
"ohos-sys-opaque-types",
"ohos-window-sys",
"once_cell",
"rangemap",
"serde_json",
"servo-media",
"servo-media-audio",
"servo-media-player",
"servo-media-streams",
"servo-media-traits",
"servo-media-webrtc",
"yuv",
]
[[package]]
name = "servo-media-player"
version = "0.1.0"
@@ -8300,7 +8359,7 @@ dependencies = [
"headers 0.4.1",
"http 1.4.0",
"http-body-util",
"hyper 1.8.1",
"hyper 1.9.0",
"hyper-rustls",
"hyper-util",
"imsz",
@@ -8926,7 +8985,7 @@ dependencies = [
[[package]]
name = "servo_arc"
version = "0.4.3"
source = "git+https://github.com/servo/stylo?rev=8557228b96c0e343764953e72a62ea503baf01b3#8557228b96c0e343764953e72a62ea503baf01b3"
source = "git+https://github.com/servo/stylo?rev=ddf2109bdfff62c83a14e3a3c7dc1c6130653283#ddf2109bdfff62c83a14e3a3c7dc1c6130653283"
dependencies = [
"serde",
"stable_deref_trait",
@@ -8981,7 +9040,7 @@ dependencies = [
"servo-allocator",
"servo-base",
"servo-webdriver-server",
"sig",
"signal-hook-registry",
"surfman",
"tokio",
"tracing",
@@ -9047,15 +9106,6 @@ version = "1.3.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0fda2ff0d084019ba4d7c6f371c95d8fd75ce3524c3cb8fb653a3023f6323e64"
[[package]]
name = "sig"
version = "1.0.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6567e29578f9bfade6a5d94a32b9a4256348358d2a3f448cab0021f9a02614a2"
dependencies = [
"libc",
]
[[package]]
name = "signal-hook-registry"
version = "1.4.8"
@@ -9203,12 +9253,12 @@ dependencies = [
[[package]]
name = "socket2"
version = "0.6.1"
version = "0.6.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "17129e116933cf371d018bb80ae557e889637989d8638274fb25622827b03881"
checksum = "3a766e1110788c36f4fa1c2b71b387a7815aa65f88ce0229841826633d93723e"
dependencies = [
"libc",
"windows-sys 0.60.2",
"windows-sys 0.61.2",
]
[[package]]
@@ -9345,8 +9395,8 @@ dependencies = [
[[package]]
name = "stylo"
version = "0.14.0"
source = "git+https://github.com/servo/stylo?rev=8557228b96c0e343764953e72a62ea503baf01b3#8557228b96c0e343764953e72a62ea503baf01b3"
version = "0.16.0"
source = "git+https://github.com/servo/stylo?rev=ddf2109bdfff62c83a14e3a3c7dc1c6130653283#ddf2109bdfff62c83a14e3a3c7dc1c6130653283"
dependencies = [
"app_units",
"arrayvec",
@@ -9401,8 +9451,8 @@ dependencies = [
[[package]]
name = "stylo_atoms"
version = "0.14.0"
source = "git+https://github.com/servo/stylo?rev=8557228b96c0e343764953e72a62ea503baf01b3#8557228b96c0e343764953e72a62ea503baf01b3"
version = "0.16.0"
source = "git+https://github.com/servo/stylo?rev=ddf2109bdfff62c83a14e3a3c7dc1c6130653283#ddf2109bdfff62c83a14e3a3c7dc1c6130653283"
dependencies = [
"string_cache",
"string_cache_codegen",
@@ -9410,8 +9460,8 @@ dependencies = [
[[package]]
name = "stylo_derive"
version = "0.14.0"
source = "git+https://github.com/servo/stylo?rev=8557228b96c0e343764953e72a62ea503baf01b3#8557228b96c0e343764953e72a62ea503baf01b3"
version = "0.16.0"
source = "git+https://github.com/servo/stylo?rev=ddf2109bdfff62c83a14e3a3c7dc1c6130653283#ddf2109bdfff62c83a14e3a3c7dc1c6130653283"
dependencies = [
"darling",
"proc-macro2",
@@ -9422,8 +9472,8 @@ dependencies = [
[[package]]
name = "stylo_dom"
version = "0.14.0"
source = "git+https://github.com/servo/stylo?rev=8557228b96c0e343764953e72a62ea503baf01b3#8557228b96c0e343764953e72a62ea503baf01b3"
version = "0.16.0"
source = "git+https://github.com/servo/stylo?rev=ddf2109bdfff62c83a14e3a3c7dc1c6130653283#ddf2109bdfff62c83a14e3a3c7dc1c6130653283"
dependencies = [
"bitflags 2.11.0",
"stylo_malloc_size_of",
@@ -9431,8 +9481,8 @@ dependencies = [
[[package]]
name = "stylo_malloc_size_of"
version = "0.14.0"
source = "git+https://github.com/servo/stylo?rev=8557228b96c0e343764953e72a62ea503baf01b3#8557228b96c0e343764953e72a62ea503baf01b3"
version = "0.16.0"
source = "git+https://github.com/servo/stylo?rev=ddf2109bdfff62c83a14e3a3c7dc1c6130653283#ddf2109bdfff62c83a14e3a3c7dc1c6130653283"
dependencies = [
"app_units",
"cssparser",
@@ -9448,13 +9498,13 @@ dependencies = [
[[package]]
name = "stylo_static_prefs"
version = "0.14.0"
source = "git+https://github.com/servo/stylo?rev=8557228b96c0e343764953e72a62ea503baf01b3#8557228b96c0e343764953e72a62ea503baf01b3"
version = "0.16.0"
source = "git+https://github.com/servo/stylo?rev=ddf2109bdfff62c83a14e3a3c7dc1c6130653283#ddf2109bdfff62c83a14e3a3c7dc1c6130653283"
[[package]]
name = "stylo_traits"
version = "0.14.0"
source = "git+https://github.com/servo/stylo?rev=8557228b96c0e343764953e72a62ea503baf01b3#8557228b96c0e343764953e72a62ea503baf01b3"
version = "0.16.0"
source = "git+https://github.com/servo/stylo?rev=ddf2109bdfff62c83a14e3a3c7dc1c6130653283#ddf2109bdfff62c83a14e3a3c7dc1c6130653283"
dependencies = [
"app_units",
"bitflags 2.11.0",
@@ -9649,7 +9699,7 @@ dependencies = [
"fastrand",
"getrandom 0.3.4",
"once_cell",
"rustix 1.1.2",
"rustix 1.1.4",
"windows-sys 0.52.0",
]
@@ -9685,9 +9735,9 @@ dependencies = [
[[package]]
name = "thin-vec"
version = "0.2.14"
version = "0.2.15"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "144f754d318415ac792f9d69fc87abbbfc043ce2ef041c60f16ad828f638717d"
checksum = "da322882471314edc77fa5232c587bcb87c9df52bfd0d7d4826f8868ead61899"
[[package]]
name = "thiserror"
@@ -9876,7 +9926,7 @@ checksum = "1f3ccbac311fea05f86f61904b462b55fb3df8837a366dfc601a0161d0532f20"
[[package]]
name = "to_shmem"
version = "0.3.0"
source = "git+https://github.com/servo/stylo?rev=8557228b96c0e343764953e72a62ea503baf01b3#8557228b96c0e343764953e72a62ea503baf01b3"
source = "git+https://github.com/servo/stylo?rev=ddf2109bdfff62c83a14e3a3c7dc1c6130653283#ddf2109bdfff62c83a14e3a3c7dc1c6130653283"
dependencies = [
"cssparser",
"servo_arc",
@@ -9889,7 +9939,7 @@ dependencies = [
[[package]]
name = "to_shmem_derive"
version = "0.1.0"
source = "git+https://github.com/servo/stylo?rev=8557228b96c0e343764953e72a62ea503baf01b3#8557228b96c0e343764953e72a62ea503baf01b3"
source = "git+https://github.com/servo/stylo?rev=ddf2109bdfff62c83a14e3a3c7dc1c6130653283#ddf2109bdfff62c83a14e3a3c7dc1c6130653283"
dependencies = [
"darling",
"proc-macro2",
@@ -9900,24 +9950,24 @@ dependencies = [
[[package]]
name = "tokio"
version = "1.50.0"
version = "1.51.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "27ad5e34374e03cfffefc301becb44e9dc3c17584f414349ebe29ed26661822d"
checksum = "f66bf9585cda4b724d3e78ab34b73fb2bbaba9011b9bfdf69dc836382ea13b8c"
dependencies = [
"bytes",
"libc",
"mio",
"pin-project-lite",
"socket2 0.6.1",
"socket2 0.6.3",
"tokio-macros",
"windows-sys 0.61.2",
]
[[package]]
name = "tokio-macros"
version = "2.6.1"
version = "2.7.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5c55a2eff8b69ce66c84f85e1da1c233edc36ceb85a2058d11b0d6a3c7e7569c"
checksum = "385a6cb71ab9ab790c5fe8d67f1645e6c450a7ce006a33de03daa956cf70a496"
dependencies = [
"proc-macro2",
"quote",
@@ -10193,7 +10243,7 @@ checksum = "f2f6fb2847f6742cd76af783a2a2c49e9375d0a111c7bef6f71cd9e738c72d6e"
dependencies = [
"memoffset",
"tempfile",
"windows-sys 0.60.2",
"windows-sys 0.61.2",
]
[[package]]
@@ -10717,7 +10767,7 @@ checksum = "2857dd20b54e916ec7253b3d6b4d5c4d7d4ca2c33c2e11c6c76a99bd8744755d"
dependencies = [
"cc",
"downcast-rs",
"rustix 1.1.2",
"rustix 1.1.4",
"scoped-tls",
"smallvec",
"wayland-sys",
@@ -10725,12 +10775,12 @@ dependencies = [
[[package]]
name = "wayland-client"
version = "0.31.13"
version = "0.31.14"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ab51d9f7c071abeee76007e2b742499e535148035bb835f97aaed1338cf516c3"
checksum = "645c7c96bb74690c3189b5c9cb4ca1627062bb23693a4fad9d8c3de958260144"
dependencies = [
"bitflags 2.11.0",
"rustix 1.1.2",
"rustix 1.1.4",
"wayland-backend",
"wayland-scanner",
]
@@ -10748,20 +10798,20 @@ dependencies = [
[[package]]
name = "wayland-cursor"
version = "0.31.13"
version = "0.31.14"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4b3298683470fbdc6ca40151dfc48c8f2fd4c41a26e13042f801f85002384091"
checksum = "4a52d18780be9b1314328a3de5f930b73d2200112e3849ca6cb11822793fb34d"
dependencies = [
"rustix 1.1.2",
"rustix 1.1.4",
"wayland-client",
"xcursor",
]
[[package]]
name = "wayland-protocols"
version = "0.32.11"
version = "0.32.12"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b23b5df31ceff1328f06ac607591d5ba360cf58f90c8fad4ac8d3a55a3c4aec7"
checksum = "563a85523cade2429938e790815fd7319062103b9f4a2dc806e9b53b95982d8f"
dependencies = [
"bitflags 2.11.0",
"wayland-backend",
@@ -10771,9 +10821,9 @@ dependencies = [
[[package]]
name = "wayland-protocols-plasma"
version = "0.3.11"
version = "0.3.12"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d392fc283a87774afc9beefcd6f931582bb97fe0e6ced0b306a62cb1d026527c"
checksum = "2b6d8cf1eb2c1c31ed1f5643c88a6e53538129d4af80030c8cabd1f9fa884d91"
dependencies = [
"bitflags 2.11.0",
"wayland-backend",
@@ -10784,9 +10834,9 @@ dependencies = [
[[package]]
name = "wayland-protocols-wlr"
version = "0.3.11"
version = "0.3.12"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "78248e4cc0eff8163370ba5c158630dcae1f3497a586b826eca2ef5f348d6235"
checksum = "eb04e52f7836d7c7976c78ca0250d61e33873c34156a2a1fc9474828ec268234"
dependencies = [
"bitflags 2.11.0",
"wayland-backend",
@@ -11418,15 +11468,6 @@ dependencies = [
"windows-targets 0.52.6",
]
[[package]]
name = "windows-sys"
version = "0.60.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f2f500e4d28234f72040990ec9d39e3a6b950f9f22d3dba18416c35882612bcb"
dependencies = [
"windows-targets 0.53.5",
]
[[package]]
name = "windows-sys"
version = "0.61.2"
@@ -11460,30 +11501,13 @@ dependencies = [
"windows_aarch64_gnullvm 0.52.6",
"windows_aarch64_msvc 0.52.6",
"windows_i686_gnu 0.52.6",
"windows_i686_gnullvm 0.52.6",
"windows_i686_gnullvm",
"windows_i686_msvc 0.52.6",
"windows_x86_64_gnu 0.52.6",
"windows_x86_64_gnullvm 0.52.6",
"windows_x86_64_msvc 0.52.6",
]
[[package]]
name = "windows-targets"
version = "0.53.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4945f9f551b88e0d65f3db0bc25c33b8acea4d9e41163edf90dcd0b19f9069f3"
dependencies = [
"windows-link 0.2.1",
"windows_aarch64_gnullvm 0.53.1",
"windows_aarch64_msvc 0.53.1",
"windows_i686_gnu 0.53.1",
"windows_i686_gnullvm 0.53.1",
"windows_i686_msvc 0.53.1",
"windows_x86_64_gnu 0.53.1",
"windows_x86_64_gnullvm 0.53.1",
"windows_x86_64_msvc 0.53.1",
]
[[package]]
name = "windows-threading"
version = "0.1.0"
@@ -11514,12 +11538,6 @@ version = "0.52.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "32a4622180e7a0ec044bb555404c800bc9fd9ec262ec147edd5989ccd0c02cd3"
[[package]]
name = "windows_aarch64_gnullvm"
version = "0.53.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a9d8416fa8b42f5c947f8482c43e7d89e73a173cead56d044f6a56104a6d1b53"
[[package]]
name = "windows_aarch64_msvc"
version = "0.42.2"
@@ -11532,12 +11550,6 @@ version = "0.52.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "09ec2a7bb152e2252b53fa7803150007879548bc709c039df7627cabbd05d469"
[[package]]
name = "windows_aarch64_msvc"
version = "0.53.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b9d782e804c2f632e395708e99a94275910eb9100b2114651e04744e9b125006"
[[package]]
name = "windows_i686_gnu"
version = "0.42.2"
@@ -11550,24 +11562,12 @@ version = "0.52.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8e9b5ad5ab802e97eb8e295ac6720e509ee4c243f69d781394014ebfe8bbfa0b"
[[package]]
name = "windows_i686_gnu"
version = "0.53.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "960e6da069d81e09becb0ca57a65220ddff016ff2d6af6a223cf372a506593a3"
[[package]]
name = "windows_i686_gnullvm"
version = "0.52.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0eee52d38c090b3caa76c563b86c3a4bd71ef1a819287c19d586d7334ae8ed66"
[[package]]
name = "windows_i686_gnullvm"
version = "0.53.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fa7359d10048f68ab8b09fa71c3daccfb0e9b559aed648a8f95469c27057180c"
[[package]]
name = "windows_i686_msvc"
version = "0.42.2"
@@ -11580,12 +11580,6 @@ version = "0.52.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "240948bc05c5e7c6dabba28bf89d89ffce3e303022809e73deaefe4f6ec56c66"
[[package]]
name = "windows_i686_msvc"
version = "0.53.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1e7ac75179f18232fe9c285163565a57ef8d3c89254a30685b57d83a38d326c2"
[[package]]
name = "windows_x86_64_gnu"
version = "0.42.2"
@@ -11598,12 +11592,6 @@ version = "0.52.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "147a5c80aabfbf0c7d901cb5895d1de30ef2907eb21fbbab29ca94c5b08b1a78"
[[package]]
name = "windows_x86_64_gnu"
version = "0.53.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9c3842cdd74a865a8066ab39c8a7a473c0778a3f29370b5fd6b4b9aa7df4a499"
[[package]]
name = "windows_x86_64_gnullvm"
version = "0.42.2"
@@ -11616,12 +11604,6 @@ version = "0.52.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "24d5b23dc417412679681396f2b49f3de8c1473deb516bd34410872eff51ed0d"
[[package]]
name = "windows_x86_64_gnullvm"
version = "0.53.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0ffa179e2d07eee8ad8f57493436566c7cc30ac536a3379fdf008f47f6bb7ae1"
[[package]]
name = "windows_x86_64_msvc"
version = "0.42.2"
@@ -11634,12 +11616,6 @@ version = "0.52.6"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "589f6da84c646204747d1270a2a5661ea66ed1cced2631d546fdfb155959f9ec"
[[package]]
name = "windows_x86_64_msvc"
version = "0.53.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d6bbff5f0aada427a1e5a6da5f1f98158182f26556f345ac9e04d36d0ebed650"
[[package]]
name = "winit"
version = "0.30.13"
@@ -11881,7 +11857,7 @@ dependencies = [
"libc",
"libloading 0.8.9",
"once_cell",
"rustix 1.1.2",
"rustix 1.1.4",
"x11rb-protocol",
]
@@ -11910,7 +11886,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "32e45ad4206f6d2479085147f02bc2ef834ac85886624a23575ae137c8aa8156"
dependencies = [
"libc",
"rustix 1.1.2",
"rustix 1.1.4",
]
[[package]]
@@ -12018,6 +11994,15 @@ dependencies = [
"synstructure",
]
[[package]]
name = "yuv"
version = "0.8.13"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "47d3a7e2cda3061858987ee2fb028f61695f5ee13f9490d75be6c3900df9a4ea"
dependencies = [
"num-traits",
]
[[package]]
name = "zbus"
version = "5.14.0"
@@ -12040,7 +12025,7 @@ dependencies = [
"hex 0.4.3",
"libc",
"ordered-stream",
"rustix 1.1.2",
"rustix 1.1.4",
"serde",
"serde_repr",
"tracing",
@@ -12149,18 +12134,18 @@ dependencies = [
[[package]]
name = "zerofrom"
version = "0.1.6"
version = "0.1.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "50cc42e0333e05660c3587f3bf9d0478688e15d870fab3346451ce7f8c9fbea5"
checksum = "69faa1f2a1ea75661980b013019ed6687ed0e83d069bc1114e2cc74c6c04c4df"
dependencies = [
"zerofrom-derive",
]
[[package]]
name = "zerofrom-derive"
version = "0.1.6"
version = "0.1.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d71e5d6e06ab090c67b5e44993ec16b72dcbaabc526db883a360057678b48502"
checksum = "11532158c46691caf0f2593ea8358fed6bbf68a0315e80aae9bd41fbade684a1"
dependencies = [
"proc-macro2",
"quote",

View File

@@ -97,7 +97,7 @@ hkdf = "0.12"
html5ever = "0.39"
http = "1.4"
http-body-util = "0.1"
hyper = "1.8"
hyper = "1.9"
hyper-rustls = { version = "0.27", default-features = false, features = ["http1", "http2", "logging", "tls12", "webpki-tokio"] }
hyper-util = { version = "0.1", features = ["client-legacy", "http2", "tokio", "client-proxy"] }
icu_locid = "1.5.0"
@@ -109,7 +109,7 @@ indexmap = { version = "2.11.4", features = ["std"] }
inventory = { version = "0.3.24" }
ipc-channel = "0.21"
itertools = "0.14"
js = { package = "mozjs", version = "=0.15.7", default-features = false, features = ["libz-sys", "intl"] }
js = { package = "mozjs", version = "=0.15.8", default-features = false, features = ["libz-sys", "intl"] }
keyboard-types = { version = "0.8.3", features = ["serde", "webdriver"] }
kurbo = { version = "0.12", features = ["euclid"] }
libc = "0.2"
@@ -161,12 +161,12 @@ rustls-platform-verifier = "0.6.2"
sea-query = { version = "1.0.0-rc.31", default-features = false, features = ["backend-sqlite", "derive"] }
sea-query-rusqlite = { version = "0.8.0-rc.15" }
sec1 = "0.7"
selectors = { git = "https://github.com/servo/stylo", rev = "8557228b96c0e343764953e72a62ea503baf01b3" }
selectors = { git = "https://github.com/servo/stylo", rev = "ddf2109bdfff62c83a14e3a3c7dc1c6130653283" }
serde = "1.0.228"
serde_bytes = "0.11"
serde_core = "1.0.226"
serde_json = "1.0"
servo_arc = { git = "https://github.com/servo/stylo", rev = "8557228b96c0e343764953e72a62ea503baf01b3" }
servo_arc = { git = "https://github.com/servo/stylo", rev = "ddf2109bdfff62c83a14e3a3c7dc1c6130653283" }
sha1 = "0.10"
sha2 = "0.10"
sha3 = "0.10"
@@ -174,12 +174,12 @@ skrifa = "0.37.0"
smallvec = { version = "1.15", features = ["serde", "union"] }
string_cache = "0.9"
strum = { version = "0.28", features = ["derive"] }
stylo = { git = "https://github.com/servo/stylo", rev = "8557228b96c0e343764953e72a62ea503baf01b3" }
stylo_atoms = { git = "https://github.com/servo/stylo", rev = "8557228b96c0e343764953e72a62ea503baf01b3" }
stylo_dom = { git = "https://github.com/servo/stylo", rev = "8557228b96c0e343764953e72a62ea503baf01b3" }
stylo_malloc_size_of = { git = "https://github.com/servo/stylo", rev = "8557228b96c0e343764953e72a62ea503baf01b3" }
stylo_static_prefs = { git = "https://github.com/servo/stylo", rev = "8557228b96c0e343764953e72a62ea503baf01b3" }
stylo_traits = { git = "https://github.com/servo/stylo", rev = "8557228b96c0e343764953e72a62ea503baf01b3" }
stylo = { git = "https://github.com/servo/stylo", rev = "ddf2109bdfff62c83a14e3a3c7dc1c6130653283" }
stylo_atoms = { git = "https://github.com/servo/stylo", rev = "ddf2109bdfff62c83a14e3a3c7dc1c6130653283" }
stylo_dom = { git = "https://github.com/servo/stylo", rev = "ddf2109bdfff62c83a14e3a3c7dc1c6130653283" }
stylo_malloc_size_of = { git = "https://github.com/servo/stylo", rev = "ddf2109bdfff62c83a14e3a3c7dc1c6130653283" }
stylo_static_prefs = { git = "https://github.com/servo/stylo", rev = "ddf2109bdfff62c83a14e3a3c7dc1c6130653283" }
stylo_traits = { git = "https://github.com/servo/stylo", rev = "ddf2109bdfff62c83a14e3a3c7dc1c6130653283" }
surfman = { version = "0.11.0", features = ["chains"] }
syn = { version = "2", default-features = false, features = ["clone-impls", "derive", "parsing"] }
synstructure = "0.13"
@@ -237,6 +237,7 @@ dom_struct = { package = "servo-dom-struct", version = "0.1.0", path = "componen
embedder_traits = { package = "servo-embedder-traits", version = "0.1.0", path = "components/shared/embedder" }
fonts = { package = "servo-fonts", version = "0.1.0", path = "components/fonts" }
fonts_traits = { package = "servo-fonts-traits", version = "0.1.0", path = "components/shared/fonts" }
hyper_serde = { package = "servo-hyper-serde", version = "0.1.0", path = "components/hyper_serde" }
jstraceable_derive = { package = "servo-jstraceable-derive", version = "0.1.0", path = "components/jstraceable_derive" }
layout = { package = "servo-layout", version = "0.1.0", path = "components/layout" }
layout_api = { package = "servo-layout-api", version = "0.1.0", path = "components/shared/layout" }
@@ -277,6 +278,7 @@ servo-media-gstreamer = { version = "0.1.0", path = "components/media/backends/g
servo-media-gstreamer-render = { version = "0.1.0", path = "components/media/backends/gstreamer/render" }
servo-media-gstreamer-render-android = { version = "0.1.0", path = "components/media/backends/gstreamer/render-android" }
servo-media-gstreamer-render-unix = { version = "0.1.0", path = "components/media/backends/gstreamer/render-unix" }
servo-media-ohos = { version = "0.1.0", path = "components/media/backends/ohos" }
servo-media-player = { version = "0.1.0", path = "components/media/player" }
servo-media-streams = { version = "0.1.0", path = "components/media/streams" }
servo-media-traits = { version = "0.1.0", path = "components/media/traits" }
@@ -295,9 +297,6 @@ webxr-api = { package = "servo-webxr-api", version = "0.1.0", path = "components
xpath = { package = "servo-xpath", version = "0.1.0", path = "components/xpath" }
# End workspace-version dependencies - Don't change this comment, we grep for it in scripts!
# Independently versioned workspace local crates. These crates will not take part in auto-bumps.
hyper_serde = { package = "servo-hyper-serde", version = "0.13.2", path = "components/hyper_serde" }
# RSA key generation could be very slow without compilation
# optimizations, in development mode. Without optimizations, WPT might
# consider RSA key generation tests fail due to timeout.

View File

@@ -318,7 +318,7 @@ impl<DrawTarget: GenericDrawTarget> CanvasData<DrawTarget> {
let (descriptor, data) = {
let _span =
profile_traits::trace_span!("image_descriptor_and_serializable_data",).entered();
profile_traits::trace_span!("image_descriptor_and_serializable_data").entered();
self.draw_target.image_descriptor_and_serializable_data()
};

View File

@@ -2,6 +2,12 @@
* License, v. 2.0. If a copy of the MPL was not distributed with this
* file, You can obtain one at https://mozilla.org/MPL/2.0/. */
//! This crate provides two mechanisms for configuring the behaviour of the Servo engine.
//! - The [`opts`] module exposes a set of global flags that are initialized once
//! and cannot be changed at runtime.
//! - The [`prefs`] module provides a mechanism to get and set global preference
//! values that can be changed at runtime.
#![deny(unsafe_code)]
pub mod opts;

View File

@@ -2,8 +2,8 @@
* License, v. 2.0. If a copy of the MPL was not distributed with this
* file, You can obtain one at https://mozilla.org/MPL/2.0/. */
//! Configuration options for a single run of the servo application. Created
//! from command line arguments.
//! Options are global configuration options that are initialized once and cannot be changed at
//! runtime.
use std::default::Default;
use std::path::PathBuf;
@@ -12,15 +12,14 @@ use std::sync::OnceLock;
use serde::{Deserialize, Serialize};
/// Global flags for Servo, currently set on the command line.
/// The set of global options supported by Servo. The values for these can be configured during
/// initialization of Servo and cannot be changed later at runtime.
#[derive(Clone, Debug, Deserialize, Serialize)]
pub struct Opts {
/// `None` to disable the time profiler or `Some` to enable it with:
/// `None` to disable the time profiler or `Some` to enable it with either:
///
/// - an interval in seconds to cause it to produce output on that interval.
/// (`i.e. -p 5`).
/// - a file path to write profiling info to a TSV file upon Servo's termination.
/// (`i.e. -p out.tsv`).
pub time_profiling: Option<OutputOptions>,
/// When the profiler is enabled, this is an optional path to dump a self-contained HTML file
@@ -79,10 +78,10 @@ pub struct Opts {
pub unminify_css: bool,
}
/// Debug options for Servo, currently set on the command line with -Z
/// Debug options for Servo.
#[derive(Clone, Debug, Default, Deserialize, Serialize)]
pub struct DiagnosticsLogging {
/// List all the debug options.
/// Print all the debug options supported by Servo to the standard output.
pub help: bool,
/// Print the DOM after each restyle.
@@ -202,9 +201,9 @@ impl DiagnosticsLogging {
}
}
/// The destination for the time profiler reports.
#[derive(Clone, Debug, Deserialize, PartialEq, Serialize)]
pub enum OutputOptions {
/// Database connection config (hostname, name, user, pass)
FileName(String),
Stdout(f64),
}
@@ -241,16 +240,16 @@ static OPTIONS: OnceLock<Opts> = OnceLock::new();
/// Initialize options.
///
/// Should only be called once at process startup.
/// Must be called before the first call to [get].
/// Must be called before the first call to [`get`].
pub fn initialize_options(opts: Opts) {
OPTIONS.set(opts).expect("Already initialized");
}
/// Get the servo options
///
/// If the servo options have not been initialized by calling [initialize_options], then the
/// options will be initialized to default values. Outside of tests the options should
/// be explicitly initialized.
/// If the servo options have not been initialized by calling [`initialize_options`], then the
/// options will be initialized to default values. Outside of tests the options should be
/// explicitly initialized.
#[inline]
pub fn get() -> &'static Opts {
// In unit-tests using default options reduces boilerplate.

View File

@@ -5,6 +5,7 @@
use serde::{Deserialize, Serialize};
use serde_json::Value;
/// The types of preference values in Servo.
#[derive(Clone, Debug, Deserialize, PartialEq, Serialize)]
pub enum PrefValue {
Float(f64),
@@ -16,6 +17,8 @@ pub enum PrefValue {
}
impl PrefValue {
/// Parse the `input` string as a preference value. Defaults to a `PrefValue::Str` if the input
/// cannot be parsed as valid value of one of the other types.
pub fn from_booleanish_str(input: &str) -> Self {
match input {
"false" => PrefValue::Bool(false),

View File

@@ -2,6 +2,8 @@
* License, v. 2.0. If a copy of the MPL was not distributed with this
* file, You can obtain one at https://mozilla.org/MPL/2.0/. */
//! Preferences are the global configuration options that can be changed at runtime.
use std::env::consts::ARCH;
use std::sync::{RwLock, RwLockReadGuard};
use std::time::Duration;
@@ -13,7 +15,12 @@ pub use crate::pref_util::PrefValue;
static PREFERENCES: RwLock<Preferences> = RwLock::new(Preferences::const_default());
/// A trait to be implemented by components that wish to be notified about runtime changes to the
/// global preferences for the current process.
pub trait PreferencesObserver: Send + Sync {
/// This method is called when the global preferences have been updated. The argument to the
/// method is an array of tuples where the first component is the name of the preference and
/// the second component is the new value of the preference.
fn prefs_changed(&self, _changes: &[(&'static str, PrefValue)]) {}
}
@@ -25,10 +32,13 @@ pub fn get() -> RwLockReadGuard<'static, Preferences> {
PREFERENCES.read().unwrap()
}
/// Subscribe to notifications about changes to the global preferences for the current process.
pub fn add_observer(observer: Box<dyn PreferencesObserver>) {
OBSERVERS.write().unwrap().push(observer);
}
/// Update the values of the global preferences for the current process. This also notifies the
/// observers previously added using [`add_observer`].
pub fn set(preferences: Preferences) {
// Map between Stylo preference names and Servo preference names as the This should be
// kept in sync with components/script/dom/bindings/codegen/run.py which generates the
@@ -72,6 +82,7 @@ macro_rules! pref {
};
}
/// The set of global preferences supported by Servo.
#[derive(Clone, Deserialize, Serialize, ServoPreferences)]
pub struct Preferences {
pub fonts_default: String,

View File

@@ -19,7 +19,7 @@ bluetooth = ["servo-bluetooth-traits"]
gamepad = ["embedder_traits/gamepad"]
tracing = ["dep:tracing", "servo-canvas/tracing"]
vello = ["servo-canvas/vello"]
webgpu = ["script_traits/webgpu"]
webgpu = ["dep:webgpu", "dep:webgpu_traits", "script_traits/webgpu"]
[lints.clippy]
unwrap_used = "deny"
@@ -28,6 +28,7 @@ panic = "deny"
[dependencies]
accesskit = { workspace = true }
backtrace = { workspace = true }
base64 = { workspace = true }
content-security-policy = { workspace = true }
crossbeam-channel = { workspace = true }
devtools_traits = { workspace = true }
@@ -62,8 +63,8 @@ storage_traits = { workspace = true }
stylo = { workspace = true }
stylo_traits = { workspace = true }
tracing = { workspace = true, optional = true }
webgpu = { workspace = true }
webgpu_traits = { workspace = true }
webgpu = { workspace = true, optional = true }
webgpu_traits = { workspace = true, optional = true }
webxr-api = { workspace = true }
[target.'cfg(any(target_os="macos", all(not(target_os = "windows"), not(target_os = "ios"), not(target_os="android"), not(target_env="ohos"), not(target_arch="arm"), not(target_arch="aarch64"))))'.dependencies]

View File

@@ -1408,8 +1408,13 @@ where
// Load a new page from a typed url
// If there is already a pending page (self.pending_changes), it will not be overridden;
// However, if the id is not encompassed by another change, it will be.
EmbedderToConstellationMessage::LoadUrl(webview_id, url) => {
let load_data = LoadData::new_for_new_unrelated_webview(url);
EmbedderToConstellationMessage::LoadUrl(webview_id, url_request) => {
let mut load_data = LoadData::new_for_new_unrelated_webview(url_request.url);
if !url_request.headers.is_empty() {
load_data.headers.extend(url_request.headers);
}
let ctx_id = BrowsingContextId::from(webview_id);
let pipeline_id = match self.browsing_contexts.get(&ctx_id) {
Some(ctx) => ctx.pipeline_id,
@@ -1903,15 +1908,20 @@ where
data,
);
},
ScriptToConstellationMessage::Focus(focused_child_browsing_context_id, sequence) => {
self.handle_focus_msg(
ScriptToConstellationMessage::FocusAncestorBrowsingContextsForFocusingSteps(
focused_child_browsing_context_id,
sequence,
) => {
self.handle_focus_ancestor_browsing_contexts_for_focusing_steps(
source_pipeline_id,
focused_child_browsing_context_id,
sequence,
);
},
ScriptToConstellationMessage::FocusRemoteDocument(focused_browsing_context_id) => {
self.handle_focus_remote_document_msg(focused_browsing_context_id);
ScriptToConstellationMessage::FocusRemoteBrowsingContext(
focused_browsing_context_id,
) => {
self.handle_focus_remote_browsing_context(focused_browsing_context_id);
},
ScriptToConstellationMessage::SetThrottledComplete(throttled) => {
self.handle_set_throttled_complete(source_pipeline_id, throttled);
@@ -3151,7 +3161,7 @@ where
webview.accessibility_active = active;
self.constellation_to_embedder_proxy.send(
ConstellationToEmbedderMsg::AccessibilityTreeIdChanged(
ConstellationToEmbedderMsg::DocumentAccessibilityTreeIdChanged(
webview_id,
webview.active_top_level_pipeline_id.into(),
),
@@ -4448,7 +4458,7 @@ where
}
#[servo_tracing::instrument(skip_all)]
fn handle_focus_msg(
fn handle_focus_ancestor_browsing_contexts_for_focusing_steps(
&mut self,
pipeline_id: PipelineId,
focused_child_browsing_context_id: Option<BrowsingContextId>,
@@ -4488,23 +4498,20 @@ where
self.focus_browsing_context(Some(pipeline_id), focused_browsing_context_id);
}
fn handle_focus_remote_document_msg(&mut self, focused_browsing_context_id: BrowsingContextId) {
let pipeline_id = match self.browsing_contexts.get(&focused_browsing_context_id) {
Some(browsing_context) => browsing_context.pipeline_id,
None => return warn!("Browsing context {} not found", focused_browsing_context_id),
fn handle_focus_remote_browsing_context(&mut self, target: BrowsingContextId) {
let Some(browsing_context) = self.browsing_contexts.get(&target) else {
return warn!("{target:?} not found for focus message");
};
// Ignore if its active document isn't fully active.
if self.get_activity(pipeline_id) != DocumentActivity::FullyActive {
debug!(
"Ignoring the remote focus request because pipeline {} of \
browsing context {} is not fully active",
pipeline_id, focused_browsing_context_id,
);
return;
let pipeline_id = browsing_context.pipeline_id;
let Some(pipeline) = self.pipelines.get(&pipeline_id) else {
return warn!("{pipeline_id:?} not found for focus message");
};
if let Err(error) = pipeline
.event_loop
.send(ScriptThreadMessage::FocusDocument(pipeline_id))
{
self.handle_send_error(pipeline_id, error);
}
self.focus_browsing_context(None, focused_browsing_context_id);
}
/// Perform [the focusing steps][1] for the active document of
@@ -4605,7 +4612,10 @@ where
// > substeps: [...]
for &pipeline in old_focus_chain_pipelines.iter() {
if Some(pipeline.id) != initiator_pipeline_id {
let msg = ScriptThreadMessage::Unfocus(pipeline.id, pipeline.focus_sequence);
let msg = ScriptThreadMessage::UnfocusDocumentAsPartOfFocusingSteps(
pipeline.id,
pipeline.focus_sequence,
);
trace!("Sending {:?} to {}", msg, pipeline.id);
if let Err(e) = pipeline.event_loop.send(msg) {
send_errors.push((pipeline.id, e));
@@ -4625,49 +4635,35 @@ where
// Don't send a message to the browsing context that initiated this
// focus operation. It already knows that it has gotten focus.
if Some(pipeline.id) != initiator_pipeline_id {
let msg = if let Some(child_browsing_context_id) = child_browsing_context_id {
// Focus the container element of `child_browsing_context_id`.
ScriptThreadMessage::FocusIFrame(
if let Err(error) = pipeline.event_loop.send(
ScriptThreadMessage::FocusDocumentAsPartOfFocusingSteps(
pipeline.id,
child_browsing_context_id,
pipeline.focus_sequence,
)
} else {
// Focus the document.
ScriptThreadMessage::FocusDocument(pipeline.id, pipeline.focus_sequence)
};
trace!("Sending {:?} to {}", msg, pipeline.id);
if let Err(e) = pipeline.event_loop.send(msg) {
send_errors.push((pipeline.id, e));
child_browsing_context_id,
),
) {
send_errors.push((pipeline.id, error));
}
} else {
trace!(
"Not notifying {} - it's the initiator of this focus operation",
pipeline.id
);
}
child_browsing_context_id = Some(pipeline.browsing_context_id);
}
if let (Some(pipeline), Some(child_browsing_context_id)) =
(first_common_pipeline_in_chain, child_browsing_context_id)
{
if let Some(pipeline) = first_common_pipeline_in_chain {
if Some(pipeline.id) != initiator_pipeline_id {
// Focus the container element of `child_browsing_context_id`.
let msg = ScriptThreadMessage::FocusIFrame(
pipeline.id,
child_browsing_context_id,
pipeline.focus_sequence,
);
trace!("Sending {:?} to {}", msg, pipeline.id);
if let Err(e) = pipeline.event_loop.send(msg) {
send_errors.push((pipeline.id, e));
if let Err(error) = pipeline.event_loop.send(
ScriptThreadMessage::FocusDocumentAsPartOfFocusingSteps(
pipeline.id,
pipeline.focus_sequence,
child_browsing_context_id,
),
) {
send_errors.push((pipeline.id, error));
}
}
}
for (pipeline_id, e) in send_errors {
self.handle_send_error(pipeline_id, e);
for (pipeline_id, error) in send_errors {
self.handle_send_error(pipeline_id, error);
}
}
@@ -4748,7 +4744,7 @@ where
let _ = response_sender.send(is_open);
},
WebDriverCommandMsg::FocusBrowsingContext(browsing_context_id) => {
self.handle_focus_remote_document_msg(browsing_context_id);
self.handle_focus_remote_browsing_context(browsing_context_id);
},
// TODO: This should use the ScriptThreadMessage::EvaluateJavaScript command
WebDriverCommandMsg::ScriptCommand(browsing_context_id, cmd) => {
@@ -5089,9 +5085,16 @@ where
// If the browsing context is focused, focus the document
let msg = if is_focused {
ScriptThreadMessage::FocusDocument(pipeline_id, pipeline.focus_sequence)
ScriptThreadMessage::FocusDocumentAsPartOfFocusingSteps(
pipeline_id,
pipeline.focus_sequence,
None,
)
} else {
ScriptThreadMessage::Unfocus(pipeline_id, pipeline.focus_sequence)
ScriptThreadMessage::UnfocusDocumentAsPartOfFocusingSteps(
pipeline_id,
pipeline.focus_sequence,
)
};
if let Err(e) = pipeline.event_loop.send(msg) {
self.handle_send_error(pipeline_id, e);
@@ -5799,7 +5802,7 @@ where
if frame_tree.pipeline.id != webview.active_top_level_pipeline_id {
webview.active_top_level_pipeline_id = frame_tree.pipeline.id;
self.constellation_to_embedder_proxy.send(
ConstellationToEmbedderMsg::AccessibilityTreeIdChanged(
ConstellationToEmbedderMsg::DocumentAccessibilityTreeIdChanged(
webview_id,
webview.active_top_level_pipeline_id.into(),
),

View File

@@ -67,10 +67,12 @@ impl ConstellationWebView {
focused_browsing_context_id: BrowsingContextId,
user_content_manager_id: Option<UserContentManagerId>,
) -> Self {
embedder_proxy.send(ConstellationToEmbedderMsg::AccessibilityTreeIdChanged(
webview_id,
active_top_level_pipeline_id.into(),
));
embedder_proxy.send(
ConstellationToEmbedderMsg::DocumentAccessibilityTreeIdChanged(
webview_id,
active_top_level_pipeline_id.into(),
),
);
Self {
webview_id,

View File

@@ -50,5 +50,5 @@ pub enum ConstellationToEmbedderMsg {
HistoryChanged(WebViewId, Vec<ServoUrl>, usize),
/// Notifies the embedder that the AccessKit [`TreeId`] for the top-level document in this
/// WebView has been changed (or initially set).
AccessibilityTreeIdChanged(WebViewId, TreeId),
DocumentAccessibilityTreeIdChanged(WebViewId, TreeId),
}

View File

@@ -144,8 +144,10 @@ mod from_script {
Self::BroadcastStorageEvent(..) => target!("BroadcastStorageEvent"),
Self::ChangeRunningAnimationsState(..) => target!("ChangeRunningAnimationsState"),
Self::CreateCanvasPaintThread(..) => target!("CreateCanvasPaintThread"),
Self::Focus(..) => target!("Focus"),
Self::FocusRemoteDocument(..) => target!("FocusRemoteDocument"),
Self::FocusAncestorBrowsingContextsForFocusingSteps(..) => {
target!("FocusAncestorBrowsingContextsForFocusingSteps")
},
Self::FocusRemoteBrowsingContext(..) => target!("FocusRemoteBrowsingContext"),
Self::GetTopForBrowsingContext(..) => target!("GetTopForBrowsingContext"),
Self::GetBrowsingContextInfo(..) => target!("GetBrowsingContextInfo"),
Self::GetDocumentOrigin(..) => target!("GetDocumentOrigin"),

View File

@@ -30,6 +30,7 @@ impl ResourceReaderMethods for DefaultResourceReader {
},
Resource::AboutMemoryHTML => &include_bytes!("resources/about-memory.html")[..],
Resource::DebuggerJS => &include_bytes!("resources/debugger.js")[..],
Resource::JsonViewerHTML => &include_bytes!("resources/json-viewer.html")[..],
}
.to_owned()
}

View File

@@ -86,13 +86,23 @@ const previewers = {
// Convert debuggee value to property descriptor value
// <https://searchfox.org/firefox-main/source/devtools/server/actors/object/utils.js#116>
function createValueGrip(value) {
function createValueGrip(value, depth = 0) {
switch (typeof value) {
case "undefined":
return { valueType: "undefined" };
case "boolean":
return { valueType: "boolean", booleanValue: value };
case "number":
if (value === Infinity) {
return { valueType: "Infinity" };
} else if (value === -Infinity) {
return { valueType: "-Infinity" };
} else if (Number.isNaN(value)) {
return { valueType: "NaN" };
} else if (!value && 1 / value === -Infinity) {
return { valueType: "-0" };
}
return { valueType: "number", numberValue: value };
case "string":
return { valueType: "string", stringValue: value };
@@ -105,7 +115,7 @@ function createValueGrip(value) {
return {
valueType: "object",
objectClass: value.class,
preview: getPreview(value),
preview: getPreview(value, depth),
};
default:
return { valueType: "string", stringValue: String(value) };
@@ -114,7 +124,7 @@ function createValueGrip(value) {
// Extract own properties from a debuggee object
// <https://firefox-source-docs.mozilla.org/devtools-user/debugger-api/debugger.object/index.html#function-properties-of-the-debugger-object-prototype>
function extractOwnProperties(obj, maxItems = OBJECT_PREVIEW_MAX_ITEMS) {
function extractOwnProperties(obj, depth, maxItems = OBJECT_PREVIEW_MAX_ITEMS) {
const ownProperties = [];
let totalLength = 0;
@@ -138,16 +148,16 @@ function extractOwnProperties(obj, maxItems = OBJECT_PREVIEW_MAX_ITEMS) {
enumerable: desc.enumerable ?? false,
writable: desc.writable ?? false,
isAccessor: desc.get !== undefined || desc.set !== undefined,
value: createValueGrip(undefined),
value: createValueGrip(undefined, depth + 1),
};
if (desc.value !== undefined) {
prop.value = createValueGrip(desc.value);
prop.value = createValueGrip(desc.value, depth + 1);
} else if (desc.get) {
try {
const result = desc.get.call(obj);
if (result && "return" in result) {
prop.value = createValueGrip(result.return);
prop.value = createValueGrip(result.return, depth + 1);
}
} catch (e) { }
}
@@ -164,25 +174,31 @@ function extractOwnProperties(obj, maxItems = OBJECT_PREVIEW_MAX_ITEMS) {
}
// <https://searchfox.org/mozilla-central/source/devtools/server/actors/object/previewers.js#125>
previewers.Function.push(function FunctionPreviewer(obj) {
const { ownProperties, ownPropertiesLength } = extractOwnProperties(obj);
previewers.Function.push(function FunctionPreviewer(obj, depth) {
let function_details = {
name: obj.name,
displayName: obj.displayName,
parameterNames: obj.parameterNames ? obj.parameterNames: [],
isAsync: obj.isAsyncFunction,
isGenerator: obj.isGeneratorFunction,
}
if (depth > 1) {
return { kind: "Object", function: function_details };
}
const { ownProperties, ownPropertiesLength } = extractOwnProperties(obj, depth);
return {
kind: "Object",
ownProperties,
ownPropertiesLength,
function: {
name: obj.name,
displayName: obj.displayName,
parameterNames: obj.parameterNames,
isAsync: obj.isAsyncFunction,
isGenerator: obj.isGeneratorFunction,
}
function: function_details
};
});
// <https://searchfox.org/mozilla-central/source/devtools/server/actors/object/previewers.js#172>
// TODO: Add implementation for showing Array items
previewers.Array.push(function ArrayPreviewer(obj) {
previewers.Array.push(function ArrayPreviewer(obj, depth) {
const lengthDescriptor = obj.getOwnPropertyDescriptor("length");
const length = lengthDescriptor ? lengthDescriptor.value : 0;
@@ -194,8 +210,12 @@ previewers.Array.push(function ArrayPreviewer(obj) {
// Generic fallback for object previewer
// <https://searchfox.org/mozilla-central/source/devtools/server/actors/object/previewers.js#856>
previewers.Object.push(function ObjectPreviewer(obj) {
const { ownProperties, ownPropertiesLength } = extractOwnProperties(obj);
previewers.Object.push(function ObjectPreviewer(obj, depth) {
if (depth > 1) {
return { kind: "Object" };
}
const { ownProperties, ownPropertiesLength } = extractOwnProperties(obj, depth);
return {
kind: "Object",
ownProperties,
@@ -203,13 +223,13 @@ previewers.Object.push(function ObjectPreviewer(obj) {
};
});
function getPreview(obj) {
function getPreview(obj, depth) {
const className = obj.class;
// <https://searchfox.org/mozilla-central/source/devtools/server/actors/object.js#295>
const typePreviewers = previewers[className] || previewers.Object;
for (const previewer of typePreviewers) {
const result = previewer(obj);
const result = previewer(obj, depth);
if (result) return result;
}
@@ -410,7 +430,9 @@ function makeSteppingHooks(steppingType, startFrame) {
this.reportedPop = true;
suspendedFrame = startFrame;
if (steppingType !== "finish") {
return handlePauseAndRespond(startFrame, completion);
// TODO: completion contains the return value, we have to send it back
// <https://searchfox.org/firefox-main/source/devtools/server/actors/thread.js#1026>
return handlePauseAndRespond(startFrame, { type_: steppingType });
}
attachSteppingHooks("next", startFrame);
},
@@ -532,7 +554,6 @@ function createEnvironmentActor(environment) {
info.bindingVariables = buildBindings(environment)
}
// TODO: Update this instead of registering
actor = registerEnvironmentActor(info, parent);
environmentActorsToEnvironments.set(actor, environment);
}
@@ -541,15 +562,28 @@ function createEnvironmentActor(environment) {
}
function buildBindings(environment) {
let bindingVar = new Map();
let bindingVars = [];
for (const name of environment.names()) {
const value = environment.getVariable(name);
// <https://searchfox.org/firefox-main/source/devtools/server/actors/environment.js#87>
// We should not do this, it is more of a place holder for now.
// TODO: build and pass correct structure for this. This structure is very similar to "eval"
bindingVar[name] = JSON.stringify(value);
const property = {
name: name,
configurable: false,
enumerable: true,
writable: !value?.optimizedOut,
isAccessor: false,
value: createValueGrip(value),
};
// To avoid recursion errors in the WebIDL, preview needs to live outside of the property descriptor
let preview = undefined;
if (property.value.preview) {
preview = property.value.preview;
delete property.value.preview;
}
bindingVars.push({ property, preview });
}
return bindingVar;
return bindingVars;
}
// Get a `Debugger.Environment` instance within which evaluation is taking place.

View File

@@ -0,0 +1,269 @@
<html>
<head>
<style>
body {
font-family: monospace;
margin: 0;
padding: 0;
background: #fff;
color: #333;
}
#json-raw {
display: none;
}
#viewer {
display: none;
padding: 0.5em 1em;
line-height: 1.5;
&.active {
display: block;
}
}
#toolbar {
display: flex;
gap: 1em;
padding: 0.5em;
align-items: center;
background: #f5f5f5;
border-bottom: 1px solid #ddd;
& button {
min-width: 5em;
&.active {
background: #ddd;
font-weight: bold;
}
}
}
#raw-view {
display: none;
padding: 0.5em 1em;
white-space: pre-wrap;
word-break: break-all;
&.active {
display: block;
}
}
.json-error {
padding: 0.5em 1em;
color: #c00;
font-weight: bold;
}
/* Syntax highlighting for Json data types */
.json-key {
color: #881391;
}
.json-string {
color: #1a1aa6;
}
.json-number {
color: #1c00cf;
}
.json-boolean {
color: #0d22aa;
}
.json-null {
color: #808080;
}
/* Collapsible tree */
.toggle {
cursor: pointer;
user-select: none;
&::before {
content: "\25BC";
display: inline-block;
width: 1em;
transition: transform 0.1s;
}
&.collapsed::before {
transform: rotate(-90deg);
}
}
.collapsible {
margin-left: 1.5em;
&.hidden {
display: none;
}
}
.bracket {
color: #333;
}
.comma {
color: #333;
}
.line {
padding-left: 0;
}
</style>
<script>
// Shortcut to create an element with an optional class and text content.
function createElement(name, classes = null, textContent = null) {
let node = document.createElement(name);
if (classes) {
node.className = classes;
}
if (textContent) {
node.textContent = textContent;
}
return node;
}
function renderNode(value, container) {
if (value === null) {
let s = createElement("span", "json-null", "null");
container.append(s);
} else if (typeof value === "boolean") {
let s = createElement("span", "json-boolean", String(value));
container.append(s);
} else if (typeof value === "number") {
let s = createElement("span", "json-number", String(value));
container.append(s);
} else if (typeof value === "string") {
let s = createElement("span", "json-string", JSON.stringify(value));
container.append(s);
} else if (Array.isArray(value)) {
renderArray(value, container);
} else if (typeof value === "object") {
renderObject(value, container);
}
}
function renderObject(obj, container) {
let keys = Object.keys(obj);
if (keys.length === 0) {
container.append(createElement("span", "bracket", "{}"));
return;
}
let toggle = createElement("span", "toggle");
container.append(toggle);
container.append(createElement("span", "bracket", "{"));
let inner = createElement("div", "collapsible");
container.append(inner);
keys.forEach((key, i) => {
let line = createElement("div", "line");
line.append(createElement("span", "json-key", JSON.stringify(key)));
line.append(document.createTextNode(": "));
renderNode(obj[key], line);
if (i < keys.length - 1) {
line.append(createElement("span", "comma", ","));
}
inner.append(line);
});
container.append(createElement("span", "bracket", "}"));
toggle.onclick = function () {
toggle.classList.toggle("collapsed");
inner.classList.toggle("hidden");
};
}
function renderArray(arr, container) {
if (arr.length === 0) {
container.append(createElement("span", "bracket", "[]"));
return;
}
let toggle = createElement("span", "toggle");
container.append(toggle);
container.append(createElement("span", "bracket", "["));
let inner = createElement("div", "collapsible");
container.append(inner);
arr.forEach((item, i) => {
let line = createElement("div", "line");
renderNode(item, line);
if (i < arr.length - 1) {
line.append(createElement("span", "comma", ","));
}
inner.append(line);
});
container.append(createElement("span", "bracket", "]"));
toggle.onclick = function () {
toggle.classList.toggle("collapsed");
inner.classList.toggle("hidden");
};
}
document.addEventListener("DOMContentLoaded", function () {
const viewer = document.getElementById("viewer");
const prettyButton = document.getElementById("pretty-toggle");
const rawButton = document.getElementById("raw-toggle");
const rawView = document.getElementById("raw-view");
const rawText = rawView.innerText;
let data;
let parseError = null;
try {
data = JSON.parse(rawText);
} catch (e) {
parseError = e;
}
if (parseError) {
let errDiv = createElement(
"div",
"json-error",
"Invalid JSON: " + parseError.message,
);
viewer.append(errDiv);
viewer.append(createElement("pre", null, rawText));
} else {
renderNode(data, viewer);
rawView.textContent = JSON.stringify(data, null, 2);
}
// Toggle buttons
prettyButton.onclick = function () {
viewer.classList.add("active");
rawView.classList.remove("active");
prettyButton.classList.add("active");
rawButton.classList.remove("active");
};
rawButton.onclick = function () {
rawView.classList.add("active");
viewer.classList.remove("active");
rawButton.classList.add("active");
prettyButton.classList.remove("active");
};
});
</script>
</head>
<body>
<div id="toolbar">
<button id="pretty-toggle" class="active">Pretty</button>
<button id="raw-toggle">Raw</button>
</div>
<div id="viewer" class="active"></div>
<pre id="raw-view">

View File

@@ -49,7 +49,6 @@ impl ActorError {
/// A common trait for all devtools actors that encompasses an immutable name
/// and the ability to process messages that are directed to particular actors.
/// TODO: ensure the name is immutable
pub(crate) trait Actor: Any + ActorAsAny + Send + Sync + MallocSizeOf {
fn handle_message(
&self,
@@ -157,11 +156,19 @@ impl ActorRegistry {
}
/// Create a unique name based on a monotonically increasing suffix
/// TODO: Merge this with `register/register_later` and don't allow to
/// TODO: Merge this with `register` and don't allow to
/// create new names without registering an actor.
pub fn new_name<T: Actor>(&self) -> String {
let suffix = self.next.fetch_add(1, Ordering::Relaxed);
format!("{}{}", Self::base_name::<T>(), suffix)
let base = Self::base_name::<T>();
// Firefox DevTools client requires "/workerTarget" in actor name to recognize workers
// <https://searchfox.org/firefox-main/source/devtools/client/fronts/watcher.js#65>
if base.contains("WorkerTarget") {
format!("/workerTarget{}", suffix)
} else {
format!("{}{}", base, suffix)
}
}
/// Add an actor to the registry of known actors that can receive messages.

View File

@@ -126,11 +126,14 @@ impl Actor for BreakpointListActor {
}
impl BreakpointListActor {
pub fn new(name: String, browsing_context_name: String) -> Self {
Self {
name,
pub fn register(registry: &ActorRegistry, browsing_context_name: String) -> String {
let name = registry.new_name::<Self>();
let actor = Self {
name: name.clone(),
browsing_context_name,
}
};
registry.register::<Self>(actor);
name
}
}

View File

@@ -139,7 +139,7 @@ pub(crate) struct BrowsingContextActor {
active_pipeline_id: AtomicRefCell<PipelineId>,
active_outer_window_id: AtomicRefCell<DevtoolsOuterWindowId>,
pub browsing_context_id: DevtoolsBrowsingContextId,
accessibility: String,
accessibility_name: String,
pub console_name: String,
css_properties_name: String,
pub(crate) inspector_name: String,
@@ -197,7 +197,8 @@ impl Actor for BrowsingContextActor {
impl BrowsingContextActor {
#[expect(clippy::too_many_arguments)]
pub(crate) fn new(
pub(crate) fn register(
registry: &ActorRegistry,
console_name: String,
browser_id: DevtoolsBrowserId,
browsing_context_id: DevtoolsBrowsingContextId,
@@ -205,8 +206,7 @@ impl BrowsingContextActor {
pipeline_id: PipelineId,
outer_window_id: DevtoolsOuterWindowId,
script_sender: GenericSender<DevtoolScriptControlMsg>,
registry: &ActorRegistry,
) -> BrowsingContextActor {
) -> String {
let name = registry.new_name::<BrowsingContextActor>();
let DevtoolsPageInfo {
title,
@@ -215,8 +215,7 @@ impl BrowsingContextActor {
..
} = page_info;
let accessibility_actor =
AccessibilityActor::new(registry.new_name::<AccessibilityActor>());
let accessibility_name = AccessibilityActor::register(registry);
let properties = (|| {
let (properties_sender, properties_receiver) = generic_channel::channel()?;
@@ -224,25 +223,21 @@ impl BrowsingContextActor {
properties_receiver.recv().ok()
})()
.unwrap_or_default();
let css_properties_actor =
CssPropertiesActor::new(registry.new_name::<CssPropertiesActor>(), properties);
let css_properties_name = CssPropertiesActor::register(registry, properties);
let inspector_name = InspectorActor::register(registry, name.clone());
let reflow_actor = ReflowActor::new(registry.new_name::<ReflowActor>());
let reflow_name = ReflowActor::register(registry);
let style_sheets_actor = StyleSheetsActor::new(registry.new_name::<StyleSheetsActor>());
let style_sheets_name = StyleSheetsActor::register(registry);
let tab_descriptor_actor =
TabDescriptorActor::new(registry, name.clone(), is_top_level_global);
let tab_descriptor_name =
TabDescriptorActor::register(registry, name.clone(), is_top_level_global);
let thread_actor = ThreadActor::new(
registry.new_name::<ThreadActor>(),
script_sender.clone(),
Some(name.clone()),
);
let thread_name =
ThreadActor::register(registry, script_sender.clone(), Some(name.clone()));
let watcher_actor = WatcherActor::new(
let watcher_name = WatcherActor::register(
registry,
name.clone(),
SessionContext::new(SessionContextType::BrowserElement),
@@ -251,8 +246,8 @@ impl BrowsingContextActor {
let mut script_chans = FxHashMap::default();
script_chans.insert(pipeline_id, script_sender);
let target = BrowsingContextActor {
name,
let actor = BrowsingContextActor {
name: name.clone(),
script_chans: AtomicRefCell::new(script_chans),
title: AtomicRefCell::new(title),
url: AtomicRefCell::new(url.into_string()),
@@ -260,26 +255,20 @@ impl BrowsingContextActor {
active_outer_window_id: AtomicRefCell::new(outer_window_id),
browser_id,
browsing_context_id,
accessibility: accessibility_actor.name(),
accessibility_name,
console_name,
css_properties_name: css_properties_actor.name(),
css_properties_name,
inspector_name,
reflow_name: reflow_actor.name(),
style_sheets_name: style_sheets_actor.name(),
_tab: tab_descriptor_actor.name(),
thread_name: thread_actor.name(),
watcher_name: watcher_actor.name(),
reflow_name,
style_sheets_name,
_tab: tab_descriptor_name,
thread_name,
watcher_name,
};
registry.register(accessibility_actor);
registry.register(css_properties_actor);
registry.register(reflow_actor);
registry.register(style_sheets_actor);
registry.register(tab_descriptor_actor);
registry.register(thread_actor);
registry.register(watcher_actor);
registry.register::<Self>(actor);
target
name
}
pub(crate) fn handle_new_global(
@@ -404,7 +393,7 @@ impl ActorEncode<BrowsingContextActorMsg> for BrowsingContextActor {
browsing_context_id: self.browsing_context_id.value(),
outer_window_id: self.outer_window_id().value(),
is_top_level_target: true,
accessibility_actor: self.accessibility.clone(),
accessibility_actor: self.accessibility_name.clone(),
console_actor: self.console_name.clone(),
css_properties_actor: self.css_properties_name.clone(),
inspector_actor: self.inspector_name.clone(),

View File

@@ -10,24 +10,21 @@ use std::collections::HashMap;
use std::sync::atomic::{AtomicBool, Ordering};
use atomic_refcell::AtomicRefCell;
use devtools_traits::DebuggerValue::{
self, BooleanValue, NullValue, NumberValue, ObjectValue, StringValue, VoidValue,
};
use devtools_traits::{
ConsoleArgument, ConsoleMessage, ConsoleMessageFields, DevtoolScriptControlMsg, PageError,
StackFrame, get_time_stamp,
ConsoleMessage, ConsoleMessageFields, DevtoolScriptControlMsg, PageError, StackFrame,
get_time_stamp,
};
use malloc_size_of_derive::MallocSizeOf;
use serde::Serialize;
use serde_json::{self, Map, Number, Value};
use serde_json::{self, Map, Value};
use servo_base::generic_channel::{self, GenericSender};
use servo_base::id::TEST_PIPELINE_ID;
use uuid::Uuid;
use crate::actor::{Actor, ActorError, ActorRegistry};
use crate::actors::browsing_context::BrowsingContextActor;
use crate::actors::object::{ObjectActor, ObjectPropertyDescriptor};
use crate::actors::worker::WorkerActor;
use crate::actors::object::debugger_value_to_json;
use crate::actors::worker::WorkerTargetActor;
use crate::protocol::{ClientRequest, DevtoolsConnection, JsonPacketStream};
use crate::resource::{ResourceArrayType, ResourceAvailable};
use crate::{EmptyReplyMsg, StreamId, UniqueId};
@@ -53,87 +50,13 @@ impl DevtoolsConsoleMessage {
arguments: message
.arguments
.into_iter()
.map(|argument| console_argument_to_value(argument, registry))
.map(|argument| debugger_value_to_json(registry, argument))
.collect(),
stacktrace: message.stacktrace,
}
}
}
fn console_argument_to_value(argument: ConsoleArgument, registry: &ActorRegistry) -> Value {
match argument {
ConsoleArgument::String(value) => Value::String(value),
ConsoleArgument::Integer(value) => Value::Number(value.into()),
ConsoleArgument::Number(value) => {
Number::from_f64(value).map(Value::from).unwrap_or_default()
},
ConsoleArgument::Boolean(value) => Value::Bool(value),
ConsoleArgument::Object(object) => {
// Create a new actor for the object.
// These are currently never cleaned up, and we make no attempt at re-using the same actor
// if the same object is logged repeatedly.
let actor = ObjectActor::register(registry, None, object.class.clone());
#[derive(Serialize)]
#[serde(rename_all = "camelCase")]
struct DevtoolsConsoleObjectArgument {
r#type: String,
actor: String,
class: String,
own_property_length: usize,
extensible: bool,
frozen: bool,
sealed: bool,
is_error: bool,
preview: DevtoolsConsoleObjectArgumentPreview,
}
#[derive(Serialize)]
#[serde(rename_all = "camelCase")]
struct DevtoolsConsoleObjectArgumentPreview {
kind: String,
own_properties: HashMap<String, ObjectPropertyDescriptor>,
own_properties_length: usize,
}
let own_properties: HashMap<String, ObjectPropertyDescriptor> = object
.own_properties
.into_iter()
.map(|property| {
let property_descriptor = ObjectPropertyDescriptor {
configurable: property.configurable,
enumerable: property.enumerable,
writable: property.writable,
value: console_argument_to_value(property.value, registry),
};
(property.key, property_descriptor)
})
.collect();
let argument = DevtoolsConsoleObjectArgument {
r#type: "object".to_owned(),
actor,
class: object.class,
own_property_length: own_properties.len(),
extensible: true,
frozen: false,
sealed: false,
is_error: false,
preview: DevtoolsConsoleObjectArgumentPreview {
kind: "Object".to_string(),
own_properties_length: own_properties.len(),
own_properties,
},
};
// to_value can fail if the implementation of Serialize fails or there are non-string map keys.
// Neither should be possible here
serde_json::to_value(argument).unwrap()
},
}
}
#[derive(Clone, Serialize, MallocSizeOf)]
#[serde(rename_all = "camelCase")]
struct DevtoolsPageError {
@@ -272,13 +195,15 @@ pub(crate) struct ConsoleActor {
}
impl ConsoleActor {
pub fn new(name: String, root: Root) -> Self {
Self {
name,
pub fn register(registry: &ActorRegistry, name: String, root: Root) -> String {
let actor = Self {
name: name.clone(),
root,
cached_events: Default::default(),
client_ready_to_receive_messages: false.into(),
}
};
registry.register(actor);
name
}
fn script_chan(&self, registry: &ActorRegistry) -> GenericSender<DevtoolScriptControlMsg> {
@@ -287,8 +212,8 @@ impl ConsoleActor {
.find::<BrowsingContextActor>(browsing_context_name)
.script_chan(),
Root::DedicatedWorker(worker_name) => registry
.find::<WorkerActor>(worker_name)
.script_chan
.find::<WorkerTargetActor>(worker_name)
.script_sender
.clone(),
}
}
@@ -301,124 +226,7 @@ impl ConsoleActor {
.pipeline_id(),
),
Root::DedicatedWorker(worker_name) => {
UniqueId::Worker(registry.find::<WorkerActor>(worker_name).worker_id)
},
}
}
// TODO: This should be handled with struct serialization instead of manually adding values to a map
fn value_to_json(value: DebuggerValue, registry: &ActorRegistry) -> Value {
let mut m = Map::new();
match value {
VoidValue => {
m.insert("type".to_owned(), Value::String("undefined".to_owned()));
Value::Object(m)
},
NullValue => {
m.insert("type".to_owned(), Value::String("null".to_owned()));
Value::Object(m)
},
BooleanValue(val) => Value::Bool(val),
NumberValue(val) => {
if val.is_nan() {
m.insert("type".to_owned(), Value::String("NaN".to_owned()));
Value::Object(m)
} else if val.is_infinite() {
if val < 0. {
m.insert("type".to_owned(), Value::String("-Infinity".to_owned()));
} else {
m.insert("type".to_owned(), Value::String("Infinity".to_owned()));
}
Value::Object(m)
} else if val == 0. && val.is_sign_negative() {
m.insert("type".to_owned(), Value::String("-0".to_owned()));
Value::Object(m)
} else {
Value::Number(Number::from_f64(val).unwrap())
}
},
StringValue(s) => Value::String(s),
ObjectValue {
uuid,
class,
preview,
} => {
let properties = preview
.clone()
.and_then(|preview| preview.own_properties)
.unwrap_or_default();
let actor = ObjectActor::register_with_properties(
registry,
Some(uuid),
class.clone(),
properties,
);
m.insert("type".to_owned(), Value::String("object".to_owned()));
m.insert("class".to_owned(), Value::String(class));
m.insert("actor".to_owned(), Value::String(actor));
m.insert("extensible".to_owned(), Value::Bool(true));
m.insert("frozen".to_owned(), Value::Bool(false));
m.insert("sealed".to_owned(), Value::Bool(false));
// Build preview
// <https://searchfox.org/firefox-main/source/devtools/server/actors/object/previewers.js#849>
let Some(preview) = preview else {
return Value::Object(m);
};
let mut preview_map = Map::new();
if preview.kind == "ArrayLike" {
if let Some(length) = preview.array_length {
preview_map.insert("length".to_owned(), Value::Number(length.into()));
}
} else {
if let Some(ref props) = preview.own_properties {
let mut own_props_map = Map::new();
for prop in props {
let descriptor =
serde_json::to_value(ObjectPropertyDescriptor::from(prop)).unwrap();
own_props_map.insert(prop.name.clone(), descriptor);
}
preview_map
.insert("ownProperties".to_owned(), Value::Object(own_props_map));
}
if let Some(length) = preview.own_properties_length {
preview_map.insert(
"ownPropertiesLength".to_owned(),
Value::Number(length.into()),
);
m.insert("ownPropertyLength".to_owned(), Value::Number(length.into()));
}
}
preview_map.insert("kind".to_owned(), Value::String(preview.kind));
// Function-specific metadata
if let Some(function) = preview.function {
if let Some(name) = function.name {
m.insert("name".to_owned(), Value::String(name));
}
if let Some(display_name) = function.display_name {
m.insert("displayName".to_owned(), Value::String(display_name));
}
m.insert(
"parameterNames".to_owned(),
Value::Array(
function
.parameter_names
.into_iter()
.map(Value::String)
.collect(),
),
);
m.insert("isAsync".to_owned(), Value::Bool(function.is_async));
m.insert("isGenerator".to_owned(), Value::Bool(function.is_generator));
}
m.insert("preview".to_owned(), Value::Object(preview_map));
Value::Object(m)
UniqueId::Worker(registry.find::<WorkerTargetActor>(worker_name).worker_id)
},
}
}
@@ -454,7 +262,7 @@ impl ConsoleActor {
let reply = EvaluateJSReply {
from: self.name(),
input,
result: Self::value_to_json(eval_result.value, registry),
result: debugger_value_to_json(registry, eval_result.value),
timestamp: get_time_stamp(),
exception: Value::Null,
exception_message: Value::Null,

View File

@@ -74,8 +74,11 @@ impl Actor for DeviceActor {
}
impl DeviceActor {
pub fn new(name: String) -> DeviceActor {
DeviceActor { name }
pub fn register(registry: &ActorRegistry) -> String {
let name = registry.new_name::<Self>();
let actor = DeviceActor { name: name.clone() };
registry.register::<Self>(actor);
name
}
pub fn description() -> ActorDescription {

View File

@@ -10,12 +10,12 @@ use serde::Serialize;
use serde_json::Value;
use crate::actor::{Actor, ActorEncode, ActorRegistry};
use crate::actors::object::ObjectActorMsg;
use crate::actors::object::{ObjectActorMsg, ObjectPropertyDescriptor};
#[derive(Serialize)]
struct EnvironmentBindings {
arguments: Vec<Value>,
variables: HashMap<String, EnvironmentVariableDesc>,
variables: HashMap<String, ObjectPropertyDescriptor>,
}
#[derive(Serialize)]
@@ -24,14 +24,6 @@ struct EnvironmentFunction {
display_name: String,
}
#[derive(Serialize)]
struct EnvironmentVariableDesc {
value: String,
configurable: bool,
enumerable: bool,
writable: bool,
}
#[derive(Serialize)]
#[serde(rename_all = "camelCase")]
pub(crate) struct EnvironmentActorMsg {
@@ -110,15 +102,13 @@ impl ActorEncode<EnvironmentActorMsg> for EnvironmentActor {
.binding_variables
.clone()
.into_iter()
.map(|(key, value)| {
.map(|ref property_descriptor| {
(
key,
EnvironmentVariableDesc {
value,
configurable: false,
enumerable: true,
writable: false,
},
property_descriptor.name.clone(),
ObjectPropertyDescriptor::from_property_descriptor(
registry,
property_descriptor,
),
)
})
.collect(),

View File

@@ -12,7 +12,7 @@ use servo_base::generic_channel::channel;
use crate::StreamId;
use crate::actor::{Actor, ActorEncode, ActorError, ActorRegistry};
use crate::actors::environment::{EnvironmentActor, EnvironmentActorMsg};
use crate::actors::object::{ObjectActor, ObjectActorMsg};
use crate::actors::object::ObjectActor;
use crate::actors::source::SourceActor;
use crate::protocol::{ClientRequest, JsonPacketStream};
@@ -49,7 +49,7 @@ pub(crate) struct FrameActorMsg {
display_name: String,
oldest: bool,
state: FrameState,
this: ObjectActorMsg,
this: Value,
#[serde(rename = "where")]
where_: FrameWhere,
}
@@ -60,7 +60,7 @@ pub(crate) struct FrameActorMsg {
pub(crate) struct FrameActor {
name: String,
object_actor: String,
source_actor: String,
source_name: String,
frame_result: FrameInfo,
current_offset: AtomicRefCell<(u32, u32)>,
}
@@ -84,16 +84,16 @@ impl Actor for FrameActor {
let Some((tx, rx)) = channel() else {
return Err(ActorError::Internal);
};
let source = registry.find::<SourceActor>(&self.source_actor);
source
let source_actor = registry.find::<SourceActor>(&self.source_name);
source_actor
.script_sender
.send(DevtoolScriptControlMsg::GetEnvironment(self.name(), tx))
.map_err(|_| ActorError::Internal)?;
let environment = rx.recv().map_err(|_| ActorError::Internal)?;
let environment_name = rx.recv().map_err(|_| ActorError::Internal)?;
let msg = FrameEnvironmentReply {
from: self.name(),
environment: registry.encode::<EnvironmentActor, _>(&environment),
environment: registry.encode::<EnvironmentActor, _>(&environment_name),
};
// This reply has a `type` field but it doesn't need a followup,
// unlike most messages. We need to skip the validity check.
@@ -109,16 +109,16 @@ impl Actor for FrameActor {
impl FrameActor {
pub fn register(
registry: &ActorRegistry,
source_actor: String,
source_name: String,
frame_result: FrameInfo,
) -> String {
let object_actor = ObjectActor::register(registry, None, "Object".to_owned());
let object_name = ObjectActor::register(registry, None, "Object".to_owned(), None);
let name = registry.new_name::<Self>();
let actor = Self {
name: name.clone(),
object_actor,
source_actor,
object_actor: object_name,
source_name,
frame_result,
current_offset: Default::default(),
};
@@ -153,7 +153,7 @@ impl ActorEncode<FrameActorMsg> for FrameActor {
oldest: self.frame_result.oldest,
state,
where_: FrameWhere {
actor: self.source_actor.clone(),
actor: self.source_name.clone(),
line,
column,
},

View File

@@ -4,7 +4,6 @@
//! Liberally derived from the [Firefox JS implementation](http://mxr.mozilla.org/mozilla-central/source/toolkit/devtools/server/actors/inspector.js).
use atomic_refcell::AtomicRefCell;
use malloc_size_of_derive::MallocSizeOf;
use serde::Serialize;
use serde_json::{self, Map, Value};
@@ -55,7 +54,7 @@ pub(crate) struct InspectorActor {
name: String,
highlighter_name: String,
page_style_name: String,
pub(crate) walker: String,
pub(crate) walker_name: String,
}
impl Actor for InspectorActor {
@@ -91,7 +90,7 @@ impl Actor for InspectorActor {
"getWalker" => {
let msg = GetWalkerReply {
from: self.name(),
walker: registry.encode::<WalkerActor, _>(&self.walker),
walker: registry.encode::<WalkerActor, _>(&self.walker_name),
};
request.reply_final(&msg)?
},
@@ -112,32 +111,20 @@ impl Actor for InspectorActor {
impl InspectorActor {
pub fn register(registry: &ActorRegistry, browsing_context_name: String) -> String {
let highlighter_actor = HighlighterActor {
name: registry.new_name::<HighlighterActor>(),
browsing_context_name: browsing_context_name.clone(),
};
let highlighter_name = HighlighterActor::register(registry, browsing_context_name.clone());
let page_style_actor = PageStyleActor {
name: registry.new_name::<PageStyleActor>(),
};
let page_style_name = PageStyleActor::register(registry);
let walker = WalkerActor {
name: registry.new_name::<WalkerActor>(),
mutations: AtomicRefCell::new(vec![]),
browsing_context_name,
};
let walker_name = WalkerActor::register(registry, browsing_context_name);
let inspector_actor = Self {
name: registry.new_name::<InspectorActor>(),
highlighter_name: highlighter_actor.name(),
page_style_name: page_style_actor.name(),
walker: walker.name(),
highlighter_name,
page_style_name,
walker_name,
};
let inspector_name = inspector_actor.name();
registry.register(highlighter_actor);
registry.register(page_style_actor);
registry.register(walker);
registry.register(inspector_actor);
inspector_name

View File

@@ -112,10 +112,7 @@ impl Actor for AccessibilityActor {
},
"getWalker" => {
// TODO: Create actual accessible walker
let actor = registry.new_name::<AccessibleWalkerActor>();
registry.register(AccessibleWalkerActor {
name: actor.clone(),
});
let actor = AccessibleWalkerActor::register(registry);
let msg = GetWalkerReply {
from: self.name(),
walker: ActorMsg { actor },
@@ -129,8 +126,11 @@ impl Actor for AccessibilityActor {
}
impl AccessibilityActor {
pub fn new(name: String) -> Self {
Self { name }
pub fn register(registry: &ActorRegistry) -> String {
let name = registry.new_name::<Self>();
let actor = Self { name: name.clone() };
registry.register::<Self>(actor);
name
}
}
@@ -150,6 +150,15 @@ pub(crate) struct AccessibleWalkerActor {
name: String,
}
impl AccessibleWalkerActor {
pub fn register(registry: &ActorRegistry) -> String {
let name = registry.new_name::<Self>();
let actor = Self { name: name.clone() };
registry.register::<Self>(actor);
name
}
}
impl Actor for AccessibleWalkerActor {
fn name(&self) -> String {
self.name.clone()

View File

@@ -57,7 +57,16 @@ impl Actor for CssPropertiesActor {
}
impl CssPropertiesActor {
pub fn new(name: String, properties: HashMap<String, CssDatabaseProperty>) -> Self {
Self { name, properties }
pub fn register(
registry: &ActorRegistry,
properties: HashMap<String, CssDatabaseProperty>,
) -> String {
let name = registry.new_name::<Self>();
let actor = Self {
name: name.clone(),
properties,
};
registry.register::<Self>(actor);
name
}
}

View File

@@ -97,12 +97,22 @@ impl Actor for HighlighterActor {
}
impl HighlighterActor {
pub fn register(registry: &ActorRegistry, browsing_context_name: String) -> String {
let name = registry.new_name::<Self>();
let actor = Self {
name: name.clone(),
browsing_context_name,
};
registry.register::<Self>(actor);
name
}
fn instruct_script_thread_to_highlight_node(
&self,
node_actor: Option<String>,
node_name: Option<String>,
registry: &ActorRegistry,
) {
let node_id = node_actor.map(|node_actor| registry.actor_to_script(node_actor));
let node_id = node_name.map(|node_name| registry.actor_to_script(node_name));
let browsing_context_actor =
registry.find::<BrowsingContextActor>(&self.browsing_context_name);
browsing_context_actor

View File

@@ -74,8 +74,11 @@ impl Actor for LayoutInspectorActor {
}
impl LayoutInspectorActor {
pub fn new(name: String) -> Self {
Self { name }
pub fn register(registry: &ActorRegistry) -> String {
let name = registry.new_name::<Self>();
let actor = Self { name: name.clone() };
registry.register::<Self>(actor);
name
}
}

View File

@@ -9,7 +9,8 @@ use std::collections::HashMap;
use atomic_refcell::AtomicRefCell;
use devtools_traits::{
AttrModification, DevtoolScriptControlMsg, EventListenerInfo, NodeInfo, ShadowRootMode,
AttrModification, DevtoolScriptControlMsg, EventListenerInfo, MatchedRule, NodeInfo,
ShadowRootMode,
};
use malloc_size_of_derive::MallocSizeOf;
use serde::Serialize;
@@ -132,7 +133,7 @@ pub(crate) struct NodeActor {
pub script_chan: GenericSender<DevtoolScriptControlMsg>,
pub pipeline: PipelineId,
pub walker: String,
pub style_rules: AtomicRefCell<HashMap<(String, usize), String>>,
pub style_rules: AtomicRefCell<HashMap<MatchedRule, String>>,
}
impl Actor for NodeActor {
@@ -257,6 +258,30 @@ impl Actor for NodeActor {
}
}
impl NodeActor {
pub fn register(
registry: &ActorRegistry,
script_id: String,
script_chan: GenericSender<DevtoolScriptControlMsg>,
pipeline: PipelineId,
walker: String,
) -> String {
let name = registry.new_name::<Self>();
registry.register_script_actor(script_id, name.clone());
let actor = Self {
name: name.clone(),
script_chan,
pipeline,
walker,
style_rules: AtomicRefCell::new(HashMap::new()),
};
registry.register(actor);
name
}
}
pub trait NodeInfoToProtocol {
fn encode(
self,
@@ -277,18 +302,13 @@ impl NodeInfoToProtocol for NodeInfo {
) -> NodeActorMsg {
let get_or_register_node_actor = |id: &str| {
if !registry.script_actor_registered(id.to_string()) {
let name = registry.new_name::<NodeActor>();
registry.register_script_actor(id.to_string(), name.clone());
let node_actor = NodeActor {
name: name.clone(),
script_chan: script_chan.clone(),
NodeActor::register(
registry,
id.to_string(),
script_chan.clone(),
pipeline,
walker: walker.clone(),
style_rules: AtomicRefCell::new(HashMap::new()),
};
registry.register(node_actor);
name
walker.clone(),
)
} else {
registry.script_to_actor(id.to_string())
}

View File

@@ -6,11 +6,10 @@
//! properties applied, including the attributes and layout of each element.
use std::collections::HashMap;
use std::collections::hash_map::Entry;
use std::iter::once;
use devtools_traits::DevtoolScriptControlMsg::{GetLayout, GetSelectors};
use devtools_traits::{AutoMargins, ComputedNodeLayout};
use devtools_traits::{AutoMargins, ComputedNodeLayout, MatchedRule};
use malloc_size_of_derive::MallocSizeOf;
use serde::Serialize;
use serde_json::{self, Map, Value};
@@ -130,33 +129,40 @@ impl Actor for PageStyleActor {
}
impl PageStyleActor {
pub fn register(registry: &ActorRegistry) -> String {
let name = registry.new_name::<Self>();
let actor = Self { name: name.clone() };
registry.register::<Self>(actor);
name
}
fn get_applied(
&self,
request: ClientRequest,
msg: &Map<String, Value>,
registry: &ActorRegistry,
) -> Result<(), ActorError> {
let target = msg
let node_name = msg
.get("node")
.ok_or(ActorError::MissingParameter)?
.as_str()
.ok_or(ActorError::BadParameterType)?;
let node = registry.find::<NodeActor>(target);
let walker = registry.find::<WalkerActor>(&node.walker);
let node_actor = registry.find::<NodeActor>(node_name);
let walker = registry.find::<WalkerActor>(&node_actor.walker);
let browsing_context_actor = walker.browsing_context_actor(registry);
let entries: Vec<_> = find_child(
&node.script_chan,
node.pipeline,
target,
&node_actor.script_chan,
node_actor.pipeline,
node_name,
registry,
&walker.root(registry)?.actor,
vec![],
|msg| msg.actor == target,
|msg| msg.actor == node_name,
)
.unwrap_or_default()
.into_iter()
.flat_map(|node| {
let inherited = (node.actor != target).then(|| node.actor.clone());
let inherited = (node.actor != node_name).then(|| node.actor.clone());
let node_actor = registry.find::<NodeActor>(&node.actor);
// Get the css selectors that match this node present in the currently active stylesheets.
@@ -177,28 +183,32 @@ impl PageStyleActor {
// For each selector (plus an empty one that represents the style attribute)
// get all of the rules associated with it.
once(("".into(), usize::MAX))
.chain(selectors)
.filter_map(move |selector| {
let rule = match node_actor.style_rules.borrow_mut().entry(selector) {
Entry::Vacant(e) => {
let name = registry.new_name::<StyleRuleActor>();
let actor = StyleRuleActor::new(
name.clone(),
node_actor.name(),
(!e.key().0.is_empty()).then_some(e.key().clone()),
);
let rule = actor.applied(registry)?;
let style_attribute_rule = MatchedRule {
selector: "".into(),
stylesheet_index: usize::MAX,
block_id: 0,
ancestor_data: vec![],
};
registry.register(actor);
e.insert(name);
rule
},
Entry::Occupied(e) => {
let actor = registry.find::<StyleRuleActor>(e.get());
actor.applied(registry)?
},
};
once(style_attribute_rule)
.chain(selectors)
.filter_map(move |matched_rule| {
let style_rule_name = node_actor
.style_rules
.borrow_mut()
.entry(matched_rule.clone())
.or_insert_with(|| {
StyleRuleActor::register(
registry,
node_actor.name(),
(matched_rule.stylesheet_index != usize::MAX)
.then_some(matched_rule.clone()),
)
})
.clone();
let actor = registry.find::<StyleRuleActor>(&style_rule_name);
let rule = actor.applied(registry)?;
if inherited.is_some() && rule.declarations.is_empty() {
return None;
}
@@ -226,33 +236,33 @@ impl PageStyleActor {
msg: &Map<String, Value>,
registry: &ActorRegistry,
) -> Result<(), ActorError> {
let target = msg
let node_name = msg
.get("node")
.ok_or(ActorError::MissingParameter)?
.as_str()
.ok_or(ActorError::BadParameterType)?;
let node_actor = registry.find::<NodeActor>(target);
let computed = (|| match node_actor
.style_rules
.borrow_mut()
.entry(("".into(), usize::MAX))
{
Entry::Vacant(e) => {
let name = registry.new_name::<StyleRuleActor>();
let actor = StyleRuleActor::new(name.clone(), target.into(), None);
let computed = actor.computed(registry)?;
registry.register(actor);
e.insert(name);
Some(computed)
},
Entry::Occupied(e) => {
let actor = registry.find::<StyleRuleActor>(e.get());
Some(actor.computed(registry)?)
},
})()
.unwrap_or_default();
let node_actor = registry.find::<NodeActor>(node_name);
let style_attribute_rule = devtools_traits::MatchedRule {
selector: "".into(),
stylesheet_index: usize::MAX,
block_id: 0,
ancestor_data: vec![],
};
let computed = {
let style_rule_name = node_actor
.style_rules
.borrow_mut()
.entry(style_attribute_rule)
.or_insert_with(|| StyleRuleActor::register(registry, node_name.into(), None))
.clone();
let actor = registry.find::<StyleRuleActor>(&style_rule_name);
actor.computed(registry)
};
let msg = GetComputedReply {
computed,
computed: computed.unwrap_or_default(),
from: self.name(),
};
request.reply_final(&msg)
@@ -264,20 +274,20 @@ impl PageStyleActor {
msg: &Map<String, Value>,
registry: &ActorRegistry,
) -> Result<(), ActorError> {
let target = msg
let node_name = msg
.get("node")
.ok_or(ActorError::MissingParameter)?
.as_str()
.ok_or(ActorError::BadParameterType)?;
let node = registry.find::<NodeActor>(target);
let walker = registry.find::<WalkerActor>(&node.walker);
let node_actor = registry.find::<NodeActor>(node_name);
let walker = registry.find::<WalkerActor>(&node_actor.walker);
let browsing_context_actor = walker.browsing_context_actor(registry);
let (tx, rx) = generic_channel::channel().ok_or(ActorError::Internal)?;
browsing_context_actor
.script_chan()
.send(GetLayout(
browsing_context_actor.pipeline_id(),
registry.actor_to_script(target.to_owned()),
registry.actor_to_script(node_name.to_owned()),
tx,
))
.map_err(|_| ActorError::Internal)?;

View File

@@ -11,6 +11,7 @@ use std::collections::HashMap;
use devtools_traits::DevtoolScriptControlMsg::{
GetAttributeStyle, GetComputedStyle, GetDocumentElement, GetStylesheetStyle, ModifyRule,
};
use devtools_traits::{AncestorData, MatchedRule};
use malloc_size_of_derive::MallocSizeOf;
use serde::Serialize;
use serde_json::{Map, Value};
@@ -28,7 +29,7 @@ const ELEMENT_STYLE_TYPE: u32 = 100;
#[serde(rename_all = "camelCase")]
pub(crate) struct AppliedRule {
actor: String,
ancestor_data: Vec<()>,
ancestor_data: Vec<AncestorData>,
authored_text: String,
css_text: String,
pub declarations: Vec<AppliedDeclaration>,
@@ -82,8 +83,8 @@ pub(crate) struct StyleRuleActorMsg {
#[derive(MallocSizeOf)]
pub(crate) struct StyleRuleActor {
name: String,
node: String,
selector: Option<(String, usize)>,
node_name: String,
selector: Option<MatchedRule>,
}
impl Actor for StyleRuleActor {
@@ -121,14 +122,14 @@ impl Actor for StyleRuleActor {
.collect();
// Query the rule modification
let node = registry.find::<NodeActor>(&self.node);
let walker = registry.find::<WalkerActor>(&node.walker);
let node_actor = registry.find::<NodeActor>(&self.node_name);
let walker = registry.find::<WalkerActor>(&node_actor.walker);
let browsing_context_actor = walker.browsing_context_actor(registry);
browsing_context_actor
.script_chan()
.send(ModifyRule(
browsing_context_actor.pipeline_id(),
registry.actor_to_script(self.node.clone()),
registry.actor_to_script(self.node_name.clone()),
modifications,
))
.map_err(|_| ActorError::Internal)?;
@@ -142,17 +143,24 @@ impl Actor for StyleRuleActor {
}
impl StyleRuleActor {
pub fn new(name: String, node: String, selector: Option<(String, usize)>) -> Self {
Self {
name,
node,
pub fn register(
registry: &ActorRegistry,
node: String,
selector: Option<MatchedRule>,
) -> String {
let name = registry.new_name::<Self>();
let actor = Self {
name: name.clone(),
node_name: node,
selector,
}
};
registry.register::<Self>(actor);
name
}
pub fn applied(&self, registry: &ActorRegistry) -> Option<AppliedRule> {
let node = registry.find::<NodeActor>(&self.node);
let walker = registry.find::<WalkerActor>(&node.walker);
let node_actor = registry.find::<NodeActor>(&self.node_name);
let walker = registry.find::<WalkerActor>(&node_actor.walker);
let browsing_context_actor = walker.browsing_context_actor(registry);
let (document_sender, document_receiver) = generic_channel::channel()?;
@@ -169,19 +177,15 @@ impl StyleRuleActor {
// not, this represents the style attribute.
let (style_sender, style_receiver) = generic_channel::channel()?;
let req = match &self.selector {
Some(selector) => {
let (selector, stylesheet) = selector.clone();
GetStylesheetStyle(
browsing_context_actor.pipeline_id(),
registry.actor_to_script(self.node.clone()),
selector,
stylesheet,
style_sender,
)
},
Some(matched_rule) => GetStylesheetStyle(
browsing_context_actor.pipeline_id(),
registry.actor_to_script(self.node_name.clone()),
matched_rule.clone(),
style_sender,
),
None => GetAttributeStyle(
browsing_context_actor.pipeline_id(),
registry.actor_to_script(self.node.clone()),
registry.actor_to_script(self.node_name.clone()),
style_sender,
),
};
@@ -190,7 +194,11 @@ impl StyleRuleActor {
Some(AppliedRule {
actor: self.name(),
ancestor_data: vec![], // TODO: Fill with hierarchy
ancestor_data: self
.selector
.as_ref()
.map(|r| r.ancestor_data.clone())
.unwrap_or_default(),
authored_text: "".into(),
css_text: "".into(), // TODO: Specify the css text
declarations: style
@@ -210,7 +218,7 @@ impl StyleRuleActor {
})
.collect(),
href: node.base_uri,
selectors: self.selector.iter().map(|(s, _)| s).cloned().collect(),
selectors: self.selector.iter().map(|r| r.selector.clone()).collect(),
selectors_specificity: self.selector.iter().map(|_| 1).collect(),
type_: ELEMENT_STYLE_TYPE,
traits: StyleRuleActorTraits {
@@ -223,8 +231,8 @@ impl StyleRuleActor {
&self,
registry: &ActorRegistry,
) -> Option<HashMap<String, ComputedDeclaration>> {
let node = registry.find::<NodeActor>(&self.node);
let walker = registry.find::<WalkerActor>(&node.walker);
let node_actor = registry.find::<NodeActor>(&self.node_name);
let walker = registry.find::<WalkerActor>(&node_actor.walker);
let browsing_context_actor = walker.browsing_context_actor(registry);
let (style_sender, style_receiver) = generic_channel::channel()?;
@@ -232,7 +240,7 @@ impl StyleRuleActor {
.script_chan()
.send(GetComputedStyle(
browsing_context_actor.pipeline_id(),
registry.actor_to_script(self.node.clone()),
registry.actor_to_script(self.node_name.clone()),
style_sender,
))
.ok()?;

View File

@@ -213,14 +213,14 @@ impl Actor for WalkerActor {
},
"getLayoutInspector" => {
// TODO: Create actual layout inspector actor
let layout_inspector_name = LayoutInspectorActor::register(registry);
let layout_inspector_actor =
LayoutInspectorActor::new(registry.new_name::<LayoutInspectorActor>());
registry.find::<LayoutInspectorActor>(&layout_inspector_name);
let msg = GetLayoutInspectorReply {
from: self.name(),
actor: layout_inspector_actor.encode(registry),
};
registry.register(layout_inspector_actor);
request.reply_final(&msg)?
},
"getMutations" => self.handle_get_mutations(request, registry)?,
@@ -237,7 +237,7 @@ impl Actor for WalkerActor {
.ok_or(ActorError::MissingParameter)?
.as_str()
.ok_or(ActorError::BadParameterType)?;
let node = msg
let node_name = msg
.get("node")
.ok_or(ActorError::MissingParameter)?
.as_str()
@@ -247,7 +247,7 @@ impl Actor for WalkerActor {
browsing_context_actor.pipeline_id(),
&self.name,
registry,
node,
node_name,
vec![],
|msg| msg.display_name == selector,
)
@@ -280,6 +280,17 @@ impl Actor for WalkerActor {
}
impl WalkerActor {
pub fn register(registry: &ActorRegistry, browsing_context_name: String) -> String {
let name = registry.new_name::<WalkerActor>();
let actor = WalkerActor {
name: name.clone(),
mutations: AtomicRefCell::new(vec![]),
browsing_context_name,
};
registry.register::<Self>(actor);
name
}
pub(crate) fn browsing_context_actor(
&self,
registry: &ActorRegistry,
@@ -378,7 +389,7 @@ pub fn find_child(
pipeline: PipelineId,
name: &str,
registry: &ActorRegistry,
node: &str,
node_name: &str,
mut hierarchy: Vec<NodeActorMsg>,
compare_fn: impl Fn(&NodeActorMsg) -> bool + Clone,
) -> Result<Vec<NodeActorMsg>, Vec<NodeActorMsg>> {
@@ -386,7 +397,7 @@ pub fn find_child(
script_chan
.send(GetChildren(
pipeline,
registry.actor_to_script(node.into()),
registry.actor_to_script(node_name.into()),
tx,
))
.unwrap();

View File

@@ -73,9 +73,14 @@ impl Actor for LongStringActor {
}
impl LongStringActor {
pub fn new(registry: &ActorRegistry, full_string: String) -> Self {
pub fn register(registry: &ActorRegistry, full_string: String) -> String {
let name = registry.new_name::<Self>();
LongStringActor { name, full_string }
let actor = Self {
name: name.clone(),
full_string,
};
registry.register::<Self>(actor);
name
}
pub fn long_string_obj(&self) -> LongStringObj {

View File

@@ -468,9 +468,10 @@ impl Actor for NetworkEventActor {
let (encoding, text) = if mime_type.is_some() {
// Queue a LongStringActor for this body
let body_string = String::from_utf8_lossy(body).to_string();
let long_string = LongStringActor::new(registry, body_string);
let value = long_string.long_string_obj();
registry.register(long_string);
let long_string_name = LongStringActor::register(registry, body_string);
let value = registry
.find::<LongStringActor>(&long_string_name)
.long_string_obj();
(None, serde_json::to_value(value).unwrap())
} else {
let b64 = STANDARD.encode(&body.0);
@@ -535,13 +536,16 @@ impl Actor for NetworkEventActor {
}
impl NetworkEventActor {
pub fn new(name: String, resource_id: u64, watcher_name: String) -> NetworkEventActor {
NetworkEventActor {
name,
pub fn register(registry: &ActorRegistry, resource_id: u64, watcher_name: String) -> String {
let name = registry.new_name::<Self>();
let actor = NetworkEventActor {
name: name.clone(),
resource_id,
watcher_name,
..Default::default()
}
};
registry.register::<Self>(actor);
name
}
pub fn add_request(&self, request: HttpRequest) {

View File

@@ -2,7 +2,7 @@
* License, v. 2.0. If a copy of the MPL was not distributed with this
* file, You can obtain one at https://mozilla.org/MPL/2.0/. */
use devtools_traits::{DebuggerValue, PropertyDescriptor};
use devtools_traits::{DebuggerValue, ObjectPreview, PropertyDescriptor};
use malloc_size_of_derive::MallocSizeOf;
use serde::Serialize;
use serde_json::{Map, Number, Value};
@@ -12,12 +12,6 @@ use crate::actor::{Actor, ActorEncode, ActorError, ActorRegistry};
use crate::actors::property_iterator::PropertyIteratorActor;
use crate::protocol::ClientRequest;
#[derive(Serialize)]
pub(crate) struct ObjectPreview {
kind: String,
url: String,
}
#[derive(Serialize)]
#[serde(rename_all = "camelCase")]
enum EnumIteratorType {
@@ -42,7 +36,7 @@ struct EnumReply {
#[derive(Serialize)]
struct PrototypeReply {
from: String,
prototype: ObjectActorMsg,
prototype: Value,
}
#[derive(Serialize)]
@@ -57,7 +51,8 @@ pub(crate) struct ObjectActorMsg {
frozen: bool,
sealed: bool,
is_error: bool,
preview: ObjectPreview,
#[serde(skip_serializing_if = "Option::is_none")]
preview: Option<ObjectPreview>,
}
#[derive(Serialize)]
@@ -68,52 +63,62 @@ pub(crate) struct ObjectPropertyDescriptor {
pub value: Value,
}
impl From<&PropertyDescriptor> for ObjectPropertyDescriptor {
fn from(prop: &PropertyDescriptor) -> Self {
impl ObjectPropertyDescriptor {
pub(crate) fn from_property_descriptor(
registry: &ActorRegistry,
prop: &PropertyDescriptor,
) -> Self {
Self {
configurable: prop.configurable,
enumerable: prop.enumerable,
writable: prop.writable,
value: debugger_value_to_json(&prop.value, &prop.name),
value: debugger_value_to_json(registry, prop.value.clone()),
}
}
}
/// <https://searchfox.org/mozilla-central/source/devtools/server/actors/object/utils.js#148>
fn debugger_value_to_json(value: &DebuggerValue, name: &str) -> Value {
pub(crate) fn debugger_value_to_json(registry: &ActorRegistry, value: DebuggerValue) -> Value {
let mut v = Map::new();
match value {
DebuggerValue::VoidValue => {
let mut v = Map::new();
v.insert("type".to_owned(), Value::String("undefined".to_owned()));
Value::Object(v)
},
DebuggerValue::NullValue => Value::Null,
DebuggerValue::BooleanValue(boolean) => Value::Bool(*boolean),
DebuggerValue::NumberValue(num) => {
if num.is_nan() {
let mut v = Map::new();
DebuggerValue::NullValue => {
v.insert("type".to_owned(), Value::String("null".to_owned()));
Value::Object(v)
},
DebuggerValue::BooleanValue(boolean) => Value::Bool(boolean),
DebuggerValue::NumberValue(val) => {
if val.is_nan() {
v.insert("type".to_owned(), Value::String("NaN".to_owned()));
Value::Object(v)
} else if num.is_infinite() {
let mut v = Map::new();
let type_str = if num.is_sign_positive() {
"Infinity"
} else if val.is_infinite() {
if val < 0. {
v.insert("type".to_owned(), Value::String("-Infinity".to_owned()));
} else {
"-Infinity"
};
v.insert("type".to_owned(), Value::String(type_str.to_owned()));
v.insert("type".to_owned(), Value::String("Infinity".to_owned()));
}
Value::Object(v)
} else if val == 0. && val.is_sign_negative() {
v.insert("type".to_owned(), Value::String("-0".to_owned()));
Value::Object(v)
} else {
Value::Number(Number::from_f64(*num).unwrap_or(Number::from(0)))
Value::Number(Number::from_f64(val).unwrap())
}
},
DebuggerValue::StringValue(str) => Value::String(str.clone()),
DebuggerValue::ObjectValue { class, .. } => {
let mut v = Map::new();
v.insert("type".to_owned(), Value::String("object".to_owned()));
v.insert("class".to_owned(), Value::String(class.clone()));
v.insert("name".to_owned(), Value::String(name.into()));
Value::Object(v)
DebuggerValue::StringValue(str) => Value::String(str),
DebuggerValue::ObjectValue {
uuid,
class,
preview,
..
} => {
let object_name = ObjectActor::register(registry, Some(uuid), class, preview);
let object_msg = registry.encode::<ObjectActor, _>(&object_name);
let value = serde_json::to_value(object_msg).unwrap_or_default();
Value::Object(value.as_object().cloned().unwrap_or_default())
},
}
}
@@ -123,7 +128,7 @@ pub(crate) struct ObjectActor {
name: String,
_uuid: Option<String>,
class: String,
properties: Vec<PropertyDescriptor>,
preview: Option<ObjectPreview>,
}
impl Actor for ObjectActor {
@@ -142,11 +147,15 @@ impl Actor for ObjectActor {
) -> Result<(), ActorError> {
match msg_type {
"enumProperties" => {
let property_iterator_name =
PropertyIteratorActor::register(registry, self.properties.clone());
let property_iterator =
let properties = self
.preview
.as_ref()
.and_then(|preview| preview.own_properties.clone())
.unwrap_or_default();
let property_iterator_name = PropertyIteratorActor::register(registry, properties);
let property_iterator_actor =
registry.find::<PropertyIteratorActor>(&property_iterator_name);
let count = property_iterator.count();
let count = property_iterator_actor.count();
let msg = EnumReply {
from: self.name(),
iterator: EnumIterator {
@@ -160,18 +169,18 @@ impl Actor for ObjectActor {
},
"enumSymbols" => {
let symbol_iterator = SymbolIteratorActor {
let symbol_iterator_actor = SymbolIteratorActor {
name: registry.new_name::<SymbolIteratorActor>(),
};
let msg = EnumReply {
from: self.name(),
iterator: EnumIterator {
actor: symbol_iterator.name(),
actor: symbol_iterator_actor.name(),
type_: EnumIteratorType::SymbolIterator,
count: 0,
},
};
registry.register(symbol_iterator);
registry.register(symbol_iterator_actor);
request.reply_final(&msg)?
},
@@ -190,15 +199,11 @@ impl Actor for ObjectActor {
}
impl ObjectActor {
pub fn register(registry: &ActorRegistry, uuid: Option<String>, class: String) -> String {
Self::register_with_properties(registry, uuid, class, Vec::new())
}
pub fn register_with_properties(
pub fn register(
registry: &ActorRegistry,
uuid: Option<String>,
class: String,
properties: Vec<PropertyDescriptor>,
preview: Option<ObjectPreview>,
) -> String {
let Some(uuid) = uuid else {
let name = registry.new_name::<Self>();
@@ -206,7 +211,7 @@ impl ObjectActor {
name: name.clone(),
_uuid: None,
class,
properties,
preview,
};
registry.register(actor);
return name;
@@ -217,7 +222,7 @@ impl ObjectActor {
name: name.clone(),
_uuid: Some(uuid.clone()),
class,
properties,
preview,
};
registry.register_script_actor(uuid, name.clone());
@@ -230,22 +235,80 @@ impl ObjectActor {
}
}
impl ActorEncode<ObjectActorMsg> for ObjectActor {
fn encode(&self, _: &ActorRegistry) -> ObjectActorMsg {
ObjectActorMsg {
actor: self.name(),
type_: "object".into(),
class: self.class.clone(),
own_property_length: self.properties.len() as i32,
extensible: true,
frozen: false,
sealed: false,
is_error: false,
preview: ObjectPreview {
kind: "ObjectWithURL".into(),
url: "".into(), // TODO: Use the correct url
},
impl ActorEncode<Value> for ObjectActor {
fn encode(&self, registry: &ActorRegistry) -> Value {
// TODO: convert to a serialize struct instead
let mut m = Map::new();
m.insert("type".to_owned(), Value::String("object".to_owned()));
m.insert("class".to_owned(), Value::String(self.class.clone()));
m.insert("actor".to_owned(), Value::String(self.name()));
m.insert("extensible".to_owned(), Value::Bool(true));
m.insert("frozen".to_owned(), Value::Bool(false));
m.insert("sealed".to_owned(), Value::Bool(false));
// Build preview
// <https://searchfox.org/firefox-main/source/devtools/server/actors/object/previewers.js#849>
let Some(preview) = self.preview.clone() else {
return Value::Object(m);
};
let mut preview_map = Map::new();
if preview.kind == "ArrayLike" {
if let Some(length) = preview.array_length {
preview_map.insert("length".to_owned(), Value::Number(length.into()));
}
} else {
if let Some(ref props) = preview.own_properties {
let mut own_props_map = Map::new();
for prop in props {
let descriptor = serde_json::to_value(
ObjectPropertyDescriptor::from_property_descriptor(registry, prop),
)
.unwrap();
own_props_map.insert(prop.name.clone(), descriptor);
}
preview_map.insert("ownProperties".to_owned(), Value::Object(own_props_map));
}
if let Some(length) = preview.own_properties_length {
preview_map.insert(
"ownPropertiesLength".to_owned(),
Value::Number(length.into()),
);
m.insert("ownPropertyLength".to_owned(), Value::Number(length.into()));
}
}
preview_map.insert("kind".to_owned(), Value::String(preview.kind));
// Function-specific metadata
if let Some(function) = preview.function {
if let Some(name) = function.name {
m.insert("name".to_owned(), Value::String(name));
}
if let Some(display_name) = function.display_name {
m.insert("displayName".to_owned(), Value::String(display_name));
}
m.insert(
"parameterNames".to_owned(),
Value::Array(
function
.parameter_names
.into_iter()
.map(Value::String)
.collect(),
),
);
if let Some(is_async) = function.is_async {
m.insert("isAsync".to_owned(), Value::Bool(is_async));
}
if let Some(is_generator) = function.is_generator {
m.insert("isGenerator".to_owned(), Value::Bool(is_generator));
}
}
m.insert("preview".to_owned(), Value::Object(preview_map));
Value::Object(m)
}
}

View File

@@ -4,13 +4,13 @@
use malloc_size_of_derive::MallocSizeOf;
use crate::actor::Actor;
use crate::actor::{Actor, ActorRegistry};
/// Referenced by `ThreadActor` when replying to `interupt` messages.
/// <https://searchfox.org/firefox-main/source/devtools/server/actors/thread.js#1699>
#[derive(MallocSizeOf)]
pub(crate) struct PauseActor {
pub name: String,
name: String,
}
impl Actor for PauseActor {
@@ -18,3 +18,12 @@ impl Actor for PauseActor {
self.name.clone()
}
}
impl PauseActor {
pub fn register(registry: &ActorRegistry) -> String {
let name = registry.new_name::<Self>();
let actor = Self { name: name.clone() };
registry.register::<Self>(actor);
name
}
}

View File

@@ -98,8 +98,11 @@ impl Actor for PerformanceActor {
}
impl PerformanceActor {
pub fn new(name: String) -> PerformanceActor {
PerformanceActor { name }
pub fn register(registry: &ActorRegistry) -> String {
let name = registry.new_name::<Self>();
let actor = PerformanceActor { name: name.clone() };
registry.register::<Self>(actor);
name
}
pub fn description() -> ActorDescription {

View File

@@ -68,8 +68,11 @@ impl Actor for ProcessActor {
}
impl ProcessActor {
pub fn new(name: String) -> Self {
Self { name }
pub fn register(registry: &ActorRegistry) -> String {
let name = registry.new_name::<Self>();
let actor = Self { name: name.clone() };
registry.register::<Self>(actor);
name
}
}

View File

@@ -51,7 +51,7 @@ impl Actor for PropertyIteratorActor {
fn handle_message(
&self,
request: ClientRequest,
_registry: &ActorRegistry,
registry: &ActorRegistry,
msg_type: &str,
msg: &Map<String, Value>,
_id: StreamId,
@@ -66,7 +66,10 @@ impl Actor for PropertyIteratorActor {
let mut own_properties = HashMap::new();
for prop in self.properties.iter().skip(start).take(count) {
own_properties.insert(prop.name.clone(), ObjectPropertyDescriptor::from(prop));
own_properties.insert(
prop.name.clone(),
ObjectPropertyDescriptor::from_property_descriptor(registry, prop),
);
}
let reply = SliceReply {

View File

@@ -47,7 +47,10 @@ impl Actor for ReflowActor {
}
impl ReflowActor {
pub fn new(name: String) -> Self {
Self { name }
pub fn register(registry: &ActorRegistry) -> String {
let name = registry.new_name::<Self>();
let actor = Self { name: name.clone() };
registry.register::<Self>(actor);
name
}
}

View File

@@ -22,7 +22,7 @@ use crate::actors::performance::PerformanceActor;
use crate::actors::preference::PreferenceActor;
use crate::actors::process::{ProcessActor, ProcessActorMsg};
use crate::actors::tab::{TabDescriptorActor, TabDescriptorActorMsg};
use crate::actors::worker::{WorkerActor, WorkerActorMsg};
use crate::actors::worker::{WorkerTargetActor, WorkerTargetActorMsg};
use crate::protocol::{ActorDescription, ClientRequest};
use crate::{EmptyReplyMsg, StreamId};
@@ -123,7 +123,7 @@ pub(crate) struct ProtocolDescriptionReply {
#[derive(Serialize)]
struct ListWorkersReply {
from: String,
workers: Vec<WorkerActorMsg>,
workers: Vec<WorkerTargetActorMsg>,
}
#[derive(Serialize)]
@@ -248,12 +248,12 @@ impl Actor for RootActor {
.borrow()
.iter()
.map(|worker_name| {
let worker = registry.find::<WorkerActor>(worker_name);
let url = worker.url.to_string();
let worker_actor = registry.find::<WorkerTargetActor>(worker_name);
let url = worker_actor.url.to_string();
// Find correct scope url in the service worker
let scope = url.clone();
ServiceWorkerRegistrationMsg {
actor: worker.name(),
actor: worker_actor.name(),
scope,
url: url.clone(),
registration_state: "".to_string(),
@@ -263,11 +263,11 @@ impl Actor for RootActor {
installing_worker: None,
waiting_worker: None,
active_worker: Some(ServiceWorkerInfo {
actor: worker.name(),
actor: worker_actor.name(),
url,
state: 4, // activated
state_text: "activated".to_string(),
id: worker.worker_id.to_string(),
id: worker_actor.worker_id.to_string(),
fetch: false,
traits: HashMap::new(),
}),
@@ -310,7 +310,7 @@ impl Actor for RootActor {
.workers
.borrow()
.iter()
.map(|worker_name| registry.encode::<WorkerActor, _>(worker_name))
.map(|worker_name| registry.encode::<WorkerTargetActor, _>(worker_name))
.collect(),
};
request.reply_final(&reply)?
@@ -347,27 +347,24 @@ impl RootActor {
/// Registers the root actor and its global actors (those not associated with a specific target).
pub fn register(registry: &mut ActorRegistry) {
// Global actors
let device_actor = DeviceActor::new(registry.new_name::<DeviceActor>());
let perf = PerformanceActor::new(registry.new_name::<PerformanceActor>());
let device_name = DeviceActor::register(registry);
let performance_name = PerformanceActor::register(registry);
let preference_name = PreferenceActor::register(registry);
// Process descriptor
let process_actor = ProcessActor::new(registry.new_name::<ProcessActor>());
let process_name = ProcessActor::register(registry);
// Root actor
let root_actor = Self {
global_actors: GlobalActors {
device_actor: device_actor.name(),
perf_actor: perf.name(),
device_actor: device_name,
perf_actor: performance_name,
preference_actor: preference_name,
},
process_name: process_actor.name(),
process_name,
..Default::default()
};
registry.register(perf);
registry.register(device_actor);
registry.register(process_actor);
registry.register(root_actor);
}

View File

@@ -50,7 +50,10 @@ impl Actor for StyleSheetsActor {
}
impl StyleSheetsActor {
pub fn new(name: String) -> StyleSheetsActor {
StyleSheetsActor { name }
pub fn register(registry: &ActorRegistry) -> String {
let name = registry.new_name::<Self>();
let actor = StyleSheetsActor { name: name.clone() };
registry.register::<Self>(actor);
name
}
}

View File

@@ -167,19 +167,21 @@ impl Actor for TabDescriptorActor {
}
impl TabDescriptorActor {
pub(crate) fn new(
pub(crate) fn register(
registry: &ActorRegistry,
browsing_context_name: String,
is_top_level_global: bool,
) -> TabDescriptorActor {
) -> String {
let name = registry.new_name::<Self>();
let root_actor = registry.find::<RootActor>("root");
root_actor.tabs.borrow_mut().push(name.clone());
TabDescriptorActor {
name,
let actor = TabDescriptorActor {
name: name.clone(),
browsing_context_name,
is_top_level_global,
}
};
registry.register::<Self>(actor);
name
}
pub(crate) fn is_top_level_global(&self) -> bool {

View File

@@ -116,18 +116,21 @@ pub(crate) struct ThreadActor {
}
impl ThreadActor {
pub fn new(
name: String,
pub fn register(
registry: &ActorRegistry,
script_sender: GenericSender<DevtoolScriptControlMsg>,
browsing_context_name: Option<String>,
) -> ThreadActor {
ThreadActor {
name,
) -> String {
let name = registry.new_name::<Self>();
let actor = ThreadActor {
name: name.clone(),
source_manager: SourceManager::new(),
script_sender,
frames: Default::default(),
browsing_context_name,
}
};
registry.register::<Self>(actor);
name
}
}
@@ -146,10 +149,7 @@ impl Actor for ThreadActor {
) -> Result<(), ActorError> {
match msg_type {
"attach" => {
let pause_name = registry.new_name::<PauseActor>();
registry.register(PauseActor {
name: pause_name.clone(),
});
let pause_name = PauseActor::register(registry);
let msg = ThreadAttached {
from: self.name(),
type_: "paused".to_owned(),

View File

@@ -23,7 +23,7 @@ use servo_url::ServoUrl;
use self::network_parent::NetworkParentActor;
use super::breakpoint::BreakpointListActor;
use super::thread::ThreadActor;
use super::worker::WorkerActorMsg;
use super::worker::WorkerTargetActorMsg;
use crate::actor::{Actor, ActorEncode, ActorError, ActorRegistry};
use crate::actors::browsing_context::{BrowsingContextActor, BrowsingContextActorMsg};
use crate::actors::console::ConsoleActor;
@@ -34,7 +34,7 @@ use crate::actors::watcher::target_configuration::{
use crate::actors::watcher::thread_configuration::ThreadConfigurationActor;
use crate::protocol::{ClientRequest, DevtoolsConnection, JsonPacketStream};
use crate::resource::{ResourceArrayType, ResourceAvailable};
use crate::{ActorMsg, EmptyReplyMsg, IdMap, StreamId, WorkerActor};
use crate::{ActorMsg, EmptyReplyMsg, IdMap, StreamId, WorkerTargetActor};
pub mod network_parent;
pub mod target_configuration;
@@ -110,7 +110,7 @@ pub enum SessionContextType {
#[serde(untagged)]
enum TargetActorMsg {
BrowsingContext(BrowsingContextActorMsg),
Worker(WorkerActorMsg),
Worker(WorkerTargetActorMsg),
}
#[derive(Serialize)]
@@ -184,7 +184,7 @@ pub(crate) struct WatcherActor {
name: String,
pub browsing_context_name: String,
network_parent_name: String,
target_configuration: String,
target_configuration_name: String,
thread_configuration_name: String,
breakpoint_list_name: String,
session_context: SessionContext,
@@ -266,7 +266,7 @@ impl Actor for WatcherActor {
from: self.name(),
type_: "target-available-form".into(),
target: TargetActorMsg::Worker(
registry.encode::<WorkerActor, _>(worker_name),
registry.encode::<WorkerTargetActor, _>(worker_name),
),
};
let _ = request.write_json_packet(&worker_msg);
@@ -277,7 +277,7 @@ impl Actor for WatcherActor {
from: self.name(),
type_: "target-available-form".into(),
target: TargetActorMsg::Worker(
registry.encode::<WorkerActor, _>(worker_name),
registry.encode::<WorkerTargetActor, _>(worker_name),
),
};
let _ = request.write_json_packet(&worker_msg);
@@ -341,7 +341,7 @@ impl Actor for WatcherActor {
);
for worker_name in &*root_actor.workers.borrow() {
let worker_actor = registry.find::<WorkerActor>(worker_name);
let worker_actor = registry.find::<WorkerTargetActor>(worker_name);
let thread_actor =
registry.find::<ThreadActor>(&worker_actor.thread_name);
@@ -365,7 +365,7 @@ impl Actor for WatcherActor {
);
for worker_name in &*root_actor.workers.borrow() {
let worker_actor = registry.find::<WorkerActor>(worker_name);
let worker_actor = registry.find::<WorkerTargetActor>(worker_name);
let console_actor =
registry.find::<ConsoleActor>(&worker_actor.console_name);
@@ -406,7 +406,7 @@ impl Actor for WatcherActor {
let msg = GetTargetConfigurationActorReply {
from: self.name(),
configuration: registry
.encode::<TargetConfigurationActor, _>(&self.target_configuration),
.encode::<TargetConfigurationActor, _>(&self.target_configuration_name),
};
request.reply_final(&msg)?
},
@@ -439,38 +439,31 @@ impl ResourceAvailable for WatcherActor {
}
impl WatcherActor {
pub fn new(
pub fn register(
registry: &ActorRegistry,
browsing_context_name: String,
session_context: SessionContext,
) -> Self {
let network_parent_actor =
NetworkParentActor::new(registry.new_name::<NetworkParentActor>());
let target_configuration =
TargetConfigurationActor::new(registry.new_name::<TargetConfigurationActor>());
let thread_configuration_actor =
ThreadConfigurationActor::new(registry.new_name::<ThreadConfigurationActor>());
let breakpoint_list_actor = BreakpointListActor::new(
registry.new_name::<BreakpointListActor>(),
browsing_context_name.clone(),
);
) -> String {
let network_parent_name = NetworkParentActor::register(registry);
let target_configuration_name = TargetConfigurationActor::register(registry);
let thread_configuration_name = ThreadConfigurationActor::register(registry);
let breakpoint_list_name =
BreakpointListActor::register(registry, browsing_context_name.clone());
let watcher_actor = Self {
name: registry.new_name::<WatcherActor>(),
let name = registry.new_name::<Self>();
let actor = Self {
name: name.clone(),
browsing_context_name,
network_parent_name: network_parent_actor.name(),
target_configuration: target_configuration.name(),
thread_configuration_name: thread_configuration_actor.name(),
breakpoint_list_name: breakpoint_list_actor.name(),
network_parent_name,
target_configuration_name,
thread_configuration_name,
breakpoint_list_name,
session_context,
};
registry.register(network_parent_actor);
registry.register(target_configuration);
registry.register(thread_configuration_actor);
registry.register(breakpoint_list_actor);
registry.register::<Self>(actor);
watcher_actor
name
}
pub fn emit_will_navigate<'a>(

View File

@@ -46,8 +46,11 @@ impl Actor for NetworkParentActor {
}
impl NetworkParentActor {
pub fn new(name: String) -> Self {
Self { name }
pub fn register(registry: &ActorRegistry) -> String {
let name = registry.new_name::<Self>();
let actor = Self { name: name.clone() };
registry.register::<Self>(actor);
name
}
}

View File

@@ -105,9 +105,10 @@ impl Actor for TargetConfigurationActor {
}
impl TargetConfigurationActor {
pub fn new(name: String) -> Self {
Self {
name,
pub fn register(registry: &ActorRegistry) -> String {
let name = registry.new_name::<Self>();
let actor = Self {
name: name.clone(),
configuration: HashMap::new(),
supported_options: HashMap::from([
("cacheDisabled", false),
@@ -128,7 +129,9 @@ impl TargetConfigurationActor {
("tracerOptions", false),
("useSimpleHighlightersForReducedMotion", false),
]),
}
};
registry.register::<Self>(actor);
name
}
}

View File

@@ -49,11 +49,14 @@ impl Actor for ThreadConfigurationActor {
}
impl ThreadConfigurationActor {
pub fn new(name: String) -> Self {
Self {
name,
pub fn register(registry: &ActorRegistry) -> String {
let name = registry.new_name::<Self>();
let actor = Self {
name: name.clone(),
_configuration: HashMap::new(),
}
};
registry.register::<Self>(actor);
name
}
}

View File

@@ -27,24 +27,50 @@ pub enum WorkerType {
}
#[derive(MallocSizeOf)]
pub(crate) struct WorkerActor {
pub(crate) struct WorkerTargetActor {
pub name: String,
pub console_name: String,
pub thread_name: String,
pub worker_id: WorkerId,
pub url: ServoUrl,
pub type_: WorkerType,
pub script_chan: GenericSender<DevtoolScriptControlMsg>,
pub script_sender: GenericSender<DevtoolScriptControlMsg>,
pub streams: AtomicRefCell<FxHashSet<StreamId>>,
}
impl ResourceAvailable for WorkerActor {
impl ResourceAvailable for WorkerTargetActor {
fn actor_name(&self) -> String {
self.name.clone()
}
}
impl Actor for WorkerActor {
impl WorkerTargetActor {
pub fn register(
registry: &ActorRegistry,
console_name: String,
thread_name: String,
worker_id: WorkerId,
url: ServoUrl,
worker_type: WorkerType,
script_sender: GenericSender<DevtoolScriptControlMsg>,
) -> String {
let name = registry.new_name::<Self>();
let actor = Self {
name: name.clone(),
console_name,
thread_name,
worker_id,
url,
type_: worker_type,
script_sender,
streams: Default::default(),
};
registry.register::<Self>(actor);
name
}
}
impl Actor for WorkerTargetActor {
fn name(&self) -> String {
self.name.clone()
}
@@ -67,7 +93,7 @@ impl Actor for WorkerActor {
request.write_json_packet(&msg)?;
self.streams.borrow_mut().insert(stream_id);
// FIXME: fix messages to not require forging a pipeline for worker messages
self.script_chan
self.script_sender
.send(WantsLiveNotifications(TEST_PIPELINE_ID, true))
.unwrap();
},
@@ -109,7 +135,7 @@ impl Actor for WorkerActor {
fn cleanup(&self, stream_id: StreamId) {
self.streams.borrow_mut().remove(&stream_id);
if self.streams.borrow().is_empty() {
self.script_chan
self.script_sender
.send(WantsLiveNotifications(TEST_PIPELINE_ID, false))
.unwrap();
}
@@ -156,7 +182,7 @@ struct WorkerTraits {
#[derive(Serialize)]
#[serde(rename_all = "camelCase")]
pub(crate) struct WorkerActorMsg {
pub(crate) struct WorkerTargetActorMsg {
actor: String,
console_actor: String,
thread_actor: String,
@@ -169,9 +195,9 @@ pub(crate) struct WorkerActorMsg {
target_type: String,
}
impl ActorEncode<WorkerActorMsg> for WorkerActor {
fn encode(&self, _: &ActorRegistry) -> WorkerActorMsg {
WorkerActorMsg {
impl ActorEncode<WorkerTargetActorMsg> for WorkerTargetActor {
fn encode(&self, _: &ActorRegistry) -> WorkerTargetActorMsg {
WorkerTargetActorMsg {
actor: self.name(),
console_actor: self.console_name.clone(),
thread_actor: self.thread_name.clone(),

View File

@@ -22,9 +22,9 @@ use std::thread;
use crossbeam_channel::{Receiver, Sender, unbounded};
use devtools_traits::{
ChromeToDevtoolsControlMsg, ConsoleLogLevel, ConsoleMessage, ConsoleMessageFields,
DevtoolScriptControlMsg, DevtoolsControlMsg, DevtoolsPageInfo, DomMutation, EnvironmentInfo,
FrameInfo, FrameOffset, NavigationState, NetworkEvent, PauseReason, ScriptToDevtoolsControlMsg,
SourceInfo, WorkerId, get_time_stamp,
DebuggerValue, DevtoolScriptControlMsg, DevtoolsControlMsg, DevtoolsPageInfo, DomMutation,
EnvironmentInfo, FrameInfo, FrameOffset, NavigationState, NetworkEvent, PauseReason,
ScriptToDevtoolsControlMsg, SourceInfo, WorkerId, get_time_stamp,
};
use embedder_traits::{AllowOrDeny, EmbedderMsg, EmbedderProxy};
use log::{trace, warn};
@@ -53,7 +53,7 @@ use crate::actors::root::RootActor;
use crate::actors::source::SourceActor;
use crate::actors::thread::{ThreadActor, ThreadInterruptedReply};
use crate::actors::watcher::WatcherActor;
use crate::actors::worker::{WorkerActor, WorkerType};
use crate::actors::worker::{WorkerTargetActor, WorkerType};
use crate::id::IdMap;
use crate::network_handler::handle_network_event;
use crate::protocol::{DevtoolsConnection, JsonPacketStream};
@@ -345,7 +345,7 @@ impl DevtoolsInstance {
column_number: css_error.column,
time_stamp: get_time_stamp(),
},
arguments: vec![css_error.msg.into()],
arguments: vec![DebuggerValue::StringValue(css_error.msg)],
stacktrace: None,
};
let console_message =
@@ -475,42 +475,33 @@ impl DevtoolsInstance {
let console_name = self.registry.new_name::<ConsoleActor>();
let parent_actor = if let Some(id) = worker_id {
let thread = ThreadActor::new(
self.registry.new_name::<ThreadActor>(),
script_sender.clone(),
None,
);
let thread_name = thread.name();
self.registry.register(thread);
let thread_name = ThreadActor::register(&self.registry, script_sender.clone(), None);
let worker_type = if page_info.is_service_worker {
WorkerType::Service
} else {
WorkerType::Dedicated
};
let worker_name = self.registry.new_name::<WorkerActor>();
let worker = WorkerActor {
name: worker_name.clone(),
console_name: console_name.clone(),
let worker_name = WorkerTargetActor::register(
&self.registry,
console_name.clone(),
thread_name,
worker_id: id,
url: page_info.url,
type_: worker_type,
script_chan: script_sender,
streams: Default::default(),
};
id,
page_info.url,
worker_type,
script_sender,
);
let root_actor = self.registry.find::<RootActor>("root");
if page_info.is_service_worker {
root_actor
.service_workers
.borrow_mut()
.push(worker.name.clone());
.push(worker_name.clone());
} else {
root_actor.workers.borrow_mut().push(worker.name.clone());
root_actor.workers.borrow_mut().push(worker_name.clone());
}
self.actor_workers.insert(id, worker_name.clone());
self.registry.register(worker);
Root::DedicatedWorker(worker_name)
} else {
@@ -519,7 +510,8 @@ impl DevtoolsInstance {
.browsing_contexts
.entry(browsing_context_id)
.or_insert_with(|| {
let browsing_context_actor = BrowsingContextActor::new(
BrowsingContextActor::register(
&self.registry,
console_name.clone(),
devtools_browser_id,
devtools_browsing_context_id,
@@ -527,11 +519,7 @@ impl DevtoolsInstance {
pipeline_id,
devtools_outer_window_id,
script_sender.clone(),
&self.registry,
);
let browsing_context_name = browsing_context_actor.name();
self.registry.register(browsing_context_actor);
browsing_context_name
)
});
let browsing_context_actor = self
.registry
@@ -540,9 +528,7 @@ impl DevtoolsInstance {
Root::BrowsingContext(browsing_context_name.clone())
};
let console_actor = ConsoleActor::new(console_name, parent_actor);
self.registry.register(console_actor);
ConsoleActor::register(&self.registry, console_name, parent_actor);
}
fn handle_title_changed(&self, pipeline_id: PipelineId, title: String) {
@@ -601,7 +587,9 @@ impl DevtoolsInstance {
let inspector_actor = self
.registry
.find::<InspectorActor>(&browsing_context_actor.inspector_name);
let walker_actor = self.registry.find::<WalkerActor>(&inspector_actor.walker);
let walker_actor = self
.registry
.find::<WalkerActor>(&inspector_actor.walker_name);
for connection in self.connections.lock().unwrap().values_mut() {
walker_actor.handle_dom_mutation(dom_mutation.clone(), connection)?;
@@ -632,7 +620,7 @@ impl DevtoolsInstance {
let worker_name = self.actor_workers.get(&worker_id)?;
Some(
self.registry
.find::<WorkerActor>(worker_name)
.find::<WorkerTargetActor>(worker_name)
.console_name
.clone(),
)
@@ -688,13 +676,11 @@ impl DevtoolsInstance {
let resource_id = self.next_resource_id;
self.next_resource_id += 1;
let network_event_name = self.registry.new_name::<NetworkEventActor>();
let network_event_actor =
NetworkEventActor::new(network_event_name.clone(), resource_id, watcher_name);
let network_event_name =
NetworkEventActor::register(&self.registry, resource_id, watcher_name);
self.actor_requests
.insert(request_id, network_event_name.clone());
self.registry.register(network_event_actor);
network_event_name
}
@@ -730,14 +716,14 @@ impl DevtoolsInstance {
let thread_actor_name = self
.registry
.find::<WorkerActor>(worker_name)
.find::<WorkerTargetActor>(worker_name)
.thread_name
.clone();
let thread_actor = self.registry.find::<ThreadActor>(&thread_actor_name);
thread_actor.source_manager.add_source(&source_actor);
let worker_actor = self.registry.find::<WorkerActor>(worker_name);
let worker_actor = self.registry.find::<WorkerTargetActor>(worker_name);
for stream in self.connections.lock().unwrap().values_mut() {
worker_actor.resource_array(
@@ -814,20 +800,17 @@ impl DevtoolsInstance {
let browsing_context_actor = self
.registry
.find::<BrowsingContextActor>(browsing_context_name);
let thread = self
let thread_actor = self
.registry
.find::<ThreadActor>(&browsing_context_actor.thread_name);
let pause_name = self.registry.new_name::<PauseActor>();
self.registry.register(PauseActor {
name: pause_name.clone(),
});
let pause_name = PauseActor::register(&self.registry);
let frame_actor = self.registry.find::<FrameActor>(&frame_offset.actor);
frame_actor.set_offset(frame_offset.column, frame_offset.line);
let msg = ThreadInterruptedReply {
from: thread.name(),
from: thread_actor.name(),
type_: "paused".to_owned(),
actor: pause_name,
frame: frame_actor.encode(&self.registry),
@@ -856,22 +839,22 @@ impl DevtoolsInstance {
let browsing_context_actor = self
.registry
.find::<BrowsingContextActor>(browsing_context_name);
let thread = self
let thread_actor = self
.registry
.find::<ThreadActor>(&browsing_context_actor.thread_name);
let source = match thread
let source_name = match thread_actor
.source_manager
.find_source(&self.registry, &frame.url)
{
Some(source) => source.name(),
Some(source_actor) => source_actor.name(),
None => {
warn!("No source actor found for URL: {}", frame.url);
return;
},
};
let frame_name = FrameActor::register(&self.registry, source, frame);
let frame_name = FrameActor::register(&self.registry, source_name, frame);
let _ = result_sender.send(frame_name);
}
@@ -879,11 +862,11 @@ impl DevtoolsInstance {
fn handle_create_environment_actor(
&mut self,
result_sender: GenericSender<String>,
environment: EnvironmentInfo,
environment_info: EnvironmentInfo,
parent: Option<String>,
) {
let frame = EnvironmentActor::register(&self.registry, environment, parent);
let _ = result_sender.send(frame);
let environment_name = EnvironmentActor::register(&self.registry, environment_info, parent);
let _ = result_sender.send(environment_name);
}
}

View File

@@ -9,7 +9,6 @@ use std::io::{self, ErrorKind, Read, Write};
use std::net::{Shutdown, SocketAddr, TcpStream};
use std::sync::{Arc, Mutex};
use log::debug;
use malloc_size_of_derive::MallocSizeOf;
use serde::Serialize;
use serde_json::{self, Value, json};
@@ -36,53 +35,6 @@ pub trait JsonPacketStream {
fn read_json_packet(&mut self) -> Result<Option<Value>, String>;
}
impl JsonPacketStream for TcpStream {
fn write_json_packet<T: Serialize>(&mut self, message: &T) -> Result<(), ActorError> {
let s = serde_json::to_string(message).map_err(|_| ActorError::Internal)?;
debug!("<- {}", s);
write!(self, "{}:{}", s.len(), s).map_err(|_| ActorError::Internal)?;
Ok(())
}
fn read_json_packet(&mut self) -> Result<Option<Value>, String> {
// https://firefox-source-docs.mozilla.org/devtools/backend/protocol.html#stream-transport
// In short, each JSON packet is [ascii length]:[JSON data of given length]
let mut buffer = vec![];
loop {
let mut buf = [0];
let byte = match self.read(&mut buf) {
Ok(0) => return Ok(None), // EOF
Err(e) if e.kind() == ErrorKind::ConnectionReset => return Ok(None), // EOF
Ok(1) => buf[0],
Ok(_) => unreachable!(),
Err(e) => return Err(e.to_string()),
};
match byte {
b':' => {
let packet_len_str = match String::from_utf8(buffer) {
Ok(packet_len) => packet_len,
Err(_) => return Err("nonvalid UTF8 in packet length".to_owned()),
};
let packet_len = match packet_len_str.parse::<u64>() {
Ok(packet_len) => packet_len,
Err(_) => return Err("packet length missing / not parsable".to_owned()),
};
let mut packet = String::new();
self.take(packet_len)
.read_to_string(&mut packet)
.map_err(|e| e.to_string())?;
debug!("{}", packet);
return match serde_json::from_str(&packet) {
Ok(json) => Ok(Some(json)),
Err(err) => Err(err.to_string()),
};
},
c => buffer.push(c),
}
}
}
}
/// Wraps a Remote Debugging Protocol TCP stream, guaranteeing that network
/// operations are synchronized when cloning across threads.
#[derive(Clone, MallocSizeOf)]

View File

@@ -32,7 +32,7 @@ use style::properties::style_structs::Font as FontStyleStruct;
use style::values::computed::font::{
FamilyName, FontFamilyNameSyntax, GenericFontFamily, SingleFontFamily,
};
use style::values::computed::{FontStretch, FontStyle, FontSynthesis, FontWeight, XLang};
use style::values::computed::{FontStretch, FontStyle, FontSynthesis, FontWeight};
use unicode_script::Script;
use webrender_api::{FontInstanceFlags, FontInstanceKey, FontVariation};
@@ -389,7 +389,7 @@ pub struct ShapingOptions {
/// determine the amount of spacing to apply.
pub letter_spacing: Option<Au>,
/// Spacing to add between each word. Corresponds to the CSS 2.1 `word-spacing` property.
pub word_spacing: Au,
pub word_spacing: Option<Au>,
/// The Unicode script property of the characters in this run.
pub script: Script,
/// The preferred language, obtained from the `lang` attribute.
@@ -478,26 +478,22 @@ impl Font {
continue;
};
let mut advance = advance_for_shaped_glyph(
Au::from_f64_px(self.glyph_h_advance(glyph_id)),
character,
options,
);
let mut advance = Au::from_f64_px(self.glyph_h_advance(glyph_id));
let offset = prev_glyph_id.map(|prev| {
let h_kerning = Au::from_f64_px(self.glyph_h_kerning(prev, glyph_id));
advance += h_kerning;
Point2D::new(h_kerning, Au::zero())
});
glyph_store.add_glyph(
character,
&ShapedGlyph {
glyph_id,
string_byte_offset,
advance,
offset,
},
);
let mut glyph = ShapedGlyph {
glyph_id,
string_byte_offset,
advance,
offset,
};
glyph.adjust_for_character(character, options, self);
glyph_store.add_glyph(character, &glyph);
prev_glyph_id = Some(glyph_id);
}
glyph_store
@@ -600,7 +596,7 @@ impl Deref for FontRef {
pub struct FallbackKey {
script: Script,
unicode_block: Option<UnicodeBlock>,
lang: XLang,
language: Language,
}
impl FallbackKey {
@@ -608,7 +604,7 @@ impl FallbackKey {
Self {
script: Script::from(options.character),
unicode_block: options.character.block(),
lang: options.lang.clone(),
language: options.language,
}
}
}
@@ -656,7 +652,7 @@ impl FontGroup {
font_context: &FontContext,
codepoint: char,
next_codepoint: Option<char>,
lang: XLang,
language: Language,
) -> Option<FontRef> {
// Tab characters are converted into spaces when rendering.
// TODO: We should not render a tab character. Instead they should be converted into tab stops
@@ -666,7 +662,7 @@ impl FontGroup {
_ => codepoint,
};
let options = FallbackFontSelectionOptions::new(codepoint, next_codepoint, lang);
let options = FallbackFontSelectionOptions::new(codepoint, next_codepoint, language);
let should_look_for_small_caps = self.descriptor.variant == font_variant_caps::T::SmallCaps &&
options.character.is_ascii_lowercase();
@@ -982,25 +978,3 @@ pub(crate) fn map_platform_values_to_style_values(mapping: &[(f64, f64)], value:
mapping[mapping.len() - 1].1
}
/// Computes the total advance for a glyph, taking `letter-spacing` and `word-spacing` into account.
pub(super) fn advance_for_shaped_glyph(
mut advance: Au,
character: char,
options: &ShapingOptions,
) -> Au {
if let Some(letter_spacing) = options.letter_spacing_for_character(character) {
advance += letter_spacing;
};
// CSS 2.1 § 16.4 states that "word spacing affects each space (U+0020) and non-breaking
// space (U+00A0) left in the text after the white space processing rules have been
// applied. The effect of the property on other word-separator characters is undefined."
// We elect to only space the two required code points.
if character == ' ' || character == '\u{a0}' {
// https://drafts.csswg.org/css-text-3/#word-spacing-property
advance += options.word_spacing;
}
advance
}

View File

@@ -16,6 +16,7 @@ use fonts_traits::{
};
use log::{debug, trace};
use malloc_size_of_derive::MallocSizeOf;
use net_traits::blob_url_store::UrlWithBlobClaim;
use net_traits::policy_container::PolicyContainer;
use net_traits::request::{
CredentialsMode, Destination, InsecureRequestsPolicy, Referrer, RequestBuilder, RequestClient,
@@ -646,7 +647,15 @@ impl FontContextWebFontMethods for Arc<FontContext> {
};
let rule: &FontFaceRule = lock.read_with(guard);
let Some(font_face) = rule.font_face() else {
// Per https://github.com/w3c/csswg-drafts/issues/1133 an @font-face rule
// is valid as far as the CSS parser is concerned even if it doesnt have
// a font-family or src declaration.
// However, both are required for the rule to represent an actual font face.
if rule.descriptors.font_family.is_none() {
continue;
}
let Some(ref sources) = rule.descriptors.src else {
continue;
};
@@ -661,7 +670,7 @@ impl FontContextWebFontMethods for Arc<FontContext> {
number_loading += 1;
self.start_loading_one_web_font(
Some(webview_id),
font_face.sources(),
sources,
css_font_face_descriptors,
WebFontLoadInitiator::Stylesheet(Box::new(initiator)),
document_context,
@@ -978,7 +987,7 @@ impl RemoteWebFontDownloader {
let request = RequestBuilder::new(
state.webview_id,
url.clone().into(),
UrlWithBlobClaim::from_url_without_having_claimed_blob(url.clone().into()),
Referrer::ReferrerUrl(document_context.document_url.clone()),
)
.destination(Destination::Font)

View File

@@ -540,7 +540,7 @@ impl ShapedGlyph {
/// TODO: This should all likely move to layout. In particular, proper tab stops
/// are context sensitive and be based on the size of the space character in the
/// inline formatting context.
fn adjust_for_character(
pub(crate) fn adjust_for_character(
&mut self,
character: char,
shaping_options: &ShapingOptions,
@@ -561,9 +561,11 @@ impl ShapedGlyph {
// space (U+00A0) left in the text after the white space processing rules have been
// applied. The effect of the property on other word-separator characters is undefined."
// We elect to only space the two required code points.
if character == ' ' || character == '\u{a0}' {
// https://drafts.csswg.org/css-text-3/#word-spacing-property
self.advance += shaping_options.word_spacing;
if let Some(word_spacing) = shaping_options.word_spacing {
if character == ' ' || character == '\u{a0}' {
// https://drafts.csswg.org/css-text-3/#word-spacing-property
self.advance += word_spacing;
}
}
}
}

View File

@@ -28,9 +28,9 @@ pub use font_store::FontTemplates;
pub use fonts_traits::*;
pub(crate) use glyph::*;
pub use glyph::{GlyphInfo, GlyphStore};
use icu_locid::subtags::Language;
pub use platform::font_list::fallback_font_families;
pub(crate) use shapers::*;
use style::values::computed::XLang;
pub use system_font_service::SystemFontService;
use unicode_properties::{EmojiStatus, UnicodeEmoji, emoji};
@@ -47,7 +47,7 @@ pub(crate) enum EmojiPresentationPreference {
pub struct FallbackFontSelectionOptions {
pub(crate) character: char,
pub(crate) presentation_preference: EmojiPresentationPreference,
pub(crate) lang: XLang,
pub(crate) language: Language,
}
impl Default for FallbackFontSelectionOptions {
@@ -55,13 +55,13 @@ impl Default for FallbackFontSelectionOptions {
Self {
character: ' ',
presentation_preference: EmojiPresentationPreference::None,
lang: XLang::get_initial_value(),
language: Language::UND,
}
}
}
impl FallbackFontSelectionOptions {
pub(crate) fn new(character: char, next_character: Option<char>, lang: XLang) -> Self {
pub(crate) fn new(character: char, next_character: Option<char>, language: Language) -> Self {
let presentation_preference = match next_character {
Some(next_character) if emoji::is_emoji_presentation_selector(next_character) => {
EmojiPresentationPreference::Emoji
@@ -89,7 +89,7 @@ impl FallbackFontSelectionOptions {
Self {
character,
presentation_preference,
lang,
language,
}
}
}

View File

@@ -19,12 +19,13 @@ use fontconfig_sys::{
FcPatternDestroy, FcPatternGetInteger, FcPatternGetString, FcResultMatch, FcSetSystem,
};
use fonts_traits::{FontTemplate, FontTemplateDescriptor, LocalFontIdentifier};
use icu_locid::subtags::language;
use libc::{c_char, c_int};
use log::debug;
use servo_base::text::{UnicodeBlock, UnicodeBlockMethod};
use style::Atom;
use style::values::computed::font::GenericFontFamily;
use style::values::computed::{FontStretch, FontStyle, FontWeight, XLang};
use style::values::computed::{FontStretch, FontStyle, FontWeight};
use unicode_script::Script;
use crate::font::map_platform_values_to_style_values;
@@ -188,13 +189,13 @@ pub fn fallback_font_families(options: FallbackFontSelectionOptions) -> Vec<&'st
// In Japanese typography, it is not common to use different fonts
// for Kanji(Han), Hiragana, and Katakana within the same document.
// We uniformly fallback to Japanese fonts when the document language is Japanese.
_ if options.lang == XLang(Atom::from("ja")) => {
_ if options.language == language!("ja") => {
families.push("TakaoPGothic");
},
_ if matches!(
Script::from(options.character),
Script::Bopomofo | Script::Han
) && options.lang != XLang(Atom::from("ja")) =>
) && options.language != language!("ja") =>
{
families.push("WenQuanYi Micro Hei");
},

View File

@@ -3,12 +3,15 @@
* file, You can obtain one at https://mozilla.org/MPL/2.0/. */
use std::collections::HashMap;
use std::ffi::OsStr;
use std::fs::File;
use std::os::unix::ffi::OsStrExt;
use std::path::{Path, PathBuf};
use std::sync::LazyLock;
use std::{fs, io};
use log::{debug, error, warn};
use read_fonts::FileRef::{Collection, Font as OHOS_Font};
use read_fonts::{FileRef, FontRef, TableProvider};
use servo_base::text::{UnicodeBlock, UnicodeBlockMethod};
use style::Atom;
use style::values::computed::font::GenericFontFamily;
@@ -95,105 +98,59 @@ fn enumerate_font_files() -> io::Result<Vec<PathBuf>> {
Ok(font_list)
}
fn detect_hos_font_style(font_modifiers: &[&str]) -> Option<String> {
if font_modifiers.contains(&"Italic") {
fn detect_hos_font_style(font: &FontRef, file_path: &str) -> Option<String> {
// This implementation uses the postscript (post) table, which is one of the mandatory tables
// according to TrueType's reference manual (https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6.html).
// Therefore, raise an error if Fontations fails to read this table for some reason.
// If angle is 0, then the font style is normal. otherwise, italic.
if font
.post()
.unwrap_or_else(|_| {
panic!("Failed to read {:?}'s postscript table!", file_path);
})
.italic_angle() !=
(0 as i32).into()
{
Some("italic".to_string())
} else {
None
}
}
// Note: The weights here are taken from the `alias` section of the fontconfig.json
fn detect_hos_font_weight_alias(font_modifiers: &[&str]) -> Option<i32> {
if font_modifiers.contains(&"Light") {
Some(100)
} else if font_modifiers.contains(&"Regular") {
Some(400)
} else if font_modifiers.contains(&"Medium") {
Some(700)
} else if font_modifiers.contains(&"Bold") {
Some(900)
} else {
None
fn detect_hos_font_weight_alias(font: &FontRef) -> Option<i32> {
// According to TrueType's reference manual (https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6.html),
// os2 is an optional table. Therefore, if Fontations fails to read this table, we don't treat this as an error
// and we simply return `None`.
match font.os2() {
Ok(result) => Some(result.us_weight_class() as i32),
Err(_) => None,
}
}
fn noto_weight_alias(alias: &str) -> Option<i32> {
match alias.to_ascii_lowercase().as_str() {
"thin" => Some(100),
"extralight" => Some(200),
"light" => Some(300),
"regular" => Some(400),
"medium" => Some(500),
"semibold" => Some(600),
"bold" => Some(700),
"extrabold" => Some(800),
"black" => Some(900),
_unknown_alias => {
warn!("Unknown weight alias `{alias}` encountered.");
None
},
}
}
fn detect_hos_font_width(font_modifiers: &[&str]) -> FontWidth {
if font_modifiers.contains(&"Condensed") {
FontWidth::Condensed
} else {
FontWidth::Normal
}
}
/// Split a Noto font filename into the family name with spaces
///
/// E.g. `NotoSansTeluguUI` -> `Noto Sans Telugu UI`
/// Or for older OH 4.1 fonts: `NotoSans_JP_Bold` -> `Noto Sans JP Bold`
fn split_noto_font_name(name: &str) -> Vec<String> {
let mut name_components = vec![];
let mut current_word = String::new();
let mut chars = name.chars();
// To not split acronyms like `UI` or `CJK`, we only start a new word if the previous
// char was not uppercase.
let mut previous_char_was_uppercase = true;
if let Some(first) = chars.next() {
current_word.push(first);
for c in chars {
if c.is_uppercase() {
if !previous_char_was_uppercase {
name_components.push(current_word.clone());
current_word = String::new();
}
previous_char_was_uppercase = true;
current_word.push(c)
} else if c == '_' {
name_components.push(current_word.clone());
current_word = String::new();
previous_char_was_uppercase = true;
// Skip the underscore itself
fn detect_hos_font_width(font: &FontRef) -> FontWidth {
// According to TrueType's reference manual (https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6.html),
// os2 is an optional table. Therefore, if Fontations fails to read this table, we don't treat this as an error
// and we simply return `FontWidth::Normal` as a default.
match font.os2() {
Ok(result) => {
let font_width = result.us_width_class().clone();
// According to https://learn.microsoft.com/en-us/typography/opentype/spec/os2#uswidthclass,
// value between 1 & 4 inclusive represents condensed type.
if font_width >= 1 && font_width <= 4 {
FontWidth::Condensed
} else {
previous_char_was_uppercase = false;
current_word.push(c)
FontWidth::Normal
}
}
},
Err(_) => FontWidth::Normal,
}
if !current_word.is_empty() {
name_components.push(current_word);
}
name_components
}
/// Parse the font file names to determine the available FontFamilies
///
/// Note: For OH 5.0+ this function is intended to only be a fallback path, if parsing the
/// `fontconfig.json` fails for some reason. Beta 1 of OH 5.0 still has a bug in the fontconfig.json
/// though, so the "normal path" is currently unimplemented.
fn parse_font_filenames(font_files: Vec<PathBuf>) -> Vec<FontFamily> {
let harmonyos_prefix = "HarmonyOS_Sans";
let weight_aliases = ["Light", "Regular", "Medium", "Bold"];
let style_modifiers = ["Italic"];
let width_modifiers = ["Condensed"];
/// This function generates list of `FontFamily` based on font files with the extension `.otf`, `.ttc`, or `.otf`.
/// If a font file's extension is .ttc, then all the font within it will be processed one by one.
#[servo_tracing::instrument(skip_all)]
fn get_system_font_families(font_files: Vec<PathBuf>) -> Vec<FontFamily> {
let mut families: HashMap<String, Vec<Font>> = HashMap::new();
let font_files: Vec<PathBuf> = font_files
@@ -211,77 +168,36 @@ fn parse_font_filenames(font_files: Vec<PathBuf>) -> Vec<FontFamily> {
})
.collect();
let harmony_os_fonts = font_files.iter().filter_map(|file_path| {
let stem = file_path.file_stem()?.to_str()?;
let stem_no_prefix = stem.strip_prefix(harmonyos_prefix)?;
let name_components: Vec<&str> = stem_no_prefix.split('_').collect();
let style = detect_hos_font_style(&name_components);
let weight = detect_hos_font_weight_alias(&name_components);
let width = detect_hos_font_width(&name_components);
let mut all_families = Vec::new();
let mut name_components = name_components;
// If we remove all the modifiers, we are left with the family name
name_components.retain(|component| {
!weight_aliases.contains(component) &&
!style_modifiers.contains(component) &&
!width_modifiers.contains(component) &&
!component.is_empty()
});
name_components.insert(0, "HarmonyOS Sans");
let family_name = name_components.join(" ");
let font = Font {
filepath: file_path.to_str()?.to_string(),
weight,
style,
width,
for font_file in font_files.iter() {
let Ok(font_bytes) =
File::open(font_file).and_then(|file| unsafe { memmap2::Mmap::map(&file) })
else {
continue;
};
let Ok(file_ref) = FileRef::new(&font_bytes) else {
continue;
};
Some((family_name, font))
});
let noto_fonts = font_files.iter().filter_map(|file_path| {
let stem = file_path.file_stem()?.to_str()?;
// Filter out non-noto fonts
if !stem.starts_with("Noto") {
return None;
match file_ref {
OHOS_Font(font) => {
if let Some(result) = get_family_name_and_generate_font_struct(&font, &font_file) {
all_families.push(result);
}
},
Collection(font_collection) => {
// Process all the font files within the collection one by one.
for f in font_collection.iter() {
if let Some(result) =
get_family_name_and_generate_font_struct(&(f.unwrap()), &font_file)
{
all_families.push(result);
};
}
},
}
// Strip the weight alias from the filename, e.g. `-Regular` or `_Regular`.
// We use `rsplit_once()`, since there is e.g. `NotoSansPhags-Pa-Regular.ttf`, where the
// Pa is part of the font family name and not a modifier.
// There seem to be no more than one modifier at once per font filename.
let (base, weight) = if let Some((stripped_base, weight_suffix)) =
stem.rsplit_once("-").or_else(|| stem.rsplit_once("_"))
{
(stripped_base, noto_weight_alias(weight_suffix))
} else {
(stem, None)
};
// Do some special post-processing for `NotoSansPhags-Pa-Regular.ttf` and any friends.
let base = if base.contains("-") {
if !base.ends_with("-Pa") {
warn!("Unknown `-` pattern in Noto font filename: {base}");
}
// Note: We assume here that the following character is uppercase, so that
// the word splitting later functions correctly.
base.replace("-", "")
} else {
base.to_string()
};
// Remove suffixes `[wght]` or `[wdth,wght]`. These suffixes seem to be mutually exclusive
// with the weight alias suffixes from before.
let base_name = base
.strip_suffix("[wght]")
.or_else(|| base.strip_suffix("[wdth,wght]"))
.unwrap_or(base.as_str());
let family_name = split_noto_font_name(base_name).join(" ");
let font = Font {
filepath: file_path.to_str()?.to_string(),
weight,
..Default::default()
};
Some((family_name, font))
});
let all_families = harmony_os_fonts.chain(noto_fonts);
}
for (family_name, font) in all_families {
if let Some(font_list) = families.get_mut(&family_name) {
@@ -297,6 +213,49 @@ fn parse_font_filenames(font_files: Vec<PathBuf>) -> Vec<FontFamily> {
.collect()
}
fn get_family_name_and_generate_font_struct(
font_ref: &FontRef,
file_path: &PathBuf,
) -> Option<(String, Font)> {
// Parse the file path to string. If this fails, then skip this font.
let Some(file_path_string_slice) = file_path.to_str() else {
return None;
};
let file_path_str = file_path_string_slice.to_string();
// Obtain the font's styling
let style = detect_hos_font_style(font_ref, file_path_string_slice);
let weight = detect_hos_font_weight_alias(font_ref);
let width = detect_hos_font_width(font_ref);
// Get the family name via the name table. According to TrueType's reference manual (https://developer.apple.com/fonts/TrueType-Reference-Manual/RM06/Chap6.html),
// the name table is a mandatory table. Therefore, if Fontations fails to read this table for whatever reason, return `None` to skip this font altogether.
let Ok(font_name_table) = font_ref.name() else {
return None;
};
let Some(family_name) = font_name_table
.name_record()
.iter()
.filter(|record| record.name_id().to_u16() == 1) // According to the reference manual, name identifier code (nameID) `1` is the font family name.
.find_map(|record| {
record
.string(font_name_table.string_data())
.ok()
.map(|s| s.to_string())
})
else {
return None;
};
let font = Font {
filepath: file_path_str,
weight,
style,
width,
};
Some((family_name, font))
}
impl FontList {
fn new() -> FontList {
FontList {
@@ -309,7 +268,7 @@ impl FontList {
fn detect_installed_font_families() -> Vec<FontFamily> {
let mut families = enumerate_font_files()
.inspect_err(|e| error!("Failed to enumerate font files due to `{e:?}`"))
.map(parse_font_filenames)
.map(get_system_font_families)
.unwrap_or_else(|_| FontList::fallback_font_families());
families.extend(Self::hardcoded_font_families());
families
@@ -543,20 +502,23 @@ pub fn fallback_font_families(options: FallbackFontSelectionOptions) -> Vec<&'st
UnicodeBlock::HangulJamoExtendedA |
UnicodeBlock::HangulJamoExtendedB |
UnicodeBlock::HangulSyllables => {
families.push("Noto Sans CJK");
families.push("Noto Serif CJK");
families.push("Noto Sans CJK KR");
families.push("Noto Sans Mono CJK KR");
families.push("Noto Serif CJK KR");
families.push("Noto Sans KR");
},
UnicodeBlock::Hiragana |
UnicodeBlock::Katakana |
UnicodeBlock::KatakanaPhoneticExtensions => {
families.push("Noto Sans CJK");
families.push("Noto Serif CJK");
families.push("Noto Sans CJK JP");
families.push("Noto Sans Mono CJK JP");
families.push("Noto Serif CJK JP");
families.push("Noto Sans JP");
},
UnicodeBlock::HalfwidthandFullwidthForms => {
families.push("HarmonyOS Sans SC");
families.push("Noto Sans CJK");
families.push("Noto Sans CJK SC");
families.push("Noto Sans Mono CJK SC");
},
_ => {},
}
@@ -590,31 +552,14 @@ mod test {
use std::path::PathBuf;
#[test]
fn split_noto_font_name_test() {
use super::split_noto_font_name;
assert_eq!(
split_noto_font_name("NotoSansSinhala"),
vec!["Noto", "Sans", "Sinhala"]
);
assert_eq!(
split_noto_font_name("NotoSansTamilUI"),
vec!["Noto", "Sans", "Tamil", "UI"]
);
assert_eq!(
split_noto_font_name("NotoSerifCJK"),
vec!["Noto", "Serif", "CJK"]
);
}
#[test]
fn test_parse_font_filenames() {
use super::parse_font_filenames;
let families = parse_font_filenames(vec![PathBuf::from("NotoSansCJK-Regular.ttc")]);
fn test_get_system_font_families() {
use super::get_system_font_families;
let families = get_system_font_families(vec![PathBuf::from("NotoSansCJK-Regular.ttc")]);
assert_eq!(families.len(), 1);
let family = families.first().unwrap();
assert_eq!(family.name, "Noto Sans CJK".to_string());
let families = parse_font_filenames(vec![
let families = get_system_font_families(vec![
PathBuf::from("NotoSerifGeorgian[wdth,wght].ttf"),
PathBuf::from("HarmonyOS_Sans_Naskh_Arabic_UI.ttf"),
PathBuf::from("HarmonyOS_Sans_Condensed.ttf"),
@@ -627,34 +572,6 @@ mod test {
assert_eq!(families.len(), 4);
}
#[test]
fn test_parse_noto_sans_phags_pa() {
use super::parse_font_filenames;
let families = parse_font_filenames(vec![PathBuf::from("NotoSansPhags-Pa-Regular.ttf")]);
let family = families.first().unwrap();
assert_eq!(family.name, "Noto Sans Phags Pa");
}
#[test]
fn test_old_noto_sans() {
use super::parse_font_filenames;
let families = parse_font_filenames(vec![
PathBuf::from("NotoSans_JP_Regular.otf"),
PathBuf::from("NotoSans_KR_Regular.otf"),
PathBuf::from("NotoSans_JP_Bold.otf"),
]);
assert_eq!(families.len(), 2, "actual families: {families:?}");
let first_family = families.first().unwrap();
let second_family = families.last().unwrap();
// We don't have a requirement on the order of the family names,
// we just want to test existence.
let names = [first_family.name.as_str(), second_family.name.as_str()];
assert!(names.contains(&"Noto Sans JP"));
assert!(names.contains(&"Noto Sans KR"));
}
#[test]
fn print_detected_families() {
let list = super::FontList::detect_installed_font_families();

View File

@@ -472,17 +472,17 @@ impl Font {
// the value stored in the HTML lang attribute is a BCP 47 language tag. These two
// formats are generally compatible, but we may need to make refinements here in
// the future.
let language = CFString::from_str(&options.lang.0);
let language = if !options.language.is_empty() {
Some(&*CFString::from_str(options.language.as_str()))
} else {
None
};
let string = CFString::from_str(&options.character.to_string());
let font = unsafe {
self.handle.ctfont.for_string_with_language(
&string,
CFRange::new(0, string.length()),
if !options.lang.0.is_empty() {
Some(&*language)
} else {
None
},
language,
)
};

View File

@@ -5,6 +5,7 @@
use std::ffi::c_void;
use fonts_traits::LocalFontIdentifier;
use icu_locid::subtags::language;
use log::debug;
use objc2_core_foundation::{CFDictionary, CFRetained, CFSet, CFString, CFType, CFURL};
use objc2_core_text::{
@@ -13,7 +14,6 @@ use objc2_core_text::{
};
use servo_base::text::{UnicodeBlock, UnicodeBlockMethod, unicode_plane};
use style::Atom;
use style::values::computed::XLang;
use style::values::computed::font::GenericFontFamily;
use unicode_script::Script;
@@ -132,7 +132,7 @@ pub fn fallback_font_families(options: FallbackFontSelectionOptions) -> Vec<&'st
// In Japanese typography, it is not common to use different fonts
// for Kanji(Han), Hiragana, and Katakana within the same document. Since Hiragino supports
// a comprehensive set of Japanese kanji, we uniformly fallback to Hiragino for all Japanese text.
_ if options.lang == XLang(Atom::from("ja")) => {
_ if options.language == language!("ja") => {
families.push("Hiragino Sans");
families.push("Hiragino Kaku Gothic ProN");
},
@@ -141,7 +141,7 @@ pub fn fallback_font_families(options: FallbackFontSelectionOptions) -> Vec<&'st
// language font to try for fallback is rather arbitrary. Usually, though,
// we hope that font prefs will have handled this earlier.
_ if matches!(script, Script::Bopomofo | Script::Han) &&
options.lang != XLang(Atom::from("ja")) =>
options.language != language!("ja") =>
{
// TODO: Need to differentiate between traditional and simplified Han here!
families.push("Songti SC");

View File

@@ -77,7 +77,7 @@ fn test_font_can_do_fast_shaping() {
// Fast shaping requires a font with a kern table and no GPOS or GSUB tables.
let shaping_options = ShapingOptions {
letter_spacing: None,
word_spacing: Au::zero(),
word_spacing: None,
script: Script::Latin,
language: Language::UND,
flags: ShapingFlags::empty(),
@@ -88,7 +88,7 @@ fn test_font_can_do_fast_shaping() {
// Non-Latin script should never have fast shaping.
let shaping_options = ShapingOptions {
letter_spacing: None,
word_spacing: Au::zero(),
word_spacing: None,
script: Script::Cherokee,
language: Language::UND,
flags: ShapingFlags::empty(),
@@ -99,7 +99,7 @@ fn test_font_can_do_fast_shaping() {
// Right-to-left text should never use fast shaping.
let shaping_options = ShapingOptions {
letter_spacing: None,
word_spacing: Au::zero(),
word_spacing: None,
script: Script::Latin,
language: Language::UND,
flags: ShapingFlags::RTL_FLAG,

View File

@@ -21,6 +21,7 @@ mod font_context {
PlatformFontMethods, SystemFontServiceMessage, SystemFontServiceProxy,
SystemFontServiceProxySender, fallback_font_families,
};
use icu_locid::subtags::Language;
use net_traits::{ResourceThreads, start_fetch_thread};
use paint_api::CrossProcessPaintApi;
use parking_lot::Mutex;
@@ -30,7 +31,6 @@ mod font_context {
use style::computed_values::font_optical_sizing::T as FontOpticalSizing;
use style::properties::longhands::font_variant_caps::computed_value::T as FontVariantCaps;
use style::properties::style_structs::Font as FontStyleStruct;
use style::values::computed::XLang;
use style::values::computed::font::{
FamilyName, FontFamily, FontFamilyList, FontFamilyNameSyntax, FontStretch, FontStyle,
FontSynthesis, FontWeight, SingleFontFamily,
@@ -264,7 +264,7 @@ mod font_context {
let group = context.context.font_group(ServoArc::new(style));
let font = group
.find_by_codepoint(&mut context.context, 'a', None, XLang::get_initial_value())
.find_by_codepoint(&mut context.context, 'a', None, Language::UND)
.unwrap();
assert_eq!(&font_face_name(&font.identifier()), "csstest-ascii");
assert_eq!(
@@ -277,7 +277,7 @@ mod font_context {
);
let font = group
.find_by_codepoint(&mut context.context, 'a', None, XLang::get_initial_value())
.find_by_codepoint(&mut context.context, 'a', None, Language::UND)
.unwrap();
assert_eq!(&font_face_name(&font.identifier()), "csstest-ascii");
assert_eq!(
@@ -290,7 +290,7 @@ mod font_context {
);
let font = group
.find_by_codepoint(&mut context.context, 'á', None, XLang::get_initial_value())
.find_by_codepoint(&mut context.context, 'á', None, Language::UND)
.unwrap();
assert_eq!(&font_face_name(&font.identifier()), "csstest-basic-regular");
assert_eq!(
@@ -313,7 +313,7 @@ mod font_context {
let group = context.context.font_group(ServoArc::new(style));
let font = group
.find_by_codepoint(&mut context.context, 'a', None, XLang::get_initial_value())
.find_by_codepoint(&mut context.context, 'a', None, Language::UND)
.unwrap();
assert_eq!(
&font_face_name(&font.identifier()),
@@ -322,7 +322,7 @@ mod font_context {
);
let font = group
.find_by_codepoint(&mut context.context, 'á', None, XLang::get_initial_value())
.find_by_codepoint(&mut context.context, 'á', None, Language::UND)
.unwrap();
assert_eq!(
&font_face_name(&font.identifier()),

View File

@@ -1,6 +1,6 @@
[package]
name = "servo-hyper-serde"
version = "0.13.2"
version.workspace = true
edition.workspace = true
authors = ["The Servo Project Developers"]
description = "Serde support for hyper types."

View File

@@ -2,7 +2,6 @@
* License, v. 2.0. If a copy of the MPL was not distributed with this
* file, You can obtain one at https://mozilla.org/MPL/2.0/. */
use core::f32;
use std::cell::{Cell, RefCell};
use std::mem;
use std::sync::Arc;
@@ -19,6 +18,7 @@ use paint_api::display_list::{
use servo_base::id::ScrollTreeNodeId;
use servo_base::print_tree::PrintTree;
use servo_config::opts::DiagnosticsLogging;
use servo_geometry::MaxRect;
use style::Zero;
use style::color::{AbsoluteColor, ColorSpace};
use style::computed_values::float::T as ComputedFloat;
@@ -1538,12 +1538,14 @@ impl BoxFragment {
};
radii = offset_radii(builder.border_radius, offsets_from_border);
} else if overflow.x != ComputedOverflow::Clip {
overflow_clip_rect.min.x = f32::MIN;
overflow_clip_rect.max.x = f32::MAX;
let max = LayoutRect::max_rect();
overflow_clip_rect.min.x = max.min.x;
overflow_clip_rect.max.x = max.max.x;
radii = BorderRadius::zero();
} else {
overflow_clip_rect.min.y = f32::MIN;
overflow_clip_rect.max.y = f32::MAX;
let max = LayoutRect::max_rect();
overflow_clip_rect.min.y = max.min.y;
overflow_clip_rect.max.y = max.max.y;
radii = BorderRadius::zero();
}

View File

@@ -1125,7 +1125,7 @@ fn item_with_auto_cross_size_stretches_to_line_size(
!margin.cross_end.is_auto()
}
/// Collect flex items into flex lines
/// "Collect flex items into flex lines"
/// <https://drafts.csswg.org/css-flexbox/#algo-line-break>
fn do_initial_flex_line_layout<'items>(
flex_context: &mut FlexContext,
@@ -1186,7 +1186,7 @@ fn do_initial_flex_line_layout<'items>(
if flex_context.layout_context.use_rayon {
lines.par_drain(..).map(construct_line).collect()
} else {
lines.drain(..).map(construct_line).collect()
lines.into_iter().map(construct_line).collect()
}
}

View File

@@ -991,12 +991,12 @@ impl SequentialLayoutState {
self.bfc_relative_block_position + self.current_margin.solve()
}
/// Collapses margins, moving the block position down by the collapsed value of `current_margin`
/// Commits margins, moving the block position down by the collapsed value of `current_margin`
/// and resetting `current_margin` to zero.
///
/// Call this method before laying out children when it is known that the start margin of the
/// current fragment can't collapse with the margins of any of its children.
pub(crate) fn collapse_margins(&mut self) {
pub(crate) fn commit_margin(&mut self) {
self.advance_block_position(self.current_margin.solve());
self.current_margin = CollapsedMargin::zero();
}

View File

@@ -83,6 +83,8 @@ use app_units::{Au, MAX_AU};
use bitflags::bitflags;
use construct::InlineFormattingContextBuilder;
use fonts::{FontMetrics, GlyphStore};
use icu_locid::LanguageIdentifier;
use icu_locid::subtags::{Language, language};
use icu_segmenter::{LineBreakOptions, LineBreakStrictness, LineBreakWordOption};
use inline_box::{InlineBox, InlineBoxContainerState, InlineBoxIdentifier, InlineBoxes};
use layout_api::wrapper_traits::SharedSelection;
@@ -1085,7 +1087,7 @@ impl InlineFormattingContextLayout<'_> {
let mut block_end_position = block_start_position + resolved_block_advance;
if let Some(sequential_layout_state) = self.sequential_layout_state.as_mut() {
if !is_phantom_line {
sequential_layout_state.collapse_margins();
sequential_layout_state.commit_margin();
}
// This amount includes both the block size of the line and any extra space
@@ -1822,10 +1824,14 @@ impl InlineFormattingContext {
.last()
.expect("Should have at least one SharedInlineStyle for the root of an IFC")
.clone();
let (word_break, line_break) = {
let (word_break, line_break, lang) = {
let styles = shared_inline_styles.style.borrow();
let text_style = styles.get_inherited_text();
(text_style.word_break, text_style.line_break)
(
text_style.word_break,
text_style.line_break,
styles.get_font()._x_lang.clone(),
)
};
let mut options = LineBreakOptions::default();
@@ -1844,7 +1850,15 @@ impl InlineFormattingContext {
WordBreak::BreakAll => LineBreakWordOption::BreakAll,
WordBreak::KeepAll => LineBreakWordOption::KeepAll,
};
options.ja_zh = false; // TODO: This should be true if the writing system is Chinese or Japanese.
// Enable Chinese/Japanese line breaking behavior when this inline formatting context
// has a Japanese or Chinese language set.
options.ja_zh = {
lang.0.parse::<LanguageIdentifier>().is_ok_and(|lang_id| {
const JA: Language = language!("ja");
const ZH: Language = language!("zh");
matches!(lang_id.language, JA | ZH)
})
};
let mut new_linebreaker = LineBreaker::new(text_content.as_str(), options);
for item in &mut builder.inline_items {

View File

@@ -54,13 +54,50 @@ enum SegmentStartSoftWrapPolicy {
FollowLinebreaker,
}
/// A data structure which contains font and language information about a run of text or
/// glyphs processed during inline layout.
/// A data structure which contains information used when shaping a [`TextRunSegment`].
#[derive(Clone, Debug, MallocSizeOf)]
pub(crate) struct FontAndScriptInfo {
/// The font used when shaping a [`TextRunSegment`].
pub font: FontRef,
/// The script used when shaping a [`TextRunSegment`].
pub script: Script,
/// The BiDi [`Level`] used when shaping a [`TextRunSegment`].
pub bidi_level: Level,
/// The [`Language`] used when shaping a [`TextRunSegment`].
pub language: Language,
/// Spacing to add between each letter. Corresponds to the CSS 2.1 `letter-spacing` property.
/// NB: You will probably want to set the `IGNORE_LIGATURES_SHAPING_FLAG` if this is non-null.
///
/// Letter spacing is not applied to all characters. Use [Self::letter_spacing_for_character] to
/// determine the amount of spacing to apply.
pub letter_spacing: Option<Au>,
/// Spacing to add between each word. Corresponds to the CSS 2.1 `word-spacing` property.
pub word_spacing: Option<Au>,
/// The [`TextRendering`] value from the original style.
pub text_rendering: TextRendering,
}
impl From<&FontAndScriptInfo> for ShapingOptions {
fn from(info: &FontAndScriptInfo) -> Self {
let mut flags = ShapingFlags::empty();
if info.bidi_level.is_rtl() {
flags.insert(ShapingFlags::RTL_FLAG);
}
if info.letter_spacing.is_some() {
flags.insert(ShapingFlags::IGNORE_LIGATURES_SHAPING_FLAG);
};
if info.text_rendering == TextRendering::Optimizespeed {
flags.insert(ShapingFlags::IGNORE_LIGATURES_SHAPING_FLAG);
flags.insert(ShapingFlags::DISABLE_KERNING_SHAPING_FLAG)
}
Self {
letter_spacing: info.letter_spacing,
word_spacing: info.word_spacing,
script: info.script,
language: info.language,
flags,
}
}
}
#[derive(Debug, MallocSizeOf)]
@@ -103,18 +140,26 @@ impl TextRunSegment {
/// Update this segment if the Font and Script are compatible. The update will only
/// ever make the Script specific. Returns true if the new Font and Script are
/// compatible with this segment or false otherwise.
fn update_if_compatible(&mut self, info: &FontAndScriptInfo) -> bool {
if self.info.bidi_level != info.bidi_level || !Arc::ptr_eq(&self.info.font, &info.font) {
fn update_if_compatible(
&mut self,
new_font: &FontRef,
new_script: Script,
new_bidi_level: Level,
) -> bool {
if self.info.bidi_level != new_bidi_level || !Arc::ptr_eq(&self.info.font, new_font) {
return false;
}
fn is_specific(script: Script) -> bool {
script != Script::Common && script != Script::Inherited
}
if !is_specific(self.info.script) && is_specific(info.script) {
self.info = Arc::new(info.clone());
if !is_specific(self.info.script) && is_specific(new_script) {
self.info = Arc::new(FontAndScriptInfo {
script: new_script,
..(*self.info).clone()
});
}
info.script == self.info.script || !is_specific(info.script)
new_script == self.info.script || !is_specific(new_script)
}
fn layout_into_line_items(
@@ -186,8 +231,9 @@ impl TextRunSegment {
parent_style: &ComputedValues,
formatting_context_text: &str,
linebreaker: &mut LineBreaker,
shaping_options: &ShapingOptions,
) {
let options: ShapingOptions = (&*self.info).into();
// Gather the linebreaks that apply to this segment from the inline formatting context's collection
// of line breaks. Also add a simulated break at the end of the segment in order to ensure the final
// piece of text is processed.
@@ -206,13 +252,12 @@ impl TextRunSegment {
let mut last_slice = self.range.start..self.range.start;
for break_index in linebreak_iter {
let mut options = options;
if *break_index == self.range.start {
self.break_at_start = true;
continue;
}
let mut options = *shaping_options;
// Extend the slice to the next UAX#14 line break opportunity.
let mut slice = last_slice.end..*break_index;
let word = &formatting_context_text[slice.clone()];
@@ -376,87 +421,15 @@ impl TextRun {
bidi_info: &BidiInfo,
) {
let parent_style = self.inline_styles.style.borrow().clone();
let inherited_text_style = parent_style.get_inherited_text().clone();
let letter_spacing = inherited_text_style
.letter_spacing
.0
.resolve(parent_style.clone_font().font_size.computed_size());
let letter_spacing = if letter_spacing.px() != 0. {
Some(app_units::Au::from(letter_spacing))
} else {
None
};
let language = parent_style
.get_font()
._x_lang
.0
.parse()
.unwrap_or(Language::UND);
let mut flags = ShapingFlags::empty();
if inherited_text_style.text_rendering == TextRendering::Optimizespeed {
flags.insert(ShapingFlags::IGNORE_LIGATURES_SHAPING_FLAG);
flags.insert(ShapingFlags::DISABLE_KERNING_SHAPING_FLAG)
let mut segments = self.segment_text_by_font(
layout_context,
formatting_context_text,
bidi_info,
&parent_style,
);
for segment in segments.iter_mut() {
segment.shape_text(&parent_style, formatting_context_text, linebreaker);
}
let specified_word_spacing = &inherited_text_style.word_spacing;
let style_word_spacing: Option<Au> = specified_word_spacing.to_length().map(|l| l.into());
let segments = self
.segment_text_by_font(
layout_context,
formatting_context_text,
bidi_info,
&parent_style,
)
.into_iter()
.map(|mut segment| {
let word_spacing = style_word_spacing.unwrap_or_else(|| {
let space_width = segment
.info
.font
.glyph_index(' ')
.map(|glyph_id| segment.info.font.glyph_h_advance(glyph_id))
.unwrap_or(LAST_RESORT_GLYPH_ADVANCE);
specified_word_spacing.to_used_value(Au::from_f64_px(space_width))
});
let mut flags = flags;
if segment.info.bidi_level.is_rtl() {
flags.insert(ShapingFlags::RTL_FLAG);
}
// From https://www.w3.org/TR/css-text-3/#cursive-script:
// Cursive scripts do not admit gaps between their letters for either
// justification or letter-spacing.
let letter_spacing = if is_cursive_script(segment.info.script) {
None
} else {
letter_spacing
};
if letter_spacing.is_some() {
flags.insert(ShapingFlags::IGNORE_LIGATURES_SHAPING_FLAG);
};
let shaping_options = ShapingOptions {
letter_spacing,
word_spacing,
script: segment.info.script,
language,
flags,
};
segment.shape_text(
&parent_style,
formatting_context_text,
linebreaker,
&shaping_options,
);
segment
})
.collect();
let _ = std::mem::replace(&mut self.shaped_text, segments);
}
@@ -476,15 +449,42 @@ impl TextRun {
let mut current: Option<TextRunSegment> = None;
let mut results = Vec::new();
let lang = parent_style.get_font()._x_lang.clone();
let x_lang = parent_style.get_font()._x_lang.clone();
let language = x_lang.0.parse().unwrap_or(Language::UND);
let text_run_text = &formatting_context_text[self.text_range.clone()];
let char_iterator = TwoCharsAtATimeIterator::new(text_run_text.chars());
let parent_style = self.inline_styles.style.borrow().clone();
let inherited_text_style = parent_style.get_inherited_text().clone();
let letter_spacing = inherited_text_style
.letter_spacing
.0
.resolve(parent_style.clone_font().font_size.computed_size());
let letter_spacing = if letter_spacing.px() != 0. {
Some(app_units::Au::from(letter_spacing))
} else {
None
};
let text_rendering = inherited_text_style.text_rendering;
let word_spacing = inherited_text_style.word_spacing.to_length().map(Au::from);
// The next current character index within the entire inline formatting context's text.
let mut next_character_index = self.character_range.start;
// The next bytes index of the charcter within the entire inline formatting context's text.
let mut next_byte_index = self.text_range.start;
let resolve_word_spacing_for_font = |font: &FontRef| {
word_spacing.unwrap_or_else(|| {
let space_width = font
.glyph_index(' ')
.map(|glyph_id| font.glyph_h_advance(glyph_id))
.unwrap_or(LAST_RESORT_GLYPH_ADVANCE);
inherited_text_style
.word_spacing
.to_used_value(Au::from_f64_px(space_width))
})
};
for (character, next_character) in char_iterator {
let current_character_index = next_character_index;
next_character_index += 1;
@@ -500,24 +500,41 @@ impl TextRun {
&layout_context.font_context,
character,
next_character,
lang.clone(),
language,
) else {
continue;
};
let info = FontAndScriptInfo {
font,
script: Script::from(character),
bidi_level: bidi_info.levels[current_byte_index],
};
let script = Script::from(character);
let bidi_level = bidi_info.levels[current_byte_index];
// If the existing segment is compatible with the character, keep going.
if let Some(current) = current.as_mut() {
if current.update_if_compatible(&info) {
if current.update_if_compatible(&font, script, bidi_level) {
continue;
}
}
// From https://www.w3.org/TR/css-text-3/#cursive-script:
// Cursive scripts do not admit gaps between their letters for either
// justification or letter-spacing.
let letter_spacing = if is_cursive_script(script) {
None
} else {
letter_spacing
};
let word_spacing = Some(resolve_word_spacing_for_font(&font));
let info = FontAndScriptInfo {
font,
script,
bidi_level,
language,
word_spacing,
letter_spacing,
text_rendering,
};
// Add the new segment and finish the existing one, if we had one. If the first
// characters in the run were control characters we may be creating the first
// segment in the middle of the run (ie the start should be the start of this
@@ -539,11 +556,16 @@ impl TextRun {
// of those cases, just use the first font.
if current.is_none() {
current = font_group.first(&layout_context.font_context).map(|font| {
let word_spacing = Some(resolve_word_spacing_for_font(&font));
TextRunSegment::new(
Arc::new(FontAndScriptInfo {
font,
script: Script::Common,
language,
bidi_level: Level::ltr(),
letter_spacing,
word_spacing,
text_rendering,
}),
self.text_range.start,
self.character_range.start,

View File

@@ -1052,11 +1052,11 @@ fn layout_in_flow_non_replaced_block_level_same_formatting_context(
// Introduce clearance if necessary.
clearance = sequential_layout_state.calculate_clearance(clear, &block_start_margin);
if clearance.is_some() {
sequential_layout_state.collapse_margins();
sequential_layout_state.commit_margin();
}
sequential_layout_state.adjoin_assign(&block_start_margin);
if !start_margin_can_collapse_with_children {
sequential_layout_state.collapse_margins();
sequential_layout_state.commit_margin();
}
// NB: This will be a no-op if we're collapsing margins with our children since that
@@ -1200,7 +1200,7 @@ fn layout_in_flow_non_replaced_block_level_same_formatting_context(
);
if !end_margin_can_collapse_with_children {
sequential_layout_state.collapse_margins();
sequential_layout_state.commit_margin();
}
sequential_layout_state.adjoin_assign(&CollapsedMargin::new(margin.block_end));
}
@@ -1666,12 +1666,12 @@ impl IndependentFormattingContext {
// Clearance prevents margin collapse between this block and previous ones,
// so in that case collapse margins before adjoining them below.
if clearance.is_some() {
sequential_layout_state.collapse_margins();
sequential_layout_state.commit_margin();
}
sequential_layout_state.adjoin_assign(&collapsed_margin_block_start);
// Margins can never collapse into independent formatting contexts.
sequential_layout_state.collapse_margins();
sequential_layout_state.commit_margin();
sequential_layout_state.advance_block_position(
pbm.padding_border_sums.block + content_size.block + clearance.unwrap_or_else(Au::zero),
);

View File

@@ -17,6 +17,7 @@ use embedder_traits::{Theme, ViewportDetails};
use euclid::{Point2D, Rect, Scale, Size2D};
use fonts::{FontContext, FontContextWebFontMethods, WebFontDocumentContext};
use fonts_traits::StylesheetWebFontLoadFinishedCallback;
use icu_locid::subtags::Language;
use layout_api::wrapper_traits::LayoutNode;
use layout_api::{
AxesOverflow, BoxAreaType, CSSPixelRectIterator, IFrameSizes, Layout, LayoutConfig,
@@ -68,7 +69,7 @@ use style::stylist::Stylist;
use style::traversal::DomTraversal;
use style::traversal_flags::TraversalFlags;
use style::values::computed::font::GenericFontFamily;
use style::values::computed::{CSSPixelLength, FontSize, Length, NonNegativeLength, XLang};
use style::values::computed::{CSSPixelLength, FontSize, Length, NonNegativeLength};
use style::values::specified::font::{KeywordInfo, QueryFontMetricsFlags};
use style::{Zero, driver};
use style_traits::{CSSPixel, SpeculativePainter};
@@ -1514,7 +1515,7 @@ impl FontMetricsProvider for LayoutFontMetricsProvider {
.zero_horizontal_advance
.or_else(|| {
font_group
.find_by_codepoint(font_context, '0', None, XLang::get_initial_value())?
.find_by_codepoint(font_context, '0', None, Language::UND)?
.metrics
.zero_horizontal_advance
})
@@ -1524,7 +1525,7 @@ impl FontMetricsProvider for LayoutFontMetricsProvider {
.ic_horizontal_advance
.or_else(|| {
font_group
.find_by_codepoint(font_context, '\u{6C34}', None, XLang::get_initial_value())?
.find_by_codepoint(font_context, '\u{6C34}', None, Language::UND)?
.metrics
.ic_horizontal_advance
})

View File

@@ -22,6 +22,7 @@ use style::computed_values::object_fit::T as ObjectFit;
use style::logical_geometry::{Direction, WritingMode};
use style::properties::{ComputedValues, StyleBuilder};
use style::rule_cache::RuleCacheConditions;
use style::rule_tree::RuleCascadeFlags;
use style::servo::url::ComputedUrl;
use style::stylesheets::container_rule::ContainerSizeQuery;
use style::values::CSSFloat;
@@ -244,6 +245,7 @@ impl ReplacedContents {
context.style_context.quirks_mode(),
rule_cache_conditions,
ContainerSizeQuery::none(),
RuleCascadeFlags::empty(),
);
let attr_to_computed = |attr_val: &AttrValue| {

View File

@@ -42,11 +42,6 @@ textarea:disabled {
border-color: lightgrey;
}
input::-servo-text-control-inner-editor {
overflow-wrap: normal;
pointer-events: auto;
}
/* FIXME(#36982): Use `display: block; align-content: center` instead of flex. */
input::-servo-text-control-inner-container {
display: flex;
@@ -55,23 +50,46 @@ input::-servo-text-control-inner-container {
position: relative;
}
input:not(:placeholder-shown)::placeholder {
visibility: hidden !important;
}
input::-servo-text-control-inner-editor, input::placeholder {
input::-servo-text-control-inner-editor,
input::placeholder {
/* This limits the block size of the inner div to that of the text size so that
it can be centered in the container. This is only necessary for <input>. */
block-size: fit-content !important;
inset-block: 0 !important;
margin-block: auto !important;
min-width: stretch;
white-space: pre;
}
input::placeholder {
input::-servo-text-control-inner-editor,
input::placeholder,
textarea::-servo-text-control-inner-editor,
textarea::placeholder {
min-width: stretch;
overflow-wrap: normal;
}
input:not(:placeholder-shown)::placeholder,
textarea:not(:placeholder-shown)::placeholder {
display: none !important;
}
input:placeholder-shown::-servo-text-control-inner-editor,
textarea:placeholder-shown::-servo-text-control-inner-editor {
display: none !important;
}
input::placeholder,
textarea::placeholder {
color: grey;
overflow: hidden;
}
input::-servo-text-control-inner-editor {
pointer-events: auto !important;
}
input::placeholder {
pointer-events: none !important;
position: absolute !important;
}
input::selection,
@@ -372,16 +390,6 @@ details {
display: block;
}
details::-servo-details-summary {
margin-left: 40px;
display: list-item;
list-style: disclosure-closed;
}
details[open]::-servo-details-summary {
list-style: disclosure-open;
}
details::details-content {
/* TODO: This should be "display: block; content-visibility: hidden;",
but servo does not support content-visibility yet */

View File

@@ -1169,14 +1169,18 @@ fn add_column(
// The HTML specification clamps value of `span` for `<col>` to [1, 1000].
assert!((1..=1000).contains(&span));
let column = old_column.unwrap_or_else(|| {
ArcRefCell::new(TableTrack {
let column = match old_column {
Some(column) => {
column.borrow_mut().group_index = group_index;
column
},
None => ArcRefCell::new(TableTrack {
base: LayoutBoxBase::new(column_info.into(), column_info.style.clone()),
group_index,
is_anonymous,
shared_background_style: SharedStyle::new(column_info.style.clone()),
})
});
}),
};
collection.extend(repeat_n(column.clone(), span as usize));
column
}

View File

@@ -18,9 +18,11 @@ app_units = { workspace = true }
atomic_refcell = { workspace = true }
content-security-policy = { workspace = true }
crossbeam-channel = { workspace = true }
data-url = { workspace = true }
encoding_rs = '0.8'
euclid = { workspace = true }
http = { workspace = true }
icu_locid = { workspace = true }
indexmap = { workspace = true }
ipc-channel = { workspace = true }
keyboard-types = { workspace = true }

View File

@@ -59,7 +59,6 @@ use resvg::usvg::{self, tiny_skia_path};
use style::properties::ComputedValues;
use style::values::generics::length::GenericLengthPercentageOrAuto;
pub use stylo_malloc_size_of::MallocSizeOfOps;
use uuid::Uuid;
/// Trait for measuring the "deep" heap usage of a data structure. This is the
/// most commonly-used of the traits.
@@ -392,6 +391,13 @@ impl<T: MallocSizeOf> MallocSizeOf for std::collections::VecDeque<T> {
}
}
impl MallocSizeOf for std::path::PathBuf {
fn size_of(&self, _ops: &mut MallocSizeOfOps) -> usize {
// This should be an approximation of the actual size
self.as_os_str().as_encoded_bytes().len()
}
}
impl<A: smallvec::Array> MallocShallowSizeOf for smallvec::SmallVec<A> {
fn shallow_size_of(&self, ops: &mut MallocSizeOfOps) -> usize {
if self.spilled() {
@@ -1073,6 +1079,12 @@ impl<T> MallocSizeOf for tokio::sync::mpsc::UnboundedSender<T> {
}
}
impl<T> MallocSizeOf for tokio::sync::oneshot::Sender<T> {
fn size_of(&self, _ops: &mut MallocSizeOfOps) -> usize {
0
}
}
impl<T> MallocSizeOf for ipc_channel::ipc::IpcSender<T> {
fn size_of(&self, _ops: &mut MallocSizeOfOps) -> usize {
0
@@ -1109,6 +1121,22 @@ impl MallocSizeOf for servo_arc::Arc<ComputedValues> {
}
}
impl MallocSizeOf for http::HeaderMap {
fn size_of(&self, ops: &mut MallocSizeOfOps) -> usize {
// The headermap in http is more complicated than a simple hashmap
// However, this should give us a reasonable approximation.
self.iter()
.map(|entry| entry.0.size_of(ops) + entry.1.size_of(ops))
.sum()
}
}
impl MallocSizeOf for data_url::mime::Mime {
fn size_of(&self, ops: &mut MallocSizeOfOps) -> usize {
self.type_.size_of(ops) + self.parameters.size_of(ops) + self.subtype.size_of(ops)
}
}
malloc_size_of_hash_map!(indexmap::IndexMap<K, V, S>);
malloc_size_of_hash_set!(indexmap::IndexSet<T, S>);
@@ -1117,12 +1145,14 @@ malloc_size_of_is_0!(f32, f64);
malloc_size_of_is_0!(i8, i16, i32, i64, i128, isize);
malloc_size_of_is_0!(u8, u16, u32, u64, u128, usize);
malloc_size_of_is_0!(Uuid);
malloc_size_of_is_0!(uuid::Uuid);
malloc_size_of_is_0!(app_units::Au);
malloc_size_of_is_0!(content_security_policy::Destination);
malloc_size_of_is_0!(content_security_policy::sandboxing_directive::SandboxingFlagSet);
malloc_size_of_is_0!(encoding_rs::Decoder);
malloc_size_of_is_0!(http::StatusCode);
malloc_size_of_is_0!(http::Method);
malloc_size_of_is_0!(icu_locid::subtags::Language);
malloc_size_of_is_0!(keyboard_types::Code);
malloc_size_of_is_0!(keyboard_types::Modifiers);
malloc_size_of_is_0!(mime::Mime);
@@ -1154,9 +1184,22 @@ malloc_size_of_is_0!(taffy::Layout);
malloc_size_of_is_0!(time::Duration);
malloc_size_of_is_0!(unicode_bidi::Level);
malloc_size_of_is_0!(unicode_script::Script);
malloc_size_of_is_0!(urlpattern::UrlPattern);
malloc_size_of_is_0!(std::net::TcpStream);
impl MallocSizeOf for urlpattern::UrlPattern {
fn size_of(&self, _ops: &mut MallocSizeOfOps) -> usize {
// This is an approximation
self.protocol().len() +
self.username().len() +
self.password().len() +
self.hostname().len() +
self.port().len() +
self.pathname().len() +
self.search().len() +
self.hash().len()
}
}
impl<S: tendril::TendrilSink<tendril::fmt::UTF8, A>, A: tendril::Atomicity> MallocSizeOf
for tendril::stream::LossyDecoder<S, A>
{
@@ -1267,6 +1310,7 @@ malloc_size_of_is_stylo_malloc_size_of!(style::attr::AttrValue);
malloc_size_of_is_stylo_malloc_size_of!(style::color::AbsoluteColor);
malloc_size_of_is_stylo_malloc_size_of!(style::computed_values::font_variant_caps::T);
malloc_size_of_is_stylo_malloc_size_of!(style::computed_values::text_decoration_style::T);
malloc_size_of_is_stylo_malloc_size_of!(style::computed_values::text_rendering::T);
malloc_size_of_is_stylo_malloc_size_of!(style::dom::OpaqueNode);
malloc_size_of_is_stylo_malloc_size_of!(style::invalidation::element::restyle_hints::RestyleHint);
malloc_size_of_is_stylo_malloc_size_of!(style::logical_geometry::WritingMode);

View File

@@ -24,3 +24,14 @@ mime = "0.3.13"
once_cell = "1.18.0"
log = "0.4"
ohos-media-sys = { version = "0.0.5" ,features = ["api-21"] }
ohos-window-sys = { version = "0.1.3", features = ["api-13"] }
ohos-sys-opaque-types = { version = "0.1.7" }
ipc-channel = { workspace = true }
crossbeam-channel = { workspace = true }
lru = "0.16.3"
rangemap = "1.6.0"
libc = "0.2"
yuv = "0.8.11"
[build-dependencies]
serde_json.workspace = true

View File

@@ -0,0 +1,41 @@
/* This Source Code Form is subject to the terms of the Mozilla Public
* License, v. 2.0. If a copy of the MPL was not distributed with this
* file, You can obtain one at https://mozilla.org/MPL/2.0/. */
use std::path;
// build.rs
fn main() {
println!("cargo:rustc-check-cfg=cfg(sdk_api_21)");
println!("cargo:rustc-check-cfg=cfg(sdk_api_22)");
println!("cargo:rustc-check-cfg=cfg(sdk_api_23)");
let target_env = std::env::var("CARGO_CFG_TARGET_ENV").unwrap();
if target_env != "ohos" {
return;
}
let sdk_path_name = std::env::var("OHOS_SDK_NATIVE").expect("OHOS_SDK_NATIVE must be set");
let sdk_path = path::PathBuf::from(sdk_path_name);
let meta_file_path = sdk_path.join("oh-uni-package.json");
let meta_info = serde_json::from_str::<serde_json::Value>(
&std::fs::read_to_string(&meta_file_path).expect("Failed to read oh-uni-package.json"),
)
.expect("Failed to parse oh-uni-package.json");
let api_version_str = meta_info
.get("apiVersion")
.expect("Unable to find apiVersion in oh-uni-package.json")
.as_str()
.expect("apiVersion should be a string");
let api_version = api_version_str
.parse::<u32>()
.expect("apiVersion should be a valid integer");
let low_api_version = 21;
if let 21.. = api_version {
for version in low_api_version..=api_version {
println!("cargo:rustc-cfg=sdk_api_{}", version);
}
}
println!("cargo:warning=Detected API version: {:?}", api_version);
println!("cargo:rerun-if-env-changed=OHOS_SDK_NATIVE");
}

View File

@@ -2,28 +2,30 @@
* License, v. 2.0. If a copy of the MPL was not distributed with this
* file, You can obtain one at https://mozilla.org/MPL/2.0/. */
use std::{
collections::HashMap,
sync::{
atomic::AtomicUsize,
mpsc::{self, Sender},
Arc, Mutex, Weak,
},
thread,
};
use std::collections::HashMap;
use std::sync::atomic::{AtomicUsize, Ordering};
use std::sync::mpsc::{self, Sender};
use std::sync::{Arc, Mutex, Weak};
use std::thread;
use log::warn;
use log::{debug, warn};
use mime::Mime;
use servo_media::player::StreamType;
use servo_media::{
Backend, BackendInit, BackendMsg, ClientContextId, MediaInstance, SupportsMediaType,
Backend, BackendInit, BackendMsg, ClientContextId, MediaInstance, MediaInstanceError,
SupportsMediaType,
};
use crate::{player::OhosAVPlayer, registry_scanner::OHOS_REGISTRY_SCANNER};
use crate::player::OhosAvPlayer;
use crate::registry_scanner::OHOS_REGISTRY_SCANNER;
mod ohos_media;
mod player;
mod registry_scanner;
type MediaInstanceMap = HashMap<ClientContextId, Vec<(usize, Weak<Mutex<dyn MediaInstance>>)>>;
pub struct OhosBackend {
instances: Arc<Mutex<HashMap<ClientContextId, Vec<(usize, Weak<Mutex<dyn MediaInstance>>)>>>>,
instances: Arc<Mutex<MediaInstanceMap>>,
next_instance_id: AtomicUsize,
backend_chan: Arc<Mutex<Sender<BackendMsg>>>,
}
@@ -32,7 +34,7 @@ impl OhosBackend {
fn media_instance_action(
&self,
id: &ClientContextId,
cb: &dyn Fn(&dyn MediaInstance) -> Result<(), ()>,
cb: &dyn Fn(&dyn MediaInstance) -> Result<(), MediaInstanceError>,
) {
let mut instances = self.instances.lock().unwrap();
match instances.get_mut(id) {
@@ -48,16 +50,14 @@ impl OhosBackend {
}),
None => {
warn!("Trying to exec media action on an unknown client context");
}
},
}
}
}
impl BackendInit for OhosBackend {
fn init() -> Box<dyn Backend> {
let instances: Arc<
Mutex<HashMap<ClientContextId, Vec<(usize, Weak<Mutex<dyn MediaInstance>>)>>>,
> = Arc::new(Mutex::new(HashMap::new()));
let instances: Arc<Mutex<MediaInstanceMap>> = Arc::new(Mutex::new(HashMap::new()));
let instances_ = instances.clone();
let (backend_chan, recvr) = mpsc::channel();
@@ -65,7 +65,11 @@ impl BackendInit for OhosBackend {
.name("OhosBackend ShutdownThread".to_owned())
.spawn(move || {
match recvr.recv().unwrap() {
BackendMsg::Shutdown { context, id, tx_ack } => {
BackendMsg::Shutdown {
context,
id,
tx_ack,
} => {
let mut map = instances_.lock().unwrap();
if let Some(vec) = map.get_mut(&context) {
vec.retain(|m| m.0 != id);
@@ -74,15 +78,15 @@ impl BackendInit for OhosBackend {
}
}
let _ = tx_ack.send(());
}
},
};
})
.unwrap();
return Box::new(OhosBackend {
Box::new(OhosBackend {
next_instance_id: AtomicUsize::new(0),
instances,
backend_chan: Arc::new(Mutex::new(backend_chan)),
});
})
}
}
@@ -92,7 +96,7 @@ impl BackendInit for OhosBackend {
impl Backend for OhosBackend {
fn create_player(
&self,
id: &servo_media::ClientContextId,
context_id: &servo_media::ClientContextId,
stream_type: servo_media_player::StreamType,
sender: servo_media_player::ipc_channel::ipc::IpcSender<servo_media_player::PlayerEvent>,
video_renderer: Option<
@@ -101,9 +105,33 @@ impl Backend for OhosBackend {
audio_renderer: Option<
std::sync::Arc<std::sync::Mutex<dyn servo_media_player::audio::AudioRenderer>>,
>,
gl_context: Box<dyn servo_media_player::context::PlayerGLContext>,
_gl_context: Box<dyn servo_media_player::context::PlayerGLContext>,
) -> std::sync::Arc<std::sync::Mutex<dyn servo_media_player::Player>> {
Arc::new(Mutex::new(OhosAVPlayer::new()))
// TODO: Choose different Player Impl depends on stream_type
match stream_type {
StreamType::Stream => {
todo!("Stream Type currently not supported!")
},
StreamType::Seekable => (),
}
if let Some(_audio_renderer) = audio_renderer {
warn!("Audio Rendering Currently Not Supported!");
}
let player_id = self.next_instance_id.fetch_add(1, Ordering::Relaxed);
debug!("Creating Player in OhosBackend");
let mut player = OhosAvPlayer::new(
player_id,
*context_id,
sender,
video_renderer,
self.backend_chan.clone(),
);
player.setup_info_event();
player.setup_data_source();
Arc::new(Mutex::new(player))
}
fn create_audiostream(&self) -> servo_media_streams::MediaStreamId {
@@ -120,7 +148,7 @@ impl Backend for OhosBackend {
fn create_stream_and_socket(
&self,
ty: servo_media_streams::MediaStreamType,
_ty: servo_media_streams::MediaStreamType,
) -> (
Box<dyn servo_media_streams::MediaSocket>,
servo_media_streams::MediaStreamId,
@@ -130,22 +158,22 @@ impl Backend for OhosBackend {
fn create_audioinput_stream(
&self,
set: servo_media_streams::capture::MediaTrackConstraintSet,
_set: servo_media_streams::capture::MediaTrackConstraintSet,
) -> Option<servo_media_streams::MediaStreamId> {
todo!()
}
fn create_videoinput_stream(
&self,
set: servo_media_streams::capture::MediaTrackConstraintSet,
_set: servo_media_streams::capture::MediaTrackConstraintSet,
) -> Option<servo_media_streams::MediaStreamId> {
todo!()
}
fn create_audio_context(
&self,
id: &servo_media::ClientContextId,
options: servo_media_audio::context::AudioContextOptions,
_id: &servo_media::ClientContextId,
_options: servo_media_audio::context::AudioContextOptions,
) -> Result<
std::sync::Arc<std::sync::Mutex<servo_media_audio::context::AudioContext>>,
servo_media_audio::sink::AudioSinkError,
@@ -155,7 +183,7 @@ impl Backend for OhosBackend {
fn create_webrtc(
&self,
signaller: Box<dyn servo_media_webrtc::WebRtcSignaller>,
_signaller: Box<dyn servo_media_webrtc::WebRtcSignaller>,
) -> servo_media_webrtc::WebRtcController {
todo!()
}

View File

@@ -0,0 +1,431 @@
/* This Source Code Form is subject to the terms of the Mozilla Public
* License, v. 2.0. If a copy of the MPL was not distributed with this
* file, You can obtain one at https://mozilla.org/MPL/2.0/. */
use std::ffi::c_void;
use libc::pollfd;
use log::{debug, warn};
use ohos_media_sys::avformat::OH_AVFormat;
use ohos_media_sys::avplayer::{
OH_AVPlayer_Create, OH_AVPlayer_Pause, OH_AVPlayer_Play, OH_AVPlayer_Prepare,
OH_AVPlayer_Release, OH_AVPlayer_Seek, OH_AVPlayer_SetOnInfoCallback,
OH_AVPlayer_SetPlaybackSpeed, OH_AVPlayer_SetVideoSurface, OH_AVPlayer_SetVolume,
OH_AVPlayer_Stop,
};
use ohos_media_sys::avplayer_base::{
AVPlaybackSpeed, AVPlayerOnInfoType, AVPlayerSeekMode, AVPlayerState, OH_AVPlayer,
};
use ohos_sys_opaque_types::{OH_NativeImage, OHNativeWindow, OHNativeWindowBuffer};
use ohos_window_sys::native_buffer::native_buffer::OH_NativeBuffer_Usage;
use ohos_window_sys::native_image::{
OH_ConsumerSurface_Create, OH_ConsumerSurface_SetDefaultUsage,
OH_NativeImage_AcquireNativeWindow, OH_NativeImage_AcquireNativeWindowBuffer,
OH_NativeImage_Destroy, OH_NativeImage_ReleaseNativeWindowBuffer,
OH_NativeImage_SetOnFrameAvailableListener, OH_OnFrameAvailableListener,
};
use ohos_window_sys::native_window::{
OH_NativeWindow_DestroyNativeWindow, OH_NativeWindow_GetBufferHandleFromNative,
OH_NativeWindow_NativeObjectReference, OH_NativeWindow_NativeObjectUnreference,
};
#[cfg(not(sdk_api_21))]
use crate::ohos_media::dummy_source::MediaSourceWrapper;
#[cfg(sdk_api_21)]
use crate::ohos_media::source::MediaSourceWrapper;
#[repr(C)]
#[derive(Debug)]
pub struct FrameInfo {
pub fd: i32,
pub width: i32,
pub height: i32,
pub stride: i32,
pub size: i32,
pub format: i32,
pub vir_addr: *mut u8,
native_window_buffer: *mut OHNativeWindowBuffer,
fence_fd: i32,
}
pub struct OhosPlayer {
native_image: Option<*mut OH_NativeImage>,
ohos_av_player: *mut OH_AVPlayer,
media_data_source: Option<MediaSourceWrapper>,
event_info_callback_closure: Option<*mut Box<dyn Fn(AVPlayerOnInfoType, *mut OH_AVFormat)>>,
frame_available_callback_closure: Option<*mut Box<dyn Fn()>>,
native_window: Option<*mut OHNativeWindow>,
has_set_source_size: bool,
has_set_window: bool,
volume: f64,
playback_rate: f64,
state: AVPlayerState,
}
impl OhosPlayer {
pub fn new() -> Self {
debug!("Creating OHOS Player!");
OhosPlayer {
native_image: None,
ohos_av_player: unsafe { OH_AVPlayer_Create() },
media_data_source: None,
event_info_callback_closure: None,
frame_available_callback_closure: None,
native_window: None,
has_set_source_size: false,
has_set_window: false,
volume: 1.0,
playback_rate: 1.0,
state: AVPlayerState::AV_IDLE,
}
}
// initialize function
pub fn set_state(&mut self, state: AVPlayerState) {
self.state = state;
self.initialize_check_state_action();
}
/// This function would try to run some action during initialize phase
/// each action should only be run once.
/// Should try to check whether to run each action when
/// 1. state changed
/// 2. after running external initialize step.
/// e.g. setup_window_buffer_listener
pub fn initialize_check_state_action(&mut self) {
if self.state == AVPlayerState::AV_INITIALIZED &&
self.native_window.is_some() &&
!self.has_set_window
{
self.has_set_window = true;
self.set_window_to_player();
self.prepare(); // only prepare after setting window.
}
}
fn set_window_to_player(&mut self) {
let Some(native_window) = self.native_window else {
warn!("Setting window to player, but Native Window not initialized!");
return;
};
unsafe {
OH_AVPlayer_SetVideoSurface(self.ohos_av_player, native_window);
}
}
/// The first step of initialization process, after this avplayer will
/// become initialized. Kickstart the initialize process.
fn setup_data_source(&mut self) {
let Some(ref mut source) = self.media_data_source else {
warn!("Error Source not initialized!");
return;
};
debug!("Setting up data source");
source.set_data_src(self.ohos_av_player);
}
pub fn set_volume(&mut self, volume: f64) {
unsafe {
OH_AVPlayer_SetVolume(self.ohos_av_player, volume as f32, volume as f32);
}
self.volume = volume;
}
pub fn volume(&self) -> f64 {
self.volume
}
pub fn set_source(&mut self, source: MediaSourceWrapper) {
// Todo: Should think of better way to change the way the data is given.
self.media_data_source = Some(source);
}
pub fn end_of_stream(&self) {
if let Some(source) = &self.media_data_source {
source.end_of_stream();
}
}
pub fn push_data(&self, data: Vec<u8>) {
if let Some(inner_source) = &self.media_data_source {
inner_source.push_data(data);
}
}
pub fn play(&self) {
unsafe {
debug!("OH_AVPlayer_Play!");
OH_AVPlayer_Play(self.ohos_av_player);
}
}
pub fn set_mute(&mut self, mute: bool) {
debug!("OH_AVPlayer Set mute: {}", mute);
let volume = match mute {
true => 0.,
false => 1.,
};
unsafe {
OH_AVPlayer_SetVolume(self.ohos_av_player, volume, volume);
}
self.volume = volume as f64;
}
pub fn muted(&self) -> bool {
self.volume == 0.0
}
pub fn seek(&self, second: i32) {
unsafe {
log::info!("OH_AVPlayer_Seek! :{}", second);
OH_AVPlayer_Seek(
self.ohos_av_player,
second,
AVPlayerSeekMode::AV_SEEK_CLOSEST,
);
}
}
pub fn set_rate(&mut self, rate: f64) {
self.playback_rate = rate;
// Round toward 1x: for rates >= 1 round down, for rates < 1 round up.
let speed = if rate >= 1.0 {
match rate {
3.0.. => AVPlaybackSpeed::AV_SPEED_FORWARD_3_00_X,
2.0.. => AVPlaybackSpeed::AV_SPEED_FORWARD_2_00_X,
1.75.. => AVPlaybackSpeed::AV_SPEED_FORWARD_1_75_X,
1.5.. => AVPlaybackSpeed::AV_SPEED_FORWARD_1_50_X,
1.25.. => AVPlaybackSpeed::AV_SPEED_FORWARD_1_25_X,
_ => AVPlaybackSpeed::AV_SPEED_FORWARD_1_00_X,
}
} else {
match rate {
..=0.0 => AVPlaybackSpeed::AV_SPEED_FORWARD_1_00_X,
..=0.125 => AVPlaybackSpeed::AV_SPEED_FORWARD_0_125_X,
..=0.25 => AVPlaybackSpeed::AV_SPEED_FORWARD_0_25_X,
..=0.5 => AVPlaybackSpeed::AV_SPEED_FORWARD_0_50_X,
..=0.75 => AVPlaybackSpeed::AV_SPEED_FORWARD_0_75_X,
_ => AVPlaybackSpeed::AV_SPEED_FORWARD_1_00_X,
}
};
unsafe {
OH_AVPlayer_SetPlaybackSpeed(self.ohos_av_player, speed);
}
}
pub fn playback_rate(&self) -> f64 {
self.playback_rate
}
pub fn pause(&self) {
unsafe {
debug!("OH_AVPlayer_Pause!");
OH_AVPlayer_Pause(self.ohos_av_player);
}
}
pub fn stop(&self) {
unsafe {
debug!("OH_AVPlayer_Stop!");
OH_AVPlayer_Stop(self.ohos_av_player);
}
}
pub fn prepare(&mut self) {
unsafe {
debug!("OH_AVPlayer Prepare Called!");
OH_AVPlayer_Prepare(self.ohos_av_player);
}
}
// For AVPlayer only call SetSource after SetInputSize to avoid being recognized as live stream.
pub fn set_input_size(&mut self, size: u64) {
if let Some(inner_source) = &mut self.media_data_source {
debug!("Setting up data source size: {}", size);
inner_source.set_input_size(size as usize);
// Only set once when first time initialized
if !self.has_set_source_size {
debug!("Setup data Source");
self.setup_data_source();
self.has_set_source_size = true;
}
}
}
pub fn connect_info_event_callback<F>(&mut self, f: F)
where
F: Fn(AVPlayerOnInfoType, *mut OH_AVFormat) + Send + 'static,
{
debug!("Trying to connect info event callback");
extern "C" fn on_info_event(
_player: *mut OH_AVPlayer,
into_type: AVPlayerOnInfoType,
info_body: *mut OH_AVFormat,
user_data: *mut c_void,
) {
assert!(
!user_data.is_null(),
"on_info_event: user_data must not be null"
);
let f = unsafe {
&*(user_data as *const Box<dyn Fn(AVPlayerOnInfoType, *mut OH_AVFormat)>)
};
f(into_type, info_body);
}
let f: Box<dyn Fn(AVPlayerOnInfoType, *mut OH_AVFormat)> = Box::new(f);
let f: Box<Box<dyn Fn(AVPlayerOnInfoType, *mut OH_AVFormat)>> = Box::new(f);
let raw_ptr_f = unsafe {
let raw_ptr_f = Box::into_raw(f);
let ret = OH_AVPlayer_SetOnInfoCallback(
self.ohos_av_player,
Some(on_info_event),
raw_ptr_f as *mut c_void,
);
debug!("OH AVPlayer Set INFO Callback: {:?}", ret);
raw_ptr_f
};
self.event_info_callback_closure = Some(raw_ptr_f);
}
/// External Initialization step.
pub fn setup_window_buffer_listener<F: Fn() + Send + 'static>(&mut self, f: F) {
let f: Box<dyn Fn()> = Box::new(f);
let f: Box<Box<dyn Fn()>> = Box::new(f);
(
self.native_image,
self.frame_available_callback_closure,
self.native_window,
) = unsafe {
let native_image = OH_ConsumerSurface_Create();
debug!("Native image created :{:p}", native_image);
let ret = OH_ConsumerSurface_SetDefaultUsage(
native_image,
OH_NativeBuffer_Usage::NATIVEBUFFER_USAGE_CPU_READ.0 as u64,
);
debug!("Set consumer surface default usage: {}", ret);
extern "C" fn frame_available_cb(context: *mut c_void) {
assert!(
!context.is_null(),
"frame_available_cb: context must not be null"
);
let f = unsafe { &*(context as *mut Box<dyn Fn()>) };
f();
}
let raw_ptr_f = Box::into_raw(f);
let listener = OH_OnFrameAvailableListener {
context: raw_ptr_f as *mut c_void,
onFrameAvailable: Some(frame_available_cb),
};
let res = OH_NativeImage_SetOnFrameAvailableListener(native_image, listener);
debug!("Native Image Set On Frame Available Listener done: {}", res);
let native_window = OH_NativeImage_AcquireNativeWindow(native_image);
debug!(
"Native window acquired from native window {:p}",
native_window
);
(Some(native_image), Some(raw_ptr_f), Some(native_window))
};
self.initialize_check_state_action();
}
/// Should pair with release_buffer.
pub fn acquire_buffer(&self) -> Option<FrameInfo> {
let native_image = self.native_image?;
let mut native_window_buffer = std::ptr::null_mut();
let mut fence_fd = 0;
let ret = unsafe {
OH_NativeImage_AcquireNativeWindowBuffer(
native_image,
&mut native_window_buffer,
&mut fence_fd,
)
};
if ret != 0 || native_window_buffer.is_null() {
warn!("Failed to acquire native window buffer: ret={}", ret);
return None;
}
debug!("Fence fd: {}", fence_fd);
if fence_fd != 0 && fence_fd != -1 {
let mut pollfds = pollfd {
fd: fence_fd,
events: libc::POLLIN,
revents: 0,
};
let ret = unsafe { libc::poll(&mut pollfds, 1, 3000) };
if ret <= 0 {
warn!("Pulling timeout or failed");
return None;
}
}
debug!("Taking object refernce!");
let ret =
unsafe { OH_NativeWindow_NativeObjectReference(native_window_buffer as *mut c_void) };
if ret != 0 {
warn!("Native Window Buffer Reference Failed!");
}
let frame_info = unsafe {
let buffer_handle = OH_NativeWindow_GetBufferHandleFromNative(native_window_buffer);
FrameInfo {
fd: (*buffer_handle).fd,
width: (*buffer_handle).width,
height: (*buffer_handle).height,
stride: (*buffer_handle).stride,
size: (*buffer_handle).size,
format: (*buffer_handle).format,
vir_addr: (*buffer_handle).virAddr as *mut u8,
native_window_buffer,
fence_fd,
}
};
let ret =
unsafe { OH_NativeWindow_NativeObjectUnreference(native_window_buffer as *mut c_void) };
if ret != 0 {
warn!("Native Window Buffer Unreference failed!");
}
// FIXME(ray): Potential memory copying.
Some(frame_info)
}
/// Should pair with acquire_buffer.
pub fn release_buffer(&self, frame_info: FrameInfo) {
let native_image = self.native_image.expect("native image should not be empty");
unsafe {
let ret = OH_NativeImage_ReleaseNativeWindowBuffer(
native_image,
frame_info.native_window_buffer,
-1,
);
debug!("Release native window buffer ret: {}", ret);
}
}
}
impl Drop for OhosPlayer {
fn drop(&mut self) {
unsafe {
if let Some(closure) = self.frame_available_callback_closure {
let box_closure = Box::from_raw(closure);
drop(box_closure);
}
if let Some(closure) = self.event_info_callback_closure {
let box_closure = Box::from_raw(closure);
drop(box_closure);
}
debug!("Releasing AVPlayer because drop is called!");
OH_AVPlayer_Release(self.ohos_av_player);
if let Some(mut native_image) = self.native_image {
OH_NativeImage_Destroy(&mut native_image);
}
if let Some(native_window) = self.native_window {
OH_NativeWindow_DestroyNativeWindow(native_window);
}
}
}
}
unsafe impl Send for OhosPlayer {}
unsafe impl Sync for OhosPlayer {}

View File

@@ -0,0 +1,36 @@
/* This Source Code Form is subject to the terms of the Mozilla Public
* License, v. 2.0. If a copy of the MPL was not distributed with this
* file, You can obtain one at https://mozilla.org/MPL/2.0/. */
use crate::ohos_media::source_builder::MediaSourceBuilder;
pub struct MediaSourceWrapper {}
impl MediaSourceWrapper {
pub fn new() -> Self {
Self {}
}
}
impl MediaSourceWrapper {
pub fn builder() -> MediaSourceBuilder {
MediaSourceBuilder {
enough_data: None,
seek_data: None,
}
}
pub fn set_input_size(&self, _size: usize) {
// No-op for dummy source
}
pub fn end_of_stream(&self) {
// No-op for dummy source
}
pub fn push_data(&self, _data: Vec<u8>) {
// No-op for dummy source
}
pub fn set_data_src(&mut self, _av_player: *mut ohos_media_sys::avplayer_base::OH_AVPlayer) {
// No-op for dummy source.
}
}

View File

@@ -0,0 +1,9 @@
/* This Source Code Form is subject to the terms of the Mozilla Public
* License, v. 2.0. If a copy of the MPL was not distributed with this
* file, You can obtain one at https://mozilla.org/MPL/2.0/. */
pub mod avplayer;
#[cfg(not(sdk_api_21))]
pub mod dummy_source;
#[cfg(sdk_api_21)]
pub mod source;
pub mod source_builder;

View File

@@ -0,0 +1,250 @@
/* This Source Code Form is subject to the terms of the Mozilla Public
* License, v. 2.0. If a copy of the MPL was not distributed with this
* file, You can obtain one at https://mozilla.org/MPL/2.0/. */
use core::slice;
use std::sync::{Arc, Mutex};
use log::debug;
use ohos_media_sys::avbuffer::OH_AVBuffer;
use ohos_media_sys::avcodec_base::OH_AVDataSourceExt;
use ohos_media_sys::avplayer::OH_AVPlayer_SetDataSource;
use ohos_media_sys::avplayer_base::OH_AVPlayer;
use crate::ohos_media::source_builder::MediaSourceBuilder;
const DEFAULT_CACHE_SIZE: usize = 8 * 1024 * 1024; // 8MB
pub struct MediaSourceWrapper {
pub(crate) data_src: ohos_media_sys::avcodec_base::OH_AVDataSourceExt,
total_media_source_size: usize,
playback_buffer: Arc<Mutex<PlaybackBuffer>>,
closure_handle: *mut Box<dyn Fn(*mut u8, u32, i64) -> i32>,
}
impl MediaSourceWrapper {
pub fn builder() -> MediaSourceBuilder {
MediaSourceBuilder {
enough_data: None,
seek_data: None,
}
}
}
// AVPlayer itself already have a short buffer internally,
// we can just schedule fetch if we does not have data for that specific location.
impl MediaSourceWrapper {
pub fn new(source_cb: MediaSourceBuilder) -> Self {
let playback_buffer = Arc::new(Mutex::new(PlaybackBuffer::new(source_cb.enough_data)));
let playback_buffer_clone = playback_buffer.clone();
let read_at_closure = move |buffer: *mut u8, length: u32, pos: i64| -> i32 {
log::debug!(
"Inside Read At Closure: {:p}, length: {}, pos: {}",
buffer,
length,
pos
);
let buffer = unsafe { slice::from_raw_parts_mut(buffer, length as usize) };
let (read_bytes, seek_pos) = {
let mut playback_buffer_lock = playback_buffer_clone.lock().unwrap();
playback_buffer_lock.read_data(buffer, pos)
};
// The playback_buffer lock must be released before calling the seek
// closure, which blocks on IPC with the script thread. Holding the
// lock here would deadlock if the script thread is simultaneously
// trying to push_data (which also acquires this lock).
if let Some(seek_pos) = seek_pos {
if let Some(seek_closure) = &source_cb.seek_data {
seek_closure(seek_pos);
}
}
read_bytes
};
let box_closure: Box<dyn Fn(*mut u8, u32, i64) -> i32> = Box::new(read_at_closure);
// Double boxing is needed because we need to convert the closure into a raw pointer to pass to C,
// but Rust does not allow us to directly convert a Box<dyn Fn> into a raw pointer, we need to first box it
// and then convert the box into a raw pointer.
let double_box_closure = Box::new(box_closure);
extern "C" fn oh_avdatasource_read_at_callback(
data: *mut OH_AVBuffer,
length: i32,
pos: i64,
user_data: *mut std::ffi::c_void,
) -> i32 {
assert!(
!user_data.is_null(),
"oh_avdatasource_read_at_callback: user_data must not be null"
);
let f = unsafe { &*(user_data as *mut Box<dyn Fn(*mut u8, u32, i64) -> i32>) };
let buffer_addr = unsafe { ohos_media_sys::avbuffer::OH_AVBuffer_GetAddr(data) };
f(buffer_addr, length as u32, pos)
}
let data_src = OH_AVDataSourceExt {
size: 0,
readAt: Some(oh_avdatasource_read_at_callback),
};
let raw_ptr_f = Box::into_raw(double_box_closure);
Self {
data_src,
total_media_source_size: 0,
playback_buffer,
closure_handle: raw_ptr_f,
}
}
pub fn set_input_size(&mut self, size: usize) {
log::debug!("Setting input size to {}", size);
if self.total_media_source_size == 0 {
self.total_media_source_size = size;
self.data_src.size = size as i64;
}
self.playback_buffer.lock().unwrap().notify_seek_done();
}
pub fn push_data(&self, data: Vec<u8>) -> bool {
let mut playback_buffer_lock = self.playback_buffer.lock().unwrap();
playback_buffer_lock.push_buffer(data)
}
pub fn end_of_stream(&self) {
self.playback_buffer.lock().unwrap().end_of_stream();
}
pub fn set_data_src(&mut self, av_player: *mut OH_AVPlayer) {
unsafe {
OH_AVPlayer_SetDataSource(
av_player,
&mut self.data_src as *mut OH_AVDataSourceExt,
self.closure_handle as *mut std::ffi::c_void,
);
}
}
}
impl Drop for MediaSourceWrapper {
fn drop(&mut self) {
unsafe {
let box_closure = Box::from_raw(self.closure_handle);
drop(box_closure);
}
}
}
/// There would be two thread interact with playbackbuffer,
/// 1. AVPlayer Client Thread, will call read.
/// 2. Script Thread, will try to push_buffer into buffer.
pub struct PlaybackBuffer {
enough_data_closure: Option<Box<dyn Fn() + Send + Sync>>,
buffer_data_head: i64,
has_active_request: bool,
last_read_end: i64,
is_seeking: bool,
buffer: Vec<u8>,
}
impl PlaybackBuffer {
pub fn new(enough_data_closure: Option<Box<dyn Fn() + Send + Sync>>) -> Self {
PlaybackBuffer {
enough_data_closure,
buffer_data_head: 0,
has_active_request: false,
is_seeking: false,
last_read_end: 0,
buffer: Vec::with_capacity(DEFAULT_CACHE_SIZE),
}
}
pub fn notify_seek_done(&mut self) {
self.is_seeking = false;
}
/// Return (Number of Bytes read, Some(Seek Position) if no data at that position)
pub fn read_data(&mut self, dest_slice: &mut [u8], pos: i64) -> (i32, Option<u64>) {
if self.is_seeking {
debug!(
"Currently seeking, cannot read data at position {}, buffer head is at {}, buffer len is {} has_active_request: {}",
pos,
self.buffer_data_head,
self.buffer.len(),
self.has_active_request
);
return (0, None);
}
// First check whether we have enough data at that position.
let pos_offset = pos - self.buffer_data_head;
let available_data = self.buffer.len() as i64 - pos_offset;
let need_seek = pos_offset < 0 ||
(available_data <= 0 &&
(!self.has_active_request ||
pos >= self.buffer_data_head + self.buffer.capacity() as i64));
if need_seek {
debug!(
"We don't have data at position {}, buffer head is at {}, buffer len is {} has_active_request: {}",
pos,
self.buffer_data_head,
self.buffer.len(),
self.has_active_request
);
self.buffer.clear();
self.buffer_data_head = pos;
self.has_active_request = true;
self.is_seeking = true;
return (0, Some(pos as u64));
}
let read_len = available_data.clamp(0, dest_slice.len() as i64) as usize;
if read_len == 0 {
debug!(
"No available data to read at position {}, buffer head is at {}, buffer len is {} has_active_request: {}",
pos,
self.buffer_data_head,
self.buffer.len(),
self.has_active_request
);
return (0, None);
}
dest_slice[..read_len]
.copy_from_slice(&self.buffer[(pos_offset) as usize..(pos_offset as usize + read_len)]);
self.last_read_end = pos + read_len as i64;
(read_len as i32, None)
}
/// Return False when we have enough data.
pub fn push_buffer(&mut self, data: Vec<u8>) -> bool {
// Reject data while a seek is in progress. Between the buffer being
// cleared/reset for a new seek position and the old fetch being
// cancelled, stale data from the previous fetch could arrive and
// corrupt the buffer (it would be appended as if it started at the
// new seek position). Silently discard it.
if self.is_seeking {
return true;
}
if self.buffer.len() + data.len() > self.buffer.capacity() {
debug!(
"Buffer is full, cannot push more data,current head: {}, current len: {}, incoming data len: {}, capacity: {}",
self.buffer_data_head,
self.buffer.len(),
data.len(),
self.buffer.capacity()
);
self.has_active_request = false;
if let Some(enough_data_closure) = &self.enough_data_closure {
enough_data_closure();
}
return false;
}
self.buffer.extend_from_slice(&data);
true
}
pub fn end_of_stream(&mut self) {
self.has_active_request = false;
}
}

View File

@@ -0,0 +1,40 @@
/* This Source Code Form is subject to the terms of the Mozilla Public
* License, v. 2.0. If a copy of the MPL was not distributed with this
* file, You can obtain one at https://mozilla.org/MPL/2.0/. */
#[cfg(not(sdk_api_21))]
use crate::ohos_media::dummy_source::MediaSourceWrapper;
#[cfg(sdk_api_21)]
use crate::ohos_media::source::MediaSourceWrapper;
type SeekDataClosure = Box<dyn Fn(u64) -> bool + Send + Sync>;
pub struct MediaSourceBuilder {
pub enough_data: Option<Box<dyn Fn() + Send + Sync>>,
pub seek_data: Option<SeekDataClosure>,
}
impl MediaSourceBuilder {
pub fn set_enough_data<F: Fn() + Send + Sync + Clone + 'static>(mut self, callback: F) -> Self {
self.enough_data = Some(Box::new(callback));
self
}
pub fn set_seek_data<F: Fn(u64) -> bool + Send + Sync + Clone + 'static>(
mut self,
callback: F,
) -> Self {
self.seek_data = Some(Box::new(callback));
self
}
pub fn build(self) -> MediaSourceWrapper {
#[cfg(not(sdk_api_21))]
{
MediaSourceWrapper::new()
}
#[cfg(sdk_api_21)]
{
MediaSourceWrapper::new(self)
}
}
}

View File

@@ -2,129 +2,729 @@
* License, v. 2.0. If a copy of the MPL was not distributed with this
* file, You can obtain one at https://mozilla.org/MPL/2.0/. */
use servo_media::MediaInstance;
use servo_media_player::Player;
use std::cell::Cell;
use std::ops::Range;
use std::sync::{Arc, Mutex, mpsc};
use std::time;
pub struct OhosAVPlayer {}
use crossbeam_channel::Sender;
use ipc_channel::ipc::{IpcReceiver, channel};
use log::{debug, error, warn};
use ohos_media_sys::avformat::{
OH_AVFormat, OH_AVFormat_GetFloatValue, OH_AVFormat_GetIntValue, OH_AVFormat_GetLongValue,
};
use ohos_media_sys::avplayer_base::{
AVPlayerOnInfoType, AVPlayerState, OH_PLAYER_BUFFERING_TYPE, OH_PLAYER_BUFFERING_VALUE,
OH_PLAYER_CURRENT_POSITION, OH_PLAYER_DURATION, OH_PLAYER_IS_LIVE_STREAM,
OH_PLAYER_SEEK_POSITION, OH_PLAYER_STATE, OH_PLAYER_STATE_CHANGE_REASON,
OH_PLAYER_VIDEO_HEIGHT, OH_PLAYER_VIDEO_WIDTH, OH_PLAYER_VOLUME,
};
use servo_media::{BackendMsg, ClientContextId, MediaInstance, MediaInstanceError};
use servo_media_player::metadata::Metadata;
use servo_media_player::video::{self, Buffer, VideoFrame, VideoFrameData};
use servo_media_player::{PlaybackState, Player, PlayerEvent, SeekLock, SeekLockMsg};
use yuv::yuv_nv12_to_bgra;
impl OhosAVPlayer {
pub fn new() -> OhosAVPlayer {
OhosAVPlayer {}
use crate::ohos_media::avplayer::OhosPlayer as OhosPlayerInner;
#[cfg(not(sdk_api_21))]
use crate::ohos_media::dummy_source::MediaSourceWrapper;
#[cfg(sdk_api_21)]
use crate::ohos_media::source::MediaSourceWrapper;
// Height of decoded video frame from AVPlayer is padded to multiples of this value by the codec.
// https://developer.huawei.com/consumer/cn/doc/harmonyos-guides/video-decoding
const FRAME_HEIGHT_MULTIPLE: i32 = 32;
/// This is used to fill the gap between internal AVPlayer state and Player State exposed to Media Element.
pub struct StateManager {
pub internal_state: InternalState,
pub player_state: PlayerState,
}
pub struct InternalState {
pub state: AVPlayerState,
}
pub struct PlayerState {
pub paused: bool,
}
impl StateManager {
pub fn new() -> Self {
StateManager {
internal_state: InternalState {
state: AVPlayerState::AV_IDLE,
},
player_state: PlayerState { paused: true },
}
}
}
impl MediaInstance for OhosAVPlayer {
pub struct OhosAvPlayer {
id: usize,
context_id: ClientContextId,
player_inner: Arc<Mutex<OhosPlayerInner>>,
event_sender: Arc<Mutex<ipc_channel::ipc::IpcSender<servo_media::player::PlayerEvent>>>,
video_sink: Option<Arc<Mutex<VideoSink>>>,
backend_chan: Arc<Mutex<mpsc::Sender<BackendMsg>>>,
last_metadata: Arc<Mutex<Cell<Metadata>>>,
state_manager: Arc<Mutex<StateManager>>,
}
// Procedure for setting up AVPlayer, state change condition:
// 1. Create AVPlayer
// 2. Setup AVPlayer InfoCallback (this should be the first step, so that we can listen to state change)
// 3. Setup AVPlayer Media source.
// 4. wait for AVplayer into Initialized State, setup VideoSurface.
// 5. AVPlayer Prepare()
// 6. wait for ready for prepare, in the meantime, avplayer will try to read data from media source.
// 7. player ready to play.
impl OhosAvPlayer {
pub fn new(
id: usize,
context_id: ClientContextId,
sender: ipc_channel::ipc::IpcSender<servo_media::player::PlayerEvent>,
video_renderer: Option<
std::sync::Arc<std::sync::Mutex<dyn servo_media::player::video::VideoFrameRenderer>>,
>,
backend_chan: Arc<Mutex<mpsc::Sender<BackendMsg>>>,
) -> OhosAvPlayer {
let player_inner = Arc::new(Mutex::new(OhosPlayerInner::new()));
let event_sender = Arc::new(Mutex::new(sender));
let video_sink = video_renderer.clone().map(|v| {
Arc::new(Mutex::new(VideoSink::new(
v,
player_inner.clone(),
event_sender.clone(),
)))
});
OhosAvPlayer {
id,
context_id,
player_inner,
event_sender,
video_sink,
backend_chan,
last_metadata: Arc::new(Mutex::new(Cell::new(Metadata {
duration: None,
width: 0,
height: 0,
format: String::new(),
is_seekable: false,
is_live: false,
video_tracks: vec![],
audio_tracks: vec![],
title: None,
}))),
state_manager: Arc::new(Mutex::new(StateManager::new())),
}
}
pub fn setup_info_event(&mut self) {
let sender_clone = self.event_sender.clone();
let player_inner_clone = self.player_inner.clone();
let state_manager_clone = self.state_manager.clone();
let video_sink_clone = self.video_sink.as_ref().map(|v| v.clone());
let metadata_clone = self.last_metadata.clone();
let event_info_closure =
move |info_type: AVPlayerOnInfoType, info_body: *mut OH_AVFormat| {
debug!(
"Info Type received!:{:?}, address: {:p}",
info_type, info_body
);
match info_type {
AVPlayerOnInfoType::AV_INFO_TYPE_STATE_CHANGE => {
let mut state_change_reason = -1;
let mut state = -1;
unsafe {
OH_AVFormat_GetIntValue(info_body, OH_PLAYER_STATE, &mut state);
OH_AVFormat_GetIntValue(
info_body,
OH_PLAYER_STATE_CHANGE_REASON,
&mut state_change_reason,
);
}
let av_player_state = AVPlayerState(state as u32);
debug!(
"AV Player State Change: {:?}, state change reason: {}",
av_player_state, state_change_reason
);
state_manager_clone.lock().unwrap().internal_state.state = av_player_state;
match av_player_state {
AVPlayerState::AV_INITIALIZED => {
debug!("Setup Video Sink");
if let Some(ref video_sink_clone) = video_sink_clone {
video_sink_clone.lock().unwrap().setup(); // TODO: Hide internal state machine
}
},
AVPlayerState::AV_PREPARED => {
let _ = sender_clone
.lock()
.unwrap()
.send(PlayerEvent::StateChanged(PlaybackState::Paused));
},
AVPlayerState::AV_PLAYING => {
let sender_clone_guard = sender_clone.lock().unwrap();
let _ = sender_clone_guard
.send(PlayerEvent::StateChanged(PlaybackState::Playing));
},
AVPlayerState::AV_PAUSED => {
let _ = sender_clone
.lock()
.unwrap()
.send(PlayerEvent::StateChanged(PlaybackState::Paused));
},
AVPlayerState::AV_STOPPED => {
let _ = sender_clone
.lock()
.unwrap()
.send(PlayerEvent::StateChanged(PlaybackState::Stopped));
},
AVPlayerState::AV_COMPLETED => {
let _ = sender_clone.lock().unwrap().send(PlayerEvent::EndOfStream);
},
_ => {
warn!("Unhandled State: {:?}", av_player_state);
},
}
player_inner_clone
.lock()
.unwrap()
.set_state(av_player_state);
},
AVPlayerOnInfoType::AV_INFO_TYPE_RESOLUTION_CHANGE => {
let mut width = -1;
let mut height = -1;
unsafe {
OH_AVFormat_GetIntValue(info_body, OH_PLAYER_VIDEO_WIDTH, &mut width);
OH_AVFormat_GetIntValue(info_body, OH_PLAYER_VIDEO_HEIGHT, &mut height);
}
// Todo fix the metadata update logic, we should only report metadata once during intialization.
let mut last_metadata = metadata_clone.lock().unwrap();
last_metadata.get_mut().height = height as u32;
last_metadata.get_mut().width = width as u32;
let meta_data_clone_clone = last_metadata.get_mut().clone();
let _ = sender_clone
.lock()
.unwrap()
.send(PlayerEvent::MetadataUpdated(meta_data_clone_clone));
debug!("Resolution get: width: {}, height: {}", width, height);
},
AVPlayerOnInfoType::AV_INFO_TYPE_IS_LIVE_STREAM => {
let mut value = -1;
unsafe {
OH_AVFormat_GetIntValue(
info_body,
OH_PLAYER_IS_LIVE_STREAM,
&mut value,
);
}
let mut last_metadata = metadata_clone.lock().unwrap();
let last_metadata_mut = last_metadata.get_mut();
(last_metadata_mut.is_live, last_metadata_mut.is_seekable) = match value {
1 => (true, false),
_ => (false, true),
};
debug!("AVPlayer is live stream: {}. which is not supported", value);
},
AVPlayerOnInfoType::AV_INFO_TYPE_DURATION_UPDATE => {
let mut duration: i64 = -1;
unsafe {
OH_AVFormat_GetLongValue(info_body, OH_PLAYER_DURATION, &mut duration);
}
let duration = time::Duration::from_millis(duration as u64);
metadata_clone.lock().unwrap().get_mut().duration = Some(duration);
let mut last_metadata = metadata_clone.lock().unwrap();
last_metadata.get_mut().duration = Some(duration);
let metadata_clone_clone = last_metadata.get_mut().clone();
let _ = sender_clone
.lock()
.unwrap()
.send(PlayerEvent::MetadataUpdated(metadata_clone_clone));
debug!("DURATION UPDATE: {:?}", duration);
},
AVPlayerOnInfoType::AV_INFO_TYPE_BUFFERING_UPDATE => {
let mut buffer_type = -1;
let mut buffer_value = -1;
unsafe {
OH_AVFormat_GetIntValue(
info_body,
OH_PLAYER_BUFFERING_TYPE,
&mut buffer_type,
);
OH_AVFormat_GetIntValue(
info_body,
OH_PLAYER_BUFFERING_VALUE,
&mut buffer_value,
);
}
debug!("Buffering update: {}, value: {}", buffer_type, buffer_value);
},
AVPlayerOnInfoType::AV_INFO_TYPE_VOLUME_CHANGE => {
let mut volume = 0.0;
unsafe {
OH_AVFormat_GetFloatValue(info_body, OH_PLAYER_VOLUME, &mut volume);
}
debug!("Player Volume Change: {}", volume);
},
AVPlayerOnInfoType::AV_INFO_TYPE_POSITION_UPDATE => {
let mut position = -1;
unsafe {
OH_AVFormat_GetIntValue(
info_body,
OH_PLAYER_CURRENT_POSITION,
&mut position,
);
}
let _ = sender_clone
.lock()
.unwrap()
.send(PlayerEvent::PositionChanged(position as f64 / 1000.0));
},
AVPlayerOnInfoType::AV_INFO_TYPE_SEEKDONE => {
let mut position = -1;
unsafe {
OH_AVFormat_GetIntValue(
info_body,
OH_PLAYER_SEEK_POSITION,
&mut position,
);
}
let _ = sender_clone
.lock()
.unwrap()
.send(PlayerEvent::SeekDone(position as f64 / 1000.0));
},
_ => {
warn!("Unhandled info type: {:?}", info_type);
},
}
};
self.player_inner
.lock()
.unwrap()
.connect_info_event_callback(event_info_closure);
}
pub fn setup_data_source(&mut self) {
let sender_clone = self.event_sender.clone();
let sender_clone_clone = self.event_sender.clone();
let seek_channel = Arc::new(Mutex::new(SeekChannel::new()));
let seekdata_send_closure = move |pos: u64| {
let _ = sender_clone.lock().unwrap().send(PlayerEvent::SeekData(
pos,
seek_channel.lock().unwrap().sender(),
));
let (ret, ack_channel) = seek_channel.lock().unwrap().wait();
let _ = ack_channel.send(());
debug!("Seek Initiated! :{}", pos);
let _ = sender_clone.lock().unwrap().send(PlayerEvent::NeedData);
ret
};
let source = MediaSourceWrapper::builder()
.set_enough_data(move || {
let _ = sender_clone_clone
.lock()
.unwrap()
.send(PlayerEvent::EnoughData);
})
.set_seek_data(seekdata_send_closure)
.build();
self.player_inner.lock().unwrap().set_source(source);
// To kickstart the first need data event.
let _ = self
.event_sender
.lock()
.unwrap()
.send(PlayerEvent::NeedData);
}
}
struct SeekChannel {
sender: SeekLock,
recv: IpcReceiver<SeekLockMsg>,
}
impl SeekChannel {
fn new() -> Self {
let (sender, recv) = channel::<SeekLockMsg>().expect("Couldn't create IPC channel");
Self {
sender: SeekLock {
lock_channel: sender,
},
recv,
}
}
fn sender(&self) -> SeekLock {
self.sender.clone()
}
fn wait(&self) -> SeekLockMsg {
self.recv.recv().unwrap()
}
}
impl Drop for OhosAvPlayer {
fn drop(&mut self) {
debug!("Ohos Dropping");
let (sender, _) = std::sync::mpsc::channel::<()>();
let _ = self
.backend_chan
.lock()
.unwrap()
.send(BackendMsg::Shutdown {
context: self.context_id,
id: self.id,
tx_ack: sender,
});
}
}
impl MediaInstance for OhosAvPlayer {
fn get_id(&self) -> usize {
todo!()
self.id
}
fn mute(&self, val: bool) -> Result<(), ()> {
todo!()
fn mute(&self, val: bool) -> Result<(), MediaInstanceError> {
self.set_mute(val).map_err(|_| MediaInstanceError)
}
fn suspend(&self) -> Result<(), ()> {
todo!()
fn suspend(&self) -> Result<(), MediaInstanceError> {
self.pause().map_err(|_| MediaInstanceError)
}
fn resume(&self) -> Result<(), ()> {
todo!()
fn resume(&self) -> Result<(), MediaInstanceError> {
self.play().map_err(|_| MediaInstanceError)
}
}
impl Player for OhosAVPlayer {
fn play(&self) -> Result<(), servo_media_player::PlayerError> {
todo!()
// TODO: Connect Error.
impl Player for OhosAvPlayer {
fn play(&self) -> Result<(), servo_media::player::PlayerError> {
debug!("Start playing ohos player");
self.state_manager.lock().unwrap().player_state.paused = false;
self.player_inner.lock().unwrap().play();
Ok(())
}
fn pause(&self) -> Result<(), servo_media_player::PlayerError> {
todo!()
fn pause(&self) -> Result<(), servo_media::player::PlayerError> {
self.state_manager.lock().unwrap().player_state.paused = true;
self.player_inner.lock().unwrap().pause();
Ok(())
}
fn paused(&self) -> bool {
todo!()
fn stop(&self) -> Result<(), servo_media::player::PlayerError> {
self.player_inner.lock().unwrap().stop();
Ok(())
}
fn seek(&self, time: f64) -> Result<(), servo_media::player::PlayerError> {
log::error!("Seeking to {} seconds", time);
self.player_inner
.lock()
.unwrap()
.seek((time * 1000.0) as i32);
let state_manger_lock = self.state_manager.lock().unwrap();
if !state_manger_lock.player_state.paused &&
state_manger_lock.internal_state.state == AVPlayerState::AV_COMPLETED
{
self.player_inner.lock().unwrap().play();
}
Ok(())
}
fn seekable(&self) -> Vec<std::ops::Range<f64>> {
if let Some(duration) = self.last_metadata.lock().unwrap().get_mut().duration {
return vec![Range {
start: 0.0,
end: duration.as_secs_f64(),
}];
}
self.buffered()
}
fn set_mute(&self, val: bool) -> Result<(), servo_media::player::PlayerError> {
self.player_inner.lock().unwrap().set_mute(val);
Ok(())
}
fn set_volume(&self, value: f64) -> Result<(), servo_media::player::PlayerError> {
self.player_inner.lock().unwrap().set_volume(value);
Ok(())
}
fn set_input_size(&self, size: u64) -> Result<(), servo_media::player::PlayerError> {
debug!("SetInputSize: {}", size);
self.player_inner.lock().unwrap().set_input_size(size);
Ok(())
}
fn set_playback_rate(
&self,
playback_rate: f64,
) -> Result<(), servo_media::player::PlayerError> {
self.player_inner.lock().unwrap().set_rate(playback_rate);
Ok(())
}
fn push_data(&self, data: Vec<u8>) -> Result<(), servo_media::player::PlayerError> {
self.player_inner.lock().unwrap().push_data(data);
Ok(())
}
fn end_of_stream(&self) -> Result<(), servo_media::player::PlayerError> {
debug!("Player: Current Request End of Stream reached!");
self.player_inner.lock().unwrap().end_of_stream();
Ok(())
}
fn buffered(&self) -> Vec<std::ops::Range<f64>> {
vec![]
}
fn set_stream(
&self,
_stream: &servo_media::streams::MediaStreamId,
_only_stream: bool,
) -> Result<(), servo_media::player::PlayerError> {
Ok(())
}
fn render_use_gl(&self) -> bool {
warn!("Render use gl not supported!");
false
}
fn set_audio_track(
&self,
_stream_index: i32,
_enabled: bool,
) -> Result<(), servo_media::player::PlayerError> {
Ok(())
}
fn set_video_track(
&self,
_stream_index: i32,
_enabled: bool,
) -> Result<(), servo_media::player::PlayerError> {
Ok(())
}
fn can_resume(&self) -> bool {
todo!()
}
fn stop(&self) -> Result<(), servo_media_player::PlayerError> {
todo!()
}
fn seek(&self, time: f64) -> Result<(), servo_media_player::PlayerError> {
todo!()
}
fn seekable(&self) -> Vec<std::ops::Range<f64>> {
todo!()
}
fn set_mute(&self, muted: bool) -> Result<(), servo_media_player::PlayerError> {
todo!()
fn paused(&self) -> bool {
self.state_manager.lock().unwrap().player_state.paused
}
fn muted(&self) -> bool {
todo!()
}
fn set_volume(&self, volume: f64) -> Result<(), servo_media_player::PlayerError> {
todo!()
self.player_inner.lock().unwrap().muted()
}
fn volume(&self) -> f64 {
todo!()
}
fn set_input_size(&self, size: u64) -> Result<(), servo_media_player::PlayerError> {
todo!()
}
fn set_playback_rate(&self, playback_rate: f64) -> Result<(), servo_media_player::PlayerError> {
todo!()
self.player_inner.lock().unwrap().volume()
}
fn playback_rate(&self) -> f64 {
todo!()
}
fn push_data(&self, data: Vec<u8>) -> Result<(), servo_media_player::PlayerError> {
todo!()
}
fn end_of_stream(&self) -> Result<(), servo_media_player::PlayerError> {
todo!()
}
fn buffered(&self) -> Vec<std::ops::Range<f64>> {
todo!()
}
fn set_stream(
&self,
stream: &servo_media_streams::MediaStreamId,
only_stream: bool,
) -> Result<(), servo_media_player::PlayerError> {
todo!()
}
fn render_use_gl(&self) -> bool {
todo!()
}
fn set_audio_track(
&self,
stream_index: i32,
enabled: bool,
) -> Result<(), servo_media_player::PlayerError> {
todo!()
}
fn set_video_track(
&self,
stream_index: i32,
enabled: bool,
) -> Result<(), servo_media_player::PlayerError> {
todo!()
self.player_inner.lock().unwrap().playback_rate()
}
}
/// Used when acquiring the decoded Video Frame,
/// and upload it to Media Frame Renderer in media element.
struct VideoSink {
video_render:
std::sync::Arc<std::sync::Mutex<dyn servo_media::player::video::VideoFrameRenderer>>,
player_inner: Arc<Mutex<OhosPlayerInner>>,
event_sender: Arc<Mutex<ipc_channel::ipc::IpcSender<servo_media::player::PlayerEvent>>>,
thread_send_chan: Cell<Option<Sender<RenderMsg>>>,
}
pub enum RenderMsg {
Terminate,
FrameAvailable,
}
impl VideoSink {
pub fn new(
video_render: std::sync::Arc<
std::sync::Mutex<dyn servo_media::player::video::VideoFrameRenderer>,
>,
player_inner: Arc<Mutex<OhosPlayerInner>>,
event_sender: Arc<Mutex<ipc_channel::ipc::IpcSender<servo_media::player::PlayerEvent>>>,
) -> Self {
VideoSink {
video_render,
player_inner,
event_sender,
thread_send_chan: Cell::new(None),
}
}
// For VideoSink, Need to think of better way to retrieve data.
pub fn setup(&self) {
let (sender, receiver) = crossbeam_channel::unbounded::<RenderMsg>();
let sender_clone = sender.clone();
self.thread_send_chan.set(Some(sender));
let event_sender_clone = self.event_sender.clone();
let player_inner_clone = self.player_inner.clone();
let renderer_clone = self.video_render.clone();
let frame_available_closure = move || {
let res = sender_clone.send(RenderMsg::FrameAvailable);
if res.is_err() {
debug!("Failed to send frame available: {:?}", res.err());
}
};
self.player_inner
.lock()
.unwrap()
.setup_window_buffer_listener(frame_available_closure);
std::thread::Builder::new()
.name("Media Worker Thread".to_owned())
.spawn(move || {
loop {
let Ok(msg) = receiver.recv() else {
debug!("error receiving message");
break;
};
match msg {
RenderMsg::Terminate => {
break;
},
RenderMsg::FrameAvailable => {
let frame_info = match player_inner_clone.lock().unwrap().acquire_buffer() {
Some(frame_info) => frame_info,
None => continue,
};
debug!(
"fd: {}, width: {}, height: {}, stride: {}, size: {}, format: {}, virt addr: {:p}",
frame_info.fd,
frame_info.width,
frame_info.height,
frame_info.stride,
frame_info.size,
frame_info.format,
frame_info.vir_addr
);
let coded_height = ((frame_info.height + FRAME_HEIGHT_MULTIPLE - 1) / FRAME_HEIGHT_MULTIPLE) * FRAME_HEIGHT_MULTIPLE;
let y_plane_size = (frame_info.stride * coded_height) as usize;
let uv_plane_size = (frame_info.stride * coded_height / 2) as usize;
let total_needed = y_plane_size + uv_plane_size;
if total_needed > frame_info.size as usize || frame_info.vir_addr.is_null() {
error!(
"Buffer too small or null: needed {} bytes (y={}, uv={}), have {} bytes, vir_addr null={}",
total_needed, y_plane_size, uv_plane_size, frame_info.size, frame_info.vir_addr.is_null()
);
player_inner_clone
.lock()
.unwrap()
.release_buffer(frame_info);
continue;
}
let bi_planar_image = yuv::YuvBiPlanarImage {
y_plane: unsafe {
std::slice::from_raw_parts(
frame_info.vir_addr as *const u8,
y_plane_size,
)
},
uv_plane: unsafe {
std::slice::from_raw_parts(
(frame_info.vir_addr as usize + y_plane_size) as *const u8,
uv_plane_size,
)
},
width: frame_info.width as u32,
height: frame_info.height as u32,
y_stride: frame_info.stride as u32,
uv_stride: frame_info.stride as u32,
};
let mut bgra = vec![0u8; (frame_info.width * frame_info.height * 4) as usize];
// Conversion from yuv to bgra8
let Ok(_) = yuv_nv12_to_bgra(
&bi_planar_image,
&mut bgra,
frame_info.width as u32 *4,
yuv::YuvRange::Full,
yuv::YuvStandardMatrix::Bt709,
yuv::YuvConversionMode::Balanced
)else{
error!("Failed to convert YUV to BGRA");
player_inner_clone
.lock()
.unwrap()
.release_buffer(frame_info);
continue;
};
let Some(frame) = VideoFrame::new(
frame_info.width,
frame_info.height,
Arc::new(OhosBuffer::new(bgra)),
) else {
error!("Failed to create VideoFrame");
player_inner_clone
.lock()
.unwrap()
.release_buffer(frame_info);
continue;
};
renderer_clone.lock().expect(
"Failed to acquire video renderer lock"
).render(frame);
player_inner_clone
.lock()
.unwrap()
.release_buffer(frame_info);
match event_sender_clone
.lock()
.unwrap()
.send(PlayerEvent::VideoFrameUpdated)
{
Ok(()) => {},
Err(e) => {
warn!("Send PlayerEvent::VideoFrameUpdated Error: {}", e);
},
};
},
}
}
})
.unwrap();
}
}
impl Drop for VideoSink {
fn drop(&mut self) {
if let Some(sender) = self.thread_send_chan.get_mut() {
let _ = sender.send(RenderMsg::Terminate);
}
}
}
struct OhosBuffer {
data: Vec<u8>,
}
impl OhosBuffer {
pub fn new(data: Vec<u8>) -> OhosBuffer {
OhosBuffer { data }
}
}
impl Buffer for OhosBuffer {
fn to_vec(&self) -> Option<video::VideoFrameData> {
Some(VideoFrameData::Raw(Arc::new(self.data.to_owned())))
}
}

View File

@@ -6,8 +6,7 @@ use std::collections::HashMap;
use once_cell::sync::Lazy;
pub static OHOS_REGISTRY_SCANNER: Lazy<OhosRegistryScanner> =
Lazy::new(|| OhosRegistryScanner::new());
pub static OHOS_REGISTRY_SCANNER: Lazy<OhosRegistryScanner> = Lazy::new(OhosRegistryScanner::new);
// Should be a combination of mime/codecs
// If the type we are matching only contain mime, then we only match the container.

View File

@@ -42,7 +42,7 @@ use serde::{Deserialize, Serialize};
use servo_arc::Arc as ServoArc;
use servo_base::generic_channel::CallbackSetter;
use servo_base::id::PipelineId;
use servo_url::{Host, ImmutableOrigin, ServoUrl};
use servo_url::{Host, ServoUrl};
use tokio::sync::Mutex as TokioMutex;
use tokio::sync::mpsc::{UnboundedReceiver as TokioReceiver, UnboundedSender as TokioSender};
@@ -540,12 +540,12 @@ pub async fn main_fetch(
.await;
let mut response = match response {
Some(res) => res,
Some(response) => response,
None => {
// Step 12. If response is null, then set response to the result
// of running the steps corresponding to the first matching statement:
let same_origin = if let Origin::Origin(ref origin) = request.origin {
*origin == current_url.origin()
*origin == request.current_url_with_blob_claim().origin()
} else {
false
};
@@ -652,9 +652,10 @@ pub async fn main_fetch(
// Step 14. If response is not a network error and response is not a filtered response, then:
let mut response = if !response.is_network_error() && response.internal_response.is_none() {
// Substep 1.
// Step 14.1 If requests response tainting is "cors", then:
if request.response_tainting == ResponseTainting::CorsTainting {
// Subsubstep 1.
// Step 14.1.1 Let headerNames be the result of extracting header list values given
// `Access-Control-Expose-Headers` and responses header list.
let header_names: Option<Vec<HeaderName>> = response
.headers
.typed_get::<AccessControlExposeHeaders>()
@@ -680,7 +681,8 @@ pub async fn main_fetch(
}
}
// Substep 2.
// Step 14.2 Set response to the following filtered response with response as its internal response,
// depending on requests response tainting:
let response_type = match request.response_tainting {
ResponseTainting::Basic => ResponseType::Basic,
ResponseTainting::CorsTainting => ResponseType::Cors,
@@ -726,7 +728,11 @@ pub async fn main_fetch(
// Step 16. If internalResponses URL list is empty, then set it to a clone of requests URL list.
if internal_response.url_list.is_empty() {
internal_response.url_list.clone_from(&request.url_list)
internal_response.url_list = request
.url_list
.iter()
.map(|locked_url| locked_url.url())
.collect();
}
// Step 17. Set internalResponses redirect taint to requests redirect-taint.
@@ -1056,18 +1062,22 @@ async fn scheme_fetch(
// Step 2: Let request be fetchParamss request.
let request = &mut fetch_params.request;
let url = request.current_url();
let url_and_blob_lock = request.current_url_with_blob_claim();
let scheme = url.scheme();
let scheme = url_and_blob_lock.scheme();
match scheme {
"about" if url.path() == "blank" => create_blank_reply(url, request.timing_type()),
"about" if url.path() == "memory" => create_about_memory(url, request.timing_type()),
"about" if url_and_blob_lock.path() == "blank" => {
create_blank_reply(url_and_blob_lock.url(), request.timing_type())
},
"about" if url_and_blob_lock.path() == "memory" => {
create_about_memory(url_and_blob_lock.url(), request.timing_type())
},
"chrome" if url.path() == "allowcert" => {
"chrome" if url_and_blob_lock.path() == "allowcert" => {
if let Err(error) = handle_allowcert_request(request, context) {
warn!("Could not handle allowcert request: {error}");
}
create_blank_reply(url, request.timing_type())
create_blank_reply(url_and_blob_lock.url(), request.timing_type())
},
"http" | "https" => {
@@ -1355,15 +1365,10 @@ pub enum MixedSecurityProhibited {
/// <https://w3c.github.io/webappsec-mixed-content/#categorize-settings-object>
fn do_settings_prohibit_mixed_security_contexts(request: &Request) -> MixedSecurityProhibited {
if let Origin::Origin(ref origin) = request.origin {
// Workers created from a data: url are secure if they were created from secure contexts
let is_origin_data_url_worker = matches!(
*origin,
ImmutableOrigin::Opaque(servo_url::OpaqueOrigin::SecureWorkerFromDataUrl(_))
);
// Step 1. If settings origin is a potentially trustworthy origin,
// then return "Prohibits Mixed Security Contexts".
if origin.is_potentially_trustworthy() || is_origin_data_url_worker {
// NOTE: Workers created from a data: url are secure if they were created from secure contexts
if origin.is_potentially_trustworthy() || origin.is_for_data_worker_from_secure_context() {
return MixedSecurityProhibited::Prohibited;
}
}

View File

@@ -18,10 +18,10 @@ use http::header::{self, HeaderValue};
use ipc_channel::ipc::IpcSender;
use log::warn;
use mime::{self, Mime};
use net_traits::blob_url_store::{BlobBuf, BlobURLStoreError};
use net_traits::blob_url_store::{BlobBuf, BlobTokenCommunicator, BlobURLStoreError};
use net_traits::filemanager_thread::{
FileManagerResult, FileManagerThreadError, FileManagerThreadMsg, FileTokenCheck,
ReadFileProgress, RelativePos,
GetTokenForFileReply, ReadFileProgress, RelativePos,
};
use net_traits::http_percent_encode;
use net_traits::response::{Response, ResponseBody};
@@ -84,13 +84,18 @@ enum FileImpl {
pub struct FileManager {
embedder_proxy: GenericEmbedderProxy<NetToEmbedderMsg>,
store: Arc<FileManagerStore>,
blob_token_communicator: Arc<Mutex<BlobTokenCommunicator>>,
}
impl FileManager {
pub fn new(embedder_proxy: GenericEmbedderProxy<NetToEmbedderMsg>) -> FileManager {
pub fn new(
embedder_proxy: GenericEmbedderProxy<NetToEmbedderMsg>,
blob_token_communicator: Arc<Mutex<BlobTokenCommunicator>>,
) -> FileManager {
FileManager {
embedder_proxy,
store: Arc::new(FileManagerStore::new()),
blob_token_communicator,
}
}
@@ -108,8 +113,8 @@ impl FileManager {
});
}
pub(crate) fn get_token_for_file(&self, file_id: &Uuid) -> FileTokenCheck {
self.store.get_token_for_file(file_id)
pub(crate) fn get_token_for_file(&self, file_id: &Uuid, allow_revoked: bool) -> FileTokenCheck {
self.store.get_token_for_file(file_id, allow_revoked)
}
pub(crate) fn invalidate_token(&self, token: &FileTokenCheck, file_id: &Uuid) {
@@ -182,6 +187,22 @@ impl FileManager {
FileManagerThreadMsg::ActivateBlobURL(id, sender, origin) => {
let _ = sender.send(self.store.set_blob_url_validity(true, &id, &origin));
},
FileManagerThreadMsg::GetTokenForFile(id, _origin, sender) => {
let token = match self.get_token_for_file(&id, false) {
FileTokenCheck::Required(token) => Some(token),
_ => None,
};
let communicator = self.blob_token_communicator.lock();
let _ = sender.send(GetTokenForFileReply {
token,
revoke_sender: communicator.revoke_sender.clone(),
refresh_sender: communicator.refresh_token_sender.clone(),
});
},
FileManagerThreadMsg::RevokeTokenForFile(token, id) => {
self.invalidate_token(&FileTokenCheck::Required(token), &id);
},
}
}
@@ -459,7 +480,7 @@ impl FileManagerStore {
}
}
pub(crate) fn get_token_for_file(&self, file_id: &Uuid) -> FileTokenCheck {
pub(crate) fn get_token_for_file(&self, file_id: &Uuid, allow_revoked: bool) -> FileTokenCheck {
let mut entries = self.entries.write();
let parent_id = match entries.get(file_id) {
Some(entry) => {
@@ -471,12 +492,11 @@ impl FileManagerStore {
},
None => return FileTokenCheck::ShouldFail,
};
let file_id = match parent_id.as_ref() {
Some(id) => id,
None => file_id,
};
let file_id = parent_id.as_ref().unwrap_or(file_id);
if let Some(entry) = entries.get_mut(file_id) {
if !entry.is_valid_url.load(Ordering::Acquire) {
if !allow_revoked && !entry.is_valid_url.load(Ordering::Acquire) {
log::warn!("Refusing to grant token for revoked blob url: {file_id:?}");
return FileTokenCheck::ShouldFail;
}
let token = Uuid::new_v4();

View File

@@ -17,7 +17,7 @@ use headers::{
use http::header::HeaderValue;
use http::{HeaderMap, Method, StatusCode, header};
use log::{debug, error};
use malloc_size_of::{MallocSizeOf, MallocSizeOfOps, MallocUnconditionalSizeOf};
use malloc_size_of::{MallocSizeOf, MallocSizeOfOps};
use malloc_size_of_derive::MallocSizeOf;
use net_traits::http_status::HttpStatus;
use net_traits::request::Request;
@@ -55,11 +55,15 @@ impl CacheKey {
}
/// A complete cached resource.
#[derive(Clone)]
#[derive(Clone, MallocSizeOf)]
pub struct CachedResource {
#[conditional_malloc_size_of]
request_headers: Arc<ParkingLotMutex<HeaderMap>>,
#[conditional_malloc_size_of]
body: Arc<ParkingLotMutex<ResponseBody>>,
#[conditional_malloc_size_of]
aborted: Arc<AtomicBool>,
#[conditional_malloc_size_of]
awaiting_body: Arc<ParkingLotMutex<Vec<TokioSender<Data>>>>,
metadata: CachedMetadata,
location_url: Option<Result<ServoUrl, String>>,
@@ -70,27 +74,11 @@ pub struct CachedResource {
last_validated: Instant,
}
impl MallocSizeOf for CachedResource {
fn size_of(&self, ops: &mut MallocSizeOfOps) -> usize {
// TODO: self.request_headers.unconditional_size_of(ops) +
self.body.unconditional_size_of(ops) +
self.aborted.unconditional_size_of(ops) +
self.awaiting_body.unconditional_size_of(ops) +
self.metadata.size_of(ops) +
self.location_url.size_of(ops) +
self.https_state.size_of(ops) +
self.status.size_of(ops) +
self.url_list.size_of(ops) +
self.expires.size_of(ops) +
self.last_validated.size_of(ops)
}
}
/// Metadata about a loaded resource, such as is obtained from HTTP headers.
#[derive(Clone, MallocSizeOf)]
struct CachedMetadata {
/// Headers
#[ignore_malloc_size_of = "Defined in `http` and has private members"]
#[conditional_malloc_size_of]
pub headers: Arc<ParkingLotMutex<HeaderMap>>,
/// Final URL after redirects.
pub final_url: ServoUrl,

View File

@@ -40,6 +40,7 @@ use ipc_channel::ipc::{self, IpcSender};
use ipc_channel::router::ROUTER;
use log::{debug, error, info, log_enabled, warn};
use malloc_size_of::{MallocSizeOf, MallocSizeOfOps};
use net_traits::blob_url_store::UrlWithBlobClaim;
use net_traits::fetch::headers::get_value_from_header_list;
use net_traits::http_status::HttpStatus;
use net_traits::policy_container::RequestPolicyContainer;
@@ -961,7 +962,7 @@ async fn obtain_response(
}
}
/// [HTTP fetch](https://fetch.spec.whatwg.org#http-fetch)
/// [HTTP fetch](https://fetch.spec.whatwg.org/#concept-http-fetch)
#[async_recursion]
#[allow(clippy::too_many_arguments)]
pub async fn http_fetch(
@@ -976,14 +977,13 @@ pub async fn http_fetch(
) -> Response {
// This is a new async fetch, reset the channel we are waiting on
*done_chan = None;
// Step 1 Let request be fetchParamss request.
// Step 1. Let request be fetchParamss request.
let request = &mut fetch_params.request;
// Step 2
// Let response and internalResponse be null.
// Step 2. Let response and internalResponse be null.
let mut response: Option<Response> = None;
// Step 3
// Step 3. If requests service-workers mode is "all", then
if request.service_workers_mode == ServiceWorkersMode::All {
// TODO: Substep 1
// Set response to the result of invoking handle fetch for request.
@@ -1011,9 +1011,9 @@ pub async fn http_fetch(
}
}
// Step 4
// Step 4. If response is null, then:
if response.is_none() {
// Substep 1
// Step 4.1. If makeCORSPreflight is true and one of these conditions is true:
if cors_preflight_flag {
let method_cache_match = cache.match_method(request, request.method.clone());
@@ -1024,17 +1024,19 @@ pub async fn http_fetch(
!is_cors_safelisted_request_header(&name, &value)
});
// Sub-substep 1
if method_mismatch || header_mismatch {
let preflight_result = cors_preflight_fetch(request, cache, context).await;
// Sub-substep 2
if let Some(e) = preflight_result.get_network_error() {
return Response::network_error(e.clone());
// Step 4.1.1. Let preflightResponse be the result of running
// CORS-preflight fetch given request.
let preflight_response = cors_preflight_fetch(request, cache, context).await;
// Step 4.1.2. If preflightResponse is a network error, then return preflightResponse.
if let Some(error) = preflight_response.get_network_error() {
return Response::network_error(error.clone());
}
}
}
// Substep 2
// Step 4.2. If requests redirect mode is "follow",
// then set requests service-workers mode to "none".
if request.redirect_mode == RedirectMode::Follow {
request.service_workers_mode = ServiceWorkersMode::None;
}
@@ -1047,6 +1049,8 @@ pub async fn http_fetch(
.lock()
.set_attribute(ResourceAttribute::RequestStart);
// Step 4.3. Set response and internalResponse to the result of
// running HTTP-network-or-cache fetch given fetchParams.
let mut fetch_result = http_network_or_cache_fetch(
fetch_params,
authentication_fetch_flag,
@@ -1056,11 +1060,14 @@ pub async fn http_fetch(
)
.await;
// Substep 4
// Step 4.4. If requests response tainting is "cors" and a CORS check for request
// and response returns failure, then return a network error.
if cors_flag && cors_check(&fetch_params.request, &fetch_result).is_err() {
return Response::network_error(NetworkError::CorsGeneral);
}
// TODO: Step 4.5. If the TAO check for request and response returns failure,
// then set requests timing allow failed flag.
fetch_result.return_internal = false;
response = Some(fetch_result);
}
@@ -1070,7 +1077,16 @@ pub async fn http_fetch(
// response is guaranteed to be something by now
let mut response = response.unwrap();
// TODO: Step 5: cross-origin resource policy check
// Step 5: If either requests response tainting or responses type is "opaque",
// and the cross-origin resource policy check with requests origin, requests client,
// requests destination, and internalResponse returns blocked, then return a network error.
if (request.response_tainting == ResponseTainting::Opaque ||
response.response_type == ResponseType::Opaque) &&
cross_origin_resource_policy_check(request, &response) ==
CrossOriginResourcePolicy::Blocked
{
return Response::network_error(NetworkError::CrossOriginResponse);
}
// Step 6. If internalResponses status is a redirect status:
if response
@@ -1353,7 +1369,11 @@ pub async fn http_redirect_fetch(
// Steps 15-17 relate to timing, which is not implemented 1:1 with the spec.
// Step 18: Append locationURL to requests URL list.
request.url_list.push(location_url);
request
.url_list
.push(UrlWithBlobClaim::from_url_without_having_claimed_blob(
location_url,
));
// Step 19: Invoke set requests referrer policy on redirect on request and internalResponse.
set_requests_referrer_policy_on_redirect(request, response.actual_response());
@@ -1752,23 +1772,20 @@ async fn http_network_or_cache_fetch(
let http_request = &mut http_fetch_params.request;
let mut response = response.unwrap();
// FIXME: The spec doesn't tell us to do this *here*, but if we don't do it then
// tests fail. Where should we do it instead? See also #33615
if http_request.response_tainting != ResponseTainting::CorsTainting &&
cross_origin_resource_policy_check(http_request, &response) ==
CrossOriginResourcePolicy::Blocked
{
return Response::network_error(NetworkError::CorsGeneral);
}
// TODO(#33616): Step 11. Set responses URL list to a clone of httpRequests URL list.
// Step 11. Set responses URL list to a clone of httpRequests URL list.
response.url_list = http_request
.url_list
.iter()
.map(|claimed_url| claimed_url.url())
.collect();
// Step 12. If httpRequests header list contains `Range`, then set responses range-requested flag.
if http_request.headers.contains_key(RANGE) {
response.range_requested = true;
}
// TODO(#33616): Step 13 Set responses request-includes-credentials to includeCredentials.
// Step 13. Set responses request-includes-credentials to includeCredentials.
response.request_includes_credentials = include_credentials;
// Step 14. If responses status is 401, httpRequests response tainting is not "cors",
// includeCredentials is true, and requests window is an environment settings object, then:
@@ -2442,7 +2459,7 @@ async fn cors_preflight_fetch(
// referrer policy, mode is "cors", and response tainting is "cors".
let mut preflight = RequestBuilder::new(
request.target_webview_id,
request.current_url(),
request.current_url_with_blob_claim(),
request.referrer.clone(),
)
.method(Method::OPTIONS)
@@ -2464,6 +2481,13 @@ async fn cors_preflight_fetch(
},
RequestPolicyContainer::PolicyContainer(policy_container) => policy_container.clone(),
})
.url_list(
request
.url_list
.iter()
.map(|claimed_url| claimed_url.url())
.collect(),
)
.build();
// Step 2. Append (`Accept`, `*/*`) to preflights header list.

View File

@@ -28,8 +28,8 @@ impl ProtocolHandler for BlobProtocolHander {
done_chan: &mut DoneChannel,
context: &FetchContext,
) -> Pin<Box<dyn Future<Output = Response> + Send>> {
let url = request.current_url();
debug!("Loading blob {}", url.as_str());
let url_and_blob_claim = request.current_url_with_blob_claim();
debug!("Loading blob {}", url_and_blob_claim.as_str());
// Step 2.
if request.method != Method::GET {
@@ -39,16 +39,22 @@ impl ProtocolHandler for BlobProtocolHander {
let range_header = request.headers.typed_get::<Range>();
let is_range_request = range_header.is_some();
let (id, origin) = match parse_blob_url(&url) {
Ok((id, origin)) => (id, origin),
Err(error) => {
let (file_id, origin) = if let Some(token) = url_and_blob_claim.token() {
(token.file_id, token.origin.clone())
} else {
// FIXME: This should never happen, we should have acquired a token beforehand
let Ok((id, _)) = parse_blob_url(&url_and_blob_claim.url()) else {
return Box::pin(ready(Response::network_error(
NetworkError::ResourceLoadError(format!("Invalid blob URL ({error})")),
NetworkError::ResourceLoadError("Invalid blob URL".into()),
)));
},
};
(id, url_and_blob_claim.url().origin())
};
let mut response = Response::new(url, ResourceFetchTiming::new(request.timing_type()));
let mut response = Response::new(
url_and_blob_claim.url(),
ResourceFetchTiming::new(request.timing_type()),
);
response.status = HttpStatus::default();
if is_range_request {
@@ -63,7 +69,7 @@ impl ProtocolHandler for BlobProtocolHander {
if let Err(err) = context.filemanager.fetch_file(
&mut done_sender,
context.cancellation_listener.clone(),
id,
file_id,
&context.file_token,
origin,
&mut response,

View File

@@ -11,6 +11,7 @@ use std::pin::Pin;
use headers::Range;
use http::StatusCode;
use log::error;
use net_traits::blob_url_store::UrlWithBlobClaim;
use net_traits::filemanager_thread::RelativePos;
use net_traits::request::Request;
use net_traits::response::Response;
@@ -219,7 +220,9 @@ impl ProtocolHandler for WebPageContentProtocolHandler {
// Ensure we did a proper substitution with a HTTP result
assert!(matches!(result_url.scheme(), "http" | "https"));
// Step 9. Navigate an appropriate navigable to resultURL.
request.url_list.push(result_url);
request
.url_list
.push(UrlWithBlobClaim::new(result_url, None));
let request2 = request.clone();
let context2 = context.clone();
Box::pin(async move { fetch(request2, &mut DiscardFetch, &context2).await })

View File

@@ -19,7 +19,7 @@ use embedder_traits::GenericEmbedderProxy;
use hyper_serde::Serde;
use ipc_channel::ipc::IpcSender;
use log::{debug, trace, warn};
use net_traits::blob_url_store::parse_blob_url;
use net_traits::blob_url_store::{BlobTokenCommunicator, parse_blob_url};
use net_traits::filemanager_thread::FileTokenCheck;
use net_traits::pub_domains::public_suffix_list_size_of;
use net_traits::request::{Destination, PreloadEntry, PreloadId, RequestBuilder, RequestId};
@@ -140,7 +140,13 @@ pub fn new_core_resource_thread(
let (public_setup_chan, public_setup_port) = generic_channel::channel().unwrap();
let (private_setup_chan, private_setup_port) = generic_channel::channel().unwrap();
let (report_chan, report_port) = generic_channel::channel().unwrap();
let (revoke_sender, revoke_receiver) = generic_channel::channel().unwrap();
let (refresh_sender, refresh_receiver) = generic_channel::channel().unwrap();
let blob_token_communicator = Arc::new(Mutex::new(BlobTokenCommunicator {
revoke_sender,
refresh_token_sender: refresh_sender,
}));
thread::Builder::new()
.name("ResourceManager".to_owned())
.spawn(move || {
@@ -150,6 +156,7 @@ pub fn new_core_resource_thread(
embedder_proxy.clone(),
ca_certificates.clone(),
ignore_certificate_errors,
blob_token_communicator,
);
let mut channel_manager = ResourceChannelManager {
@@ -167,6 +174,8 @@ pub fn new_core_resource_thread(
public_setup_port,
private_setup_port,
report_port,
revoke_receiver,
refresh_receiver,
protocols,
embedder_proxy,
)
@@ -241,11 +250,14 @@ fn create_http_states(
}
impl ResourceChannelManager {
#[expect(clippy::too_many_arguments)]
fn start(
&mut self,
public_receiver: GenericReceiver<CoreResourceMsg>,
private_receiver: GenericReceiver<CoreResourceMsg>,
memory_reporter: GenericReceiver<CoreResourceMsg>,
revoke_receiver: GenericReceiver<CoreResourceMsg>,
refresh_receiver: GenericReceiver<CoreResourceMsg>,
protocols: Arc<ProtocolRegistry>,
embedder_proxy: GenericEmbedderProxy<NetToEmbedderMsg>,
) {
@@ -260,6 +272,8 @@ impl ResourceChannelManager {
let private_id = rx_set.add(private_receiver);
let public_id = rx_set.add(public_receiver);
let reporter_id = rx_set.add(memory_reporter);
let revoker_id = rx_set.add(revoke_receiver);
let refresh_id = rx_set.add(refresh_receiver);
loop {
for received in rx_set.select().into_iter() {
@@ -270,7 +284,31 @@ impl ResourceChannelManager {
log::error!("Found selection error: {error}")
},
GenericSelectionResult::MessageReceived(id, msg) => {
if id == reporter_id {
if id == revoker_id {
let CoreResourceMsg::RevokeTokenForFile(revocation_request) = msg
else {
log::error!("Blob revocation channel received unexpected message");
continue;
};
self.resource_manager.filemanager.invalidate_token(
&FileTokenCheck::Required(revocation_request.token),
&revocation_request.blob_id,
)
} else if id == refresh_id {
let CoreResourceMsg::RefreshTokenForFile(refresh_request) = msg else {
log::error!("Blob revocation channel received unexpected message");
continue;
};
let FileTokenCheck::Required(refreshed_token) = self
.resource_manager
.filemanager
.get_token_for_file(&refresh_request.blob_id, true)
else {
unreachable!();
};
let _ = refresh_request.new_token_sender.send(refreshed_token);
} else if id == reporter_id {
if let CoreResourceMsg::CollectMemoryReport(report_chan) = msg {
self.process_report(
report_chan,
@@ -607,8 +645,10 @@ impl ResourceChannelManager {
let _ = sender.send(());
return false;
},
// Ignore this message as we handle it only in the reporter chan
CoreResourceMsg::CollectMemoryReport(_) => {},
// Ignore these messages as they are only sent on very specific channels.
CoreResourceMsg::CollectMemoryReport(_) |
CoreResourceMsg::RevokeTokenForFile(..) |
CoreResourceMsg::RefreshTokenForFile(..) => {},
}
true
}
@@ -654,11 +694,12 @@ impl CoreResourceManager {
embedder_proxy: GenericEmbedderProxy<NetToEmbedderMsg>,
ca_certificates: CACertificates<'static>,
ignore_certificate_errors: bool,
blob_token_communicator: Arc<Mutex<BlobTokenCommunicator>>,
) -> CoreResourceManager {
CoreResourceManager {
devtools_sender,
sw_managers: Default::default(),
filemanager: FileManager::new(embedder_proxy.clone()),
filemanager: FileManager::new(embedder_proxy.clone(), blob_token_communicator),
request_interceptor: RequestInterceptor::new(embedder_proxy),
ca_certificates,
ignore_certificate_errors,
@@ -717,17 +758,18 @@ impl CoreResourceManager {
// In the case of a valid blob URL, acquiring a token granting access to a file,
// regardless if the URL is revoked after token acquisition.
//
// TODO: to make more tests pass, acquire this token earlier,
// probably in a separate message flow.
//
// In such a setup, the token would not be acquired here,
// but could instead be contained in the actual CoreResourceMsg::Fetch message.
//
// See https://github.com/servo/servo/issues/25226
// Ideally all callers should have claimed the blob entry themselves, but we're not there
// yet.
let (file_token, blob_url_file_id) = match url.scheme() {
"blob" => {
if let Ok((id, _)) = parse_blob_url(&url) {
(self.filemanager.get_token_for_file(&id), Some(id))
if let Some(token) = request.current_url_with_blob_claim().token() {
(FileTokenCheck::Required(token.token), Some(token.file_id))
} else if let Ok((id, _)) = parse_blob_url(&url) {
// See https://github.com/servo/servo/issues/25226
log::warn!(
"Failed to claim blob URL entry of valid blob URL before passing it to `net`. This causes race conditions."
);
(self.filemanager.get_token_for_file(&id, false), Some(id))
} else {
(FileTokenCheck::ShouldFail, None)
}

View File

@@ -20,6 +20,7 @@ use hyper::service::service_fn;
use hyper::{Request as HyperRequest, Response as HyperResponse};
use hyper_util::rt::tokio::TokioIo;
use net_traits::AsyncRuntime;
use net_traits::blob_url_store::UrlWithBlobClaim;
use rustls_pki_types::pem::PemObject;
use rustls_pki_types::{CertificateDer, PrivateKeyDer, PrivatePkcs8KeyDer};
use servo_default_resources as _;
@@ -79,7 +80,7 @@ impl Server {
}
}
pub fn make_server<H>(handler: H) -> (Server, ServoUrl)
pub fn make_server<H>(handler: H) -> (Server, UrlWithBlobClaim)
where
H: Fn(HyperRequest<Incoming>, &mut HyperResponse<BoxBody<Bytes, hyper::Error>>)
+ Send
@@ -99,7 +100,7 @@ where
);
let url_string = format!("http://localhost:{}", listener.local_addr().unwrap().port());
let url = ServoUrl::parse(&url_string).unwrap();
let url = UrlWithBlobClaim::new(ServoUrl::parse(&url_string).unwrap(), None);
let graceful = hyper_util::server::graceful::GracefulShutdown::new();
@@ -175,7 +176,7 @@ fn load_private_key_from_file(
}
}
pub fn make_ssl_server<H>(handler: H) -> (Server, ServoUrl)
pub fn make_ssl_server<H>(handler: H) -> (Server, UrlWithBlobClaim)
where
H: Fn(HyperRequest<Incoming>, &mut HyperResponse<BoxBody<Bytes, hyper::Error>>)
+ Send
@@ -194,7 +195,7 @@ where
);
let url_string = format!("http://localhost:{}", listener.local_addr().unwrap().port());
let url = ServoUrl::parse(&url_string).unwrap();
let url = UrlWithBlobClaim::new(ServoUrl::parse(&url_string).unwrap(), None);
let cert_path = Path::new("../../resources/self_signed_certificate_for_testing.crt")
.canonicalize()

View File

@@ -7,6 +7,7 @@ use std::ops::Deref;
use headers::{ContentType, HeaderMapExt};
use hyper_serde::Serde;
use mime::{self, Mime};
use net_traits::blob_url_store::UrlWithBlobClaim;
use net_traits::request::Referrer;
use net_traits::response::ResponseBody;
use net_traits::{FetchMetadata, FilteredMetadata, NetworkError};
@@ -24,7 +25,7 @@ fn assert_parse(
) {
use net_traits::request::RequestBuilder;
let url = ServoUrl::parse(url).unwrap();
let url = UrlWithBlobClaim::new(ServoUrl::parse(url).unwrap(), None);
let request = RequestBuilder::new(Some(TEST_WEBVIEW_ID), url.clone(), Referrer::NoReferrer)
.origin(url.origin())
.pipeline_id(None)

View File

@@ -32,6 +32,7 @@ use net::filemanager_thread::FileManager;
use net::hsts::HstsEntry;
use net::protocols::ProtocolRegistry;
use net::request_interceptor::RequestInterceptor;
use net_traits::blob_url_store::{BlobTokenCommunicator, UrlWithBlobClaim};
use net_traits::filemanager_thread::FileTokenCheck;
use net_traits::http_status::HttpStatus;
use net_traits::request::{
@@ -83,10 +84,14 @@ fn test_fetch_response_is_not_network_error() {
#[test]
fn test_fetch_on_bad_port_is_network_error() {
let url = ServoUrl::parse("http://www.example.org:6667").unwrap();
let request = RequestBuilder::new(Some(TEST_WEBVIEW_ID), url.clone(), Referrer::NoReferrer)
.origin(url.origin())
.policy_container(Default::default())
.build();
let request = RequestBuilder::new(
Some(TEST_WEBVIEW_ID),
UrlWithBlobClaim::new(url.clone(), None),
Referrer::NoReferrer,
)
.origin(url.origin())
.policy_container(Default::default())
.build();
let fetch_response = fetch(request, None);
assert!(fetch_response.is_network_error());
let fetch_error = fetch_response.get_network_error().unwrap();
@@ -124,10 +129,14 @@ fn test_fetch_response_body_matches_const_message() {
#[test]
fn test_fetch_aboutblank() {
let url = ServoUrl::parse("about:blank").unwrap();
let request = RequestBuilder::new(Some(TEST_WEBVIEW_ID), url.clone(), Referrer::NoReferrer)
.origin(url.origin())
.policy_container(Default::default())
.build();
let request = RequestBuilder::new(
Some(TEST_WEBVIEW_ID),
UrlWithBlobClaim::new(url.clone(), None),
Referrer::NoReferrer,
)
.origin(url.origin())
.policy_container(Default::default())
.build();
let fetch_response = fetch(request, None);
// We should see an opaque-filtered response.
@@ -187,10 +196,14 @@ fn test_fetch_blob() {
.promote_memory(id.clone(), blob_buf, true, origin.origin());
let url = ServoUrl::parse(&format!("blob:{}{}", origin.as_str(), id.simple())).unwrap();
let request = RequestBuilder::new(Some(TEST_WEBVIEW_ID), url.clone(), Referrer::NoReferrer)
.origin(origin.origin())
.policy_container(Default::default())
.build();
let request = RequestBuilder::new(
Some(TEST_WEBVIEW_ID),
UrlWithBlobClaim::from_url_without_having_claimed_blob(url.clone()),
Referrer::NoReferrer,
)
.origin(origin.origin())
.policy_container(Default::default())
.build();
let (sender, receiver) = unbounded();
@@ -227,10 +240,14 @@ fn test_file() {
.unwrap();
let url = ServoUrl::from_file_path(path.clone()).unwrap();
let request = RequestBuilder::new(Some(TEST_WEBVIEW_ID), url.clone(), Referrer::NoReferrer)
.origin(url.origin())
.policy_container(Default::default())
.build();
let request = RequestBuilder::new(
Some(TEST_WEBVIEW_ID),
UrlWithBlobClaim::new(url.clone(), None),
Referrer::NoReferrer,
)
.origin(url.origin())
.policy_container(Default::default())
.build();
let mut context = new_fetch_context(None, None);
let fetch_response = fetch_with_context(request, &mut context);
@@ -269,10 +286,14 @@ fn test_file() {
#[test]
fn test_fetch_ftp() {
let url = ServoUrl::parse("ftp://not-supported").unwrap();
let request = RequestBuilder::new(Some(TEST_WEBVIEW_ID), url.clone(), Referrer::NoReferrer)
.origin(url.origin())
.policy_container(Default::default())
.build();
let request = RequestBuilder::new(
Some(TEST_WEBVIEW_ID),
UrlWithBlobClaim::new(url.clone(), None),
Referrer::NoReferrer,
)
.origin(url.origin())
.policy_container(Default::default())
.build();
let fetch_response = fetch(request, None);
assert!(fetch_response.is_network_error());
}
@@ -280,10 +301,14 @@ fn test_fetch_ftp() {
#[test]
fn test_fetch_bogus_scheme() {
let url = ServoUrl::parse("bogus://whatever").unwrap();
let request = RequestBuilder::new(Some(TEST_WEBVIEW_ID), url.clone(), Referrer::NoReferrer)
.origin(url.origin())
.policy_container(Default::default())
.build();
let request = RequestBuilder::new(
Some(TEST_WEBVIEW_ID),
UrlWithBlobClaim::new(url.clone(), None),
Referrer::NoReferrer,
)
.origin(url.origin())
.policy_container(Default::default())
.build();
let fetch_response = fetch(request, None);
assert!(fetch_response.is_network_error());
}
@@ -699,7 +724,7 @@ fn test_fetch_with_local_urls_only() {
};
let (server, server_url) = make_server(handler);
let do_fetch = |url: ServoUrl| {
let do_fetch = |url: UrlWithBlobClaim| {
let mut request =
RequestBuilder::new(Some(TEST_WEBVIEW_ID), url.clone(), Referrer::NoReferrer)
.origin(url.origin())
@@ -712,7 +737,7 @@ fn test_fetch_with_local_urls_only() {
fetch(request, None)
};
let local_url = ServoUrl::parse("about:blank").unwrap();
let local_url = UrlWithBlobClaim::new(ServoUrl::parse("about:blank").unwrap(), None);
let local_response = do_fetch(local_url);
let server_response = do_fetch(server_url);
@@ -745,7 +770,10 @@ fn test_fetch_with_hsts() {
state: Arc::new(create_http_state(None)),
user_agent: DEFAULT_USER_AGENT.into(),
devtools_chan: None,
filemanager: FileManager::new(embedder_proxy.clone()),
filemanager: FileManager::new(
embedder_proxy.clone(),
BlobTokenCommunicator::stub_for_testing(),
),
file_token: FileTokenCheck::NotRequired,
request_interceptor: Arc::new(TokioMutex::new(RequestInterceptor::new(embedder_proxy))),
cancellation_listener: Arc::new(Default::default()),
@@ -808,7 +836,10 @@ fn test_load_adds_host_to_hsts_list_when_url_is_https() {
state: Arc::new(create_http_state(None)),
user_agent: DEFAULT_USER_AGENT.into(),
devtools_chan: None,
filemanager: FileManager::new(embedder_proxy.clone()),
filemanager: FileManager::new(
embedder_proxy.clone(),
BlobTokenCommunicator::stub_for_testing(),
),
file_token: FileTokenCheck::NotRequired,
request_interceptor: Arc::new(TokioMutex::new(RequestInterceptor::new(embedder_proxy))),
cancellation_listener: Arc::new(Default::default()),
@@ -876,7 +907,10 @@ fn test_fetch_self_signed() {
state: Arc::new(create_http_state(None)),
user_agent: DEFAULT_USER_AGENT.into(),
devtools_chan: None,
filemanager: FileManager::new(embedder_proxy.clone()),
filemanager: FileManager::new(
embedder_proxy.clone(),
BlobTokenCommunicator::stub_for_testing(),
),
file_token: FileTokenCheck::NotRequired,
request_interceptor: Arc::new(TokioMutex::new(RequestInterceptor::new(embedder_proxy))),
cancellation_listener: Arc::new(Default::default()),
@@ -1439,7 +1473,7 @@ fn test_fetch_with_devtools() {
);
let httprequest = DevtoolsHttpRequest {
url: url,
url: url.url(),
method: Method::GET,
headers: headers,
body: Some(vec![].into()),
@@ -1522,7 +1556,10 @@ fn test_fetch_request_intercepted() {
state: Arc::new(create_http_state(None)),
user_agent: DEFAULT_USER_AGENT.into(),
devtools_chan: None,
filemanager: FileManager::new(embedder_proxy.clone()),
filemanager: FileManager::new(
embedder_proxy.clone(),
BlobTokenCommunicator::stub_for_testing(),
),
file_token: FileTokenCheck::NotRequired,
request_interceptor: Arc::new(TokioMutex::new(RequestInterceptor::new(embedder_proxy))),
cancellation_listener: Arc::new(Default::default()),
@@ -1538,10 +1575,14 @@ fn test_fetch_request_intercepted() {
};
let url = ServoUrl::parse("http://www.example.org").unwrap();
let request = RequestBuilder::new(Some(TEST_WEBVIEW_ID), url.clone(), Referrer::NoReferrer)
.origin(url.origin())
.policy_container(Default::default())
.build();
let request = RequestBuilder::new(
Some(TEST_WEBVIEW_ID),
UrlWithBlobClaim::new(url.clone(), None),
Referrer::NoReferrer,
)
.origin(url.origin())
.policy_container(Default::default())
.build();
let response = fetch_with_context(request, &mut context);
assert!(

View File

@@ -13,7 +13,7 @@ use ipc_channel::ipc;
use net::async_runtime::init_async_runtime;
use net::embedder::NetToEmbedderMsg;
use net::filemanager_thread::FileManager;
use net_traits::blob_url_store::BlobURLStoreError;
use net_traits::blob_url_store::{BlobTokenCommunicator, BlobURLStoreError};
use net_traits::filemanager_thread::{
FileManagerThreadError, FileManagerThreadMsg, ReadFileProgress,
};
@@ -32,7 +32,7 @@ fn test_filemanager() {
servo_config::prefs::set(preferences);
let (embedder_proxy, embedder_receiver) = create_generic_embedder_proxy_and_receiver();
let filemanager = FileManager::new(embedder_proxy);
let filemanager = FileManager::new(embedder_proxy, BlobTokenCommunicator::stub_for_testing());
// Try to open a dummy file "components/net/tests/test.jpeg" in tree
let mut handler = File::open("tests/test.jpeg").expect("test.jpeg is stolen");

View File

@@ -5,6 +5,7 @@
use http::header::{CONTENT_LENGTH, CONTENT_RANGE, EXPIRES, HeaderValue, RANGE};
use http::{HeaderMap, StatusCode};
use net::http_cache::{CacheKey, HttpCache, refresh};
use net_traits::blob_url_store::UrlWithBlobClaim;
use net_traits::request::{Referrer, RequestBuilder};
use net_traits::response::{Response, ResponseBody};
use net_traits::{ResourceFetchTiming, ResourceTimingType};
@@ -20,10 +21,14 @@ async fn test_refreshing_resource_sets_done_chan_the_appropriate_value() {
ResponseBody::Done(vec![]),
];
let url = ServoUrl::parse("https://servo.org").unwrap();
let request = RequestBuilder::new(None, url.clone(), Referrer::NoReferrer)
.pipeline_id(Some(TEST_PIPELINE_ID))
.origin(url.origin())
.build();
let request = RequestBuilder::new(
None,
UrlWithBlobClaim::new(url.clone(), None),
Referrer::NoReferrer,
)
.pipeline_id(Some(TEST_PIPELINE_ID))
.origin(url.origin())
.build();
let timing = ResourceFetchTiming::new(ResourceTimingType::Navigation);
let mut response = Response::new(url.clone(), timing);
// Expires header makes the response cacheable.
@@ -71,11 +76,15 @@ async fn test_skip_incomplete_cache_for_range_request_with_no_end_bound() {
RANGE,
HeaderValue::from_str(&format!("bytes={}-", 0)).unwrap(),
);
let request = RequestBuilder::new(None, url.clone(), Referrer::NoReferrer)
.pipeline_id(Some(TEST_PIPELINE_ID))
.origin(url.origin())
.headers(headers)
.build();
let request = RequestBuilder::new(
None,
UrlWithBlobClaim::new(url.clone(), None),
Referrer::NoReferrer,
)
.pipeline_id(Some(TEST_PIPELINE_ID))
.origin(url.origin())
.headers(headers)
.build();
// Store incomplete response to http_cache
let timing = ResourceFetchTiming::new(ResourceTimingType::Navigation);

Some files were not shown because too many files have changed in this diff Show More