Compare commits

..

24 Commits

Author SHA1 Message Date
Teffen Ellis
90f27f93e1 web/e2e: fix three regressions blocking the parallel suite
Locally and in CI the new `e2e (playwright)` job appeared to "hang"
under `fullyParallel: true` + `workers: "50%"`. The hang was actually
five tests sharing two unrelated bugs that all manifest as 30s test
timeouts; the cluster only *looks* like a parallelism issue because
multiple workers stall on the same wall-clock window. With these three
fixes the full suite is green in 1m48s on `--workers=2` (was: 5 failed
/ 17 passed in 5m30s).

1. `web/test/browser/600-providers.test.ts`
   PR #21647 dropped the `to:` argument on the `session.login()` call
   in this file's `beforeEach`. Without it, `SessionFixture.login()`
   waits for the auth-flow URL pattern to re-appear — which it does
   immediately, since we just navigated there — so the helper returns
   *before* the post-login redirect lands. The wizard buttons probed
   afterward live on `/if/admin/#/core/providers`, which the user never
   actually reaches; every test in the file then hits the 30s
   `beforeEach` timeout. Pin the destination explicitly, matching the
   shape of every other test file.

2. `web/src/admin/roles/ak-role-list.ts`
   The role-list row anchor had no aria-label, so its accessible name
   was the (random, generated) role name. `500-roles.test.ts` searches
   for that anchor with `getByRole("link", { name: "view details" })`
   — the same selector `400-groups.test.ts` uses against the group
   list, where `GroupListPage.row()` *does* set
   `aria-label="View details of group ..."`. Bring the role row to
   parity with groups; the test wasn't wrong, the UI was missing the
   accessibility hook.

3. `web/test/browser/500-roles.test.ts` ("Edit role from view page")
   The post-edit verification used `page.getByText(updatedName)`, but
   on the role view page the new name renders in two places (the
   "Role <name>" page-navbar heading and the description-list value),
   so the bare text match resolves to two elements and trips
   strict-mode. Add `{ exact: true }` so we assert the canonical value
   the edit wrote rather than the heading template.

Co-Authored-By: Agent (authentik-i21996-internal-achievable-raisin) <279763771+playpen-agent@users.noreply.github.com>
2026-05-01 15:42:07 +00:00
Teffen Ellis
d92bfc473f ci/web: post Playwright result comment + gated S3 upload + !cancelled() guards
Three reviewer-facing improvements to the e2e job:

1. Idempotent PR comment summarising Playwright pass/fail/flaky/skipped
   counts. Marker `<!-- playwright-result -->` lets re-runs edit the
   same comment instead of piling up. Skipped on fork PRs where the
   default GITHUB_TOKEN is read-only.

2. Optional S3 publish of the HTML report to
   `s3://authentik-playwright-artifacts/pr-<n>/run-<id>/attempt-<n>/`,
   gated behind `vars.PLAYWRIGHT_S3_ENABLED == 'true'`. The bucket is
   pending infra provisioning; the public URL pattern is already wired
   into the comment so flipping the variable on later requires no
   workflow changes. Borrows the OIDC + IAM role plumbing from
   `.github/workflows/release-publish.yml`.

3. Switch the failure-guarded reporting/upload steps to `!cancelled()`
   so a superseded (cancelled) run no longer emits failure-shaped noise,
   and so successful runs still produce the artifact bundle reviewers
   expect.

Adds the Playwright JSON reporter so the parse step can pull pass/fail
counts from `playwright-report/results.json` for the comment body.

Co-Authored-By: Agent (authentik-i21996-internal-achievable-raisin) <279763771+playpen-agent@users.noreply.github.com>
2026-05-01 14:57:35 +00:00
Teffen Ellis
4d3ac3f63a Flesh out types. 2026-05-01 16:05:16 +02:00
Teffen Ellis
e29d37bb8a Always parallel. 2026-05-01 15:48:29 +02:00
Teffen Ellis
6fee629926 Update expected path. 2026-05-01 04:04:57 +02:00
Teffen Ellis
cad8395dad Ignore playwright-traces. 2026-05-01 04:04:57 +02:00
Teffen Ellis
191ecc51bd Reorder tests. 2026-05-01 04:04:57 +02:00
Teffen Ellis
71e9092810 Remove guard. 2026-05-01 04:04:56 +02:00
Teffen Ellis
34dbb78df0 Use parallelism. 2026-05-01 04:04:56 +02:00
Teffen Ellis
e29a3eb35a ci/web: format test-admin-user.yaml with prettier
Pick up the 4-space indent that web/'s prettier config enforces. The
file was added under issue #21994 with 2-space indent and tripped the
ci-web format check on push.

Co-Authored-By: Agent (authentik-i21994-better-mobile-tangelo) <279763771+playpen-agent@users.noreply.github.com>
2026-05-01 04:04:56 +02:00
Teffen Ellis
fe8b3d1687 ci/web: make test-admin blueprint self-contained
The previous blueprint used !Find to look up the authentik Admins group,
which raced against system/bootstrap.yaml and resolved to None when the
explicit apply_blueprint step ran before the worker had applied bootstrap.
The serializer rejected groups: [None] with Invalid pk "None".

Define the group in the same blueprint with state: present and reference
it via !KeyOf, so the test admin setup does not depend on any pre-existing
data. If bootstrap has already created the group, state: present is a
no-op on the identifiers; otherwise the group is created here.

Co-Authored-By: Agent (authentik-i21994-better-mobile-tangelo) <279763771+playpen-agent@users.noreply.github.com>
2026-05-01 04:04:56 +02:00
Teffen Ellis
0fd2a13b35 Bump package.json 2026-05-01 04:04:55 +02:00
Teffen Ellis
17d49aa99a ci/web: run Playwright e2e suite on every PR
Boots the full authentik stack (postgres + Go server + Rust worker)
inside the existing ci-web workflow, applies migrations and the
test-admin user blueprint, then runs `corepack npm run --prefix web
test:e2e` against http://localhost:9000. Uploads the HTML report,
traces/videos, and authentik logs as artifacts on failure so reviewers
can debug without rerunning locally.

Also enables the HTML reporter and screenshot/video capture on CI in
playwright.config.js, and updates the full dev-environment docs to
point at the same npm scripts CI uses so local and CI runs stay in
lockstep.

Closes #21994

Co-Authored-By: Agent (authentik-i21994-better-mobile-tangelo) <279763771+playpen-agent@users.noreply.github.com>
2026-05-01 04:04:55 +02:00
Teffen Ellis
b9389544be Flesh out github actions. 2026-05-01 01:23:13 +02:00
Teffen Ellis
ed82b3f623 Lint. 2026-05-01 01:21:05 +02:00
Teffen Ellis
b281bb819d Bump. 2026-05-01 01:21:05 +02:00
Teffen Ellis
3ca633c10a lint. 2026-05-01 01:21:05 +02:00
Teffen Ellis
ad1582c43f Clean up docs container. 2026-05-01 01:21:05 +02:00
Teffen Ellis
53f2826ea1 Flesh out github actions. 2026-05-01 01:21:03 +02:00
Teffen Ellis
ccf7225f03 Update makefile. 2026-05-01 01:20:18 +02:00
Teffen Ellis
85469c86d1 Prep containers. 2026-05-01 01:20:18 +02:00
Teffen Ellis
0bfce2d1f9 Bump engines. 2026-05-01 01:20:18 +02:00
Teffen Ellis
7b4e175d59 Flesh out node scripts. 2026-05-01 01:20:18 +02:00
Teffen Ellis
b47f6f8e56 Fix mounted references. 2026-05-01 01:20:17 +02:00
281 changed files with 6343 additions and 12043 deletions

81
.github/actions/setup-node/action.yml vendored Normal file
View File

@@ -0,0 +1,81 @@
name: "Setup Node.js and NPM"
description: "Sets up Node.js with a specific NPM version via Corepack"
inputs:
working-directory:
description: "Path to the working directory containing the package.json file"
required: false
default: "."
dependencies:
required: false
description: "List of dependencies to setup"
default: "monorepo,working-directory"
node-version-file:
description: "Path to file containing the Node.js version"
required: false
default: "package.json"
cache-dependency-path:
description: "Path to dependency lock file for caching"
required: false
default: "package-lock.json"
cache:
description: "Package manager to cache"
default: "npm"
registry-url:
description: "npm registry URL"
default: "https://registry.npmjs.org"
runs:
using: "composite"
steps:
- name: Setup Node.js (Corepack bootstrap)
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v4
with:
node-version-file: ${{ inputs.node-version-file }}
registry-url: ${{ inputs.registry-url }}
# The setup-node action will attempt to create a cache using a version of
# npm that may not be compatible with the range specified in package.json.
# This can be enabled **after** corepack is installed and the correct npm version is available.
package-manager-cache: false
- name: Install Corepack
working-directory: ${{ github.workspace}}
shell: bash
run: | #shell
node ./scripts/node/lint-runtime.mjs
node ./scripts/node/setup-corepack.mjs --force
corepack enable
- name: Lint Node.js and NPM versions
shell: bash
run: node ./scripts/node/lint-runtime.mjs
- name: Setup Node.js (Monorepo Root)
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v4
with:
node-version-file: ${{ inputs.node-version-file }}
cache: ${{ inputs.cache }}
cache-dependency-path: ${{ inputs.cache-dependency-path }}
registry-url: ${{ inputs.registry-url }}
- name: Install monorepo dependencies
if: ${{ contains(inputs.dependencies, 'monorepo') }}
shell: bash
run: | #shell
node ./scripts/node/lint-lockfile.mjs
corepack npm ci
- name: Setup Node.js (Working Directory)
if: ${{ contains(inputs.dependencies, 'working-directory') }}
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v4
with:
node-version-file: ${{ inputs.working-directory }}/${{ inputs.node-version-file }}
cache: ${{ inputs.cache }}
cache-dependency-path: ${{ inputs.working-directory }}/${{ inputs.cache-dependency-path }}
registry-url: ${{ inputs.registry-url }}
- name: Install working directory dependencies
if: ${{ contains(inputs.dependencies, 'working-directory') }}
shell: bash
run: | # shell
corepack install
echo "node version: $(node --version)"
echo "npm version: $(corepack npm --version)"
node ./scripts/node/lint-lockfile.mjs ${{ inputs.working-directory }}
corepack npm ci --prefix ${{ inputs.working-directory }}

View File

@@ -18,19 +18,24 @@ runs:
using: "composite"
steps:
- name: Cleanup apt
if: ${{ contains(inputs.dependencies, 'system') || contains(inputs.dependencies, 'python') }}
if: ${{ contains(inputs.dependencies, 'system') || contains(inputs.dependencies,
'python') }}
shell: bash
run: sudo apt-get remove --purge man-db
- name: Install apt deps
if: ${{ contains(inputs.dependencies, 'system') || contains(inputs.dependencies, 'python') }}
if: ${{ contains(inputs.dependencies, 'system') || contains(inputs.dependencies,
'python') }}
uses: gerlero/apt-install@f4fa5265092af9e750549565d28c99aec7189639
with:
packages: libpq-dev openssl libxmlsec1-dev pkg-config gettext krb5-multidev libkrb5-dev heimdal-multidev libclang-dev krb5-kdc krb5-user krb5-admin-server
packages: libpq-dev openssl libxmlsec1-dev pkg-config gettext krb5-multidev
libkrb5-dev heimdal-multidev libclang-dev krb5-kdc krb5-user
krb5-admin-server
update: true
upgrade: false
install-recommends: false
- name: Make space on disk
if: ${{ contains(inputs.dependencies, 'system') || contains(inputs.dependencies, 'python') }}
if: ${{ contains(inputs.dependencies, 'system') || contains(inputs.dependencies,
'python') }}
shell: bash
run: |
sudo mkdir -p /tmp/empty/
@@ -49,9 +54,10 @@ runs:
if: ${{ contains(inputs.dependencies, 'python') }}
shell: bash
working-directory: ${{ inputs.working-directory }}
run: uv sync --all-extras --dev --locked
run: uv sync --all-extras --dev --frozen
- name: Setup rust (stable)
if: ${{ contains(inputs.dependencies, 'rust') && !contains(inputs.dependencies, 'rust-nightly') }}
if: ${{ contains(inputs.dependencies, 'rust') && !contains(inputs.dependencies,
'rust-nightly') }}
uses: actions-rust-lang/setup-rust-toolchain@2b1f5e9b395427c92ee4e3331786ca3c37afe2d7 # v1
with:
rustflags: ""
@@ -64,30 +70,14 @@ runs:
rustflags: ""
- name: Setup rust dependencies
if: ${{ contains(inputs.dependencies, 'rust') }}
uses: taiki-e/install-action@db5fb34fa772531a3ece57ca434f579eb334e0fb # v2
uses: taiki-e/install-action@481c34c1cf3a84c68b5e46f4eccfc82af798415a # v2
with:
tool: cargo-deny cargo-machete cargo-llvm-cov nextest
- name: Setup node (web)
- name: Setup node (root, web)
if: ${{ contains(inputs.dependencies, 'node') }}
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v4
uses: ./.github/actions/setup-node
with:
node-version-file: "${{ inputs.working-directory }}web/package.json"
cache: "npm"
cache-dependency-path: "${{ inputs.working-directory }}web/package-lock.json"
registry-url: "https://registry.npmjs.org"
- name: Setup node (root)
if: ${{ contains(inputs.dependencies, 'node') }}
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v4
with:
node-version-file: "${{ inputs.working-directory }}package.json"
cache: "npm"
cache-dependency-path: "${{ inputs.working-directory }}package-lock.json"
registry-url: "https://registry.npmjs.org"
- name: Install Node deps
if: ${{ contains(inputs.dependencies, 'node') }}
shell: bash
working-directory: ${{ inputs.working-directory }}
run: npm ci
working-directory: web
- name: Setup go
if: ${{ contains(inputs.dependencies, 'go') }}
uses: actions/setup-go@4a3601121dd01d1626a1e23e37211e3254c1c06c # v5
@@ -97,7 +87,9 @@ runs:
if: ${{ contains(inputs.dependencies, 'runtime') }}
uses: AndreKurait/docker-cache@0fe76702a40db986d9663c24954fc14c6a6031b7
with:
key: docker-images-${{ runner.os }}-${{ hashFiles('.github/actions/setup/compose.yml', 'Makefile') }}-${{ inputs.postgresql_version }}
key: docker-images-${{ runner.os }}-${{
hashFiles('.github/actions/setup/compose.yml', 'Makefile') }}-${{
inputs.postgresql_version }}
- name: Setup dependencies
if: ${{ contains(inputs.dependencies, 'runtime') }}
shell: bash
@@ -105,7 +97,7 @@ runs:
run: |
export PSQL_TAG=${{ inputs.postgresql_version }}
docker compose -f .github/actions/setup/compose.yml up -d --wait
cd web && npm ci
corepack npm ci --prefix web
- name: Generate config
if: ${{ contains(inputs.dependencies, 'python') }}
shell: uv run python {0}

View File

@@ -67,6 +67,16 @@ jobs:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- uses: ./.github/actions/setup-node
with:
working-directory: web
dependencies: "monorepo"
- uses: actions/setup-go@4b73464bb391d4059bd26b0524d20df3927bd417 # v6
with:
go-version-file: "go.mod"
- name: Generate API Clients
run: |
make gen-client-ts
- name: Build Docker Image
uses: docker/build-push-action@bcafcacb16a39f128d818304e6c9c0c18556b85f # v7.1.0
id: push
@@ -81,7 +91,8 @@ jobs:
${{ steps.ev.outputs.imageBuildArgs }}
tags: ${{ steps.ev.outputs.imageTags }}
platforms: linux/${{ inputs.image_arch }}
cache-from: type=registry,ref=${{ steps.ev.outputs.attestImageNames }}:buildcache-${{ inputs.image_arch }}
cache-from: type=registry,ref=${{ steps.ev.outputs.attestImageNames
}}:buildcache-${{ inputs.image_arch }}
cache-to: ${{ steps.ev.outputs.cacheTo }}
- uses: actions/attest-build-provenance@a2bbfa25375fe432b6a289bc6b6cd05ecd0c4c32 # v3
id: attest

View File

@@ -90,7 +90,7 @@ jobs:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- uses: int128/docker-manifest-create-action@fa55f72001a6c74b0f4997dca65c70d334905180 # v2
- uses: int128/docker-manifest-create-action@7df7f9e221d927eaadf87db231ddf728047308a4 # v2
id: build
with:
tags: ${{ matrix.tag }}

65
.github/workflows/api-ts-publish.yml vendored Normal file
View File

@@ -0,0 +1,65 @@
---
name: API - Publish Typescript client
on:
push:
branches: [main]
paths:
- "schema.yml"
workflow_dispatch:
permissions:
# Required for NPM OIDC trusted publisher
id-token: write
contents: read
jobs:
build:
runs-on: ubuntu-latest
steps:
- id: generate_token
uses: actions/create-github-app-token@f8d387b68d61c58ab83c6c016672934102569859 # v2
with:
app-id: ${{ secrets.GH_APP_ID }}
private-key: ${{ secrets.GH_APP_PRIV_KEY }}
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
with:
token: ${{ steps.generate_token.outputs.token }}
- uses: ./.github/actions/setup-node
with:
working-directory: web
- name: Generate API Client
run: make gen-client-ts
- name: Publish package
working-directory: gen-ts-api/
run: |
npm i
npm publish --tag generated
- name: Upgrade /web
working-directory: web
run: |
export VERSION=`node -e 'import mod from "./gen-ts-api/package.json" with { type: "json" };console.log(mod.version);'`
npm i @goauthentik/api@$VERSION
- name: Upgrade /web/packages/sfe
working-directory: web/packages/sfe
run: |
export VERSION=`node -e 'import mod from "./gen-ts-api/package.json" with { type: "json" };console.log(mod.version);'`
npm i @goauthentik/api@$VERSION
- uses: peter-evans/create-pull-request@c0f553fe549906ede9cf27b5156039d195d2ece0 # v7
id: cpr
with:
token: ${{ steps.generate_token.outputs.token }}
branch: update-web-api-client
commit-message: "web: bump API Client version"
title: "web: bump API Client version"
body: "web: bump API Client version"
delete-branch: true
signoff: true
# ID from https://api.github.com/users/authentik-automation[bot]
author: authentik-automation[bot] <135050075+authentik-automation[bot]@users.noreply.github.com>
labels: dependencies
- uses: peter-evans/enable-pull-request-automerge@a660677d5469627102a1c1e11409dd063606628d # v3
with:
token: ${{ steps.generate_token.outputs.token }}
pull-request-number: ${{ steps.cpr.outputs.pull-request-number }}
merge-method: squash

View File

@@ -22,25 +22,19 @@ jobs:
- prettier-check
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- name: Install Dependencies
working-directory: website/
run: npm ci
- uses: ./.github/actions/setup-node
with:
working-directory: website
- name: Lint
working-directory: website/
run: npm run ${{ matrix.command }}
run: corepack npm run ${{ matrix.command }} --prefix website
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v5
- uses: ./.github/actions/setup-node
with:
node-version-file: website/package.json
cache: "npm"
cache-dependency-path: website/package-lock.json
- working-directory: website/
name: Install Dependencies
run: npm ci
working-directory: website
- uses: actions/cache@27d5ce7f107fe9357f9df03efb73ab90386fccae # v4
with:
path: |
@@ -54,7 +48,7 @@ jobs:
working-directory: website
env:
NODE_ENV: production
run: npm run build -w api
run: corepack npm run build -w api
- uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v4
with:
name: api-docs
@@ -71,11 +65,9 @@ jobs:
with:
name: api-docs
path: website/api/build
- uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v5
- uses: ./.github/actions/setup-node
with:
node-version-file: website/package.json
cache: "npm"
cache-dependency-path: website/package-lock.json
working-directory: website
- name: Deploy Netlify (Production)
working-directory: website/api
if: github.event_name == 'push' && github.ref == 'refs/heads/main'

View File

@@ -24,14 +24,9 @@ jobs:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- name: Setup authentik env
uses: ./.github/actions/setup
- uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v5
- uses: ./.github/actions/setup-node
with:
node-version-file: lifecycle/aws/package.json
cache: "npm"
cache-dependency-path: lifecycle/aws/package-lock.json
- working-directory: lifecycle/aws/
run: |
npm ci
working-directory: lifecycle/aws
- name: Check changes have been applied
run: |
uv run make aws-cfn

View File

@@ -24,46 +24,34 @@ jobs:
- prettier-check
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- name: Install dependencies
working-directory: website/
run: npm ci
- uses: ./.github/actions/setup-node
with:
working-directory: website
- name: Lint
working-directory: website/
run: npm run ${{ matrix.command }}
run: corepack npm run ${{ matrix.command }} --prefix website
build-docs:
runs-on: ubuntu-latest
env:
NODE_ENV: production
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v5
- uses: ./.github/actions/setup-node
name: Setup Node.js
with:
node-version-file: website/package.json
cache: "npm"
cache-dependency-path: website/package-lock.json
- working-directory: website/
name: Install Dependencies
run: npm ci
working-directory: website
- name: Build Documentation via Docusaurus
working-directory: website/
run: npm run build
run: corepack npm run build --prefix website
build-integrations:
runs-on: ubuntu-latest
env:
NODE_ENV: production
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v5
- uses: ./.github/actions/setup-node
with:
node-version-file: website/package.json
cache: "npm"
cache-dependency-path: website/package-lock.json
- working-directory: website/
name: Install Dependencies
run: npm ci
working-directory: website
- name: Build Integrations via Docusaurus
working-directory: website/
run: npm run build -w integrations
run: corepack npm run build -w integrations --prefix website
build-container:
runs-on: ubuntu-latest
permissions:
@@ -104,7 +92,9 @@ jobs:
platforms: linux/amd64,linux/arm64
context: .
cache-from: type=registry,ref=ghcr.io/goauthentik/dev-docs:buildcache
cache-to: ${{ steps.ev.outputs.shouldPush == 'true' && 'type=registry,ref=ghcr.io/goauthentik/dev-docs:buildcache,mode=max' || '' }}
cache-to: ${{ steps.ev.outputs.shouldPush == 'true' &&
'type=registry,ref=ghcr.io/goauthentik/dev-docs:buildcache,mode=max'
|| '' }}
- uses: actions/attest-build-provenance@a2bbfa25375fe432b6a289bc6b6cd05ecd0c4c32 # v3
id: attest
if: ${{ steps.ev.outputs.shouldPush == 'true' }}

View File

@@ -73,7 +73,8 @@ jobs:
- name: generate API clients
run: make gen-clients
- name: ensure schema is up-to-date
run: git diff --exit-code -- schema.yml blueprints/schema.json packages/client-go packages/client-rust packages/client-ts
run: git diff --exit-code -- schema.yml blueprints/schema.json
packages/client-go packages/client-rust packages/client-ts
test-migrations:
runs-on: ubuntu-latest
steps:
@@ -91,7 +92,8 @@ jobs:
outputs:
seed: ${{ steps.seed.outputs.seed }}
test-migrations-from-stable:
name: test-migrations-from-stable - PostgreSQL ${{ matrix.psql }} - Run ${{ matrix.run_id }}/5
name: test-migrations-from-stable - PostgreSQL ${{ matrix.psql }} - Run ${{
matrix.run_id }}/5
runs-on: ubuntu-latest
timeout-minutes: 30
needs: test-make-seed
@@ -101,7 +103,7 @@ jobs:
psql:
- 14-alpine
- 18-alpine
run_id: [1, 2, 3, 4, 5]
run_id: [ 1, 2, 3, 4, 5 ]
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
with:
@@ -109,8 +111,13 @@ jobs:
- name: checkout stable
run: |
set -e -o pipefail
cp -R .github ..
cp -R scripts ..
mkdir -p ../packages
cp -R packages/logger-js ../packages/logger-js
# Previous stable tag
prev_stable=$(git tag --sort=version:refname | grep '^version/' | grep -vE -- '-rc[0-9]+$' | tail -n1)
# Current version family based on
@@ -118,10 +125,13 @@ jobs:
if [[ -n $current_version_family ]]; then
prev_stable="version/${current_version_family}"
fi
echo "::notice::Checking out ${prev_stable} as stable version..."
git checkout ${prev_stable}
rm -rf .github/ scripts/
rm -rf .github/ scripts/ packages/logger-js/
mv ../.github ../scripts .
mv ../packages/logger-js ./packages/
- name: Setup authentik env (stable)
uses: ./.github/actions/setup
with:
@@ -169,7 +179,7 @@ jobs:
psql:
- 14-alpine
- 18-alpine
run_id: [1, 2, 3, 4, 5]
run_id: [ 1, 2, 3, 4, 5 ]
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- name: Setup authentik env
@@ -252,19 +262,22 @@ jobs:
COMPOSE_PROFILES: ${{ matrix.job.profiles }}
run: |
docker compose -f tests/e2e/compose.yml up -d --quiet-pull
- uses: ./.github/actions/setup-node
- id: cache-web
uses: actions/cache@27d5ce7f107fe9357f9df03efb73ab90386fccae # v4
if: contains(matrix.job.profiles, 'selenium')
with:
path: web/dist
key: ${{ runner.os }}-web-${{ hashFiles('web/package-lock.json', 'package-lock.json', 'web/src/**', 'web/packages/sfe/src/**') }}-b
key: ${{ runner.os }}-web-${{ hashFiles('web/package-lock.json',
'package-lock.json', 'web/src/**', 'web/packages/sfe/src/**') }}-b
- name: prepare web ui
if: steps.cache-web.outputs.cache-hit != 'true' && contains(matrix.job.profiles, 'selenium')
if: steps.cache-web.outputs.cache-hit != 'true' && contains(matrix.job.profiles,
'selenium')
working-directory: web
run: |
npm ci
npm run build
npm run build:sfe
corepack npm ci
corepack npm run build
corepack npm run build:sfe
- name: run e2e
run: |
uv run coverage run manage.py test ${{ matrix.job.glob }}
@@ -282,18 +295,10 @@ jobs:
fail-fast: false
matrix:
job:
- name: oidc_basic
glob: tests/openid_conformance/test_oidc_basic.py
- name: oidc_implicit
glob: tests/openid_conformance/test_oidc_implicit.py
- name: oidc_rp-initiated
glob: tests/openid_conformance/test_oidc_rp_initiated.py
- name: oidc_frontchannel
glob: tests/openid_conformance/test_oidc_frontchannel.py
- name: oidc_backchannel
glob: tests/openid_conformance/test_oidc_backchannel.py
- name: ssf_transmitter
glob: tests/openid_conformance/test_ssf_transmitter.py
- name: basic
glob: tests/openid_conformance/test_basic.py
- name: implicit
glob: tests/openid_conformance/test_implicit.py
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- name: Setup authentik env
@@ -310,14 +315,14 @@ jobs:
uses: actions/cache@27d5ce7f107fe9357f9df03efb73ab90386fccae # v4
with:
path: web/dist
key: ${{ runner.os }}-web-${{ hashFiles('web/package-lock.json', 'web/src/**', 'web/packages/sfe/src/**') }}-b
key: ${{ runner.os }}-web-${{ hashFiles('web/package-lock.json', 'web/src/**',
'web/packages/sfe/src/**') }}-b
- name: prepare web ui
if: steps.cache-web.outputs.cache-hit != 'true'
working-directory: web
run: |
npm ci
npm run build
npm run build:sfe
corepack npm ci --prefix web
corepack npm run build --prefix web
corepack npm run build:sfe --prefix web
- name: run conformance
run: |
uv run coverage run manage.py test ${{ matrix.job.glob }}
@@ -383,7 +388,9 @@ jobs:
uses: ./.github/workflows/_reusable-docker-build.yml
secrets: inherit
with:
image_name: ${{ github.repository == 'goauthentik/authentik-internal' && 'ghcr.io/goauthentik/internal-server' || 'ghcr.io/goauthentik/dev-server' }}
image_name: ${{ github.repository == 'goauthentik/authentik-internal' &&
'ghcr.io/goauthentik/internal-server' ||
'ghcr.io/goauthentik/dev-server' }}
release: false
pr-comment:
needs:

View File

@@ -114,8 +114,11 @@ jobs:
GIT_BUILD_HASH=${{ steps.ev.outputs.sha }}
platforms: linux/amd64,linux/arm64
context: .
cache-from: type=registry,ref=ghcr.io/goauthentik/dev-${{ matrix.type }}:buildcache
cache-to: ${{ steps.ev.outputs.shouldPush == 'true' && format('type=registry,ref=ghcr.io/goauthentik/dev-{0}:buildcache,mode=max', matrix.type) || '' }}
cache-from: type=registry,ref=ghcr.io/goauthentik/dev-${{ matrix.type
}}:buildcache
cache-to: ${{ steps.ev.outputs.shouldPush == 'true' &&
format('type=registry,ref=ghcr.io/goauthentik/dev-{0}:buildcache,mode=max',
matrix.type) || '' }}
- uses: actions/attest-build-provenance@a2bbfa25375fe432b6a289bc6b6cd05ecd0c4c32 # v3
id: attest
if: ${{ steps.ev.outputs.shouldPush == 'true' }}
@@ -136,8 +139,8 @@ jobs:
- ldap
- radius
- rac
goos: [linux]
goarch: [amd64, arm64]
goos: [ linux ]
goarch: [ amd64, arm64 ]
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
with:
@@ -145,16 +148,11 @@ jobs:
- uses: actions/setup-go@4a3601121dd01d1626a1e23e37211e3254c1c06c # v6
with:
go-version-file: "go.mod"
- uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v5
- uses: ./.github/actions/setup-node
with:
node-version-file: web/package.json
cache: "npm"
cache-dependency-path: web/package-lock.json
working-directory: web
- name: Build web
working-directory: web/
run: |
npm ci
npm run build-proxy
run: corepack npm run build-proxy --prefix web
- name: Build outpost
run: |
set -x

View File

@@ -12,51 +12,280 @@ on:
- main
- version-*
env:
POSTGRES_DB: authentik
POSTGRES_USER: authentik
POSTGRES_PASSWORD: "EK-5jnKfjrGRm<77"
AUTHENTIK_BLUEPRINTS_DIR: "./blueprints"
AUTHENTIK_OUTPOSTS__DISABLE_EMBEDDED_OUTPOST: "true"
# Drives the system/bootstrap.yaml blueprint at startup: creates akadmin with
# these credentials and flips the Setup flag (Setup.set(True)) so the SPA's
# post-login redirect to "/" doesn't bounce through /setup, which would 500
# because the OOBE policy refuses to run once akadmin already has a usable
# password. See authentik/core/setup/signals.py and blueprints/default/flow-oobe.yaml.
AUTHENTIK_BOOTSTRAP_EMAIL: "test-admin@goauthentik.io"
AUTHENTIK_BOOTSTRAP_PASSWORD: "test-runner"
jobs:
lint:
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
command:
- lint
- lint:lockfile
- tsc
- prettier-check
project:
- web
include:
- command: tsc
project: web
- command: lit-analyse
project: web
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v5
- uses: ./.github/actions/setup-node
with:
node-version-file: ${{ matrix.project }}/package.json
cache: "npm"
cache-dependency-path: ${{ matrix.project }}/package-lock.json
- working-directory: ${{ matrix.project }}/
run: |
npm ci
working-directory: web
- name: Lint
working-directory: ${{ matrix.project }}/
run: npm run ${{ matrix.command }}
run: corepack npm run lint --prefix web
- name: Check types
run: corepack npm run tsc --prefix web
- name: Check formatting
run: corepack npm run prettier-check --prefix web
- name: Lit analyse
run: corepack npm run lit-analyse --prefix web
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v5
- uses: ./.github/actions/setup-node
with:
node-version-file: web/package.json
cache: "npm"
cache-dependency-path: web/package-lock.json
- working-directory: web/
run: npm ci
working-directory: web
- name: build
working-directory: web/
run: npm run build
run: corepack npm run build
e2e:
name: e2e (playwright)
runs-on: ubuntu-latest
timeout-minutes: 60
permissions:
contents: read
# Required so the "Comment Playwright result on PR" step can update its
# marker comment via the gh CLI / REST API.
pull-requests: write
# Required so the optional "Upload HTML report to S3" step can mint OIDC
# credentials with aws-actions/configure-aws-credentials. Harmless when
# the upload is gated off.
id-token: write
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- name: Setup authentik env
uses: ./.github/actions/setup
with:
dependencies: system,python,node,go,rust,runtime
- name: Build web UI
run: corepack npm run --prefix web build
- name: Build authentik server (Go)
run: | # shell
go build -o ./bin/authentik-server ./cmd/server
sudo install -m 0755 ./bin/authentik-server /usr/local/bin/authentik-server
- name: Build authentik worker (Rust)
run: | # shell
cargo build --release --bin authentik
sudo install -m 0755 ./target/release/authentik /usr/local/bin/authentik
- name: Apply migrations
run: uv run python -m lifecycle.migrate
- name: Resolve Playwright version
id: playwright-version
working-directory: web
run: | # shell
version=$(node -p "require('@playwright/test/package.json').version")
if [ -z "$version" ]; then
echo "Failed to resolve @playwright/test version" >&2
exit 1
fi
echo "version=${version}" >> "$GITHUB_OUTPUT"
- name: Cache Playwright browsers
id: playwright-cache
uses: actions/cache@27d5ce7f107fe9357f9df03efb73ab90386fccae # v4
with:
path: ~/.cache/ms-playwright
key: ${{ runner.os }}-playwright-${{ steps.playwright-version.outputs.version }}
- name: Install Playwright browsers
working-directory: web
run: | # shell
if [ "${{ steps.playwright-cache.outputs.cache-hit }}" = "true" ]; then
corepack npm exec -- playwright install-deps chromium
else
corepack npm exec -- playwright install --with-deps chromium
fi
- name: Start authentik server and worker
run: | # shell
set -euo pipefail
mkdir -p /tmp/ak-logs
# The Go server (authentik-server) spawns gunicorn as a child via PATH lookup
# and inherits the env that `uv run` set up for it. Verify gunicorn resolves
# under the same launcher so we fail fast here instead of waiting for an
# empty 200 from the proxy fallback later.
uv run --frozen sh -c 'command -v gunicorn' \
|| { echo "gunicorn not resolvable from uv run"; exit 1; }
uv run ak server > /tmp/ak-logs/server.log 2>&1 &
echo $! > /tmp/ak-logs/server.pid
# The Rust worker also opens an HTTP/metrics server on listen.http /
# listen.metrics (default :9000 / :9300). On a single CI host that races the
# Go server's binds and silently steals :9000, leaving Playwright talking to
# a healthcheck-only axum router that returns 200/empty for /if/* paths.
# Pin the worker to disjoint ports so the Go server keeps the public 9000.
AUTHENTIK_LISTEN__HTTP="[::]:9001" \
AUTHENTIK_LISTEN__METRICS="[::]:9301" \
uv run ak worker > /tmp/ak-logs/worker.log 2>&1 &
echo $! > /tmp/ak-logs/worker.pid
- name: Wait for authentik to be ready
run: | # shell
set -euo pipefail
# Readiness probes must verify the Go server is actually serving the request,
# not just that *something* on :9000 returned 200. The Go proxy stamps
# `X-authentik-version` on its static responses and the rendered flow page
# contains the <ak-flow-executor> custom element — both are absent from the
# worker's axum healthcheck router, so checking either rules out the
# port-collision failure mode.
timeout 240 bash -c '
until curl -fsS -o /dev/null http://localhost:9000/-/health/ready/; do
sleep 2
done'
timeout 300 bash -c '
until curl -fsS http://localhost:9000/if/flow/default-authentication-flow/ \
| grep -q "ak-flow-executor"; do
sleep 3
done'
- name: Run Playwright tests
working-directory: web
env:
AK_TEST_RUNNER_PAGE_URL: http://localhost:9000
run: corepack npm run test:e2e
# Reporting / upload steps below intentionally use `!cancelled()` rather
# than `failure()`: a cancelled run (e.g. superseded by a newer push) is
# not a real result and shouldn't produce reviewer-facing artifacts or
# comments. `if-no-files-found: ignore` keeps the "passed" case quiet.
- name: Upload Playwright HTML report
if: ${{ !cancelled() }}
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1
with:
name: playwright-report
path: web/playwright-report/
retention-days: 14
if-no-files-found: ignore
- name: Upload Playwright traces and videos
if: ${{ !cancelled() }}
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1
with:
name: playwright-traces
path: web/test-results/
retention-days: 14
if-no-files-found: ignore
- name: Upload authentik server and worker logs
if: ${{ !cancelled() }}
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1
with:
name: authentik-logs
path: /tmp/ak-logs/
retention-days: 14
if-no-files-found: ignore
- name: Parse Playwright results
id: playwright-results
if: ${{ !cancelled() }}
run: | # shell
set -euo pipefail
report=web/playwright-report/results.json
if [ ! -f "$report" ]; then
{
echo "available=false"
echo "passed=0"
echo "failed=0"
echo "flaky=0"
echo "skipped=0"
} >> "$GITHUB_OUTPUT"
exit 0
fi
{
echo "available=true"
echo "passed=$(jq -r '.stats.expected // 0' "$report")"
echo "failed=$(jq -r '.stats.unexpected // 0' "$report")"
echo "flaky=$(jq -r '.stats.flaky // 0' "$report")"
echo "skipped=$(jq -r '.stats.skipped // 0' "$report")"
} >> "$GITHUB_OUTPUT"
# The S3 publishing pair below is intentionally gated off until the
# `authentik-playwright-artifacts` bucket is provisioned by infra. Flip
# the repo variable `PLAYWRIGHT_S3_ENABLED=true` to turn it on; the URL
# baked into the PR comment below already points at the eventual key.
- name: Configure AWS credentials for HTML report upload
if: ${{ !cancelled() && github.event_name == 'pull_request' && github.event.pull_request.head.repo.full_name == github.repository && vars.PLAYWRIGHT_S3_ENABLED == 'true' }}
uses: aws-actions/configure-aws-credentials@ec61189d14ec14c8efccab744f656cffd0e33f37 # v6.1.0
with:
role-to-assume: "arn:aws:iam::016170277896:role/github_goauthentik_authentik"
aws-region: eu-central-1
- name: Upload HTML report to S3
if: ${{ !cancelled() && github.event_name == 'pull_request' && github.event.pull_request.head.repo.full_name == github.repository && vars.PLAYWRIGHT_S3_ENABLED == 'true' }}
env:
S3_BUCKET: authentik-playwright-artifacts
S3_KEY_PREFIX: pr-${{ github.event.pull_request.number }}/run-${{ github.run_id }}/attempt-${{ github.run_attempt }}
run: | # shell
set -euo pipefail
if [ ! -d web/playwright-report ]; then
echo "No playwright-report/ produced; skipping S3 upload"
exit 0
fi
aws s3 cp \
--recursive \
--acl=public-read \
--cache-control "public, max-age=600" \
web/playwright-report/ \
"s3://${S3_BUCKET}/${S3_KEY_PREFIX}/"
# Same-repo guard: fork PRs run with a read-only GITHUB_TOKEN (even after
# maintainer approval), so the comment + edit calls would 403. Skip cleanly
# rather than failing the job on every fork PR run.
- name: Comment Playwright result on PR
if: ${{ !cancelled() && github.event_name == 'pull_request' && github.event.pull_request.head.repo.full_name == github.repository }}
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
PR_NUMBER: ${{ github.event.pull_request.number }}
AVAILABLE: ${{ steps.playwright-results.outputs.available }}
PASSED: ${{ steps.playwright-results.outputs.passed }}
FAILED: ${{ steps.playwright-results.outputs.failed }}
FLAKY: ${{ steps.playwright-results.outputs.flaky }}
SKIPPED: ${{ steps.playwright-results.outputs.skipped }}
S3_ENABLED: ${{ vars.PLAYWRIGHT_S3_ENABLED }}
REPORT_URL: https://authentik-playwright-artifacts.s3.amazonaws.com/pr-${{ github.event.pull_request.number }}/run-${{ github.run_id }}/attempt-${{ github.run_attempt }}/index.html
RUN_URL: ${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}/attempts/${{ github.run_attempt }}
run: | # shell
set -euo pipefail
marker='<!-- playwright-result -->'
if [ "$AVAILABLE" = "true" ]; then
if [ "$FAILED" -gt 0 ]; then
status='❌ Failed'
elif [ "$FLAKY" -gt 0 ]; then
status='⚠️ Passed with flakes'
else
status='✅ Passed'
fi
stats=$(printf '| Result | Count |\n|---|---|\n| ✅ Passed | %s |\n| ❌ Failed | %s |\n| ⚠️ Flaky | %s |\n| ⏭️ Skipped | %s |\n' "$PASSED" "$FAILED" "$FLAKY" "$SKIPPED")
else
status='⚠️ No results produced'
stats='The job did not produce `playwright-report/results.json`. The suite likely crashed before the JSON reporter wrote its output — see the workflow run for setup-step failures.'
fi
if [ "$S3_ENABLED" = "true" ]; then
report_line=$(printf '[HTML report](%s) · [Workflow run](%s)' "$REPORT_URL" "$RUN_URL")
else
report_line=$(printf '[Workflow run](%s) · _HTML report hosting is gated off until the `authentik-playwright-artifacts` S3 bucket is provisioned (`vars.PLAYWRIGHT_S3_ENABLED`). Until then, download the `playwright-report` artifact from the run page._' "$RUN_URL")
fi
body=$(printf '%s\n## Playwright e2e — %s\n\n%s\n\n%s\n' "$marker" "$status" "$stats" "$report_line")
existing=$(gh api "repos/${GITHUB_REPOSITORY}/issues/${PR_NUMBER}/comments" \
--paginate \
--jq "[.[] | select(.body != null and (.body | startswith(\"$marker\")))] | .[0].id // empty")
if [ -n "$existing" ]; then
gh api -X PATCH "repos/${GITHUB_REPOSITORY}/issues/comments/${existing}" -f body="$body" > /dev/null
echo "Updated existing comment ${existing}"
else
gh api -X POST "repos/${GITHUB_REPOSITORY}/issues/${PR_NUMBER}/comments" -f body="$body" > /dev/null
echo "Created new playwright-result comment"
fi
ci-web-mark:
if: always()
needs:
@@ -73,13 +302,9 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v5
- uses: ./.github/actions/setup-node
with:
node-version-file: web/package.json
cache: "npm"
cache-dependency-path: web/package-lock.json
- working-directory: web/
run: npm ci
working-directory: web
- name: test
working-directory: web/
run: npm run test || exit 0
run: corepack npm run test || exit 0

View File

@@ -3,7 +3,7 @@ name: Packages - Publish NPM packages
on:
push:
branches: [main]
branches: [ main ]
paths:
- packages/tsconfig/**
- packages/eslint-config/**
@@ -35,22 +35,19 @@ jobs:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
with:
fetch-depth: 2
- uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v5
- uses: ./.github/actions/setup-node
with:
node-version-file: ${{ matrix.package }}/package.json
registry-url: "https://registry.npmjs.org"
working-directory: ${{ matrix.package }}
- name: Get changed files
id: changed-files
uses: tj-actions/changed-files@9426d40962ed5378910ee2e21d5f8c6fcbf2dd96 # 24d32ffd492484c1d75e0c0b894501ddb9d30d62
with:
files: |
${{ matrix.package }}/package.json
- name: Install Dependencies
run: npm ci
- name: Publish package
if: steps.changed-files.outputs.any_changed == 'true'
working-directory: ${{ matrix.package }}
run: |
npm ci
npm run build
npm publish
corepack npm ci
corepack npm run build
corepack npm publish

View File

@@ -3,7 +3,7 @@ name: Release - On publish
on:
release:
types: [published, created]
types: [ published, created ]
jobs:
build-server:
@@ -87,11 +87,9 @@ jobs:
- uses: actions/setup-go@4a3601121dd01d1626a1e23e37211e3254c1c06c # v6
with:
go-version-file: "go.mod"
- uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v5
- uses: ./.github/actions/setup-node
with:
node-version-file: web/package.json
cache: "npm"
cache-dependency-path: web/package-lock.json
working-directory: web
- name: Set up QEMU
uses: docker/setup-qemu-action@ce360397dd3f832beb865e1373c09c0e9f86d70a # v4.0.0
- name: Set up Docker Buildx
@@ -144,22 +142,16 @@ jobs:
- proxy
- ldap
- radius
goos: [linux, darwin]
goarch: [amd64, arm64]
goos: [ linux, darwin ]
goarch: [ amd64, arm64 ]
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/setup-go@4a3601121dd01d1626a1e23e37211e3254c1c06c # v6
with:
go-version-file: "go.mod"
- uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v5
- uses: ./.github/actions/setup-node
with:
node-version-file: web/package.json
cache: "npm"
cache-dependency-path: web/package-lock.json
- name: Install web dependencies
working-directory: web/
run: |
npm ci
working-directory: web
- name: Build web
working-directory: web/
run: |
@@ -175,8 +167,10 @@ jobs:
uses: svenstaro/upload-release-action@29e53e917877a24fad85510ded594ab3c9ca12de # v2
with:
repo_token: ${{ secrets.GITHUB_TOKEN }}
file: ./authentik-outpost-${{ matrix.type }}_${{ matrix.goos }}_${{ matrix.goarch }}
asset_name: authentik-outpost-${{ matrix.type }}_${{ matrix.goos }}_${{ matrix.goarch }}
file: ./authentik-outpost-${{ matrix.type }}_${{ matrix.goos }}_${{
matrix.goarch }}
asset_name: authentik-outpost-${{ matrix.type }}_${{ matrix.goos }}_${{
matrix.goarch }}
tag: ${{ github.ref }}
upload-aws-cfn-template:
permissions:

7
.gitignore vendored
View File

@@ -14,6 +14,8 @@ media
# Node
node_modules
corepack.tgz
.corepack
.cspellcache
cspell-report.*
@@ -229,11 +231,6 @@ source_docs/
### Golang ###
/vendor/
server
proxy
ldap
rac
radius
### Docker ###
tests/openid_conformance/exports/*.zip

158
Cargo.lock generated
View File

@@ -17,6 +17,18 @@ version = "2.0.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "320119579fcad9c21884f5c4861d16174d0e06250625266f50fe6898340abefa"
[[package]]
name = "ahash"
version = "0.8.12"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5a15f179cd60c4584b8a8c596927aadc462e27f2ca70c04e0071964a73ba7a75"
dependencies = [
"cfg-if",
"once_cell",
"version_check",
"zerocopy",
]
[[package]]
name = "aho-corasick"
version = "1.1.4"
@@ -191,7 +203,6 @@ dependencies = [
"tokio",
"tracing",
"uuid",
"which",
]
[[package]]
@@ -1003,17 +1014,6 @@ dependencies = [
"pin-project-lite",
]
[[package]]
name = "evmap"
version = "11.0.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1b8874945f036109c72242964c1174cf99434e30cfa45bf45fedc983f50046f8"
dependencies = [
"hashbag",
"left-right",
"smallvec",
]
[[package]]
name = "eyre"
version = "0.6.12"
@@ -1230,21 +1230,6 @@ dependencies = [
"slab",
]
[[package]]
name = "generator"
version = "0.8.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "52f04ae4152da20c76fe800fa48659201d5cf627c5149ca0b707b69d7eef6cf9"
dependencies = [
"cc",
"cfg-if",
"libc",
"log",
"rustversion",
"windows-link",
"windows-result",
]
[[package]]
name = "generic-array"
version = "0.14.7"
@@ -1326,12 +1311,6 @@ dependencies = [
"tracing",
]
[[package]]
name = "hashbag"
version = "0.1.13"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7040a10f52cba493ddb09926e15d10a9d8a28043708a405931fe4c6f19fac064"
[[package]]
name = "hashbrown"
version = "0.15.5"
@@ -1889,17 +1868,6 @@ version = "0.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "09edd9e8b54e49e587e4f6295a7d29c3ea94d469cb40ab8ca70b288248a81db2"
[[package]]
name = "left-right"
version = "0.11.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0f0c21e4c8ff95f487fb34e6f9182875f42c84cef966d29216bf115d9bba835a"
dependencies = [
"crossbeam-utils",
"loom",
"slab",
]
[[package]]
name = "libc"
version = "0.2.183"
@@ -1971,19 +1939,6 @@ version = "0.4.29"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5e5032e24019045c762d3c0f28f5b6b8bbf38563a65908389bf7978758920897"
[[package]]
name = "loom"
version = "0.7.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "419e0dc8046cb947daa77eb95ae174acfbddb7673b4151f56d1eed8e93fbfaca"
dependencies = [
"cfg-if",
"generator",
"scoped-tls",
"tracing",
"tracing-subscriber",
]
[[package]]
name = "lru-slab"
version = "0.1.2"
@@ -2023,22 +1978,21 @@ checksum = "f8ca58f447f06ed17d5fc4043ce1b10dd205e060fb3ce5b979b8ed8e59ff3f79"
[[package]]
name = "metrics"
version = "0.24.5"
version = "0.24.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ff56c2e7dce6bd462e3b8919986a617027481b1dcc703175b58cf9dd98a2f071"
checksum = "5d5312e9ba3771cfa961b585728215e3d972c950a3eed9252aa093d6301277e8"
dependencies = [
"ahash",
"portable-atomic",
"rapidhash",
]
[[package]]
name = "metrics-exporter-prometheus"
version = "0.18.3"
version = "0.18.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1db0d8f1fc9e62caebd0319e11eaec5822b0186c171568f0480b46a0137f9108"
checksum = "3589659543c04c7dc5526ec858591015b87cd8746583b51b48ef4353f99dbcda"
dependencies = [
"base64 0.22.1",
"evmap",
"indexmap",
"metrics",
"metrics-util",
@@ -2057,7 +2011,7 @@ dependencies = [
"hashbrown 0.16.1",
"metrics",
"quanta",
"rand 0.9.4",
"rand 0.9.2",
"rand_xoshiro",
"sketches-ddsketch",
]
@@ -2744,7 +2698,7 @@ dependencies = [
"bytes",
"getrandom 0.3.4",
"lru-slab",
"rand 0.9.4",
"rand 0.9.2",
"ring",
"rustc-hash",
"rustls",
@@ -2804,9 +2758,9 @@ dependencies = [
[[package]]
name = "rand"
version = "0.9.4"
version = "0.9.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "44c5af06bb1b7d3216d91932aed5265164bf384dc89cd6ba05cf59a35f5f76ea"
checksum = "6db2770f06117d490610c7488547d543617b21bfa07796d7a12f6f1bd53850d1"
dependencies = [
"rand_chacha 0.9.0",
"rand_core 0.9.5",
@@ -2859,15 +2813,6 @@ dependencies = [
"rand_core 0.9.5",
]
[[package]]
name = "rapidhash"
version = "4.4.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b5e48930979c155e2f33aa36ab3119b5ee81332beb6482199a8ecd6029b80b59"
dependencies = [
"rustversion",
]
[[package]]
name = "raw-cpuid"
version = "11.6.0"
@@ -2926,9 +2871,9 @@ checksum = "dc897dd8d9e8bd1ed8cdad82b5966c3e0ecae09fb1907d58efaa013543185d0a"
[[package]]
name = "reqwest"
version = "0.13.3"
version = "0.13.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "62e0021ea2c22aed41653bc7e1419abb2c97e038ff2c33d0e1309e49a97deec0"
checksum = "ab3f43e3283ab1488b624b44b0e988d0acea0b3214e694730a055cb6b2efa801"
dependencies = [
"base64 0.22.1",
"bytes",
@@ -3160,12 +3105,6 @@ dependencies = [
"windows-sys 0.61.2",
]
[[package]]
name = "scoped-tls"
version = "1.0.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e1cf6437eb19a8f4a6cc0f7dca544973b0b78843adbfeb3683d1a94a0024a294"
[[package]]
name = "scopeguard"
version = "1.2.0"
@@ -3203,9 +3142,9 @@ checksum = "d767eb0aabc880b29956c35734170f26ed551a859dbd361d140cdbeca61ab1e2"
[[package]]
name = "sentry"
version = "0.48.0"
version = "0.47.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e8ac94aab850a23d7507307cc505332ed2bafd36c65930dfc5c43610f9e9b477"
checksum = "eb25f439f97d26fea01d717fa626167ceffcd981addaa670001e70505b72acbb"
dependencies = [
"cfg_aliases",
"httpdate",
@@ -3224,9 +3163,9 @@ dependencies = [
[[package]]
name = "sentry-backtrace"
version = "0.48.1"
version = "0.47.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "dc84c325ace9ca2388e510fe7d6672b5d60cd8b3bd0eb4bb4ee8314c323cd686"
checksum = "46a8c2c1bd5c1f735e84f28b48e7d72efcaafc362b7541bc8253e60e8fcdffc6"
dependencies = [
"backtrace",
"regex",
@@ -3235,9 +3174,9 @@ dependencies = [
[[package]]
name = "sentry-contexts"
version = "0.48.1"
version = "0.47.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "896c1ab62dbfe1746fb262bbf72e6feb2fb9dfb2c14709077bf71beb532e44b2"
checksum = "9b88a90baa654d7f0e1f4b667f6b434293d9f72c71bef16b197c76af5b7d5803"
dependencies = [
"hostname",
"libc",
@@ -3249,11 +3188,11 @@ dependencies = [
[[package]]
name = "sentry-core"
version = "0.48.1"
version = "0.47.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d5f5abf20c42cb1593ec1638976e2647da55f79bccac956444c1707b6cce259a"
checksum = "0ac170a5bba8bec6e3339c90432569d89641fa7a3d3e4f44987d24f0762e6adf"
dependencies = [
"rand 0.9.4",
"rand 0.9.2",
"sentry-types",
"serde",
"serde_json",
@@ -3262,9 +3201,9 @@ dependencies = [
[[package]]
name = "sentry-debug-images"
version = "0.48.1"
version = "0.47.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4b88bbe6a760d5724bb40689827e82e8db1e275947df2c59abe171bfc30bb671"
checksum = "dd9646a972b57896d4a92ed200cf76139f8e30b3cfd03b6662ae59926d26633c"
dependencies = [
"findshlibs",
"sentry-core",
@@ -3272,9 +3211,9 @@ dependencies = [
[[package]]
name = "sentry-panic"
version = "0.48.1"
version = "0.47.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0260dcb52562b6a79ae7702312a26dba94b79fb5baee7301087529e5ca4e872e"
checksum = "6127d3d304ba5ce0409401e85aae538e303a569f8dbb031bf64f9ba0f7174346"
dependencies = [
"sentry-backtrace",
"sentry-core",
@@ -3282,9 +3221,9 @@ dependencies = [
[[package]]
name = "sentry-tower"
version = "0.48.1"
version = "0.47.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d669616d5d5279b5712febfc80c343acc3695e499de0d101ed70fceacadf37f2"
checksum = "61c5253dc4ad89863a866b93aeaaac1c9d60f2f774663b5024afe2d57e0a101c"
dependencies = [
"sentry-core",
"tower-layer",
@@ -3293,9 +3232,9 @@ dependencies = [
[[package]]
name = "sentry-tracing"
version = "0.48.1"
version = "0.47.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a1c035f3a0a8671ae1a231c5b457abb68b71acba2bf3054dab2a09a9d4ea487e"
checksum = "27701acc51e68db5281802b709010395bfcbcb128b1d0a4e5873680d3b47ff0c"
dependencies = [
"bitflags 2.11.0",
"sentry-backtrace",
@@ -3306,13 +3245,13 @@ dependencies = [
[[package]]
name = "sentry-types"
version = "0.48.1"
version = "0.47.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "82d8e81058ec155992191f61c7b29bfa7b2cf12012131e7cdc0678020898a7c9"
checksum = "56780cb5597d676bf22e6c11d1f062eb4def46390ea3bfb047bcbcf7dfd19bdb"
dependencies = [
"debugid",
"hex",
"rand 0.9.4",
"rand 0.9.2",
"serde",
"serde_json",
"thiserror 2.0.18",
@@ -4214,7 +4153,7 @@ dependencies = [
"http",
"httparse",
"log",
"rand 0.9.4",
"rand 0.9.2",
"sha1",
"thiserror 2.0.18",
]
@@ -4576,15 +4515,6 @@ dependencies = [
"rustls-pki-types",
]
[[package]]
name = "which"
version = "8.0.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "81995fafaaaf6ae47a7d0cc83c67caf92aeb7e5331650ae6ff856f7c0c60c459"
dependencies = [
"libc",
]
[[package]]
name = "whoami"
version = "1.6.1"

View File

@@ -43,15 +43,15 @@ hyper-unix-socket = "= 0.6.1"
hyper-util = "= 0.1.20"
ipnet = { version = "= 2.12.0", features = ["serde"] }
json-subscriber = "= 0.2.8"
metrics = "= 0.24.5"
metrics-exporter-prometheus = { version = "= 0.18.3", default-features = false }
metrics = "= 0.24.3"
metrics-exporter-prometheus = { version = "= 0.18.1", default-features = false }
nix = { version = "= 0.31.2", features = ["hostname", "signal"] }
notify = "= 8.2.0"
pin-project-lite = "= 0.2.17"
pyo3 = "= 0.28.3"
pyo3-build-config = "= 0.28.3"
regex = "= 1.12.3"
reqwest = { version = "= 0.13.3", features = [
reqwest = { version = "= 0.13.2", features = [
"form",
"json",
"multipart",
@@ -67,7 +67,7 @@ reqwest-middleware = { version = "= 0.5.1", features = [
"rustls",
] }
rustls = { version = "= 0.23.40", features = ["fips"] }
sentry = { version = "= 0.48.0", default-features = false, features = [
sentry = { version = "= 0.47.0", default-features = false, features = [
"backtrace",
"contexts",
"debug-images",
@@ -113,7 +113,6 @@ tracing-subscriber = { version = "= 0.3.23", features = [
] }
url = "= 2.5.8"
uuid = { version = "= 1.23.1", features = ["serde", "v4"] }
which = "= 8.0.2"
ak-axum = { package = "authentik-axum", version = "2026.5.0-rc1", path = "./packages/ak-axum" }
ak-client = { package = "authentik-client", version = "2026.5.0-rc1", path = "./packages/client-rust" }
@@ -283,7 +282,6 @@ sqlx = { workspace = true, optional = true }
tokio.workspace = true
tracing.workspace = true
uuid.workspace = true
which.workspace = true
[lints]
workspace = true

View File

@@ -106,14 +106,18 @@ migrate: ## Run the Authentik Django server's migrations
i18n-extract: core-i18n-extract web-i18n-extract ## Extract strings that require translation into files to send to a translation service
aws-cfn:
cd lifecycle/aws && npm i && $(UV) run npm run aws-cfn
aws-cfn: node-install
corepack npm install --prefix lifecycle/aws
$(UV) run corepack npm run aws-cfn --prefix lifecycle/aws
run: ## Run the main authentik server and worker processes
$(UV) run ak allinone
run-server: ## Run the main authentik server process
$(UV) run ak server
run-watch: ## Run the authentik server and worker, with auto reloading
watchexec --on-busy-update=restart --stop-signal=SIGINT --exts py,rs,go --no-meta --notify -- $(UV) run ak allinone
run-worker: ## Run the main authentik worker process
$(UV) run ak worker
run-worker-watch: ## Run the authentik worker, with auto reloading
watchexec --on-busy-update=restart --stop-signal=SIGINT --exts py,rs --no-meta --notify -- $(UV) run ak worker
core-i18n-extract:
$(UV) run ak makemessages \
@@ -125,7 +129,7 @@ core-i18n-extract:
--ignore website \
-l en
install: node-install docs-install core-install ## Install all requires dependencies for `node`, `docs` and `core`
install: node-install web-install core-install ## Install all requires dependencies for `node`, `web` and `core`
dev-drop-db:
$(eval pg_user := $(shell $(UV) run python -m authentik.lib.config postgresql.user 2>/dev/null))
@@ -229,38 +233,46 @@ gen-dev-config: ## Generate a local development config file
#########################
node-install: ## Install the necessary libraries to build Node.js packages
npm ci
npm ci --prefix web
node ./scripts/node/setup-corepack.mjs
node ./scripts/node/lint-runtime.mjs
node ./scripts/node/lint-runtime.mjs
#########################
## Web
#########################
web-build: node-install ## Build the Authentik UI
npm run --prefix web build
web-install: ## Install the necessary libraries to build the Authentik UI
node ./scripts/node/lint-runtime.mjs web
corepack npm ci
corepack npm ci --prefix web
web-build: ## Build the Authentik UI
corepack npm run --prefix web build
web: web-lint-fix web-lint web-check-compile ## Automatically fix formatting issues in the Authentik UI source code, lint the code, and compile it
web-test: ## Run tests for the Authentik UI
npm run --prefix web test
corepack npm run --prefix web test
web-watch: ## Build and watch the Authentik UI for changes, updating automatically
npm run --prefix web watch
corepack npm run --prefix web watch
web-storybook-watch: ## Build and run the storybook documentation server
npm run --prefix web storybook
corepack npm run --prefix web storybook
web-lint-fix:
npm run --prefix web prettier
corepack npm run --prefix web prettier
web-lint:
npm run --prefix web lint
npm run --prefix web lit-analyse
corepack npm run --prefix web lint
corepack npm run --prefix web lit-analyse
web-check-compile:
npm run --prefix web tsc
corepack npm run --prefix web tsc
web-i18n-extract:
npm run --prefix web extract-locales
corepack npm run --prefix web extract-locales
#########################
## Docs
@@ -268,35 +280,40 @@ web-i18n-extract:
docs: docs-lint-fix docs-build ## Automatically fix formatting issues in the Authentik docs source code, lint the code, and compile it
docs-install:
npm ci --prefix website
docs-install: node-install ## Install the necessary libraries to build the Authentik documentation
node ./scripts/node/lint-runtime.mjs
corepack npm ci
corepack npm ci --prefix website
docs-lint-fix: lint-spellcheck
npm run --prefix website prettier
corepack npm run --prefix website prettier
docs-build:
npm run --prefix website build
node ./scripts/node/lint-runtime.mjs website
corepack npm run --prefix website build
docs-watch: ## Build and watch the topics documentation
npm run --prefix website start
corepack npm run --prefix website start
integrations: docs-lint-fix integrations-build ## Fix formatting issues in the integrations source code, lint the code, and compile it
integrations-build:
npm run --prefix website -w integrations build
corepack npm run --prefix website -w integrations build
integrations-watch: ## Build and watch the Integrations documentation
npm run --prefix website -w integrations start
corepack npm run --prefix website -w integrations start
docs-api-build:
npm run --prefix website -w api build
corepack npm run --prefix website -w api build
docs-api-watch: ## Build and watch the API documentation
npm run --prefix website -w api generate
npm run --prefix website -w api start
corepack npm run --prefix website -w api generate
corepack npm run --prefix website -w api start
docs-api-clean: ## Clean generated API documentation
npm run --prefix website -w api build:api:clean
corepack npm run --prefix website -w api build:api:clean
#########################
## Docker

View File

@@ -1,36 +0,0 @@
from django.db.models import F, QuerySet
from rest_framework.filters import OrderingFilter
from rest_framework.request import Request
from rest_framework.views import APIView
class NullsAwareOrderingFilter(OrderingFilter):
"""OrderingFilter that sorts NULL values consistently.
For any nullable field, NULLs are treated as the smallest possible value:
- ascending → NULLs appear first (nulls_first=True)
- descending → NULLs appear last (nulls_last=True)
"""
def _nullable_field_names(self, queryset: QuerySet) -> set[str]:
return {f.name for f in queryset.model._meta.get_fields() if hasattr(f, "null") and f.null}
def filter_queryset(self, request: Request, queryset: QuerySet, view: APIView):
queryset = super().filter_queryset(request, queryset, view)
ordering = queryset.query.order_by
if not ordering:
return queryset
nullable = self._nullable_field_names(queryset)
new_ordering = []
changed = False
for term in ordering:
name = term.lstrip("-")
if name in nullable:
changed = True
if term.startswith("-"):
new_ordering.append(F(name).desc(nulls_last=True))
else:
new_ordering.append(F(name).asc(nulls_first=True))
else:
new_ordering.append(term)
return queryset.order_by(*new_ordering) if changed else queryset

View File

@@ -1,59 +0,0 @@
from django.db.models import OrderBy
from django.test import TestCase
from rest_framework.request import Request
from rest_framework.test import APIRequestFactory
from authentik.api.ordering import NullsAwareOrderingFilter
from authentik.core.models import Token, User
class MockView:
ordering_fields = "__all__"
ordering = None
class TestNullsAwareOrderingFilter(TestCase):
def setUp(self):
self.filter = NullsAwareOrderingFilter()
self.view = MockView()
factory = APIRequestFactory()
self._req = lambda ordering: Request(factory.get("/", {"ordering": ordering}))
def _order_by(self, model, ordering):
qs = model.objects.all()
return self.filter.filter_queryset(self._req(ordering), qs, self.view).query.order_by
def test_nullable_asc_nulls_first(self):
"""Ascending sort on a nullable field rewrites to nulls_first=True."""
(expr,) = self._order_by(User, "last_login")
self.assertIsInstance(expr, OrderBy)
self.assertFalse(expr.descending)
self.assertTrue(expr.nulls_first)
def test_nullable_desc_nulls_last(self):
"""Descending sort on a nullable field rewrites to nulls_last=True."""
(expr,) = self._order_by(User, "-last_login")
self.assertIsInstance(expr, OrderBy)
self.assertTrue(expr.descending)
self.assertTrue(expr.nulls_last)
def test_non_nullable_passes_through(self):
"""Non-nullable fields are left as plain string terms."""
(expr,) = self._order_by(User, "username")
self.assertEqual(expr, "username")
def test_mixed_ordering(self):
"""Only nullable terms are rewritten; non-nullable terms pass through unchanged."""
first, second = self._order_by(User, "username,-last_login")
self.assertEqual(first, "username")
self.assertIsInstance(second, OrderBy)
self.assertTrue(second.descending)
self.assertTrue(second.nulls_last)
def test_expires_nullable(self):
"""expires on ExpiringModel is nullable and is rewritten correctly."""
(expr,) = self._order_by(Token, "-expires")
self.assertIsInstance(expr, OrderBy)
self.assertTrue(expr.descending)
self.assertTrue(expr.nulls_last)

View File

@@ -1,6 +1,5 @@
"""Serializer mixin for managed models"""
from json import JSONDecodeError, loads
from typing import cast
from django.conf import settings
@@ -45,7 +44,6 @@ class BlueprintUploadSerializer(PassiveSerializer):
file = FileField(required=False)
path = CharField(required=False)
context = CharField(required=False, allow_blank=True)
def validate_path(self, path: str) -> str:
"""Ensure the path (if set) specified is retrievable"""
@@ -56,18 +54,6 @@ class BlueprintUploadSerializer(PassiveSerializer):
raise ValidationError(_("Blueprint file does not exist"))
return path
def validate_context(self, context: str) -> dict:
"""Parse context as a JSON object"""
if not context:
return {}
try:
parsed = loads(context)
except JSONDecodeError as exc:
raise ValidationError(_("Context must be valid JSON")) from exc
if not isinstance(parsed, dict):
raise ValidationError(_("Context must be a JSON object"))
return parsed
class ManagedSerializer:
"""Managed Serializer"""
@@ -238,8 +224,7 @@ class BlueprintInstanceViewSet(UsedByMixin, ModelViewSet):
).retrieve_file()
else:
raise ValidationError("Either path or file must be set")
context = body.validated_data.get("context") or {}
importer = Importer.from_string(string_contents, context)
importer = Importer.from_string(string_contents)
check_blueprint_perms(importer.blueprint, request.user)

View File

@@ -1,6 +1,6 @@
"""Test blueprints v1 api"""
from json import dumps, loads
from json import loads
from tempfile import NamedTemporaryFile, mkdtemp
from django.urls import reverse
@@ -8,11 +8,7 @@ from rest_framework.test import APITestCase
from yaml import dump
from authentik.core.tests.utils import create_test_admin_user
from authentik.flows.models import Flow
from authentik.lib.config import CONFIG
from authentik.lib.generators import generate_id
from authentik.stages.invitation.models import InvitationStage
from authentik.stages.user_write.models import UserWriteStage
TMP = mkdtemp("authentik-blueprints")
@@ -84,107 +80,3 @@ class TestBlueprintsV1API(APITestCase):
res.content.decode(),
{"content": ["Failed to validate blueprint", "- Invalid blueprint version"]},
)
def test_api_import_with_context(self):
"""Test that the import endpoint applies the supplied context to the real blueprint"""
slug = f"invitation-enrollment-{generate_id()}"
flow_name = f"Invitation Enrollment {generate_id()}"
stage_name = f"invitation-stage-{generate_id()}"
user_type = "internal"
continue_without_invitation = True
res = self.client.post(
reverse("authentik_api:blueprintinstance-import-"),
data={
"path": "example/flows-invitation-enrollment-minimal.yaml",
"context": dumps(
{
"flow_slug": slug,
"flow_name": flow_name,
"stage_name": stage_name,
"continue_flow_without_invitation": continue_without_invitation,
"user_type": user_type,
}
),
},
format="multipart",
)
self.assertEqual(res.status_code, 200)
self.assertTrue(res.json()["success"])
flow = Flow.objects.get(slug=slug)
self.assertEqual(flow.name, flow_name)
self.assertEqual(flow.title, flow_name)
invitation_stage = InvitationStage.objects.get(name=stage_name)
self.assertEqual(
invitation_stage.continue_flow_without_invitation,
continue_without_invitation,
)
user_write_stage = UserWriteStage.objects.get(
name=f"invitation-enrollment-user-write-{slug}"
)
self.assertEqual(user_write_stage.user_type, user_type)
self.assertEqual(user_write_stage.user_path_template, f"users/{user_type}")
def test_api_import_blank_path(self):
"""Validator returns empty path unchanged (covers api.py:53)."""
with NamedTemporaryFile(mode="w+", suffix=".yaml") as file:
file.write(dump({"version": 1, "entries": []}))
file.flush()
file.seek(0)
res = self.client.post(
reverse("authentik_api:blueprintinstance-import-"),
data={"path": "", "file": file},
format="multipart",
)
self.assertEqual(res.status_code, 200)
def test_api_import_unknown_path(self):
"""Path not in available blueprints is rejected (covers api.py:56)."""
res = self.client.post(
reverse("authentik_api:blueprintinstance-import-"),
data={"path": "does/not/exist.yaml"},
format="multipart",
)
self.assertEqual(res.status_code, 400)
self.assertIn("Blueprint file does not exist", res.content.decode())
def test_api_import_blank_context(self):
"""Blank context is normalized to empty dict (covers api.py:62)."""
res = self.client.post(
reverse("authentik_api:blueprintinstance-import-"),
data={
"path": "example/flows-invitation-enrollment-minimal.yaml",
"context": "",
},
format="multipart",
)
self.assertEqual(res.status_code, 200)
def test_api_import_invalid_json_context(self):
"""Malformed JSON context raises ValidationError (covers api.py:65-66)."""
res = self.client.post(
reverse("authentik_api:blueprintinstance-import-"),
data={
"path": "example/flows-invitation-enrollment-minimal.yaml",
"context": "{not json",
},
format="multipart",
)
self.assertEqual(res.status_code, 400)
self.assertIn("Context must be valid JSON", res.content.decode())
def test_api_import_non_object_context(self):
"""JSON context that isn't an object is rejected (covers api.py:68)."""
res = self.client.post(
reverse("authentik_api:blueprintinstance-import-"),
data={
"path": "example/flows-invitation-enrollment-minimal.yaml",
"context": "[1, 2, 3]",
},
format="multipart",
)
self.assertEqual(res.status_code, 400)
self.assertIn("Context must be a JSON object", res.content.decode())

View File

@@ -32,19 +32,19 @@ from authentik.rbac.decorators import permission_required
class UserAgentDeviceDict(TypedDict):
"""User agent device"""
brand: str | None = None
brand: str
family: str
model: str | None = None
model: str
class UserAgentOSDict(TypedDict):
"""User agent os"""
family: str
major: str | None = None
minor: str | None = None
patch: str | None = None
patch_minor: str | None = None
major: str
minor: str
patch: str
patch_minor: str
class UserAgentBrowserDict(TypedDict):

View File

@@ -1,5 +1,6 @@
"""authentik core signals"""
from asgiref.sync import async_to_sync
from channels.layers import get_channel_layer
from django.contrib.auth.signals import user_logged_in
from django.core.cache import cache
@@ -58,7 +59,7 @@ def user_logged_in_session(sender, request: HttpRequest, user: User, **_):
layer = get_channel_layer()
device_cookie = request.COOKIES.get("authentik_device")
if device_cookie:
layer.group_send_blocking(
async_to_sync(layer.group_send)(
build_device_group(device_cookie),
{"type": "event.session.authenticated"},
)

View File

@@ -1,7 +1,6 @@
# Generated by Django 5.2.12 on 2026-04-04 16:58
from django.db import migrations, models
import django.contrib.postgres.fields
class Migration(migrations.Migration):
@@ -41,109 +40,4 @@ class Migration(migrations.Migration):
]
),
),
migrations.AlterField(
model_name="stream",
name="events_requested",
field=django.contrib.postgres.fields.ArrayField(
base_field=models.TextField(
choices=[
(
"https://schemas.openid.net/secevent/caep/event-type/session-revoked",
"Caep Session Revoked",
),
(
"https://schemas.openid.net/secevent/caep/event-type/token-claims-change",
"Caep Token Claims Change",
),
(
"https://schemas.openid.net/secevent/caep/event-type/credential-change",
"Caep Credential Change",
),
(
"https://schemas.openid.net/secevent/caep/event-type/assurance-level-change",
"Caep Assurance Level Change",
),
(
"https://schemas.openid.net/secevent/caep/event-type/device-compliance-change",
"Caep Device Compliance Change",
),
(
"https://schemas.openid.net/secevent/caep/event-type/session-established",
"Caep Session Established",
),
(
"https://schemas.openid.net/secevent/caep/event-type/session-presented",
"Caep Session Presented",
),
(
"https://schemas.openid.net/secevent/caep/event-type/risk-level-change",
"Caep Risk Level Change",
),
(
"https://schemas.openid.net/secevent/ssf/event-type/verification",
"Set Verification",
),
]
),
default=list,
size=None,
),
),
migrations.AlterField(
model_name="stream",
name="status",
field=models.TextField(
choices=[
("enabled", "Enabled"),
("paused", "Paused"),
("disabled", "Disabled"),
("disabled_deleted", "Disabled Deleted"),
],
default="enabled",
),
),
migrations.AlterField(
model_name="streamevent",
name="type",
field=models.TextField(
choices=[
(
"https://schemas.openid.net/secevent/caep/event-type/session-revoked",
"Caep Session Revoked",
),
(
"https://schemas.openid.net/secevent/caep/event-type/token-claims-change",
"Caep Token Claims Change",
),
(
"https://schemas.openid.net/secevent/caep/event-type/credential-change",
"Caep Credential Change",
),
(
"https://schemas.openid.net/secevent/caep/event-type/assurance-level-change",
"Caep Assurance Level Change",
),
(
"https://schemas.openid.net/secevent/caep/event-type/device-compliance-change",
"Caep Device Compliance Change",
),
(
"https://schemas.openid.net/secevent/caep/event-type/session-established",
"Caep Session Established",
),
(
"https://schemas.openid.net/secevent/caep/event-type/session-presented",
"Caep Session Presented",
),
(
"https://schemas.openid.net/secevent/caep/event-type/risk-level-change",
"Caep Risk Level Change",
),
(
"https://schemas.openid.net/secevent/ssf/event-type/verification",
"Set Verification",
),
]
),
),
]

View File

@@ -24,31 +24,8 @@ class EventTypes(models.TextChoices):
"""SSF Event types supported by authentik"""
CAEP_SESSION_REVOKED = "https://schemas.openid.net/secevent/caep/event-type/session-revoked"
"""https://openid.net/specs/openid-caep-1_0-final.html#section-3.1"""
CAEP_TOKEN_CLAIMS_CHANGE = (
"https://schemas.openid.net/secevent/caep/event-type/token-claims-change"
)
"""https://openid.net/specs/openid-caep-1_0-final.html#section-3.2"""
CAEP_CREDENTIAL_CHANGE = "https://schemas.openid.net/secevent/caep/event-type/credential-change"
"""https://openid.net/specs/openid-caep-1_0-final.html#section-3.3"""
CAEP_ASSURANCE_LEVEL_CHANGE = (
"https://schemas.openid.net/secevent/caep/event-type/assurance-level-change"
)
"""https://openid.net/specs/openid-caep-1_0-final.html#section-3.4"""
CAEP_DEVICE_COMPLIANCE_CHANGE = (
"https://schemas.openid.net/secevent/caep/event-type/device-compliance-change"
)
"""https://openid.net/specs/openid-caep-1_0-final.html#section-3.5"""
CAEP_SESSION_ESTABLISHED = (
"https://schemas.openid.net/secevent/caep/event-type/session-established"
)
"""https://openid.net/specs/openid-caep-1_0-final.html#section-3.6"""
CAEP_SESSION_PRESENTED = "https://schemas.openid.net/secevent/caep/event-type/session-presented"
"""https://openid.net/specs/openid-caep-1_0-final.html#section-3.7"""
CAEP_RISK_LEVEL_CHANGE = "https://schemas.openid.net/secevent/caep/event-type/risk-level-change"
"""https://openid.net/specs/openid-caep-1_0-final.html#section-3.8"""
SET_VERIFICATION = "https://schemas.openid.net/secevent/ssf/event-type/verification"
"""https://openid.net/specs/openid-sharedsignals-framework-1_0.html#section-8.1.4.1"""
class DeliveryMethods(models.TextChoices):
@@ -69,12 +46,10 @@ class SSFEventStatus(models.TextChoices):
class StreamStatus(models.TextChoices):
"""SSF Stream status"""
ENABLED = "enabled"
PAUSED = "paused"
DISABLED = "disabled"
DISABLED_DELETED = "disabled_deleted"
class SSFProvider(TasksModel, BackchannelProvider):

View File

@@ -108,13 +108,13 @@ def send_ssf_event(stream_uuid: UUID, event_data: dict[str, Any]):
event.save()
self.info("Event successfully sent", status=response.status_code)
# Cleanup, if we were the last pending message for this stream and it has been deleted
# (status=StreamStatus.DISABLED_DELETED), then we can delete the stream
# (status=StreamStatus.DISABLED), then we can delete the stream
if (
not StreamEvent.objects.filter(
stream=stream,
status__in=[SSFEventStatus.PENDING_FAILED, SSFEventStatus.PENDING_NEW],
).exists()
and stream.status == StreamStatus.DISABLED_DELETED
and stream.status == StreamStatus.DISABLED
):
LOGGER.info(
"Deleting inactive stream as all pending messages were sent.", stream=stream

View File

@@ -62,7 +62,7 @@ class TestSSFAuth(APITestCase):
self.assertEqual(event.status, SSFEventStatus.PENDING_FAILED)
self.assertEqual(
event.payload["events"],
{"https://schemas.openid.net/secevent/ssf/event-type/verification": {}},
{"https://schemas.openid.net/secevent/ssf/event-type/verification": {"state": None}},
)
def test_stream_add_oidc(self):
@@ -115,7 +115,7 @@ class TestSSFAuth(APITestCase):
self.assertEqual(event.status, SSFEventStatus.PENDING_FAILED)
self.assertEqual(
event.payload["events"],
{"https://schemas.openid.net/secevent/ssf/event-type/verification": {}},
{"https://schemas.openid.net/secevent/ssf/event-type/verification": {"state": None}},
)
def test_token_invalid(self):

View File

@@ -54,7 +54,7 @@ class TestStream(APITestCase):
self.assertEqual(event.status, SSFEventStatus.PENDING_FAILED)
self.assertEqual(
event.payload["events"],
{"https://schemas.openid.net/secevent/ssf/event-type/verification": {}},
{"https://schemas.openid.net/secevent/ssf/event-type/verification": {"state": None}},
)
def test_stream_add_poll(self):
@@ -96,7 +96,7 @@ class TestStream(APITestCase):
)
self.assertEqual(res.status_code, 204)
stream.refresh_from_db()
self.assertEqual(stream.status, StreamStatus.DISABLED_DELETED)
self.assertEqual(stream.status, StreamStatus.DISABLED)
def test_stream_get(self):
"""get stream"""
@@ -225,26 +225,3 @@ class TestStream(APITestCase):
HTTP_AUTHORIZATION=f"Bearer {self.provider.token.key}",
)
self.assertEqual(res.status_code, 404)
def test_stream_status_update(self):
stream = Stream.objects.create(provider=self.provider)
res = self.client.post(
reverse(
"authentik_providers_ssf:stream-status",
kwargs={"application_slug": self.application.slug},
),
data={
"stream_id": str(stream.pk),
"status": StreamStatus.DISABLED,
},
HTTP_AUTHORIZATION=f"Bearer {self.provider.token.key}",
)
self.assertEqual(res.status_code, 200)
stream.refresh_from_db()
self.assertJSONEqual(
res.content,
{
"stream_id": str(stream.pk),
"status": str(stream.status),
},
)

View File

@@ -33,7 +33,7 @@ class TestTasks(APITestCase):
)
event_data = stream.prepare_event_payload(
EventTypes.SET_VERIFICATION,
{},
{"state": None},
sub_id={"format": "opaque", "id": str(stream.uuid)},
)
with Mocker() as mocker:
@@ -46,7 +46,7 @@ class TestTasks(APITestCase):
)
jwt = decode_complete(mocker.request_history[0].body, options={"verify_signature": False})
self.assertEqual(jwt["header"]["typ"], "secevent+jwt")
self.assertEqual(jwt["payload"]["events"][EventTypes.SET_VERIFICATION], {})
self.assertIsNone(jwt["payload"]["events"][EventTypes.SET_VERIFICATION]["state"])
def test_push_auth(self):
auth = generate_id()
@@ -58,7 +58,7 @@ class TestTasks(APITestCase):
)
event_data = stream.prepare_event_payload(
EventTypes.SET_VERIFICATION,
{},
{"state": None},
sub_id={"format": "opaque", "id": str(stream.uuid)},
)
with Mocker() as mocker:
@@ -72,7 +72,7 @@ class TestTasks(APITestCase):
)
jwt = decode_complete(mocker.request_history[0].body, options={"verify_signature": False})
self.assertEqual(jwt["header"]["typ"], "secevent+jwt")
self.assertEqual(jwt["payload"]["events"][EventTypes.SET_VERIFICATION], {})
self.assertIsNone(jwt["payload"]["events"][EventTypes.SET_VERIFICATION]["state"])
def test_push_stream_disable(self):
auth = generate_id()
@@ -81,11 +81,11 @@ class TestTasks(APITestCase):
delivery_method=DeliveryMethods.RFC_PUSH,
endpoint_url="http://localhost/ssf-push",
authorization_header=auth,
status=StreamStatus.DISABLED_DELETED,
status=StreamStatus.DISABLED,
)
event_data = stream.prepare_event_payload(
EventTypes.SET_VERIFICATION,
{},
{"state": None},
sub_id={"format": "opaque", "id": str(stream.uuid)},
)
with Mocker() as mocker:
@@ -95,7 +95,7 @@ class TestTasks(APITestCase):
).get_result(block=True, timeout=1)
jwt = decode_complete(mocker.request_history[0].body, options={"verify_signature": False})
self.assertEqual(jwt["header"]["typ"], "secevent+jwt")
self.assertEqual(jwt["payload"]["events"][EventTypes.SET_VERIFICATION], {})
self.assertIsNone(jwt["payload"]["events"][EventTypes.SET_VERIFICATION]["state"])
self.assertFalse(Stream.objects.filter(pk=stream.pk).exists())
def test_push_error(self):
@@ -106,7 +106,7 @@ class TestTasks(APITestCase):
)
event_data = stream.prepare_event_payload(
EventTypes.SET_VERIFICATION,
{},
{"state": None},
sub_id={"format": "opaque", "id": str(stream.uuid)},
)
with Mocker() as mocker:

View File

@@ -24,10 +24,10 @@ class SSFView(APIView):
class SSFStreamView(SSFView):
def get_object(self) -> Stream:
streams = Stream.objects.filter(provider=self.provider).exclude(
status=StreamStatus.DISABLED_DELETED
)
def get_object(self, any_status=False) -> Stream:
streams = Stream.objects.filter(provider=self.provider)
if not any_status:
streams = streams.filter(status__in=[StreamStatus.ENABLED, StreamStatus.PAUSED])
if "stream_id" in self.request.query_params:
streams = streams.filter(pk=self.request.query_params["stream_id"])
if "stream_id" in self.request.data:

View File

@@ -1,6 +1,6 @@
from uuid import uuid4
from django.http import Http404, HttpRequest
from django.http import HttpRequest
from django.urls import reverse
from rest_framework.exceptions import PermissionDenied, ValidationError
from rest_framework.fields import CharField, ChoiceField, ListField, SerializerMethodField
@@ -106,11 +106,7 @@ class StreamResponseSerializer(PassiveSerializer):
}
def get_events_supported(self, instance: Stream) -> list[str]:
return [
EventTypes.CAEP_SESSION_REVOKED,
EventTypes.CAEP_CREDENTIAL_CHANGE,
EventTypes.SET_VERIFICATION,
]
return [x.value for x in EventTypes]
class StreamView(SSFStreamView):
@@ -132,9 +128,10 @@ class StreamView(SSFStreamView):
LOGGER.info("Sending verification event", stream=instance)
send_ssf_events(
EventTypes.SET_VERIFICATION,
{},
{
"state": None,
},
stream_filter={"pk": instance.uuid},
request=request,
sub_id={"format": "opaque", "id": str(instance.uuid)},
)
response = StreamResponseSerializer(instance=instance, context={"request": request}).data
@@ -162,9 +159,7 @@ class StreamView(SSFStreamView):
def delete(self, request: Request, *args, **kwargs) -> Response:
stream = self.get_object()
if stream.status == StreamStatus.DISABLED_DELETED:
raise Http404
stream.status = StreamStatus.DISABLED_DELETED
stream.status = StreamStatus.DISABLED
stream.save()
return Response(status=204)
@@ -180,7 +175,6 @@ class StreamVerifyView(SSFStreamView):
"state": state,
},
stream_filter={"pk": stream.uuid},
request=request,
sub_id={"format": "opaque", "id": str(stream.uuid)},
)
return Response(status=204)
@@ -188,25 +182,8 @@ class StreamVerifyView(SSFStreamView):
class StreamStatusView(SSFStreamView):
class StreamStatusSerializer(PassiveSerializer):
stream_id = CharField()
status = ChoiceField(choices=StreamStatus.choices)
def get(self, request: Request, *args, **kwargs):
stream = self.get_object()
return Response(
{
"stream_id": str(stream.pk),
"status": str(stream.status),
}
)
def post(self, request: Request, *args, **kwargs):
stream = self.get_object()
serializer = self.StreamStatusSerializer(stream, data=request.data)
serializer.is_valid(raise_exception=True)
stream.status = serializer.validated_data["status"]
stream.save()
stream = self.get_object(any_status=True)
return Response(
{
"stream_id": str(stream.pk),

View File

@@ -8,6 +8,7 @@ from inspect import currentframe
from typing import Any
from uuid import uuid4
from asgiref.sync import async_to_sync
from channels.layers import get_channel_layer
from django.apps import apps
from django.db import models
@@ -409,7 +410,7 @@ class NotificationTransport(TasksModel, SerializerModel):
)
notification.save()
layer = get_channel_layer()
layer.group_send_blocking(
async_to_sync(layer.group_send)(
build_user_group(notification.user),
{
"type": "event.notification",

View File

@@ -29,7 +29,6 @@ class RefreshOtherFlowsAfterAuthentication(Flag[bool], key="flows_refresh_others
default = False
visibility = "public"
description = _("Refresh other tabs after successful authentication.")
deprecated = True
class ContinuousLogin(Flag[bool], key="flows_continuous_login"):

View File

@@ -53,16 +53,6 @@ class TestEndSessionView(OAuthTestCase):
self.brand.flow_invalidation = self.invalidation_flow
self.brand.save()
def _id_token_hint(self, host: str) -> str:
"""Issue a valid id_token_hint for the test provider under the given host."""
return self.provider.encode(
{
"iss": f"http://{host}/application/o/{self.app.slug}/",
"aud": self.provider.client_id,
"sub": str(self.user.pk),
}
)
def test_post_logout_redirect_uri_strict_match(self):
"""Test strict URI matching redirects to flow"""
self.client.force_login(self.user)
@@ -71,10 +61,7 @@ class TestEndSessionView(OAuthTestCase):
"authentik_providers_oauth2:end-session",
kwargs={"application_slug": self.app.slug},
),
{
"post_logout_redirect_uri": "http://testserver/logout",
"id_token_hint": self._id_token_hint(self.brand.domain),
},
{"post_logout_redirect_uri": "http://testserver/logout"},
HTTP_HOST=self.brand.domain,
)
# Should redirect to the invalidation flow
@@ -82,12 +69,7 @@ class TestEndSessionView(OAuthTestCase):
self.assertIn(self.invalidation_flow.slug, response.url)
def test_post_logout_redirect_uri_strict_no_match(self):
"""Test strict URI not matching returns an error and does not start logout flow.
Required by OIDC RP-Initiated Logout 1.0: on an unregistered
post_logout_redirect_uri, the OP MUST NOT redirect and MUST NOT proceed with
logout that targets the RP.
"""
"""Test strict URI not matching still proceeds with flow (no redirect URI in context)"""
self.client.force_login(self.user)
invalid_uri = "http://testserver/other"
response = self.client.get(
@@ -95,14 +77,12 @@ class TestEndSessionView(OAuthTestCase):
"authentik_providers_oauth2:end-session",
kwargs={"application_slug": self.app.slug},
),
{
"post_logout_redirect_uri": invalid_uri,
"id_token_hint": self._id_token_hint(self.brand.domain),
},
{"post_logout_redirect_uri": invalid_uri},
HTTP_HOST=self.brand.domain,
)
self.assertEqual(response.status_code, 400)
self.assertNotIn(invalid_uri, response.content.decode())
# Should still redirect to flow, but invalid URI should not be in response
self.assertEqual(response.status_code, 302)
self.assertNotIn(invalid_uri, response.url)
def test_post_logout_redirect_uri_regex_match(self):
"""Test regex URI matching redirects to flow"""
@@ -112,10 +92,7 @@ class TestEndSessionView(OAuthTestCase):
"authentik_providers_oauth2:end-session",
kwargs={"application_slug": self.app.slug},
),
{
"post_logout_redirect_uri": "https://app.example.com/logout",
"id_token_hint": self._id_token_hint(self.brand.domain),
},
{"post_logout_redirect_uri": "https://app.example.com/logout"},
HTTP_HOST=self.brand.domain,
)
# Should redirect to the invalidation flow
@@ -123,7 +100,7 @@ class TestEndSessionView(OAuthTestCase):
self.assertIn(self.invalidation_flow.slug, response.url)
def test_post_logout_redirect_uri_regex_no_match(self):
"""Test regex URI not matching returns an error and does not start logout flow."""
"""Test regex URI not matching"""
self.client.force_login(self.user)
invalid_uri = "https://malicious.com/logout"
response = self.client.get(
@@ -131,14 +108,12 @@ class TestEndSessionView(OAuthTestCase):
"authentik_providers_oauth2:end-session",
kwargs={"application_slug": self.app.slug},
),
{
"post_logout_redirect_uri": invalid_uri,
"id_token_hint": self._id_token_hint(self.brand.domain),
},
{"post_logout_redirect_uri": invalid_uri},
HTTP_HOST=self.brand.domain,
)
self.assertEqual(response.status_code, 400)
self.assertNotIn(invalid_uri, response.content.decode())
# Should still proceed to flow, but invalid URI should not be in response
self.assertEqual(response.status_code, 302)
self.assertNotIn(invalid_uri, response.url)
def test_state_parameter_appended_to_uri(self):
"""Test state parameter is appended to validated redirect URI"""
@@ -148,7 +123,6 @@ class TestEndSessionView(OAuthTestCase):
{
"post_logout_redirect_uri": "http://testserver/logout",
"state": "test-state-123",
"id_token_hint": self._id_token_hint("testserver"),
},
)
request.user = self.user
@@ -158,7 +132,6 @@ class TestEndSessionView(OAuthTestCase):
view.request = request
view.kwargs = {"application_slug": self.app.slug}
view.resolve_provider_application()
view.validate()
self.assertIn("state=test-state-123", view.post_logout_redirect_uri)
@@ -173,7 +146,6 @@ class TestEndSessionView(OAuthTestCase):
{
"post_logout_redirect_uri": "http://testserver/logout",
"state": "xyz789",
"id_token_hint": self._id_token_hint(self.brand.domain),
},
HTTP_HOST=self.brand.domain,
)

View File

@@ -5,8 +5,6 @@ from urllib.parse import quote, urlparse
from django.http import Http404, HttpRequest, HttpResponse
from django.shortcuts import get_object_or_404
from jwt import PyJWTError
from jwt import decode as jwt_decode
from authentik.common.oauth.constants import (
FORBIDDEN_URI_SCHEMES,
@@ -23,14 +21,11 @@ from authentik.flows.planner import (
from authentik.flows.stage import SessionEndStage
from authentik.flows.views.executor import SESSION_KEY_PLAN
from authentik.lib.views import bad_request_message
from authentik.policies.views import PolicyAccessView
from authentik.policies.views import PolicyAccessView, RequestValidationError
from authentik.providers.iframe_logout import IframeLogoutStageView
from authentik.providers.oauth2.errors import TokenError
from authentik.providers.oauth2.models import (
AccessToken,
JWTAlgorithms,
OAuth2LogoutMethod,
OAuth2Provider,
RedirectURIMatchingMode,
)
from authentik.providers.oauth2.tasks import send_backchannel_logout_request
@@ -52,45 +47,21 @@ class EndSessionView(PolicyAccessView):
if not self.flow:
raise Http404
def validate(self):
# Parse end session parameters
query_dict = self.request.POST if self.request.method == "POST" else self.request.GET
state = query_dict.get("state")
request_redirect_uri = query_dict.get("post_logout_redirect_uri")
id_token_hint = query_dict.get("id_token_hint")
self.post_logout_redirect_uri = None
# OIDC Certification: Verify id_token_hint. If invalid or missing, throw an error
if id_token_hint:
# Load a fresh provider instance that's not part of the flow
# since it'll have the cryptography Certificate that can't be pickled
provider = OAuth2Provider.objects.get(pk=self.provider.pk)
key, alg = provider.jwt_key
if alg != JWTAlgorithms.HS256:
key = provider.signing_key.public_key
try:
jwt_decode(
id_token_hint,
key,
algorithms=[alg],
audience=provider.client_id,
issuer=provider.get_issuer(self.request),
# ID Tokens are short-lived; a logout request arriving
# after expiry is still legitimate and must succeed.
options={"verify_exp": False},
)
except PyJWTError:
raise TokenError("invalid_request").with_cause(
"id_token_hint_decode_failed"
) from None
# Validate post_logout_redirect_uri against registered URIs
if request_redirect_uri:
# OIDC Certification: id_token_hint required with post_logout_redirect_uri
if not id_token_hint:
raise TokenError("invalid_request").with_cause("id_token_hint_missing")
if urlparse(request_redirect_uri).scheme in FORBIDDEN_URI_SCHEMES:
raise TokenError("invalid_request").with_cause("post_logout_redirect_uri")
raise RequestValidationError(
bad_request_message(
self.request,
"Forbidden URI scheme in post_logout_redirect_uri",
)
)
for allowed in self.provider.post_logout_redirect_uris:
if allowed.matching_mode == RedirectURIMatchingMode.STRICT:
if request_redirect_uri == allowed.url:
@@ -100,10 +71,6 @@ class EndSessionView(PolicyAccessView):
if fullmatch(allowed.url, request_redirect_uri):
self.post_logout_redirect_uri = request_redirect_uri
break
# OIDC Certification: OP MUST NOT perform post-logout redirection
# if the supplied URI does not exactly match a registered one
if self.post_logout_redirect_uri is None:
raise TokenError("invalid_request").with_cause("invalid_post_logout_redirect_uri")
# Append state to the redirect URI if both are present
if self.post_logout_redirect_uri and state:
@@ -124,43 +91,50 @@ class EndSessionView(PolicyAccessView):
"<html><body>Logout successful</body></html>", content_type="text/html", status=200
)
# Otherwise, continue with normal policy checks
return super().dispatch(request, *args, **kwargs)
def get(self, request: HttpRequest, *args, **kwargs) -> HttpResponse:
"""Dispatch the flow planner for the invalidation flow"""
try:
self.validate()
except TokenError as exc:
return bad_request_message(
self.request,
exc.description,
)
planner = FlowPlanner(self.flow)
planner.allow_empty_flows = True
# Build flow context with logout parameters
context = {
PLAN_CONTEXT_APPLICATION: self.application,
}
# Get session info for logout notifications and token invalidation
auth_session = AuthenticatedSession.from_request(request, request.user)
# Add validated redirect URI (with state appended) to context if available
if self.post_logout_redirect_uri:
context[PLAN_CONTEXT_POST_LOGOUT_REDIRECT_URI] = self.post_logout_redirect_uri
# Invalidate tokens for this provider/session (RP-initiated logout:
# user stays logged into authentik, only this provider's tokens are revoked)
if request.user.is_authenticated and auth_session:
AccessToken.objects.filter(
user=request.user,
provider=self.provider,
session=auth_session,
).delete()
session_key = (
auth_session.session.session_key if auth_session and auth_session.session else None
)
# Handle frontchannel logout
frontchannel_logout_url = None
if self.provider.logout_method == OAuth2LogoutMethod.FRONTCHANNEL:
frontchannel_logout_url = build_frontchannel_logout_url(
self.provider, request, session_key
)
# Handle backchannel logout
if (
self.provider.logout_method == OAuth2LogoutMethod.BACKCHANNEL
and self.provider.logout_uri
):
# Find access token to get iss and sub for the logout token
access_token = AccessToken.objects.filter(
user=request.user,
provider=self.provider,
@@ -189,16 +163,9 @@ class EndSessionView(PolicyAccessView):
}
]
access_tokens = AccessToken.objects.filter(
user=request.user,
provider=self.provider,
)
if auth_session:
access_tokens = access_tokens.filter(session=auth_session)
access_tokens.delete()
plan = planner.plan(request, context)
# Inject iframe logout stage if frontchannel logout is configured
if frontchannel_logout_url:
plan.insert_stage(in_memory_stage(IframeLogoutStageView))

View File

@@ -1,5 +1,6 @@
"""RAC Signals"""
from asgiref.sync import async_to_sync
from channels.layers import get_channel_layer
from django.core.cache import cache
from django.db.models.signals import post_delete, post_save, pre_delete
@@ -17,7 +18,7 @@ from authentik.providers.rac.models import ConnectionToken, Endpoint
@receiver(pre_delete, sender=AuthenticatedSession)
def user_session_deleted(sender, instance: AuthenticatedSession, **_):
layer = get_channel_layer()
layer.group_send_blocking(
async_to_sync(layer.group_send)(
build_rac_client_group_session(instance.session.session_key),
{"type": "event.disconnect", "reason": "session_logout"},
)
@@ -27,7 +28,7 @@ def user_session_deleted(sender, instance: AuthenticatedSession, **_):
def pre_delete_connection_token_disconnect(sender, instance: ConnectionToken, **_):
"""Disconnect session when connection token is deleted"""
layer = get_channel_layer()
layer.group_send_blocking(
async_to_sync(layer.group_send)(
build_rac_client_group_token(instance.token),
{"type": "event.disconnect", "reason": "token_delete"},
)

View File

@@ -6,7 +6,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("authentik_core", "0056_user_roles"), # must run before group field is removed
("authentik_rbac", "0009_remove_initialpermissions_mode"),
]

View File

@@ -187,7 +187,6 @@ SPECTACULAR_SETTINGS = {
"PolicyEngineMode": "authentik.policies.models.PolicyEngineMode",
"PromptTypeEnum": "authentik.stages.prompt.models.FieldTypes",
"ProxyMode": "authentik.providers.proxy.models.ProxyMode",
"RedirectURITypeEnum": "authentik.providers.oauth2.models.RedirectURIType",
"SAMLBindingsEnum": "authentik.providers.saml.models.SAMLBindings",
"SAMLLogoutMethods": "authentik.providers.saml.models.SAMLLogoutMethods",
"SAMLNameIDPolicyEnum": "authentik.sources.saml.models.SAMLNameIDPolicy",
@@ -221,7 +220,7 @@ REST_FRAMEWORK = {
"authentik.api.search.ql.QLSearch",
"authentik.rbac.filters.ObjectFilter",
"django_filters.rest_framework.DjangoFilterBackend",
"authentik.api.ordering.NullsAwareOrderingFilter",
"rest_framework.filters.OrderingFilter",
],
"DEFAULT_PERMISSION_CLASSES": ("authentik.rbac.permissions.ObjectPermissions",),
"DEFAULT_AUTHENTICATION_CLASSES": (

File diff suppressed because one or more lines are too long

View File

@@ -16,7 +16,7 @@ class RedirectMode(models.TextChoices):
class RedirectStage(Stage):
"""Redirect the user to a static URL or another flow, optionally with all gathered context."""
"""Redirect the user to another flow, potentially with all gathered context."""
keep_context = models.BooleanField(default=True)
mode = models.TextField(choices=RedirectMode.choices)

View File

@@ -7,7 +7,7 @@ from dramatiq.broker import Broker, MessageProxy, get_broker
from dramatiq.middleware.middleware import Middleware
from dramatiq.middleware.retries import Retries
from dramatiq.results.middleware import Results
from dramatiq.worker import ConsumerThread, Worker, WorkerThread
from dramatiq.worker import Worker, _ConsumerThread, _WorkerThread
from authentik.tasks.broker import PostgresBroker
@@ -20,7 +20,7 @@ class TestWorker(Worker):
self.worker_id = 1000
self.work_queue = PriorityQueue()
self.consumers = {
TESTING_QUEUE: ConsumerThread(
TESTING_QUEUE: _ConsumerThread(
broker=self.broker,
queue_name=TESTING_QUEUE,
prefetch=2,
@@ -33,7 +33,7 @@ class TestWorker(Worker):
prefetch=2,
timeout=1,
)
self._worker = WorkerThread(
self._worker = _WorkerThread(
broker=self.broker,
consumers=self.consumers,
work_queue=self.work_queue,
@@ -78,18 +78,17 @@ def use_test_broker():
actor.broker = broker
actor.broker.declare_actor(actor)
for middleware_class_path, middleware_kwargs in Conf().middlewares:
middleware_class = import_string(middleware_class_path)
if issubclass(middleware_class, Results):
middleware_kwargs["backend"] = import_string(Conf().result_backend)(
*Conf().result_backend_args,
**Conf().result_backend_kwargs,
)
middleware: Middleware = middleware_class(
for middleware_class, middleware_kwargs in Conf().middlewares:
middleware: Middleware = import_string(middleware_class)(
**middleware_kwargs,
)
if isinstance(middleware, Retries):
middleware.max_retries = 0
if isinstance(middleware, Results):
middleware.backend = import_string(Conf().result_backend)(
*Conf().result_backend_args,
**Conf().result_backend_kwargs,
)
broker.add_middleware(middleware)
broker.start()

View File

@@ -59,8 +59,6 @@ class FlagsJSONExtension(OpenApiSerializerFieldExtension):
props[_flag.key] = build_basic_type(get_args(_flag.__orig_bases__[0])[0])
if _flag.description:
props[_flag.key]["description"] = _flag.description
if _flag.deprecated:
props[_flag.key]["deprecated"] = _flag.deprecated
return build_object_type(props, required=props.keys())

View File

@@ -18,7 +18,6 @@ class Flag[T]:
Literal["none"] | Literal["public"] | Literal["authenticated"] | Literal["system"]
) = "none"
description: str | None = None
deprecated = False
def __init_subclass__(cls, key: str, **kwargs):
cls.__key = key

View File

@@ -1,211 +0,0 @@
# Minimal Invitation-based Enrollment Blueprint
#
# Companion to flows-invitation-enrollment.yaml, intended for the "New Invitation"
# wizard in the admin UI. Creates a single enrollment flow with an invitation stage
# bound to it, plus the supporting prompt/user-write/user-login stages.
#
# All user-facing fields are parameterized via !Context with fallback defaults, so
# this blueprint can be imported directly (without context) or through the wizard
# with custom values.
#
# Context keys (all optional):
# flow_name Display name of the enrollment flow.
# flow_slug URL slug of the flow and suffix for sub-entity
# identifiers (so repeated imports with different
# slugs don't overwrite each other).
# stage_name Name of the invitation stage.
# continue_flow_without_invitation Whether the flow continues when no invitation
# is supplied (default: false).
# user_type "external" or "internal" (default: "external").
# Drives the user-write stage's user_type and
# user_path_template.
version: 1
metadata:
labels:
blueprints.goauthentik.io/instantiate: "false"
name: Invitation-based Enrollment (minimal)
entries:
- identifiers:
slug: !Context [flow_slug, invitation-enrollment-flow]
model: authentik_flows.flow
id: flow
attrs:
name: !Context [flow_name, Invitation Enrollment Flow]
title: !Context [flow_name, Invitation Enrollment Flow]
designation: enrollment
authentication: require_unauthenticated
- identifiers:
name: !Context [stage_name, invitation-stage]
id: invitation-stage
model: authentik_stages_invitation.invitationstage
attrs:
continue_flow_without_invitation: !Context [continue_flow_without_invitation, false]
- identifiers:
name:
!Format [
"invitation-enrollment-field-username-%s",
!Context [flow_slug, invitation-enrollment-flow],
]
id: prompt-field-username
model: authentik_stages_prompt.prompt
attrs:
field_key: username
label: Username
type: username
required: true
placeholder: Username
placeholder_expression: false
order: 0
- identifiers:
name:
!Format [
"invitation-enrollment-field-password-%s",
!Context [flow_slug, invitation-enrollment-flow],
]
id: prompt-field-password
model: authentik_stages_prompt.prompt
attrs:
field_key: password
label: Password
type: password
required: true
placeholder: Password
placeholder_expression: false
order: 1
- identifiers:
name:
!Format [
"invitation-enrollment-field-password-repeat-%s",
!Context [flow_slug, invitation-enrollment-flow],
]
id: prompt-field-password-repeat
model: authentik_stages_prompt.prompt
attrs:
field_key: password_repeat
label: Password (repeat)
type: password
required: true
placeholder: Password (repeat)
placeholder_expression: false
order: 2
- identifiers:
name:
!Format [
"invitation-enrollment-field-name-%s",
!Context [flow_slug, invitation-enrollment-flow],
]
id: prompt-field-name
model: authentik_stages_prompt.prompt
attrs:
field_key: name
label: Name
type: text
required: true
placeholder: Name
placeholder_expression: false
order: 0
- identifiers:
name:
!Format [
"invitation-enrollment-field-email-%s",
!Context [flow_slug, invitation-enrollment-flow],
]
id: prompt-field-email
model: authentik_stages_prompt.prompt
attrs:
field_key: email
label: Email
type: email
required: true
placeholder: Email
placeholder_expression: false
order: 1
- identifiers:
name:
!Format [
"invitation-enrollment-prompt-credentials-%s",
!Context [flow_slug, invitation-enrollment-flow],
]
id: prompt-stage-credentials
model: authentik_stages_prompt.promptstage
attrs:
fields:
- !KeyOf prompt-field-username
- !KeyOf prompt-field-password
- !KeyOf prompt-field-password-repeat
- identifiers:
name:
!Format [
"invitation-enrollment-prompt-details-%s",
!Context [flow_slug, invitation-enrollment-flow],
]
id: prompt-stage-details
model: authentik_stages_prompt.promptstage
attrs:
fields:
- !KeyOf prompt-field-name
- !KeyOf prompt-field-email
- identifiers:
name:
!Format [
"invitation-enrollment-user-write-%s",
!Context [flow_slug, invitation-enrollment-flow],
]
id: user-write-stage
model: authentik_stages_user_write.userwritestage
attrs:
user_creation_mode: always_create
user_type: !Context [user_type, external]
user_path_template:
!Format ["users/%s", !Context [user_type, external]]
- identifiers:
name:
!Format [
"invitation-enrollment-user-login-%s",
!Context [flow_slug, invitation-enrollment-flow],
]
id: user-login-stage
model: authentik_stages_user_login.userloginstage
- identifiers:
target: !KeyOf flow
stage: !KeyOf invitation-stage
order: 5
model: authentik_flows.flowstagebinding
attrs:
evaluate_on_plan: true
re_evaluate_policies: true
- identifiers:
target: !KeyOf flow
stage: !KeyOf prompt-stage-credentials
order: 10
model: authentik_flows.flowstagebinding
- identifiers:
target: !KeyOf flow
stage: !KeyOf prompt-stage-details
order: 15
model: authentik_flows.flowstagebinding
- identifiers:
target: !KeyOf flow
stage: !KeyOf user-write-stage
order: 20
model: authentik_flows.flowstagebinding
- identifiers:
target: !KeyOf flow
stage: !KeyOf user-login-stage
order: 100
model: authentik_flows.flowstagebinding

View File

@@ -73,16 +73,8 @@ entries:
redirect_uris:
- matching_mode: strict
url: https://localhost:8443/test/a/authentik/callback
redirect_uri_type: authorization
- matching_mode: strict
url: https://host.docker.internal:8443/test/a/authentik/callback
redirect_uri_type: authorization
- matching_mode: strict
url: https://localhost:8443/test/a/authentik/post_logout_redirect
redirect_uri_type: logout
- matching_mode: strict
url: https://host.docker.internal:8443/test/a/authentik/post_logout_redirect
redirect_uri_type: logout
grant_types:
- authorization_code
- implicit
@@ -116,16 +108,8 @@ entries:
redirect_uris:
- matching_mode: strict
url: https://localhost:8443/test/a/authentik/callback
redirect_uri_type: authorization
- matching_mode: strict
url: https://host.docker.internal:8443/test/a/authentik/callback
redirect_uri_type: authorization
- matching_mode: strict
url: https://localhost:8443/test/a/authentik/post_logout_redirect
redirect_uri_type: logout
- matching_mode: strict
url: https://host.docker.internal:8443/test/a/authentik/post_logout_redirect
redirect_uri_type: logout
grant_types:
- authorization_code
- implicit

View File

@@ -26,8 +26,6 @@ var healthcheckCmd = &cobra.Command{
exitCode := 1
log.WithField("mode", mode).Debug("checking health")
switch strings.ToLower(mode) {
case "allinone":
fallthrough
case "server":
exitCode = check(fmt.Sprintf("http://localhost%s-/health/live/", config.Get().Web.Path))
case "worker":

2
go.mod
View File

@@ -7,7 +7,7 @@ require (
beryju.io/radius-eap v0.1.0
github.com/avast/retry-go/v4 v4.7.0
github.com/coreos/go-oidc/v3 v3.18.0
github.com/getsentry/sentry-go v0.46.1
github.com/getsentry/sentry-go v0.46.0
github.com/go-http-utils/etag v0.0.0-20161124023236-513ea8f21eb1
github.com/go-ldap/ldap/v3 v3.4.13
github.com/go-openapi/runtime v0.29.4

4
go.sum
View File

@@ -20,8 +20,8 @@ github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/felixge/httpsnoop v1.0.3 h1:s/nj+GCswXYzN5v2DpNMuMQYe+0DDwt5WVCU6CWBdXk=
github.com/felixge/httpsnoop v1.0.3/go.mod h1:m8KPJKqk1gH5J9DgRY2ASl2lWCfGKXixSwevea8zH2U=
github.com/getsentry/sentry-go v0.46.1 h1:mZyQFaQYkPxAdDG4HR8gDg6j4CnKYVWt4TF92N7i3XY=
github.com/getsentry/sentry-go v0.46.1/go.mod h1:evVbw2qotNUdYG8KxXbAdjOQWWvWIwKxpjdZZIvcIPw=
github.com/getsentry/sentry-go v0.46.0 h1:mbdDaarbUdOt9X+dx6kDdntkShLEX3/+KyOsVDTPDj0=
github.com/getsentry/sentry-go v0.46.0/go.mod h1:evVbw2qotNUdYG8KxXbAdjOQWWvWIwKxpjdZZIvcIPw=
github.com/go-asn1-ber/asn1-ber v1.5.8-0.20250403174932-29230038a667 h1:BP4M0CvQ4S3TGls2FvczZtj5Re/2ZzkV9VwqPHH/3Bo=
github.com/go-asn1-ber/asn1-ber v1.5.8-0.20250403174932-29230038a667/go.mod h1:hEBeB/ic+5LoWskz+yKT7vGhhPYkProFKoKdwZRWMe0=
github.com/go-errors/errors v1.4.2 h1:J6MZopCL4uSllY1OfXM374weqZFFItUbrImctkmUxIA=

View File

@@ -31,7 +31,7 @@ function run_authentik {
echo go run ./cmd/server "$@"
fi
;;
allinone | worker)
worker)
if [[ -x "$(command -v authentik)" ]]; then
echo authentik "$@"
else
@@ -79,7 +79,7 @@ function prepare_debug {
apt-get update
apt-get install -y --no-install-recommends krb5-kdc krb5-user krb5-admin-server libkrb5-dev gcc
source "${VENV_PATH}/bin/activate"
uv sync --active --locked
uv sync --active --frozen
touch /unittest.xml
chown authentik:authentik /unittest.xml
}
@@ -105,7 +105,7 @@ elif [[ "$1" == "test-all" ]]; then
prepare_debug
chmod 777 /root
check_if_root_and_run manage test authentik
elif [[ "$1" == "allinone" ]] || [[ "$1" == "server" ]] || [[ "$1" == "worker" ]]; then
elif [[ "$1" == "server" ]] || [[ "$1" == "worker" ]]; then
wait_for_db
check_if_root_and_run "$@"
elif [[ "$1" == "healthcheck" ]]; then

View File

@@ -9,12 +9,12 @@
"version": "0.0.0",
"license": "MIT",
"devDependencies": {
"aws-cdk": "^2.1119.0",
"aws-cdk": "^2.1118.4",
"cross-env": "^10.1.0"
},
"engines": {
"node": ">=20",
"npm": ">=11.6.2"
"node": ">=24",
"npm": ">=11.10.1"
}
},
"node_modules/@epic-web/invariant": {
@@ -25,9 +25,9 @@
"license": "MIT"
},
"node_modules/aws-cdk": {
"version": "2.1119.0",
"resolved": "https://registry.npmjs.org/aws-cdk/-/aws-cdk-2.1119.0.tgz",
"integrity": "sha512-XBxZEKH3BY4M1EX6x0qBkmOAj8viErjpww14iH6Z3z6nI0YzjZeJ05eEl7eJwzUgv7NTGagWBS9m/eDJW5+dAg==",
"version": "2.1118.4",
"resolved": "https://registry.npmjs.org/aws-cdk/-/aws-cdk-2.1118.4.tgz",
"integrity": "sha512-wJfRQdvb+FJ2cni059mYdmjhfwhMskP+PAB59BL9jhon+jYtjy8X3pbj3uzHgAOJwNhh6jGkP8xq36Cffccbbw==",
"dev": true,
"license": "Apache-2.0",
"bin": {

View File

@@ -7,11 +7,24 @@
"aws-cfn": "cross-env CI=false cdk synth --version-reporting=false > template.yaml"
},
"devDependencies": {
"aws-cdk": "^2.1119.0",
"aws-cdk": "^2.1118.4",
"cross-env": "^10.1.0"
},
"engines": {
"node": ">=20",
"npm": ">=11.6.2"
}
"node": ">=24",
"npm": ">=11.10.1"
},
"devEngines": {
"runtime": {
"name": "node",
"onFail": "warn",
"version": ">=24"
},
"packageManager": {
"name": "npm",
"version": ">=11.10.1",
"onFail": "warn"
}
},
"packageManager": "npm@11.11.0+sha512.f36811c4aae1fde639527368ae44c571d050006a608d67a191f195a801a52637a312d259186254aa3a3799b05335b7390539cf28656d18f0591a1125ba35f973"
}

View File

@@ -7,6 +7,17 @@ ARG GIT_BUILD_HASH
ENV GIT_BUILD_HASH=$GIT_BUILD_HASH
ENV NODE_ENV=production
WORKDIR /work
RUN --mount=type=bind,target=/work/package.json,src=./package.json \
--mount=type=bind,target=/work/package-lock.json,src=./package-lock.json \
--mount=type=bind,target=/work/web/package.json,src=./web/package.json \
--mount=type=bind,target=/work/web/package-lock.json,src=./web/package-lock.json \
--mount=type=bind,target=/work/scripts/node/,src=./scripts/node/ \
--mount=type=bind,target=/work/packages/logger-js/,src=./packages/logger-js/ \
node ./scripts/node/setup-corepack.mjs --force && \
node ./scripts/node/lint-runtime.mjs ./web
WORKDIR /work/web
# These files need to be copied and cannot be mounted as `npm ci` will build the client's typescript
@@ -18,7 +29,7 @@ RUN --mount=type=bind,target=/work/web/package.json,src=./web/package.json \
--mount=type=bind,target=/work/web/packages/sfe/package.json,src=./web/packages/sfe/package.json \
--mount=type=bind,target=/work/web/scripts,src=./web/scripts \
--mount=type=cache,id=npm-ak,sharing=shared,target=/root/.npm \
npm ci
corepack npm ci
COPY ./package.json /work
COPY ./web /work/web/
@@ -200,7 +211,7 @@ RUN --mount=type=bind,target=pyproject.toml,src=pyproject.toml \
--mount=type=bind,target=packages/django-postgres-cache,src=packages/django-postgres-cache \
--mount=type=bind,target=rust-toolchain.toml,src=rust-toolchain.toml \
--mount=type=cache,id=uv-python-deps-$TARGETARCH$TARGETVARIANT,target=/root/.cache/uv \
uv sync --locked --no-install-project --no-dev
uv sync --frozen --no-install-project --no-dev
# Stage: Run
FROM python-base AS final-image

View File

@@ -10,12 +10,22 @@ WORKDIR /static
COPY ./packages /packages
COPY ./web/packages /static/packages
RUN --mount=type=bind,target=/static/package.json,src=./package.json \
--mount=type=bind,target=/static/package-lock.json,src=./package-lock.json \
--mount=type=bind,target=/static/web/package.json,src=./web/package.json \
--mount=type=bind,target=/static/web/package-lock.json,src=./web/package-lock.json \
--mount=type=bind,target=/static/scripts/node/,src=./scripts/node/ \
--mount=type=bind,target=/static/packages/logger-js/,src=./packages/logger-js/ \
node ./scripts/node/setup-corepack.mjs --force && \
node ./scripts/node/lint-runtime.mjs ./web
COPY package.json /
RUN --mount=type=bind,target=/static/package.json,src=./web/package.json \
--mount=type=bind,target=/static/package-lock.json,src=./web/package-lock.json \
--mount=type=bind,target=/static/scripts,src=./web/scripts \
--mount=type=cache,target=/root/.npm \
npm ci
corepack npm ci
COPY web .
RUN npm run build-proxy

View File

@@ -28,39 +28,20 @@ class HttpHandler(BaseHTTPRequestHandler):
_ = db_conn.cursor()
def do_GET(self):
from django.db import DatabaseError, InterfaceError, OperationalError, connections
from psycopg.errors import AdminShutdown
from authentik.root.monitoring import monitoring_set
DATABASE_ERRORS = (
AdminShutdown,
InterfaceError,
DatabaseError,
OperationalError,
)
if self.path == "/-/metrics/":
try:
monitoring_set.send(self)
except DATABASE_ERRORS as exc:
LOGGER.warning("failed to send monitoring_set", exc=exc)
for db_conn in connections.all():
db_conn.close()
self.send_response(503)
else:
self.send_response(200)
from authentik.root.monitoring import monitoring_set
monitoring_set.send_robust(self)
self.send_response(200)
self.end_headers()
elif self.path == "/-/health/ready/":
from django.db.utils import OperationalError
try:
self.check_db()
except DATABASE_ERRORS as exc:
LOGGER.warning("failed to check database health", exc=exc)
for db_conn in connections.all():
db_conn.close()
except OperationalError:
self.send_response(503)
else:
self.send_response(200)
self.send_response(200)
self.end_headers()
else:
self.send_response(200)

View File

@@ -8,7 +8,7 @@ msgid ""
msgstr ""
"Project-Id-Version: PACKAGE VERSION\n"
"Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2026-05-06 00:27+0000\n"
"POT-Creation-Date: 2026-04-30 00:27+0000\n"
"PO-Revision-Date: YEAR-MO-DA HO:MI+ZONE\n"
"Last-Translator: FULL NAME <EMAIL@ADDRESS>\n"
"Language-Team: LANGUAGE <LL@li.org>\n"
@@ -101,14 +101,6 @@ msgstr ""
msgid "Blueprint file does not exist"
msgstr ""
#: authentik/blueprints/api.py
msgid "Context must be valid JSON"
msgstr ""
#: authentik/blueprints/api.py
msgid "Context must be a JSON object"
msgstr ""
#: authentik/blueprints/api.py
msgid "Failed to validate blueprint"
msgstr ""
@@ -880,6 +872,10 @@ msgstr ""
msgid "Grace period must be shorter than the interval."
msgstr ""
#: authentik/enterprise/lifecycle/api/rules.py
msgid "Only one type-wide rule for each object type is allowed."
msgstr ""
#: authentik/enterprise/lifecycle/models.py
msgid ""
"Select which transports should be used to notify the reviewers. If none are "
@@ -907,8 +903,7 @@ msgid "Go to {self._get_model_name()}"
msgstr ""
#: authentik/enterprise/lifecycle/models.py
msgid ""
"Access review is due for {self.content_type.name.lower()} {object_label}"
msgid "Access review is due for {self.content_type.name} {str(self.object)}"
msgstr ""
#: authentik/enterprise/lifecycle/models.py
@@ -921,7 +916,7 @@ msgid "Access review completed for {self.content_type.name} {str(self.object)}"
msgstr ""
#: authentik/enterprise/lifecycle/tasks.py
msgid "Dispatch tasks to apply lifecycle rules."
msgid "Dispatch tasks to validate lifecycle rules."
msgstr ""
#: authentik/enterprise/lifecycle/tasks.py
@@ -1234,78 +1229,6 @@ msgstr ""
msgid "Generate data export."
msgstr ""
#: authentik/enterprise/stages/account_lockdown/api.py
msgid "User to lock. If omitted, locks the current user (self-service)."
msgstr ""
#: authentik/enterprise/stages/account_lockdown/api.py
msgid "No lockdown flow configured."
msgstr ""
#: authentik/enterprise/stages/account_lockdown/api.py
msgid "Lockdown flow is not applicable."
msgstr ""
#: authentik/enterprise/stages/account_lockdown/api.py
msgid "Choose the target account, then return a flow link."
msgstr ""
#: authentik/enterprise/stages/account_lockdown/api.py
msgid "No lockdown flow configured or the flow is not applicable"
msgstr ""
#: authentik/enterprise/stages/account_lockdown/api.py
msgid "Permission denied (when targeting another user)"
msgstr ""
#: authentik/enterprise/stages/account_lockdown/models.py
msgid "Deactivate the user account (set is_active to False)"
msgstr ""
#: authentik/enterprise/stages/account_lockdown/models.py
msgid "Set an unusable password for the user"
msgstr ""
#: authentik/enterprise/stages/account_lockdown/models.py
msgid "Delete all active sessions for the user"
msgstr ""
#: authentik/enterprise/stages/account_lockdown/models.py
msgid ""
"Revoke all tokens for the user (API, app password, recovery, verification, "
"OAuth)"
msgstr ""
#: authentik/enterprise/stages/account_lockdown/models.py
msgid ""
"Flow to redirect users to after self-service lockdown. This flow should not "
"require authentication since the user's session is deleted."
msgstr ""
#: authentik/enterprise/stages/account_lockdown/models.py
msgid "Account Lockdown Stage"
msgstr ""
#: authentik/enterprise/stages/account_lockdown/models.py
msgid "Account Lockdown Stages"
msgstr ""
#: authentik/enterprise/stages/account_lockdown/stage.py
msgid "No target user specified for account lockdown"
msgstr ""
#: authentik/enterprise/stages/account_lockdown/stage.py
msgid "You do not have permission to lock down this account."
msgstr ""
#: authentik/enterprise/stages/account_lockdown/stage.py
msgid "Account lockdown failed for this account."
msgstr ""
#: authentik/enterprise/stages/account_lockdown/stage.py
msgid "Self-service account lockdown requires a completion flow."
msgstr ""
#: authentik/enterprise/stages/authenticator_endpoint_gdtc/models.py
msgid "Endpoint Authenticator Google Device Trust Connector Stage"
msgstr ""
@@ -4533,18 +4456,6 @@ msgstr ""
msgid "Static: Static value, displayed as-is."
msgstr ""
#: authentik/stages/prompt/models.py
msgid "Alert (Info): Static alert box with info styling"
msgstr ""
#: authentik/stages/prompt/models.py
msgid "Alert (Warning): Static alert box with warning styling"
msgstr ""
#: authentik/stages/prompt/models.py
msgid "Alert (Danger): Static alert box with danger styling"
msgstr ""
#: authentik/stages/prompt/models.py
msgid "authentik: Selection of locales authentik supports"
msgstr ""

View File

@@ -53,7 +53,6 @@ Relatedly
Sidero
snipeit
sonarqube
Technitium
Terrakube
Ueberauth
Veeam

Binary file not shown.

View File

@@ -15,7 +15,7 @@ msgid ""
msgstr ""
"Project-Id-Version: PACKAGE VERSION\n"
"Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2026-05-01 03:47+0000\n"
"POT-Creation-Date: 2026-04-08 00:28+0000\n"
"PO-Revision-Date: 2025-12-01 19:09+0000\n"
"Last-Translator: Sp P, 2026\n"
"Language-Team: French (France) (https://app.transifex.com/authentik/teams/119923/fr_FR/)\n"
@@ -263,18 +263,6 @@ msgstr ""
"fournisseurs backchannels sont retournés. Si faux, les fournisseurs "
"backchannels sont exclus"
#: authentik/core/api/users.py
msgid "Invalid password hash format. Must be a valid Django password hash."
msgstr ""
"Format de hachage de mot de passe invalide. Cela doit être un hachage de mot"
" de passe Django valide."
#: authentik/core/api/users.py
msgid "Cannot set both password and password_hash. Use only one."
msgstr ""
"Impossible de définir à la fois password (mot de passe) et password_hash "
"(hachage de mot de passe). N'en utiliser qu'un seul."
#: authentik/core/api/users.py
msgid "No leading or trailing slashes allowed."
msgstr ""
@@ -455,11 +443,6 @@ msgid "Open launch URL in a new browser tab or window."
msgstr ""
"Ouvrir l'URL de lancement dans une nouvelle fenêtre ou un nouvel onglet."
#: authentik/core/models.py
msgid "Hide this application from the user's My applications page."
msgstr ""
"Masquer cette application dans la page Mes applications de l'utilisateur."
#: authentik/core/models.py
msgid "Application"
msgstr "Application"
@@ -827,14 +810,6 @@ msgstr "Nonce Apple"
msgid "Apple Nonces"
msgstr "Nonces Apple"
#: authentik/endpoints/connectors/agent/models.py
msgid "Apple Independent Secure Enclave"
msgstr "Secure Enclave indépendante d'Apple"
#: authentik/endpoints/connectors/agent/models.py
msgid "Apple Independent Secure Enclaves"
msgstr "Secure Enclaves indépendantes d'Apple"
#: authentik/endpoints/facts.py
msgid "Operating System name, such as 'Server 2022' or 'Ubuntu'"
msgstr "Nom du système d'exploitation, comme 'Server 2022' ou 'Ubuntu'"
@@ -961,6 +936,12 @@ msgstr "Soit un groupe de réviseurs soit un réviseur doit être défini."
msgid "Grace period must be shorter than the interval."
msgstr "La période de grâce doit être plus courte que l'intervalle."
#: authentik/enterprise/lifecycle/api/rules.py
msgid "Only one type-wide rule for each object type is allowed."
msgstr ""
"Une seule règle pour l'ensemble du type est autorisée pour chaque type "
"d'objet."
#: authentik/enterprise/lifecycle/models.py
msgid ""
"Select which transports should be used to notify the reviewers. If none are "
@@ -991,11 +972,10 @@ msgid "Go to {self._get_model_name()}"
msgstr "Aller à {self._get_model_name()}"
#: authentik/enterprise/lifecycle/models.py
msgid ""
"Access review is due for {self.content_type.name.lower()} {object_label}"
msgid "Access review is due for {self.content_type.name} {str(self.object)}"
msgstr ""
"La révision de l'accès doit être effectuée pour "
"{self.content_type.name.lower()} {object_label}"
"La révision d'accès est attendue pour {self.content_type.name} "
"{str(self.object)}"
#: authentik/enterprise/lifecycle/models.py
msgid ""
@@ -1012,8 +992,8 @@ msgstr ""
"{str(self.object)}"
#: authentik/enterprise/lifecycle/tasks.py
msgid "Dispatch tasks to apply lifecycle rules."
msgstr "Déclencher les tâches pour appliquer les règles de cycle de vie"
msgid "Dispatch tasks to validate lifecycle rules."
msgstr "Déclenche les tâches pour valider les règles de cycle de vie"
#: authentik/enterprise/lifecycle/tasks.py
msgid "Apply lifecycle rule."
@@ -1356,86 +1336,6 @@ msgstr "Télécharger"
msgid "Generate data export."
msgstr "Générer un export de données."
#: authentik/enterprise/stages/account_lockdown/api.py
msgid "User to lock. If omitted, locks the current user (self-service)."
msgstr ""
"Utilisateur à bloquer. Si non renseigné, bloque l'utilisateur actuel (libre "
"service)."
#: authentik/enterprise/stages/account_lockdown/api.py
msgid "No lockdown flow configured."
msgstr "Aucun flux de blocage configuré."
#: authentik/enterprise/stages/account_lockdown/api.py
msgid "Lockdown flow is not applicable."
msgstr "Le flux de blocage n'est pas applicable."
#: authentik/enterprise/stages/account_lockdown/api.py
msgid "Choose the target account, then return a flow link."
msgstr "Choisit le compte cible, puis renvoie un lien de flux."
#: authentik/enterprise/stages/account_lockdown/api.py
msgid "No lockdown flow configured or the flow is not applicable"
msgstr "Aucun flux de blocage configuré, ou le flux n'est pas applicable"
#: authentik/enterprise/stages/account_lockdown/api.py
msgid "Permission denied (when targeting another user)"
msgstr "Permission refusée (lors du ciblage d'un autre utilisateur)"
#: authentik/enterprise/stages/account_lockdown/models.py
msgid "Deactivate the user account (set is_active to False)"
msgstr "Désactiver le compte de l'utilisateur (définir is_active à False)."
#: authentik/enterprise/stages/account_lockdown/models.py
msgid "Set an unusable password for the user"
msgstr "Définit un mot de passe inutilisable pour cet utilisateur."
#: authentik/enterprise/stages/account_lockdown/models.py
msgid "Delete all active sessions for the user"
msgstr "Supprimer toutes les sessions actives pour cet utilisateur."
#: authentik/enterprise/stages/account_lockdown/models.py
msgid ""
"Revoke all tokens for the user (API, app password, recovery, verification, "
"OAuth)"
msgstr ""
"Révoquer tous les jetons pour cet utilisateur (API, mot de passe applicatif,"
" récupération, vérification, OAuth)"
#: authentik/enterprise/stages/account_lockdown/models.py
msgid ""
"Flow to redirect users to after self-service lockdown. This flow should not "
"require authentication since the user's session is deleted."
msgstr ""
"Flux vers lequel rediriger les utilisateurs après le blocage en libre "
"service. Ce flux ne doit pas nécessiter d'authentification car la session "
"utilisateur est supprimée."
#: authentik/enterprise/stages/account_lockdown/models.py
msgid "Account Lockdown Stage"
msgstr "Etape de blocage de compte"
#: authentik/enterprise/stages/account_lockdown/models.py
msgid "Account Lockdown Stages"
msgstr "Etapes de blocage de compte"
#: authentik/enterprise/stages/account_lockdown/stage.py
msgid "No target user specified for account lockdown"
msgstr "Aucun utilisateur ciblé défini pour le blocage de compte"
#: authentik/enterprise/stages/account_lockdown/stage.py
msgid "You do not have permission to lock down this account."
msgstr "Vous n'avez pas la permission de bloquer ce compte."
#: authentik/enterprise/stages/account_lockdown/stage.py
msgid "Account lockdown failed for this account."
msgstr "Echec du blocage de compte pour ce compte."
#: authentik/enterprise/stages/account_lockdown/stage.py
msgid "Self-service account lockdown requires a completion flow."
msgstr ""
"Le blocage de compte en libre service nécessite un flux de finalisation."
#: authentik/enterprise/stages/authenticator_endpoint_gdtc/models.py
msgid "Endpoint Authenticator Google Device Trust Connector Stage"
msgstr ""
@@ -1569,11 +1469,11 @@ msgstr "Évènement utilisateur"
#: authentik/events/models.py
msgid "Notification Transport"
msgstr "Transport de notification"
msgstr "Transport de Notification"
#: authentik/events/models.py
msgid "Notification Transports"
msgstr "Transports de notifications"
msgstr "Transports de notification"
#: authentik/events/models.py
msgid "Notice"
@@ -1845,10 +1745,6 @@ msgstr "Jeton du flux"
msgid "Flow Tokens"
msgstr "Jetons du flux"
#: authentik/flows/planner.py
msgid "This link is invalid or has expired. Please request a new one."
msgstr "Ce lien est invalide ou a expiré. Veuillez un demander un nouveau."
#: authentik/flows/views/executor.py
msgid "Invalid next URL"
msgstr "URL suivante invalide"
@@ -2876,12 +2772,8 @@ msgstr ""
"restriction d'audience ne sera ajoutée."
#: authentik/providers/saml/models.py
msgid ""
"Also known as EntityID. Providing a value overrides the default issuer "
"generated by authentik."
msgstr ""
"Aussi appelé EntityID. Fournir une valeur remplace l'émetteur par défaut "
"généré par authentik."
msgid "Also known as EntityID"
msgstr "Aussi appelé EntityID"
#: authentik/providers/saml/models.py
msgid "SLS URL"
@@ -3102,10 +2994,6 @@ msgstr "SAML NameID pour cette session"
msgid "SAML NameID format"
msgstr "Format SAML NameID"
#: authentik/providers/saml/models.py
msgid "SAML Issuer used for this session"
msgstr "Émetteur SAML utilisé pour cette session"
#: authentik/providers/saml/models.py
msgid "SAML Session"
msgstr "Session SAML"
@@ -3138,10 +3026,6 @@ msgstr "Salesforce"
msgid "Webex"
msgstr "Webex"
#: authentik/providers/scim/models.py
msgid "vCenter"
msgstr "vCenter"
#: authentik/providers/scim/models.py
msgid "Group filters used to define sync-scope for groups."
msgstr ""
@@ -3865,8 +3749,8 @@ msgid ""
"Which servers a user has to be a member of to be granted access. Empty list "
"allows every server."
msgstr ""
"De quels serveurs un utilisateur doit être membre afin d'obtenir l'accès. "
"Une liste vide autorise tous les serveurs."
"De quels serveurs un utilisateur doit être membre afin d'être autorisé. Une "
"liste vide autorise tous les serveurs."
#: authentik/sources/plex/models.py
msgid "Allow friends to authenticate, even if you don't share a server."
@@ -4571,11 +4455,11 @@ msgstr "Activer les utilisateurs à la complétion de l'étape."
#: authentik/stages/email/models.py
msgid "Email Stage"
msgstr "Étape de Courriel"
msgstr "Étape Courriel"
#: authentik/stages/email/models.py
msgid "Email Stages"
msgstr "Étapes de Courriel"
msgstr "Étapes Courriel"
#: authentik/stages/email/stage.py
msgid "Successfully verified Email."
@@ -5049,19 +4933,6 @@ msgstr ""
msgid "Static: Static value, displayed as-is."
msgstr "Statique : valeur statique, affichée comme telle."
#: authentik/stages/prompt/models.py
msgid "Alert (Info): Static alert box with info styling"
msgstr "Alerte (Info) : message d'alerte statique au format information"
#: authentik/stages/prompt/models.py
msgid "Alert (Warning): Static alert box with warning styling"
msgstr ""
"Alerte (Avertissement) : message d'alerte statique au format avertissement"
#: authentik/stages/prompt/models.py
msgid "Alert (Danger): Static alert box with danger styling"
msgstr "Alerte (Danger) : message d'alerte statique au format danger"
#: authentik/stages/prompt/models.py
msgid "authentik: Selection of locales authentik supports"
msgstr "authentik : sélection des locales prises en charges par authentik"

View File

@@ -1,6 +1,6 @@
//! Utilities to run an axum server.
use std::{net, os::unix, path::PathBuf};
use std::{net, os::unix};
use ak_common::arbiter::{Arbiter, Tasks};
use axum::Router;
@@ -21,20 +21,26 @@ async fn run_plain(
name: &str,
router: Router,
addr: net::SocketAddr,
allow_failure: bool,
) -> Result<()> {
info!(addr = addr.to_string(), "starting {name} server");
let handle = Handle::new();
arbiter.add_net_handle(handle.clone()).await;
axum_server::Server::bind(addr)
let res = axum_server::Server::bind(addr)
.acceptor(CatchPanicAcceptor::new(
ProxyProtocolAcceptor::new().acceptor(DefaultAcceptor::new()),
arbiter.clone(),
))
.handle(handle)
.serve(router.into_make_service_with_connect_info::<net::SocketAddr>())
.await?;
.await;
if res.is_err() && allow_failure {
arbiter.shutdown().await;
return Ok(());
}
res?;
Ok(())
}
@@ -43,59 +49,60 @@ async fn run_plain(
///
/// `name` is only used for observability purposes and should describe which module is starting the
/// server.
///
/// `allow_failure` allows the server to fail silently.
pub fn start_plain(
tasks: &mut Tasks,
name: &'static str,
router: Router,
addr: net::SocketAddr,
allow_failure: bool,
) -> Result<()> {
let arbiter = tasks.arbiter();
tasks
.build_task()
.name(&format!("{}::run_plain({name}, {addr})", module_path!()))
.spawn(run_plain(arbiter, name, router, addr))?;
.spawn(run_plain(arbiter, name, router, addr, allow_failure))?;
Ok(())
}
struct UnixSocketGuard(PathBuf);
impl Drop for UnixSocketGuard {
fn drop(&mut self) {
trace!(path = ?self.0, "removing socket");
if let Err(err) = std::fs::remove_file(&self.0) {
trace!(?err, "failed to remove socket, ignoring");
}
}
}
pub(crate) async fn run_unix(
arbiter: Arbiter,
name: &str,
router: Router,
addr: unix::net::SocketAddr,
allow_failure: bool,
) -> Result<()> {
info!(?addr, "starting {name} server");
let handle = Handle::new();
arbiter.add_unix_handle(handle.clone()).await;
let _guard = if let Some(path) = addr.as_pathname() {
if !allow_failure && let Some(path) = addr.as_pathname() {
trace!(?addr, "removing socket");
if let Err(err) = std::fs::remove_file(path) {
trace!(?err, "failed to remove socket, ignoring");
}
Some(UnixSocketGuard(path.to_owned()))
} else {
None
};
axum_server::Server::bind(addr.clone())
}
let res = axum_server::Server::bind(addr.clone())
.acceptor(CatchPanicAcceptor::new(
DefaultAcceptor::new(),
arbiter.clone(),
))
.handle(handle)
.serve(router.into_make_service())
.await?;
.await;
if !allow_failure && let Some(path) = addr.as_pathname() {
trace!(?addr, "removing socket");
if let Err(err) = std::fs::remove_file(path) {
trace!(?err, "failed to remove socket, ignoring");
}
}
if res.is_err() && allow_failure {
arbiter.shutdown().await;
return Ok(());
}
res?;
Ok(())
}
@@ -104,17 +111,20 @@ pub(crate) async fn run_unix(
///
/// `name` is only used for observability purposes and should describe which module is starting the
/// server.
///
/// `allow_failure` allows the server to fail silently.
pub fn start_unix(
tasks: &mut Tasks,
name: &'static str,
router: Router,
addr: unix::net::SocketAddr,
allow_failure: bool,
) -> Result<()> {
let arbiter = tasks.arbiter();
tasks
.build_task()
.name(&format!("{}::run_unix({name}, {addr:?})", module_path!()))
.spawn(run_unix(arbiter, name, router, addr))?;
.spawn(run_unix(arbiter, name, router, addr, allow_failure))?;
Ok(())
}

View File

@@ -8,8 +8,7 @@
"url": "https://github.com/goauthentik/authentik.git"
},
"scripts": {
"clean": "tsc -b --clean tsconfig.json tsconfig.esm.json",
"build": "npm run clean && tsc -b tsconfig.json tsconfig.esm.json",
"build": "tsc && tsc -p tsconfig.esm.json",
"prepare": "npm run build"
},
"main": "./dist/index.js",

View File

@@ -47,7 +47,6 @@ export interface ManagedBlueprintsDestroyRequest {
export interface ManagedBlueprintsImportCreateRequest {
file?: Blob;
path?: string;
context?: string;
}
export interface ManagedBlueprintsListRequest {
@@ -370,10 +369,6 @@ export class ManagedApi extends runtime.BaseAPI {
formParams.append("path", requestParameters["path"] as any);
}
if (requestParameters["context"] != null) {
formParams.append("context", requestParameters["context"] as any);
}
let urlPath = `/managed/blueprints/import/`;
return {

View File

@@ -23,7 +23,7 @@ export interface AuthenticatedSessionUserAgentDevice {
* @type {string}
* @memberof AuthenticatedSessionUserAgentDevice
*/
brand: string | null;
brand: string;
/**
*
* @type {string}
@@ -35,7 +35,7 @@ export interface AuthenticatedSessionUserAgentDevice {
* @type {string}
* @memberof AuthenticatedSessionUserAgentDevice
*/
model: string | null;
model: string;
}
/**

View File

@@ -29,25 +29,25 @@ export interface AuthenticatedSessionUserAgentOs {
* @type {string}
* @memberof AuthenticatedSessionUserAgentOs
*/
major: string | null;
major: string;
/**
*
* @type {string}
* @memberof AuthenticatedSessionUserAgentOs
*/
minor: string | null;
minor: string;
/**
*
* @type {string}
* @memberof AuthenticatedSessionUserAgentOs
*/
patch: string | null;
patch: string;
/**
*
* @type {string}
* @memberof AuthenticatedSessionUserAgentOs
*/
patchMinor: string | null;
patchMinor: string;
}
/**

View File

@@ -40,7 +40,6 @@ export interface CurrentBrandFlags {
* Refresh other tabs after successful authentication.
* @type {boolean}
* @memberof CurrentBrandFlags
* @deprecated
*/
flowsRefreshOthers: boolean;
}

View File

@@ -19,20 +19,8 @@
export const EventsRequestedEnum = {
HttpsSchemasOpenidNetSeceventCaepEventTypeSessionRevoked:
"https://schemas.openid.net/secevent/caep/event-type/session-revoked",
HttpsSchemasOpenidNetSeceventCaepEventTypeTokenClaimsChange:
"https://schemas.openid.net/secevent/caep/event-type/token-claims-change",
HttpsSchemasOpenidNetSeceventCaepEventTypeCredentialChange:
"https://schemas.openid.net/secevent/caep/event-type/credential-change",
HttpsSchemasOpenidNetSeceventCaepEventTypeAssuranceLevelChange:
"https://schemas.openid.net/secevent/caep/event-type/assurance-level-change",
HttpsSchemasOpenidNetSeceventCaepEventTypeDeviceComplianceChange:
"https://schemas.openid.net/secevent/caep/event-type/device-compliance-change",
HttpsSchemasOpenidNetSeceventCaepEventTypeSessionEstablished:
"https://schemas.openid.net/secevent/caep/event-type/session-established",
HttpsSchemasOpenidNetSeceventCaepEventTypeSessionPresented:
"https://schemas.openid.net/secevent/caep/event-type/session-presented",
HttpsSchemasOpenidNetSeceventCaepEventTypeRiskLevelChange:
"https://schemas.openid.net/secevent/caep/event-type/risk-level-change",
HttpsSchemasOpenidNetSeceventSsfEventTypeVerification:
"https://schemas.openid.net/secevent/ssf/event-type/verification",
UnknownDefaultOpenApi: "11184809",

View File

@@ -40,7 +40,6 @@ export interface PatchedSettingsRequestFlags {
* Refresh other tabs after successful authentication.
* @type {boolean}
* @memberof PatchedSettingsRequestFlags
* @deprecated
*/
flowsRefreshOthers: boolean;
}

View File

@@ -14,8 +14,8 @@
import type { MatchingModeEnum } from "./MatchingModeEnum";
import { MatchingModeEnumFromJSON, MatchingModeEnumToJSON } from "./MatchingModeEnum";
import type { RedirectURITypeEnum } from "./RedirectURITypeEnum";
import { RedirectURITypeEnumFromJSON, RedirectURITypeEnumToJSON } from "./RedirectURITypeEnum";
import type { RedirectUriTypeEnum } from "./RedirectUriTypeEnum";
import { RedirectUriTypeEnumFromJSON, RedirectUriTypeEnumToJSON } from "./RedirectUriTypeEnum";
/**
* A single allowed redirect URI entry
@@ -37,10 +37,10 @@ export interface RedirectURI {
url: string;
/**
*
* @type {RedirectURITypeEnum}
* @type {RedirectUriTypeEnum}
* @memberof RedirectURI
*/
redirectUriType?: RedirectURITypeEnum;
redirectUriType?: RedirectUriTypeEnum;
}
/**
@@ -66,7 +66,7 @@ export function RedirectURIFromJSONTyped(json: any, ignoreDiscriminator: boolean
redirectUriType:
json["redirect_uri_type"] == null
? undefined
: RedirectURITypeEnumFromJSON(json["redirect_uri_type"]),
: RedirectUriTypeEnumFromJSON(json["redirect_uri_type"]),
};
}
@@ -85,6 +85,6 @@ export function RedirectURIToJSONTyped(
return {
matching_mode: MatchingModeEnumToJSON(value["matchingMode"]),
url: value["url"],
redirect_uri_type: RedirectURITypeEnumToJSON(value["redirectUriType"]),
redirect_uri_type: RedirectUriTypeEnumToJSON(value["redirectUriType"]),
};
}

View File

@@ -14,8 +14,8 @@
import type { MatchingModeEnum } from "./MatchingModeEnum";
import { MatchingModeEnumFromJSON, MatchingModeEnumToJSON } from "./MatchingModeEnum";
import type { RedirectURITypeEnum } from "./RedirectURITypeEnum";
import { RedirectURITypeEnumFromJSON, RedirectURITypeEnumToJSON } from "./RedirectURITypeEnum";
import type { RedirectUriTypeEnum } from "./RedirectUriTypeEnum";
import { RedirectUriTypeEnumFromJSON, RedirectUriTypeEnumToJSON } from "./RedirectUriTypeEnum";
/**
* A single allowed redirect URI entry
@@ -37,10 +37,10 @@ export interface RedirectURIRequest {
url: string;
/**
*
* @type {RedirectURITypeEnum}
* @type {RedirectUriTypeEnum}
* @memberof RedirectURIRequest
*/
redirectUriType?: RedirectURITypeEnum;
redirectUriType?: RedirectUriTypeEnum;
}
/**
@@ -69,7 +69,7 @@ export function RedirectURIRequestFromJSONTyped(
redirectUriType:
json["redirect_uri_type"] == null
? undefined
: RedirectURITypeEnumFromJSON(json["redirect_uri_type"]),
: RedirectUriTypeEnumFromJSON(json["redirect_uri_type"]),
};
}
@@ -88,6 +88,6 @@ export function RedirectURIRequestToJSONTyped(
return {
matching_mode: MatchingModeEnumToJSON(value["matchingMode"]),
url: value["url"],
redirect_uri_type: RedirectURITypeEnumToJSON(value["redirectUriType"]),
redirect_uri_type: RedirectUriTypeEnumToJSON(value["redirectUriType"]),
};
}

View File

@@ -1,57 +0,0 @@
/* tslint:disable */
/* eslint-disable */
/**
* authentik
* Making authentication simple.
*
* The version of the OpenAPI document: 2026.5.0-rc1
* Contact: hello@goauthentik.io
*
* NOTE: This class is auto generated by OpenAPI Generator (https://openapi-generator.tech).
* https://openapi-generator.tech
* Do not edit the class manually.
*/
/**
*
* @export
*/
export const RedirectURITypeEnum = {
Authorization: "authorization",
Logout: "logout",
UnknownDefaultOpenApi: "11184809",
} as const;
export type RedirectURITypeEnum = (typeof RedirectURITypeEnum)[keyof typeof RedirectURITypeEnum];
export function instanceOfRedirectURITypeEnum(value: any): boolean {
for (const key in RedirectURITypeEnum) {
if (Object.prototype.hasOwnProperty.call(RedirectURITypeEnum, key)) {
if (RedirectURITypeEnum[key as keyof typeof RedirectURITypeEnum] === value) {
return true;
}
}
}
return false;
}
export function RedirectURITypeEnumFromJSON(json: any): RedirectURITypeEnum {
return RedirectURITypeEnumFromJSONTyped(json, false);
}
export function RedirectURITypeEnumFromJSONTyped(
json: any,
ignoreDiscriminator: boolean,
): RedirectURITypeEnum {
return json as RedirectURITypeEnum;
}
export function RedirectURITypeEnumToJSON(value?: RedirectURITypeEnum | null): any {
return value as any;
}
export function RedirectURITypeEnumToJSONTyped(
value: any,
ignoreDiscriminator: boolean,
): RedirectURITypeEnum {
return value as RedirectURITypeEnum;
}

View File

@@ -0,0 +1,57 @@
/* tslint:disable */
/* eslint-disable */
/**
* authentik
* Making authentication simple.
*
* The version of the OpenAPI document: 2026.5.0-rc1
* Contact: hello@goauthentik.io
*
* NOTE: This class is auto generated by OpenAPI Generator (https://openapi-generator.tech).
* https://openapi-generator.tech
* Do not edit the class manually.
*/
/**
*
* @export
*/
export const RedirectUriTypeEnum = {
Authorization: "authorization",
Logout: "logout",
UnknownDefaultOpenApi: "11184809",
} as const;
export type RedirectUriTypeEnum = (typeof RedirectUriTypeEnum)[keyof typeof RedirectUriTypeEnum];
export function instanceOfRedirectUriTypeEnum(value: any): boolean {
for (const key in RedirectUriTypeEnum) {
if (Object.prototype.hasOwnProperty.call(RedirectUriTypeEnum, key)) {
if (RedirectUriTypeEnum[key as keyof typeof RedirectUriTypeEnum] === value) {
return true;
}
}
}
return false;
}
export function RedirectUriTypeEnumFromJSON(json: any): RedirectUriTypeEnum {
return RedirectUriTypeEnumFromJSONTyped(json, false);
}
export function RedirectUriTypeEnumFromJSONTyped(
json: any,
ignoreDiscriminator: boolean,
): RedirectUriTypeEnum {
return json as RedirectUriTypeEnum;
}
export function RedirectUriTypeEnumToJSON(value?: RedirectUriTypeEnum | null): any {
return value as any;
}
export function RedirectUriTypeEnumToJSONTyped(
value: any,
ignoreDiscriminator: boolean,
): RedirectUriTypeEnum {
return value as RedirectUriTypeEnum;
}

View File

@@ -20,7 +20,6 @@ export const SSFStreamStatusEnum = {
Enabled: "enabled",
Paused: "paused",
Disabled: "disabled",
DisabledDeleted: "disabled_deleted",
UnknownDefaultOpenApi: "11184809",
} as const;
export type SSFStreamStatusEnum = (typeof SSFStreamStatusEnum)[keyof typeof SSFStreamStatusEnum];

View File

@@ -707,7 +707,7 @@ export * from "./RedirectStageModeEnum";
export * from "./RedirectStageRequest";
export * from "./RedirectURI";
export * from "./RedirectURIRequest";
export * from "./RedirectURITypeEnum";
export * from "./RedirectUriTypeEnum";
export * from "./RelatedGroup";
export * from "./RelatedRule";
export * from "./Reputation";

View File

@@ -1,7 +1,9 @@
{
"$schema": "https://json.schemastore.org/tsconfig",
"compilerOptions": {
"composite": true,
"isolatedModules": true,
"incremental": true,
"rootDir": "src",
"strict": true,
"newLine": "lf",

View File

@@ -1,7 +1,9 @@
{
"$schema": "https://json.schemastore.org/tsconfig",
"compilerOptions": {
"composite": true,
"isolatedModules": true,
"incremental": true,
"rootDir": "src",
"strict": true,
"newLine": "lf",

View File

@@ -32,17 +32,16 @@ class DjangoDramatiqPostgres(AppConfig):
middleware=[],
)
for middleware_class_path, middleware_kwargs in Conf().middlewares:
middleware_class = import_string(middleware_class_path)
if issubclass(middleware_class, Results):
middleware_kwargs["backend"] = import_string(Conf().result_backend)(
for middleware_class, middleware_kwargs in Conf().middlewares:
middleware: dramatiq.middleware.middleware.Middleware = import_string(middleware_class)(
**middleware_kwargs,
)
if isinstance(middleware, Results):
middleware.backend = import_string(Conf().result_backend)(
*Conf().result_backend_args,
**Conf().result_backend_kwargs,
)
middleware: dramatiq.middleware.middleware.Middleware = middleware_class(
**middleware_kwargs,
)
broker.add_middleware(middleware)
broker.add_middleware(middleware) # type: ignore[no-untyped-call]
dramatiq.set_broker(broker)

View File

@@ -23,9 +23,11 @@ from django.utils.functional import cached_property
from django.utils.module_loading import import_string
from dramatiq.broker import Broker, Consumer, MessageProxy
from dramatiq.common import compute_backoff, current_millis, dq_name, q_name, xq_name
from dramatiq.errors import BrokerConnectionError, QueueJoinTimeout
from dramatiq.errors import ConnectionError, QueueJoinTimeout
from dramatiq.message import Message
from dramatiq.middleware import Middleware
from dramatiq.middleware import (
Middleware,
)
from pglock.core import _cast_lock_id
from psycopg import sql
from psycopg.errors import AdminShutdown
@@ -44,6 +46,7 @@ DATABASE_ERRORS = (
AdminShutdown,
InterfaceError,
DatabaseError,
ConnectionError,
OperationalError,
)
@@ -52,7 +55,7 @@ def channel_name(queue_name: str, identifier: ChannelIdentifier) -> str:
return f"{CHANNEL_PREFIX}.{queue_name}.{identifier.value}"
def raise_broker_connection_error(func: Callable[P, R]) -> Callable[P, R]: # noqa: UP047
def raise_connection_error(func: Callable[P, R]) -> Callable[P, R]: # noqa: UP047
@functools.wraps(func)
def wrapper(*args: P.args, **kwargs: P.kwargs) -> R:
try:
@@ -63,13 +66,13 @@ def raise_broker_connection_error(func: Callable[P, R]) -> Callable[P, R]: # no
connections.close_all()
except DATABASE_ERRORS:
pass
raise BrokerConnectionError(str(exc)) from exc # type: ignore[no-untyped-call]
raise ConnectionError(str(exc)) from exc # type: ignore[no-untyped-call]
return wrapper
class PostgresBroker(Broker):
queues: set[str]
queues: set[str] # type: ignore[assignment]
def __init__(
self,
@@ -78,7 +81,7 @@ class PostgresBroker(Broker):
db_alias: str = DEFAULT_DB_ALIAS,
**kwargs: Any,
) -> None:
super().__init__(*args, middleware=[], **kwargs) # type: ignore[misc]
super().__init__(*args, middleware=[], **kwargs) # type: ignore[no-untyped-call,misc]
self.logger = get_logger(__name__, type(self))
self.queues = set()
@@ -119,10 +122,10 @@ class PostgresBroker(Broker):
def declare_queue(self, queue_name: str) -> None:
if queue_name not in self.queues:
self.emit_before("declare_queue", queue_name)
self.emit_before("declare_queue", queue_name) # type: ignore[no-untyped-call]
self.queues.add(queue_name)
# Nothing more to do, all queues are in the same table
self.emit_after("declare_queue", queue_name)
self.emit_after("declare_queue", queue_name) # type: ignore[no-untyped-call]
def model_defaults(self, message: Message[Any]) -> dict[str, Any]:
eta = None
@@ -138,7 +141,7 @@ class PostgresBroker(Broker):
}
@tenacity.retry(
retry=tenacity.retry_if_exception_type(BrokerConnectionError),
retry=tenacity.retry_if_exception_type(ConnectionError),
reraise=True,
wait=tenacity.wait_random_exponential(multiplier=1, max=5),
stop=tenacity.stop_after_attempt(3),
@@ -146,11 +149,11 @@ class PostgresBroker(Broker):
cast(logging.Logger, logger), logging.INFO, exc_info=True
),
)
@raise_broker_connection_error
@raise_connection_error
def enqueue(self, message: Message[Any], *, delay: int | None = None) -> Message[Any]:
queue_name = q_name(message.queue_name)
queue_name = q_name(message.queue_name) # type: ignore[no-untyped-call]
if delay:
message_eta = current_millis() + delay
message_eta = current_millis() + delay # type: ignore[no-untyped-call]
message.options["eta"] = message_eta
self.declare_queue(queue_name)
@@ -160,7 +163,7 @@ class PostgresBroker(Broker):
message.options["model_defaults"] = self.model_defaults(message)
message.options["model_create_defaults"] = {}
self.emit_before("enqueue", message, delay)
self.emit_before("enqueue", message, delay) # type: ignore[no-untyped-call]
with transaction.atomic(using=self.db_alias):
query = {
@@ -182,7 +185,7 @@ class PostgresBroker(Broker):
message.options["task"] = task
message.options["task_created"] = created
self.emit_after("enqueue", message, delay)
self.emit_after("enqueue", message, delay) # type: ignore[no-untyped-call]
return message
def get_declared_queues(self) -> set[str]:
@@ -190,7 +193,7 @@ class PostgresBroker(Broker):
def flush(self, queue_name: str) -> None:
self.query_set.filter(
queue_name__in=(queue_name, dq_name(queue_name), xq_name(queue_name))
queue_name__in=(queue_name, dq_name(queue_name), xq_name(queue_name)) # type: ignore[no-untyped-call]
).delete()
def flush_all(self) -> None:
@@ -372,7 +375,7 @@ class _PostgresConsumer(Consumer):
self.in_processing.add(str(message_id))
return message
@raise_broker_connection_error
@raise_connection_error
def __next__(self) -> MessageProxy | None:
# This method is called every second
@@ -392,7 +395,7 @@ class _PostgresConsumer(Consumer):
if processing >= self.prefetch:
# If we have too many messages already processing, wait and don't consume a message
# straight away, other workers will be faster.
self.misses, backoff_ms = compute_backoff(self.misses, max_backoff=1000)
self.misses, backoff_ms = compute_backoff(self.misses, max_backoff=1000) # type: ignore[no-untyped-call]
self.logger.debug(
"Too many messages in processing, Sleeping",
processing=processing,
@@ -417,7 +420,7 @@ class _PostgresConsumer(Consumer):
break
message = self._consume_one(str(message_id))
if message is not None:
return MessageProxy(message)
return MessageProxy(message) # type: ignore[no-untyped-call]
else:
self.logger.debug("Message already consumed. Skipping.", message_id=message_id)
continue
@@ -441,7 +444,7 @@ class _PostgresConsumer(Consumer):
self.to_unlock.add(str(message_id))
return False
def _post_process_message(self, message: MessageProxy, state: TaskState) -> None:
def _post_process_message(self, message: Message[Any], state: TaskState) -> None:
self.logger.debug("Post-processing message", message=message.message_id, state=state)
try:
self.in_processing.remove(str(message.message_id))
@@ -463,16 +466,16 @@ class _PostgresConsumer(Consumer):
)
message.options["task"] = task
@raise_broker_connection_error
def ack(self, message: MessageProxy) -> None:
@raise_connection_error
def ack(self, message: Message[Any]) -> None:
self._post_process_message(message, TaskState.DONE)
@raise_broker_connection_error
def nack(self, message: MessageProxy) -> None:
@raise_connection_error
def nack(self, message: Message[Any]) -> None:
self._post_process_message(message, TaskState.REJECTED)
@raise_broker_connection_error
def requeue(self, messages: Iterable[MessageProxy]) -> None:
@raise_connection_error
def requeue(self, messages: Iterable[Message[Any]]) -> None:
self.query_set.filter(
message_id__in=[message.message_id for message in messages],
).update(
@@ -511,7 +514,7 @@ class _PostgresConsumer(Consumer):
self.logger.info("Purged messages in all queues", count=count)
self.task_purge_last_run = timezone.now()
@raise_broker_connection_error
@raise_connection_error
def close(self) -> None:
try:
self._purge_locks()

View File

@@ -5,7 +5,7 @@ from signal import pause
from django_dramatiq_postgres.conf import Conf
def worker_metrics() -> int:
def worker_metrics() -> None:
import_module(Conf().autodiscovery["setup_module"])
from django_dramatiq_postgres.middleware import MetricsMiddleware
@@ -15,4 +15,3 @@ def worker_metrics() -> int:
int(os.getenv("dramatiq_prom_port", "9191")),
)
pause()
return 0

View File

@@ -10,7 +10,7 @@ from typing import TYPE_CHECKING, Any, cast
from django.db import DatabaseError, close_old_connections, connections
from dramatiq.actor import Actor
from dramatiq.broker import Broker, MessageProxy
from dramatiq.broker import Broker
from dramatiq.common import current_millis
from dramatiq.message import Message
from dramatiq.middleware.middleware import Middleware
@@ -79,7 +79,7 @@ class DbConnectionMiddleware(Middleware):
class TaskStateBeforeMiddleware(Middleware):
def before_process_message(self, broker: PostgresBroker, message: Message[Any]) -> None: # type: ignore[override]
def before_process_message(self, broker: PostgresBroker, message: Message[Any]) -> None:
broker.query_set.filter(
message_id=message.message_id,
queue_name=message.queue_name,
@@ -90,7 +90,7 @@ class TaskStateBeforeMiddleware(Middleware):
class TaskStateAfterMiddleware(Middleware):
def before_process_message(self, broker: PostgresBroker, message: MessageProxy) -> None: # type: ignore[override]
def before_process_message(self, broker: PostgresBroker, message: Message[Any]) -> None:
broker.query_set.filter(
message_id=message.message_id,
queue_name=message.queue_name,
@@ -99,7 +99,7 @@ class TaskStateAfterMiddleware(Middleware):
state=TaskState.RUNNING,
)
def after_skip_message(self, broker: PostgresBroker, message: MessageProxy) -> None: # type: ignore[override]
def after_skip_message(self, broker: PostgresBroker, message: Message[Any]) -> None:
broker.query_set.filter(
message_id=message.message_id,
queue_name=message.queue_name,
@@ -110,11 +110,11 @@ class TaskStateAfterMiddleware(Middleware):
def after_process_message(
self,
broker: PostgresBroker, # type: ignore[override]
message: MessageProxy,
broker: PostgresBroker,
message: Message[Any],
*,
result: Any | None = None,
exception: BaseException | None = None,
exception: Exception | None = None,
) -> None:
self.after_skip_message(broker, message)
@@ -147,7 +147,7 @@ class CurrentTask(Middleware):
raise CurrentTaskNotFound()
return task[-1]
def before_process_message(self, broker: Broker, message: MessageProxy) -> None:
def before_process_message(self, broker: Broker, message: Message[Any]) -> None:
tasks = self._TASKS.get()
if tasks is None:
tasks = []
@@ -157,10 +157,10 @@ class CurrentTask(Middleware):
def after_process_message(
self,
broker: Broker,
message: MessageProxy,
message: Message[Any],
*,
result: Any | None = None,
exception: BaseException | None = None,
exception: Exception | None = None,
) -> None:
tasks: list[TaskBase] | None = self._TASKS.get()
if tasks is None or len(tasks) == 0:
@@ -194,7 +194,7 @@ class CurrentTask(Middleware):
pass
self._TASKS.set(tasks[:-1])
def after_skip_message(self, broker: Broker, message: MessageProxy) -> None:
def after_skip_message(self, broker: Broker, message: Message[Any]) -> None:
self.after_process_message(broker, message)
@@ -236,7 +236,7 @@ class MetricsMiddleware(Middleware):
self.message_start_times: dict[str, int] = {}
@property
def forks(self) -> list[Callable[[], int]]:
def forks(self) -> list[Callable[[], None]]:
from django_dramatiq_postgres.forks import worker_metrics
return [worker_metrics]
@@ -310,41 +310,41 @@ class MetricsMiddleware(Middleware):
# TODO: worker_id
multiprocess.mark_process_dead(os.getpid()) # type: ignore[no-untyped-call]
def _make_labels(self, message: MessageProxy | Message[Any]) -> list[str]:
def _make_labels(self, message: Message[Any]) -> list[str]:
return [message.queue_name, message.actor_name]
def after_nack(self, broker: Broker, message: MessageProxy) -> None:
def after_nack(self, broker: Broker, message: Message[Any]) -> None:
self.total_rejected_messages.labels(*self._make_labels(message)).inc()
def after_enqueue(self, broker: Broker, message: Message[Any], delay: int) -> None:
if "retries" in message.options:
self.total_retried_messages.labels(*self._make_labels(message)).inc()
def before_delay_message(self, broker: Broker, message: MessageProxy) -> None:
def before_delay_message(self, broker: Broker, message: Message[Any]) -> None:
self.delayed_messages.add(message.message_id)
self.in_progress_delayed_messages.labels(*self._make_labels(message)).inc()
def before_process_message(self, broker: Broker, message: MessageProxy) -> None:
def before_process_message(self, broker: Broker, message: Message[Any]) -> None:
labels = self._make_labels(message)
if message.message_id in self.delayed_messages:
self.delayed_messages.remove(message.message_id)
self.in_progress_delayed_messages.labels(*labels).dec()
self.in_progress_messages.labels(*labels).inc()
self.message_start_times[message.message_id] = current_millis()
self.message_start_times[message.message_id] = current_millis() # type: ignore[no-untyped-call]
def after_process_message(
self,
broker: Broker,
message: MessageProxy,
message: Message[Any],
*,
result: Any | None = None,
exception: BaseException | None = None,
exception: Exception | None = None,
) -> None:
labels = self._make_labels(message)
message_start_time = self.message_start_times.pop(message.message_id, current_millis())
message_duration = current_millis() - message_start_time
message_start_time = self.message_start_times.pop(message.message_id, current_millis()) # type: ignore[no-untyped-call]
message_duration = current_millis() - message_start_time # type: ignore[no-untyped-call]
self.messages_durations.labels(*labels).observe(message_duration)
self.in_progress_messages.labels(*labels).dec()

View File

@@ -159,7 +159,7 @@ class ScheduleBase(models.Model):
def send(self, broker: Broker | None = None) -> Message[Any]:
broker = broker or get_broker()
actor: Actor[Any, Any] = broker.get_actor(self.actor_name)
actor: Actor[Any, Any] = broker.get_actor(self.actor_name) # type: ignore[no-untyped-call]
return actor.send_with_options(
args=pickle.loads(self.args), # nosec
kwargs=pickle.loads(self.kwargs), # nosec

View File

@@ -36,7 +36,7 @@ dependencies = [
"django >=4.2,<6.0",
"django-pglock >=1.7,<2",
"django-pgtrigger >=4,<5",
"dramatiq >=2,<3",
"dramatiq >=1.17,<1.18",
"tenacity >=9,<10",
"structlog >=25,<26",
]

View File

@@ -9,7 +9,7 @@ dependencies = [
"argon2-cffi==25.1.0",
"cachetools==7.0.6",
"channels==4.3.2",
"cryptography==48.0.0",
"cryptography==47.0.0",
"dacite==1.9.2",
"deepmerge==2.0",
"defusedxml==0.7.1",
@@ -25,7 +25,7 @@ dependencies = [
"django-prometheus==2.4.1",
"django-storages[s3]==1.14.6",
"django-tenants==3.10.1",
"django==5.2.14",
"django==5.2.13",
"djangoql==0.19.1",
"djangorestframework==3.17.1",
"docker==7.1.0",
@@ -46,9 +46,9 @@ dependencies = [
"lxml==6.1.0",
"msgraph-sdk==1.56.0",
"opencontainers==0.0.15",
"packaging==26.2",
"packaging==26.1",
"paramiko==4.0.0",
"psycopg[c,pool]==3.3.4",
"psycopg[c,pool]==3.3.3",
"pydantic-scim==0.0.8",
"pydantic==2.13.3",
"pyjwt==2.11.0",
@@ -66,7 +66,7 @@ dependencies = [
"ua-parser==1.0.2",
"unidecode==1.4.0",
"urllib3<3",
"uvicorn[standard]==0.46.0",
"uvicorn[standard]==0.45.0",
"watchdog==6.0.0",
"webauthn==2.7.1",
"wsproto==1.3.2",
@@ -76,7 +76,7 @@ dependencies = [
[dependency-groups]
dev = [
"aws-cdk-lib==2.251.0",
"aws-cdk-lib==2.250.0",
"bandit==1.9.4",
"black==26.3.1",
"bpython==0.26",

View File

@@ -34608,12 +34608,10 @@ components:
properties:
brand:
type: string
nullable: true
family:
type: string
model:
type: string
nullable: true
required:
- brand
- family
@@ -34626,16 +34624,12 @@ components:
type: string
major:
type: string
nullable: true
minor:
type: string
nullable: true
patch:
type: string
nullable: true
patch_minor:
type: string
nullable: true
required:
- family
- major
@@ -36050,8 +36044,6 @@ components:
path:
type: string
minLength: 1
context:
type: string
Brand:
type: object
description: Brand Serializer
@@ -37191,7 +37183,6 @@ components:
flows_refresh_others:
type: boolean
description: Refresh other tabs after successful authentication.
deprecated: true
required:
- core_default_app_access
- enterprise_audit_include_expanded_diff
@@ -39040,13 +39031,7 @@ components:
EventsRequestedEnum:
enum:
- https://schemas.openid.net/secevent/caep/event-type/session-revoked
- https://schemas.openid.net/secevent/caep/event-type/token-claims-change
- https://schemas.openid.net/secevent/caep/event-type/credential-change
- https://schemas.openid.net/secevent/caep/event-type/assurance-level-change
- https://schemas.openid.net/secevent/caep/event-type/device-compliance-change
- https://schemas.openid.net/secevent/caep/event-type/session-established
- https://schemas.openid.net/secevent/caep/event-type/session-presented
- https://schemas.openid.net/secevent/caep/event-type/risk-level-change
- https://schemas.openid.net/secevent/ssf/event-type/verification
type: string
ExpiringBaseGrantModel:
@@ -51202,7 +51187,6 @@ components:
flows_refresh_others:
type: boolean
description: Refresh other tabs after successful authentication.
deprecated: true
required:
- core_default_app_access
- enterprise_audit_include_expanded_diff
@@ -53643,7 +53627,7 @@ components:
type: string
redirect_uri_type:
allOf:
- $ref: '#/components/schemas/RedirectURITypeEnum'
- $ref: '#/components/schemas/RedirectUriTypeEnum'
default: authorization
required:
- matching_mode
@@ -53659,12 +53643,12 @@ components:
minLength: 1
redirect_uri_type:
allOf:
- $ref: '#/components/schemas/RedirectURITypeEnum'
- $ref: '#/components/schemas/RedirectUriTypeEnum'
default: authorization
required:
- matching_mode
- url
RedirectURITypeEnum:
RedirectUriTypeEnum:
enum:
- authorization
- logout
@@ -55634,7 +55618,6 @@ components:
- enabled
- paused
- disabled
- disabled_deleted
type: string
Schedule:
type: object
@@ -55985,7 +55968,6 @@ components:
flows_refresh_others:
type: boolean
description: Refresh other tabs after successful authentication.
deprecated: true
required:
- core_default_app_access
- enterprise_audit_include_expanded_diff
@@ -56074,7 +56056,6 @@ components:
flows_refresh_others:
type: boolean
description: Refresh other tabs after successful authentication.
deprecated: true
required:
- core_default_app_access
- enterprise_audit_include_expanded_diff

278
scripts/node/lint-lockfile.mjs Executable file
View File

@@ -0,0 +1,278 @@
#!/usr/bin/env node
/**
* @file Lints the package-lock.json file to ensure it is in sync with package.json.
*
* Usage:
* lint-lockfile [options] [directory]
*
* Options:
* --warn Report issues as warnings instead of failing. The lockfile is
* still regenerated on disk, but the process exits 0.
*
* Exit codes:
* 0 Lockfile is in sync (or --warn was passed)
* 1 Unexpected error
* 2 Lockfile drift detected
*/
/// <reference lib="esnext" />
import * as assert from "node:assert/strict";
import { findPackageJSON } from "node:module";
import { dirname } from "node:path";
import { isDeepStrictEqual, parseArgs } from "node:util";
import { ConsoleLogger } from "../../packages/logger-js/lib/node.js";
import { parseCWD, reportAndExit } from "./utils/commands.mjs";
import { corepack } from "./utils/corepack.mjs";
import { gitStatus } from "./utils/git.mjs";
import { findNPMPackage, loadJSON, npm, pluckDependencyFields } from "./utils/node.mjs";
//#region Constants
const logger = ConsoleLogger.prefix("lint:lockfile");
const { values: options, positionals } = parseArgs({
options: {
"warn": {
type: "boolean",
default: false,
description: "Report issues as warnings instead of failing",
},
"skip-git": {
type: "boolean",
default: !!process.env.CI,
description:
"Skip checking for uncommitted changes (use with --warn to ignore drift without reporting)",
},
},
allowPositionals: true,
});
const cwd = parseCWD(positionals);
const ignoredProperties = new Set([
// ---
"peer",
"engines",
"optional",
]);
//#region Utilities
/**
* @param {Record<string, unknown>} actual
* @param {Record<string, unknown>} expected
* @param {string[]} [prefix]
* @returns {Set<string>[]}
*/
function extractDiffedProperties(actual, expected, prefix = []) {
const a = actual ?? {};
const b = expected ?? {};
const keys = new Set([...Object.keys(a), ...Object.keys(b)]);
/** @type {Set<string>[]} */
const diffs = [];
for (const key of keys) {
const path = [...prefix, key];
const valA = a[key];
const valB = b[key];
if (
valA !== null &&
valB !== null &&
typeof valA === "object" &&
typeof valB === "object" &&
!Array.isArray(valA) &&
!Array.isArray(valB)
) {
// @ts-ignore
diffs.push(...extractDiffedProperties(valA, valB, path));
} else if (!isDeepStrictEqual(valA, valB)) {
diffs.push(new Set(path));
}
}
return diffs;
}
//#endregion
/**
* Exit code when lockfile drift is detected (distinct from general errors)
*/
const EXIT_DRIFT = 2;
/**
* @returns {Promise<string[]>} The list of issues detected.
*/
async function run() {
/** @type {string[]} */
const issues = [];
/**
* Records an issue. In strict mode, throws immediately.
* In warn mode, collects the message for later reporting.
*
* @param {boolean} ok
* @param {string} message
*/
const check = (ok, message) => {
if (ok) return;
if (options.warn) {
issues.push(message);
return;
}
assert.fail(message);
};
/**
* Checks deep equality of two values. In strict mode, throws if they are not equal.
* In warn mode, records an issue instead.
*
* @param {unknown} actual
* @param {unknown} expected
* @param {string} message
*/
const checkDeep = (actual, expected, message) => {
if (options.warn) {
if (!isDeepStrictEqual(actual, expected)) {
issues.push(message);
}
return;
}
assert.deepStrictEqual(actual, expected, message);
};
logger.info(`Linting lockfile integrity in: ${cwd}`);
// MARK: Locate files
const resolvedPath = import.meta.resolve(cwd);
const packageJSONPath = findPackageJSON(resolvedPath);
assert.ok(
packageJSONPath,
"Could not find package.json in the current directory or any parent directories",
);
const packageDir = dirname(packageJSONPath);
const { packageLockPath } = await findNPMPackage(packageDir);
const lockfileDir = dirname(packageLockPath);
const isWorkspace = lockfileDir !== packageDir;
const corepackVersion = await corepack`--version`().catch(() => null);
const useCorepack = !!corepackVersion;
logger.info(`corepack: ${corepackVersion || "disabled"}`);
const expected = {
lockfile: await loadJSON(packageLockPath),
package: await loadJSON(packageJSONPath).then(pluckDependencyFields),
};
logger.info(`package.json: ${packageJSONPath} (${expected.package.name})`);
logger.info(`package-lock.json: ${packageLockPath}${isWorkspace ? " (workspace root)" : ""}`);
// MARK: Uncommitted changes
if (options["skip-git"]) {
logger.warn("Skipping git status check");
} else {
const packageStatus = await gitStatus(packageJSONPath);
const lockfileStatus = await gitStatus(packageLockPath);
if (!packageStatus.available || !lockfileStatus.available) {
logger.warn("Git is not available; skipping uncommitted change detection.");
} else {
check(packageStatus.clean, `package.json has uncommitted changes: ${packageJSONPath}`);
check(
lockfileStatus.clean,
`package-lock.json has uncommitted changes: ${packageLockPath}`,
);
}
}
// MARK: Regenerate
const npmVersion = await npm`--version`({ useCorepack });
logger.info(`Detected npm version: ${npmVersion}`);
await npm`install --package-lock-only`({
cwd: lockfileDir,
useCorepack,
});
logger.info("npm install complete.");
const actual = {
lockfile: await loadJSON(packageLockPath),
package: await loadJSON(packageJSONPath).then(pluckDependencyFields),
};
// MARK: Compare
assert.deepStrictEqual(
actual.package,
expected.package,
`package.json was unexpectedly modified during lockfile check: ${packageJSONPath}`,
);
try {
checkDeep(
actual.lockfile,
expected.lockfile,
`package-lock.json is out of sync with package.json`,
);
} catch (error) {
if (!(error instanceof assert.AssertionError)) {
throw error;
}
// NPM versions <=11.10 has issues with deterministic lockfile generation,
// especially around optional peer dependencies.
const diffedProperties = extractDiffedProperties(actual.lockfile, expected.lockfile).filter(
(segments) => segments.isDisjointFrom(ignoredProperties),
);
if (diffedProperties.length) {
const formatted = diffedProperties
.map((segments) => Array.from(segments).join("."))
.join("\n");
throw new Error(`Lockfile drift detected:\n${formatted}`, { cause: error });
}
logger.warn(
"Permissible dependency differences detected. Run `npm install` to update the lockfile.",
);
}
return issues;
}
run()
.then((issues) => {
if (issues.length) {
logger.warn(`⚠️ ${issues.length} issue(s) detected:`);
for (const issue of issues) {
logger.warn(` - ${issue}`);
}
if (options.warn) {
logger.warn(
"The lockfile on disk has been regenerated. Review and commit the changes.",
);
process.exit(EXIT_DRIFT);
}
} else {
logger.info("✅ Lockfile is in sync.");
}
})
.catch((error) => reportAndExit(error, logger));

114
scripts/node/lint-runtime.mjs Executable file
View File

@@ -0,0 +1,114 @@
#!/usr/bin/env node
/**
* @file Lints the installed Node.js and npm versions against the requirements specified in package.json.
*
* Usage:
* lint-node [options] [directory]
*
* Exit codes:
* 0 Versions are in sync
* 1 Version mismatch detected
*/
import * as assert from "node:assert/strict";
import { parseArgs } from "node:util";
import { ConsoleLogger } from "../../packages/logger-js/lib/node.js";
import { CommandError, parseCWD, reportAndExit } from "./utils/commands.mjs";
import { corepack } from "./utils/corepack.mjs";
import { resolveRepoRoot } from "./utils/git.mjs";
import { compareVersions, findNPMPackage, loadJSON, node, npm, parseRange } from "./utils/node.mjs";
const logger = ConsoleLogger.prefix("lint-runtime");
/**
* @param {string} start
*/
async function readRequirements(start) {
const { packageJSONPath } = await findNPMPackage(start);
logger.info(`Checking versions in ${packageJSONPath}`);
const packageJSONData = await loadJSON(packageJSONPath);
const nodeVersion = await node`--version`().then((output) => output.replace(/^v/, ""));
const requiredNpmVersion = packageJSONData.engines?.npm;
const requiredNodeVersion = packageJSONData.engines?.node;
return { nodeVersion, requiredNpmVersion, requiredNodeVersion };
}
async function main() {
const parsedArgs = parseArgs({
allowPositionals: true,
});
const cwd = parseCWD(parsedArgs.positionals);
const repoRoot = await resolveRepoRoot(cwd).catch(() => null);
logger.info(`cwd ${cwd}`);
logger.info(`repository ${repoRoot || "not found"}`);
const corepackVersion = await corepack`--version`().catch(() => null);
const useCorepack = !!corepackVersion;
logger.info(`corepack ${corepackVersion || "disabled"}`);
const npmVersion = await npm`--version`({ cwd, useCorepack })
.then((version) => {
logger.info(`npm${corepackVersion ? " (via Corepack)" : ""} ${version}`);
return version;
})
.catch((error) => {
if (error instanceof CommandError && corepackVersion) {
logger.warn(`Failed to read npm version via Corepack ${error.message}`);
logger.info(`Attempting to read npm version directly without Corepack...`);
// Corepack might be misconfigured or outdated.
// Attempting a second read without Corepack can help us distinguish
// between a general npm issue and a Corepack-specific one.
return npm`--version`({ cwd }).then((version) => {
logger.info(`npm (direct) ${version}`);
return version;
});
}
throw error;
});
const { nodeVersion, requiredNpmVersion, requiredNodeVersion } = await readRequirements(cwd);
logger.info(`node ${nodeVersion}`);
if (requiredNpmVersion) {
logger.info(`package.json npm ${requiredNpmVersion}`);
const { operator, version: required } = parseRange(requiredNpmVersion);
const result = compareVersions(npmVersion, required);
assert.ok(
operator === ">=" ? result >= 0 : result === 0,
`npm version ${npmVersion} does not satisfy required version ${requiredNpmVersion}`,
);
}
if (requiredNodeVersion) {
logger.info(`package.json node ${requiredNodeVersion}`);
const { operator, version: required } = parseRange(requiredNodeVersion);
const result = compareVersions(nodeVersion, required);
assert.ok(
operator === ">=" ? result >= 0 : result === 0,
`Node.js version ${nodeVersion} does not satisfy required version ${requiredNodeVersion}`,
);
}
}
main()
.then(() => {
logger.info("✅ Node.js and npm versions are in sync.");
})
.catch((error) => reportAndExit(error, logger));

94
scripts/node/setup-corepack.mjs Executable file
View File

@@ -0,0 +1,94 @@
#!/usr/bin/env node
/**
* @file Downloads the latest corepack tarball from the npm registry.
*/
import * as fs from "node:fs/promises";
import { parseArgs } from "node:util";
import { ConsoleLogger } from "../../packages/logger-js/lib/node.js";
import { $, parseCWD, reportAndExit } from "./utils/commands.mjs";
import { corepack, pullLatestCorepack } from "./utils/corepack.mjs";
import { resolveRepoRoot } from "./utils/git.mjs";
import { findNPMPackage, loadJSON, npm } from "./utils/node.mjs";
const FALLBACK_NPM_VERSION = "11.11.0";
const logger = ConsoleLogger.prefix("setup-corepack");
async function main() {
const parsedArgs = parseArgs({
options: {
force: {
type: "boolean",
default: false,
description: "Force re-download of corepack even if a version is already installed",
},
},
allowPositionals: true,
});
const cwdArg = parseCWD(parsedArgs.positionals);
const repoRoot = await resolveRepoRoot(cwdArg).catch(() => null);
const cwd = repoRoot || cwdArg;
const npmVersion = await npm`--version`({ cwd });
logger.info(`npm ${npmVersion}`);
const corepackVersion = await corepack`--version`({ cwd }).catch(() => null);
logger.info(`corepack ${corepackVersion || "not found"}`);
if (corepackVersion && !parsedArgs.values.force) {
logger.info("Corepack is already installed, skipping download (use --force to override)");
return;
}
await pullLatestCorepack(cwd);
await npm`install --force -g corepack@latest`({ cwd });
logger.info("Corepack installed successfully");
const { packageJSONPath } = await findNPMPackage(cwd);
logger.info(`Checking versions in ${packageJSONPath}`);
const packageJSONData = await loadJSON(packageJSONPath);
const packageManager = packageJSONData.packageManager || `npm@${FALLBACK_NPM_VERSION}`;
await $`corepack install -g ${packageManager}`({ cwd });
logger.info(`Setting up Corepack to use ${packageManager}...`);
const writablePackageJSON = await fs.access(packageJSONPath, fs.constants.W_OK).then(
() => true,
() => false,
);
/**
* @type {string}
*/
let subcommand;
if (!writablePackageJSON) {
if (!packageJSONData.packageManager) {
throw new Error(
`package.json is not writable and does not specify a packageManager field. Was the package.json file mounted via Docker?`,
);
}
subcommand = "install -g";
} else {
logger.info("package.json is writable");
subcommand = "use";
}
await $`corepack ${subcommand} ${packageManager}`({ cwd });
logger.info("Corepack installed npm successfully");
}
main().catch((error) => reportAndExit(error, logger));

View File

@@ -0,0 +1,116 @@
/**
* Utility functions for running shell commands and handling their results.
*
* @import { ExecOptions } from "node:child_process"
*/
import { exec } from "node:child_process";
import { resolve, sep } from "node:path";
import { promisify } from "node:util";
import { ConsoleLogger } from "../../../packages/logger-js/lib/node.js";
const logger = ConsoleLogger.prefix("commands");
export class CommandError extends Error {
name = "CommandError";
/**
* @param {string} command
* @param {ErrorOptions & ExecOptions} options
*/
constructor(command, { cause, cwd, shell } = {}) {
const cwdInfo = cwd ? ` in directory ${cwd}` : "";
const shellInfo = shell ? ` using shell ${shell}` : "";
super(`Command failed: ${command}${cwdInfo}${shellInfo}`, { cause });
}
}
/**
* @param {string[]} positionals
* @returns {string} The resolved current working directory for the script
*/
export function parseCWD(positionals) {
// `INIT_CWD` is present only if the script is run via npm.
const initCWD = process.env.INIT_CWD || process.cwd();
const cwd = (positionals.length ? resolve(initCWD, positionals[0]) : initCWD) + sep;
return cwd;
}
const execAsync = promisify(exec);
/**
* @param {Awaited<ReturnType<typeof execAsync>>} result
*/
export const trimResult = (result) => String(result.stdout).trim();
/**
* @typedef {(strings: TemplateStringsArray, ...expressions: unknown[]) =>
* (options?: ExecOptions) => Promise<string>
* } CommandTag
*/
function createTag(prefix = "") {
/** @type {CommandTag} */
return (strings, ...expressions) => {
const command = (prefix ? prefix + " " : "") + String.raw(strings, ...expressions);
logger.debug(command);
return (options) =>
execAsync(command, options)
.then(trimResult)
.catch((cause) => {
throw new CommandError(command, { ...options, cause });
});
};
}
/**
* A tagged template function for running shell commands.
* @type {CommandTag & { bind(prefix: string): CommandTag }}
*/
export const $ = createTag();
/**
* @param {string} prefix
* @returns {CommandTag}
*/
$.bind = (prefix) => createTag(prefix);
/**
* Promisified version of {@linkcode exec} for easier async/await usage.
*
* @param {string} command The command to run, with space-separated arguments.
* @param {ExecOptions} [options] Optional execution options.
* @throws {CommandError} If the command fails to execute.
*/
export function $2(command, options) {
return execAsync(command, options)
.then(trimResult)
.catch((cause) => {
throw new CommandError(command, { ...options, cause });
});
}
/**
* Logs the given error and its cause (if any) and exits the process with a failure code.
* @param {unknown} error
* @param {typeof ConsoleLogger} logger
* @returns {never}
*/
export function reportAndExit(error, logger = ConsoleLogger) {
const message = error instanceof Error ? error.message : String(error);
const cause = error instanceof Error && error.cause instanceof Error ? error.cause : null;
logger.error(`${message}`);
if (cause) {
logger.error(`Caused by: ${cause.message}`);
}
process.exit(1);
}

View File

@@ -0,0 +1,84 @@
import * as crypto from "node:crypto";
import * as fs from "node:fs/promises";
import { join, relative } from "node:path";
import { ConsoleLogger } from "../../../packages/logger-js/lib/node.js";
import { $ } from "./commands.mjs";
const REGISTRY_URL = "https://registry.npmjs.org/corepack";
const OUTPUT_DIR = join(".corepack", "releases");
const OUTPUT_FILENAME = "latest.tgz";
export const corepack = $.bind("corepack");
/**
* Reads the installed Corepack version.
*
* @param {string} [cwd] The directory to run the command in.
* @returns {Promise<string | null>} The installed Corepack version
*/
export function readCorepackVersion(cwd = process.cwd()) {
return $`corepack --version`({ cwd });
}
const logger = ConsoleLogger.prefix("setup-corepack");
/**
* @param {string} baseDirectory
*/
export async function pullLatestCorepack(baseDirectory = process.cwd()) {
logger.info("Fetching corepack metadata from registry...");
const outputDir = join(baseDirectory, OUTPUT_DIR);
const outputPath = join(outputDir, OUTPUT_FILENAME);
const res = await fetch(REGISTRY_URL, { signal: AbortSignal.timeout(1000 * 60) });
if (!res.ok) {
throw new Error(`Failed to fetch registry metadata: ${res.status} ${res.statusText}`);
}
const metadata = await res.json();
const latestVersion = metadata["dist-tags"].latest;
const versionData = metadata.versions[latestVersion];
const tarballUrl = versionData.dist.tarball;
const expectedIntegrity = versionData.dist.integrity;
logger.info(`Latest corepack version: ${latestVersion}`);
logger.info(`Tarball URL: ${tarballUrl}`);
logger.info(`Expected integrity: ${expectedIntegrity}`);
logger.info({ url: tarballUrl }, "Downloading tarball...");
const tarballRes = await fetch(tarballUrl, {
signal: AbortSignal.timeout(1000 * 60),
});
if (!tarballRes.ok) {
throw new Error(
`Failed to download tarball: ${tarballRes.status} ${tarballRes.statusText}`,
);
}
const tarballBuffer = Buffer.from(await tarballRes.arrayBuffer());
logger.info("Verifying integrity...");
const [algorithm, expectedHash] = expectedIntegrity.split("-");
const actualHash = crypto.createHash(algorithm).update(tarballBuffer).digest("base64");
if (actualHash !== expectedHash) {
throw new Error(
`Integrity mismatch!\n Expected: ${expectedHash}\n Actual: ${actualHash}`,
);
}
logger.info("Integrity verified.");
await fs.mkdir(outputDir, { recursive: true });
await fs.writeFile(outputPath, tarballBuffer);
logger.info(`Saved to ${relative(baseDirectory, outputPath)}`);
logger.info(`corepack@${latestVersion} (${expectedIntegrity})`);
}

View File

@@ -0,0 +1,25 @@
import { $ } from "./commands.mjs";
/**
* Checks whether the given file has uncommitted changes in git.
*
* @param {string} filePath
* @param {string} [cwd]
* @returns {Promise<{ clean: boolean, available: boolean }>}
*/
export async function gitStatus(filePath, cwd = process.cwd()) {
return $`git status --porcelain ${filePath}`({ cwd })
.then((output) => ({ clean: !output, available: true }))
.catch(() => ({ clean: false, available: false }));
}
/**
* Finds the root directory of the git repository containing the given directory.
*
* @param {string} cwd
* @returns {Promise<string>} The path to the git repository root.
* @throws {Error} If the command fails (e.g., not a git repository).
*/
export function resolveRepoRoot(cwd = process.cwd()) {
return $`git rev-parse --show-toplevel`({ cwd });
}

175
scripts/node/utils/node.mjs Normal file
View File

@@ -0,0 +1,175 @@
/**
* Utility functions for working with npm packages and versions.
*
* @import { ExecOptions } from "node:child_process"
*/
import * as fs from "node:fs/promises";
import { dirname, join } from "node:path";
import { $ } from "./commands.mjs";
/**
* Find the nearest directory containing both package.json and package-lock.json,
* starting from the given directory and walking upward.
*
* @param {string} start The directory to start searching from.
* @returns {Promise<{ packageJSONPath: string, packageLockPath: string }>}
* @throws {Error} If no co-located package.json and package-lock.json are found.
*/
export async function findNPMPackage(start) {
let currentDir = start;
while (currentDir !== dirname(currentDir)) {
const packageJSONPath = join(currentDir, "package.json");
const packageLockPath = join(currentDir, "package-lock.json");
try {
await Promise.all([fs.access(packageJSONPath), fs.access(packageLockPath)]);
return {
packageJSONPath,
packageLockPath,
};
} catch {
// Continue searching up the directory tree
}
currentDir = dirname(currentDir);
}
throw new Error(`No co-located package.json and package-lock.json found above ${start}`);
}
/**
* @typedef {object} PackageJSON
* @property {string} name
* @property {string} version
* @property {Record<string, string>} [dependencies]
* @property {Record<string, string>} [devDependencies]
* @property {Record<string, string>} [peerDependencies]
* @property {Record<string, string>} [optionalDependencies]
* @property {Record<string, string>} [peerDependenciesMeta]
* @property {Record<string, string>} [engines]
* @property {Record<string, string>} [devEngines]
* @property {string} [packageManager]
*/
/**
* @param {string} jsonPath
* @returns {Promise<PackageJSON>}
*/
export function loadJSON(jsonPath) {
return fs
.readFile(jsonPath, "utf-8")
.then(JSON.parse)
.catch((cause) => {
throw new Error(`Failed to load JSON file at ${jsonPath}`, { cause });
});
}
const PackageJSONComparisionFields = /** @type {const} */ ([
"name",
"dependencies",
"devDependencies",
"optionalDependencies",
"peerDependencies",
"peerDependenciesMeta",
]);
/**
* @typedef {typeof PackageJSONComparisionFields[number]} PackageJSONComparisionField
*/
/**
* Extracts only the dependency fields from a package.json object for comparison purposes.
*
* @param {PackageJSON} data
* @returns {Pick<PackageJSON, PackageJSONComparisionField>}
*/
export function pluckDependencyFields(data) {
/**
* @type {Record<string, unknown>}
*/
const result = {};
for (const field of PackageJSONComparisionFields) {
if (data[field]) {
result[field] = data[field];
}
}
return /** @type {Pick<PackageJSON, PackageJSONComparisionField>} */ (result);
}
//#region Versioning
/**
* Compares two semantic version strings (e.g., "14.17.0").
*
* @param {string} a The first version string.
* @param {string} b The second version string.
* @returns {number}
*/
export function compareVersions(a, b) {
const pa = a.split(".").map(Number);
const pb = b.split(".").map(Number);
for (let i = 0; i < 3; i++) {
if (pa[i] > pb[i]) return 1;
if (pa[i] < pb[i]) return -1;
}
return 0;
}
/**
* Runs a Node.js command and returns its stdout output as a string.
*
* @param {TemplateStringsArray} strings
* @param {...unknown} expressions
* @returns {(options?: ExecOptions) => Promise<string>}
*/
export const node = $.bind("node");
/**
* @typedef {object} NPMCommandOptions
* @property {boolean} [useCorepack] Whether to prefix the command with "corepack " to use Corepack's shims.
* @returns {Promise<string>}
*/
/**
* Runs an npm command and returns its stdout output as a string.
*
* @param {TemplateStringsArray} strings
* @param {...unknown} expressions
* @returns {(options?: ExecOptions & NPMCommandOptions) => Promise<string>}
*/
export function npm(strings, ...expressions) {
const subcommand = String.raw(strings, ...expressions);
return ({ useCorepack, ...options } = {}) => {
const command = [useCorepack ? "corepack" : "", "npm", subcommand]
.filter(Boolean)
.join(" ");
return $`${command}`(options);
};
}
/**
* Parses a version range string, stripping any leading >= and normalizing to three parts.
* @param {string} range
* @returns {{ operator: ">=" | "=", version: string }}
*/
export function parseRange(range) {
const hasGte = range.startsWith(">=");
const raw = hasGte ? range.slice(2) : range;
const parts = raw.split(".").map(Number);
while (parts.length < 3) parts.push(0);
return {
operator: hasGte ? ">=" : "=",
version: parts.join("."),
};
}
//#endregion

View File

@@ -23,23 +23,10 @@ struct Cli {
#[derive(Debug, FromArgs, PartialEq)]
#[argh(subcommand)]
enum Command {
#[cfg(feature = "core")]
AllInOne(AllInOne),
#[cfg(feature = "core")]
Server(server::Cli),
#[cfg(feature = "core")]
Worker(worker::Cli),
}
#[derive(Debug, FromArgs, PartialEq)]
/// Run both the authentik server and worker.
#[argh(subcommand, name = "allinone")]
#[expect(
clippy::empty_structs_with_brackets,
reason = "argh doesn't support unit structs"
)]
pub(crate) struct AllInOne {}
fn main() -> Result<()> {
let tracing_crude = ak_tracing::install_crude();
info!(version = authentik_full_version(), "authentik is starting");
@@ -47,10 +34,6 @@ fn main() -> Result<()> {
let cli: Cli = argh::from_env();
match &cli.command {
#[cfg(feature = "core")]
Command::AllInOne(_) => Mode::set(Mode::AllInOne)?,
#[cfg(feature = "core")]
Command::Server(_) => Mode::set(Mode::Server)?,
#[cfg(feature = "core")]
Command::Worker(_) => Mode::set(Mode::Worker)?,
}
@@ -93,16 +76,6 @@ fn main() -> Result<()> {
}
match cli.command {
#[cfg(feature = "core")]
Command::AllInOne(_) => {
server::start(server::Cli::default(), &mut tasks).await?;
let workers = worker::start(worker::Cli::default(), &mut tasks)?;
metrics.workers.store(Some(workers));
}
#[cfg(feature = "core")]
Command::Server(args) => {
server::start(args, &mut tasks).await?;
}
#[cfg(feature = "core")]
Command::Worker(args) => {
let workers = worker::start(args, &mut tasks)?;

View File

@@ -2,7 +2,6 @@ use std::{env::temp_dir, os::unix, path::PathBuf, sync::Arc};
use ak_axum::{router::wrap_router, server};
use ak_common::{
Mode,
arbiter::{Arbiter, Tasks},
config,
};
@@ -78,20 +77,25 @@ pub(crate) fn start(tasks: &mut Tasks) -> Result<Arc<Metrics>> {
.name(&format!("{}::run_upkeep", module_path!()))
.spawn(run_upkeep(arbiter, Arc::clone(&metrics)))?;
// Only run HTTP server in worker mode, in server or allinone mode, they're handled by the
// server.
if Mode::get() == Mode::Worker {
for addr in config::get().listen.metrics.iter().copied() {
server::start_plain(tasks, "metrics", router.clone(), addr)?;
}
server::start_unix(
for addr in config::get().listen.metrics.iter().copied() {
server::start_plain(
tasks,
"metrics",
router,
unix::net::SocketAddr::from_pathname(socket_path())?,
router.clone(),
addr,
config::get().debug, /* Allow failure in case the server is running on the same
* machine, like in dev */
)?;
}
server::start_unix(
tasks,
"metrics",
router,
unix::net::SocketAddr::from_pathname(socket_path())?,
config::get().debug, /* Allow failure in case the server is running on the same machine,
* like in dev */
)?;
Ok(metrics)
}

View File

@@ -1,124 +1,5 @@
use std::{env::temp_dir, path::PathBuf, process::Stdio, sync::Arc};
use ak_common::{Arbiter, Tasks, config};
use argh::FromArgs;
use eyre::{Result, eyre};
use nix::{
sys::signal::{Signal, kill},
unistd::Pid,
};
use tokio::{
process::{Child, Command},
sync::Mutex,
time::{Duration, sleep, timeout},
};
use tracing::{info, warn};
#[derive(Debug, Default, FromArgs, PartialEq, Eq)]
/// Run the authentik server.
#[argh(subcommand, name = "server")]
#[expect(
clippy::empty_structs_with_brackets,
reason = "argh doesn't support unit structs"
)]
pub(crate) struct Cli {}
use std::{env::temp_dir, path::PathBuf};
pub(crate) fn socket_path() -> PathBuf {
temp_dir().join("authentik.sock")
}
pub(crate) struct Server {
server: Mutex<Child>,
}
impl Server {
async fn new() -> Result<Self> {
info!("starting server");
let server = if config::get().debug && which::which("authentik-server").is_err() {
let build_status = Command::new("go")
.args(["build", "-o", "server", "./cmd/server"])
.stdin(Stdio::null())
.status()
.await?;
if !build_status.success() {
return Err(eyre!("golang server failed to compile"));
}
Command::new("./server")
.kill_on_drop(true)
.stdin(Stdio::null())
.stdout(Stdio::inherit())
.stderr(Stdio::inherit())
.spawn()?
} else {
Command::new("authentik-server")
.kill_on_drop(true)
.stdin(Stdio::null())
.stdout(Stdio::inherit())
.stderr(Stdio::inherit())
.spawn()?
};
Ok(Self {
server: Mutex::new(server),
})
}
async fn shutdown(&self) -> Result<()> {
info!("shutting down server");
let mut server = self.server.lock().await;
if let Some(id) = server.id() {
kill(Pid::from_raw(id.cast_signed()), Signal::SIGINT)?;
}
timeout(Duration::from_secs(1), server.wait()).await??;
Ok(())
}
async fn is_alive(&self) -> bool {
let try_wait = self.server.lock().await.try_wait();
match try_wait {
Ok(Some(code)) => {
warn!(?code, "server has exited");
false
}
Ok(None) => true,
Err(err) => {
warn!(
?err,
"failed to check the status of server process, ignoring"
);
true
}
}
}
}
async fn watch_server(arbiter: Arbiter, server: Arc<Server>) -> Result<()> {
info!("starting server watcher");
loop {
tokio::select! {
() = sleep(Duration::from_secs(5)) => {
if !server.is_alive().await {
return Err(eyre!("server has exited unexpectedly"));
}
}
() = arbiter.shutdown() => {
server.shutdown().await?;
return Ok(());
}
}
}
}
pub(crate) async fn start(_cli: Cli, tasks: &mut Tasks) -> Result<Arc<Server>> {
let arbiter = tasks.arbiter();
let server = Arc::new(Server::new().await?);
tasks
.build_task()
.name(&format!("{}::watch_server", module_path!()))
.spawn(watch_server(arbiter.clone(), Arc::clone(&server)))?;
Ok(server)
}

View File

@@ -54,7 +54,6 @@ const INITIAL_WORKER_ID: usize = 1000;
static INITIAL_WORKER_READY: AtomicBool = AtomicBool::new(false);
pub(crate) struct Worker {
worker_id: usize,
worker: Child,
client: Client<UnixSocketConnector<PathBuf>, Body>,
socket_path: PathBuf,
@@ -76,7 +75,6 @@ impl Worker {
.build(UnixSocketConnector::new(socket_path.clone()));
Ok(Self {
worker_id,
worker: cmd
.kill_on_drop(true)
.stdin(Stdio::null())
@@ -110,7 +108,7 @@ impl Worker {
self.shutdown(Signal::SIGINT).await
}
#[instrument(skip(self), fields(worker_id = self.worker_id))]
#[instrument(skip_all)]
fn is_alive(&mut self) -> bool {
let try_wait = self.worker.try_wait();
match try_wait {
@@ -135,52 +133,34 @@ impl Worker {
result.is_ok()
}
#[instrument(skip(self), fields(worker_id = self.worker_id))]
#[instrument(skip_all)]
async fn health_live(&self) -> Result<bool> {
trace!("sending health live request to worker");
let req = Request::builder()
.method("GET")
.uri("http://localhost:8000/-/health/live/")
.header(HOST, "localhost")
.body(Body::from(""))?;
Ok(self
.client
.request(req)
.await
.inspect_err(|err| warn!(?err, "failed to send health live request to worker"))?
.status()
.is_success())
Ok(self.client.request(req).await?.status().is_success())
}
#[instrument(skip(self), fields(worker_id = self.worker_id))]
#[instrument(skip_all)]
async fn health_ready(&self) -> Result<bool> {
trace!("sending health ready request to worker");
let req = Request::builder()
.method("GET")
.uri("http://localhost:8000/-/health/ready/")
.header(HOST, "localhost")
.body(Body::from(""))?;
Ok(self
.client
.request(req)
.await
.inspect_err(|err| warn!(?err, "failed to send health ready request to worker"))?
.status()
.is_success())
Ok(self.client.request(req).await?.status().is_success())
}
#[instrument(skip(self), fields(worker_id = self.worker_id))]
#[instrument(skip_all)]
async fn notify_metrics(&self) -> Result<()> {
trace!("sending metrics request to worker");
let req = Request::builder()
.method("GET")
.uri("http://localhost:8000/-/metrics/")
.header(HOST, "localhost")
.body(Body::from(""))?;
self.client
.request(req)
.await
.inspect_err(|err| warn!(?err, "failed to send metrics request to worker"))?;
self.client.request(req).await?;
Ok(())
}
}
@@ -343,7 +323,14 @@ pub(crate) fn start(_cli: Cli, tasks: &mut Tasks) -> Result<Arc<Workers>> {
let router = healthcheck::build_router(Arc::clone(&workers));
for addr in config::get().listen.http.iter().copied() {
ak_axum::server::start_plain(tasks, "worker", router.clone(), addr)?;
ak_axum::server::start_plain(
tasks,
"worker",
router.clone(),
addr,
config::get().debug, /* Allow failure in case the server is running on the same
* machine, like in dev. */
)?;
}
ak_axum::server::start_unix(
@@ -351,6 +338,8 @@ pub(crate) fn start(_cli: Cli, tasks: &mut Tasks) -> Result<Arc<Workers>> {
"worker",
router,
unix::net::SocketAddr::from_pathname(socket_path())?,
config::get().debug, /* Allow failure in case the server is running on the same
* machine, like in dev. */
)?;
}

View File

@@ -111,12 +111,8 @@ class TestFlowsEnroll(SeleniumTestCase):
identification_stage = self.get_shadow_root("ak-stage-identification", flow_executor)
wait = WebDriverWait(identification_stage, self.wait_timeout)
wait.until(
ec.presence_of_element_located((By.CSS_SELECTOR, "a[data-ouia-component-id='enroll']"))
)
identification_stage.find_element(
By.CSS_SELECTOR, "a[data-ouia-component-id='enroll']"
).click()
wait.until(ec.presence_of_element_located((By.CSS_SELECTOR, "a[ouiaId='enroll']")))
identification_stage.find_element(By.CSS_SELECTOR, "a[ouiaId='enroll']").click()
# First prompt stage
flow_executor = self.get_shadow_root("ak-flow-executor")

View File

@@ -27,14 +27,8 @@ class TestFlowsRecovery(SeleniumTestCase):
identification_stage = self.get_shadow_root("ak-stage-identification", flow_executor)
wait = WebDriverWait(identification_stage, self.wait_timeout)
wait.until(
ec.presence_of_element_located(
(By.CSS_SELECTOR, "a[data-ouia-component-id='recovery']")
)
)
identification_stage.find_element(
By.CSS_SELECTOR, "a[data-ouia-component-id='recovery']"
).click()
wait.until(ec.presence_of_element_located((By.CSS_SELECTOR, "a[ouiaId='recovery']")))
identification_stage.find_element(By.CSS_SELECTOR, "a[ouiaId='recovery']").click()
# First prompt stage
flow_executor = self.get_shadow_root("ak-flow-executor")

View File

@@ -1,16 +1,10 @@
from os import unlink, write
from sys import stderr
from tempfile import mkstemp
from urllib.parse import urlencode
from channels.testing import ChannelsLiveServerTestCase
from django.apps import apps
from django.contrib.staticfiles.testing import StaticLiveServerTestCase
from django.urls import reverse
from docker.types import Healthcheck
from dramatiq import get_broker
from structlog.stdlib import get_logger
from yaml import safe_dump
from authentik.core.apps import Setup
from authentik.core.models import User
@@ -53,84 +47,6 @@ class E2ETestMixin(DockerTestCase):
print("::endgroup::", file=stderr)
super().tearDown()
def url(self, view: str, query: dict | None = None, **kwargs) -> str:
"""reverse `view` with `**kwargs` into full URL using live_server_url"""
url = self.live_server_url + reverse(view, kwargs=kwargs)
if query:
return url + "?" + urlencode(query)
return url
class SSLLiveMixin(DockerTestCase):
"""Mixin to provide an SSL-enabled webserver for integration/e2e tests that require it.
Overrides `live_server_url` and as such other all usual helper functions will return an HTTPS
URL. Certificate is self-signed and random on each run."""
def setUp(self):
super().setUp()
self._setup_traefik()
def tearDown(self):
super().tearDown()
unlink(self._traefik_config)
@property
def live_server_url(self):
return f"https://{self.host}:{self._traefik_port}"
def _setup_traefik(self):
config = {
"http": {
"routers": {
"authentik": {
"rule": "PathPrefix(`/`)",
"entryPoints": ["websecure"],
"service": "authentik",
"tls": {},
}
},
"services": {
"authentik": {"loadBalancer": {"servers": [{"url": super().live_server_url}]}}
},
}
}
fd, self._traefik_config = mkstemp()
write(fd, safe_dump(config).encode())
traefik = self.run_container(
image="docker.io/library/traefik:3.1",
command=[
"--providers.file.filename=/etc/traefik/dynamic.yml",
"--providers.file.watch=true",
"--entrypoints.websecure.address=:9443",
"--log.level=DEBUG",
"--api=true",
"--api.dashboard=true",
"--api.insecure=true",
"--ping=true",
],
healthcheck=Healthcheck(
test=["CMD", "traefik", "healthcheck", "--ping"],
interval=5 * 1_000 * 1_000_000,
start_period=1 * 1_000 * 1_000_000,
),
ports={
"9443": None,
},
volumes={
self._traefik_config: {
"bind": "/etc/traefik/dynamic.yml",
}
},
)
# {
# "8443/tcp": [
# {"HostIp": "0.0.0.0", "HostPort": "8443"},
# {"HostIp": "::", "HostPort": "8443"},
# ],
# }
self._traefik_port = traefik.ports["9443/tcp"][0]["HostPort"]
class E2ETestCase(E2ETestMixin, StaticLiveServerTestCase):
"""E2E Test case with django static live server"""

View File

@@ -1,19 +1,17 @@
from os import makedirs
from pathlib import Path
from time import sleep
from typing import Any
from selenium.webdriver.common.by import By
from selenium.webdriver.support import expected_conditions as ec
from authentik.blueprints.tests import apply_blueprint, reconcile_app
from authentik.providers.oauth2.models import OAuth2Provider
from tests.live import SSLLiveMixin
from tests.openid_conformance.conformance import Conformance
from tests.selenium import SeleniumTestCase
class TestOpenIDConformance(SSLLiveMixin, SeleniumTestCase):
class TestOpenIDConformance(SeleniumTestCase):
conformance: Conformance
@@ -61,28 +59,32 @@ class TestOpenIDConformance(SSLLiveMixin, SeleniumTestCase):
},
"consent": {},
}
self.test_variant = {
"server_metadata": "discovery",
"client_registration": "static_client",
}
def run_test(
self, test_name: str, test_plan_config: dict[str, Any], test_variant: dict[str, Any]
):
def run_test(self, test_name: str, test_plan_config: dict):
# Create a Conformance instance...
self.conformance = Conformance(f"https://{self.host}:8443/", None, verify_ssl=False)
test_plan = self.conformance.create_test_plan(
test_name,
test_plan_config,
test_variant,
self.test_variant,
)
plan_id = test_plan["id"]
for test in test_plan["modules"]:
# Fetch name and variant of the next test to run
module_name = test["testModule"]
variant = test["variant"]
module_instance = self.conformance.create_test_from_plan_with_variant(
plan_id, module_name, variant
)
module_id = module_instance["id"]
self.run_single_test(module_id)
self.conformance.wait_for_state(module_id, ["FINISHED"], timeout=self.wait_timeout)
with self.subTest(test["testModule"], **test["variant"]):
# Fetch name and variant of the next test to run
module_name = test["testModule"]
variant = test["variant"]
module_instance = self.conformance.create_test_from_plan_with_variant(
plan_id, module_name, variant
)
module_id = module_instance["id"]
self.run_single_test(module_id)
self.conformance.wait_for_state(module_id, ["FINISHED"], timeout=self.wait_timeout)
sleep(2)
self.conformance.export_html(plan_id, Path(__file__).parent / "exports")

View File

@@ -2,14 +2,14 @@ services:
mongodb:
image: mongo:6.0.13
nginx:
image: ghcr.io/beryju/oidc-conformance-suite-nginx:v5.1.43
image: ghcr.io/beryju/oidc-conformance-suite-nginx:v5.1.41
ports:
- "8443:8443"
- "8444:8444"
depends_on:
- server
server:
image: ghcr.io/beryju/oidc-conformance-suite-server:v5.1.43
image: ghcr.io/beryju/oidc-conformance-suite-server:v5.1.41
ports:
- "9999:9999"
extra_hosts:
@@ -19,8 +19,8 @@ services:
-Xdebug -Xrunjdwp:transport=dt_socket,address=*:9999,server=y,suspend=n
-jar /server/fapi-test-suite.jar
-Djdk.tls.maxHandshakeMessageSize=65536
--fintechlabs.base_url=https://localhost:8443
--fintechlabs.base_mtls_url=https://localhost:8444
--fintechlabs.base_url=https://host.docker.internal:8443
--fintechlabs.base_mtls_url=https://host.docker.internal:8444
--fintechlabs.devmode=true
--fintechlabs.startredir=true
links:

View File

@@ -0,0 +1,10 @@
from tests.decorators import retry
from tests.openid_conformance.base import TestOpenIDConformance
class TestOpenIDConformanceBasic(TestOpenIDConformance):
@retry()
def test_oidcc_basic_certification_test(self):
test_plan_name = "oidcc-basic-certification-test-plan"
self.run_test(test_plan_name, self.test_plan_config)

View File

@@ -0,0 +1,10 @@
from tests.decorators import retry
from tests.openid_conformance.base import TestOpenIDConformance
class TestOpenIDConformanceImplicit(TestOpenIDConformance):
@retry()
def test_oidcc_implicit_certification_test_plan(self):
test_plan_name = "oidcc-implicit-certification-test-plan"
self.run_test(test_plan_name, self.test_plan_config)

Some files were not shown because too many files have changed in this diff Show More