Compare commits

..

24 Commits

Author SHA1 Message Date
Teffen Ellis
90f27f93e1 web/e2e: fix three regressions blocking the parallel suite
Locally and in CI the new `e2e (playwright)` job appeared to "hang"
under `fullyParallel: true` + `workers: "50%"`. The hang was actually
five tests sharing two unrelated bugs that all manifest as 30s test
timeouts; the cluster only *looks* like a parallelism issue because
multiple workers stall on the same wall-clock window. With these three
fixes the full suite is green in 1m48s on `--workers=2` (was: 5 failed
/ 17 passed in 5m30s).

1. `web/test/browser/600-providers.test.ts`
   PR #21647 dropped the `to:` argument on the `session.login()` call
   in this file's `beforeEach`. Without it, `SessionFixture.login()`
   waits for the auth-flow URL pattern to re-appear — which it does
   immediately, since we just navigated there — so the helper returns
   *before* the post-login redirect lands. The wizard buttons probed
   afterward live on `/if/admin/#/core/providers`, which the user never
   actually reaches; every test in the file then hits the 30s
   `beforeEach` timeout. Pin the destination explicitly, matching the
   shape of every other test file.

2. `web/src/admin/roles/ak-role-list.ts`
   The role-list row anchor had no aria-label, so its accessible name
   was the (random, generated) role name. `500-roles.test.ts` searches
   for that anchor with `getByRole("link", { name: "view details" })`
   — the same selector `400-groups.test.ts` uses against the group
   list, where `GroupListPage.row()` *does* set
   `aria-label="View details of group ..."`. Bring the role row to
   parity with groups; the test wasn't wrong, the UI was missing the
   accessibility hook.

3. `web/test/browser/500-roles.test.ts` ("Edit role from view page")
   The post-edit verification used `page.getByText(updatedName)`, but
   on the role view page the new name renders in two places (the
   "Role <name>" page-navbar heading and the description-list value),
   so the bare text match resolves to two elements and trips
   strict-mode. Add `{ exact: true }` so we assert the canonical value
   the edit wrote rather than the heading template.

Co-Authored-By: Agent (authentik-i21996-internal-achievable-raisin) <279763771+playpen-agent@users.noreply.github.com>
2026-05-01 15:42:07 +00:00
Teffen Ellis
d92bfc473f ci/web: post Playwright result comment + gated S3 upload + !cancelled() guards
Three reviewer-facing improvements to the e2e job:

1. Idempotent PR comment summarising Playwright pass/fail/flaky/skipped
   counts. Marker `<!-- playwright-result -->` lets re-runs edit the
   same comment instead of piling up. Skipped on fork PRs where the
   default GITHUB_TOKEN is read-only.

2. Optional S3 publish of the HTML report to
   `s3://authentik-playwright-artifacts/pr-<n>/run-<id>/attempt-<n>/`,
   gated behind `vars.PLAYWRIGHT_S3_ENABLED == 'true'`. The bucket is
   pending infra provisioning; the public URL pattern is already wired
   into the comment so flipping the variable on later requires no
   workflow changes. Borrows the OIDC + IAM role plumbing from
   `.github/workflows/release-publish.yml`.

3. Switch the failure-guarded reporting/upload steps to `!cancelled()`
   so a superseded (cancelled) run no longer emits failure-shaped noise,
   and so successful runs still produce the artifact bundle reviewers
   expect.

Adds the Playwright JSON reporter so the parse step can pull pass/fail
counts from `playwright-report/results.json` for the comment body.

Co-Authored-By: Agent (authentik-i21996-internal-achievable-raisin) <279763771+playpen-agent@users.noreply.github.com>
2026-05-01 14:57:35 +00:00
Teffen Ellis
4d3ac3f63a Flesh out types. 2026-05-01 16:05:16 +02:00
Teffen Ellis
e29d37bb8a Always parallel. 2026-05-01 15:48:29 +02:00
Teffen Ellis
6fee629926 Update expected path. 2026-05-01 04:04:57 +02:00
Teffen Ellis
cad8395dad Ignore playwright-traces. 2026-05-01 04:04:57 +02:00
Teffen Ellis
191ecc51bd Reorder tests. 2026-05-01 04:04:57 +02:00
Teffen Ellis
71e9092810 Remove guard. 2026-05-01 04:04:56 +02:00
Teffen Ellis
34dbb78df0 Use parallelism. 2026-05-01 04:04:56 +02:00
Teffen Ellis
e29a3eb35a ci/web: format test-admin-user.yaml with prettier
Pick up the 4-space indent that web/'s prettier config enforces. The
file was added under issue #21994 with 2-space indent and tripped the
ci-web format check on push.

Co-Authored-By: Agent (authentik-i21994-better-mobile-tangelo) <279763771+playpen-agent@users.noreply.github.com>
2026-05-01 04:04:56 +02:00
Teffen Ellis
fe8b3d1687 ci/web: make test-admin blueprint self-contained
The previous blueprint used !Find to look up the authentik Admins group,
which raced against system/bootstrap.yaml and resolved to None when the
explicit apply_blueprint step ran before the worker had applied bootstrap.
The serializer rejected groups: [None] with Invalid pk "None".

Define the group in the same blueprint with state: present and reference
it via !KeyOf, so the test admin setup does not depend on any pre-existing
data. If bootstrap has already created the group, state: present is a
no-op on the identifiers; otherwise the group is created here.

Co-Authored-By: Agent (authentik-i21994-better-mobile-tangelo) <279763771+playpen-agent@users.noreply.github.com>
2026-05-01 04:04:56 +02:00
Teffen Ellis
0fd2a13b35 Bump package.json 2026-05-01 04:04:55 +02:00
Teffen Ellis
17d49aa99a ci/web: run Playwright e2e suite on every PR
Boots the full authentik stack (postgres + Go server + Rust worker)
inside the existing ci-web workflow, applies migrations and the
test-admin user blueprint, then runs `corepack npm run --prefix web
test:e2e` against http://localhost:9000. Uploads the HTML report,
traces/videos, and authentik logs as artifacts on failure so reviewers
can debug without rerunning locally.

Also enables the HTML reporter and screenshot/video capture on CI in
playwright.config.js, and updates the full dev-environment docs to
point at the same npm scripts CI uses so local and CI runs stay in
lockstep.

Closes #21994

Co-Authored-By: Agent (authentik-i21994-better-mobile-tangelo) <279763771+playpen-agent@users.noreply.github.com>
2026-05-01 04:04:55 +02:00
Teffen Ellis
b9389544be Flesh out github actions. 2026-05-01 01:23:13 +02:00
Teffen Ellis
ed82b3f623 Lint. 2026-05-01 01:21:05 +02:00
Teffen Ellis
b281bb819d Bump. 2026-05-01 01:21:05 +02:00
Teffen Ellis
3ca633c10a lint. 2026-05-01 01:21:05 +02:00
Teffen Ellis
ad1582c43f Clean up docs container. 2026-05-01 01:21:05 +02:00
Teffen Ellis
53f2826ea1 Flesh out github actions. 2026-05-01 01:21:03 +02:00
Teffen Ellis
ccf7225f03 Update makefile. 2026-05-01 01:20:18 +02:00
Teffen Ellis
85469c86d1 Prep containers. 2026-05-01 01:20:18 +02:00
Teffen Ellis
0bfce2d1f9 Bump engines. 2026-05-01 01:20:18 +02:00
Teffen Ellis
7b4e175d59 Flesh out node scripts. 2026-05-01 01:20:18 +02:00
Teffen Ellis
b47f6f8e56 Fix mounted references. 2026-05-01 01:20:17 +02:00
1889 changed files with 18288 additions and 56358 deletions

81
.github/actions/setup-node/action.yml vendored Normal file
View File

@@ -0,0 +1,81 @@
name: "Setup Node.js and NPM"
description: "Sets up Node.js with a specific NPM version via Corepack"
inputs:
working-directory:
description: "Path to the working directory containing the package.json file"
required: false
default: "."
dependencies:
required: false
description: "List of dependencies to setup"
default: "monorepo,working-directory"
node-version-file:
description: "Path to file containing the Node.js version"
required: false
default: "package.json"
cache-dependency-path:
description: "Path to dependency lock file for caching"
required: false
default: "package-lock.json"
cache:
description: "Package manager to cache"
default: "npm"
registry-url:
description: "npm registry URL"
default: "https://registry.npmjs.org"
runs:
using: "composite"
steps:
- name: Setup Node.js (Corepack bootstrap)
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v4
with:
node-version-file: ${{ inputs.node-version-file }}
registry-url: ${{ inputs.registry-url }}
# The setup-node action will attempt to create a cache using a version of
# npm that may not be compatible with the range specified in package.json.
# This can be enabled **after** corepack is installed and the correct npm version is available.
package-manager-cache: false
- name: Install Corepack
working-directory: ${{ github.workspace}}
shell: bash
run: | #shell
node ./scripts/node/lint-runtime.mjs
node ./scripts/node/setup-corepack.mjs --force
corepack enable
- name: Lint Node.js and NPM versions
shell: bash
run: node ./scripts/node/lint-runtime.mjs
- name: Setup Node.js (Monorepo Root)
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v4
with:
node-version-file: ${{ inputs.node-version-file }}
cache: ${{ inputs.cache }}
cache-dependency-path: ${{ inputs.cache-dependency-path }}
registry-url: ${{ inputs.registry-url }}
- name: Install monorepo dependencies
if: ${{ contains(inputs.dependencies, 'monorepo') }}
shell: bash
run: | #shell
node ./scripts/node/lint-lockfile.mjs
corepack npm ci
- name: Setup Node.js (Working Directory)
if: ${{ contains(inputs.dependencies, 'working-directory') }}
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v4
with:
node-version-file: ${{ inputs.working-directory }}/${{ inputs.node-version-file }}
cache: ${{ inputs.cache }}
cache-dependency-path: ${{ inputs.working-directory }}/${{ inputs.cache-dependency-path }}
registry-url: ${{ inputs.registry-url }}
- name: Install working directory dependencies
if: ${{ contains(inputs.dependencies, 'working-directory') }}
shell: bash
run: | # shell
corepack install
echo "node version: $(node --version)"
echo "npm version: $(corepack npm --version)"
node ./scripts/node/lint-lockfile.mjs ${{ inputs.working-directory }}
corepack npm ci --prefix ${{ inputs.working-directory }}

View File

@@ -18,19 +18,24 @@ runs:
using: "composite"
steps:
- name: Cleanup apt
if: ${{ contains(inputs.dependencies, 'system') || contains(inputs.dependencies, 'python') }}
if: ${{ contains(inputs.dependencies, 'system') || contains(inputs.dependencies,
'python') }}
shell: bash
run: sudo apt-get remove --purge man-db
- name: Install apt deps
if: ${{ contains(inputs.dependencies, 'system') || contains(inputs.dependencies, 'python') }}
if: ${{ contains(inputs.dependencies, 'system') || contains(inputs.dependencies,
'python') }}
uses: gerlero/apt-install@f4fa5265092af9e750549565d28c99aec7189639
with:
packages: libpq-dev openssl libxmlsec1-dev pkg-config gettext libclang-dev libkadm5clnt-mit12 libkadm5clnt7t64-heimdal libkrb5-dev krb5-kdc krb5-user krb5-admin-server
packages: libpq-dev openssl libxmlsec1-dev pkg-config gettext krb5-multidev
libkrb5-dev heimdal-multidev libclang-dev krb5-kdc krb5-user
krb5-admin-server
update: true
upgrade: false
install-recommends: false
- name: Make space on disk
if: ${{ contains(inputs.dependencies, 'system') || contains(inputs.dependencies, 'python') }}
if: ${{ contains(inputs.dependencies, 'system') || contains(inputs.dependencies,
'python') }}
shell: bash
run: |
sudo mkdir -p /tmp/empty/
@@ -49,45 +54,30 @@ runs:
if: ${{ contains(inputs.dependencies, 'python') }}
shell: bash
working-directory: ${{ inputs.working-directory }}
run: uv sync --all-extras --dev --locked
run: uv sync --all-extras --dev --frozen
- name: Setup rust (stable)
if: ${{ contains(inputs.dependencies, 'rust') && !contains(inputs.dependencies, 'rust-nightly') }}
uses: actions-rust-lang/setup-rust-toolchain@46268bd060767258de96ed93c1251119784f2ab6 # v1
if: ${{ contains(inputs.dependencies, 'rust') && !contains(inputs.dependencies,
'rust-nightly') }}
uses: actions-rust-lang/setup-rust-toolchain@2b1f5e9b395427c92ee4e3331786ca3c37afe2d7 # v1
with:
rustflags: ""
- name: Setup rust (nightly)
if: ${{ contains(inputs.dependencies, 'rust-nightly') }}
uses: actions-rust-lang/setup-rust-toolchain@46268bd060767258de96ed93c1251119784f2ab6 # v1
uses: actions-rust-lang/setup-rust-toolchain@2b1f5e9b395427c92ee4e3331786ca3c37afe2d7 # v1
with:
toolchain: nightly
components: rustfmt
rustflags: ""
- name: Setup rust dependencies
if: ${{ contains(inputs.dependencies, 'rust') }}
uses: taiki-e/install-action@ec28e287910af896fd98e04056d31fa68607e7ad # v2
uses: taiki-e/install-action@481c34c1cf3a84c68b5e46f4eccfc82af798415a # v2
with:
tool: cargo-deny cargo-machete cargo-llvm-cov nextest
- name: Setup node (web)
- name: Setup node (root, web)
if: ${{ contains(inputs.dependencies, 'node') }}
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v4
uses: ./.github/actions/setup-node
with:
node-version-file: "${{ inputs.working-directory }}web/package.json"
cache: "npm"
cache-dependency-path: "${{ inputs.working-directory }}web/package-lock.json"
registry-url: "https://registry.npmjs.org"
- name: Setup node (root)
if: ${{ contains(inputs.dependencies, 'node') }}
uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v4
with:
node-version-file: "${{ inputs.working-directory }}package.json"
cache: "npm"
cache-dependency-path: "${{ inputs.working-directory }}package-lock.json"
registry-url: "https://registry.npmjs.org"
- name: Install Node deps
if: ${{ contains(inputs.dependencies, 'node') }}
shell: bash
working-directory: ${{ inputs.working-directory }}
run: npm ci
working-directory: web
- name: Setup go
if: ${{ contains(inputs.dependencies, 'go') }}
uses: actions/setup-go@4a3601121dd01d1626a1e23e37211e3254c1c06c # v5
@@ -97,7 +87,9 @@ runs:
if: ${{ contains(inputs.dependencies, 'runtime') }}
uses: AndreKurait/docker-cache@0fe76702a40db986d9663c24954fc14c6a6031b7
with:
key: docker-images-${{ runner.os }}-${{ hashFiles('.github/actions/setup/compose.yml', 'Makefile') }}-${{ inputs.postgresql_version }}
key: docker-images-${{ runner.os }}-${{
hashFiles('.github/actions/setup/compose.yml', 'Makefile') }}-${{
inputs.postgresql_version }}
- name: Setup dependencies
if: ${{ contains(inputs.dependencies, 'runtime') }}
shell: bash
@@ -105,7 +97,7 @@ runs:
run: |
export PSQL_TAG=${{ inputs.postgresql_version }}
docker compose -f .github/actions/setup/compose.yml up -d --wait
cd web && npm ci
corepack npm ci --prefix web
- name: Generate config
if: ${{ contains(inputs.dependencies, 'python') }}
shell: uv run python {0}

View File

@@ -67,6 +67,16 @@ jobs:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- uses: ./.github/actions/setup-node
with:
working-directory: web
dependencies: "monorepo"
- uses: actions/setup-go@4b73464bb391d4059bd26b0524d20df3927bd417 # v6
with:
go-version-file: "go.mod"
- name: Generate API Clients
run: |
make gen-client-ts
- name: Build Docker Image
uses: docker/build-push-action@bcafcacb16a39f128d818304e6c9c0c18556b85f # v7.1.0
id: push
@@ -81,7 +91,8 @@ jobs:
${{ steps.ev.outputs.imageBuildArgs }}
tags: ${{ steps.ev.outputs.imageTags }}
platforms: linux/${{ inputs.image_arch }}
cache-from: type=registry,ref=${{ steps.ev.outputs.attestImageNames }}:buildcache-${{ inputs.image_arch }}
cache-from: type=registry,ref=${{ steps.ev.outputs.attestImageNames
}}:buildcache-${{ inputs.image_arch }}
cache-to: ${{ steps.ev.outputs.cacheTo }}
- uses: actions/attest-build-provenance@a2bbfa25375fe432b6a289bc6b6cd05ecd0c4c32 # v3
id: attest

View File

@@ -90,7 +90,7 @@ jobs:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- uses: int128/docker-manifest-create-action@fa55f72001a6c74b0f4997dca65c70d334905180 # v2
- uses: int128/docker-manifest-create-action@7df7f9e221d927eaadf87db231ddf728047308a4 # v2
id: build
with:
tags: ${{ matrix.tag }}

65
.github/workflows/api-ts-publish.yml vendored Normal file
View File

@@ -0,0 +1,65 @@
---
name: API - Publish Typescript client
on:
push:
branches: [main]
paths:
- "schema.yml"
workflow_dispatch:
permissions:
# Required for NPM OIDC trusted publisher
id-token: write
contents: read
jobs:
build:
runs-on: ubuntu-latest
steps:
- id: generate_token
uses: actions/create-github-app-token@f8d387b68d61c58ab83c6c016672934102569859 # v2
with:
app-id: ${{ secrets.GH_APP_ID }}
private-key: ${{ secrets.GH_APP_PRIV_KEY }}
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
with:
token: ${{ steps.generate_token.outputs.token }}
- uses: ./.github/actions/setup-node
with:
working-directory: web
- name: Generate API Client
run: make gen-client-ts
- name: Publish package
working-directory: gen-ts-api/
run: |
npm i
npm publish --tag generated
- name: Upgrade /web
working-directory: web
run: |
export VERSION=`node -e 'import mod from "./gen-ts-api/package.json" with { type: "json" };console.log(mod.version);'`
npm i @goauthentik/api@$VERSION
- name: Upgrade /web/packages/sfe
working-directory: web/packages/sfe
run: |
export VERSION=`node -e 'import mod from "./gen-ts-api/package.json" with { type: "json" };console.log(mod.version);'`
npm i @goauthentik/api@$VERSION
- uses: peter-evans/create-pull-request@c0f553fe549906ede9cf27b5156039d195d2ece0 # v7
id: cpr
with:
token: ${{ steps.generate_token.outputs.token }}
branch: update-web-api-client
commit-message: "web: bump API Client version"
title: "web: bump API Client version"
body: "web: bump API Client version"
delete-branch: true
signoff: true
# ID from https://api.github.com/users/authentik-automation[bot]
author: authentik-automation[bot] <135050075+authentik-automation[bot]@users.noreply.github.com>
labels: dependencies
- uses: peter-evans/enable-pull-request-automerge@a660677d5469627102a1c1e11409dd063606628d # v3
with:
token: ${{ steps.generate_token.outputs.token }}
pull-request-number: ${{ steps.cpr.outputs.pull-request-number }}
merge-method: squash

View File

@@ -22,25 +22,19 @@ jobs:
- prettier-check
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- name: Install Dependencies
working-directory: website/
run: npm ci
- uses: ./.github/actions/setup-node
with:
working-directory: website
- name: Lint
working-directory: website/
run: npm run ${{ matrix.command }}
run: corepack npm run ${{ matrix.command }} --prefix website
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v5
- uses: ./.github/actions/setup-node
with:
node-version-file: website/package.json
cache: "npm"
cache-dependency-path: website/package-lock.json
- working-directory: website/
name: Install Dependencies
run: npm ci
working-directory: website
- uses: actions/cache@27d5ce7f107fe9357f9df03efb73ab90386fccae # v4
with:
path: |
@@ -54,7 +48,7 @@ jobs:
working-directory: website
env:
NODE_ENV: production
run: npm run build -w api
run: corepack npm run build -w api
- uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v4
with:
name: api-docs
@@ -71,11 +65,9 @@ jobs:
with:
name: api-docs
path: website/api/build
- uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v5
- uses: ./.github/actions/setup-node
with:
node-version-file: website/package.json
cache: "npm"
cache-dependency-path: website/package-lock.json
working-directory: website
- name: Deploy Netlify (Production)
working-directory: website/api
if: github.event_name == 'push' && github.ref == 'refs/heads/main'

View File

@@ -24,14 +24,9 @@ jobs:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- name: Setup authentik env
uses: ./.github/actions/setup
- uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v5
- uses: ./.github/actions/setup-node
with:
node-version-file: lifecycle/aws/package.json
cache: "npm"
cache-dependency-path: lifecycle/aws/package-lock.json
- working-directory: lifecycle/aws/
run: |
npm ci
working-directory: lifecycle/aws
- name: Check changes have been applied
run: |
uv run make aws-cfn

View File

@@ -24,46 +24,34 @@ jobs:
- prettier-check
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- name: Install dependencies
working-directory: website/
run: npm ci
- uses: ./.github/actions/setup-node
with:
working-directory: website
- name: Lint
working-directory: website/
run: npm run ${{ matrix.command }}
run: corepack npm run ${{ matrix.command }} --prefix website
build-docs:
runs-on: ubuntu-latest
env:
NODE_ENV: production
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v5
- uses: ./.github/actions/setup-node
name: Setup Node.js
with:
node-version-file: website/package.json
cache: "npm"
cache-dependency-path: website/package-lock.json
- working-directory: website/
name: Install Dependencies
run: npm ci
working-directory: website
- name: Build Documentation via Docusaurus
working-directory: website/
run: npm run build
run: corepack npm run build --prefix website
build-integrations:
runs-on: ubuntu-latest
env:
NODE_ENV: production
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v5
- uses: ./.github/actions/setup-node
with:
node-version-file: website/package.json
cache: "npm"
cache-dependency-path: website/package-lock.json
- working-directory: website/
name: Install Dependencies
run: npm ci
working-directory: website
- name: Build Integrations via Docusaurus
working-directory: website/
run: npm run build -w integrations
run: corepack npm run build -w integrations --prefix website
build-container:
runs-on: ubuntu-latest
permissions:
@@ -104,7 +92,9 @@ jobs:
platforms: linux/amd64,linux/arm64
context: .
cache-from: type=registry,ref=ghcr.io/goauthentik/dev-docs:buildcache
cache-to: ${{ steps.ev.outputs.shouldPush == 'true' && 'type=registry,ref=ghcr.io/goauthentik/dev-docs:buildcache,mode=max' || '' }}
cache-to: ${{ steps.ev.outputs.shouldPush == 'true' &&
'type=registry,ref=ghcr.io/goauthentik/dev-docs:buildcache,mode=max'
|| '' }}
- uses: actions/attest-build-provenance@a2bbfa25375fe432b6a289bc6b6cd05ecd0c4c32 # v3
id: attest
if: ${{ steps.ev.outputs.shouldPush == 'true' }}

View File

@@ -73,7 +73,8 @@ jobs:
- name: generate API clients
run: make gen-clients
- name: ensure schema is up-to-date
run: git diff --exit-code -- schema.yml blueprints/schema.json packages/client-go packages/client-rust packages/client-ts
run: git diff --exit-code -- schema.yml blueprints/schema.json
packages/client-go packages/client-rust packages/client-ts
test-migrations:
runs-on: ubuntu-latest
steps:
@@ -91,7 +92,8 @@ jobs:
outputs:
seed: ${{ steps.seed.outputs.seed }}
test-migrations-from-stable:
name: test-migrations-from-stable - PostgreSQL ${{ matrix.psql }} - Run ${{ matrix.run_id }}/5
name: test-migrations-from-stable - PostgreSQL ${{ matrix.psql }} - Run ${{
matrix.run_id }}/5
runs-on: ubuntu-latest
timeout-minutes: 30
needs: test-make-seed
@@ -101,7 +103,7 @@ jobs:
psql:
- 14-alpine
- 18-alpine
run_id: [1, 2, 3, 4, 5]
run_id: [ 1, 2, 3, 4, 5 ]
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
with:
@@ -109,8 +111,13 @@ jobs:
- name: checkout stable
run: |
set -e -o pipefail
cp -R .github ..
cp -R scripts ..
mkdir -p ../packages
cp -R packages/logger-js ../packages/logger-js
# Previous stable tag
prev_stable=$(git tag --sort=version:refname | grep '^version/' | grep -vE -- '-rc[0-9]+$' | tail -n1)
# Current version family based on
@@ -118,10 +125,13 @@ jobs:
if [[ -n $current_version_family ]]; then
prev_stable="version/${current_version_family}"
fi
echo "::notice::Checking out ${prev_stable} as stable version..."
git checkout ${prev_stable}
rm -rf .github/ scripts/
rm -rf .github/ scripts/ packages/logger-js/
mv ../.github ../scripts .
mv ../packages/logger-js ./packages/
- name: Setup authentik env (stable)
uses: ./.github/actions/setup
with:
@@ -169,7 +179,7 @@ jobs:
psql:
- 14-alpine
- 18-alpine
run_id: [1, 2, 3, 4, 5]
run_id: [ 1, 2, 3, 4, 5 ]
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- name: Setup authentik env
@@ -252,19 +262,22 @@ jobs:
COMPOSE_PROFILES: ${{ matrix.job.profiles }}
run: |
docker compose -f tests/e2e/compose.yml up -d --quiet-pull
- uses: ./.github/actions/setup-node
- id: cache-web
uses: actions/cache@27d5ce7f107fe9357f9df03efb73ab90386fccae # v4
if: contains(matrix.job.profiles, 'selenium')
with:
path: web/dist
key: ${{ runner.os }}-web-${{ hashFiles('web/package-lock.json', 'package-lock.json', 'web/src/**', 'web/packages/sfe/src/**') }}-b
key: ${{ runner.os }}-web-${{ hashFiles('web/package-lock.json',
'package-lock.json', 'web/src/**', 'web/packages/sfe/src/**') }}-b
- name: prepare web ui
if: steps.cache-web.outputs.cache-hit != 'true' && contains(matrix.job.profiles, 'selenium')
if: steps.cache-web.outputs.cache-hit != 'true' && contains(matrix.job.profiles,
'selenium')
working-directory: web
run: |
npm ci
npm run build
npm run build:sfe
corepack npm ci
corepack npm run build
corepack npm run build:sfe
- name: run e2e
run: |
uv run coverage run manage.py test ${{ matrix.job.glob }}
@@ -282,18 +295,10 @@ jobs:
fail-fast: false
matrix:
job:
- name: oidc_basic
glob: tests/openid_conformance/test_oidc_basic.py
- name: oidc_implicit
glob: tests/openid_conformance/test_oidc_implicit.py
- name: oidc_rp-initiated
glob: tests/openid_conformance/test_oidc_rp_initiated.py
- name: oidc_frontchannel
glob: tests/openid_conformance/test_oidc_frontchannel.py
- name: oidc_backchannel
glob: tests/openid_conformance/test_oidc_backchannel.py
- name: ssf_transmitter
glob: tests/openid_conformance/test_ssf_transmitter.py
- name: basic
glob: tests/openid_conformance/test_basic.py
- name: implicit
glob: tests/openid_conformance/test_implicit.py
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- name: Setup authentik env
@@ -310,14 +315,14 @@ jobs:
uses: actions/cache@27d5ce7f107fe9357f9df03efb73ab90386fccae # v4
with:
path: web/dist
key: ${{ runner.os }}-web-${{ hashFiles('web/package-lock.json', 'web/src/**', 'web/packages/sfe/src/**') }}-b
key: ${{ runner.os }}-web-${{ hashFiles('web/package-lock.json', 'web/src/**',
'web/packages/sfe/src/**') }}-b
- name: prepare web ui
if: steps.cache-web.outputs.cache-hit != 'true'
working-directory: web
run: |
npm ci
npm run build
npm run build:sfe
corepack npm ci --prefix web
corepack npm run build --prefix web
corepack npm run build:sfe --prefix web
- name: run conformance
run: |
uv run coverage run manage.py test ${{ matrix.job.glob }}
@@ -383,7 +388,9 @@ jobs:
uses: ./.github/workflows/_reusable-docker-build.yml
secrets: inherit
with:
image_name: ${{ github.repository == 'goauthentik/authentik-internal' && 'ghcr.io/goauthentik/internal-server' || 'ghcr.io/goauthentik/dev-server' }}
image_name: ${{ github.repository == 'goauthentik/authentik-internal' &&
'ghcr.io/goauthentik/internal-server' ||
'ghcr.io/goauthentik/dev-server' }}
release: false
pr-comment:
needs:

View File

@@ -114,8 +114,11 @@ jobs:
GIT_BUILD_HASH=${{ steps.ev.outputs.sha }}
platforms: linux/amd64,linux/arm64
context: .
cache-from: type=registry,ref=ghcr.io/goauthentik/dev-${{ matrix.type }}:buildcache
cache-to: ${{ steps.ev.outputs.shouldPush == 'true' && format('type=registry,ref=ghcr.io/goauthentik/dev-{0}:buildcache,mode=max', matrix.type) || '' }}
cache-from: type=registry,ref=ghcr.io/goauthentik/dev-${{ matrix.type
}}:buildcache
cache-to: ${{ steps.ev.outputs.shouldPush == 'true' &&
format('type=registry,ref=ghcr.io/goauthentik/dev-{0}:buildcache,mode=max',
matrix.type) || '' }}
- uses: actions/attest-build-provenance@a2bbfa25375fe432b6a289bc6b6cd05ecd0c4c32 # v3
id: attest
if: ${{ steps.ev.outputs.shouldPush == 'true' }}
@@ -136,8 +139,8 @@ jobs:
- ldap
- radius
- rac
goos: [linux]
goarch: [amd64, arm64]
goos: [ linux ]
goarch: [ amd64, arm64 ]
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
with:
@@ -145,16 +148,11 @@ jobs:
- uses: actions/setup-go@4a3601121dd01d1626a1e23e37211e3254c1c06c # v6
with:
go-version-file: "go.mod"
- uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v5
- uses: ./.github/actions/setup-node
with:
node-version-file: web/package.json
cache: "npm"
cache-dependency-path: web/package-lock.json
working-directory: web
- name: Build web
working-directory: web/
run: |
npm ci
npm run build-proxy
run: corepack npm run build-proxy --prefix web
- name: Build outpost
run: |
set -x

View File

@@ -12,51 +12,280 @@ on:
- main
- version-*
env:
POSTGRES_DB: authentik
POSTGRES_USER: authentik
POSTGRES_PASSWORD: "EK-5jnKfjrGRm<77"
AUTHENTIK_BLUEPRINTS_DIR: "./blueprints"
AUTHENTIK_OUTPOSTS__DISABLE_EMBEDDED_OUTPOST: "true"
# Drives the system/bootstrap.yaml blueprint at startup: creates akadmin with
# these credentials and flips the Setup flag (Setup.set(True)) so the SPA's
# post-login redirect to "/" doesn't bounce through /setup, which would 500
# because the OOBE policy refuses to run once akadmin already has a usable
# password. See authentik/core/setup/signals.py and blueprints/default/flow-oobe.yaml.
AUTHENTIK_BOOTSTRAP_EMAIL: "test-admin@goauthentik.io"
AUTHENTIK_BOOTSTRAP_PASSWORD: "test-runner"
jobs:
lint:
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
command:
- lint
- lint:lockfile
- tsc
- prettier-check
project:
- web
include:
- command: tsc
project: web
- command: lit-analyse
project: web
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v5
- uses: ./.github/actions/setup-node
with:
node-version-file: ${{ matrix.project }}/package.json
cache: "npm"
cache-dependency-path: ${{ matrix.project }}/package-lock.json
- working-directory: ${{ matrix.project }}/
run: |
npm ci
working-directory: web
- name: Lint
working-directory: ${{ matrix.project }}/
run: npm run ${{ matrix.command }}
run: corepack npm run lint --prefix web
- name: Check types
run: corepack npm run tsc --prefix web
- name: Check formatting
run: corepack npm run prettier-check --prefix web
- name: Lit analyse
run: corepack npm run lit-analyse --prefix web
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v5
- uses: ./.github/actions/setup-node
with:
node-version-file: web/package.json
cache: "npm"
cache-dependency-path: web/package-lock.json
- working-directory: web/
run: npm ci
working-directory: web
- name: build
working-directory: web/
run: npm run build
run: corepack npm run build
e2e:
name: e2e (playwright)
runs-on: ubuntu-latest
timeout-minutes: 60
permissions:
contents: read
# Required so the "Comment Playwright result on PR" step can update its
# marker comment via the gh CLI / REST API.
pull-requests: write
# Required so the optional "Upload HTML report to S3" step can mint OIDC
# credentials with aws-actions/configure-aws-credentials. Harmless when
# the upload is gated off.
id-token: write
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- name: Setup authentik env
uses: ./.github/actions/setup
with:
dependencies: system,python,node,go,rust,runtime
- name: Build web UI
run: corepack npm run --prefix web build
- name: Build authentik server (Go)
run: | # shell
go build -o ./bin/authentik-server ./cmd/server
sudo install -m 0755 ./bin/authentik-server /usr/local/bin/authentik-server
- name: Build authentik worker (Rust)
run: | # shell
cargo build --release --bin authentik
sudo install -m 0755 ./target/release/authentik /usr/local/bin/authentik
- name: Apply migrations
run: uv run python -m lifecycle.migrate
- name: Resolve Playwright version
id: playwright-version
working-directory: web
run: | # shell
version=$(node -p "require('@playwright/test/package.json').version")
if [ -z "$version" ]; then
echo "Failed to resolve @playwright/test version" >&2
exit 1
fi
echo "version=${version}" >> "$GITHUB_OUTPUT"
- name: Cache Playwright browsers
id: playwright-cache
uses: actions/cache@27d5ce7f107fe9357f9df03efb73ab90386fccae # v4
with:
path: ~/.cache/ms-playwright
key: ${{ runner.os }}-playwright-${{ steps.playwright-version.outputs.version }}
- name: Install Playwright browsers
working-directory: web
run: | # shell
if [ "${{ steps.playwright-cache.outputs.cache-hit }}" = "true" ]; then
corepack npm exec -- playwright install-deps chromium
else
corepack npm exec -- playwright install --with-deps chromium
fi
- name: Start authentik server and worker
run: | # shell
set -euo pipefail
mkdir -p /tmp/ak-logs
# The Go server (authentik-server) spawns gunicorn as a child via PATH lookup
# and inherits the env that `uv run` set up for it. Verify gunicorn resolves
# under the same launcher so we fail fast here instead of waiting for an
# empty 200 from the proxy fallback later.
uv run --frozen sh -c 'command -v gunicorn' \
|| { echo "gunicorn not resolvable from uv run"; exit 1; }
uv run ak server > /tmp/ak-logs/server.log 2>&1 &
echo $! > /tmp/ak-logs/server.pid
# The Rust worker also opens an HTTP/metrics server on listen.http /
# listen.metrics (default :9000 / :9300). On a single CI host that races the
# Go server's binds and silently steals :9000, leaving Playwright talking to
# a healthcheck-only axum router that returns 200/empty for /if/* paths.
# Pin the worker to disjoint ports so the Go server keeps the public 9000.
AUTHENTIK_LISTEN__HTTP="[::]:9001" \
AUTHENTIK_LISTEN__METRICS="[::]:9301" \
uv run ak worker > /tmp/ak-logs/worker.log 2>&1 &
echo $! > /tmp/ak-logs/worker.pid
- name: Wait for authentik to be ready
run: | # shell
set -euo pipefail
# Readiness probes must verify the Go server is actually serving the request,
# not just that *something* on :9000 returned 200. The Go proxy stamps
# `X-authentik-version` on its static responses and the rendered flow page
# contains the <ak-flow-executor> custom element — both are absent from the
# worker's axum healthcheck router, so checking either rules out the
# port-collision failure mode.
timeout 240 bash -c '
until curl -fsS -o /dev/null http://localhost:9000/-/health/ready/; do
sleep 2
done'
timeout 300 bash -c '
until curl -fsS http://localhost:9000/if/flow/default-authentication-flow/ \
| grep -q "ak-flow-executor"; do
sleep 3
done'
- name: Run Playwright tests
working-directory: web
env:
AK_TEST_RUNNER_PAGE_URL: http://localhost:9000
run: corepack npm run test:e2e
# Reporting / upload steps below intentionally use `!cancelled()` rather
# than `failure()`: a cancelled run (e.g. superseded by a newer push) is
# not a real result and shouldn't produce reviewer-facing artifacts or
# comments. `if-no-files-found: ignore` keeps the "passed" case quiet.
- name: Upload Playwright HTML report
if: ${{ !cancelled() }}
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1
with:
name: playwright-report
path: web/playwright-report/
retention-days: 14
if-no-files-found: ignore
- name: Upload Playwright traces and videos
if: ${{ !cancelled() }}
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1
with:
name: playwright-traces
path: web/test-results/
retention-days: 14
if-no-files-found: ignore
- name: Upload authentik server and worker logs
if: ${{ !cancelled() }}
uses: actions/upload-artifact@043fb46d1a93c77aae656e7c1c64a875d1fc6a0a # v7.0.1
with:
name: authentik-logs
path: /tmp/ak-logs/
retention-days: 14
if-no-files-found: ignore
- name: Parse Playwright results
id: playwright-results
if: ${{ !cancelled() }}
run: | # shell
set -euo pipefail
report=web/playwright-report/results.json
if [ ! -f "$report" ]; then
{
echo "available=false"
echo "passed=0"
echo "failed=0"
echo "flaky=0"
echo "skipped=0"
} >> "$GITHUB_OUTPUT"
exit 0
fi
{
echo "available=true"
echo "passed=$(jq -r '.stats.expected // 0' "$report")"
echo "failed=$(jq -r '.stats.unexpected // 0' "$report")"
echo "flaky=$(jq -r '.stats.flaky // 0' "$report")"
echo "skipped=$(jq -r '.stats.skipped // 0' "$report")"
} >> "$GITHUB_OUTPUT"
# The S3 publishing pair below is intentionally gated off until the
# `authentik-playwright-artifacts` bucket is provisioned by infra. Flip
# the repo variable `PLAYWRIGHT_S3_ENABLED=true` to turn it on; the URL
# baked into the PR comment below already points at the eventual key.
- name: Configure AWS credentials for HTML report upload
if: ${{ !cancelled() && github.event_name == 'pull_request' && github.event.pull_request.head.repo.full_name == github.repository && vars.PLAYWRIGHT_S3_ENABLED == 'true' }}
uses: aws-actions/configure-aws-credentials@ec61189d14ec14c8efccab744f656cffd0e33f37 # v6.1.0
with:
role-to-assume: "arn:aws:iam::016170277896:role/github_goauthentik_authentik"
aws-region: eu-central-1
- name: Upload HTML report to S3
if: ${{ !cancelled() && github.event_name == 'pull_request' && github.event.pull_request.head.repo.full_name == github.repository && vars.PLAYWRIGHT_S3_ENABLED == 'true' }}
env:
S3_BUCKET: authentik-playwright-artifacts
S3_KEY_PREFIX: pr-${{ github.event.pull_request.number }}/run-${{ github.run_id }}/attempt-${{ github.run_attempt }}
run: | # shell
set -euo pipefail
if [ ! -d web/playwright-report ]; then
echo "No playwright-report/ produced; skipping S3 upload"
exit 0
fi
aws s3 cp \
--recursive \
--acl=public-read \
--cache-control "public, max-age=600" \
web/playwright-report/ \
"s3://${S3_BUCKET}/${S3_KEY_PREFIX}/"
# Same-repo guard: fork PRs run with a read-only GITHUB_TOKEN (even after
# maintainer approval), so the comment + edit calls would 403. Skip cleanly
# rather than failing the job on every fork PR run.
- name: Comment Playwright result on PR
if: ${{ !cancelled() && github.event_name == 'pull_request' && github.event.pull_request.head.repo.full_name == github.repository }}
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
PR_NUMBER: ${{ github.event.pull_request.number }}
AVAILABLE: ${{ steps.playwright-results.outputs.available }}
PASSED: ${{ steps.playwright-results.outputs.passed }}
FAILED: ${{ steps.playwright-results.outputs.failed }}
FLAKY: ${{ steps.playwright-results.outputs.flaky }}
SKIPPED: ${{ steps.playwright-results.outputs.skipped }}
S3_ENABLED: ${{ vars.PLAYWRIGHT_S3_ENABLED }}
REPORT_URL: https://authentik-playwright-artifacts.s3.amazonaws.com/pr-${{ github.event.pull_request.number }}/run-${{ github.run_id }}/attempt-${{ github.run_attempt }}/index.html
RUN_URL: ${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}/attempts/${{ github.run_attempt }}
run: | # shell
set -euo pipefail
marker='<!-- playwright-result -->'
if [ "$AVAILABLE" = "true" ]; then
if [ "$FAILED" -gt 0 ]; then
status='❌ Failed'
elif [ "$FLAKY" -gt 0 ]; then
status='⚠️ Passed with flakes'
else
status='✅ Passed'
fi
stats=$(printf '| Result | Count |\n|---|---|\n| ✅ Passed | %s |\n| ❌ Failed | %s |\n| ⚠️ Flaky | %s |\n| ⏭️ Skipped | %s |\n' "$PASSED" "$FAILED" "$FLAKY" "$SKIPPED")
else
status='⚠️ No results produced'
stats='The job did not produce `playwright-report/results.json`. The suite likely crashed before the JSON reporter wrote its output — see the workflow run for setup-step failures.'
fi
if [ "$S3_ENABLED" = "true" ]; then
report_line=$(printf '[HTML report](%s) · [Workflow run](%s)' "$REPORT_URL" "$RUN_URL")
else
report_line=$(printf '[Workflow run](%s) · _HTML report hosting is gated off until the `authentik-playwright-artifacts` S3 bucket is provisioned (`vars.PLAYWRIGHT_S3_ENABLED`). Until then, download the `playwright-report` artifact from the run page._' "$RUN_URL")
fi
body=$(printf '%s\n## Playwright e2e — %s\n\n%s\n\n%s\n' "$marker" "$status" "$stats" "$report_line")
existing=$(gh api "repos/${GITHUB_REPOSITORY}/issues/${PR_NUMBER}/comments" \
--paginate \
--jq "[.[] | select(.body != null and (.body | startswith(\"$marker\")))] | .[0].id // empty")
if [ -n "$existing" ]; then
gh api -X PATCH "repos/${GITHUB_REPOSITORY}/issues/comments/${existing}" -f body="$body" > /dev/null
echo "Updated existing comment ${existing}"
else
gh api -X POST "repos/${GITHUB_REPOSITORY}/issues/${PR_NUMBER}/comments" -f body="$body" > /dev/null
echo "Created new playwright-result comment"
fi
ci-web-mark:
if: always()
needs:
@@ -73,13 +302,9 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v5
- uses: ./.github/actions/setup-node
with:
node-version-file: web/package.json
cache: "npm"
cache-dependency-path: web/package-lock.json
- working-directory: web/
run: npm ci
working-directory: web
- name: test
working-directory: web/
run: npm run test || exit 0
run: corepack npm run test || exit 0

View File

@@ -3,7 +3,7 @@ name: Packages - Publish NPM packages
on:
push:
branches: [main]
branches: [ main ]
paths:
- packages/tsconfig/**
- packages/eslint-config/**
@@ -35,22 +35,19 @@ jobs:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
with:
fetch-depth: 2
- uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v5
- uses: ./.github/actions/setup-node
with:
node-version-file: ${{ matrix.package }}/package.json
registry-url: "https://registry.npmjs.org"
working-directory: ${{ matrix.package }}
- name: Get changed files
id: changed-files
uses: tj-actions/changed-files@9426d40962ed5378910ee2e21d5f8c6fcbf2dd96 # 24d32ffd492484c1d75e0c0b894501ddb9d30d62
with:
files: |
${{ matrix.package }}/package.json
- name: Install Dependencies
run: npm ci
- name: Publish package
if: steps.changed-files.outputs.any_changed == 'true'
working-directory: ${{ matrix.package }}
run: |
npm ci
npm run build
npm publish
corepack npm ci
corepack npm run build
corepack npm publish

View File

@@ -28,10 +28,10 @@ jobs:
- name: Setup authentik env
uses: ./.github/actions/setup
- name: Initialize CodeQL
uses: github/codeql-action/init@v4.35.4
uses: github/codeql-action/init@v4
with:
languages: ${{ matrix.language }}
- name: Autobuild
uses: github/codeql-action/autobuild@v4.35.4
uses: github/codeql-action/autobuild@v4
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v4.35.4
uses: github/codeql-action/analyze@v4

View File

@@ -5,7 +5,7 @@ on:
workflow_dispatch:
inputs:
next_version:
description: Next version (for example, if you're currently releasing 2026.5, then enter 2026.8)
description: Next major version (for example, if releasing 2042.2, this is 2042.4)
required: true
type: string
@@ -68,14 +68,10 @@ jobs:
token: ${{ steps.generate_token.outputs.token }}
- name: Setup authentik env
uses: ./.github/actions/setup
with:
dependencies: "system,python,go,node,runtime,rust-nightly"
- name: Run migrations
run: make migrate
- name: Bump version
run: "make bump version=${{ inputs.next_version }}.0-rc1"
- name: Re-generate API Clients
run: make gen
- name: Create pull request
uses: peter-evans/create-pull-request@5f6978faf089d4d20b00c7766989d076bb2fc7f1 # v7
with:

View File

@@ -3,7 +3,7 @@ name: Release - On publish
on:
release:
types: [published, created]
types: [ published, created ]
jobs:
build-server:
@@ -87,11 +87,9 @@ jobs:
- uses: actions/setup-go@4a3601121dd01d1626a1e23e37211e3254c1c06c # v6
with:
go-version-file: "go.mod"
- uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v5
- uses: ./.github/actions/setup-node
with:
node-version-file: web/package.json
cache: "npm"
cache-dependency-path: web/package-lock.json
working-directory: web
- name: Set up QEMU
uses: docker/setup-qemu-action@ce360397dd3f832beb865e1373c09c0e9f86d70a # v4.0.0
- name: Set up Docker Buildx
@@ -144,22 +142,16 @@ jobs:
- proxy
- ldap
- radius
goos: [linux, darwin]
goarch: [amd64, arm64]
goos: [ linux, darwin ]
goarch: [ amd64, arm64 ]
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/setup-go@4a3601121dd01d1626a1e23e37211e3254c1c06c # v6
with:
go-version-file: "go.mod"
- uses: actions/setup-node@48b55a011bda9f5d6aeb4c2d9c7362e8dae4041e # v5
- uses: ./.github/actions/setup-node
with:
node-version-file: web/package.json
cache: "npm"
cache-dependency-path: web/package-lock.json
- name: Install web dependencies
working-directory: web/
run: |
npm ci
working-directory: web
- name: Build web
working-directory: web/
run: |
@@ -175,8 +167,10 @@ jobs:
uses: svenstaro/upload-release-action@29e53e917877a24fad85510ded594ab3c9ca12de # v2
with:
repo_token: ${{ secrets.GITHUB_TOKEN }}
file: ./authentik-outpost-${{ matrix.type }}_${{ matrix.goos }}_${{ matrix.goarch }}
asset_name: authentik-outpost-${{ matrix.type }}_${{ matrix.goos }}_${{ matrix.goarch }}
file: ./authentik-outpost-${{ matrix.type }}_${{ matrix.goos }}_${{
matrix.goarch }}
asset_name: authentik-outpost-${{ matrix.type }}_${{ matrix.goos }}_${{
matrix.goarch }}
tag: ${{ github.ref }}
upload-aws-cfn-template:
permissions:
@@ -191,7 +185,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: aws-actions/configure-aws-credentials@d979d5b3a71173a29b74b5b88418bfda9437d885 # v6.1.1
- uses: aws-actions/configure-aws-credentials@ec61189d14ec14c8efccab744f656cffd0e33f37 # v6.1.0
with:
role-to-assume: "arn:aws:iam::016170277896:role/github_goauthentik_authentik"
aws-region: ${{ env.AWS_REGION }}

View File

@@ -82,14 +82,10 @@ jobs:
token: "${{ steps.app-token.outputs.token }}"
- name: Setup authentik env
uses: ./.github/actions/setup
with:
dependencies: "system,python,go,node,runtime,rust-nightly"
- name: Run migrations
run: make migrate
- name: Bump version
run: "make bump version=${{ inputs.version }}"
- name: Re-generate API Clients
run: make gen
- name: Commit and push
run: |
# ID from https://api.github.com/users/authentik-automation[bot]

7
.gitignore vendored
View File

@@ -14,6 +14,8 @@ media
# Node
node_modules
corepack.tgz
.corepack
.cspellcache
cspell-report.*
@@ -229,11 +231,6 @@ source_docs/
### Golang ###
/vendor/
server
proxy
ldap
rac
radius
### Docker ###
tests/openid_conformance/exports/*.zip

20
.npmrc
View File

@@ -1,20 +0,0 @@
# Block lifecycle scripts (preinstall/install/postinstall/prepare) from dependencies.
# This neutralizes the dominant npm supply-chain attack vector.
#
# Packages that legitimately need a build step (e.g. esbuild, chromedriver, tree-sitter)
# must be rebuilt explicitly:
#
# npm rebuild --foreground-scripts esbuild chromedriver tree-sitter tree-sitter-json
ignore-scripts=true
# Fail fast if the active Node/npm doesn't match the "engines" field.
engine-strict=true
# Pin exact versions so `npm install <pkg>` writes "1.2.3" not "^1.2.3".
save-exact=true
# Surface CVE warnings during install; doesn't block.
audit=true
# Suppress funding banners.
fund=false

View File

@@ -34,7 +34,6 @@ packages/django-channels-postgres @goauthentik/backend
packages/django-postgres-cache @goauthentik/backend
packages/django-dramatiq-postgres @goauthentik/backend
# Web packages
.npmrc @goauthentik/frontend
tsconfig.json @goauthentik/frontend
package.json @goauthentik/frontend
package-lock.json @goauthentik/frontend

190
Cargo.lock generated
View File

@@ -17,6 +17,18 @@ version = "2.0.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "320119579fcad9c21884f5c4861d16174d0e06250625266f50fe6898340abefa"
[[package]]
name = "ahash"
version = "0.8.12"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5a15f179cd60c4584b8a8c596927aadc462e27f2ca70c04e0071964a73ba7a75"
dependencies = [
"cfg-if",
"once_cell",
"version_check",
"zerocopy",
]
[[package]]
name = "aho-corasick"
version = "1.1.4"
@@ -171,7 +183,7 @@ checksum = "1505bd5d3d116872e7271a6d4e16d81d0c8570876c8de68093a09ac269d8aac0"
[[package]]
name = "authentik"
version = "2026.8.0-rc1"
version = "2026.5.0-rc1"
dependencies = [
"arc-swap",
"argh",
@@ -191,12 +203,11 @@ dependencies = [
"tokio",
"tracing",
"uuid",
"which",
]
[[package]]
name = "authentik-axum"
version = "2026.8.0-rc1"
version = "2026.5.0-rc1"
dependencies = [
"authentik-common",
"axum",
@@ -216,7 +227,7 @@ dependencies = [
[[package]]
name = "authentik-client"
version = "2026.8.0-rc1"
version = "2026.5.0-rc1"
dependencies = [
"aws-lc-rs",
"reqwest",
@@ -232,7 +243,7 @@ dependencies = [
[[package]]
name = "authentik-common"
version = "2026.8.0-rc1"
version = "2026.5.0-rc1"
dependencies = [
"arc-swap",
"authentik-client",
@@ -1003,17 +1014,6 @@ dependencies = [
"pin-project-lite",
]
[[package]]
name = "evmap"
version = "11.0.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1b8874945f036109c72242964c1174cf99434e30cfa45bf45fedc983f50046f8"
dependencies = [
"hashbag",
"left-right",
"smallvec",
]
[[package]]
name = "eyre"
version = "0.6.12"
@@ -1230,21 +1230,6 @@ dependencies = [
"slab",
]
[[package]]
name = "generator"
version = "0.8.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "52f04ae4152da20c76fe800fa48659201d5cf627c5149ca0b707b69d7eef6cf9"
dependencies = [
"cc",
"cfg-if",
"libc",
"log",
"rustversion",
"windows-link",
"windows-result",
]
[[package]]
name = "generic-array"
version = "0.14.7"
@@ -1326,12 +1311,6 @@ dependencies = [
"tracing",
]
[[package]]
name = "hashbag"
version = "0.1.13"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7040a10f52cba493ddb09926e15d10a9d8a28043708a405931fe4c6f19fac064"
[[package]]
name = "hashbrown"
version = "0.15.5"
@@ -1744,6 +1723,16 @@ dependencies = [
"serde",
]
[[package]]
name = "iri-string"
version = "0.7.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d8e7418f59cc01c88316161279a7f665217ae316b388e58a0d10e29f54f1e5eb"
dependencies = [
"memchr",
"serde",
]
[[package]]
name = "is_terminal_polyfill"
version = "1.70.2"
@@ -1879,17 +1868,6 @@ version = "0.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "09edd9e8b54e49e587e4f6295a7d29c3ea94d469cb40ab8ca70b288248a81db2"
[[package]]
name = "left-right"
version = "0.11.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0f0c21e4c8ff95f487fb34e6f9182875f42c84cef966d29216bf115d9bba835a"
dependencies = [
"crossbeam-utils",
"loom",
"slab",
]
[[package]]
name = "libc"
version = "0.2.183"
@@ -1961,19 +1939,6 @@ version = "0.4.29"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5e5032e24019045c762d3c0f28f5b6b8bbf38563a65908389bf7978758920897"
[[package]]
name = "loom"
version = "0.7.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "419e0dc8046cb947daa77eb95ae174acfbddb7673b4151f56d1eed8e93fbfaca"
dependencies = [
"cfg-if",
"generator",
"scoped-tls",
"tracing",
"tracing-subscriber",
]
[[package]]
name = "lru-slab"
version = "0.1.2"
@@ -2013,22 +1978,21 @@ checksum = "f8ca58f447f06ed17d5fc4043ce1b10dd205e060fb3ce5b979b8ed8e59ff3f79"
[[package]]
name = "metrics"
version = "0.24.5"
version = "0.24.3"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ff56c2e7dce6bd462e3b8919986a617027481b1dcc703175b58cf9dd98a2f071"
checksum = "5d5312e9ba3771cfa961b585728215e3d972c950a3eed9252aa093d6301277e8"
dependencies = [
"ahash",
"portable-atomic",
"rapidhash",
]
[[package]]
name = "metrics-exporter-prometheus"
version = "0.18.3"
version = "0.18.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1db0d8f1fc9e62caebd0319e11eaec5822b0186c171568f0480b46a0137f9108"
checksum = "3589659543c04c7dc5526ec858591015b87cd8746583b51b48ef4353f99dbcda"
dependencies = [
"base64 0.22.1",
"evmap",
"indexmap",
"metrics",
"metrics-util",
@@ -2047,7 +2011,7 @@ dependencies = [
"hashbrown 0.16.1",
"metrics",
"quanta",
"rand 0.9.4",
"rand 0.9.2",
"rand_xoshiro",
"sketches-ddsketch",
]
@@ -2734,7 +2698,7 @@ dependencies = [
"bytes",
"getrandom 0.3.4",
"lru-slab",
"rand 0.9.4",
"rand 0.9.2",
"ring",
"rustc-hash",
"rustls",
@@ -2794,9 +2758,9 @@ dependencies = [
[[package]]
name = "rand"
version = "0.9.4"
version = "0.9.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "44c5af06bb1b7d3216d91932aed5265164bf384dc89cd6ba05cf59a35f5f76ea"
checksum = "6db2770f06117d490610c7488547d543617b21bfa07796d7a12f6f1bd53850d1"
dependencies = [
"rand_chacha 0.9.0",
"rand_core 0.9.5",
@@ -2849,15 +2813,6 @@ dependencies = [
"rand_core 0.9.5",
]
[[package]]
name = "rapidhash"
version = "4.4.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b5e48930979c155e2f33aa36ab3119b5ee81332beb6482199a8ecd6029b80b59"
dependencies = [
"rustversion",
]
[[package]]
name = "raw-cpuid"
version = "11.6.0"
@@ -2916,9 +2871,9 @@ checksum = "dc897dd8d9e8bd1ed8cdad82b5966c3e0ecae09fb1907d58efaa013543185d0a"
[[package]]
name = "reqwest"
version = "0.13.3"
version = "0.13.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "62e0021ea2c22aed41653bc7e1419abb2c97e038ff2c33d0e1309e49a97deec0"
checksum = "ab3f43e3283ab1488b624b44b0e988d0acea0b3214e694730a055cb6b2efa801"
dependencies = [
"base64 0.22.1",
"bytes",
@@ -3150,12 +3105,6 @@ dependencies = [
"windows-sys 0.61.2",
]
[[package]]
name = "scoped-tls"
version = "1.0.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e1cf6437eb19a8f4a6cc0f7dca544973b0b78843adbfeb3683d1a94a0024a294"
[[package]]
name = "scopeguard"
version = "1.2.0"
@@ -3193,9 +3142,9 @@ checksum = "d767eb0aabc880b29956c35734170f26ed551a859dbd361d140cdbeca61ab1e2"
[[package]]
name = "sentry"
version = "0.48.1"
version = "0.47.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b93b3e19f45495ddd41d8222a152c48c84f6ba45abe9c69e2527e9cdea29bb5b"
checksum = "eb25f439f97d26fea01d717fa626167ceffcd981addaa670001e70505b72acbb"
dependencies = [
"cfg_aliases",
"httpdate",
@@ -3214,9 +3163,9 @@ dependencies = [
[[package]]
name = "sentry-backtrace"
version = "0.48.1"
version = "0.47.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "dc84c325ace9ca2388e510fe7d6672b5d60cd8b3bd0eb4bb4ee8314c323cd686"
checksum = "46a8c2c1bd5c1f735e84f28b48e7d72efcaafc362b7541bc8253e60e8fcdffc6"
dependencies = [
"backtrace",
"regex",
@@ -3225,9 +3174,9 @@ dependencies = [
[[package]]
name = "sentry-contexts"
version = "0.48.1"
version = "0.47.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "896c1ab62dbfe1746fb262bbf72e6feb2fb9dfb2c14709077bf71beb532e44b2"
checksum = "9b88a90baa654d7f0e1f4b667f6b434293d9f72c71bef16b197c76af5b7d5803"
dependencies = [
"hostname",
"libc",
@@ -3239,11 +3188,11 @@ dependencies = [
[[package]]
name = "sentry-core"
version = "0.48.1"
version = "0.47.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d5f5abf20c42cb1593ec1638976e2647da55f79bccac956444c1707b6cce259a"
checksum = "0ac170a5bba8bec6e3339c90432569d89641fa7a3d3e4f44987d24f0762e6adf"
dependencies = [
"rand 0.9.4",
"rand 0.9.2",
"sentry-types",
"serde",
"serde_json",
@@ -3252,9 +3201,9 @@ dependencies = [
[[package]]
name = "sentry-debug-images"
version = "0.48.1"
version = "0.47.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4b88bbe6a760d5724bb40689827e82e8db1e275947df2c59abe171bfc30bb671"
checksum = "dd9646a972b57896d4a92ed200cf76139f8e30b3cfd03b6662ae59926d26633c"
dependencies = [
"findshlibs",
"sentry-core",
@@ -3262,9 +3211,9 @@ dependencies = [
[[package]]
name = "sentry-panic"
version = "0.48.1"
version = "0.47.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0260dcb52562b6a79ae7702312a26dba94b79fb5baee7301087529e5ca4e872e"
checksum = "6127d3d304ba5ce0409401e85aae538e303a569f8dbb031bf64f9ba0f7174346"
dependencies = [
"sentry-backtrace",
"sentry-core",
@@ -3272,9 +3221,9 @@ dependencies = [
[[package]]
name = "sentry-tower"
version = "0.48.1"
version = "0.47.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d669616d5d5279b5712febfc80c343acc3695e499de0d101ed70fceacadf37f2"
checksum = "61c5253dc4ad89863a866b93aeaaac1c9d60f2f774663b5024afe2d57e0a101c"
dependencies = [
"sentry-core",
"tower-layer",
@@ -3283,9 +3232,9 @@ dependencies = [
[[package]]
name = "sentry-tracing"
version = "0.48.1"
version = "0.47.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a1c035f3a0a8671ae1a231c5b457abb68b71acba2bf3054dab2a09a9d4ea487e"
checksum = "27701acc51e68db5281802b709010395bfcbcb128b1d0a4e5873680d3b47ff0c"
dependencies = [
"bitflags 2.11.0",
"sentry-backtrace",
@@ -3296,13 +3245,13 @@ dependencies = [
[[package]]
name = "sentry-types"
version = "0.48.1"
version = "0.47.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "82d8e81058ec155992191f61c7b29bfa7b2cf12012131e7cdc0678020898a7c9"
checksum = "56780cb5597d676bf22e6c11d1f062eb4def46390ea3bfb047bcbcf7dfd19bdb"
dependencies = [
"debugid",
"hex",
"rand 0.9.4",
"rand 0.9.2",
"serde",
"serde_json",
"thiserror 2.0.18",
@@ -3390,9 +3339,9 @@ dependencies = [
[[package]]
name = "serde_with"
version = "3.19.0"
version = "3.18.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f05839ce67618e14a09b286535c0d9c94e85ef25469b0e13cb4f844e5593eb19"
checksum = "dd5414fad8e6907dbdd5bc441a50ae8d6e26151a03b1de04d89a5576de61d01f"
dependencies = [
"base64 0.22.1",
"chrono",
@@ -3924,9 +3873,9 @@ checksum = "1f3ccbac311fea05f86f61904b462b55fb3df8837a366dfc601a0161d0532f20"
[[package]]
name = "tokio"
version = "1.52.3"
version = "1.52.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "8fc7f01b389ac15039e4dc9531aa973a135d7a4135281b12d7c1bc79fd57fffe"
checksum = "b67dee974fe86fd92cc45b7a95fdd2f99a36a6d7b0d431a231178d3d670bbcc6"
dependencies = [
"bytes",
"libc",
@@ -4072,21 +4021,21 @@ dependencies = [
[[package]]
name = "tower-http"
version = "0.6.10"
version = "0.6.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "68d6fdd9f81c2819c9a8b0e0cd91660e7746a8e6ea2ba7c6b2b057985f6bcb51"
checksum = "d4e6559d53cc268e5031cd8429d05415bc4cb4aefc4aa5d6cc35fbf5b924a1f8"
dependencies = [
"bitflags 2.11.0",
"bytes",
"futures-util",
"http",
"http-body",
"iri-string",
"pin-project-lite",
"tokio",
"tower",
"tower-layer",
"tower-service",
"url",
]
[[package]]
@@ -4204,7 +4153,7 @@ dependencies = [
"http",
"httparse",
"log",
"rand 0.9.4",
"rand 0.9.2",
"sha1",
"thiserror 2.0.18",
]
@@ -4566,15 +4515,6 @@ dependencies = [
"rustls-pki-types",
]
[[package]]
name = "which"
version = "8.0.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "81995fafaaaf6ae47a7d0cc83c67caf92aeb7e5331650ae6ff856f7c0c60c459"
dependencies = [
"libc",
]
[[package]]
name = "whoami"
version = "1.6.1"

View File

@@ -8,7 +8,7 @@ members = [
resolver = "3"
[workspace.package]
version = "2026.8.0-rc1"
version = "2026.5.0-rc1"
authors = ["authentik Team <hello@goauthentik.io>"]
description = "Making authentication simple."
edition = "2024"
@@ -43,15 +43,15 @@ hyper-unix-socket = "= 0.6.1"
hyper-util = "= 0.1.20"
ipnet = { version = "= 2.12.0", features = ["serde"] }
json-subscriber = "= 0.2.8"
metrics = "= 0.24.5"
metrics-exporter-prometheus = { version = "= 0.18.3", default-features = false }
metrics = "= 0.24.3"
metrics-exporter-prometheus = { version = "= 0.18.1", default-features = false }
nix = { version = "= 0.31.2", features = ["hostname", "signal"] }
notify = "= 8.2.0"
pin-project-lite = "= 0.2.17"
pyo3 = "= 0.28.3"
pyo3-build-config = "= 0.28.3"
regex = "= 1.12.3"
reqwest = { version = "= 0.13.3", features = [
reqwest = { version = "= 0.13.2", features = [
"form",
"json",
"multipart",
@@ -67,7 +67,7 @@ reqwest-middleware = { version = "= 0.5.1", features = [
"rustls",
] }
rustls = { version = "= 0.23.40", features = ["fips"] }
sentry = { version = "= 0.48.1", default-features = false, features = [
sentry = { version = "= 0.47.0", default-features = false, features = [
"backtrace",
"contexts",
"debug-images",
@@ -80,7 +80,7 @@ sentry = { version = "= 0.48.1", default-features = false, features = [
serde = { version = "= 1.0.228", features = ["derive"] }
serde_json = "= 1.0.149"
serde_repr = "= 0.1.20"
serde_with = { version = "= 3.19.0", default-features = false, features = [
serde_with = { version = "= 3.18.0", default-features = false, features = [
"base64",
] }
sqlx = { version = "= 0.8.6", default-features = false, features = [
@@ -97,12 +97,12 @@ sqlx = { version = "= 0.8.6", default-features = false, features = [
tempfile = "= 3.27.0"
thiserror = "= 2.0.18"
time = { version = "= 0.3.47", features = ["macros"] }
tokio = { version = "= 1.52.3", features = ["full", "tracing"] }
tokio = { version = "= 1.52.1", features = ["full", "tracing"] }
tokio-retry2 = "= 0.9.1"
tokio-rustls = "= 0.26.4"
tokio-util = { version = "= 0.7.18", features = ["full"] }
tower = "= 0.5.3"
tower-http = { version = "= 0.6.10", features = ["timeout"] }
tower-http = { version = "= 0.6.8", features = ["timeout"] }
tracing = "= 0.1.44"
tracing-error = "= 0.2.1"
tracing-subscriber = { version = "= 0.3.23", features = [
@@ -113,11 +113,10 @@ tracing-subscriber = { version = "= 0.3.23", features = [
] }
url = "= 2.5.8"
uuid = { version = "= 1.23.1", features = ["serde", "v4"] }
which = "= 8.0.2"
ak-axum = { package = "authentik-axum", version = "2026.8.0-rc1", path = "./packages/ak-axum" }
ak-client = { package = "authentik-client", version = "2026.8.0-rc1", path = "./packages/client-rust" }
ak-common = { package = "authentik-common", version = "2026.8.0-rc1", path = "./packages/ak-common", default-features = false }
ak-axum = { package = "authentik-axum", version = "2026.5.0-rc1", path = "./packages/ak-axum" }
ak-client = { package = "authentik-client", version = "2026.5.0-rc1", path = "./packages/client-rust" }
ak-common = { package = "authentik-common", version = "2026.5.0-rc1", path = "./packages/ak-common", default-features = false }
[workspace.lints.rust]
ambiguous_negative_literals = "warn"
@@ -283,7 +282,6 @@ sqlx = { workspace = true, optional = true }
tokio.workspace = true
tracing.workspace = true
uuid.workspace = true
which.workspace = true
[lints]
workspace = true

View File

@@ -106,14 +106,18 @@ migrate: ## Run the Authentik Django server's migrations
i18n-extract: core-i18n-extract web-i18n-extract ## Extract strings that require translation into files to send to a translation service
aws-cfn:
cd lifecycle/aws && npm i && $(UV) run npm run aws-cfn
aws-cfn: node-install
corepack npm install --prefix lifecycle/aws
$(UV) run corepack npm run aws-cfn --prefix lifecycle/aws
run: ## Run the main authentik server and worker processes
$(UV) run ak allinone
run-server: ## Run the main authentik server process
$(UV) run ak server
run-watch: ## Run the authentik server and worker, with auto reloading
watchexec --on-busy-update=restart --stop-signal=SIGINT --exts py,rs,go --no-meta --notify -- $(UV) run ak allinone
run-worker: ## Run the main authentik worker process
$(UV) run ak worker
run-worker-watch: ## Run the authentik worker, with auto reloading
watchexec --on-busy-update=restart --stop-signal=SIGINT --exts py,rs --no-meta --notify -- $(UV) run ak worker
core-i18n-extract:
$(UV) run ak makemessages \
@@ -160,7 +164,7 @@ endif
$(eval current_version := $(shell cat ${PWD}/internal/constants/VERSION))
$(SED_INPLACE) 's/^version = ".*"/version = "$(version)"/' ${PWD}/pyproject.toml
$(SED_INPLACE) 's/^VERSION = ".*"/VERSION = "$(version)"/' ${PWD}/authentik/__init__.py
$(SED_INPLACE) "s/version = \"${current_version}\"/version = \"$(version)\"/" ${PWD}/Cargo.toml ${PWD}/Cargo.lock
$(SED_INPLACE) "s/version = \"${current_version}\"/version = \"$(version)\"" ${PWD}/Cargo.toml ${PWD}/Cargo.lock
$(MAKE) gen-build gen-compose aws-cfn
$(SED_INPLACE) "s/\"${current_version}\"/\"$(version)\"/" ${PWD}/package.json ${PWD}/package-lock.json ${PWD}/web/package.json ${PWD}/web/package-lock.json
echo -n $(version) > ${PWD}/internal/constants/VERSION
@@ -228,51 +232,47 @@ gen-dev-config: ## Generate a local development config file
## Node.js
#########################
# Packages whose install/postinstall scripts are required for correct
# operation (binary downloads, native bindings). The root .npmrc sets
# `ignore-scripts=true` to block dependency lifecycle scripts by default;
# this list is rebuilt explicitly with scripts re-enabled. Audit any
# additions: each entry runs arbitrary code at install time.
TRUSTED_INSTALL_SCRIPTS := esbuild chromedriver tree-sitter tree-sitter-json
node-install: ## Install the necessary libraries to build Node.js packages
npm ci
node ./scripts/node/setup-corepack.mjs
node ./scripts/node/lint-runtime.mjs
node ./scripts/node/lint-runtime.mjs
#########################
## Web
#########################
web-install: ## Install the necessary libraries to build the Authentik UI
npm ci --prefix web
node ./scripts/node/lint-runtime.mjs web
web-postinstall: ## Trigger postinstall scripts for packages with native bindings or binary downloads, which are blocked by default for security reasons.
npm rebuild --prefix web --ignore-scripts=false --foreground-scripts $(TRUSTED_INSTALL_SCRIPTS)
corepack npm ci
corepack npm ci --prefix web
web-build: node-install ## Build the Authentik UI
npm run --prefix web build
web-build: ## Build the Authentik UI
corepack npm run --prefix web build
web: web-lint-fix web-lint web-check-compile ## Automatically fix formatting issues in the Authentik UI source code, lint the code, and compile it
web-test: ## Run tests for the Authentik UI
npm run --prefix web test
corepack npm run --prefix web test
web-watch: ## Build and watch the Authentik UI for changes, updating automatically
npm run --prefix web watch
corepack npm run --prefix web watch
web-storybook-watch: ## Build and run the storybook documentation server
npm run --prefix web storybook
corepack npm run --prefix web storybook
web-lint-fix:
npm run --prefix web prettier
corepack npm run --prefix web prettier
web-lint:
npm run --prefix web lint
npm run --prefix web lit-analyse
corepack npm run --prefix web lint
corepack npm run --prefix web lit-analyse
web-check-compile:
npm run --prefix web tsc
corepack npm run --prefix web tsc
web-i18n-extract:
npm run --prefix web extract-locales
corepack npm run --prefix web extract-locales
#########################
## Docs
@@ -280,35 +280,40 @@ web-i18n-extract:
docs: docs-lint-fix docs-build ## Automatically fix formatting issues in the Authentik docs source code, lint the code, and compile it
docs-install: node-install
npm ci --prefix website
docs-install: node-install ## Install the necessary libraries to build the Authentik documentation
node ./scripts/node/lint-runtime.mjs
corepack npm ci
corepack npm ci --prefix website
docs-lint-fix: lint-spellcheck
npm run --prefix website prettier
corepack npm run --prefix website prettier
docs-build:
npm run --prefix website build
node ./scripts/node/lint-runtime.mjs website
corepack npm run --prefix website build
docs-watch: ## Build and watch the topics documentation
npm run --prefix website start
corepack npm run --prefix website start
integrations: docs-lint-fix integrations-build ## Fix formatting issues in the integrations source code, lint the code, and compile it
integrations-build:
npm run --prefix website -w integrations build
corepack npm run --prefix website -w integrations build
integrations-watch: ## Build and watch the Integrations documentation
npm run --prefix website -w integrations start
corepack npm run --prefix website -w integrations start
docs-api-build:
npm run --prefix website -w api build
corepack npm run --prefix website -w api build
docs-api-watch: ## Build and watch the API documentation
npm run --prefix website -w api generate
npm run --prefix website -w api start
corepack npm run --prefix website -w api generate
corepack npm run --prefix website -w api start
docs-api-clean: ## Clean generated API documentation
npm run --prefix website -w api build:api:clean
corepack npm run --prefix website -w api build:api:clean
#########################
## Docker

View File

@@ -36,8 +36,6 @@ Our [enterprise offering](https://goauthentik.io/pricing) is available for organ
See the [Developer Documentation](https://docs.goauthentik.io/docs/developer-docs/) for information about setting up local build environments, testing your contributions, and our contribution process.
When you contribute documentation, either to accompany a code change or as a standalone contribution, please be sure to follow our documentation [Style Guide](website/docs/developer-docs/docs/style-guide.mdx).
## Security
Please see [SECURITY.md](SECURITY.md).

View File

@@ -3,7 +3,7 @@
from functools import lru_cache
from os import environ
VERSION = "2026.8.0-rc1"
VERSION = "2026.5.0-rc1"
ENV_GIT_HASH_KEY = "GIT_BUILD_HASH"

View File

@@ -42,29 +42,11 @@ def validate_auth(header: bytes, format="bearer") -> str | None:
return auth_credentials
class VirtualUser(AnonymousUser):
is_active = True
@property
def type(self):
return UserTypes.INTERNAL_SERVICE_ACCOUNT
@property
def is_anonymous(self):
return False
@property
def is_authenticated(self):
return True
def all_roles(self):
return []
class IPCUser(VirtualUser):
class IPCUser(AnonymousUser):
"""'Virtual' user for IPC communication between authentik core and the authentik router"""
username = "authentik:system"
is_active = True
is_superuser = True
@property
@@ -80,6 +62,17 @@ class IPCUser(VirtualUser):
def has_module_perms(self, module):
return True
@property
def is_anonymous(self):
return False
@property
def is_authenticated(self):
return True
def all_roles(self):
return []
class TokenAuthentication(BaseAuthentication):
"""Token-based authentication using HTTP Bearer authentication"""

View File

@@ -1,36 +0,0 @@
from django.db.models import F, QuerySet
from rest_framework.filters import OrderingFilter
from rest_framework.request import Request
from rest_framework.views import APIView
class NullsAwareOrderingFilter(OrderingFilter):
"""OrderingFilter that sorts NULL values consistently.
For any nullable field, NULLs are treated as the smallest possible value:
- ascending → NULLs appear first (nulls_first=True)
- descending → NULLs appear last (nulls_last=True)
"""
def _nullable_field_names(self, queryset: QuerySet) -> set[str]:
return {f.name for f in queryset.model._meta.get_fields() if hasattr(f, "null") and f.null}
def filter_queryset(self, request: Request, queryset: QuerySet, view: APIView):
queryset = super().filter_queryset(request, queryset, view)
ordering = queryset.query.order_by
if not ordering:
return queryset
nullable = self._nullable_field_names(queryset)
new_ordering = []
changed = False
for term in ordering:
name = term.lstrip("-")
if name in nullable:
changed = True
if term.startswith("-"):
new_ordering.append(F(name).desc(nulls_last=True))
else:
new_ordering.append(F(name).asc(nulls_first=True))
else:
new_ordering.append(term)
return queryset.order_by(*new_ordering) if changed else queryset

View File

@@ -1,59 +0,0 @@
from django.db.models import OrderBy
from django.test import TestCase
from rest_framework.request import Request
from rest_framework.test import APIRequestFactory
from authentik.api.ordering import NullsAwareOrderingFilter
from authentik.core.models import Token, User
class MockView:
ordering_fields = "__all__"
ordering = None
class TestNullsAwareOrderingFilter(TestCase):
def setUp(self):
self.filter = NullsAwareOrderingFilter()
self.view = MockView()
factory = APIRequestFactory()
self._req = lambda ordering: Request(factory.get("/", {"ordering": ordering}))
def _order_by(self, model, ordering):
qs = model.objects.all()
return self.filter.filter_queryset(self._req(ordering), qs, self.view).query.order_by
def test_nullable_asc_nulls_first(self):
"""Ascending sort on a nullable field rewrites to nulls_first=True."""
(expr,) = self._order_by(User, "last_login")
self.assertIsInstance(expr, OrderBy)
self.assertFalse(expr.descending)
self.assertTrue(expr.nulls_first)
def test_nullable_desc_nulls_last(self):
"""Descending sort on a nullable field rewrites to nulls_last=True."""
(expr,) = self._order_by(User, "-last_login")
self.assertIsInstance(expr, OrderBy)
self.assertTrue(expr.descending)
self.assertTrue(expr.nulls_last)
def test_non_nullable_passes_through(self):
"""Non-nullable fields are left as plain string terms."""
(expr,) = self._order_by(User, "username")
self.assertEqual(expr, "username")
def test_mixed_ordering(self):
"""Only nullable terms are rewritten; non-nullable terms pass through unchanged."""
first, second = self._order_by(User, "username,-last_login")
self.assertEqual(first, "username")
self.assertIsInstance(second, OrderBy)
self.assertTrue(second.descending)
self.assertTrue(second.nulls_last)
def test_expires_nullable(self):
"""expires on ExpiringModel is nullable and is rewritten correctly."""
(expr,) = self._order_by(Token, "-expires")
self.assertIsInstance(expr, OrderBy)
self.assertTrue(expr.descending)
self.assertTrue(expr.nulls_last)

View File

@@ -1,6 +1,5 @@
"""Serializer mixin for managed models"""
from json import JSONDecodeError, loads
from typing import cast
from django.conf import settings
@@ -45,7 +44,6 @@ class BlueprintUploadSerializer(PassiveSerializer):
file = FileField(required=False)
path = CharField(required=False)
context = CharField(required=False, allow_blank=True)
def validate_path(self, path: str) -> str:
"""Ensure the path (if set) specified is retrievable"""
@@ -56,18 +54,6 @@ class BlueprintUploadSerializer(PassiveSerializer):
raise ValidationError(_("Blueprint file does not exist"))
return path
def validate_context(self, context: str) -> dict:
"""Parse context as a JSON object"""
if not context:
return {}
try:
parsed = loads(context)
except JSONDecodeError as exc:
raise ValidationError(_("Context must be valid JSON")) from exc
if not isinstance(parsed, dict):
raise ValidationError(_("Context must be a JSON object"))
return parsed
class ManagedSerializer:
"""Managed Serializer"""
@@ -217,7 +203,10 @@ class BlueprintInstanceViewSet(UsedByMixin, ModelViewSet):
@extend_schema(
request={"multipart/form-data": BlueprintUploadSerializer},
responses={200: BlueprintImportResultSerializer},
responses={
204: BlueprintImportResultSerializer,
400: BlueprintImportResultSerializer,
},
)
@action(url_path="import", detail=False, methods=["POST"], parser_classes=(MultiPartParser,))
@validate(
@@ -235,8 +224,7 @@ class BlueprintInstanceViewSet(UsedByMixin, ModelViewSet):
).retrieve_file()
else:
raise ValidationError("Either path or file must be set")
context = body.validated_data.get("context") or {}
importer = Importer.from_string(string_contents, context)
importer = Importer.from_string(string_contents)
check_blueprint_perms(importer.blueprint, request.user)
@@ -244,13 +232,21 @@ class BlueprintInstanceViewSet(UsedByMixin, ModelViewSet):
import_response = self.BlueprintImportResultSerializer(
data={
"logs": [LogEventSerializer(log).data for log in logs],
"success": valid,
"logs": [],
"success": False,
}
)
import_response.is_valid(raise_exception=True)
if valid:
import_response.initial_data["success"] = importer.apply()
import_response.is_valid()
import_response.initial_data["logs"] = [LogEventSerializer(log).data for log in logs]
import_response.initial_data["success"] = valid
import_response.is_valid()
if not valid:
return Response(data=import_response.initial_data, status=200)
successful = importer.apply()
import_response.initial_data["success"] = successful
import_response.is_valid()
if not successful:
return Response(data=import_response.initial_data, status=200)
return Response(data=import_response.initial_data, status=200)

View File

@@ -1,19 +1,14 @@
"""Test blueprints v1 api"""
from json import dumps, loads
from json import loads
from tempfile import NamedTemporaryFile, mkdtemp
from django.core.files.uploadedfile import SimpleUploadedFile
from django.urls import reverse
from rest_framework.test import APITestCase
from yaml import dump
from authentik.core.tests.utils import create_test_admin_user
from authentik.flows.models import Flow
from authentik.lib.config import CONFIG
from authentik.lib.generators import generate_id
from authentik.stages.invitation.models import InvitationStage
from authentik.stages.user_write.models import UserWriteStage
TMP = mkdtemp("authentik-blueprints")
@@ -85,121 +80,3 @@ class TestBlueprintsV1API(APITestCase):
res.content.decode(),
{"content": ["Failed to validate blueprint", "- Invalid blueprint version"]},
)
def test_api_import_with_context(self):
"""Test that the import endpoint applies the supplied context to the real blueprint"""
slug = f"invitation-enrollment-{generate_id()}"
flow_name = f"Invitation Enrollment {generate_id()}"
stage_name = f"invitation-stage-{generate_id()}"
user_type = "internal"
continue_without_invitation = True
res = self.client.post(
reverse("authentik_api:blueprintinstance-import-"),
data={
"path": "example/flows-invitation-enrollment-minimal.yaml",
"context": dumps(
{
"flow_slug": slug,
"flow_name": flow_name,
"stage_name": stage_name,
"continue_flow_without_invitation": continue_without_invitation,
"user_type": user_type,
}
),
},
format="multipart",
)
self.assertEqual(res.status_code, 200)
self.assertTrue(res.json()["success"])
flow = Flow.objects.get(slug=slug)
self.assertEqual(flow.name, flow_name)
self.assertEqual(flow.title, flow_name)
invitation_stage = InvitationStage.objects.get(name=stage_name)
self.assertEqual(
invitation_stage.continue_flow_without_invitation,
continue_without_invitation,
)
user_write_stage = UserWriteStage.objects.get(
name=f"invitation-enrollment-user-write-{slug}"
)
self.assertEqual(user_write_stage.user_type, user_type)
self.assertEqual(user_write_stage.user_path_template, f"users/{user_type}")
def test_api_import_blank_path(self):
"""Validator returns empty path unchanged (covers api.py:53)."""
with NamedTemporaryFile(mode="w+", suffix=".yaml") as file:
file.write(dump({"version": 1, "entries": []}))
file.flush()
file.seek(0)
res = self.client.post(
reverse("authentik_api:blueprintinstance-import-"),
data={"path": "", "file": file},
format="multipart",
)
self.assertEqual(res.status_code, 200)
def test_api_import_invalid_blueprint_returns_result_payload(self):
"""Invalid blueprint content returns a result payload instead of a 400 response."""
file = SimpleUploadedFile("invalid-blueprint.yaml", b'{"version": 3}')
res = self.client.post(
reverse("authentik_api:blueprintinstance-import-"),
data={"file": file},
format="multipart",
)
self.assertEqual(res.status_code, 200)
self.assertFalse(res.json()["success"])
self.assertGreater(len(res.json()["logs"]), 0)
def test_api_import_unknown_path(self):
"""Path not in available blueprints is rejected (covers api.py:56)."""
res = self.client.post(
reverse("authentik_api:blueprintinstance-import-"),
data={"path": "does/not/exist.yaml"},
format="multipart",
)
self.assertEqual(res.status_code, 400)
self.assertIn("Blueprint file does not exist", res.content.decode())
def test_api_import_blank_context(self):
"""Blank context is normalized to empty dict (covers api.py:62)."""
res = self.client.post(
reverse("authentik_api:blueprintinstance-import-"),
data={
"path": "example/flows-invitation-enrollment-minimal.yaml",
"context": "",
},
format="multipart",
)
self.assertEqual(res.status_code, 200)
def test_api_import_invalid_json_context(self):
"""Malformed JSON context raises ValidationError (covers api.py:65-66)."""
res = self.client.post(
reverse("authentik_api:blueprintinstance-import-"),
data={
"path": "example/flows-invitation-enrollment-minimal.yaml",
"context": "{not json",
},
format="multipart",
)
self.assertEqual(res.status_code, 400)
self.assertIn("Context must be valid JSON", res.content.decode())
def test_api_import_non_object_context(self):
"""JSON context that isn't an object is rejected (covers api.py:68)."""
res = self.client.post(
reverse("authentik_api:blueprintinstance-import-"),
data={
"path": "example/flows-invitation-enrollment-minimal.yaml",
"context": "[1, 2, 3]",
},
format="multipart",
)
self.assertEqual(res.status_code, 400)
self.assertIn("Context must be a JSON object", res.content.decode())

View File

@@ -32,19 +32,19 @@ from authentik.rbac.decorators import permission_required
class UserAgentDeviceDict(TypedDict):
"""User agent device"""
brand: str | None = None
brand: str
family: str
model: str | None = None
model: str
class UserAgentOSDict(TypedDict):
"""User agent os"""
family: str
major: str | None = None
minor: str | None = None
patch: str | None = None
patch_minor: str | None = None
major: str
minor: str
patch: str
patch_minor: str
class UserAgentBrowserDict(TypedDict):

View File

@@ -246,25 +246,6 @@ class GroupSerializer(ModelSerializer):
)
return superuser
def validate_users(self, users: list) -> list:
"""Require add_user_to_group permission when adding new members via group PATCH."""
request: Request = self.context.get("request", None)
if not request:
return users
if not self.instance:
return users
# BulkManyRelatedField returns raw PKs, not model instances
current_user_pks = set(self.instance.users.values_list("pk", flat=True))
new_users = [u for u in users if u not in current_user_pks]
if not new_users:
return users
has_perm = request.user.has_perm(
"authentik_core.add_user_to_group"
) or request.user.has_perm("authentik_core.add_user_to_group", self.instance)
if not has_perm:
raise ValidationError(_("User does not have permission to add members to this group."))
return users
class Meta:
model = Group
fields = [

View File

@@ -297,36 +297,6 @@ class UserSerializer(ModelSerializer):
raise ValidationError(_("Setting a user to internal service account is not allowed."))
return user_type
def validate_groups(self, groups: list) -> list:
"""Require enable_group_superuser permission when adding a user to a superuser group."""
request: Request = self.context.get("request", None)
if not request:
return groups
current_groups = set(self.instance.groups.all()) if self.instance else set()
for group in groups:
if not group.is_superuser:
continue
if group in current_groups:
continue
if not request.user.has_perm("authentik_core.enable_group_superuser"):
raise ValidationError(
_("User does not have permission to add members to a superuser group.")
)
return groups
def validate_roles(self, roles: list) -> list:
"""Require change_role permission when assigning new roles to a user."""
request: Request = self.context.get("request", None)
if not request:
return roles
current_roles = set(self.instance.roles.all()) if self.instance else set()
new_roles = [r for r in roles if r not in current_roles]
if not new_roles:
return roles
if not request.user.has_perm("authentik_rbac.change_role"):
raise ValidationError(_("User does not have permission to assign roles."))
return roles
def validate(self, attrs: dict) -> dict:
if self.instance and self.instance.type == UserTypes.INTERNAL_SERVICE_ACCOUNT:
raise ValidationError(_("Can't modify internal service account users"))

View File

@@ -1,5 +1,6 @@
"""authentik core signals"""
from asgiref.sync import async_to_sync
from channels.layers import get_channel_layer
from django.contrib.auth.signals import user_logged_in
from django.core.cache import cache
@@ -58,7 +59,7 @@ def user_logged_in_session(sender, request: HttpRequest, user: User, **_):
layer = get_channel_layer()
device_cookie = request.COOKIES.get("authentik_device")
if device_cookie:
layer.group_send_blocking(
async_to_sync(layer.group_send)(
build_device_group(device_cookie),
{"type": "event.session.authenticated"},
)

View File

@@ -158,58 +158,3 @@ class TestGroupsAPI(APITestCase):
data={"name": generate_id(), "is_superuser": True},
)
self.assertEqual(res.status_code, 201)
def test_patch_users_no_perm(self):
"""PATCH group with new users without add_user_to_group must be rejected."""
group = Group.objects.create(name=generate_id())
self.login_user.assign_perms_to_managed_role("authentik_core.view_group", group)
self.login_user.assign_perms_to_managed_role("authentik_core.change_group", group)
self.client.force_login(self.login_user)
res = self.client.patch(
reverse("authentik_api:group-detail", kwargs={"pk": group.pk}),
data={"users": [self.user.pk]},
content_type="application/json",
)
self.assertEqual(res.status_code, 400)
def test_patch_users_with_global_perm(self):
"""PATCH group with new users with global add_user_to_group must succeed."""
group = Group.objects.create(name=generate_id())
self.login_user.assign_perms_to_managed_role("authentik_core.view_group", group)
self.login_user.assign_perms_to_managed_role("authentik_core.change_group", group)
self.login_user.assign_perms_to_managed_role("authentik_core.add_user_to_group")
self.client.force_login(self.login_user)
res = self.client.patch(
reverse("authentik_api:group-detail", kwargs={"pk": group.pk}),
data={"users": [self.user.pk]},
content_type="application/json",
)
self.assertEqual(res.status_code, 200)
def test_patch_users_with_obj_perm(self):
"""PATCH group with new users with object-level add_user_to_group must succeed."""
group = Group.objects.create(name=generate_id())
self.login_user.assign_perms_to_managed_role("authentik_core.view_group", group)
self.login_user.assign_perms_to_managed_role("authentik_core.change_group", group)
self.login_user.assign_perms_to_managed_role("authentik_core.add_user_to_group", group)
self.client.force_login(self.login_user)
res = self.client.patch(
reverse("authentik_api:group-detail", kwargs={"pk": group.pk}),
data={"users": [self.user.pk]},
content_type="application/json",
)
self.assertEqual(res.status_code, 200)
def test_patch_existing_users_no_perm(self):
"""PATCH group keeping existing membership without add_user_to_group must succeed."""
group = Group.objects.create(name=generate_id())
group.users.add(self.user)
self.login_user.assign_perms_to_managed_role("authentik_core.view_group", group)
self.login_user.assign_perms_to_managed_role("authentik_core.change_group", group)
self.client.force_login(self.login_user)
res = self.client.patch(
reverse("authentik_api:group-detail", kwargs={"pk": group.pk}),
data={"users": [self.user.pk]},
content_type="application/json",
)
self.assertEqual(res.status_code, 200)

View File

@@ -12,7 +12,6 @@ from authentik.brands.models import Brand
from authentik.core.models import (
USER_ATTRIBUTE_TOKEN_EXPIRING,
AuthenticatedSession,
Group,
Session,
Token,
User,
@@ -26,7 +25,6 @@ from authentik.core.tests.utils import (
)
from authentik.flows.models import FlowAuthenticationRequirement, FlowDesignation
from authentik.lib.generators import generate_id, generate_key
from authentik.rbac.models import Role
from authentik.stages.email.models import EmailStage
INVALID_PASSWORD_HASH = "not-a-valid-hash"
@@ -941,79 +939,3 @@ class TestUsersAPI(APITestCase):
self.assertIn(user2.pk, pks)
# Verify user2 comes before user1 in descending order
self.assertLess(pks.index(user2.pk), pks.index(user1.pk))
class TestUsersAPIGroupRoleValidation(APITestCase):
"""Test that PATCH /api/v3/core/users/{pk}/ enforces group and role permission checks."""
def setUp(self) -> None:
self.actor = create_test_user()
self.target = create_test_user()
def _patch(self, data: dict):
self.client.force_login(self.actor)
return self.client.patch(
reverse("authentik_api:user-detail", kwargs={"pk": self.target.pk}),
data=data,
content_type="application/json",
)
def test_patch_superuser_group_no_perm(self):
"""Assigning a superuser group without enable_group_superuser must be rejected."""
self.actor.assign_perms_to_managed_role("authentik_core.view_user")
self.actor.assign_perms_to_managed_role("authentik_core.change_user", self.target)
group = Group.objects.create(name=generate_id(), is_superuser=True)
res = self._patch({"groups": [str(group.pk)]})
self.assertEqual(res.status_code, 400)
def test_patch_superuser_group_with_perm(self):
"""Assigning a superuser group with enable_group_superuser must succeed."""
self.actor.assign_perms_to_managed_role("authentik_core.view_user")
self.actor.assign_perms_to_managed_role("authentik_core.change_user", self.target)
self.actor.assign_perms_to_managed_role("authentik_core.enable_group_superuser")
group = Group.objects.create(name=generate_id(), is_superuser=True)
res = self._patch({"groups": [str(group.pk)]})
self.assertEqual(res.status_code, 200)
def test_patch_non_superuser_group_no_perm(self):
"""Assigning a non-superuser group without special permission must succeed."""
self.actor.assign_perms_to_managed_role("authentik_core.view_user")
self.actor.assign_perms_to_managed_role("authentik_core.change_user", self.target)
group = Group.objects.create(name=generate_id(), is_superuser=False)
res = self._patch({"groups": [str(group.pk)]})
self.assertEqual(res.status_code, 200)
def test_patch_existing_superuser_group_no_perm(self):
"""Keeping an existing superuser group membership without the permission must succeed."""
self.actor.assign_perms_to_managed_role("authentik_core.view_user")
self.actor.assign_perms_to_managed_role("authentik_core.change_user", self.target)
group = Group.objects.create(name=generate_id(), is_superuser=True)
self.target.groups.add(group)
res = self._patch({"groups": [str(group.pk)]})
self.assertEqual(res.status_code, 200)
def test_patch_role_no_perm(self):
"""Assigning a new role without change_role must be rejected."""
self.actor.assign_perms_to_managed_role("authentik_core.view_user")
self.actor.assign_perms_to_managed_role("authentik_core.change_user", self.target)
role = Role.objects.create(name=generate_id())
res = self._patch({"roles": [str(role.pk)]})
self.assertEqual(res.status_code, 400)
def test_patch_role_with_perm(self):
"""Assigning a new role with change_role must succeed."""
self.actor.assign_perms_to_managed_role("authentik_core.view_user")
self.actor.assign_perms_to_managed_role("authentik_core.change_user", self.target)
self.actor.assign_perms_to_managed_role("authentik_rbac.change_role")
role = Role.objects.create(name=generate_id())
res = self._patch({"roles": [str(role.pk)]})
self.assertEqual(res.status_code, 200)
def test_patch_existing_role_no_perm(self):
"""Keeping an existing role without change_role must succeed."""
self.actor.assign_perms_to_managed_role("authentik_core.view_user")
self.actor.assign_perms_to_managed_role("authentik_core.change_user", self.target)
role = Role.objects.create(name=generate_id())
self.target.roles.add(role)
res = self._patch({"roles": [str(role.pk)]})
self.assertEqual(res.status_code, 200)

View File

@@ -7,7 +7,7 @@ from drf_spectacular.utils import OpenApiParameter, OpenApiResponse, extend_sche
from rest_framework.decorators import action
from rest_framework.exceptions import PermissionDenied, ValidationError
from rest_framework.fields import ChoiceField
from rest_framework.permissions import AllowAny, IsAuthenticated
from rest_framework.permissions import IsAuthenticated
from rest_framework.relations import PrimaryKeyRelatedField
from rest_framework.request import Request
from rest_framework.response import Response
@@ -44,6 +44,7 @@ from authentik.stages.password.stage import PLAN_CONTEXT_METHOD, PLAN_CONTEXT_ME
class AgentConnectorSerializer(ConnectorSerializer):
class Meta(ConnectorSerializer.Meta):
model = AgentConnector
fields = ConnectorSerializer.Meta.fields + [
@@ -62,6 +63,7 @@ class AgentConnectorSerializer(ConnectorSerializer):
class MDMConfigSerializer(PassiveSerializer):
platform = ChoiceField(choices=OSFamily.choices)
enrollment_token = PrimaryKeyRelatedField(
queryset=EnrollmentToken.objects.including_expired().all()
@@ -87,6 +89,7 @@ class AgentConnectorViewSet(
UsedByMixin,
ModelViewSet,
):
queryset = AgentConnector.objects.all()
serializer_class = AgentConnectorSerializer
search_fields = ["name"]
@@ -118,8 +121,6 @@ class AgentConnectorViewSet(
methods=["POST"],
detail=False,
authentication_classes=[AgentEnrollmentAuth],
# Permissions are handled via AgentEnrollmentAuth
permission_classes=[AllowAny],
)
def enroll(self, request: Request):
token: EnrollmentToken = request.auth
@@ -150,13 +151,7 @@ class AgentConnectorViewSet(
request=OpenApiTypes.NONE,
responses=AgentConfigSerializer(),
)
@action(
methods=["GET"],
detail=False,
authentication_classes=[AgentAuth],
# Permissions are handled via AgentAuth
permission_classes=[AllowAny],
)
@action(methods=["GET"], detail=False, authentication_classes=[AgentAuth])
def agent_config(self, request: Request):
token: DeviceToken = request.auth
connector: AgentConnector = token.device.connector.agentconnector
@@ -170,13 +165,7 @@ class AgentConnectorViewSet(
request=DeviceFacts(),
responses={204: OpenApiResponse(description="Successfully checked in")},
)
@action(
methods=["POST"],
detail=False,
authentication_classes=[AgentAuth],
# Permissions are handled via AgentAuth
permission_classes=[AllowAny],
)
@action(methods=["POST"], detail=False, authentication_classes=[AgentAuth])
def check_in(self, request: Request):
token: DeviceToken = request.auth
data = DeviceFacts(data=request.data)

View File

@@ -1,6 +1,5 @@
from typing import Any
from django.db.models import Model
from django.http import HttpRequest
from django.utils.timezone import now
from drf_spectacular.extensions import OpenApiAuthenticationExtension
@@ -10,7 +9,7 @@ from rest_framework.exceptions import PermissionDenied
from rest_framework.request import Request
from structlog.stdlib import get_logger
from authentik.api.authentication import VirtualUser, validate_auth
from authentik.api.authentication import IPCUser, validate_auth
from authentik.core.middleware import CTX_AUTH_VIA
from authentik.core.models import User
from authentik.crypto.apps import MANAGED_KEY
@@ -26,18 +25,9 @@ LOGGER = get_logger()
PLATFORM_ISSUER = "goauthentik.io/platform"
class DeviceUser(VirtualUser):
class DeviceUser(IPCUser):
username = "authentik:endpoints:device"
def has_perm(self, perm: str, obj: Model | None = None) -> bool:
if perm in [
"authentik_core.view_user",
"authentik_core.view_group",
]:
return True
return False
class AgentEnrollmentAuth(BaseAuthentication):

View File

@@ -223,17 +223,3 @@ class TestAgentAPI(APITestCase):
data={"platform": OSFamily.macOS, "enrollment_token": self.token.pk},
)
self.assertEqual(res.status_code, 200)
def test_users_list(self):
response = self.client.get(
reverse("authentik_api:user-list"),
HTTP_AUTHORIZATION=f"Bearer+agent {self.device_token.key}",
)
self.assertEqual(response.status_code, 200)
def test_other_api_forbidden(self):
response = self.client.get(
reverse("authentik_api:application-list"),
HTTP_AUTHORIZATION=f"Bearer+agent {self.device_token.key}",
)
self.assertEqual(response.status_code, 403)

View File

@@ -2,7 +2,6 @@ from django.urls import reverse
from drf_spectacular.types import OpenApiTypes
from drf_spectacular.utils import extend_schema
from rest_framework.decorators import action
from rest_framework.permissions import AllowAny
from rest_framework.request import Request
from rest_framework.response import Response
from structlog.stdlib import get_logger
@@ -26,13 +25,7 @@ class AgentConnectorViewSetMixin:
request=OpenApiTypes.NONE,
responses=AgentAuthenticationResponse(),
)
@action(
methods=["POST"],
detail=False,
authentication_classes=[AgentAuth],
# Permissions are handled via AgentAuth
permission_classes=[AllowAny],
)
@action(methods=["POST"], detail=False, authentication_classes=[AgentAuth])
@enterprise_action
def auth_ia(self, request: Request) -> Response:
token: DeviceToken = request.auth

View File

@@ -1,72 +1,14 @@
from datetime import datetime
from django.urls import reverse
from django.utils.translation import gettext as _
from rest_framework.exceptions import ValidationError
from authentik.enterprise.license import LicenseKey
from authentik.providers.scim.models import SCIMAuthenticationMode, SCIMProvider
from authentik.sources.oauth.models import UserOAuthSourceConnection
from authentik.providers.scim.models import SCIMAuthenticationMode
class SCIMProviderSerializerMixin:
def _get_token(self, instance: SCIMProvider) -> UserOAuthSourceConnection | None:
user = instance.auth_oauth_user
conn = UserOAuthSourceConnection.objects.filter(
user=user, source=instance.auth_oauth
).first()
return conn
def get_auth_oauth_token_last_updated(self, instance: SCIMProvider) -> datetime | None:
conn = self._get_token(instance)
return conn.last_updated if conn else None
def get_auth_oauth_token_expires(self, instance: SCIMProvider) -> datetime | None:
conn = self._get_token(instance)
return conn.expires if conn else None
def get_auth_oauth_url_callback(self, instance: SCIMProvider) -> str | None:
if (
instance.auth_mode
in [
SCIMAuthenticationMode.TOKEN,
SCIMAuthenticationMode.OAUTH_SILENT,
]
or not instance.backchannel_application
):
return None
relative_url = reverse(
"authentik_enterprise_providers_scim:callback",
kwargs={"application_slug": instance.backchannel_application.slug},
)
if "request" not in self.context:
return relative_url
return self.context["request"].build_absolute_uri(relative_url)
def get_auth_oauth_url_start(self, instance: SCIMProvider) -> str | None:
if (
instance.auth_mode
in [
SCIMAuthenticationMode.TOKEN,
SCIMAuthenticationMode.OAUTH_SILENT,
]
or not instance.backchannel_application
):
return None
relative_url = reverse(
"authentik_enterprise_providers_scim:start",
kwargs={"application_slug": instance.backchannel_application.slug},
)
if "request" not in self.context:
return relative_url
return self.context["request"].build_absolute_uri(relative_url)
def validate_auth_mode(self, auth_mode: SCIMAuthenticationMode) -> SCIMAuthenticationMode:
if auth_mode in [
SCIMAuthenticationMode.OAUTH_SILENT,
SCIMAuthenticationMode.OAUTH_INTERACTIVE,
]:
if auth_mode == SCIMAuthenticationMode.OAUTH:
if not LicenseKey.cached_summary().status.is_valid:
raise ValidationError(_("Enterprise is required to use the OAuth mode."))
return auth_mode

View File

@@ -7,4 +7,3 @@ class AuthentikEnterpriseProviderSCIMConfig(EnterpriseConfig):
label = "authentik_enterprise_providers_scim"
verbose_name = "authentik Enterprise.Providers.SCIM"
default = True
mountpoint = "application/scim/"

View File

@@ -1,14 +1,12 @@
from datetime import timedelta
from typing import TYPE_CHECKING, Any
from typing import TYPE_CHECKING
from django.utils.timezone import now
from requests import Request, RequestException
from structlog.stdlib import get_logger
from authentik.common.oauth.constants import GRANT_TYPE_PASSWORD, GRANT_TYPE_REFRESH_TOKEN
from authentik.providers.scim.clients.exceptions import SCIMRequestException
from authentik.providers.scim.models import SCIMAuthenticationMode
from authentik.sources.oauth.clients.base import BaseOAuthClient
from authentik.sources.oauth.clients.oauth2 import OAuth2Client
from authentik.sources.oauth.models import OAuthSource, UserOAuthSourceConnection
if TYPE_CHECKING:
@@ -20,26 +18,23 @@ class SCIMOAuthException(SCIMRequestException):
class SCIMOAuthAuth:
def __init__(self, provider: SCIMProvider):
self.provider = provider
self.user = provider.auth_oauth_user
self.logger = get_logger().bind()
self.connection = self.get_connection()
def retrieve_token(self, conn: UserOAuthSourceConnection | None) -> dict[str, Any]:
def retrieve_token(self):
if not self.provider.auth_oauth:
return None
source: OAuthSource = self.provider.auth_oauth
client: BaseOAuthClient = source.source_type.callback_view(request=None).get_client(source)
client = OAuth2Client(source, None)
access_token_url = source.source_type.access_token_url or ""
if source.source_type.urls_customizable and source.access_token_url:
access_token_url = source.access_token_url
data = client.get_access_token_args(None, None)
if self.provider.auth_mode == SCIMAuthenticationMode.OAUTH_SILENT:
data["grant_type"] = GRANT_TYPE_PASSWORD
elif self.provider.auth_mode == SCIMAuthenticationMode.OAUTH_INTERACTIVE:
data["grant_type"] = GRANT_TYPE_REFRESH_TOKEN
if not conn:
raise SCIMOAuthException(None, "Could not refresh SCIM OAuth token")
data["refresh_token"] = conn.refresh_token
data["grant_type"] = "password"
data.update(self.provider.auth_oauth_params)
try:
response = client.do_request(
@@ -59,14 +54,12 @@ class SCIMOAuthAuth:
raise SCIMOAuthException(exc.response, message="Failed to get OAuth token") from exc
def get_connection(self):
if not self.provider.auth_oauth:
return None
conn = UserOAuthSourceConnection.objects.filter(
source=self.provider.auth_oauth, user=self.user
token = UserOAuthSourceConnection.objects.filter(
source=self.provider.auth_oauth, user=self.user, expires__gt=now()
).first()
if conn and conn.access_token and conn.expires > now():
return conn
token = self.retrieve_token(conn)
if token and token.access_token:
return token
token = self.retrieve_token()
access_token = token["access_token"]
expires_in = int(token.get("expires_in", 0))
token, _ = UserOAuthSourceConnection.objects.update_or_create(
@@ -74,10 +67,7 @@ class SCIMOAuthAuth:
user=self.user,
defaults={
"access_token": access_token,
"refresh_token": token.get("refresh_token"),
"expires": now() + timedelta(seconds=expires_in),
# When using `update_or_create`, `last_updated` is not updated
"last_updated": now(),
},
)
return token

View File

@@ -14,10 +14,7 @@ def scim_provider_post_save(sender: type[Model], instance: SCIMProvider, created
"""Create service account before provider is saved"""
identifier = f"ak-providers-scim-{instance.pk}"
with audit_ignore():
if instance.auth_mode in [
SCIMAuthenticationMode.OAUTH_SILENT,
SCIMAuthenticationMode.OAUTH_INTERACTIVE,
]:
if instance.auth_mode == SCIMAuthenticationMode.OAUTH:
user, user_created = User.objects.update_or_create(
username=identifier,
defaults={

View File

@@ -2,7 +2,7 @@
from base64 import b64encode
from datetime import timedelta
from urllib.parse import parse_qs, urlencode, urlparse
from unittest.mock import MagicMock, PropertyMock, patch
from django.urls import reverse
from django.utils.timezone import now
@@ -11,14 +11,17 @@ from rest_framework.test import APITestCase
from authentik.blueprints.tests import apply_blueprint
from authentik.core.models import Application, Group, User
from authentik.core.tests.utils import create_test_admin_user
from authentik.enterprise.license import LicenseKey
from authentik.enterprise.models import License
from authentik.enterprise.tests.test_license import expiry_valid
from authentik.lib.generators import generate_id
from authentik.providers.scim.models import SCIMAuthenticationMode, SCIMMapping, SCIMProvider
from authentik.sources.oauth.models import OAuthSource, UserOAuthSourceConnection
from authentik.tenants.models import Tenant
from tests.live import create_test_admin_user
class TestSCIMOAuthToken(APITestCase):
class SCIMOAuthTests(APITestCase):
"""SCIM User tests"""
@apply_blueprint("system/providers-scim.yaml")
@@ -39,7 +42,7 @@ class TestSCIMOAuthToken(APITestCase):
self.provider = SCIMProvider.objects.create(
name=generate_id(),
url="https://localhost",
auth_mode=SCIMAuthenticationMode.OAUTH_SILENT,
auth_mode=SCIMAuthenticationMode.OAUTH,
auth_oauth=self.source,
auth_oauth_params={
"foo": "bar",
@@ -57,9 +60,8 @@ class TestSCIMOAuthToken(APITestCase):
self.provider.property_mappings_group.add(
SCIMMapping.objects.get(managed="goauthentik.io/providers/scim/group")
)
self.admin = create_test_admin_user()
def test_retrieve_token_silent(self):
def test_retrieve_token(self):
"""Test token retrieval"""
with Mocker() as mocker:
token = generate_id()
@@ -84,44 +86,6 @@ class TestSCIMOAuthToken(APITestCase):
)
self.assertEqual(mocker.request_history[0].body, "grant_type=password&foo=bar")
def test_retrieve_token_interactive(self):
"""Test token retrieval"""
self.provider.auth_mode = SCIMAuthenticationMode.OAUTH_INTERACTIVE
self.provider.save()
refresh_token = generate_id()
access_token = generate_id()
UserOAuthSourceConnection.objects.create(
user=self.provider.auth_oauth_user,
source=self.source,
refresh_token=refresh_token,
access_token=access_token,
)
with Mocker() as mocker:
token = generate_id()
mocker.post("http://localhost/token", json={"access_token": token, "expires_in": 3600})
self.provider.scim_auth()
conn = UserOAuthSourceConnection.objects.filter(
source=self.source,
user=self.provider.auth_oauth_user,
).first()
self.assertIsNotNone(conn)
self.assertTrue(conn.is_valid)
auth = (
b64encode(
b":".join((self.source.consumer_key.encode(), self.source.consumer_secret.encode()))
)
.strip()
.decode()
)
self.assertEqual(
mocker.request_history[0].headers["Authorization"],
f"Basic {auth}",
)
self.assertEqual(
mocker.request_history[0].body,
f"grant_type=refresh_token&refresh_token={refresh_token}&foo=bar",
)
def test_existing_token(self):
"""Test existing token"""
UserOAuthSourceConnection.objects.create(
@@ -134,54 +98,96 @@ class TestSCIMOAuthToken(APITestCase):
self.provider.scim_auth()
self.assertEqual(len(mocker.request_history), 0)
def test_interactive_start(self):
self.client.force_login(self.admin)
res = self.client.get(
reverse(
"authentik_enterprise_providers_scim:start",
kwargs={
"application_slug": self.app.slug,
@Mocker()
def test_user_create(self, mock: Mocker):
"""Test user creation"""
scim_id = generate_id()
token = generate_id()
mock.post("http://localhost/token", json={"access_token": token, "expires_in": 3600})
mock.get(
"https://localhost/ServiceProviderConfig",
json={},
)
mock.post(
"https://localhost/Users",
json={
"id": scim_id,
},
)
uid = generate_id()
user = User.objects.create(
username=uid,
name=f"{uid} {uid}",
email=f"{uid}@goauthentik.io",
)
self.assertEqual(mock.call_count, 3)
self.assertEqual(mock.request_history[1].method, "GET")
self.assertEqual(mock.request_history[2].method, "POST")
self.assertJSONEqual(
mock.request_history[2].body,
{
"schemas": ["urn:ietf:params:scim:schemas:core:2.0:User"],
"active": True,
"emails": [
{
"primary": True,
"type": "other",
"value": f"{uid}@goauthentik.io",
}
],
"externalId": user.uid,
"name": {
"familyName": uid,
"formatted": f"{uid} {uid}",
"givenName": uid,
},
)
"displayName": f"{uid} {uid}",
"userName": uid,
},
)
self.assertEqual(res.status_code, 302)
query = parse_qs(urlparse(res.url).query)
self.assertEqual(query["client_id"], [self.source.consumer_key])
self.assertEqual(
query["redirect_uri"],
[f"http://testserver/application/scim/{self.app.slug}/oauth2/callback/"],
)
self.assertEqual(query["response_type"], ["code"])
def test_interactive_callback(self):
self.client.force_login(self.admin)
res = self.client.get(
reverse(
"authentik_enterprise_providers_scim:start",
kwargs={
"application_slug": self.app.slug,
},
@patch(
"authentik.enterprise.license.LicenseKey.validate",
MagicMock(
return_value=LicenseKey(
aud="",
exp=expiry_valid,
name=generate_id(),
internal_users=100,
external_users=100,
)
),
)
def test_api_create(self):
License.objects.create(key=generate_id())
self.client.force_login(create_test_admin_user())
res = self.client.post(
reverse("authentik_api:scimprovider-list"),
{
"name": generate_id(),
"url": "http://localhost",
"auth_mode": "oauth",
"auth_oauth": str(self.source.pk),
},
)
self.assertEqual(res.status_code, 302)
query = parse_qs(urlparse(res.url).query)
self.assertEqual(res.status_code, 201)
with Mocker() as mock:
token = generate_id()
mock.post("http://localhost/token", json={"access_token": token, "expires_in": 3600})
res = self.client.get(
reverse(
"authentik_enterprise_providers_scim:callback",
kwargs={
"application_slug": self.app.slug,
},
)
+ "?"
+ urlencode({"state": query["state"][0], "code": generate_id()})
)
self.assertEqual(res.status_code, 302)
conn = UserOAuthSourceConnection.objects.filter(source=self.source).first()
self.assertIsNotNone(conn)
self.assertTrue(conn.is_valid)
@patch(
"authentik.enterprise.models.LicenseUsageStatus.is_valid",
PropertyMock(return_value=False),
)
def test_api_create_no_license(self):
self.client.force_login(create_test_admin_user())
res = self.client.post(
reverse("authentik_api:scimprovider-list"),
{
"name": generate_id(),
"url": "http://localhost",
"auth_mode": "oauth",
"auth_oauth": str(self.source.pk),
},
)
self.assertEqual(res.status_code, 400)
self.assertJSONEqual(
res.content, {"auth_mode": ["Enterprise is required to use the OAuth mode."]}
)

View File

@@ -1,73 +0,0 @@
"""SCIM OAuth tests"""
from unittest.mock import MagicMock, PropertyMock, patch
from django.urls import reverse
from rest_framework.test import APITestCase
from authentik.core.tests.utils import create_test_admin_user
from authentik.enterprise.license import LicenseKey
from authentik.enterprise.models import License
from authentik.enterprise.tests.test_license import expiry_valid
from authentik.lib.generators import generate_id
from authentik.sources.oauth.models import OAuthSource
class TestSCIMOAuthAPI(APITestCase):
"""SCIM User tests"""
def setUp(self):
self.source = OAuthSource.objects.create(
name=generate_id(),
slug=generate_id(),
access_token_url="http://localhost/token", # nosec
consumer_key=generate_id(),
consumer_secret=generate_id(),
provider_type="openidconnect",
)
@patch(
"authentik.enterprise.license.LicenseKey.validate",
MagicMock(
return_value=LicenseKey(
aud="",
exp=expiry_valid,
name=generate_id(),
internal_users=100,
external_users=100,
)
),
)
def test_api_create(self):
License.objects.create(key=generate_id())
self.client.force_login(create_test_admin_user())
res = self.client.post(
reverse("authentik_api:scimprovider-list"),
{
"name": generate_id(),
"url": "http://localhost",
"auth_mode": "oauth",
"auth_oauth": str(self.source.pk),
},
)
self.assertEqual(res.status_code, 201)
@patch(
"authentik.enterprise.models.LicenseUsageStatus.is_valid",
PropertyMock(return_value=False),
)
def test_api_create_no_license(self):
self.client.force_login(create_test_admin_user())
res = self.client.post(
reverse("authentik_api:scimprovider-list"),
{
"name": generate_id(),
"url": "http://localhost",
"auth_mode": "oauth",
"auth_oauth": str(self.source.pk),
},
)
self.assertEqual(res.status_code, 400)
self.assertJSONEqual(
res.content, {"auth_mode": ["Enterprise is required to use the OAuth mode."]}
)

View File

@@ -1,100 +0,0 @@
"""SCIM OAuth tests"""
from requests_mock import Mocker
from rest_framework.test import APITestCase
from authentik.blueprints.tests import apply_blueprint
from authentik.core.models import Application, Group, User
from authentik.lib.generators import generate_id
from authentik.providers.scim.models import SCIMAuthenticationMode, SCIMMapping, SCIMProvider
from authentik.sources.oauth.models import OAuthSource
from authentik.tenants.models import Tenant
class TestSCIMOAuthAuth(APITestCase):
"""SCIM User tests"""
@apply_blueprint("system/providers-scim.yaml")
def setUp(self) -> None:
# Delete all users and groups as the mocked HTTP responses only return one ID
# which will cause errors with multiple users
Tenant.objects.update(avatars="none")
User.objects.all().exclude_anonymous().delete()
Group.objects.all().delete()
self.source = OAuthSource.objects.create(
name=generate_id(),
slug=generate_id(),
access_token_url="http://localhost/token", # nosec
consumer_key=generate_id(),
consumer_secret=generate_id(),
provider_type="openidconnect",
)
self.provider = SCIMProvider.objects.create(
name=generate_id(),
url="https://localhost",
auth_mode=SCIMAuthenticationMode.OAUTH_SILENT,
auth_oauth=self.source,
auth_oauth_params={
"foo": "bar",
},
exclude_users_service_account=True,
)
self.app: Application = Application.objects.create(
name=generate_id(),
slug=generate_id(),
)
self.app.backchannel_providers.add(self.provider)
self.provider.property_mappings.add(
SCIMMapping.objects.get(managed="goauthentik.io/providers/scim/user")
)
self.provider.property_mappings_group.add(
SCIMMapping.objects.get(managed="goauthentik.io/providers/scim/group")
)
@Mocker()
def test_user_create(self, mock: Mocker):
"""Test user creation"""
scim_id = generate_id()
token = generate_id()
mock.post("http://localhost/token", json={"access_token": token, "expires_in": 3600})
mock.get(
"https://localhost/ServiceProviderConfig",
json={},
)
mock.post(
"https://localhost/Users",
json={
"id": scim_id,
},
)
uid = generate_id()
user = User.objects.create(
username=uid,
name=f"{uid} {uid}",
email=f"{uid}@goauthentik.io",
)
self.assertEqual(mock.call_count, 3)
self.assertEqual(mock.request_history[1].method, "GET")
self.assertEqual(mock.request_history[2].method, "POST")
self.assertJSONEqual(
mock.request_history[2].body,
{
"schemas": ["urn:ietf:params:scim:schemas:core:2.0:User"],
"active": True,
"emails": [
{
"primary": True,
"type": "other",
"value": f"{uid}@goauthentik.io",
}
],
"externalId": user.uid,
"name": {
"familyName": uid,
"formatted": f"{uid} {uid}",
"givenName": uid,
},
"displayName": f"{uid} {uid}",
"userName": uid,
},
)

View File

@@ -1,10 +0,0 @@
from django.urls import path
from authentik.enterprise.providers.scim.views import SCIMOAuthStart, SCIMRedirectCallback
urlpatterns = [
path("<slug:application_slug>/oauth2/start/", SCIMOAuthStart.as_view(), name="start"),
path(
"<slug:application_slug>/oauth2/callback/", SCIMRedirectCallback.as_view(), name="callback"
),
]

View File

@@ -1,70 +0,0 @@
from datetime import timedelta
from django.core.exceptions import PermissionDenied
from django.http import HttpRequest
from django.shortcuts import redirect
from django.urls import reverse
from django.utils.timezone import now
from authentik.core.models import Application
from authentik.providers.scim.models import SCIMProvider
from authentik.sources.oauth.clients.base import BaseOAuthClient
from authentik.sources.oauth.models import OAuthSource, UserOAuthSourceConnection
from authentik.sources.oauth.types.registry import RequestKind, registry
from authentik.sources.oauth.views.callback import OAuthCallback
from authentik.sources.oauth.views.redirect import OAuthRedirect
class SCIMOAuthViewMixin:
provider: SCIMProvider
def get_client(self, source: OAuthSource, **kwargs) -> BaseOAuthClient:
source: OAuthSource = self.provider.auth_oauth
source_cls = registry.find(source.provider_type, kind=RequestKind.CALLBACK)
if not source_cls.client_class:
return super().get_client(source, **kwargs)
return source_cls.client_class(source, self.request, **kwargs)
def _get_scim_provider(self, app_slug: str):
app = Application.objects.filter(slug=app_slug).first()
if not app:
return None
provider = SCIMProvider.objects.filter(backchannel_application=app)
return provider.first()
def dispatch(self, request: HttpRequest, application_slug: str):
if not request.user.is_authenticated:
raise PermissionDenied()
provider = self._get_scim_provider(application_slug)
if not provider or not provider.auth_oauth:
raise PermissionDenied()
if not request.user.has_perm(
"authentik_providers_scim.change_scimprovider",
provider,
):
raise PermissionDenied()
self.provider = provider
return super().dispatch(request, source_slug=provider.auth_oauth.slug)
class SCIMOAuthStart(SCIMOAuthViewMixin, OAuthRedirect):
def get_callback_url(self, source: OAuthSource):
return reverse("authentik_enterprise_providers_scim:callback", kwargs=self.kwargs)
class SCIMRedirectCallback(SCIMOAuthViewMixin, OAuthCallback):
def redirect_flow_manager(self, client: BaseOAuthClient):
expires_in = int(self.token.get("expires_in", 0))
UserOAuthSourceConnection.objects.update_or_create(
source=self.provider.auth_oauth,
user=self.provider.auth_oauth_user,
defaults={
"access_token": self.token.get("access_token"),
"refresh_token": self.token.get("refresh_token"),
"expires": now() + timedelta(seconds=expires_in),
},
)
return redirect("authentik_core:if-admin")

View File

@@ -1,7 +1,6 @@
# Generated by Django 5.2.12 on 2026-04-04 16:58
from django.db import migrations, models
import django.contrib.postgres.fields
class Migration(migrations.Migration):
@@ -41,109 +40,4 @@ class Migration(migrations.Migration):
]
),
),
migrations.AlterField(
model_name="stream",
name="events_requested",
field=django.contrib.postgres.fields.ArrayField(
base_field=models.TextField(
choices=[
(
"https://schemas.openid.net/secevent/caep/event-type/session-revoked",
"Caep Session Revoked",
),
(
"https://schemas.openid.net/secevent/caep/event-type/token-claims-change",
"Caep Token Claims Change",
),
(
"https://schemas.openid.net/secevent/caep/event-type/credential-change",
"Caep Credential Change",
),
(
"https://schemas.openid.net/secevent/caep/event-type/assurance-level-change",
"Caep Assurance Level Change",
),
(
"https://schemas.openid.net/secevent/caep/event-type/device-compliance-change",
"Caep Device Compliance Change",
),
(
"https://schemas.openid.net/secevent/caep/event-type/session-established",
"Caep Session Established",
),
(
"https://schemas.openid.net/secevent/caep/event-type/session-presented",
"Caep Session Presented",
),
(
"https://schemas.openid.net/secevent/caep/event-type/risk-level-change",
"Caep Risk Level Change",
),
(
"https://schemas.openid.net/secevent/ssf/event-type/verification",
"Set Verification",
),
]
),
default=list,
size=None,
),
),
migrations.AlterField(
model_name="stream",
name="status",
field=models.TextField(
choices=[
("enabled", "Enabled"),
("paused", "Paused"),
("disabled", "Disabled"),
("disabled_deleted", "Disabled Deleted"),
],
default="enabled",
),
),
migrations.AlterField(
model_name="streamevent",
name="type",
field=models.TextField(
choices=[
(
"https://schemas.openid.net/secevent/caep/event-type/session-revoked",
"Caep Session Revoked",
),
(
"https://schemas.openid.net/secevent/caep/event-type/token-claims-change",
"Caep Token Claims Change",
),
(
"https://schemas.openid.net/secevent/caep/event-type/credential-change",
"Caep Credential Change",
),
(
"https://schemas.openid.net/secevent/caep/event-type/assurance-level-change",
"Caep Assurance Level Change",
),
(
"https://schemas.openid.net/secevent/caep/event-type/device-compliance-change",
"Caep Device Compliance Change",
),
(
"https://schemas.openid.net/secevent/caep/event-type/session-established",
"Caep Session Established",
),
(
"https://schemas.openid.net/secevent/caep/event-type/session-presented",
"Caep Session Presented",
),
(
"https://schemas.openid.net/secevent/caep/event-type/risk-level-change",
"Caep Risk Level Change",
),
(
"https://schemas.openid.net/secevent/ssf/event-type/verification",
"Set Verification",
),
]
),
),
]

View File

@@ -24,31 +24,8 @@ class EventTypes(models.TextChoices):
"""SSF Event types supported by authentik"""
CAEP_SESSION_REVOKED = "https://schemas.openid.net/secevent/caep/event-type/session-revoked"
"""https://openid.net/specs/openid-caep-1_0-final.html#section-3.1"""
CAEP_TOKEN_CLAIMS_CHANGE = (
"https://schemas.openid.net/secevent/caep/event-type/token-claims-change"
)
"""https://openid.net/specs/openid-caep-1_0-final.html#section-3.2"""
CAEP_CREDENTIAL_CHANGE = "https://schemas.openid.net/secevent/caep/event-type/credential-change"
"""https://openid.net/specs/openid-caep-1_0-final.html#section-3.3"""
CAEP_ASSURANCE_LEVEL_CHANGE = (
"https://schemas.openid.net/secevent/caep/event-type/assurance-level-change"
)
"""https://openid.net/specs/openid-caep-1_0-final.html#section-3.4"""
CAEP_DEVICE_COMPLIANCE_CHANGE = (
"https://schemas.openid.net/secevent/caep/event-type/device-compliance-change"
)
"""https://openid.net/specs/openid-caep-1_0-final.html#section-3.5"""
CAEP_SESSION_ESTABLISHED = (
"https://schemas.openid.net/secevent/caep/event-type/session-established"
)
"""https://openid.net/specs/openid-caep-1_0-final.html#section-3.6"""
CAEP_SESSION_PRESENTED = "https://schemas.openid.net/secevent/caep/event-type/session-presented"
"""https://openid.net/specs/openid-caep-1_0-final.html#section-3.7"""
CAEP_RISK_LEVEL_CHANGE = "https://schemas.openid.net/secevent/caep/event-type/risk-level-change"
"""https://openid.net/specs/openid-caep-1_0-final.html#section-3.8"""
SET_VERIFICATION = "https://schemas.openid.net/secevent/ssf/event-type/verification"
"""https://openid.net/specs/openid-sharedsignals-framework-1_0.html#section-8.1.4.1"""
class DeliveryMethods(models.TextChoices):
@@ -69,12 +46,10 @@ class SSFEventStatus(models.TextChoices):
class StreamStatus(models.TextChoices):
"""SSF Stream status"""
ENABLED = "enabled"
PAUSED = "paused"
DISABLED = "disabled"
DISABLED_DELETED = "disabled_deleted"
class SSFProvider(TasksModel, BackchannelProvider):

View File

@@ -108,13 +108,13 @@ def send_ssf_event(stream_uuid: UUID, event_data: dict[str, Any]):
event.save()
self.info("Event successfully sent", status=response.status_code)
# Cleanup, if we were the last pending message for this stream and it has been deleted
# (status=StreamStatus.DISABLED_DELETED), then we can delete the stream
# (status=StreamStatus.DISABLED), then we can delete the stream
if (
not StreamEvent.objects.filter(
stream=stream,
status__in=[SSFEventStatus.PENDING_FAILED, SSFEventStatus.PENDING_NEW],
).exists()
and stream.status == StreamStatus.DISABLED_DELETED
and stream.status == StreamStatus.DISABLED
):
LOGGER.info(
"Deleting inactive stream as all pending messages were sent.", stream=stream

View File

@@ -62,7 +62,7 @@ class TestSSFAuth(APITestCase):
self.assertEqual(event.status, SSFEventStatus.PENDING_FAILED)
self.assertEqual(
event.payload["events"],
{"https://schemas.openid.net/secevent/ssf/event-type/verification": {}},
{"https://schemas.openid.net/secevent/ssf/event-type/verification": {"state": None}},
)
def test_stream_add_oidc(self):
@@ -115,7 +115,7 @@ class TestSSFAuth(APITestCase):
self.assertEqual(event.status, SSFEventStatus.PENDING_FAILED)
self.assertEqual(
event.payload["events"],
{"https://schemas.openid.net/secevent/ssf/event-type/verification": {}},
{"https://schemas.openid.net/secevent/ssf/event-type/verification": {"state": None}},
)
def test_token_invalid(self):

View File

@@ -54,7 +54,7 @@ class TestStream(APITestCase):
self.assertEqual(event.status, SSFEventStatus.PENDING_FAILED)
self.assertEqual(
event.payload["events"],
{"https://schemas.openid.net/secevent/ssf/event-type/verification": {}},
{"https://schemas.openid.net/secevent/ssf/event-type/verification": {"state": None}},
)
def test_stream_add_poll(self):
@@ -96,7 +96,7 @@ class TestStream(APITestCase):
)
self.assertEqual(res.status_code, 204)
stream.refresh_from_db()
self.assertEqual(stream.status, StreamStatus.DISABLED_DELETED)
self.assertEqual(stream.status, StreamStatus.DISABLED)
def test_stream_get(self):
"""get stream"""
@@ -225,26 +225,3 @@ class TestStream(APITestCase):
HTTP_AUTHORIZATION=f"Bearer {self.provider.token.key}",
)
self.assertEqual(res.status_code, 404)
def test_stream_status_update(self):
stream = Stream.objects.create(provider=self.provider)
res = self.client.post(
reverse(
"authentik_providers_ssf:stream-status",
kwargs={"application_slug": self.application.slug},
),
data={
"stream_id": str(stream.pk),
"status": StreamStatus.DISABLED,
},
HTTP_AUTHORIZATION=f"Bearer {self.provider.token.key}",
)
self.assertEqual(res.status_code, 200)
stream.refresh_from_db()
self.assertJSONEqual(
res.content,
{
"stream_id": str(stream.pk),
"status": str(stream.status),
},
)

View File

@@ -33,7 +33,7 @@ class TestTasks(APITestCase):
)
event_data = stream.prepare_event_payload(
EventTypes.SET_VERIFICATION,
{},
{"state": None},
sub_id={"format": "opaque", "id": str(stream.uuid)},
)
with Mocker() as mocker:
@@ -46,7 +46,7 @@ class TestTasks(APITestCase):
)
jwt = decode_complete(mocker.request_history[0].body, options={"verify_signature": False})
self.assertEqual(jwt["header"]["typ"], "secevent+jwt")
self.assertEqual(jwt["payload"]["events"][EventTypes.SET_VERIFICATION], {})
self.assertIsNone(jwt["payload"]["events"][EventTypes.SET_VERIFICATION]["state"])
def test_push_auth(self):
auth = generate_id()
@@ -58,7 +58,7 @@ class TestTasks(APITestCase):
)
event_data = stream.prepare_event_payload(
EventTypes.SET_VERIFICATION,
{},
{"state": None},
sub_id={"format": "opaque", "id": str(stream.uuid)},
)
with Mocker() as mocker:
@@ -72,7 +72,7 @@ class TestTasks(APITestCase):
)
jwt = decode_complete(mocker.request_history[0].body, options={"verify_signature": False})
self.assertEqual(jwt["header"]["typ"], "secevent+jwt")
self.assertEqual(jwt["payload"]["events"][EventTypes.SET_VERIFICATION], {})
self.assertIsNone(jwt["payload"]["events"][EventTypes.SET_VERIFICATION]["state"])
def test_push_stream_disable(self):
auth = generate_id()
@@ -81,11 +81,11 @@ class TestTasks(APITestCase):
delivery_method=DeliveryMethods.RFC_PUSH,
endpoint_url="http://localhost/ssf-push",
authorization_header=auth,
status=StreamStatus.DISABLED_DELETED,
status=StreamStatus.DISABLED,
)
event_data = stream.prepare_event_payload(
EventTypes.SET_VERIFICATION,
{},
{"state": None},
sub_id={"format": "opaque", "id": str(stream.uuid)},
)
with Mocker() as mocker:
@@ -95,7 +95,7 @@ class TestTasks(APITestCase):
).get_result(block=True, timeout=1)
jwt = decode_complete(mocker.request_history[0].body, options={"verify_signature": False})
self.assertEqual(jwt["header"]["typ"], "secevent+jwt")
self.assertEqual(jwt["payload"]["events"][EventTypes.SET_VERIFICATION], {})
self.assertIsNone(jwt["payload"]["events"][EventTypes.SET_VERIFICATION]["state"])
self.assertFalse(Stream.objects.filter(pk=stream.pk).exists())
def test_push_error(self):
@@ -106,7 +106,7 @@ class TestTasks(APITestCase):
)
event_data = stream.prepare_event_payload(
EventTypes.SET_VERIFICATION,
{},
{"state": None},
sub_id={"format": "opaque", "id": str(stream.uuid)},
)
with Mocker() as mocker:

View File

@@ -24,10 +24,10 @@ class SSFView(APIView):
class SSFStreamView(SSFView):
def get_object(self) -> Stream:
streams = Stream.objects.filter(provider=self.provider).exclude(
status=StreamStatus.DISABLED_DELETED
)
def get_object(self, any_status=False) -> Stream:
streams = Stream.objects.filter(provider=self.provider)
if not any_status:
streams = streams.filter(status__in=[StreamStatus.ENABLED, StreamStatus.PAUSED])
if "stream_id" in self.request.query_params:
streams = streams.filter(pk=self.request.query_params["stream_id"])
if "stream_id" in self.request.data:

View File

@@ -1,6 +1,6 @@
from uuid import uuid4
from django.http import Http404, HttpRequest
from django.http import HttpRequest
from django.urls import reverse
from rest_framework.exceptions import PermissionDenied, ValidationError
from rest_framework.fields import CharField, ChoiceField, ListField, SerializerMethodField
@@ -106,11 +106,7 @@ class StreamResponseSerializer(PassiveSerializer):
}
def get_events_supported(self, instance: Stream) -> list[str]:
return [
EventTypes.CAEP_SESSION_REVOKED,
EventTypes.CAEP_CREDENTIAL_CHANGE,
EventTypes.SET_VERIFICATION,
]
return [x.value for x in EventTypes]
class StreamView(SSFStreamView):
@@ -132,9 +128,10 @@ class StreamView(SSFStreamView):
LOGGER.info("Sending verification event", stream=instance)
send_ssf_events(
EventTypes.SET_VERIFICATION,
{},
{
"state": None,
},
stream_filter={"pk": instance.uuid},
request=request,
sub_id={"format": "opaque", "id": str(instance.uuid)},
)
response = StreamResponseSerializer(instance=instance, context={"request": request}).data
@@ -162,9 +159,7 @@ class StreamView(SSFStreamView):
def delete(self, request: Request, *args, **kwargs) -> Response:
stream = self.get_object()
if stream.status == StreamStatus.DISABLED_DELETED:
raise Http404
stream.status = StreamStatus.DISABLED_DELETED
stream.status = StreamStatus.DISABLED
stream.save()
return Response(status=204)
@@ -180,7 +175,6 @@ class StreamVerifyView(SSFStreamView):
"state": state,
},
stream_filter={"pk": stream.uuid},
request=request,
sub_id={"format": "opaque", "id": str(stream.uuid)},
)
return Response(status=204)
@@ -188,25 +182,8 @@ class StreamVerifyView(SSFStreamView):
class StreamStatusView(SSFStreamView):
class StreamStatusSerializer(PassiveSerializer):
stream_id = CharField()
status = ChoiceField(choices=StreamStatus.choices)
def get(self, request: Request, *args, **kwargs):
stream = self.get_object()
return Response(
{
"stream_id": str(stream.pk),
"status": str(stream.status),
}
)
def post(self, request: Request, *args, **kwargs):
stream = self.get_object()
serializer = self.StreamStatusSerializer(stream, data=request.data)
serializer.is_valid(raise_exception=True)
stream.status = serializer.validated_data["status"]
stream.save()
stream = self.get_object(any_status=True)
return Response(
{
"stream_id": str(stream.pk),

View File

@@ -1,5 +1,4 @@
from dataclasses import dataclass
from urllib.parse import urlparse
from django.http import HttpRequest
from django.shortcuts import get_object_or_404
@@ -56,9 +55,7 @@ class SignInRequest:
_, provider = req.get_app_provider()
if not req.wreply:
req.wreply = provider.acs_url
reply = urlparse(req.wreply)
configured = urlparse(provider.acs_url)
if not (reply[:2] == configured[:2] and reply.path.startswith(configured.path)):
if not req.wreply.startswith(provider.acs_url):
raise ValueError("Invalid wreply")
return req

View File

@@ -1,5 +1,4 @@
from dataclasses import dataclass
from urllib.parse import urlparse
from django.http import HttpRequest
from django.shortcuts import get_object_or_404
@@ -33,9 +32,7 @@ class SignOutRequest:
_, provider = req.get_app_provider()
if not req.wreply:
req.wreply = provider.acs_url
reply = urlparse(req.wreply)
configured = urlparse(provider.acs_url)
if not (reply[:2] == configured[:2] and reply.path.startswith(configured.path)):
if not req.wreply.startswith(provider.acs_url):
raise ValueError("Invalid wreply")
return req

View File

@@ -27,27 +27,12 @@ class TestWSFedSignIn(TestCase):
name=generate_id(),
authorization_flow=self.flow,
signing_kp=self.cert,
acs_url="https://t.goauthentik.io",
audience="foo",
)
self.app = Application.objects.create(
name=generate_id(), slug=generate_id(), provider=self.provider
)
self.factory = RequestFactory()
def test_wreply(self):
request = self.factory.get(
"/?wreply=https://t.goauthentik.io/foo&wa=wsignin1.0&wtrealm=foo",
user=get_anonymous_user(),
)
SignInRequest.parse(request)
with self.assertRaises(ValueError):
request = self.factory.get(
"/?wreply=https://t.goauthentik.io.invalid.com&wa=wsignin1.0&wtrealm=foo",
user=get_anonymous_user(),
)
SignInRequest.parse(request)
def test_token_gen(self):
request = self.factory.get("/", user=get_anonymous_user())
proc = SignInProcessor(

View File

@@ -11,9 +11,7 @@ from authentik.events.models import NotificationRule
class NotificationRuleSerializer(ModelSerializer):
"""NotificationRule Serializer"""
destination_group_obj = GroupSerializer(
read_only=True, source="destination_group", required=False, allow_null=True
)
destination_group_obj = GroupSerializer(read_only=True, source="destination_group")
class Meta:
model = NotificationRule

View File

@@ -8,6 +8,7 @@ from inspect import currentframe
from typing import Any
from uuid import uuid4
from asgiref.sync import async_to_sync
from channels.layers import get_channel_layer
from django.apps import apps
from django.db import models
@@ -409,7 +410,7 @@ class NotificationTransport(TasksModel, SerializerModel):
)
notification.save()
layer = get_channel_layer()
layer.group_send_blocking(
async_to_sync(layer.group_send)(
build_user_group(notification.user),
{
"type": "event.notification",

View File

@@ -29,7 +29,6 @@ class RefreshOtherFlowsAfterAuthentication(Flag[bool], key="flows_refresh_others
default = False
visibility = "public"
description = _("Refresh other tabs after successful authentication.")
deprecated = True
class ContinuousLogin(Flag[bool], key="flows_continuous_login"):

View File

@@ -9,10 +9,10 @@ from rest_framework.fields import CharField, ListField, SerializerMethodField
from rest_framework.filters import OrderingFilter, SearchFilter
from rest_framework.viewsets import GenericViewSet
from authentik.core.api.providers import ProviderSerializer
from authentik.core.api.used_by import UsedByMixin
from authentik.core.api.users import UserSerializer
from authentik.core.api.utils import MetaNameSerializer, ModelSerializer
from authentik.providers.oauth2.api.providers import OAuth2ProviderSerializer
from authentik.providers.oauth2.models import AccessToken, AuthorizationCode, RefreshToken
@@ -20,7 +20,7 @@ class ExpiringBaseGrantModelSerializer(ModelSerializer, MetaNameSerializer):
"""Serializer for BaseGrantModel and ExpiringBaseGrant"""
user = UserSerializer()
provider = ProviderSerializer()
provider = OAuth2ProviderSerializer()
scope = ListField(child=CharField())
class Meta:

View File

@@ -53,16 +53,6 @@ class TestEndSessionView(OAuthTestCase):
self.brand.flow_invalidation = self.invalidation_flow
self.brand.save()
def _id_token_hint(self, host: str) -> str:
"""Issue a valid id_token_hint for the test provider under the given host."""
return self.provider.encode(
{
"iss": f"http://{host}/application/o/{self.app.slug}/",
"aud": self.provider.client_id,
"sub": str(self.user.pk),
}
)
def test_post_logout_redirect_uri_strict_match(self):
"""Test strict URI matching redirects to flow"""
self.client.force_login(self.user)
@@ -71,10 +61,7 @@ class TestEndSessionView(OAuthTestCase):
"authentik_providers_oauth2:end-session",
kwargs={"application_slug": self.app.slug},
),
{
"post_logout_redirect_uri": "http://testserver/logout",
"id_token_hint": self._id_token_hint(self.brand.domain),
},
{"post_logout_redirect_uri": "http://testserver/logout"},
HTTP_HOST=self.brand.domain,
)
# Should redirect to the invalidation flow
@@ -82,12 +69,7 @@ class TestEndSessionView(OAuthTestCase):
self.assertIn(self.invalidation_flow.slug, response.url)
def test_post_logout_redirect_uri_strict_no_match(self):
"""Test strict URI not matching returns an error and does not start logout flow.
Required by OIDC RP-Initiated Logout 1.0: on an unregistered
post_logout_redirect_uri, the OP MUST NOT redirect and MUST NOT proceed with
logout that targets the RP.
"""
"""Test strict URI not matching still proceeds with flow (no redirect URI in context)"""
self.client.force_login(self.user)
invalid_uri = "http://testserver/other"
response = self.client.get(
@@ -95,14 +77,12 @@ class TestEndSessionView(OAuthTestCase):
"authentik_providers_oauth2:end-session",
kwargs={"application_slug": self.app.slug},
),
{
"post_logout_redirect_uri": invalid_uri,
"id_token_hint": self._id_token_hint(self.brand.domain),
},
{"post_logout_redirect_uri": invalid_uri},
HTTP_HOST=self.brand.domain,
)
self.assertEqual(response.status_code, 400)
self.assertNotIn(invalid_uri, response.content.decode())
# Should still redirect to flow, but invalid URI should not be in response
self.assertEqual(response.status_code, 302)
self.assertNotIn(invalid_uri, response.url)
def test_post_logout_redirect_uri_regex_match(self):
"""Test regex URI matching redirects to flow"""
@@ -112,10 +92,7 @@ class TestEndSessionView(OAuthTestCase):
"authentik_providers_oauth2:end-session",
kwargs={"application_slug": self.app.slug},
),
{
"post_logout_redirect_uri": "https://app.example.com/logout",
"id_token_hint": self._id_token_hint(self.brand.domain),
},
{"post_logout_redirect_uri": "https://app.example.com/logout"},
HTTP_HOST=self.brand.domain,
)
# Should redirect to the invalidation flow
@@ -123,7 +100,7 @@ class TestEndSessionView(OAuthTestCase):
self.assertIn(self.invalidation_flow.slug, response.url)
def test_post_logout_redirect_uri_regex_no_match(self):
"""Test regex URI not matching returns an error and does not start logout flow."""
"""Test regex URI not matching"""
self.client.force_login(self.user)
invalid_uri = "https://malicious.com/logout"
response = self.client.get(
@@ -131,14 +108,12 @@ class TestEndSessionView(OAuthTestCase):
"authentik_providers_oauth2:end-session",
kwargs={"application_slug": self.app.slug},
),
{
"post_logout_redirect_uri": invalid_uri,
"id_token_hint": self._id_token_hint(self.brand.domain),
},
{"post_logout_redirect_uri": invalid_uri},
HTTP_HOST=self.brand.domain,
)
self.assertEqual(response.status_code, 400)
self.assertNotIn(invalid_uri, response.content.decode())
# Should still proceed to flow, but invalid URI should not be in response
self.assertEqual(response.status_code, 302)
self.assertNotIn(invalid_uri, response.url)
def test_state_parameter_appended_to_uri(self):
"""Test state parameter is appended to validated redirect URI"""
@@ -148,7 +123,6 @@ class TestEndSessionView(OAuthTestCase):
{
"post_logout_redirect_uri": "http://testserver/logout",
"state": "test-state-123",
"id_token_hint": self._id_token_hint("testserver"),
},
)
request.user = self.user
@@ -158,7 +132,6 @@ class TestEndSessionView(OAuthTestCase):
view.request = request
view.kwargs = {"application_slug": self.app.slug}
view.resolve_provider_application()
view.validate()
self.assertIn("state=test-state-123", view.post_logout_redirect_uri)
@@ -173,7 +146,6 @@ class TestEndSessionView(OAuthTestCase):
{
"post_logout_redirect_uri": "http://testserver/logout",
"state": "xyz789",
"id_token_hint": self._id_token_hint(self.brand.domain),
},
HTTP_HOST=self.brand.domain,
)

View File

@@ -5,8 +5,6 @@ from urllib.parse import quote, urlparse
from django.http import Http404, HttpRequest, HttpResponse
from django.shortcuts import get_object_or_404
from jwt import PyJWTError
from jwt import decode as jwt_decode
from authentik.common.oauth.constants import (
FORBIDDEN_URI_SCHEMES,
@@ -23,14 +21,11 @@ from authentik.flows.planner import (
from authentik.flows.stage import SessionEndStage
from authentik.flows.views.executor import SESSION_KEY_PLAN
from authentik.lib.views import bad_request_message
from authentik.policies.views import PolicyAccessView
from authentik.policies.views import PolicyAccessView, RequestValidationError
from authentik.providers.iframe_logout import IframeLogoutStageView
from authentik.providers.oauth2.errors import TokenError
from authentik.providers.oauth2.models import (
AccessToken,
JWTAlgorithms,
OAuth2LogoutMethod,
OAuth2Provider,
RedirectURIMatchingMode,
)
from authentik.providers.oauth2.tasks import send_backchannel_logout_request
@@ -52,45 +47,21 @@ class EndSessionView(PolicyAccessView):
if not self.flow:
raise Http404
def validate(self):
# Parse end session parameters
query_dict = self.request.POST if self.request.method == "POST" else self.request.GET
state = query_dict.get("state")
request_redirect_uri = query_dict.get("post_logout_redirect_uri")
id_token_hint = query_dict.get("id_token_hint")
self.post_logout_redirect_uri = None
# OIDC Certification: Verify id_token_hint. If invalid or missing, throw an error
if id_token_hint:
# Load a fresh provider instance that's not part of the flow
# since it'll have the cryptography Certificate that can't be pickled
provider = OAuth2Provider.objects.get(pk=self.provider.pk)
key, alg = provider.jwt_key
if alg != JWTAlgorithms.HS256:
key = provider.signing_key.public_key
try:
jwt_decode(
id_token_hint,
key,
algorithms=[alg],
audience=provider.client_id,
issuer=provider.get_issuer(self.request),
# ID Tokens are short-lived; a logout request arriving
# after expiry is still legitimate and must succeed.
options={"verify_exp": False},
)
except PyJWTError:
raise TokenError("invalid_request").with_cause(
"id_token_hint_decode_failed"
) from None
# Validate post_logout_redirect_uri against registered URIs
if request_redirect_uri:
# OIDC Certification: id_token_hint required with post_logout_redirect_uri
if not id_token_hint:
raise TokenError("invalid_request").with_cause("id_token_hint_missing")
if urlparse(request_redirect_uri).scheme in FORBIDDEN_URI_SCHEMES:
raise TokenError("invalid_request").with_cause("post_logout_redirect_uri")
raise RequestValidationError(
bad_request_message(
self.request,
"Forbidden URI scheme in post_logout_redirect_uri",
)
)
for allowed in self.provider.post_logout_redirect_uris:
if allowed.matching_mode == RedirectURIMatchingMode.STRICT:
if request_redirect_uri == allowed.url:
@@ -100,10 +71,6 @@ class EndSessionView(PolicyAccessView):
if fullmatch(allowed.url, request_redirect_uri):
self.post_logout_redirect_uri = request_redirect_uri
break
# OIDC Certification: OP MUST NOT perform post-logout redirection
# if the supplied URI does not exactly match a registered one
if self.post_logout_redirect_uri is None:
raise TokenError("invalid_request").with_cause("invalid_post_logout_redirect_uri")
# Append state to the redirect URI if both are present
if self.post_logout_redirect_uri and state:
@@ -124,43 +91,50 @@ class EndSessionView(PolicyAccessView):
"<html><body>Logout successful</body></html>", content_type="text/html", status=200
)
# Otherwise, continue with normal policy checks
return super().dispatch(request, *args, **kwargs)
def get(self, request: HttpRequest, *args, **kwargs) -> HttpResponse:
"""Dispatch the flow planner for the invalidation flow"""
try:
self.validate()
except TokenError as exc:
return bad_request_message(
self.request,
exc.description,
)
planner = FlowPlanner(self.flow)
planner.allow_empty_flows = True
# Build flow context with logout parameters
context = {
PLAN_CONTEXT_APPLICATION: self.application,
}
# Get session info for logout notifications and token invalidation
auth_session = AuthenticatedSession.from_request(request, request.user)
# Add validated redirect URI (with state appended) to context if available
if self.post_logout_redirect_uri:
context[PLAN_CONTEXT_POST_LOGOUT_REDIRECT_URI] = self.post_logout_redirect_uri
# Invalidate tokens for this provider/session (RP-initiated logout:
# user stays logged into authentik, only this provider's tokens are revoked)
if request.user.is_authenticated and auth_session:
AccessToken.objects.filter(
user=request.user,
provider=self.provider,
session=auth_session,
).delete()
session_key = (
auth_session.session.session_key if auth_session and auth_session.session else None
)
# Handle frontchannel logout
frontchannel_logout_url = None
if self.provider.logout_method == OAuth2LogoutMethod.FRONTCHANNEL:
frontchannel_logout_url = build_frontchannel_logout_url(
self.provider, request, session_key
)
# Handle backchannel logout
if (
self.provider.logout_method == OAuth2LogoutMethod.BACKCHANNEL
and self.provider.logout_uri
):
# Find access token to get iss and sub for the logout token
access_token = AccessToken.objects.filter(
user=request.user,
provider=self.provider,
@@ -189,16 +163,9 @@ class EndSessionView(PolicyAccessView):
}
]
access_tokens = AccessToken.objects.filter(
user=request.user,
provider=self.provider,
)
if auth_session:
access_tokens = access_tokens.filter(session=auth_session)
access_tokens.delete()
plan = planner.plan(request, context)
# Inject iframe logout stage if frontchannel logout is configured
if frontchannel_logout_url:
plan.insert_stage(in_memory_stage(IframeLogoutStageView))

View File

@@ -1,5 +1,6 @@
"""RAC Signals"""
from asgiref.sync import async_to_sync
from channels.layers import get_channel_layer
from django.core.cache import cache
from django.db.models.signals import post_delete, post_save, pre_delete
@@ -17,7 +18,7 @@ from authentik.providers.rac.models import ConnectionToken, Endpoint
@receiver(pre_delete, sender=AuthenticatedSession)
def user_session_deleted(sender, instance: AuthenticatedSession, **_):
layer = get_channel_layer()
layer.group_send_blocking(
async_to_sync(layer.group_send)(
build_rac_client_group_session(instance.session.session_key),
{"type": "event.disconnect", "reason": "session_logout"},
)
@@ -27,7 +28,7 @@ def user_session_deleted(sender, instance: AuthenticatedSession, **_):
def pre_delete_connection_token_disconnect(sender, instance: ConnectionToken, **_):
"""Disconnect session when connection token is deleted"""
layer = get_channel_layer()
layer.group_send_blocking(
async_to_sync(layer.group_send)(
build_rac_client_group_token(instance.token),
{"type": "event.disconnect", "reason": "token_delete"},
)

View File

@@ -61,11 +61,6 @@ class SAMLProviderSerializer(ProviderSerializer):
url_download_metadata = SerializerMethodField()
url_issuer = SerializerMethodField()
# Unified SAML endpoint (primary)
url_unified = SerializerMethodField()
url_unified_init = SerializerMethodField()
# Legacy endpoints (for backward compatibility)
url_sso_post = SerializerMethodField()
url_sso_redirect = SerializerMethodField()
url_sso_init = SerializerMethodField()
@@ -102,21 +97,6 @@ class SAMLProviderSerializer(ProviderSerializer):
if "request" not in self._context:
return DEFAULT_ISSUER
request: HttpRequest = self._context["request"]._request
try:
return request.build_absolute_uri(
reverse(
"authentik_providers_saml:metadata-download",
kwargs={"application_slug": instance.application.slug},
)
)
except Provider.application.RelatedObjectDoesNotExist:
return DEFAULT_ISSUER
def get_url_unified(self, instance: SAMLProvider) -> str:
"""Get unified SAML endpoint URL (handles SSO and SLO)"""
if "request" not in self._context:
return ""
request: HttpRequest = self._context["request"]._request
try:
return request.build_absolute_uri(
reverse(
@@ -125,22 +105,7 @@ class SAMLProviderSerializer(ProviderSerializer):
)
)
except Provider.application.RelatedObjectDoesNotExist:
return "-"
def get_url_unified_init(self, instance: SAMLProvider) -> str:
"""Get IdP-initiated SAML URL"""
if "request" not in self._context:
return ""
request: HttpRequest = self._context["request"]._request
try:
return request.build_absolute_uri(
reverse(
"authentik_providers_saml:init",
kwargs={"application_slug": instance.application.slug},
)
)
except Provider.application.RelatedObjectDoesNotExist:
return "-"
return DEFAULT_ISSUER
def get_url_sso_post(self, instance: SAMLProvider) -> str:
"""Get SSO Post URL"""
@@ -278,8 +243,6 @@ class SAMLProviderSerializer(ProviderSerializer):
"default_name_id_policy",
"url_download_metadata",
"url_issuer",
"url_unified",
"url_unified_init",
"url_sso_post",
"url_sso_redirect",
"url_sso_init",

View File

@@ -241,7 +241,7 @@ class SAMLProvider(Provider):
"""Use IDP-Initiated SAML flow as launch URL"""
try:
return reverse(
"authentik_providers_saml:init",
"authentik_providers_saml:sso-init",
kwargs={"application_slug": self.application.slug},
)
except Provider.application.RelatedObjectDoesNotExist:

View File

@@ -147,7 +147,7 @@ class AssertionProcessor:
return self.http_request.build_absolute_uri(
reverse(
"authentik_providers_saml:metadata-download",
"authentik_providers_saml:base",
kwargs={"application_slug": self.provider.application.slug},
)
)

View File

@@ -48,7 +48,7 @@ class MetadataProcessor:
return self.http_request.build_absolute_uri(
reverse(
"authentik_providers_saml:metadata-download",
"authentik_providers_saml:base",
kwargs={"application_slug": self.provider.application.slug},
)
)
@@ -81,35 +81,54 @@ class MetadataProcessor:
element.text = name_id_format
yield element
def _get_unified_url(self) -> str:
"""Get the unified SAML endpoint URL"""
return self.http_request.build_absolute_uri(
reverse(
"authentik_providers_saml:base",
kwargs={"application_slug": self.provider.application.slug},
)
)
def get_sso_bindings(self) -> Iterator[Element]:
"""Get all SSO Bindings - both point to unified endpoint"""
unified_url = self._get_unified_url()
for binding in [SAML_BINDING_REDIRECT, SAML_BINDING_POST]:
"""Get all Bindings supported"""
binding_url_map = {
(SAML_BINDING_REDIRECT, "SingleSignOnService"): self.http_request.build_absolute_uri(
reverse(
"authentik_providers_saml:sso-redirect",
kwargs={"application_slug": self.provider.application.slug},
)
),
(SAML_BINDING_POST, "SingleSignOnService"): self.http_request.build_absolute_uri(
reverse(
"authentik_providers_saml:sso-post",
kwargs={"application_slug": self.provider.application.slug},
)
),
}
for binding_svc, url in binding_url_map.items():
binding, svc = binding_svc
if self.force_binding and self.force_binding != binding:
continue
element = Element(f"{{{NS_SAML_METADATA}}}SingleSignOnService")
element = Element(f"{{{NS_SAML_METADATA}}}{svc}")
element.attrib["Binding"] = binding
element.attrib["Location"] = unified_url
element.attrib["Location"] = url
yield element
def get_slo_bindings(self) -> Iterator[Element]:
"""Get all SLO Bindings - both point to unified endpoint"""
unified_url = self._get_unified_url()
for binding in [SAML_BINDING_REDIRECT, SAML_BINDING_POST]:
"""Get all Bindings supported"""
binding_url_map = {
(SAML_BINDING_REDIRECT, "SingleLogoutService"): self.http_request.build_absolute_uri(
reverse(
"authentik_providers_saml:slo-redirect",
kwargs={"application_slug": self.provider.application.slug},
)
),
(SAML_BINDING_POST, "SingleLogoutService"): self.http_request.build_absolute_uri(
reverse(
"authentik_providers_saml:slo-post",
kwargs={"application_slug": self.provider.application.slug},
)
),
}
for binding_svc, url in binding_url_map.items():
binding, svc = binding_svc
if self.force_binding and self.force_binding != binding:
continue
element = Element(f"{{{NS_SAML_METADATA}}}SingleLogoutService")
element = Element(f"{{{NS_SAML_METADATA}}}{svc}")
element.attrib["Binding"] = binding
element.attrib["Location"] = unified_url
element.attrib["Location"] = url
yield element
def _prepare_signature(self, entity_descriptor: _Element):

View File

@@ -4,26 +4,19 @@ from django.urls import path
from authentik.providers.saml.api.property_mappings import SAMLPropertyMappingViewSet
from authentik.providers.saml.api.providers import SAMLProviderViewSet
from authentik.providers.saml.views import metadata, sso, unified
from authentik.providers.saml.views import metadata, sso
from authentik.providers.saml.views.sp_slo import (
SPInitiatedSLOBindingPOSTView,
SPInitiatedSLOBindingRedirectView,
)
urlpatterns = [
# Unified Endpoint - handles SSO and SLO based on message type
# Base path for Issuer/Entity ID
path(
"<slug:application_slug>/",
unified.SAMLUnifiedView.as_view(),
sso.SAMLSSOBindingRedirectView.as_view(),
name="base",
),
# IdP-initiated
path(
"<slug:application_slug>/init/",
sso.SAMLSSOBindingInitView.as_view(),
name="init",
),
# LEGACY Endpoints (backward compatibility)
# SSO Bindings
path(
"<slug:application_slug>/sso/binding/redirect/",

View File

@@ -1,118 +0,0 @@
"""Unified SAML endpoint - handles SSO and SLO based on message type"""
from base64 import b64decode
from defusedxml.lxml import fromstring
from django.http import HttpRequest, HttpResponse
from django.utils.decorators import method_decorator
from django.views import View
from django.views.decorators.clickjacking import xframe_options_sameorigin
from django.views.decorators.csrf import csrf_exempt
from structlog.stdlib import get_logger
from authentik.common.saml.constants import NS_MAP
from authentik.flows.views.executor import SESSION_KEY_POST
from authentik.lib.views import bad_request_message
from authentik.providers.saml.utils.encoding import decode_base64_and_inflate
from authentik.providers.saml.views.flows import (
REQUEST_KEY_SAML_REQUEST,
REQUEST_KEY_SAML_RESPONSE,
)
from authentik.providers.saml.views.sp_slo import (
SPInitiatedSLOBindingPOSTView,
SPInitiatedSLOBindingRedirectView,
)
from authentik.providers.saml.views.sso import (
SAMLSSOBindingPOSTView,
SAMLSSOBindingRedirectView,
)
LOGGER = get_logger()
# SAML message type constants
SAML_MESSAGE_TYPE_AUTHN_REQUEST = "AuthnRequest"
SAML_MESSAGE_TYPE_LOGOUT_REQUEST = "LogoutRequest"
def detect_saml_message_type(saml_request: str, is_post_binding: bool) -> str | None:
"""Parse SAML request to determine if AuthnRequest or LogoutRequest."""
try:
if is_post_binding:
decoded_xml = b64decode(saml_request.encode())
else:
decoded_xml = decode_base64_and_inflate(saml_request)
root = fromstring(decoded_xml)
if len(root.xpath("//samlp:AuthnRequest", namespaces=NS_MAP)):
return SAML_MESSAGE_TYPE_AUTHN_REQUEST
if len(root.xpath("//samlp:LogoutRequest", namespaces=NS_MAP)):
return SAML_MESSAGE_TYPE_LOGOUT_REQUEST
return None
except Exception: # noqa: BLE001
return None
@method_decorator(xframe_options_sameorigin, name="dispatch")
@method_decorator(csrf_exempt, name="dispatch")
class SAMLUnifiedView(View):
"""Unified SAML endpoint - handles SSO and SLO based on message type.
The operation type is determined by parsing
the incoming SAML message:
- AuthnRequest -> SSO flow (delegates to SAMLSSOBindingRedirectView/POSTView)
- LogoutRequest -> SLO flow (delegates to SPInitiatedSLOBindingRedirectView/POSTView)
- LogoutResponse -> SLO completion (delegates to SPInitiatedSLOBindingRedirectView/POSTView)
"""
def dispatch(self, request: HttpRequest, application_slug: str) -> HttpResponse:
"""Route the request based on SAML message type."""
# ak user was not logged in, redirected to login, and is back w POST payload in session
if SESSION_KEY_POST in request.session:
return self._delegate_to_sso(request, application_slug, is_post_binding=True)
# Determine binding from HTTP method
is_post_binding = request.method == "POST"
data = request.POST if is_post_binding else request.GET
# LogoutResponse - delegate to SLO view (handles it in dispatch)
if REQUEST_KEY_SAML_RESPONSE in data:
return self._delegate_to_slo(request, application_slug, is_post_binding)
# Check for SAML request
if REQUEST_KEY_SAML_REQUEST not in data:
LOGGER.info("SAML payload missing")
return bad_request_message(request, "The SAML request payload is missing.")
# Detect message type and delegate
saml_request = data[REQUEST_KEY_SAML_REQUEST]
message_type = detect_saml_message_type(saml_request, is_post_binding)
if message_type == SAML_MESSAGE_TYPE_AUTHN_REQUEST:
return self._delegate_to_sso(request, application_slug, is_post_binding)
elif message_type == SAML_MESSAGE_TYPE_LOGOUT_REQUEST:
return self._delegate_to_slo(request, application_slug, is_post_binding)
else:
LOGGER.warning("Unknown SAML message type", message_type=message_type)
return bad_request_message(
request, f"Unsupported SAML message type: {message_type or 'unknown'}"
)
def _delegate_to_sso(
self, request: HttpRequest, application_slug: str, is_post_binding: bool
) -> HttpResponse:
"""Delegate to the appropriate SSO view."""
if is_post_binding:
view = SAMLSSOBindingPOSTView.as_view()
else:
view = SAMLSSOBindingRedirectView.as_view()
return view(request, application_slug=application_slug)
def _delegate_to_slo(
self, request: HttpRequest, application_slug: str, is_post_binding: bool
) -> HttpResponse:
"""Delegate to the appropriate SLO view."""
if is_post_binding:
view = SPInitiatedSLOBindingPOSTView.as_view()
else:
view = SPInitiatedSLOBindingRedirectView.as_view()
return view(request, application_slug=application_slug)

View File

@@ -1,6 +1,5 @@
"""SCIM Provider API Views"""
from rest_framework.fields import SerializerMethodField
from rest_framework.viewsets import ModelViewSet
from authentik.core.api.providers import ProviderSerializer
@@ -17,11 +16,6 @@ class SCIMProviderSerializer(
):
"""SCIMProvider Serializer"""
auth_oauth_token_last_updated = SerializerMethodField()
auth_oauth_token_expires = SerializerMethodField()
auth_oauth_url_callback = SerializerMethodField()
auth_oauth_url_start = SerializerMethodField()
class Meta:
model = SCIMProvider
fields = [
@@ -41,10 +35,6 @@ class SCIMProviderSerializer(
"auth_mode",
"auth_oauth",
"auth_oauth_params",
"auth_oauth_token_last_updated",
"auth_oauth_token_expires",
"auth_oauth_url_callback",
"auth_oauth_url_start",
"compatibility_mode",
"service_provider_config_cache_timeout",
"exclude_users_service_account",

View File

@@ -102,16 +102,4 @@ class Migration(migrations.Migration):
verbose_name="SCIM Compatibility Mode",
),
),
migrations.AlterField(
model_name="scimprovider",
name="auth_mode",
field=models.TextField(
choices=[
("token", "Token"),
("oauth", "OAuth (Silent)"),
("oauth_interactive", "OAuth (interactive)"),
],
default="token",
),
),
]

View File

@@ -72,8 +72,7 @@ class SCIMAuthenticationMode(models.TextChoices):
"""SCIM authentication modes"""
TOKEN = "token", _("Token")
OAUTH_SILENT = "oauth", _("OAuth (Silent)")
OAUTH_INTERACTIVE = "oauth_interactive", _("OAuth (interactive)")
OAUTH = "oauth", _("OAuth")
class SCIMCompatibilityMode(models.TextChoices):
@@ -145,10 +144,7 @@ class SCIMProvider(OutgoingSyncProvider, BackchannelProvider):
)
def scim_auth(self) -> AuthBase:
if self.auth_mode in [
SCIMAuthenticationMode.OAUTH_SILENT,
SCIMAuthenticationMode.OAUTH_INTERACTIVE,
]:
if self.auth_mode == SCIMAuthenticationMode.OAUTH:
try:
from authentik.enterprise.providers.scim.auth_oauth2 import SCIMOAuthAuth

View File

@@ -6,7 +6,6 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("authentik_core", "0056_user_roles"), # must run before group field is removed
("authentik_rbac", "0009_remove_initialpermissions_mode"),
]

View File

@@ -187,7 +187,6 @@ SPECTACULAR_SETTINGS = {
"PolicyEngineMode": "authentik.policies.models.PolicyEngineMode",
"PromptTypeEnum": "authentik.stages.prompt.models.FieldTypes",
"ProxyMode": "authentik.providers.proxy.models.ProxyMode",
"RedirectURITypeEnum": "authentik.providers.oauth2.models.RedirectURIType",
"SAMLBindingsEnum": "authentik.providers.saml.models.SAMLBindings",
"SAMLLogoutMethods": "authentik.providers.saml.models.SAMLLogoutMethods",
"SAMLNameIDPolicyEnum": "authentik.sources.saml.models.SAMLNameIDPolicy",
@@ -221,7 +220,7 @@ REST_FRAMEWORK = {
"authentik.api.search.ql.QLSearch",
"authentik.rbac.filters.ObjectFilter",
"django_filters.rest_framework.DjangoFilterBackend",
"authentik.api.ordering.NullsAwareOrderingFilter",
"rest_framework.filters.OrderingFilter",
],
"DEFAULT_PERMISSION_CLASSES": ("authentik.rbac.permissions.ObjectPermissions",),
"DEFAULT_AUTHENTICATION_CLASSES": (

View File

@@ -1,5 +1,6 @@
"""Source type manager"""
from collections.abc import Callable
from enum import Enum
from typing import Any
@@ -113,7 +114,7 @@ class SourceTypeRegistry:
)
return found_type
def find(self, type_name: str, kind: RequestKind) -> type[OAuthCallback | OAuthRedirect]:
def find(self, type_name: str, kind: RequestKind) -> Callable:
"""Find fitting Source Type"""
found_type = self.find_type(type_name)
if kind == RequestKind.CALLBACK:

View File

@@ -15,7 +15,6 @@ from structlog.stdlib import get_logger
from authentik.core.sources.flow_manager import SourceFlowManager
from authentik.events.models import Event, EventAction
from authentik.sources.oauth.clients.base import BaseOAuthClient
from authentik.sources.oauth.models import (
GroupOAuthSourceConnection,
OAuthSource,
@@ -30,7 +29,7 @@ class OAuthCallback(OAuthClientMixin, View):
"Base OAuth callback view."
source: OAuthSource
token: dict[str, Any] | None = None
token: dict | None = None
def dispatch(self, request: HttpRequest, *_, **kwargs) -> HttpResponse:
"""View Get handler"""
@@ -50,31 +49,20 @@ class OAuthCallback(OAuthClientMixin, View):
if "error" in self.token:
return self.handle_login_failure(self.token["error"])
# Fetch profile info
try:
res = self.redirect_flow_manager(client)
except ValueError as exc:
# if we're authenticated and not in a source stage and this new flag is enabled,
# just continue
if self.request.user.is_authenticated:
pass
return self.handle_login_failure(exc.args[0])
return res
def redirect_flow_manager(self, client: BaseOAuthClient) -> HttpResponse:
try:
raw_info = client.get_profile_info(self.token)
if raw_info is None:
raise ValueError("Could not retrieve profile.")
return self.handle_login_failure("Could not retrieve profile.")
except JSONDecodeError as exc:
Event.new(
EventAction.CONFIGURATION_ERROR,
message="Failed to JSON-decode profile.",
raw_profile=exc.doc,
).from_http(self.request)
raise ValueError("Could not retrieve profile.") from None
return self.handle_login_failure("Could not retrieve profile.")
identifier = self.get_user_id(info=raw_info)
if identifier is None:
raise ValueError("Could not determine id.")
return self.handle_login_failure("Could not determine id.")
sfm = OAuthSourceFlowManager(
source=self.source,
request=self.request,

View File

@@ -1,7 +1,6 @@
"""authentik saml source processor"""
from base64 import b64decode
from datetime import UTC, datetime
from time import mktime
from typing import TYPE_CHECKING
@@ -41,7 +40,6 @@ from authentik.sources.saml.exceptions import (
InvalidSignature,
MismatchedRequestID,
MissingSAMLResponse,
SAMLException,
UnsupportedNameIDFormat,
)
from authentik.sources.saml.models import (
@@ -97,7 +95,6 @@ class ResponseProcessor:
self._verify_request_id()
self._verify_status()
self._verify_conditions()
def _decrypt_response(self):
"""Decrypt SAMLResponse EncryptedAssertion Element"""
@@ -129,20 +126,6 @@ class ResponseProcessor:
)
self._assertion = decrypted_assertion
def _verify_conditions(self):
conditions = self.get_assertion().find(f"{{{NS_SAML_ASSERTION}}}Conditions")
if conditions is None:
return
_now = now()
before = conditions.attrib.get("NotBefore")
if before:
if datetime.fromisoformat(before).replace(tzinfo=UTC) > _now:
raise SAMLException("Assertion is not valid yet or expired.")
on_or_after = conditions.attrib.get("NotOnOrAfter")
if on_or_after:
if datetime.fromisoformat(on_or_after).replace(tzinfo=UTC) < _now:
raise SAMLException("Assertion is not valid yet or expired.")
def _verify_signature(self, signature_node: _Element):
"""Verify a single signature node"""
xmlsec.tree.add_ids(self._root, ["ID"])
@@ -232,9 +215,10 @@ class ResponseProcessor:
user has an attribute that refers to our Source for cleanup. The user is also deleted
on logout and periodically."""
# Create a temporary User
name_id_el, name_id = self._get_name_id()
name_id = self._get_name_id()
username = name_id.text
# trim username to ensure it is max 150 chars
username = f"ak-{name_id[: USERNAME_MAX_LENGTH - 14]}-transient"
username = f"ak-{username[: USERNAME_MAX_LENGTH - 14]}-transient"
expiry = mktime(
(now() + timedelta_from_string(self._source.temporary_user_delete_after)).timetuple()
)
@@ -250,18 +234,20 @@ class ResponseProcessor:
},
path=self._source.get_user_path(),
)
LOGGER.debug("Created temporary user for NameID Transient", username=name_id)
LOGGER.debug("Created temporary user for NameID Transient", username=name_id.text)
user.set_unusable_password()
user.save()
UserSAMLSourceConnection.objects.create(source=self._source, user=user, identifier=name_id)
UserSAMLSourceConnection.objects.create(
source=self._source, user=user, identifier=name_id.text
)
return SAMLSourceFlowManager(
source=self._source,
request=self._http_request,
identifier=str(name_id),
identifier=str(name_id.text),
user_info={
"root": self._root,
"assertion": self.get_assertion(),
"name_id": name_id_el,
"name_id": name_id,
},
policy_context={},
)
@@ -272,7 +258,7 @@ class ResponseProcessor:
return self._assertion
return self._root.find(f"{{{NS_SAML_ASSERTION}}}Assertion")
def _get_name_id(self) -> tuple[Element, str]:
def _get_name_id(self) -> Element:
"""Get NameID Element"""
assertion = self.get_assertion()
if assertion is None:
@@ -283,11 +269,12 @@ class ResponseProcessor:
name_id = subject.find(f"{{{NS_SAML_ASSERTION}}}NameID")
if name_id is None:
raise ValueError("NameID element not found")
return name_id, "".join(name_id.itertext())
return name_id
def _get_name_id_filter(self) -> dict[str, str]:
"""Returns the subject's NameID as a Filter for the `User`"""
name_id_el, name_id = self._get_name_id()
name_id_el = self._get_name_id()
name_id = name_id_el.text
if not name_id:
raise UnsupportedNameIDFormat("Subject's NameID is empty.")
_format = name_id_el.attrib["Format"]
@@ -308,26 +295,26 @@ class ResponseProcessor:
def prepare_flow_manager(self) -> SourceFlowManager:
"""Prepare flow plan depending on whether or not the user exists"""
name_id_el, name_id = self._get_name_id()
name_id = self._get_name_id()
# Sanity check, show a warning if NameIDPolicy doesn't match what we go
if self._source.name_id_policy != name_id_el.attrib["Format"]:
if self._source.name_id_policy != name_id.attrib["Format"]:
LOGGER.warning(
"NameID from IdP doesn't match our policy",
expected=self._source.name_id_policy,
got=name_id_el.attrib["Format"],
got=name_id.attrib["Format"],
)
# transient NameIDs are handled separately as they don't have to go through flows.
if name_id_el.attrib["Format"] == SAML_NAME_ID_FORMAT_TRANSIENT:
if name_id.attrib["Format"] == SAML_NAME_ID_FORMAT_TRANSIENT:
return self._handle_name_id_transient()
return SAMLSourceFlowManager(
source=self._source,
request=self._http_request,
identifier=str(name_id),
identifier=str(name_id.text),
user_info={
"root": self._root,
"assertion": self.get_assertion(),
"name_id": name_id_el,
"name_id": name_id,
},
policy_context={
"saml_response": etree.tostring(self._root),

View File

@@ -4,7 +4,6 @@ from base64 import b64encode
from defusedxml.lxml import fromstring
from django.test import TestCase
from freezegun import freeze_time
from authentik.common.saml.constants import NS_SAML_ASSERTION
from authentik.core.tests.utils import RequestFactory, create_test_flow
@@ -35,7 +34,6 @@ class TestPropertyMappings(TestCase):
pre_authentication_flow=create_test_flow(),
)
@freeze_time("2022-10-14T14:15:00")
def test_user_base_properties(self):
"""Test user base properties"""
properties = self.source.get_base_user_properties(
@@ -63,7 +61,6 @@ class TestPropertyMappings(TestCase):
properties = self.source.get_base_group_properties(root=ROOT, group_id=group_id)
self.assertEqual(properties, {"name": group_id})
@freeze_time("2022-10-14T14:15:00")
def test_user_property_mappings(self):
"""Test user property mappings"""
self.source.user_property_mappings.add(
@@ -97,7 +94,6 @@ class TestPropertyMappings(TestCase):
},
)
@freeze_time("2022-10-14T14:15:00")
def test_group_property_mappings(self):
"""Test group property mappings"""
self.source.group_property_mappings.add(

View File

@@ -3,7 +3,6 @@
from base64 import b64encode
from django.test import TestCase
from freezegun import freeze_time
from authentik.core.tests.utils import RequestFactory, create_test_cert, create_test_flow
from authentik.crypto.models import CertificateKeyPair
@@ -47,7 +46,6 @@ class TestResponseProcessor(TestCase):
):
ResponseProcessor(self.source, request).parse()
@freeze_time("2022-10-14T14:15:00")
def test_success(self):
"""Test success"""
request = self.factory.post(
@@ -74,7 +72,6 @@ class TestResponseProcessor(TestCase):
},
)
@freeze_time("2022-10-14T14:16:40Z")
def test_success_with_status_message_and_detail(self):
"""Test success with StatusMessage and StatusDetail present (should not raise error)"""
request = self.factory.post(
@@ -91,7 +88,6 @@ class TestResponseProcessor(TestCase):
sfm = parser.prepare_flow_manager()
self.assertEqual(sfm.user_properties["username"], "jens@goauthentik.io")
@freeze_time("2022-10-14T14:16:40Z")
def test_error_with_message_and_detail(self):
"""Test error status with StatusMessage and StatusDetail includes both in error"""
request = self.factory.post(
@@ -109,7 +105,6 @@ class TestResponseProcessor(TestCase):
self.assertIn("User account is disabled", str(ctx.exception))
self.assertIn("Authentication failed", str(ctx.exception))
@freeze_time("2024-08-07T15:48:09.325Z")
def test_encrypted_correct(self):
"""Test encrypted"""
key = load_fixture("fixtures/encrypted-key.pem")
@@ -147,7 +142,6 @@ class TestResponseProcessor(TestCase):
with self.assertRaises(InvalidEncryption):
parser.parse()
@freeze_time("2022-10-14T14:16:40Z")
def test_verification_assertion(self):
"""Test verifying signature inside assertion"""
key = load_fixture("fixtures/signature_cert.pem")
@@ -170,7 +164,6 @@ class TestResponseProcessor(TestCase):
parser = ResponseProcessor(self.source, request)
parser.parse()
@freeze_time("2014-07-17T01:02:18Z")
def test_verification_assertion_duplicate(self):
"""Test verifying signature inside assertion, where the response has another assertion
before our signed assertion"""
@@ -193,35 +186,9 @@ class TestResponseProcessor(TestCase):
parser = ResponseProcessor(self.source, request)
parser.parse()
self.assertNotEqual(parser._get_name_id()[1], "bad")
self.assertEqual(parser._get_name_id()[1], "_ce3d2948b4cf20146dee0a0b3dd6f69b6cf86f62d7")
self.assertNotEqual(parser._get_name_id().text, "bad")
self.assertEqual(parser._get_name_id().text, "_ce3d2948b4cf20146dee0a0b3dd6f69b6cf86f62d7")
@freeze_time("2022-10-14T14:15:00")
def test_name_id_comment(self):
"""Test comment in name ID"""
fixture = load_fixture("fixtures/response_signed_assertion_dup.xml")
fixture = fixture.replace(
"_ce3d2948b4cf20146dee0a0b3dd6f69b6cf86f62d7",
"_ce3d2948b4cf20146dee0a0b3dd6f<!--x-->69b6cf86f62d7",
)
key = load_fixture("fixtures/signature_cert.pem")
kp = CertificateKeyPair.objects.create(
name=generate_id(),
certificate_data=key,
)
self.source.verification_kp = kp
self.source.signed_assertion = True
self.source.signed_response = False
request = self.factory.post(
"/",
data={"SAMLResponse": b64encode(fixture.encode()).decode()},
)
parser = ResponseProcessor(self.source, request)
parser.parse()
self.assertEqual(parser._get_name_id()[1], "_ce3d2948b4cf20146dee0a0b3dd6f69b6cf86f62d7")
@freeze_time("2014-07-17T01:02:18Z")
def test_verification_response(self):
"""Test verifying signature inside response"""
key = load_fixture("fixtures/signature_cert.pem")
@@ -244,7 +211,6 @@ class TestResponseProcessor(TestCase):
parser = ResponseProcessor(self.source, request)
parser.parse()
@freeze_time("2024-01-18T06:20:48Z")
def test_verification_response_and_assertion(self):
"""Test verifying signature inside response and assertion"""
key = load_fixture("fixtures/signature_cert.pem")
@@ -291,7 +257,6 @@ class TestResponseProcessor(TestCase):
with self.assertRaisesMessage(InvalidSignature, ""):
parser.parse()
@freeze_time("2022-10-14T14:15:00")
def test_verification_no_signature(self):
"""Test rejecting response without signature when signed_assertion is True"""
key = load_fixture("fixtures/signature_cert.pem")
@@ -338,7 +303,6 @@ class TestResponseProcessor(TestCase):
with self.assertRaisesMessage(InvalidSignature, ""):
parser.parse()
@freeze_time("2025-10-30T05:45:47.619Z")
def test_signed_encrypted_response(self):
"""Test signed & encrypted response"""
verification_key = load_fixture("fixtures/signature_cert2.pem")
@@ -366,7 +330,6 @@ class TestResponseProcessor(TestCase):
parser = ResponseProcessor(self.source, request)
parser.parse()
@freeze_time("2026-01-21T14:23")
def test_transient(self):
"""Test SAML transient NameID"""
verification_key = load_fixture("fixtures/signature_cert2.pem")

View File

@@ -4,7 +4,6 @@ from base64 import b64encode
from django.test import RequestFactory, TestCase
from django.urls import reverse
from freezegun import freeze_time
from authentik.core.tests.utils import create_test_flow
from authentik.flows.planner import PLAN_CONTEXT_REDIRECT, FlowPlan
@@ -27,7 +26,6 @@ class TestViews(TestCase):
pre_authentication_flow=create_test_flow(),
)
@freeze_time("2022-10-14T14:15:00")
def test_enroll(self):
"""Enroll"""
flow = create_test_flow()
@@ -54,7 +52,6 @@ class TestViews(TestCase):
plan: FlowPlan = self.client.session.get(SESSION_KEY_PLAN)
self.assertIsNotNone(plan)
@freeze_time("2022-10-14T14:15:00")
def test_enroll_redirect(self):
"""Enroll when attempting to access a provider"""
initial_redirect = f"http://{generate_id()}"

View File

@@ -389,19 +389,17 @@ class ThrottlingMixin(models.Model):
"""Check if throttling is enabled"""
return self.get_throttle_factor() > 0
def get_throttle_factor(self) -> float: # pragma: no cover
def get_throttle_factor(self): # pragma: no cover
"""
Returns the throttling factor.
"""
return getattr(self, "_throttle_factor", 1.0)
def set_throttle_factor(self, throttle_factor: float) -> None:
"""
Sets the throttle factor to use. Call this to override the default value of 1.
This must be implemented to return the throttle factor.
The number of seconds required between verification attempts will be
:math:`c2^{n-1}` where `c` is this factor and `n` is the number of
previous failures. A factor of 1 translates to delays of 1, 2, 4, 8,
etc. seconds. A factor of 0 disables the throttling.
Normally this is just a wrapper for a plugin-specific setting like
:setting:`OTP_EMAIL_THROTTLE_FACTOR`.
"""
self._throttle_factor = throttle_factor
raise NotImplementedError()

View File

@@ -6,6 +6,7 @@ from threading import Thread
from django.contrib.auth.models import AnonymousUser
from django.db import connection
from django.test import TestCase, TransactionTestCase
from django.test.utils import override_settings
from django.utils import timezone
from freezegun import freeze_time
@@ -109,24 +110,8 @@ class ThrottlingTestMixin:
self.assertEqual(verify_is_allowed3, True)
self.assertEqual(data3, None)
def test_set_throttle_factor_is_reflected(self):
"""`set_throttle_factor` must drive `get_throttle_factor`."""
self.device.set_throttle_factor(5.5)
self.assertEqual(self.device.get_throttle_factor(), 5.5)
self.device.set_throttle_factor(0)
self.assertEqual(self.device.get_throttle_factor(), 0)
def test_throttling_disabled_by_factor_zero(self):
"""Setting the throttle factor to 0 must actually disable throttling.
A failed attempt followed by a successful one must succeed. The lockout
path must not kick in when the factor is 0.
"""
self.device.set_throttle_factor(0)
self.assertFalse(self.device.verify_token(self.invalid_token()))
self.assertTrue(self.device.verify_token(self.valid_token()))
@override_settings(OTP_STATIC_THROTTLE_FACTOR=0)
class APITestCase(TestCase):
"""Test API"""
@@ -134,7 +119,6 @@ class APITestCase(TestCase):
self.alice = create_test_admin_user("alice")
self.bob = create_test_admin_user("bob")
device = self.alice.staticdevice_set.create()
device.set_throttle_factor(0)
self.valid = generate_id(length=16)
device.token_set.create(token=self.valid)
@@ -154,8 +138,6 @@ class APITestCase(TestCase):
verified = verify_token(self.alice, device.persistent_id, "bogus")
self.assertIsNone(verified)
self.alice.staticdevice_set.get().throttle_reset()
verified = verify_token(self.alice, device.persistent_id, self.valid)
self.assertIsNotNone(verified)
@@ -164,12 +146,11 @@ class APITestCase(TestCase):
verified = match_token(self.alice, "bogus")
self.assertIsNone(verified)
self.alice.staticdevice_set.get().throttle_reset()
verified = match_token(self.alice, self.valid)
self.assertEqual(verified, self.alice.staticdevice_set.first())
@override_settings(OTP_STATIC_THROTTLE_FACTOR=0)
class ConcurrencyTestCase(TransactionTestCase):
"""Test concurrent verifications"""

View File

@@ -1,33 +0,0 @@
# Generated by Django 5.2.12 on 2026-04-02 15:14
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
(
"authentik_stages_authenticator_email",
"0002_alter_authenticatoremailstage_friendly_name",
),
]
operations = [
migrations.AddField(
model_name="emaildevice",
name="throttling_failure_count",
field=models.PositiveIntegerField(
default=0, help_text="Number of successive failed attempts."
),
),
migrations.AddField(
model_name="emaildevice",
name="throttling_failure_timestamp",
field=models.DateTimeField(
blank=True,
default=None,
help_text="A timestamp of the last failed verification attempt. Null if last attempt succeeded.",
null=True,
),
),
]

View File

@@ -14,7 +14,7 @@ from authentik.flows.models import ConfigurableStage, FriendlyNamedStage, Stage
from authentik.lib.config import CONFIG
from authentik.lib.models import SerializerModel
from authentik.lib.utils.time import timedelta_string_validator
from authentik.stages.authenticator.models import SideChannelDevice, ThrottlingMixin
from authentik.stages.authenticator.models import SideChannelDevice
from authentik.stages.email.models import EmailTemplates
from authentik.stages.email.utils import TemplateEmailMessage
@@ -116,7 +116,7 @@ class AuthenticatorEmailStage(ConfigurableStage, FriendlyNamedStage, Stage):
verbose_name_plural = _("Email Authenticator Setup Stages")
class EmailDevice(SerializerModel, ThrottlingMixin, SideChannelDevice):
class EmailDevice(SerializerModel, SideChannelDevice):
"""Email Device"""
user = models.ForeignKey(get_user_model(), on_delete=models.CASCADE)
@@ -130,20 +130,6 @@ class EmailDevice(SerializerModel, ThrottlingMixin, SideChannelDevice):
return EmailDeviceSerializer
def verify_token(self, token: str) -> bool:
verify_allowed, _ = self.verify_is_allowed()
if verify_allowed:
verified = super().verify_token(token)
if verified:
self.throttle_reset()
else:
self.throttle_increment()
else:
verified = False
return verified
def _compose_email(self) -> TemplateEmailMessage:
try:
pending_user = self.user

View File

@@ -8,7 +8,6 @@ from django.core.mail.backends.locmem import EmailBackend
from django.core.mail.backends.smtp import EmailBackend as SMTPEmailBackend
from django.db.utils import IntegrityError
from django.template.exceptions import TemplateDoesNotExist
from django.test import TestCase
from django.urls import reverse
from django.utils.timezone import now
@@ -17,7 +16,6 @@ from authentik.flows.models import FlowStageBinding
from authentik.flows.tests import FlowTestCase
from authentik.lib.config import CONFIG
from authentik.lib.utils.email import mask_email
from authentik.stages.authenticator.tests import ThrottlingTestMixin
from authentik.stages.authenticator_email.api import (
AuthenticatorEmailStageSerializer,
EmailDeviceSerializer,
@@ -81,7 +79,6 @@ class TestAuthenticatorEmailStage(FlowTestCase):
self.assertFalse(self.device.verify_token("000000"))
# Verify correct token (should clear token after verification)
self.device.throttle_reset(commit=False)
self.assertTrue(self.device.verify_token(token))
self.assertIsNone(self.device.token)
@@ -332,27 +329,3 @@ class TestAuthenticatorEmailStage(FlowTestCase):
# Test AuthenticatorEmailStage send method
self.stage.send(self.device)
self.assertEqual(len(mail.outbox), 1)
class TestEmailDeviceThrottling(ThrottlingTestMixin, TestCase):
def setUp(self):
super().setUp()
flow = create_test_flow()
user = create_test_user()
stage = AuthenticatorEmailStage.objects.create(
name="email-authenticator-throttle",
use_global_settings=True,
from_address="test@authentik.local",
configure_flow=flow,
token_expiry="minutes=30",
) # nosec
self.device = EmailDevice.objects.create(
user=user, stage=stage, email="throttle@authentik.local"
)
self.device.generate_token()
def valid_token(self):
return self.device.token
def invalid_token(self):
return "000000" if self.device.token != "000000" else "111111"

View File

@@ -1,30 +0,0 @@
# Generated by Django 5.2.12 on 2026-04-16 17:28
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("authentik_stages_authenticator_sms", "0008_alter_authenticatorsmsstage_friendly_name"),
]
operations = [
migrations.AddField(
model_name="smsdevice",
name="throttling_failure_count",
field=models.PositiveIntegerField(
default=0, help_text="Number of successive failed attempts."
),
),
migrations.AddField(
model_name="smsdevice",
name="throttling_failure_timestamp",
field=models.DateTimeField(
blank=True,
default=None,
help_text="A timestamp of the last failed verification attempt. Null if last attempt succeeded.",
null=True,
),
),
]

View File

@@ -20,7 +20,7 @@ from authentik.events.utils import sanitize_item
from authentik.flows.models import ConfigurableStage, FriendlyNamedStage, Stage
from authentik.lib.models import SerializerModel
from authentik.lib.utils.http import get_http_session
from authentik.stages.authenticator.models import SideChannelDevice, ThrottlingMixin
from authentik.stages.authenticator.models import SideChannelDevice
LOGGER = get_logger()
@@ -197,7 +197,7 @@ def hash_phone_number(phone_number: str) -> str:
return "hash:" + sha256(phone_number.encode()).hexdigest()
class SMSDevice(SerializerModel, ThrottlingMixin, SideChannelDevice):
class SMSDevice(SerializerModel, SideChannelDevice):
"""SMS Device"""
user = models.ForeignKey(get_user_model(), on_delete=models.CASCADE)
@@ -224,19 +224,11 @@ class SMSDevice(SerializerModel, ThrottlingMixin, SideChannelDevice):
return SMSDeviceSerializer
def verify_token(self, token: str) -> bool:
verify_allowed, _ = self.verify_is_allowed()
if verify_allowed:
verified = super().verify_token(token)
if verified:
self.throttle_reset()
else:
self.throttle_increment()
else:
verified = False
return verified
def verify_token(self, token):
valid = super().verify_token(token)
if valid:
self.save()
return valid
def __str__(self):
return str(self.name) or str(self.user_id)

View File

@@ -3,7 +3,6 @@
from unittest.mock import MagicMock, patch
from urllib.parse import parse_qsl
from django.test import TestCase
from django.urls import reverse
from requests_mock import Mocker
@@ -13,7 +12,6 @@ from authentik.flows.planner import FlowPlan
from authentik.flows.tests import FlowTestCase
from authentik.flows.views.executor import SESSION_KEY_PLAN
from authentik.lib.generators import generate_id
from authentik.stages.authenticator.tests import ThrottlingTestMixin
from authentik.stages.authenticator_sms.models import (
AuthenticatorSMSStage,
SMSDevice,
@@ -359,30 +357,3 @@ class AuthenticatorSMSStageTests(FlowTestCase):
},
phone_number_required=False,
)
class TestSMSDeviceThrottling(ThrottlingTestMixin, TestCase):
"""Test ThrottlingMixin behaviour on SMSDevice.verify_token"""
def setUp(self):
super().setUp()
flow = create_test_flow()
user = create_test_admin_user()
stage = AuthenticatorSMSStage.objects.create(
flow=flow,
name="sms-throttle",
provider=SMSProviders.GENERIC,
from_number="1234",
)
self.device = SMSDevice.objects.create(
user=user,
stage=stage,
phone_number="+15551230001",
)
self.device.generate_token()
def valid_token(self):
return self.device.token
def invalid_token(self):
return "000000" if self.device.token != "000000" else "111111"

View File

@@ -3,6 +3,7 @@
from base64 import b32encode
from os import urandom
from django.conf import settings
from django.core.validators import MaxValueValidator
from django.db import models
from django.utils.translation import gettext_lazy as _
@@ -77,6 +78,9 @@ class StaticDevice(SerializerModel, ThrottlingMixin, Device):
return StaticDeviceSerializer
def get_throttle_factor(self):
return getattr(settings, "OTP_STATIC_THROTTLE_FACTOR", 1)
def verify_token(self, token):
verify_allowed, _ = self.verify_is_allowed()
if verify_allowed:

View File

@@ -1,5 +1,6 @@
"""Test Static API"""
from django.test.utils import override_settings
from django.urls import reverse
from rest_framework.test import APITestCase
@@ -43,6 +44,9 @@ class DeviceTest(TestCase):
str(device)
@override_settings(
OTP_STATIC_THROTTLE_FACTOR=1,
)
class ThrottlingTestCase(ThrottlingTestMixin, TestCase):
"""Test static device throttling"""

View File

@@ -194,6 +194,9 @@ class TOTPDevice(SerializerModel, ThrottlingMixin, Device):
return verified
def get_throttle_factor(self):
return getattr(settings, "OTP_TOTP_THROTTLE_FACTOR", 1)
@property
def config_url(self):
"""

View File

@@ -63,14 +63,11 @@ class TOTPDeviceMixin:
@override_settings(
OTP_TOTP_SYNC=False,
OTP_TOTP_THROTTLE_FACTOR=0,
)
class TOTPTest(TOTPDeviceMixin, TestCase):
"""TOTP tests"""
def setUp(self):
super().setUp()
self.device.set_throttle_factor(0)
def test_default_key(self):
"""Ensure default_key is valid"""
device = self.alice.totpdevice_set.create()
@@ -193,6 +190,9 @@ class TOTPTest(TOTPDeviceMixin, TestCase):
self.assertEqual(params["image"][0], image_url)
@override_settings(
OTP_TOTP_THROTTLE_FACTOR=1,
)
class ThrottlingTestCase(TOTPDeviceMixin, ThrottlingTestMixin, TestCase):
"""Test TOTP Throttling"""

View File

@@ -39,10 +39,6 @@ class AuthenticatorValidateStageSerializer(StageSerializer):
"webauthn_hints",
"webauthn_allowed_device_types",
"webauthn_allowed_device_types_obj",
"email_otp_throttling_factor",
"sms_otp_throttling_factor",
"totp_otp_throttling_factor",
"static_otp_throttling_factor",
]

View File

@@ -3,7 +3,6 @@
from typing import TYPE_CHECKING
from urllib.parse import urlencode
from django.db import transaction
from django.http import HttpRequest
from django.http.response import Http404
from django.shortcuts import get_object_or_404
@@ -30,8 +29,8 @@ from authentik.flows.stage import StageView
from authentik.lib.utils.email import mask_email
from authentik.lib.utils.time import timedelta_from_string
from authentik.root.middleware import ClientIPMiddleware
from authentik.stages.authenticator import devices_for_user
from authentik.stages.authenticator.models import Device, ThrottlingMixin
from authentik.stages.authenticator import match_token
from authentik.stages.authenticator.models import Device
from authentik.stages.authenticator_duo.models import AuthenticatorDuoStage, DuoDevice
from authentik.stages.authenticator_email.models import EmailDevice
from authentik.stages.authenticator_sms.models import SMSDevice
@@ -144,20 +143,7 @@ def select_challenge_email(request: HttpRequest, device: EmailDevice):
def validate_challenge_code(code: str, stage_view: StageView, user: User) -> Device:
"""Validate code-based challenges. We test against every device, on purpose, as
the user mustn't choose between totp and static devices."""
with transaction.atomic():
for device in devices_for_user(user, for_verify=True):
if isinstance(device, ThrottlingMixin):
throttling_factor = stage_view.executor.current_stage.get_throttling_factor(
DeviceClasses.from_model_label(device.model_label())
)
if throttling_factor is not None:
device.set_throttle_factor(throttling_factor)
if device.verify_token(code):
break
else:
device = None
device = match_token(user, code)
if not device:
login_failed.send(
sender=__name__,

View File

@@ -1,36 +0,0 @@
# Generated by Django 5.2.12 on 2026-04-16 16:33
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
(
"authentik_stages_authenticator_validate",
"0015_authenticatorvalidatestage_webauthn_hints",
),
]
operations = [
migrations.AddField(
model_name="authenticatorvalidatestage",
name="email_otp_throttling_factor",
field=models.FloatField(default=1),
),
migrations.AddField(
model_name="authenticatorvalidatestage",
name="sms_otp_throttling_factor",
field=models.FloatField(default=1),
),
migrations.AddField(
model_name="authenticatorvalidatestage",
name="static_otp_throttling_factor",
field=models.FloatField(default=1),
),
migrations.AddField(
model_name="authenticatorvalidatestage",
name="totp_otp_throttling_factor",
field=models.FloatField(default=1),
),
]

View File

@@ -22,12 +22,6 @@ class DeviceClasses(models.TextChoices):
SMS = "sms", _("SMS")
EMAIL = "email", _("Email")
@staticmethod
def from_model_label(model_label: str) -> DeviceClasses:
return getattr(
DeviceClasses, model_label.rsplit(".", maxsplit=1)[-1][: -len("device")].upper()
)
def default_device_classes() -> list:
"""By default, accept all device classes"""
@@ -88,11 +82,6 @@ class AuthenticatorValidateStage(Stage):
"authentik_stages_authenticator_webauthn.WebAuthnDeviceType", blank=True
)
email_otp_throttling_factor = models.FloatField(default=1)
sms_otp_throttling_factor = models.FloatField(default=1)
totp_otp_throttling_factor = models.FloatField(default=1)
static_otp_throttling_factor = models.FloatField(default=1)
@property
def serializer(self) -> type[BaseSerializer]:
from authentik.stages.authenticator_validate.api import AuthenticatorValidateStageSerializer
@@ -109,17 +98,6 @@ class AuthenticatorValidateStage(Stage):
def component(self) -> str:
return "ak-stage-authenticator-validate-form"
def get_throttling_factor(self, device_class: DeviceClasses) -> float | None:
if device_class == DeviceClasses.EMAIL:
return self.email_otp_throttling_factor
elif device_class == DeviceClasses.SMS:
return self.sms_otp_throttling_factor
elif device_class == DeviceClasses.TOTP:
return self.totp_otp_throttling_factor
elif device_class == DeviceClasses.STATIC:
return self.static_otp_throttling_factor
return None
class Meta:
verbose_name = _("Authenticator Validation Stage")
verbose_name_plural = _("Authenticator Validation Stages")

Some files were not shown because too many files have changed in this diff Show More