Compare commits

..

168 Commits

Author SHA1 Message Date
Marc 'risson' Schmitt
49b46b5338 ci: pull latest changes before tagging new version (cherry-pick #20413 to version-2025.8) (#20417) 2026-02-19 14:32:34 +01:00
authentik-automation[bot]
cfcdc70542 root: do not rely on npm cli for version bump (cherry-pick #20276 to version-2025.8) (#20318)
root: do not rely on npm cli for version bump

Signed-off-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
Co-authored-by: authentik-automation[bot] <135050075+authentik-automation[bot]@users.noreply.github.com>
2026-02-17 13:02:29 +01:00
Marc 'risson' Schmitt
471f0d65e0 ci: fix binary outpost build on release (cherry-pick #20248 to version-2025.8) (#20282)
fix binary outpost build on release (#20248)
2026-02-13 13:38:45 +01:00
authentik-automation[bot]
3c0731ab6d website/docs: 2025.8.6 release notes (cherry-pick #20243 to version-2025.8) (#20254)
Co-authored-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
2026-02-12 16:57:03 +01:00
authentik-automation[bot]
bce6ba2afa release: 2025.8.6 2026-02-12 15:27:43 +00:00
Marc 'risson' Schmitt
1b490c1b5a website/docs: fix lint
Signed-off-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
2026-02-12 15:41:11 +01:00
Marc 'risson' Schmitt
9a038aa0fe lib: fix lint
Signed-off-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
2026-02-12 15:41:07 +01:00
authentik-automation[bot]
c80112e5f8 security: CVE-2026-25227 (#20233)
Co-authored-by: authentik-automation[bot] <135050075+authentik-automation[bot]@users.noreply.github.com>
2026-02-12 15:24:09 +01:00
authentik-automation[bot]
65ae736fe2 security: CVE-2026-25748 (#20234)
Co-authored-by: authentik-automation[bot] <135050075+authentik-automation[bot]@users.noreply.github.com>
2026-02-12 15:23:47 +01:00
authentik-automation[bot]
271e7eae1c security: CVE-2026-25922 (#20235)
Co-authored-by: authentik-automation[bot] <135050075+authentik-automation[bot]@users.noreply.github.com>
2026-02-12 15:20:41 +01:00
authentik-automation[bot]
8bf66b8987 website/docs: generate CVE sidebar (cherry-pick #20098 to version-2025.8) (#20099)
website/docs: generate CVE sidebar (#20098)

* website/docs: generate CVE sidebar



* docs



* slightly less warnings



---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
2026-02-08 17:09:40 +01:00
Marc 'risson' Schmitt
6f9369283c root: update client-go generation (cherry-pick #19762 and #19906 to version-2025.8) (#19934)
* root: update client-go generation (cherry-pick #19762 to version-2025.12) (#19791)

Co-authored-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
Signed-off-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>

* ci: always generate API clients (#19906)

* ci: always generate API clients

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* fix missing respective actions

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* fix flaky test

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* mount generated client

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* sigh

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* ci: fix test_docker.sh

Signed-off-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>

* bump go

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* fix dind

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

---------

Signed-off-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: authentik-automation[bot] <135050075+authentik-automation[bot]@users.noreply.github.com>
Co-authored-by: Jens L. <jens@goauthentik.io>
2026-02-07 23:00:10 +01:00
authentik-automation[bot]
859ef3a722 website/docs: Remove stale 2024 version directives (cherry-pick #19888 to version-2025.8) (#20021)
* Cherry-pick #19888 to version-2025.8 (with conflicts)

This cherry-pick has conflicts that need manual resolution.

Original PR: #19888
Original commit: 469bc0b6b4

* fix conflicts

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* sigh

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* not fail

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Teffen Ellis <592134+GirlBossRush@users.noreply.github.com>
Co-authored-by: Jens Langhammer <jens@goauthentik.io>
2026-02-04 22:20:48 +01:00
authentik-automation[bot]
b47ed8894c release: 2025.8.5 2025-11-19 15:05:03 +00:00
authentik-automation[bot]
8c1f08fecb website/docs: add 2025.8.5 and 2025.10.2 release notes (cherry-pick #18268 to version-2025.8) (#18269)
Cherry-pick #18268 to version-2025.8 (with conflicts)

This cherry-pick has conflicts that need manual resolution.

Original PR: #18268
Original commit: cd1693be28

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-11-19 15:30:29 +01:00
authentik-automation[bot]
cc676793b0 internal: Automated internal backport: 5000-sidebar.sec.patch to authentik-2025.8 (#18263)
Automated internal backport of patch 5000-sidebar.sec.patch to authentik-2025.8

Co-authored-by: authentik-automation[bot] <135050075+authentik-automation[bot]@users.noreply.github.com>
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-11-19 15:21:03 +01:00
authentik-automation[bot]
76815cd5b9 internal: Automated internal backport: 1487-invitation-expiry.sec.patch to authentik-2025.8 (#18261)
* Automated internal backport of patch 1487-invitation-expiry.sec.patch to authentik-2025.8

* lint

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: authentik-automation[bot] <135050075+authentik-automation[bot]@users.noreply.github.com>
Co-authored-by: Jens Langhammer <jens@goauthentik.io>
2025-11-19 15:10:10 +01:00
authentik-automation[bot]
57a378bbd0 internal: Automated internal backport: 1498-oauth2-cc-user-active.sec.patch to authentik-2025.8 (#18262)
Automated internal backport of patch 1498-oauth2-cc-user-active.sec.patch to authentik-2025.8

Co-authored-by: authentik-automation[bot] <135050075+authentik-automation[bot]@users.noreply.github.com>
2025-11-19 15:01:25 +01:00
Marcelo Elizeche Landó
f542a0d415 core: bump Django from 5.1.13 to 5.1.14 for 2025.8 (#17968)
* upgrade django from 5.1.13 to 5.1.14

* longer urls

* sync package-lock.json

* fix go deprecation for +build

* update package.lock
2025-11-11 12:20:45 +01:00
Jens L.
f69bbbe85e ci: fix migrate-from-stable for old versions (#18018)
* ci: fix migrate-from-stable for old versions

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* version from authentik

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* npm i

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* npm i on linux

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* fix outpost lint

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* gha notice

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* idk man I hate node

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
2025-11-07 17:45:12 +01:00
Jens L.
0845e23337 ci: rework internal repo (#17797) (#17830)
* ci: rework internal repo (#17797)

* ci: rework internal repo

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* also fix retention workflow

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
# Conflicts:
#	.github/workflows/gh-ghcr-retention.yml
#	.github/workflows/repo-mirror-cleanup.yml
#	.github/workflows/repo-mirror.yml

* fix package

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
2025-11-01 20:08:15 +01:00
authentik-automation[bot]
de6d2a02b2 website/docs: update SAML provider docs (cherry-pick #15887 to version-2025.8) (#17583)
* Cherry-pick #15887 to version-2025.8 (with conflicts)

This cherry-pick has conflicts that need manual resolution.

Original PR: #15887
Original commit: 83603a528f

* Fix sidebar conflict

* Remove SAML logout link

---------

Co-authored-by: Tana M Berry <tanamarieberry@yahoo.com>
Co-authored-by: dewi-tik <dewi@goauthentik.io>
2025-10-22 18:37:48 +02:00
authentik-automation[bot]
73be5321a9 website: add powershell syntax highlighting and bump package (cherry-pick #16683) (#16721)
website: add powershell syntax highlighting and bump package (#16683)

Add powershell syntax highlighting and bump package

Co-authored-by: Dewi Roberts <dewi@goauthentik.io>
2025-10-10 13:43:24 +02:00
authentik-automation[bot]
ff8ad31a6b website/docs: add email config section (cherry-pick #16727 to version-2025.8) (#17364)
website/docs: add email config section (#16727)

* Add email section and link to it from install guide



* WIP

* WIP

* WIP

* WIP

* WIP

* WIP

* WIP

* Typo

* WIP

* Apply suggestion

* Added TLS email config

* Apply suggestions

* Apply suggestions

* fix linting

* fix broken anchor

* Apply suggestions

* Fix extra line

---------

Co-authored-by: Dewi Roberts <dewi@goauthentik.io>
Co-authored-by: Marcelo Elizeche Landó <marcelo@goauthentik.io>
2025-10-09 22:19:40 +00:00
authentik-automation[bot]
ad2cedac98 website/docs: add entra id scim source (cherry-pick #17357 to version-2025.8) (#17362)
* Cherry-pick #17357 to version-2025.8 (with conflicts)

This cherry-pick has conflicts that need manual resolution.

Original PR: #17357
Original commit: 2e03270dcf

* Conflict fix

---------

Co-authored-by: Dewi Roberts <dewi@goauthentik.io>
2025-10-09 18:40:47 +00:00
authentik-automation[bot]
7a0bd0d0c1 web/admin: fix incorrect placeholder for scim provider (cherry-pick #17308 to version-2025.8) (#17309)
* Cherry-pick #17308 to version-2025.8 (with conflicts)

This cherry-pick has conflicts that need manual resolution.

Original PR: #17308
Original commit: 88583ae46b

* Update LDAPProviderFormForm.ts

Signed-off-by: Jens L. <jens@goauthentik.io>

* Update LDAPProviderFormForm.ts

Signed-off-by: Jens L. <jens@goauthentik.io>

---------

Signed-off-by: Jens L. <jens@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-10-08 16:54:18 +02:00
authentik-automation[bot]
269551cf4d blueprints: ensure tasks retry on database errors (cherry-pick #17333 to version-2025.8) (#17334)
Co-authored-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
2025-10-08 16:48:04 +02:00
authentik-automation[bot]
23a6ab915b lib/sync/outgoing: revert reduce number of db queries made (revert #14177) (cherry-pick #17306 to version-2025.8) (#17330)
Co-authored-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
2025-10-08 11:30:08 +00:00
authentik-automation[bot]
0c77e5c33e web: Fix behavior for modals configured with closeAfterSuccessfulSubmit (cherry-pick #17277 to version-2025.8) (#17299)
web: Fix behavior for modals configured with closeAfterSuccessfulSubmit (#17277)

when a form inside a modal submits successfully, it dispatches an EVENT_REFRESH event that bubbles up through the DOM. Parent components like TablePage listen for this event to refresh their data.
so, when the parent component refreshes/re-renders in response to EVENT_REFRESH, it destroys and recreates the entire row including the modal element and that causes the modal to disappear even
though the ModalForm component never explicitly closed it.

Co-authored-by: Dominic R <dominic@sdko.org>
2025-10-07 13:58:44 -04:00
authentik-automation[bot]
93aeae5405 core: fix absolute and relative path file uploads (cherry-pick #17269 to version-2025.8) (#17272)
core: fix absolute and relative path file uploads (#17269)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-10-06 19:05:09 +02:00
authentik-automation[bot]
cd3ff47221 tasks/middlewares/messages: make sure exceptions are always logged (cherry-pick #17237 to version-2025.8) (#17248)
Co-authored-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
2025-10-04 16:42:26 +02:00
authentik-automation[bot]
a370a54db8 packages/django-dramatiq-postgres: fix error when updating task with no changes (cherry-pick #16728 to version-2025.8) (#17238)
packages/django-dramatiq-postgres: fix error when updating task with no changes (#16728)

Co-authored-by: Jens L. <jens@goauthentik.io>
2025-10-03 17:12:02 +02:00
authentik-automation[bot]
1d0e45abe8 packages/django-dramatiq-postgres: broker: fix task expiration (cherry-pick #17178 to version-2025.8) (#17217)
Co-authored-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
2025-10-02 14:49:57 +02:00
authentik-automation[bot]
a501b627eb build(deps): bump django from 5.1.12 to 5.1.13 (cherry-pick #17198 to version-2025.8) (#17199)
build(deps): bump django from 5.1.12 to 5.1.13 (#17198)

* build(deps): bump django from 5.1.12 to 5.1.13

Bumps [django](https://github.com/django/django) from 5.1.12 to 5.1.13.
- [Commits](https://github.com/django/django/compare/5.1.12...5.1.13)

---
updated-dependencies:
- dependency-name: django
  dependency-version: 5.1.13
  dependency-type: direct:production
...



* lock



---------

Signed-off-by: dependabot[bot] <support@github.com>
Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Jens Langhammer <jens@goauthentik.io>
2025-10-02 11:57:19 +02:00
authentik-automation[bot]
78a4c08fc8 website/docs: developer docs: adjust sentence for writing docs (cherry-pick #17137 to version-2025.8) (#17142)
website/docs: developer docs: adjust sentence for writing docs (#17137)

As per Tana's request

Signed-off-by: Dominic R <dominic@sdko.org>
Co-authored-by: Dominic R <dominic@sdko.org>
2025-09-30 15:52:22 +00:00
authentik-automation[bot]
8d3a289d12 release: 2025.8.4 2025-09-30 00:02:15 +00:00
Jens Langhammer
99e92bf998 root: fix rustup error during build when buildcache version is outdated
Signed-off-by: Jens Langhammer <jens@goauthentik.io>
2025-09-30 01:32:01 +02:00
authentik-automation[bot]
1df7b22d29 release: 2025.8.4 2025-09-29 21:52:52 +00:00
authentik-automation[bot]
af484738f6 website/docs: 2025.8.4 release notes (cherry-pick #17119 to version-2025.8) (#17120) 2025-09-29 23:44:03 +02:00
authentik-automation[bot]
19ba3c8453 tasks: reduce default number of retries and max backoff (cherry-pick #17107 to version-2025.8) (#17109)
Co-authored-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
2025-09-29 17:33:19 +00:00
authentik-automation[bot]
9d1b384baa packages/django-dramatiq-postgres: broker: fix new messages not being picked up when too many messages are waiting (cherry-pick #17106 to version-2025.8) (#17108)
packages/django-dramatiq-postgres: broker: fix new messages not being picked up when too many messages are waiting (#17106)

Signed-off-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
Co-authored-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
2025-09-29 17:11:11 +00:00
authentik-automation[bot]
8a702b148f providers/oauth2: fix authentication error with identical app passwords (cherry-pick #17100 to version-2025.8) (#17103)
providers/oauth2: fix authentication error with identical app passwords (#17100)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-09-29 17:29:14 +02:00
authentik-automation[bot]
2db42a3a04 stages/identification: fix mismatched error messages (cherry-pick #17090 to version-2025.8) (#17104)
stages/identification: fix mismatched error messages (#17090)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-09-29 17:29:03 +02:00
authentik-automation[bot]
4403baaa28 outposts/ldap: add pwdChangeTime attribute (cherry-pick #17010 to version-2025.8) (#17101)
Co-authored-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
fix ldap tests following #17010 (#17021)
2025-09-29 15:44:35 +02:00
authentik-automation[bot]
8030d4d734 cmd/server/healthcheck: info log success instead of debug (cherry-pick #17093 to version-2025.8) (#17097)
Co-authored-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
2025-09-29 15:26:48 +02:00
authentik-automation[bot]
4b7a78453c web: Fix layout class for row in LibraryPage (cherry-pick #16752 to version-2025.8) (#17091)
web: Fix layout class for 'row' in LibraryPage (#16752)

Fix layout class for 'row' in LibraryPage

Signed-off-by: Jérôme W. <jerome@wnetworks.org>
Co-authored-by: Jérôme W. <jerome@wnetworks.org>
2025-09-29 14:56:15 +02:00
authentik-automation[bot]
b9746106a0 rbac: optimize rbac assigned by users query (cherry-pick #17015 to version-2025.8) (#17092)
Co-authored-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
2025-09-29 12:47:45 +00:00
authentik-automation[bot]
faa47670ef web/admin: fix federation sources automatically selected (cherry-pick #17069 to version-2025.8) (#17070)
web/admin: fix federation sources automatically selected (#17069)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-09-28 20:30:19 +02:00
authentik-automation[bot]
add5f4e0cb tasks: fix logger name (cherry-pick #17009 to version-2025.8) (#17060)
* tasks: fix logger name (#17009)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* fix tests

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-09-27 02:33:23 +02:00
authentik-automation[bot]
7636c038d9 */bindings: order by pk (cherry-pick #17027) (#17053)
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-09-26 18:37:29 +02:00
authentik-automation[bot]
557f781f5c lib/config: fix listen settings (cherry-pick #17005) (#17023)
Co-authored-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
fix listen settings (#17005)
2025-09-26 16:59:23 +02:00
authentik-automation[bot]
1997011328 website/docs: Update Github expression to handle non-OAuth sources gracefully (cherry-pick #17014) (#17029)
website/docs: Update Github expression to handle non-OAuth sources gracefully (#17014)

Co-authored-by: Patrick <tradiuz@gmail.com>
Co-authored-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
2025-09-26 14:24:10 +02:00
authentik-automation[bot]
dfdd5ebc8c lib: match exception_to_dict locals behaviour (cherry-pick #17006) (#17016)
lib: match exception_to_dict locals behaviour (#17006)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-09-25 17:53:36 +02:00
authentik-automation[bot]
42977caa6a core: add index on Group.is_superuser (cherry-pick #17011) (#17017)
Co-authored-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
2025-09-25 15:29:15 +00:00
authentik-automation[bot]
bce46c880d rbac: fix typo (cherry-pick #16476) (#17018)
Co-authored-by: Simonyi Gergő <28359278+gergosimonyi@users.noreply.github.com>
fix typo (#16476)
2025-09-25 15:12:03 +00:00
authentik-automation[bot]
9c987c5694 website/docs: improve discord policies when also bound to non-oauth sources (cherry-pick #17008) (#17012)
website/docs: improve discord policies when also bound to non-oauth sources (#17008)

Co-authored-by: Jens L. <jens@goauthentik.io>
2025-09-25 15:53:40 +02:00
authentik-automation[bot]
f33e50c28c webiste/docs: add missing oauth endpoints (cherry-pick #16995) (#16999)
Co-authored-by: Dewi Roberts <dewi@goauthentik.io>
2025-09-25 15:00:21 +02:00
authentik-automation[bot]
280d6febf7 website/docs: oauth provider: Add device and introspect to reserved slugs (cherry-pick #16994) (#16998)
Co-authored-by: Dominic R <dominic@sdko.org>
2025-09-25 14:59:56 +02:00
authentik-automation[bot]
3a32918ceb providers/ldap: add include_children parameter to cached search mode (cherry-pick #16918) (#17000)
Co-authored-by: Daniel Adu-Gyan <26229521+danieladugyan@users.noreply.github.com>
2025-09-25 12:58:24 +00:00
authentik-automation[bot]
723bd4bbbc website/developer docs: What domain for what doc version (cherry-pick #16987) (#16996)
website/developer docs: What domain for what doc version (#16987)

* website/developer docs: What domain for what doc version

Closes: AUTH-1316

* Apply suggestions from code review




---------

Signed-off-by: Dominic R <dominic@sdko.org>
Co-authored-by: Dominic R <dominic@sdko.org>
Co-authored-by: Dewi Roberts <dewi@goauthentik.io>
2025-09-25 08:35:30 -04:00
authentik-automation[bot]
7b130e1832 website/docs: update url to docker-compose.yml (cherry-pick #16901) (#16986)
website/docs: update url to docker-compose.yml (#16901)

Update URL of docker-compose.yml from "https://goauthentik.io/docker-compose.yml" to "https://docs.goauthentik.io/docker-compose.yml"

Co-authored-by: Nuno Alves <nuno.alves02@gmail.com>
2025-09-24 17:41:40 -04:00
authentik-automation[bot]
3aa969d64e website/docs: add docs for source switch expression policy (cherry-pick #16878) (#16972)
website/docs: add docs for source switch expression policy (#16878)

* draft for source switch docs

* add python

* add sidebar entry

* Update website/docs/customize/policies/expression/source_switch.md




* Update website/docs/customize/policies/index.md




* Update website/docs/customize/policies/working_with_policies.md




* Update website/docs/customize/policies/working_with_policies.md




* add sectin about binding the policy

* tweak

* Update website/docs/customize/policies/index.md




* Update website/docs/customize/policies/working_with_policies.md




* Update website/docs/customize/policies/working_with_policies.md




* Update website/docs/customize/policies/working_with_policies.md




* Update website/docs/customize/policies/working_with_policies.md




* Update website/docs/customize/policies/expression.mdx




---------

Signed-off-by: Tana M Berry <tanamarieberry@yahoo.com>
Co-authored-by: Tana M Berry <tanamarieberry@yahoo.com>
Co-authored-by: Tana M Berry <tana@goauthentik.io>
Co-authored-by: Dewi Roberts <dewi@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-09-24 13:24:33 +01:00
authentik-automation[bot]
04688c86b3 website/docs: random typo fixes (cherry-pick #16956) (#16959)
website/docs: random typo fixes (#16956)

Co-authored-by: Jared Harrison <50091069+Jared-Is-Coding@users.noreply.github.com>
2025-09-23 23:57:14 +02:00
authentik-automation[bot]
4f8dbb5efb website/docs: fix capitalization (cherry-pick #16944) (#16958)
website/docs: fix capitalization (#16944)

* fix capitalization

* tweak

* Update website/docs/add-secure-apps/outposts/manual-deploy-docker-compose.md




---------

Signed-off-by: Tana M Berry <tanamarieberry@yahoo.com>
Co-authored-by: Tana M Berry <tanamarieberry@yahoo.com>
Co-authored-by: Tana M Berry <tana@goauthentik.io>
Co-authored-by: Dominic R <dominic@sdko.org>
2025-09-23 21:15:57 +01:00
authentik-automation[bot]
3b382199eb website/docs: 2025.8: fix worker concurrency setting rename (cherry-pick #16946) (#16950)
Co-authored-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
fix worker concurrency setting rename (#16946)
2025-09-23 19:52:20 +02:00
authentik-automation[bot]
db50991e9c providers/scim: fix string formatting for SCIM user filter (cherry-pick #16465) (#16852)
providers/scim: fix string formatting for SCIM user filter (#16465)

* providers/scim: fix string formatting for SCIM user filter



* format

---------

Signed-off-by: Adam Shirt <adamshirt@outlook.com>
Co-authored-by: Adam Shirt <adamshirt@outlook.com>
Co-authored-by: Jens Langhammer <jens.langhammer@beryju.org>
2025-09-17 13:48:18 +02:00
authentik-automation[bot]
88036ebe14 webiste/docs: improve user ref doc (cherry-pick #16779) (#16839)
webiste/docs: improve user ref doc (#16779)

* WIP

* WIP

* Update website/docs/users-sources/user/user_ref.mdx




* Language change

* Update website/docs/users-sources/user/user_ref.mdx




---------

Signed-off-by: Dewi Roberts <dewi@goauthentik.io>
Co-authored-by: Dewi Roberts <dewi@goauthentik.io>
Co-authored-by: Tana M Berry <tanamarieberry@yahoo.com>
2025-09-16 22:17:41 +00:00
authentik-automation[bot]
680feaefa1 release: 2025.8.3 2025-09-16 15:09:16 +00:00
authentik-automation[bot]
8676cd3a43 website/docs: 2025.8.3 release notes (cherry-pick #16809) (#16810)
website/docs: 2025.8.3 release notes (#16809)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-09-16 16:40:32 +02:00
authentik-automation[bot]
53e0f6b734 stages/email_authenticator: Fix email mfa loop (cherry-pick #16579) (#16807)
stages/email_authenticator: Fix email mfa loop (#16579)

* Return error message instead of infinite loop

* Remove unused code

* check for existing email device

* revert to initial behaviour

* Add test for failing when the user already has an email device

Co-authored-by: Marcelo Elizeche Landó <marcelo@goauthentik.io>
2025-09-16 16:20:34 +02:00
authentik-automation[bot]
eaee475662 website/docs: update create oauth provider page (cherry-pick #16617) (#16806)
website/docs: update create oauth provider page (#16617)

* Updated the page to be more consistent with upcoming changes to the saml page

* Add note

Co-authored-by: Dewi Roberts <dewi@goauthentik.io>
2025-09-16 13:33:53 +00:00
authentik-automation[bot]
19b672b3bc lifecycle: fix permission error when running worker as root (cherry-pick #16735) (#16784)
lifecycle: fix permission error when running worker as root (#16735)

* lifecycle: fix permission error when running worker as root



* fix maybe?



---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-09-16 14:49:31 +02:00
authentik-automation[bot]
bf0a31ce86 website/docs: fix docker tabs not rendering properly (cherry-pick #16799) (#16802)
website/docs: fix docker tabs not rendering properly (#16799)

docs: fix docker tabs not rendering properly

Co-authored-by: Connor Peshek <connor@connorpeshek.me>
2025-09-16 11:00:27 +01:00
authentik-automation[bot]
28ff561400 release: 2025.8.2 2025-09-15 15:47:14 +00:00
authentik-automation[bot]
ff2472a551 website/docs: 2025.8.2 release notes (cherry-pick #16773) (#16778)
Co-authored-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
2025-09-15 17:15:44 +02:00
authentik-automation[bot]
dac302e8be sources/oauth/entra_id: do not assume group_id comes from entra (cherry-pick #16456) (#16777)
Co-authored-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
2025-09-15 16:43:34 +02:00
authentik-automation[bot]
930a6f7c6f lib/logging: only show locals when in debug mode (cherry-pick #16772) (#16774)
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-09-15 15:34:52 +02:00
authentik-automation[bot]
b661b0bb39 website/docs: update ssh rac doc (cherry-pick #16695) (#16720)
website/docs: update ssh rac doc (#16695)

* Added linebreak preservation and changed blocks to yaml syntax

* Update website/docs/add-secure-apps/providers/rac/rac-public-key.md




* Update website/docs/add-secure-apps/providers/rac/rac-public-key.md




---------

Signed-off-by: Dewi Roberts <dewi@goauthentik.io>
Co-authored-by: Dewi Roberts <dewi@goauthentik.io>
Co-authored-by: Dominic R <dominic@sdko.org>
2025-09-12 16:05:57 +00:00
authentik-automation[bot]
5203713ca0 website/docs: re-fix sentence about Go (backport of #16736) (#16737)
* Cherry-pick #16736 to version-2025.8 (with conflicts)

This cherry-pick has conflicts that need manual resolution.

Original PR: #16736
Original commit: 515a065831

* fixed conflict

---------

Co-authored-by: Tana M Berry <tanamarieberry@yahoo.com>
Co-authored-by: Tana M Berry <tana@goauthentik.io>
2025-09-12 03:33:56 -05:00
authentik-automation[bot]
94794b106e web: Docker versioning compatibility (cherry-pick #16139) (#16732)
web: Docker versioning compatibility (#16139)

* website/docs: update frontend dev docs to always use latest dev images



* website: Clarify instructions.

---------

Signed-off-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
Co-authored-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
Co-authored-by: Teffen Ellis <teffen@goauthentik.io>
2025-09-11 14:39:12 -05:00
authentik-automation[bot]
eb8c21cf04 website/docs: dev-docs: Minimal landing + landing for dev env (cherry-pick #15246) (#16734)
website/docs: dev-docs: Minimal landing + landing for dev env (#15246)

* wip

* Update index.mdx



* Update website/docs/developer-docs/contributing.md



* Update website/docs/developer-docs/contributing.md



---------

Signed-off-by: Dominic R <dominic@sdko.org>
Co-authored-by: Dominic R <dominic@sdko.org>
2025-09-11 14:06:48 -05:00
authentik-automation[bot]
8a501377f2 website/docs: fix typos (cherry-pick #16716) (#16719)
website/docs: fix typos (#16716)

Fix typos

Co-authored-by: Dewi Roberts <dewi@goauthentik.io>
2025-09-11 14:17:33 +00:00
authentik-automation[bot]
5c67a0fecd website/docs: moves display source notes (cherry-pick #16704) (#16718)
website/docs: moves display source notes (#16704)

Moves display source note location to better location

Co-authored-by: Dewi Roberts <dewi@goauthentik.io>
2025-09-11 13:37:43 +00:00
authentik-automation[bot]
145e6a3a4f website/docs: clarify docker compose install (cherry-pick #16696) (#16699)
website/docs: clarify docker compose install (#16696)

* Change order

* WIP

Co-authored-by: Dewi Roberts <dewi@goauthentik.io>
2025-09-10 21:28:56 +01:00
authentik-automation[bot]
0219ed73f5 website/docs: add missing notification rule example doc to sidebar (cherry-pick #16478) (#16479)
website/docs: add missing notification rule example doc to sidebar (#16478)

Adds missing doc to sidebar

Co-authored-by: Dewi Roberts <dewi@goauthentik.io>
2025-09-10 15:54:29 +00:00
authentik-automation[bot]
88ccb7857b website: Redirect Azure to Entra. Add tags for search indexing. (cherry-pick #16474) (#16483)
website: Redirect Azure to Entra. Add tags for search indexing. (#16474)

* website: Add Azure redirects, tags for  search indexing.

* website: Fix redirects tab alignment.

Co-authored-by: Teffen Ellis <592134+GirlBossRush@users.noreply.github.com>
2025-09-10 14:38:57 +00:00
authentik-automation[bot]
e33d6becba website/docs: add rate limiting info to Email stage docs (cherry-pick #16668) (#16691)
website/docs: add rate limiting info to Email stage docs (#16668)

* add rate limiting info

* added Jens' edits

* Update website/docs/add-secure-apps/flows-stages/stages/email/index.mdx




* Update website/docs/add-secure-apps/flows-stages/stages/email/index.mdx




* Update website/docs/add-secure-apps/flows-stages/stages/email/index.mdx




---------

Signed-off-by: Tana M Berry <tanamarieberry@yahoo.com>
Co-authored-by: Tana M Berry <tanamarieberry@yahoo.com>
Co-authored-by: Tana M Berry <tana@goauthentik.io>
Co-authored-by: Dewi Roberts <dewi@goauthentik.io>
2025-09-10 10:35:15 +00:00
authentik-automation[bot]
41417affc0 website/docs: fix typo (cherry-pick #16681) (#16682)
website/docs: fix typo (#16681)

Fix typo

Co-authored-by: Dewi Roberts <dewi@goauthentik.io>
2025-09-09 13:28:51 +00:00
authentik-automation[bot]
1b4a6c3f6d tasks: fix status and healthcheck breaking with connection issues (cherry-pick #16504) (#16673)
tasks: fix status and healthcheck breaking with connection issues (#16504)

* add high priority for inline tasks in API



* also catch psycopg errors directly



* retry locking worker state after failure



* format



---------

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-09-08 17:04:17 -05:00
authentik-automation[bot]
aa98edd661 providers/scim: improve error message when object fails to sync (cherry-pick #16625) (#16643)
providers/scim: improve error message when object fails to sync (#16625)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-09-05 15:14:20 +02:00
authentik-automation[bot]
cb70331c82 providers/oauth2: add missing exp claim for logout token (cherry-pick #16593) (#16598)
providers/oauth2: add missing exp claim for logout token (#16593)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-09-04 00:23:20 +02:00
authentik-automation[bot]
31e105b190 web/flows: only disable login button when interactive captcha is configured and not loaded (cherry-pick #16586) (#16590)
web/flows: only disable login button when interactive captcha is configured and not loaded (#16586)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-09-03 18:26:33 +02:00
authentik-automation[bot]
08d2615f71 website: Update license text to use "directory" (cherry-pick #16589) (#16592)
website: Update license text to use "directory" (#16589)

To remove any potential confusion

Signed-off-by: Dominic R <dominic@sdko.org>
Co-authored-by: Dominic R <dominic@sdko.org>
2025-09-03 16:08:50 +00:00
authentik-automation[bot]
bb9e8b1c42 core: bump django from 5.1.11 to v5.1.12 (cherry-pick #16584) (#16591)
core: bump django from 5.1.11 to v5.1.12 (#16584)

bump django from 5.1.11 to 5.1.12

Co-authored-by: Marcelo Elizeche Landó <marcelo@goauthentik.io>
2025-09-03 17:39:33 +02:00
authentik-automation[bot]
6b8d7376a6 sources/ldap: fix malformed filter error with special characters in group DN (cherry-pick #16243) (#16585)
sources/ldap: fix malformed filter error with special characters in group DN (#16243)

* sources/ldap: fix malformed filter error with special characters in group DN

Escape special characters in LDAP group DN when constructing membership filters using ldap3's escape_filter_chars() function to prevent LDAPInvalidFilterError exceptions.

* sources/ldap: add tests for special characters in group DN filter escaping

Add comprehensive tests to verify that LDAP group DN values with special characters (parentheses, backslashes, asterisks) are properly escaped when constructing LDAP filters. Tests cover both the membership synchronization process and the escape_filter_chars function directly to ensure malformed filter errors are prevented.

* sources/ldap: fix line length for ruff linting compliance

Split long line in group filter construction to comply with 100 character limit.

* sources/ldap: move escape_filter_chars import to top-level in tests; audit other filter usages

---------

Co-authored-by: Dongwoo Jeong <dongwoo@duck.com>
Co-authored-by: Dongwoo Jeong <dongwoo.jeong@lge.com>
2025-09-03 17:20:34 +02:00
authentik-automation[bot]
4f680d8c06 root: clean up README (cherry-pick #16286) (#16573)
root: clean up README (#16286)

* wip





* Delete website/LICENSE



---------

Signed-off-by: Dominic R <dominic@sdko.org>
Co-authored-by: Dominic R <dominic@sdko.org>
Co-authored-by: Dewi Roberts <dewi@goauthentik.io>
Co-authored-by: Tana M Berry <tanamarieberry@yahoo.com>
2025-09-03 15:49:26 +02:00
authentik-automation[bot]
3eeb741975 website: Add text copy of license (cherry-pick #16559) (#16574)
website: Add text copy of license (#16559)

* website: Add License

According to the root LICENSE, the website is licensed under Creative Commons: CC BY-SA 4.0 license. To match the root of the project and the EE dir, a text copy of the license is now included.

Updates AUTH-1246



* Update LICENSE



* Create LICENSE for proprietary logo assets

Added license information for proprietary assets.



---------

Signed-off-by: Dominic R <dominic@sdko.org>
Co-authored-by: Dominic R <dominic@sdko.org>
Co-authored-by: Dewi Roberts <dewi@goauthentik.io>
2025-09-03 09:56:28 +00:00
authentik-automation[bot]
39cb638132 website/docs: remove base providers redirect. (cherry-pick #16576) (#16580)
website/docs: remove base providers redirect. (#16576)

Co-authored-by: Connor Peshek <connor@connorpeshek.me>
2025-09-03 10:26:19 +01:00
authentik-automation[bot]
be8f5d21cb website/docs: changes all -> arrows to > (cherry-pick #16569) (#16571)
website/docs: changes all -> arrows to > (#16569)

Changes all -> arrows to > as per style guide. Also fixes some bolding and capitalization

Co-authored-by: Dewi Roberts <dewi@goauthentik.io>
2025-09-02 22:27:26 +00:00
authentik-automation[bot]
acf18836e8 website/docs: update external user information (cherry-pick #16493) (#16568)
website/docs: update external user information (#16493)

Updated information

Co-authored-by: Dewi Roberts <dewi@goauthentik.io>
2025-09-02 14:09:52 +00:00
authentik-automation[bot]
d688621de4 website/docs: improve customize your instance page layout (cherry-pick #16367) (#16566)
website/docs: improve customize your instance page layout (#16367)

* Changes bullet points to a table. WIP

* Finished sentence and added punctuation

* Updated to match other overview/landing pages

Co-authored-by: Dewi Roberts <dewi@goauthentik.io>
2025-09-02 14:27:14 +01:00
authentik-automation[bot]
5173d09191 website/docs: update how to rac and outpost landing page (cherry-pick #16442) (#16498)
website/docs: update how to rac and outpost landing page (#16442)

* Adds outpost deployment to how to rac guide and updates the outpost landing page

* Applied suggestions

* Apply suggestion

* Update website/docs/add-secure-apps/providers/rac/how-to-rac.md




* Update website/docs/add-secure-apps/outposts/index.mdx



* Removed cards

---------

Signed-off-by: Dewi Roberts <dewi@goauthentik.io>
Co-authored-by: Dewi Roberts <dewi@goauthentik.io>
Co-authored-by: Dominic R <dominic@sdko.org>
2025-09-01 08:00:48 -05:00
authentik-automation[bot]
45bab4d32f website/integrations: bitwarden: fix ent notice (cherry-pick #16502) (#16511)
website/integrations: bitwarden: fix ent notice (#16502)

Signed-off-by: Dominic R <dominic@sdko.org>
Co-authored-by: Dominic R <dominic@sdko.org>
2025-09-01 08:27:39 +00:00
authentik-automation[bot]
f383e54c72 policies: remove object pk from authentik_policies_execution_time to reduce cardinality (cherry-pick #16500) (#16501)
policies: remove object pk from authentik_policies_execution_time to reduce cardinality (#16500)

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

#16386

Co-authored-by: Jens L. <jens@goauthentik.io>
2025-08-31 14:16:24 +01:00
authentik-automation[bot]
ab6b9b27cc website: Page redirect guide, documentation (backport of #16466) (#16475)
* website/docs: merge docs development environment and writing docs pages (#16402)

* Merges the docs development environment doc into the writing documentation guide to avoid duplication. Also updates the formatting of the writing docs page to make it easier to copy commands.

* Updates sidebar

* Update website/docs/developer-docs/docs/writing-documentation.md

Co-authored-by: Tana M Berry <tanamarieberry@yahoo.com>
Signed-off-by: Dewi Roberts <dewi@goauthentik.io>

* Update website/docs/developer-docs/docs/writing-documentation.md

Co-authored-by: Tana M Berry <tanamarieberry@yahoo.com>
Signed-off-by: Dewi Roberts <dewi@goauthentik.io>

* Update website/docs/developer-docs/docs/writing-documentation.md

Co-authored-by: Tana M Berry <tanamarieberry@yahoo.com>
Signed-off-by: Dewi Roberts <dewi@goauthentik.io>

* Update language on -watch commands

---------

Signed-off-by: Dewi Roberts <dewi@goauthentik.io>
Co-authored-by: Tana M Berry <tanamarieberry@yahoo.com>

* website: Unify Netlify redirects with Docusaurus's client-side router. (backport of #16430) (#16472)

Cherry-pick #16430 to version-2025.8 (with conflicts)

This cherry-pick has conflicts that need manual resolution.

Original PR: #16430
Original commit: 6d81aea5aa

Co-authored-by: Teffen Ellis <592134+GirlBossRush@users.noreply.github.com>

* Cherry-pick #16466 to version-2025.8 (with conflicts)

This cherry-pick has conflicts that need manual resolution.

Original PR: #16466
Original commit: 66e17fe280

---------

Signed-off-by: Dewi Roberts <dewi@goauthentik.io>
Co-authored-by: Dewi Roberts <dewi@goauthentik.io>
Co-authored-by: Tana M Berry <tanamarieberry@yahoo.com>
Co-authored-by: authentik-automation[bot] <135050075+authentik-automation[bot]@users.noreply.github.com>
Co-authored-by: Teffen Ellis <592134+GirlBossRush@users.noreply.github.com>
2025-08-29 15:13:32 +00:00
authentik-automation[bot]
db3fb0bf2e website: Unify Netlify redirects with Docusaurus's client-side router. (backport of #16430) (#16472)
Cherry-pick #16430 to version-2025.8 (with conflicts)

This cherry-pick has conflicts that need manual resolution.

Original PR: #16430
Original commit: 6d81aea5aa

Co-authored-by: Teffen Ellis <592134+GirlBossRush@users.noreply.github.com>
2025-08-29 14:08:36 +00:00
authentik-automation[bot]
5859e6a5e5 core: fix client-side only validation allowing admin to set blank user password (cherry-pick #16467) (#16469)
Co-authored-by: Jens L. <jens@goauthentik.io>
fix client-side only validation allowing admin to set blank user password (#16467)
2025-08-29 15:20:46 +02:00
authentik-automation[bot]
1d3271fec7 lib/sync/outgoing: fix single object sync timeout (cherry-pick #16447) (#16453)
Co-authored-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
fix single object sync timeout (#16447)
2025-08-28 19:35:33 +02:00
authentik-automation[bot]
673c8ef62c core: bump h2 from 4.2.0 to 4.3.0 (cherry-pick #16446) (#16449)
Co-authored-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
2025-08-28 17:14:51 +02:00
authentik-automation[bot]
4614ae320f website/docs: capitalized proper name of stages, removed old version references. (cherry-pick #16414) (#16450)
website/docs: capitalized proper name of stages, removed old version references. (#16414)

* capitalized proper name of stages, removed old version references.

* ken and dominics edits

---------

Co-authored-by: Tana M Berry <tanamarieberry@yahoo.com>
Co-authored-by: Tana M Berry <tana@goauthentik.io>
2025-08-28 09:52:59 -05:00
authentik-automation[bot]
10b103c0bf lib/sync: fix missing f for string (cherry-pick #16423) (#16427)
Co-authored-by: Jens L. <jens@goauthentik.io>
fix missing f for string (#16423)
2025-08-28 15:38:24 +02:00
authentik-automation[bot]
361e64a8a1 website/docs: update internal vs external user information (cherry-pick #16359) (#16421)
website/docs: update internal vs external user information (#16359)

* Update external vs internal user information

* Language updates

* Spelling

* Applied suggestion

Co-authored-by: Dewi Roberts <dewi@goauthentik.io>
2025-08-27 19:45:48 +01:00
authentik-automation[bot]
e56081b863 providers/rac: fix AuthenticatedSession migration (cherry-pick #16400) (#16419)
Co-authored-by: Simonyi Gergő <28359278+gergosimonyi@users.noreply.github.com>
fix `AuthenticatedSession` migration (#16400)
2025-08-27 18:43:31 +02:00
authentik-automation[bot]
01a44b281b root: fix security.md (cherry-pick #16345) (#16415)
Co-authored-by: Dominic R <dominic@sdko.org>
fix security.md (#16345)
2025-08-27 18:15:05 +02:00
authentik-automation[bot]
7a6631c6e8 website/docs: adds details to certificates doc (cherry-pick #16335) (#16394)
website/docs: adds details to certificates doc (#16335)

* Clarifies certs directory mounting and adds instruction for manually re-triggering discovery.

* Fixed mounting info

* Update website/docs/sys-mgmt/certificates.md




* Update website/docs/sys-mgmt/certificates.md




---------

Signed-off-by: Dewi Roberts <dewi@goauthentik.io>
Co-authored-by: Dewi Roberts <dewi@goauthentik.io>
Co-authored-by: Dominic R <dominic@sdko.org>
2025-08-27 16:14:45 +00:00
authentik-automation[bot]
ada973dd44 tasks/schedules: fix api search fields (cherry-pick #16405) (#16410)
Co-authored-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
fix api search fields (#16405)
2025-08-27 17:04:32 +02:00
authentik-automation[bot]
3a7e962bde lifecycle: fix PROMETHEUS_MULTIPROC_DIR missing suffix (cherry-pick #16401) (#16407)
Co-authored-by: Marc 'risson' Schmitt <marc.schmitt@risson.space>
fix PROMETHEUS_MULTIPROC_DIR missing suffix (#16401)
2025-08-27 15:32:23 +02:00
authentik-automation[bot]
ae297e2f60 root: update security.md with github reporting link (cherry-pick #16332) (#16395)
Co-authored-by: Dewi Roberts <dewi@goauthentik.io>
2025-08-27 14:31:06 +02:00
Teffen Ellis
2bc2b6bd41 website: cherry-pick 2025.8/client side redirects (#16372)
website: Fix stale links that depend on initial route not triggering redirects. (#16369)

website: Fix issue where stale links that depend on initial route are
not redirected.
2025-08-26 14:36:58 -05:00
Marc 'risson' Schmitt
8199371172 providers/oauth2: avoid deadlock during session migration (cherry-pick #16361) (#16364) 2025-08-25 18:29:53 +02:00
Marc 'risson' Schmitt
dac1879de5 website/docs: 2025.8.1 release notes (cherry-pick #16343) (#16344) 2025-08-22 16:53:58 +02:00
authentik-automation[bot]
dd7c6b29d9 release: 2025.8.1 2025-08-22 14:40:58 +00:00
Marc 'risson' Schmitt
41b7e05f59 packages/django-dramatiq-postgres: broker: fix various timing issues (cherry-pick #16340) (#16342)
fix various timing issues (#16340)
2025-08-22 16:08:53 +02:00
Teffen Ellis
c5f5714e02 website: Hotfix version 2025.8/website versioning (#16336)
Co-authored-by: Dominic R <dominic@sdko.org>
Fix version origin detection, build-time URLs  (#15774)
2025-08-22 16:08:34 +02:00
Marc 'risson' Schmitt
d20d8322af outposts: allow ingress path type configuration (cherry-pick #16339) (#16341) 2025-08-22 15:49:33 +02:00
Marc 'risson' Schmitt
288f5d5015 outposts: fix service connection update task arguments (cherry-pick #16312) (#16313) 2025-08-22 14:31:56 +02:00
Marc 'risson' Schmitt
a640eb9180 packages/django-dramatiq-postgres: middleware: fix listening on hosts where ipv6 is not supported (cherry-pick #16308) (#16305) 2025-08-22 14:26:49 +02:00
Marc 'risson' Schmitt
7d48baab3e core: use email backend for test_email management command (cherry-pick #16311) (#16337)
Co-authored-by: Marcelo Elizeche Landó <marcelo@goauthentik.io>
2025-08-22 14:25:06 +02:00
Tana M Berry
b1cd6d34fc website/docs: add link in 2025.8 rel notes to back-channel logout doc (cherry pick #16306) (#16318)
Co-authored-by: Tana M Berry <tana@goauthentik.io>
2025-08-21 22:35:36 +02:00
Teffen Ellis
f14d033cef web: Username truncation, field alignment. (cherry-pick #16283) (#16284) 2025-08-21 18:03:55 +02:00
Teffen Ellis
ec75e161e2 web: Fix form group slots (cherry-pick #16276) (#16277) 2025-08-21 17:56:36 +02:00
Teffen Ellis
2e8fb8f2c6 web: Improvements to ReCaptcha resizing (cherry-pick #16171) (#16281) 2025-08-21 17:56:11 +02:00
Teffen Ellis
362bf22139 web: Fix issue where clicking a list item scrolls container (cherry-pick #16174) (#16278) 2025-08-21 17:49:43 +02:00
Marc 'risson' Schmitt
890da9b287 lifecycle: set PROMETHEUS_MULTIPROC_DIR as early as possible (cherry-pick #16298) (#16302) 2025-08-21 16:48:50 +02:00
Marc 'risson' Schmitt
7b8dadf945 providers/oauth2: fix logout token missing sid, fix wrong sub mode used (cherry-pick #16295) (#16299)
fix logout token missing sid, fix wrong sub mode used (#16295)
2025-08-21 16:39:18 +02:00
Teffen Ellis
8f70dbb963 web: Fix hidden textarea required attribute. (cherry-pick #16168) (#16275)
Fix hidden textarea `required` attribute. (#16168)
2025-08-21 14:50:36 +02:00
Teffen Ellis
15505f5caf web: fix "Explore integrations" link in Quick actions (cherry-pick #16274) (#16285)
Co-authored-by: Max <mail@heavygale.de>
fix "Explore integrations" link in Quick actions (#16274)
2025-08-21 14:20:07 +02:00
Marc 'risson' Schmitt
b2d770c0a4 *: Fix dead doc link (cherry-pick #16288) (#16297)
Co-authored-by: Dominic R <dominic@sdko.org>
Fix dead doc link (#16288)
2025-08-21 14:10:33 +02:00
authentik-automation[bot]
f7e25fff1a release: 2025.8.0 2025-08-20 18:01:34 +00:00
Marc 'risson' Schmitt
688b3a9f8b tasks: add rel_obj to system task exception event (cherry-pick #16270) (#16272) 2025-08-20 19:31:02 +02:00
Marc 'risson' Schmitt
8f48e18854 website/docs: update 2025.8 release notes (cherry-pick #16269) (#16271) 2025-08-20 19:20:18 +02:00
Marc 'risson' Schmitt
306f75be59 security: Bump supported versions (cherry-pick #16261) (#16267)
Co-authored-by: Dominic R <dominic@sdko.org>
2025-08-20 19:17:08 +02:00
Tana M Berry
982c3cf4dc website/docs: sys-mgmt/s3: Clean up and improve (cherry-pick #16242) (#16266)
Co-authored-by: Dominic R <dominic@sdko.org>
Co-authored-by: Dewi Roberts <dewi@goauthentik.io>
2025-08-20 13:24:51 +02:00
Tana M Berry
156cda6cb6 website/docs: update Advanced Queries docs and Rel Notes with new Int Guide (cherry-pick #16191) (#16258)
website/docs: Advanced queries, remove reference to QL and add more examples (#16191)

* remove reference to QL

* add Jens' examples

* tweak

* Update website/docs/users-sources/user/user_basic_operations.md




* Update website/docs/users-sources/user/user_basic_operations.md




* add note about UX ticks

* tweak

* argh

* clarify there are more values

* add link to Event actions list

* tweaks, typo

* Update website/docs/users-sources/user/user_basic_operations.md




* Update website/docs/sys-mgmt/events/logging-events.md




* jens edits

---------

Signed-off-by: Tana M Berry <tanamarieberry@yahoo.com>
Co-authored-by: Tana M Berry <tana@goauthentik.io>
Co-authored-by: Jens L. <jens@goauthentik.io>
Co-authored-by: Dewi Roberts <dewi@goauthentik.io>
2025-08-20 04:36:47 -05:00
authentik-automation[bot]
9f5125cf6b release: 2025.8.0-rc7 2025-08-18 18:44:42 +00:00
Marc 'risson' Schmitt
a84411363e web: Fix ak-flow-card footer alignment. (cherry-pick #16236) (#16238)
Co-authored-by: Teffen Ellis <592134+GirlBossRush@users.noreply.github.com>
Fix ak-flow-card footer alignment. (#16236)
2025-08-18 20:15:21 +02:00
Marc 'risson' Schmitt
9f3bb0210b brands: revert sort matched brand by match length (revert #15413) (cherry-pick #16233) (#16235) 2025-08-18 19:28:19 +02:00
Marc 'risson' Schmitt
d1271502ef web: bump API Client version (cherry-pick #16203) (#16226)
Co-authored-by: authentik-automation[bot] <135050075+authentik-automation[bot]@users.noreply.github.com>
2025-08-18 16:42:35 +02:00
Marc 'risson' Schmitt
d1065b2d49 policies/password: Fix amount_uppercase in password policy check (cherry-pick #16197) (#16228)
Co-authored-by: M-Slanec <mcslanec@gmail.com>
Co-authored-by: Matthew Slanec <matthewslanec@Matthews-MacBook-Pro.local>
Fix amount_uppercase in password policy check (#16197)
2025-08-18 16:42:26 +02:00
Marc 'risson' Schmitt
aa19227e30 web/a11y: QL Search Input (cherry-pick #16198) (#16229)
Co-authored-by: Teffen Ellis <592134+GirlBossRush@users.noreply.github.com>
2025-08-18 16:41:50 +02:00
Marc 'risson' Schmitt
d7a2861bbe stages/authenticator_webauthn: Update FIDO MDS3 & Passkey aaguid blobs (cherry-pick #16196) (#16227)
Co-authored-by: authentik-automation[bot] <135050075+authentik-automation[bot]@users.noreply.github.com>
2025-08-18 16:41:33 +02:00
Marc 'risson' Schmitt
5aae9c9afa core: Add email template selector (cherry-pick #16170) (#16225)
Co-authored-by: Marcelo Elizeche Landó <marcelo@goauthentik.io>
Co-authored-by: Jens Langhammer <jens@goauthentik.io>
2025-08-18 16:41:22 +02:00
Tana M Berry
93cb48c928 website/docs: add content about new Advanced Query searches (cherry-pick #16019) (#16188)
Co-authored-by: Tana M Berry <tana@goauthentik.io>
2025-08-14 18:32:05 +02:00
Marc 'risson' Schmitt
55f7f93a24 web/admin: fix settings saving (cherry-pick #16184) (#16187)
Co-authored-by: Jens Langhammer <jens@goauthentik.io>
2025-08-14 13:02:30 +00:00
authentik-automation[bot]
5a608a4235 release: 2025.8.0-rc6 2025-08-14 11:29:06 +00:00
Marc 'risson' Schmitt
77d023758f tasks: add sentry dramatiq integration (cherry-pick #16167) (#16183)
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-08-14 13:05:59 +02:00
Marc 'risson' Schmitt
f9edafd374 ci: release tag: fix missing env variables (cherry-pick #16172) (#16173) 2025-08-13 21:31:01 +02:00
Marc 'risson' Schmitt
d94219eb0e ci: release: consolidation bump version and on tag (cherry-pick #16164) (#16169)
Co-authored-by: Dominic R <dominic@sdko.org>
2025-08-13 18:22:11 +02:00
Marc 'risson' Schmitt
0871aa2cf3 ci: fix docker hub credentials (cherry-pick #16165) (#16166) 2025-08-13 15:20:03 +02:00
authentik-automation[bot]
e0fe99d0b8 release: 2025.8.0-rc5 2025-08-13 00:21:34 +00:00
Marc 'risson' Schmitt
c1acf53585 root: fix custom packages installation in docker (cherry-pick #16157) (#16158) 2025-08-13 02:08:56 +02:00
authentik-automation[bot]
baf4eed0d9 release: 2025.8.0-rc4 2025-08-12 22:26:57 +00:00
Marc 'risson' Schmitt
60e1192a7a ci: release publish: fix missing permissions (cherry-pick #16155) (#16156) 2025-08-13 00:20:52 +02:00
authentik-automation[bot]
2608e02d6e release: 2025.8.0-rc3 2025-08-12 21:49:51 +00:00
Marc 'risson' Schmitt
0a5928fbcb ci: docker push: fix version missing dash (cherry-pick #16153) (#16154) 2025-08-12 23:44:01 +02:00
authentik-automation[bot]
7b99b02b4a release: 2025.8.0-rc2 2025-08-12 21:12:07 +00:00
Marc 'risson' Schmitt
dfeadc9ebe root: fix custom packages installation in docker (cherry-pick #16150) (#16151) 2025-08-12 23:08:40 +02:00
authentik-automation[bot]
7ff64fbd09 release: 2025.8.0-rc1 2025-08-12 20:31:40 +00:00
2800 changed files with 262270 additions and 452001 deletions

View File

@@ -1,81 +0,0 @@
name: Bug report
description: Create a report to help us improve
labels: ["bug", "triage"]
type: bug
body:
- type: markdown
attributes:
value: |
Thank you for taking the time to fill out this bug report!
- type: textarea
id: describe-the-bug
attributes:
label: Describe the bug
description: "A clear and concise description of what the bug is."
placeholder: "Describe the issue"
validations:
required: true
- type: textarea
id: how-to-reproduce
attributes:
label: How to reproduce
description: "Steps to reproduce the behavior."
placeholder: |
1. Go to '...'
2. Click on '....'
3. Scroll down to '....'
4. See error
validations:
required: true
- type: textarea
id: expected-behavior
attributes:
label: Expected behavior
description: "A clear and concise description of what you expected to happen."
placeholder: "The behavior that I expect to see is [...]"
validations:
required: true
- type: textarea
id: screenshots
attributes:
label: Screenshots
description: "If applicable, add screenshots to help explain your problem."
validations:
required: false
- type: textarea
id: additional-context
attributes:
label: Additional context
description: "Add any other context about the problem here."
placeholder: "Also note that [...]"
validations:
required: false
- type: dropdown
id: deployment-method
attributes:
label: Deployment Method
description: "What deployment method are you using for authentik? Only Docker, Kubernetes and AWS CloudFormation are supported."
options:
- Docker
- Kubernetes
- AWS CloudFormation
- Other (please specify)
default: 0
validations:
required: true
- type: input
id: version
attributes:
label: Version
description: "What version of authentik are you using?"
placeholder: "[e.g. 2025.10.1]"
validations:
required: true
- type: textarea
id: logs
attributes:
label: Relevant log output
description: "Please copy and paste any relevant log output. This will be automatically formatted into code, so no need for backticks."
render: shell
validations:
required: false

View File

@@ -1,49 +0,0 @@
name: Documentation suggestion/problem
description: Suggest an improvement or report a problem in our docs
labels: ["area: docs", "triage"]
type: task
body:
- type: markdown
attributes:
value: |
Thank you for taking the time to fill out this documentation issue!
- type: markdown
attributes:
value: |
**Consider opening a PR!**
If the issue is one that you can fix, or even make a good pass at, we'd appreciate a PR.
For more information about making a contribution to the docs, and using our Style Guide and our templates, refer to ["Writing documentation"](https://docs.goauthentik.io/docs/developer-docs/docs/writing-documentation).
- type: textarea
id: issue
attributes:
label: Do you see an area that can be clarified or expanded, a technical inaccuracy, or a broken link?
description: "A clear and concise description of what the problem is, or where the document can be improved."
placeholder: "I believe we need more details about [...]"
validations:
required: true
- type: input
id: link
attributes:
label: Link
description: "Provide the URL or link to the exact page in the documentation to which you are referring."
placeholder: "If there are multiple pages, list them all"
validations:
required: true
- type: textarea
id: solution
attributes:
label: Solution
description: "A clear and concise description of what you suggest as a solution"
placeholder: "This issue could be resolved by [...]"
validations:
required: true
- type: textarea
id: additional-context
attributes:
label: Additional context
description: "Add any other context or screenshots about the documentation issue here."
placeholder: "Also note that [...]"
validations:
required: false

View File

@@ -1,41 +0,0 @@
name: Feature request
description: Suggest an idea for a feature
labels: ["enhancement", "triage"]
type: feature
body:
- type: markdown
attributes:
value: |
Thank you for taking the time to fill out this feature request!
- type: textarea
id: related-to-problem
attributes:
label: Is your feature request related to a problem?
description: "A clear and concise description of what the problem is."
placeholder: "I'm always frustrated when [...]"
validations:
required: true
- type: textarea
id: feature
attributes:
label: Describe the solution you'd like
description: A clear and concise description of what you want to happen.
placeholder: "I'd like authentik to have [...]"
validations:
required: false
- type: textarea
id: alternatives
attributes:
label: Describe alternatives that you've considered
description: "A clear and concise description of any alternative solutions or features you've considered."
placeholder: "I've tried this but [...]"
validations:
required: true
- type: textarea
id: additional-context
attributes:
label: Additional context
description: "Add any other context or screenshots about the feature request here."
placeholder: "Also note that [...]"
validations:
required: false

39
.github/ISSUE_TEMPLATE/bug_report.md vendored Normal file
View File

@@ -0,0 +1,39 @@
---
name: Bug report
about: Create a report to help us improve
title: ""
labels: bug
assignees: ""
---
**Describe the bug**
A clear and concise description of what the bug is.
**To Reproduce**
Steps to reproduce the behavior:
1. Go to '...'
2. Click on '....'
3. Scroll down to '....'
4. See error
**Expected behavior**
A clear and concise description of what you expected to happen.
**Screenshots**
If applicable, add screenshots to help explain your problem.
**Logs**
Output of docker-compose logs or kubectl logs respectively
**Version and Deployment (please complete the following information):**
<!--
Notice: authentik supports installation via Docker, Kubernetes, and AWS CloudFormation only. Support is not available for other methods. For detailed installation and configuration instructions, please refer to the official documentation at https://docs.goauthentik.io/docs/install-config/.
-->
- authentik version: [e.g. 2025.2.0]
- Deployment: [e.g. docker-compose, helm]
**Additional context**
Add any other context about the problem here.

View File

@@ -1,8 +0,0 @@
blank_issues_enabled: false
contact_links:
- name: Question
url: https://github.com/goauthentik/authentik/discussions
about: Please ask questions via GitHub Discussions rather than creating issues.
- name: authentik Discord
url: https://discord.com/invite/jg33eMhnj6
about: For community support, visit our Discord server.

22
.github/ISSUE_TEMPLATE/docs_issue.md vendored Normal file
View File

@@ -0,0 +1,22 @@
---
name: Documentation issue
about: Suggest an improvement or report a problem
title: ""
labels: documentation
assignees: ""
---
**Do you see an area that can be clarified or expanded, a technical inaccuracy, or a broken link? Please describe.**
A clear and concise description of what the problem is, or where the document can be improved. Ex. I believe we need more details about [...]
**Provide the URL or link to the exact page in the documentation to which you are referring.**
If there are multiple pages, list them all, and be sure to state the header or section where the content is.
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Additional context**
Add any other context or screenshots about the documentation issue here.
**Consider opening a PR!**
If the issue is one that you can fix, or even make a good pass at, we'd appreciate a PR. For more information about making a contribution to the docs, and using our Style Guide and our templates, refer to ["Writing documentation"](https://docs.goauthentik.io/docs/developer-docs/docs/writing-documentation).

View File

@@ -0,0 +1,19 @@
---
name: Feature request
about: Suggest an idea for this project
title: ""
labels: enhancement
assignees: ""
---
**Is your feature request related to a problem? Please describe.**
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.

View File

@@ -0,0 +1,17 @@
---
name: Hackathon Idea
about: Propose an idea for the hackathon
title: ""
labels: hackathon
assignees: ""
---
**Describe the idea**
A clear concise description of the idea you want to implement
You're also free to work on existing GitHub issues, whether they be feature requests or bugs, just link the existing GitHub issue here.
<!-- Don't modify below here -->
If you want to help working on this idea or want to contribute in any other way, react to this issue with a :rocket:

View File

@@ -1,7 +0,0 @@
---
name: Blank issue
about: This issue type is only for internal use
title:
labels:
assignees:
---

32
.github/ISSUE_TEMPLATE/question.md vendored Normal file
View File

@@ -0,0 +1,32 @@
---
name: Question
about: Ask a question about a feature or specific configuration
title: ""
labels: question
assignees: ""
---
**Describe your question/**
A clear and concise description of what you're trying to do.
**Relevant info**
i.e. Version of other software you're using, specifics of your setup
**Screenshots**
If applicable, add screenshots to help explain your problem.
**Logs**
Output of docker-compose logs or kubectl logs respectively
**Version and Deployment (please complete the following information):**
<!--
Notice: authentik supports installation via Docker, Kubernetes, and AWS CloudFormation only. Support is not available for other methods. For detailed installation and configuration instructions, please refer to the official documentation at https://docs.goauthentik.io/docs/install-config/.
-->
- authentik version: [e.g. 2025.2.0]
- Deployment: [e.g. docker-compose, helm]
**Additional context**
Add any other context about the problem here.

View File

@@ -1,273 +0,0 @@
name: "Cherry-picker"
description: "Cherry-pick PRs based on their labels"
inputs:
token:
description: "GitHub Token"
required: true
git_user:
description: "Git user for pushing the cherry-pick PR"
required: true
git_user_email:
description: "Git user email for pushing the cherry-pick PR"
required: true
runs:
using: "composite"
steps:
- name: Check if workflow should run
id: should_run
shell: bash
env:
GITHUB_TOKEN: ${{ inputs.token }}
run: |
set -e -o pipefail
# For issues events, check if it's actually a PR
if [ "${{ github.event_name }}" = "issues" ]; then
# Check if this issue is actually a PR
PR_DATA=$(gh api repos/${{ github.repository }}/pulls/${{ github.event.issue.number }} 2>/dev/null || echo "null")
if [ "$PR_DATA" = "null" ]; then
echo "should_run=false" >> $GITHUB_OUTPUT
echo "reason=not_a_pr" >> $GITHUB_OUTPUT
echo "This is an issue, not a PR. Skipping."
exit 0
fi
# Get PR data
PR_MERGED=$(echo "$PR_DATA" | jq -r '.merged')
PR_NUMBER="${{ github.event.issue.number }}"
MERGE_COMMIT_SHA=$(echo "$PR_DATA" | jq -r '.merge_commit_sha')
# Check if it's a backport label
LABEL_NAME="${{ github.event.label.name }}"
if [[ "$LABEL_NAME" =~ ^backport/(.+)$ ]]; then
if [ "$PR_MERGED" = "true" ]; then
echo "should_run=true" >> $GITHUB_OUTPUT
echo "reason=label_added_to_merged_pr" >> $GITHUB_OUTPUT
echo "pr_number=$PR_NUMBER" >> $GITHUB_OUTPUT
echo "merge_commit_sha=$MERGE_COMMIT_SHA" >> $GITHUB_OUTPUT
exit 0
else
echo "should_run=false" >> $GITHUB_OUTPUT
echo "reason=label_added_to_open_pr" >> $GITHUB_OUTPUT
echo "Backport label added to open PR. Will run after PR is merged."
exit 0
fi
else
echo "should_run=false" >> $GITHUB_OUTPUT
echo "reason=non_backport_label" >> $GITHUB_OUTPUT
exit 0
fi
fi
# For pull_request and pull_request_target events
PR_NUMBER="${{ github.event.pull_request.number }}"
MERGE_COMMIT_SHA="${{ github.event.pull_request.merge_commit_sha }}"
# Case 1: PR was just merged (closed + merged = true)
if [ "${{ github.event.action }}" = "closed" ] && [ "${{ github.event.pull_request.merged }}" = "true" ]; then
echo "should_run=true" >> $GITHUB_OUTPUT
echo "reason=pr_merged" >> $GITHUB_OUTPUT
echo "pr_number=$PR_NUMBER" >> $GITHUB_OUTPUT
echo "merge_commit_sha=$MERGE_COMMIT_SHA" >> $GITHUB_OUTPUT
exit 0
fi
# Case 2: Label was added
if [ "${{ github.event.action }}" = "labeled" ]; then
LABEL_NAME="${{ github.event.label.name }}"
# Check if it's a backport label
if [[ "$LABEL_NAME" =~ ^backport/(.+)$ ]]; then
# Check if PR is already merged
if [ "${{ github.event.pull_request.merged }}" = "true" ]; then
echo "should_run=true" >> $GITHUB_OUTPUT
echo "reason=label_added_to_merged_pr" >> $GITHUB_OUTPUT
echo "pr_number=$PR_NUMBER" >> $GITHUB_OUTPUT
echo "merge_commit_sha=$MERGE_COMMIT_SHA" >> $GITHUB_OUTPUT
exit 0
else
echo "should_run=false" >> $GITHUB_OUTPUT
echo "reason=label_added_to_open_pr" >> $GITHUB_OUTPUT
echo "Backport label added to open PR. Will run after PR is merged."
exit 0
fi
else
echo "should_run=false" >> $GITHUB_OUTPUT
echo "reason=non_backport_label" >> $GITHUB_OUTPUT
exit 0
fi
fi
echo "should_run=false" >> $GITHUB_OUTPUT
echo "reason=unknown" >> $GITHUB_OUTPUT
- name: Configure Git
if: steps.should_run.outputs.should_run == 'true'
shell: bash
env:
user: ${{ inputs.git_user }}
email: ${{ inputs.git_user_email }}
run: |
git config --global user.name "${user}"
git config --global user.email "${email}"
- name: Get PR details and extract backport labels
if: steps.should_run.outputs.should_run == 'true'
id: pr_details
shell: bash
env:
GITHUB_TOKEN: ${{ inputs.token }}
run: |
set -e -o pipefail
PR_NUMBER="${{ steps.should_run.outputs.pr_number }}"
# Get PR details
PR_DATA=$(gh api repos/${{ github.repository }}/pulls/$PR_NUMBER)
PR_TITLE=$(echo "$PR_DATA" | jq -r '.title')
PR_AUTHOR=$(echo "$PR_DATA" | jq -r '.user.login')
echo "pr_title=$PR_TITLE" >> $GITHUB_OUTPUT
echo "pr_author=$PR_AUTHOR" >> $GITHUB_OUTPUT
# Determine which labels to process
if [ "${{ steps.should_run.outputs.reason }}" = "label_added_to_merged_pr" ]; then
# Only process the specific label that was just added
if [ "${{ github.event_name }}" = "issues" ]; then
LABEL_NAME="${{ github.event.label.name }}"
else
LABEL_NAME="${{ github.event.label.name }}"
fi
if [[ "$LABEL_NAME" =~ ^backport/(.+)$ ]]; then
echo "labels=$LABEL_NAME" >> $GITHUB_OUTPUT
else
echo "Label $LABEL_NAME does not match backport pattern"
echo "labels=" >> $GITHUB_OUTPUT
fi
else
# PR was just merged, process all backport labels
LABELS=$(gh pr view $PR_NUMBER --json labels --jq '.labels[].name' | grep '^backport/' | tr '\n' ' ' || true)
echo "labels=$LABELS" >> $GITHUB_OUTPUT
fi
- name: Cherry-pick to target branches
if: steps.should_run.outputs.should_run == 'true' && steps.pr_details.outputs.labels != ''
shell: bash
env:
GITHUB_TOKEN: ${{ inputs.token }}
run: |
set -e -o pipefail
PR_NUMBER='${{ steps.should_run.outputs.pr_number }}'
COMMIT_SHA='${{ steps.should_run.outputs.merge_commit_sha }}'
PR_TITLE='${{ steps.pr_details.outputs.pr_title }}'
PR_AUTHOR='${{ steps.pr_details.outputs.pr_author }}'
LABELS='${{ steps.pr_details.outputs.labels }}'
echo "Processing PR #$PR_NUMBER (reason: ${{ steps.should_run.outputs.reason }})"
echo "Found backport labels: $LABELS"
# Process each backport label
for label in $LABELS; do
if [[ "$label" =~ ^backport/(.+)$ ]]; then
TARGET_BRANCH="${BASH_REMATCH[1]}"
echo "Processing backport to branch: $TARGET_BRANCH"
# Check if target branch exists
if ! git ls-remote --heads origin "$TARGET_BRANCH" | grep -q "$TARGET_BRANCH"; then
echo "❌ Target branch $TARGET_BRANCH does not exist, skipping"
# Comment on the original PR about the missing branch
gh pr comment $PR_NUMBER --body "⚠️ Cannot backport to \`$TARGET_BRANCH\`: branch does not exist."
continue
fi
# Create a unique branch name for the cherry-pick
CHERRY_PICK_BRANCH="cherry-pick/${PR_NUMBER}-to-${TARGET_BRANCH}"
# Check if a cherry-pick PR already exists
EXISTING_PR=$(gh pr list --head "$CHERRY_PICK_BRANCH" --json number --jq '.[0].number' 2>/dev/null || echo "")
if [ -n "$EXISTING_PR" ]; then
echo "⚠️ Cherry-pick PR already exists: #$EXISTING_PR"
gh pr comment $PR_NUMBER --body "Cherry-pick to \`$TARGET_BRANCH\` already exists: #$EXISTING_PR"
continue
fi
# Fetch and checkout target branch
git fetch origin "$TARGET_BRANCH"
git checkout -b "$CHERRY_PICK_BRANCH" "origin/$TARGET_BRANCH"
# Attempt cherry-pick
if git cherry-pick "$COMMIT_SHA"; then
echo "✅ Cherry-pick successful for $TARGET_BRANCH"
# Push the cherry-pick branch
git push origin "$CHERRY_PICK_BRANCH"
# Create PR for the cherry-pick
CHERRY_PICK_TITLE="$PR_TITLE (cherry-pick #$PR_NUMBER to $TARGET_BRANCH)"
CHERRY_PICK_BODY="Cherry-pick of #$PR_NUMBER to \`$TARGET_BRANCH\` branch.
**Original PR:** #$PR_NUMBER
**Original Author:** @$PR_AUTHOR
**Cherry-picked commit:** $COMMIT_SHA"
NEW_PR=$(gh pr create \
--title "$CHERRY_PICK_TITLE" \
--body "$CHERRY_PICK_BODY" \
--base "$TARGET_BRANCH" \
--head "$CHERRY_PICK_BRANCH" \
--label "cherry-pick")
# Assign the PR to the original author
gh pr edit "$NEW_PR" --add-assignee "$PR_AUTHOR" || true
echo "✅ Created cherry-pick PR $NEW_PR for $TARGET_BRANCH"
# Comment on original PR
gh pr comment $PR_NUMBER --body "🍒 Cherry-pick to \`$TARGET_BRANCH\` created: $NEW_PR"
else
echo "⚠️ Cherry-pick failed for $TARGET_BRANCH, creating conflict resolution PR"
# Add conflicted files and commit
git add .
git commit -m "Cherry-pick #$PR_NUMBER to $TARGET_BRANCH (with conflicts)
This cherry-pick has conflicts that need manual resolution.
Original PR: #$PR_NUMBER
Original commit: $COMMIT_SHA"
# Push the branch with conflicts
git push origin "$CHERRY_PICK_BRANCH"
# Create PR with conflict notice
CONFLICT_TITLE="$PR_TITLE (cherry-pick #$PR_NUMBER to $TARGET_BRANCH)"
CONFLICT_BODY="⚠️ **This cherry-pick has conflicts that require manual resolution.**
Cherry-pick of #$PR_NUMBER to \`$TARGET_BRANCH\` branch.
**Original PR:** #$PR_NUMBER
**Original Author:** @$PR_AUTHOR
**Cherry-picked commit:** $COMMIT_SHA
**Please resolve the conflicts in this PR before merging.**"
NEW_PR=$(gh pr create \
--title "$CONFLICT_TITLE" \
--body "$CONFLICT_BODY" \
--base "$TARGET_BRANCH" \
--head "$CHERRY_PICK_BRANCH" \
--label "cherry-pick")
# Assign the PR to the original author
gh pr edit "$NEW_PR" --add-assignee "$PR_AUTHOR" || true
echo "⚠️ Created conflict resolution PR $NEW_PR for $TARGET_BRANCH"
# Comment on original PR
gh pr comment $PR_NUMBER --body "⚠️ Cherry-pick to \`$TARGET_BRANCH\` has conflicts: $NEW_PR"
fi
# Clean up - go back to main branch
git checkout main
git branch -D "$CHERRY_PICK_BRANCH" 2>/dev/null || true
fi
done

View File

@@ -10,14 +10,14 @@ runs:
using: "composite"
steps:
- name: Find Comment
uses: peter-evans/find-comment@b30e6a3c0ed37e7c023ccd3f1db5c6c0b0c23aad # v2
uses: peter-evans/find-comment@v2
id: fc
with:
issue-number: ${{ github.event.pull_request.number }}
comment-author: "github-actions[bot]"
body-includes: authentik PR Installation instructions
- name: Create or update comment
uses: peter-evans/create-or-update-comment@e8674b075228eee787fea43ef493e45ece1004c9 # v2
uses: peter-evans/create-or-update-comment@v2
with:
comment-id: ${{ steps.fc.outputs.comment-id }}
issue-number: ${{ github.event.pull_request.number }}

View File

@@ -2,28 +2,16 @@
import os
from json import dumps
from sys import exit as sysexit
from time import time
from authentik import authentik_version
def must_or_fail(input: str | None, error: str) -> str:
if not input:
print(f"::error::{error}")
sysexit(1)
return input
# Decide if we should push the image or not
should_push = True
if len(os.environ.get("DOCKER_USERNAME", "")) < 1:
# Don't push if we don't have DOCKER_USERNAME, i.e. no secrets are available
should_push = False
if (
must_or_fail(os.environ.get("GITHUB_REPOSITORY"), "Repo required").lower()
== "goauthentik/authentik-internal"
):
if os.environ.get("GITHUB_REPOSITORY").lower() == "goauthentik/authentik-internal":
# Don't push on the internal repo
should_push = False
@@ -32,16 +20,13 @@ if os.environ.get("GITHUB_HEAD_REF", "") != "":
branch_name = os.environ["GITHUB_HEAD_REF"]
safe_branch_name = branch_name.replace("refs/heads/", "").replace("/", "-").replace("'", "-")
image_names = must_or_fail(os.getenv("IMAGE_NAME"), "Image name required").split(",")
image_names = os.getenv("IMAGE_NAME").split(",")
image_arch = os.getenv("IMAGE_ARCH") or None
is_pull_request = bool(os.getenv("PR_HEAD_SHA"))
is_release = "dev" not in image_names[0]
sha = must_or_fail(
os.environ["GITHUB_SHA"] if not is_pull_request else os.getenv("PR_HEAD_SHA"),
"could not determine SHA",
)
sha = os.environ["GITHUB_SHA"] if not is_pull_request else os.getenv("PR_HEAD_SHA")
# 2042.1.0 or 2042.1.0-rc1
version = authentik_version()
@@ -73,7 +58,7 @@ else:
image_main_tag = image_tags[0].split(":")[-1]
def get_attest_image_names(image_with_tags: list[str]) -> str:
def get_attest_image_names(image_with_tags: list[str]):
"""Attestation only for GHCR"""
image_tags = []
for image_name in set(name.split(":")[0] for name in image_with_tags):
@@ -97,6 +82,7 @@ if os.getenv("RELEASE", "false").lower() == "true":
image_build_args = [f"VERSION={os.getenv('REF')}"]
else:
image_build_args = [f"GIT_BUILD_HASH={sha}"]
image_build_args = "\n".join(image_build_args)
with open(os.environ["GITHUB_OUTPUT"], "a+", encoding="utf-8") as _output:
print(f"shouldPush={str(should_push).lower()}", file=_output)
@@ -109,4 +95,4 @@ with open(os.environ["GITHUB_OUTPUT"], "a+", encoding="utf-8") as _output:
print(f"imageMainTag={image_main_tag}", file=_output)
print(f"imageMainName={image_tags[0]}", file=_output)
print(f"cacheTo={cache_to}", file=_output)
print(f"imageBuildArgs={"\n".join(image_build_args)}", file=_output)
print(f"imageBuildArgs={image_build_args}", file=_output)

View File

@@ -12,22 +12,21 @@ inputs:
runs:
using: "composite"
steps:
- name: Install apt deps & cleanup
- name: Install apt deps
if: ${{ contains(inputs.dependencies, 'system') || contains(inputs.dependencies, 'python') }}
shell: bash
run: |
sudo apt-get remove --purge man-db
sudo apt-get update
sudo apt-get install --no-install-recommends -y libpq-dev openssl libxmlsec1-dev pkg-config gettext krb5-multidev libkrb5-dev heimdal-multidev libclang-dev krb5-kdc krb5-user krb5-admin-server
sudo rm -rf /usr/local/lib/android
sudo apt-get install --no-install-recommends -y libpq-dev openssl libxmlsec1-dev pkg-config gettext libkrb5-dev krb5-kdc krb5-user krb5-admin-server
- name: Install uv
if: ${{ contains(inputs.dependencies, 'python') }}
uses: astral-sh/setup-uv@eac588ad8def6316056a12d4907a9d4d84ff7a3b # v5
uses: astral-sh/setup-uv@v5
with:
enable-cache: true
- name: Setup python
if: ${{ contains(inputs.dependencies, 'python') }}
uses: actions/setup-python@a309ff8b426b58ec0e2a45f0f869d46889d02405 # v5
uses: actions/setup-python@v5
with:
python-version-file: "pyproject.toml"
- name: Install Python deps
@@ -36,29 +35,28 @@ runs:
run: uv sync --all-extras --dev --frozen
- name: Setup node
if: ${{ contains(inputs.dependencies, 'node') }}
uses: actions/setup-node@6044e13b5dc448c55e2357c09f80417699197238 # v4
uses: actions/setup-node@v4
with:
node-version-file: web/package.json
cache: "npm"
cache-dependency-path: web/package-lock.json
registry-url: 'https://registry.npmjs.org'
- name: Setup go
if: ${{ contains(inputs.dependencies, 'go') }}
uses: actions/setup-go@7a3fe6cf4cb3a834922a1244abfce67bcef6a0c5 # v5
uses: actions/setup-go@v5
with:
go-version-file: "go.mod"
- name: Setup docker cache
if: ${{ contains(inputs.dependencies, 'runtime') }}
uses: AndreKurait/docker-cache@0fe76702a40db986d9663c24954fc14c6a6031b7
with:
key: docker-images-${{ runner.os }}-${{ hashFiles('.github/actions/setup/compose.yml', 'Makefile') }}-${{ inputs.postgresql_version }}
key: docker-images-${{ runner.os }}-${{ hashFiles('.github/actions/setup/docker-compose.yml', 'Makefile') }}-${{ inputs.postgresql_version }}
- name: Setup dependencies
if: ${{ contains(inputs.dependencies, 'runtime') }}
shell: bash
run: |
export PSQL_TAG=${{ inputs.postgresql_version }}
docker compose -f .github/actions/setup/compose.yml up -d
cd web && npm i
docker compose -f .github/actions/setup/docker-compose.yml up -d
cd web && npm ci
- name: Generate config
if: ${{ contains(inputs.dependencies, 'python') }}
shell: uv run python {0}

View File

@@ -1,34 +0,0 @@
services:
postgresql:
image: docker.io/library/postgres:${PSQL_TAG:-16}
volumes:
- db-data:/var/lib/postgresql/data
command: "-c log_statement=all"
environment:
POSTGRES_USER: authentik
POSTGRES_PASSWORD: "EK-5jnKfjrGRm<77"
POSTGRES_DB: authentik
ports:
- 5432:5432
restart: always
s3:
container_name: s3
image: docker.io/zenko/cloudserver
environment:
REMOTE_MANAGEMENT_DISABLE: "1"
SCALITY_ACCESS_KEY_ID: accessKey1
SCALITY_SECRET_ACCESS_KEY: secretKey1
ports:
- 8020:8000
volumes:
- s3-data:/usr/src/app/localData
- s3-metadata:/usr/src/app/localMetadata
restart: always
volumes:
db-data:
driver: local
s3-data:
driver: local
s3-metadata:
driver: local

View File

@@ -0,0 +1,21 @@
services:
postgresql:
image: docker.io/library/postgres:${PSQL_TAG:-16}
volumes:
- db-data:/var/lib/postgresql/data
environment:
POSTGRES_USER: authentik
POSTGRES_PASSWORD: "EK-5jnKfjrGRm<77"
POSTGRES_DB: authentik
ports:
- 5432:5432
restart: always
redis:
image: docker.io/library/redis:7
ports:
- 6379:6379
restart: always
volumes:
db-data:
driver: local

View File

@@ -1,28 +0,0 @@
name: "Process test results"
description: Convert test results to JUnit, add them to GitHub Actions and codecov
inputs:
flags:
description: Codecov flags
runs:
using: "composite"
steps:
- uses: codecov/codecov-action@671740ac38dd9b0130fbe1cec585b89eea48d3de # v5
with:
flags: ${{ inputs.flags }}
use_oidc: true
- uses: codecov/codecov-action@671740ac38dd9b0130fbe1cec585b89eea48d3de # v5
with:
flags: ${{ inputs.flags }}
use_oidc: true
report_type: test_results
- name: PostgreSQL Logs
shell: bash
run: |
if [[ $RUNNER_DEBUG == '1' ]]; then
docker stop setup-postgresql-1
echo "::group::PostgreSQL Logs"
docker logs setup-postgresql-1
echo "::endgroup::"
fi

2
.github/cherry-pick-bot.yml vendored Normal file
View File

@@ -0,0 +1,2 @@
enabled: true
preservePullRequestTitle: true

132
.github/dependabot.yml vendored
View File

@@ -1,17 +1,7 @@
version: 2
updates:
#region Github Actions
- package-ecosystem: "github-actions"
directories:
- /
# Required to update composite actions
# https://github.com/dependabot/dependabot-core/issues/6704
- /.github/actions/cherry-pick
- /.github/actions/setup
- /.github/actions/docker-push-variables
- /.github/actions/comment-pr-instructions
- /.github/actions/test-results
directory: "/"
schedule:
interval: daily
time: "04:00"
@@ -20,11 +10,6 @@ updates:
prefix: "ci:"
labels:
- dependencies
#endregion
#region Golang
- package-ecosystem: gomod
directory: "/"
schedule:
@@ -35,74 +20,11 @@ updates:
prefix: "core:"
labels:
- dependencies
#endregion
#region Web
- package-ecosystem: npm
directories:
- "/"
- "/web"
- "/web/packages/*"
schedule:
interval: daily
time: "04:00"
labels:
- dependencies
open-pull-requests-limit: 10
commit-message:
prefix: "web:"
groups:
sentry:
patterns:
- "@sentry/*"
babel:
patterns:
- "@babel/*"
- "babel-*"
eslint:
patterns:
- "@eslint/*"
- "@typescript-eslint/*"
- "eslint-*"
- "eslint"
- "typescript-eslint"
storybook:
patterns:
- "@storybook/*"
- "*storybook*"
bundler:
patterns:
- "@esbuild/*"
- "esbuild*"
- "@vitest/*"
- "vitest"
rollup:
patterns:
- "@rollup/*"
- "rollup-*"
- "rollup*"
swc:
patterns:
- "@swc/*"
- "swc-*"
goauthentik:
patterns:
- "@goauthentik/*"
react:
patterns:
- "react"
- "react-dom"
- "@types/react"
- "@types/react-dom"
#endregion
#region NPM Packages
- package-ecosystem: npm
directories:
- "/web/packages/sfe"
- "/web/packages/core"
- "/packages/esbuild-plugin-live-reload"
- "/packages/prettier-config"
- "/packages/tsconfig"
@@ -115,11 +37,12 @@ updates:
- dependencies
open-pull-requests-limit: 10
commit-message:
prefix: "core, web:"
prefix: "web:"
groups:
sentry:
patterns:
- "@sentry/*"
- "@spotlightjs/*"
babel:
patterns:
- "@babel/*"
@@ -135,12 +58,10 @@ updates:
patterns:
- "@storybook/*"
- "*storybook*"
bundler:
esbuild:
patterns:
- "@esbuild/*"
- "esbuild*"
- "@vitest/*"
- "vitest"
rollup:
patterns:
- "@rollup/*"
@@ -150,20 +71,12 @@ updates:
patterns:
- "@swc/*"
- "swc-*"
wdio:
patterns:
- "@wdio/*"
goauthentik:
patterns:
- "@goauthentik/*"
react:
patterns:
- "react"
- "react-dom"
- "@types/react"
- "@types/react-dom"
#endregion
# #region Documentation
- package-ecosystem: npm
directory: "/website"
schedule:
@@ -178,7 +91,6 @@ updates:
docusaurus:
patterns:
- "@docusaurus/*"
- "@goauthentik/docusaurus-config"
build:
patterns:
- "@swc/*"
@@ -187,9 +99,7 @@ updates:
- "@rspack/binding*"
goauthentik:
patterns:
- "@goauthentik/eslint-config"
- "@goauthentik/prettier-config"
- "@goauthentik/tsconfig"
- "@goauthentik/*"
eslint:
patterns:
- "@eslint/*"
@@ -197,11 +107,6 @@ updates:
- "eslint-*"
- "eslint"
- "typescript-eslint"
#endregion
# AWS Lifecycle
- package-ecosystem: npm
directory: "/lifecycle/aws"
schedule:
@@ -212,11 +117,6 @@ updates:
prefix: "lifecycle/aws:"
labels:
- dependencies
#endregion
#region Python
- package-ecosystem: uv
directory: "/"
schedule:
@@ -227,15 +127,8 @@ updates:
prefix: "core:"
labels:
- dependencies
#endregion
#region Docker
- package-ecosystem: docker
directories:
- /
- /website
directory: "/"
schedule:
interval: daily
time: "04:00"
@@ -247,7 +140,6 @@ updates:
- package-ecosystem: docker-compose
directories:
# - /scripts # Maybe
- /scripts/api
- /tests/e2e
schedule:
interval: daily
@@ -257,5 +149,3 @@ updates:
prefix: "core:"
labels:
- dependencies
#endregion

View File

@@ -2,10 +2,6 @@
👋 Hi there! Welcome.
Please check the Contributing guidelines: https://docs.goauthentik.io/docs/developer-docs/#how-can-i-contribute
⚠️ IMPORTANT: Make sure you are opening this PR from a FEATURE BRANCH, not from your main branch!
If you opened this PR from your main branch, please close it and create a new feature branch instead.
For more information, see: https://docs.goauthentik.io/developer-docs/contributing/#always-use-feature-branches
-->
## Details

View File

@@ -1,4 +1,3 @@
---
git:
filters:
- filter_type: file

View File

@@ -42,9 +42,9 @@ jobs:
# Needed for checkout
contents: read
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: docker/setup-qemu-action@c7c53464625b32c7a7e944ae62b3e17d2b600130 # v3.7.0
- uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3
- uses: actions/checkout@v5
- uses: docker/setup-qemu-action@v3.6.0
- uses: docker/setup-buildx-action@v3
- name: prepare variables
uses: ./.github/actions/docker-push-variables
id: ev
@@ -56,23 +56,25 @@ jobs:
release: ${{ inputs.release }}
- name: Login to Docker Hub
if: ${{ inputs.registry_dockerhub }}
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKER_CORP_USERNAME }}
password: ${{ secrets.DOCKER_CORP_PASSWORD }}
- name: Login to GitHub Container Registry
if: ${{ inputs.registry_ghcr }}
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- uses: actions/setup-node@6044e13b5dc448c55e2357c09f80417699197238 # v5
- name: Setup node
uses: actions/setup-node@v4
with:
node-version-file: web/package.json
cache: "npm"
cache-dependency-path: web/package-lock.json
- uses: actions/setup-go@7a3fe6cf4cb3a834922a1244abfce67bcef6a0c5 # v6
- name: Setup go
uses: actions/setup-go@v5
with:
go-version-file: "go.mod"
- name: Generate API Clients
@@ -80,11 +82,10 @@ jobs:
make gen-client-ts
make gen-client-go
- name: Build Docker Image
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6
uses: docker/build-push-action@v6
id: push
with:
context: .
file: lifecycle/container/Dockerfile
push: ${{ steps.ev.outputs.shouldPush == 'true' }}
secrets: |
GEOIPUPDATE_ACCOUNT_ID=${{ secrets.GEOIPUPDATE_ACCOUNT_ID }}
@@ -95,7 +96,7 @@ jobs:
platforms: linux/${{ inputs.image_arch }}
cache-from: type=registry,ref=${{ steps.ev.outputs.attestImageNames }}:buildcache-${{ inputs.image_arch }}
cache-to: ${{ steps.ev.outputs.cacheTo }}
- uses: actions/attest-build-provenance@96278af6caaf10aea03fd8d33a09a777ca52d62f # v3
- uses: actions/attest-build-provenance@v2
id: attest
if: ${{ steps.ev.outputs.shouldPush == 'true' }}
with:

View File

@@ -21,7 +21,7 @@ on:
jobs:
build-server-amd64:
uses: ./.github/workflows/_reusable-docker-build-single.yml
uses: ./.github/workflows/_reusable-docker-build-single.yaml
secrets: inherit
with:
image_name: ${{ inputs.image_name }}
@@ -31,7 +31,7 @@ jobs:
registry_ghcr: ${{ inputs.registry_ghcr }}
release: ${{ inputs.release }}
build-server-arm64:
uses: ./.github/workflows/_reusable-docker-build-single.yml
uses: ./.github/workflows/_reusable-docker-build-single.yaml
secrets: inherit
with:
image_name: ${{ inputs.image_name }}
@@ -49,7 +49,7 @@ jobs:
tags: ${{ steps.ev.outputs.imageTagsJSON }}
shouldPush: ${{ steps.ev.outputs.shouldPush }}
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/checkout@v5
- name: prepare variables
uses: ./.github/actions/docker-push-variables
id: ev
@@ -69,7 +69,7 @@ jobs:
matrix:
tag: ${{ fromJson(needs.get-tags.outputs.tags) }}
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/checkout@v5
- name: prepare variables
uses: ./.github/actions/docker-push-variables
id: ev
@@ -79,25 +79,25 @@ jobs:
image-name: ${{ inputs.image_name }}
- name: Login to Docker Hub
if: ${{ inputs.registry_dockerhub }}
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKER_CORP_USERNAME }}
password: ${{ secrets.DOCKER_CORP_PASSWORD }}
- name: Login to GitHub Container Registry
if: ${{ inputs.registry_ghcr }}
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- uses: int128/docker-manifest-create-action@1a059c021f1d5e9f2bd39de745d5dd3a0ef6df90 # v2
- uses: int128/docker-manifest-create-action@v2
id: build
with:
tags: ${{ matrix.tag }}
sources: |
${{ steps.ev.outputs.attestImageNames }}@${{ needs.build-server-amd64.outputs.image-digest }}
${{ steps.ev.outputs.attestImageNames }}@${{ needs.build-server-arm64.outputs.image-digest }}
- uses: actions/attest-build-provenance@96278af6caaf10aea03fd8d33a09a777ca52d62f # v3
- uses: actions/attest-build-provenance@v2
id: attest
with:
subject-name: ${{ steps.ev.outputs.attestImageNames }}

68
.github/workflows/api-py-publish.yml vendored Normal file
View File

@@ -0,0 +1,68 @@
---
name: API - Publish Python client
on:
push:
branches: [main]
paths:
- "schema.yml"
workflow_dispatch:
jobs:
build:
if: ${{ github.repository != 'goauthentik/authentik-internal' }}
runs-on: ubuntu-latest
permissions:
id-token: write
steps:
- id: generate_token
uses: tibdex/github-app-token@v2
with:
app_id: ${{ secrets.GH_APP_ID }}
private_key: ${{ secrets.GH_APP_PRIVATE_KEY }}
- uses: actions/checkout@v5
with:
token: ${{ steps.generate_token.outputs.token }}
- name: Install poetry & deps
shell: bash
run: |
pipx install poetry || true
sudo apt-get update
sudo apt-get install --no-install-recommends -y libpq-dev openssl libxmlsec1-dev pkg-config gettext
- name: Setup python and restore poetry
uses: actions/setup-python@v5
with:
python-version-file: "pyproject.toml"
- name: Generate API Client
run: make gen-client-py
- name: Publish package
working-directory: gen-py-api/
run: |
poetry build
- name: Publish package to PyPI
uses: pypa/gh-action-pypi-publish@release/v1
with:
packages-dir: gen-py-api/dist/
# We can't easily upgrade the API client being used due to poetry being poetry
# so we'll have to rely on dependabot
# - name: Upgrade /
# run: |
# export VERSION=$(cd gen-py-api && poetry version -s)
# poetry add "authentik_client=$VERSION" --allow-prereleases --lock
# - uses: peter-evans/create-pull-request@v6
# id: cpr
# with:
# token: ${{ steps.generate_token.outputs.token }}
# branch: update-root-api-client
# commit-message: "root: bump API Client version"
# title: "root: bump API Client version"
# body: "root: bump API Client version"
# delete-branch: true
# signoff: true
# # ID from https://api.github.com/users/authentik-automation[bot]
# author: authentik-automation[bot] <135050075+authentik-automation[bot]@users.noreply.github.com>
# - uses: peter-evans/enable-pull-request-automerge@v3
# with:
# token: ${{ steps.generate_token.outputs.token }}
# pull-request-number: ${{ steps.cpr.outputs.pull-request-number }}
# merge-method: squash

View File

@@ -8,24 +8,19 @@ on:
- "schema.yml"
workflow_dispatch:
permissions:
# Required for NPM OIDC trusted publisher
id-token: write
contents: read
jobs:
build:
runs-on: ubuntu-latest
steps:
- id: generate_token
uses: actions/create-github-app-token@29824e69f54612133e76f7eaac726eef6c875baf # v2
uses: tibdex/github-app-token@v2
with:
app-id: ${{ secrets.GH_APP_ID }}
private-key: ${{ secrets.GH_APP_PRIVATE_KEY }}
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
app_id: ${{ secrets.GH_APP_ID }}
private_key: ${{ secrets.GH_APP_PRIVATE_KEY }}
- uses: actions/checkout@v5
with:
token: ${{ steps.generate_token.outputs.token }}
- uses: actions/setup-node@6044e13b5dc448c55e2357c09f80417699197238 # v5
- uses: actions/setup-node@v4
with:
node-version-file: web/package.json
registry-url: "https://registry.npmjs.org"
@@ -36,6 +31,8 @@ jobs:
run: |
npm i
npm publish --tag generated
env:
NODE_AUTH_TOKEN: ${{ secrets.NPM_PUBLISH_TOKEN }}
- name: Upgrade /web
working-directory: web
run: |
@@ -46,7 +43,7 @@ jobs:
run: |
export VERSION=`node -e 'console.log(require("../gen-ts-api/package.json").version)'`
npm i @goauthentik/api@$VERSION
- uses: peter-evans/create-pull-request@c0f553fe549906ede9cf27b5156039d195d2ece0 # v7
- uses: peter-evans/create-pull-request@v7
id: cpr
with:
token: ${{ steps.generate_token.outputs.token }}
@@ -59,7 +56,7 @@ jobs:
# ID from https://api.github.com/users/authentik-automation[bot]
author: authentik-automation[bot] <135050075+authentik-automation[bot]@users.noreply.github.com>
labels: dependencies
- uses: peter-evans/enable-pull-request-automerge@a660677d5469627102a1c1e11409dd063606628d # v3
- uses: peter-evans/enable-pull-request-automerge@v3
with:
token: ${{ steps.generate_token.outputs.token }}
pull-request-number: ${{ steps.cpr.outputs.pull-request-number }}

View File

@@ -21,7 +21,7 @@ jobs:
command:
- prettier-check
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/checkout@v5
- name: Install Dependencies
working-directory: website/
run: npm ci
@@ -32,8 +32,8 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/setup-node@6044e13b5dc448c55e2357c09f80417699197238 # v5
- uses: actions/checkout@v5
- uses: actions/setup-node@v4
with:
node-version-file: website/package.json
cache: "npm"
@@ -41,7 +41,7 @@ jobs:
- working-directory: website/
name: Install Dependencies
run: npm ci
- uses: actions/cache@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v4
- uses: actions/cache@v4
with:
path: |
${{ github.workspace }}/website/api/.docusaurus
@@ -55,7 +55,7 @@ jobs:
env:
NODE_ENV: production
run: npm run build -w api
- uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v4
- uses: actions/upload-artifact@v4
with:
name: api-docs
path: website/api/build
@@ -66,12 +66,12 @@ jobs:
- lint
- build
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v5
- uses: actions/checkout@v5
- uses: actions/download-artifact@v5
with:
name: api-docs
path: website/api/build
- uses: actions/setup-node@6044e13b5dc448c55e2357c09f80417699197238 # v5
- uses: actions/setup-node@v4
with:
node-version-file: website/package.json
cache: "npm"

View File

@@ -21,10 +21,10 @@ jobs:
check-changes-applied:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/checkout@v5
- name: Setup authentik env
uses: ./.github/actions/setup
- uses: actions/setup-node@6044e13b5dc448c55e2357c09f80417699197238 # v5
- uses: actions/setup-node@v4
with:
node-version-file: lifecycle/aws/package.json
cache: "npm"
@@ -35,13 +35,13 @@ jobs:
- name: Check changes have been applied
run: |
uv run make aws-cfn
git diff --exit-code lifecycle/aws/template.yaml
git diff --exit-code
ci-aws-cfn-mark:
if: always()
needs:
- check-changes-applied
runs-on: ubuntu-latest
steps:
- uses: re-actors/alls-green@05ac9388f0aebcb5727afa17fcccfecd6f8ec5fe # release/v1
- uses: re-actors/alls-green@release/v1
with:
jobs: ${{ toJSON(needs) }}

View File

@@ -16,7 +16,7 @@ jobs:
runs-on: ubuntu-latest
timeout-minutes: 120
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/checkout@v5
- name: Setup authentik env
uses: ./.github/actions/setup
- name: generate docs
@@ -24,9 +24,9 @@ jobs:
uv run make migrate
uv run ak build_source_docs
- name: Publish
uses: netlify/actions/cli@master
with:
args: deploy --dir=source_docs --prod
env:
NETLIFY_SITE_ID: eb246b7b-1d83-4f69-89f7-01a936b4ca59
NETLIFY_AUTH_TOKEN: ${{ secrets.NETLIFY_AUTH_TOKEN }}
run: |
npm install -g netlify-cli
netlify deploy --dir=source_docs --prod

View File

@@ -15,15 +15,13 @@ on:
jobs:
lint:
runs-on: ubuntu-latest
env:
NODE_ENV: production
strategy:
fail-fast: false
matrix:
command:
- prettier-check
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/checkout@v5
- name: Install dependencies
working-directory: website/
run: npm ci
@@ -32,11 +30,10 @@ jobs:
run: npm run ${{ matrix.command }}
build-docs:
runs-on: ubuntu-latest
env:
NODE_ENV: production
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/setup-node@6044e13b5dc448c55e2357c09f80417699197238 # v5
- uses: actions/checkout@v5
- uses: actions/setup-node@v4
with:
node-version-file: website/package.json
cache: "npm"
@@ -49,11 +46,10 @@ jobs:
run: npm run build
build-integrations:
runs-on: ubuntu-latest
env:
NODE_ENV: production
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/setup-node@6044e13b5dc448c55e2357c09f80417699197238 # v5
- uses: actions/checkout@v5
- uses: actions/setup-node@v4
with:
node-version-file: website/package.json
cache: "npm"
@@ -73,13 +69,13 @@ jobs:
id-token: write
attestations: write
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/checkout@v5
with:
ref: ${{ github.event.pull_request.head.sha }}
- name: Set up QEMU
uses: docker/setup-qemu-action@c7c53464625b32c7a7e944ae62b3e17d2b600130 # v3.7.0
uses: docker/setup-qemu-action@v3.6.0
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3
uses: docker/setup-buildx-action@v3
- name: prepare variables
uses: ./.github/actions/docker-push-variables
id: ev
@@ -89,14 +85,14 @@ jobs:
image-name: ghcr.io/goauthentik/dev-docs
- name: Login to Container Registry
if: ${{ steps.ev.outputs.shouldPush == 'true' }}
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Build Docker Image
id: push
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6
uses: docker/build-push-action@v6
with:
tags: ${{ steps.ev.outputs.imageTags }}
file: website/Dockerfile
@@ -105,7 +101,7 @@ jobs:
context: .
cache-from: type=registry,ref=ghcr.io/goauthentik/dev-docs:buildcache
cache-to: ${{ steps.ev.outputs.shouldPush == 'true' && 'type=registry,ref=ghcr.io/goauthentik/dev-docs:buildcache,mode=max' || '' }}
- uses: actions/attest-build-provenance@96278af6caaf10aea03fd8d33a09a777ca52d62f # v3
- uses: actions/attest-build-provenance@v2
id: attest
if: ${{ steps.ev.outputs.shouldPush == 'true' }}
with:
@@ -121,6 +117,6 @@ jobs:
- build-container
runs-on: ubuntu-latest
steps:
- uses: re-actors/alls-green@05ac9388f0aebcb5727afa17fcccfecd6f8ec5fe # release/v1
- uses: re-actors/alls-green@release/v1
with:
jobs: ${{ toJSON(needs) }}

View File

@@ -18,11 +18,11 @@ jobs:
- version-2025-4
- version-2025-2
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/checkout@v5
- run: |
current="$(pwd)"
dir="/tmp/authentik/${{ matrix.version }}"
mkdir -p $dir
cd $dir
wget https://${{ matrix.version }}.goauthentik.io/compose.yml
wget https://${{ matrix.version }}.goauthentik.io/docker-compose.yml
${current}/scripts/test_docker.sh

View File

@@ -34,28 +34,17 @@ jobs:
- codespell
- pending-migrations
- ruff
- mypy
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/checkout@v5
- name: Setup authentik env
uses: ./.github/actions/setup
- name: run job
run: uv run make ci-${{ matrix.job }}
test-gen-build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- name: Setup authentik env
uses: ./.github/actions/setup
- name: generate schema
run: make migrate gen-build
- name: ensure schema is up-to-date
run: git diff --exit-code -- schema.yml blueprints/schema.json
test-migrations:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/checkout@v5
- name: Setup authentik env
uses: ./.github/actions/setup
- name: run migrations
@@ -71,33 +60,35 @@ jobs:
test-migrations-from-stable:
name: test-migrations-from-stable - PostgreSQL ${{ matrix.psql }} - Run ${{ matrix.run_id }}/5
runs-on: ubuntu-latest
timeout-minutes: 30
timeout-minutes: 20
needs: test-make-seed
strategy:
fail-fast: false
matrix:
psql:
- 14-alpine
- 18-alpine
- 15-alpine
- 16-alpine
- 17-alpine
run_id: [1, 2, 3, 4, 5]
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/checkout@v5
with:
fetch-depth: 0
- name: checkout stable
run: |
set -e -o pipefail
# Copy current, latest config to local
cp authentik/lib/default.yml local.env.yml
cp -R .github ..
cp -R scripts ..
# Previous stable tag
prev_stable=$(git tag --sort=version:refname | grep '^version/' | grep -vE -- '-rc[0-9]+$' | tail -n1)
# Current version family based on
current_version_family=$(cat internal/constants/VERSION | grep -vE -- 'rc[0-9]+$' || true)
current_version_family=$(python -c "from authentik import VERSION; print(VERSION)" | grep -vE -- 'rc[0-9]+$')
if [[ -n $current_version_family ]]; then
prev_stable="version/${current_version_family}"
prev_stable=$current_version_family
fi
echo "::notice::Checking out ${prev_stable} as stable version..."
git checkout ${prev_stable}
git checkout $(prev_stable)
rm -rf .github/ scripts/
mv ../.github ../scripts .
- name: Setup authentik env (stable)
@@ -129,24 +120,21 @@ jobs:
CI_TOTAL_RUNS: "5"
run: |
uv run make ci-test
- uses: ./.github/actions/test-results
if: ${{ always() }}
with:
flags: unit-migrate
test-unittest:
name: test-unittest - PostgreSQL ${{ matrix.psql }} - Run ${{ matrix.run_id }}/5
runs-on: ubuntu-latest
timeout-minutes: 30
timeout-minutes: 20
needs: test-make-seed
strategy:
fail-fast: false
matrix:
psql:
- 14-alpine
- 18-alpine
- 15-alpine
- 16-alpine
- 17-alpine
run_id: [1, 2, 3, 4, 5]
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/checkout@v5
- name: Setup authentik env
uses: ./.github/actions/setup
with:
@@ -158,27 +146,41 @@ jobs:
CI_TOTAL_RUNS: "5"
run: |
uv run make ci-test
- uses: ./.github/actions/test-results
if: ${{ always() }}
- if: ${{ always() }}
uses: codecov/codecov-action@v5
with:
flags: unit
use_oidc: true
- if: ${{ !cancelled() }}
uses: codecov/test-results-action@v1
with:
flags: unit
file: unittest.xml
use_oidc: true
test-integration:
runs-on: ubuntu-latest
timeout-minutes: 30
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/checkout@v5
- name: Setup authentik env
uses: ./.github/actions/setup
- name: Create k8s Kind Cluster
uses: helm/kind-action@92086f6be054225fa813e0a4b13787fc9088faab # v1.13.0
uses: helm/kind-action@v1.12.0
- name: run integration
run: |
uv run coverage run manage.py test tests/integration
uv run coverage xml
- uses: ./.github/actions/test-results
if: ${{ always() }}
- if: ${{ always() }}
uses: codecov/codecov-action@v5
with:
flags: integration
use_oidc: true
- if: ${{ !cancelled() }}
uses: codecov/test-results-action@v1
with:
flags: integration
file: unittest.xml
use_oidc: true
test-e2e:
name: test-e2e (${{ matrix.job.name }})
runs-on: ubuntu-latest
@@ -197,25 +199,21 @@ jobs:
glob: tests/e2e/test_provider_saml* tests/e2e/test_source_saml*
- name: ldap
glob: tests/e2e/test_provider_ldap* tests/e2e/test_source_ldap*
- name: ws-fed
glob: tests/e2e/test_provider_ws_fed*
- name: radius
glob: tests/e2e/test_provider_radius*
- name: scim
glob: tests/e2e/test_source_scim*
- name: flows
glob: tests/e2e/test_flows*
- name: endpoints
glob: tests/e2e/test_endpoints_*
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/checkout@v5
- name: Setup authentik env
uses: ./.github/actions/setup
- name: Setup e2e env (chrome, etc)
run: |
docker compose -f tests/e2e/compose.yml up -d --quiet-pull
docker compose -f tests/e2e/docker-compose.yml up -d --quiet-pull
- id: cache-web
uses: actions/cache@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v4
uses: actions/cache@v4
with:
path: web/dist
key: ${{ runner.os }}-web-${{ hashFiles('web/package-lock.json', 'package-lock.json', 'web/src/**', 'web/packages/sfe/src/**') }}-b
@@ -231,63 +229,21 @@ jobs:
run: |
uv run coverage run manage.py test ${{ matrix.job.glob }}
uv run coverage xml
- uses: ./.github/actions/test-results
if: ${{ always() }}
- if: ${{ always() }}
uses: codecov/codecov-action@v5
with:
flags: e2e
test-openid-conformance:
name: test-openid-conformance (${{ matrix.job.name }})
runs-on: ubuntu-latest
timeout-minutes: 30
strategy:
fail-fast: false
matrix:
job:
- name: basic
glob: tests/openid_conformance/test_basic.py
- name: implicit
glob: tests/openid_conformance/test_implicit.py
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- name: Setup authentik env
uses: ./.github/actions/setup
- name: Setup e2e env (chrome, etc)
run: |
docker compose -f tests/e2e/compose.yml up -d --quiet-pull
- name: Setup conformance suite
run: |
docker compose -f tests/openid_conformance/compose.yml up -d --quiet-pull
- id: cache-web
uses: actions/cache@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # v4
with:
path: web/dist
key: ${{ runner.os }}-web-${{ hashFiles('web/package-lock.json', 'web/src/**', 'web/packages/sfe/src/**') }}-b
- name: prepare web ui
if: steps.cache-web.outputs.cache-hit != 'true'
working-directory: web
run: |
npm ci
make -C .. gen-client-ts
npm run build
npm run build:sfe
- name: run conformance
run: |
uv run coverage run manage.py test ${{ matrix.job.glob }}
uv run coverage xml
- uses: ./.github/actions/test-results
if: ${{ always() }}
with:
flags: conformance
use_oidc: true
- if: ${{ !cancelled() }}
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6
uses: codecov/test-results-action@v1
with:
name: conformance-certification-${{ matrix.job.name }}
path: tests/openid_conformance/exports/
flags: e2e
file: unittest.xml
use_oidc: true
ci-core-mark:
if: always()
needs:
- lint
- test-gen-build
- test-migrations
- test-migrations-from-stable
- test-unittest
@@ -295,7 +251,7 @@ jobs:
- test-e2e
runs-on: ubuntu-latest
steps:
- uses: re-actors/alls-green@05ac9388f0aebcb5727afa17fcccfecd6f8ec5fe # release/v1
- uses: re-actors/alls-green@release/v1
with:
jobs: ${{ toJSON(needs) }}
build:
@@ -308,7 +264,7 @@ jobs:
# Needed for checkout
contents: read
needs: ci-core-mark
uses: ./.github/workflows/_reusable-docker-build.yml
uses: ./.github/workflows/_reusable-docker-build.yaml
secrets: inherit
with:
image_name: ${{ github.repository == 'goauthentik/authentik-internal' && 'ghcr.io/goauthentik/internal-server' || 'ghcr.io/goauthentik/dev-server' }}
@@ -323,7 +279,7 @@ jobs:
pull-requests: write
timeout-minutes: 120
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/checkout@v5
with:
ref: ${{ github.event.pull_request.head.sha }}
- name: prepare variables

View File

@@ -12,17 +12,12 @@ on:
- main
- version-*
env:
POSTGRES_DB: authentik
POSTGRES_USER: authentik
POSTGRES_PASSWORD: "EK-5jnKfjrGRm<77"
jobs:
lint-golint:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/setup-go@7a3fe6cf4cb3a834922a1244abfce67bcef6a0c5 # v6
- uses: actions/checkout@v5
- uses: actions/setup-go@v5
with:
go-version-file: "go.mod"
- name: Prepare and generate API
@@ -34,7 +29,7 @@ jobs:
- name: Generate API
run: make gen-client-go
- name: golangci-lint
uses: golangci/golangci-lint-action@1e7e51e771db61008b38414a730f564565cf7c20 # v8
uses: golangci/golangci-lint-action@v8
with:
version: latest
args: --timeout 5000s --verbose
@@ -42,17 +37,14 @@ jobs:
test-unittest:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/setup-go@7a3fe6cf4cb3a834922a1244abfce67bcef6a0c5 # v6
- uses: actions/checkout@v5
- uses: actions/setup-go@v5
with:
go-version-file: "go.mod"
- name: Setup authentik env
uses: ./.github/actions/setup
- name: Generate API
run: make gen-client-go
- name: prepare database
run: |
uv run make migrate
- name: Go unittests
run: |
go test -timeout 0 -v -race -coverprofile=coverage.out -covermode=atomic -cover ./...
@@ -63,7 +55,7 @@ jobs:
- test-unittest
runs-on: ubuntu-latest
steps:
- uses: re-actors/alls-green@05ac9388f0aebcb5727afa17fcccfecd6f8ec5fe # release/v1
- uses: re-actors/alls-green@release/v1
with:
jobs: ${{ toJSON(needs) }}
build-container:
@@ -86,13 +78,13 @@ jobs:
id-token: write
attestations: write
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/checkout@v5
with:
ref: ${{ github.event.pull_request.head.sha }}
- name: Set up QEMU
uses: docker/setup-qemu-action@c7c53464625b32c7a7e944ae62b3e17d2b600130 # v3.7.0
uses: docker/setup-qemu-action@v3.6.0
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3
uses: docker/setup-buildx-action@v3
- name: prepare variables
uses: ./.github/actions/docker-push-variables
id: ev
@@ -102,7 +94,7 @@ jobs:
image-name: ghcr.io/goauthentik/dev-${{ matrix.type }}
- name: Login to Container Registry
if: ${{ steps.ev.outputs.shouldPush == 'true' }}
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
@@ -111,10 +103,10 @@ jobs:
run: make gen-client-go
- name: Build Docker Image
id: push
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6
uses: docker/build-push-action@v6
with:
tags: ${{ steps.ev.outputs.imageTags }}
file: lifecycle/container/${{ matrix.type }}.Dockerfile
file: ${{ matrix.type }}.Dockerfile
push: ${{ steps.ev.outputs.shouldPush == 'true' }}
build-args: |
GIT_BUILD_HASH=${{ steps.ev.outputs.sha }}
@@ -122,7 +114,7 @@ jobs:
context: .
cache-from: type=registry,ref=ghcr.io/goauthentik/dev-${{ matrix.type }}:buildcache
cache-to: ${{ steps.ev.outputs.shouldPush == 'true' && format('type=registry,ref=ghcr.io/goauthentik/dev-{0}:buildcache,mode=max', matrix.type) || '' }}
- uses: actions/attest-build-provenance@96278af6caaf10aea03fd8d33a09a777ca52d62f # v3
- uses: actions/attest-build-provenance@v2
id: attest
if: ${{ steps.ev.outputs.shouldPush == 'true' }}
with:
@@ -145,13 +137,13 @@ jobs:
goos: [linux]
goarch: [amd64, arm64]
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/checkout@v5
with:
ref: ${{ github.event.pull_request.head.sha }}
- uses: actions/setup-go@7a3fe6cf4cb3a834922a1244abfce67bcef6a0c5 # v6
- uses: actions/setup-go@v5
with:
go-version-file: "go.mod"
- uses: actions/setup-node@6044e13b5dc448c55e2357c09f80417699197238 # v5
- uses: actions/setup-node@v4
with:
node-version-file: web/package.json
cache: "npm"

View File

@@ -31,8 +31,8 @@ jobs:
- command: lit-analyse
project: web
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/setup-node@6044e13b5dc448c55e2357c09f80417699197238 # v5
- uses: actions/checkout@v5
- uses: actions/setup-node@v4
with:
node-version-file: ${{ matrix.project }}/package.json
cache: "npm"
@@ -48,8 +48,8 @@ jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/setup-node@6044e13b5dc448c55e2357c09f80417699197238 # v5
- uses: actions/checkout@v5
- uses: actions/setup-node@v4
with:
node-version-file: web/package.json
cache: "npm"
@@ -68,7 +68,7 @@ jobs:
- lint
runs-on: ubuntu-latest
steps:
- uses: re-actors/alls-green@05ac9388f0aebcb5727afa17fcccfecd6f8ec5fe # release/v1
- uses: re-actors/alls-green@release/v1
with:
jobs: ${{ toJSON(needs) }}
test:
@@ -76,8 +76,8 @@ jobs:
- ci-web-mark
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/setup-node@6044e13b5dc448c55e2357c09f80417699197238 # v5
- uses: actions/checkout@v5
- uses: actions/setup-node@v4
with:
node-version-file: web/package.json
cache: "npm"

View File

@@ -29,32 +29,32 @@ jobs:
github.event.pull_request.head.repo.full_name == github.repository)
steps:
- id: generate_token
uses: actions/create-github-app-token@29824e69f54612133e76f7eaac726eef6c875baf # v2
uses: tibdex/github-app-token@v2
with:
app-id: ${{ secrets.GH_APP_ID }}
private-key: ${{ secrets.GH_APP_PRIVATE_KEY }}
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
app_id: ${{ secrets.GH_APP_ID }}
private_key: ${{ secrets.GH_APP_PRIVATE_KEY }}
- uses: actions/checkout@v5
with:
token: ${{ steps.generate_token.outputs.token }}
- name: Compress images
id: compress
uses: calibreapp/image-actions@d9c8ee5c3dc52ae4622c82ead88d658f4b16b65f # main
uses: calibreapp/image-actions@main
with:
GITHUB_TOKEN: ${{ steps.generate_token.outputs.token }}
githubToken: ${{ steps.generate_token.outputs.token }}
compressOnly: ${{ github.event_name != 'pull_request' }}
- uses: peter-evans/create-pull-request@c0f553fe549906ede9cf27b5156039d195d2ece0 # v7
- uses: peter-evans/create-pull-request@v7
if: "${{ github.event_name != 'pull_request' && steps.compress.outputs.markdown != '' }}"
id: cpr
with:
token: ${{ steps.generate_token.outputs.token }}
title: "*: Auto compress images"
branch-suffix: timestamp
commit-message: "*: compress images"
commit-messsage: "*: compress images"
body: ${{ steps.compress.outputs.markdown }}
delete-branch: true
signoff: true
labels: dependencies
- uses: peter-evans/enable-pull-request-automerge@a660677d5469627102a1c1e11409dd063606628d # v3
- uses: peter-evans/enable-pull-request-automerge@v3
if: "${{ github.event_name != 'pull_request' && steps.compress.outputs.markdown != '' }}"
with:
token: ${{ steps.generate_token.outputs.token }}

View File

@@ -16,17 +16,17 @@ jobs:
runs-on: ubuntu-latest
steps:
- id: generate_token
uses: actions/create-github-app-token@29824e69f54612133e76f7eaac726eef6c875baf # v2
uses: tibdex/github-app-token@v2
with:
app-id: ${{ secrets.GH_APP_ID }}
private-key: ${{ secrets.GH_APP_PRIVATE_KEY }}
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
app_id: ${{ secrets.GH_APP_ID }}
private_key: ${{ secrets.GH_APP_PRIVATE_KEY }}
- uses: actions/checkout@v5
with:
token: ${{ steps.generate_token.outputs.token }}
- name: Setup authentik env
uses: ./.github/actions/setup
- run: uv run ak update_webauthn_mds
- uses: peter-evans/create-pull-request@c0f553fe549906ede9cf27b5156039d195d2ece0 # v7
- uses: peter-evans/create-pull-request@v7
id: cpr
with:
token: ${{ steps.generate_token.outputs.token }}
@@ -39,7 +39,7 @@ jobs:
# ID from https://api.github.com/users/authentik-automation[bot]
author: authentik-automation[bot] <135050075+authentik-automation[bot]@users.noreply.github.com>
labels: dependencies
- uses: peter-evans/enable-pull-request-automerge@a660677d5469627102a1c1e11409dd063606628d # v3
- uses: peter-evans/enable-pull-request-automerge@v3
with:
token: ${{ steps.generate_token.outputs.token }}
pull-request-number: ${{ steps.cpr.outputs.pull-request-number }}

View File

@@ -1,36 +0,0 @@
name: GH - Cherry-pick
on:
pull_request_target:
types: [closed, labeled]
jobs:
cherry-pick:
runs-on: ubuntu-latest
steps:
- id: app-token
name: Generate app token
uses: actions/create-github-app-token@29824e69f54612133e76f7eaac726eef6c875baf # v2
if: ${{ env.GH_APP_ID != '' }}
with:
app-id: ${{ secrets.GH_APP_ID }}
private-key: ${{ secrets.GH_APP_PRIVATE_KEY }}
env:
GH_APP_ID: ${{ secrets.GH_APP_ID }}
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
if: ${{ steps.app-token.outcome != 'skipped' }}
with:
fetch-depth: 0
token: "${{ steps.app-token.outputs.token }}"
- id: get-user-id
if: ${{ steps.app-token.outcome != 'skipped' }}
name: Get GitHub app user ID
run: echo "user-id=$(gh api "/users/${{ steps.app-token.outputs.app-slug }}[bot]" --jq .id)" >> "$GITHUB_OUTPUT"
env:
GH_TOKEN: "${{ steps.app-token.outputs.token }}"
- uses: ./.github/actions/cherry-pick
if: ${{ steps.app-token.outcome != 'skipped' }}
with:
token: ${{ steps.app-token.outputs.token }}
git_user: ${{ steps.app-token.outputs.app-slug }}[bot]
git_user_email: '${{ steps.get-user-id.outputs.user-id }}+${{ steps.app-token.outputs.app-slug }}[bot]@users.noreply.github.com'

View File

@@ -16,7 +16,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Check out code
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
uses: actions/checkout@v5
- name: Cleanup
run: |

View File

@@ -16,10 +16,10 @@ jobs:
runs-on: ubuntu-latest
steps:
- id: generate_token
uses: actions/create-github-app-token@29824e69f54612133e76f7eaac726eef6c875baf # v2
uses: tibdex/github-app-token@v2
with:
app-id: ${{ secrets.GH_APP_ID }}
private-key: ${{ secrets.GH_APP_PRIVATE_KEY }}
app_id: ${{ secrets.GH_APP_ID }}
private_key: ${{ secrets.GH_APP_PRIVATE_KEY }}
- name: Delete 'dev' containers older than a week
uses: snok/container-retention-policy@3b0972b2276b171b212f8c4efbca59ebba26eceb # v3.0.1
with:

View File

@@ -5,18 +5,13 @@ on:
push:
branches: [main]
paths:
- packages/tsconfig/**
- packages/docusaurus-config/**
- packages/eslint-config/**
- packages/prettier-config/**
- packages/docusaurus-config/**
- packages/tsconfig/**
- packages/esbuild-plugin-live-reload/**
workflow_dispatch:
permissions:
# Required for NPM OIDC trusted publisher
id-token: write
contents: read
jobs:
publish:
runs-on: ubuntu-latest
@@ -24,28 +19,25 @@ jobs:
fail-fast: false
matrix:
package:
# The order of the `*config` packages should not be changed, as they depend on each other.
- packages/tsconfig
- packages/docusaurus-config
- packages/eslint-config
- packages/prettier-config
- packages/docusaurus-config
- packages/tsconfig
- packages/esbuild-plugin-live-reload
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/checkout@v5
with:
fetch-depth: 2
- uses: actions/setup-node@6044e13b5dc448c55e2357c09f80417699197238 # v5
- uses: actions/setup-node@v4
with:
node-version-file: ${{ matrix.package }}/package.json
registry-url: "https://registry.npmjs.org"
- name: Get changed files
id: changed-files
uses: tj-actions/changed-files@8cba46e29c11878d930bca7870bb54394d3e8b21 # 24d32ffd492484c1d75e0c0b894501ddb9d30d62
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c
with:
files: |
${{ matrix.package }}/package.json
- name: Install Dependencies
run: npm ci
- name: Publish package
if: steps.changed-files.outputs.any_changed == 'true'
working-directory: ${{ matrix.package }}
@@ -53,3 +45,5 @@ jobs:
npm ci
npm run build
npm publish
env:
NODE_AUTH_TOKEN: ${{ secrets.NPM_PUBLISH_TOKEN }}

View File

@@ -24,14 +24,14 @@ jobs:
language: ["go", "javascript", "python"]
steps:
- name: Checkout repository
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
uses: actions/checkout@v5
- name: Setup authentik env
uses: ./.github/actions/setup
- name: Initialize CodeQL
uses: github/codeql-action/init@v4
uses: github/codeql-action/init@v3
with:
languages: ${{ matrix.language }}
- name: Autobuild
uses: github/codeql-action/autobuild@v4
uses: github/codeql-action/autobuild@v3
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v4
uses: github/codeql-action/analyze@v3

View File

@@ -26,5 +26,5 @@ jobs:
image: semgrep/semgrep
if: (github.actor != 'dependabot[bot]')
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/checkout@v5
- run: semgrep ci

View File

@@ -29,12 +29,12 @@ jobs:
steps:
- id: app-token
name: Generate app token
uses: actions/create-github-app-token@29824e69f54612133e76f7eaac726eef6c875baf # v2
uses: actions/create-github-app-token@v2
with:
app-id: ${{ secrets.GH_APP_ID }}
private-key: ${{ secrets.GH_APP_PRIVATE_KEY }}
- name: Checkout main
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
uses: actions/checkout@v5
with:
ref: main
token: "${{ steps.app-token.outputs.token }}"
@@ -43,13 +43,10 @@ jobs:
with:
dependencies: python
- name: Create version branch
env:
GH_TOKEN: "${{ steps.app-token.outputs.token }}"
run: |
current_major_version="$(uv version --short | grep -oE "^[0-9]{4}\.[0-9]{1,2}")"
git checkout -b "version-${current_major_version}"
git push origin "version-${current_major_version}"
gh label create "backport/version-${current_major_version}" --description "Add this label to PRs to backport changes to version-${current_major_version}" --color "fbca04"
bump-version-pr:
name: Open version bump PR
needs:
@@ -57,12 +54,12 @@ jobs:
runs-on: ubuntu-latest
steps:
- id: generate_token
uses: actions/create-github-app-token@29824e69f54612133e76f7eaac726eef6c875baf # v2
uses: tibdex/github-app-token@v2
with:
app-id: ${{ secrets.GH_APP_ID }}
private-key: ${{ secrets.GH_APP_PRIVATE_KEY }}
app_id: ${{ secrets.GH_APP_ID }}
private_key: ${{ secrets.GH_APP_PRIVATE_KEY }}
- name: Checkout main
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
uses: actions/checkout@v5
with:
ref: main
token: ${{ steps.generate_token.outputs.token }}
@@ -73,7 +70,7 @@ jobs:
- name: Bump version
run: "make bump version=${{ inputs.next_version }}.0-rc1"
- name: Create pull request
uses: peter-evans/create-pull-request@c0f553fe549906ede9cf27b5156039d195d2ece0 # v7
uses: peter-evans/create-pull-request@v7
with:
token: ${{ steps.generate_token.outputs.token }}
branch: release-bump-${{ inputs.next_version }}

View File

@@ -15,7 +15,7 @@ jobs:
runs-on: ubuntu-latest
environment: internal-production
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/checkout@v5
with:
ref: main
- run: |

View File

@@ -7,7 +7,7 @@ on:
jobs:
build-server:
uses: ./.github/workflows/_reusable-docker-build.yml
uses: ./.github/workflows/_reusable-docker-build.yaml
secrets: inherit
permissions:
contents: read
@@ -31,11 +31,11 @@ jobs:
id-token: write
attestations: write
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/checkout@v5
- name: Set up QEMU
uses: docker/setup-qemu-action@c7c53464625b32c7a7e944ae62b3e17d2b600130 # v3.7.0
uses: docker/setup-qemu-action@v3.6.0
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3
uses: docker/setup-buildx-action@v3
- name: prepare variables
uses: ./.github/actions/docker-push-variables
id: ev
@@ -44,21 +44,21 @@ jobs:
with:
image-name: ghcr.io/goauthentik/docs
- name: Login to GitHub Container Registry
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Build Docker Image
id: push
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6
uses: docker/build-push-action@v6
with:
tags: ${{ steps.ev.outputs.imageTags }}
file: website/Dockerfile
push: true
platforms: linux/amd64,linux/arm64
context: .
- uses: actions/attest-build-provenance@96278af6caaf10aea03fd8d33a09a777ca52d62f # v3
- uses: actions/attest-build-provenance@v2
id: attest
if: true
with:
@@ -83,19 +83,19 @@ jobs:
- radius
- rac
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/setup-go@7a3fe6cf4cb3a834922a1244abfce67bcef6a0c5 # v6
- uses: actions/checkout@v5
- uses: actions/setup-go@v5
with:
go-version-file: "go.mod"
- uses: actions/setup-node@6044e13b5dc448c55e2357c09f80417699197238 # v5
- uses: actions/setup-node@v5
with:
node-version-file: web/package.json
cache: "npm"
cache-dependency-path: web/package-lock.json
- name: Set up QEMU
uses: docker/setup-qemu-action@c7c53464625b32c7a7e944ae62b3e17d2b600130 # v3.7.0
uses: docker/setup-qemu-action@v3.6.0
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3
uses: docker/setup-buildx-action@v3
- name: prepare variables
uses: ./.github/actions/docker-push-variables
id: ev
@@ -108,28 +108,28 @@ jobs:
make gen-client-ts
make gen-client-go
- name: Docker Login Registry
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKER_CORP_USERNAME }}
password: ${{ secrets.DOCKER_CORP_PASSWORD }}
- name: Login to GitHub Container Registry
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Build Docker Image
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6
uses: docker/build-push-action@v6
id: push
with:
push: true
build-args: |
VERSION=${{ github.ref }}
tags: ${{ steps.ev.outputs.imageTags }}
file: lifecycle/container/${{ matrix.type }}.Dockerfile
file: ${{ matrix.type }}.Dockerfile
platforms: linux/amd64,linux/arm64
context: .
- uses: actions/attest-build-provenance@96278af6caaf10aea03fd8d33a09a777ca52d62f # v3
- uses: actions/attest-build-provenance@v2
id: attest
with:
subject-name: ${{ steps.ev.outputs.attestImageNames }}
@@ -151,11 +151,11 @@ jobs:
goos: [linux, darwin]
goarch: [amd64, arm64]
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/setup-go@7a3fe6cf4cb3a834922a1244abfce67bcef6a0c5 # v6
- uses: actions/checkout@v5
- uses: actions/setup-go@v5
with:
go-version-file: "go.mod"
- uses: actions/setup-node@6044e13b5dc448c55e2357c09f80417699197238 # v5
- uses: actions/setup-node@v4
with:
node-version-file: web/package.json
cache: "npm"
@@ -180,7 +180,7 @@ jobs:
export CGO_ENABLED=0
go build -tags=outpost_static_embed -v -o ./authentik-outpost-${{ matrix.type }}_${{ matrix.goos }}_${{ matrix.goarch }} ./cmd/${{ matrix.type }}
- name: Upload binaries to release
uses: svenstaro/upload-release-action@6b7fa9f267e90b50a19fef07b3596790bb941741 # v2
uses: svenstaro/upload-release-action@v2
with:
repo_token: ${{ secrets.GITHUB_TOKEN }}
file: ./authentik-outpost-${{ matrix.type }}_${{ matrix.goos }}_${{ matrix.goarch }}
@@ -198,8 +198,8 @@ jobs:
AWS_REGION: eu-central-1
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: aws-actions/configure-aws-credentials@8df5847569e6427dd6c4fb1cf565c83acfa8afa7 # v6.0.0
- uses: actions/checkout@v5
- uses: aws-actions/configure-aws-credentials@v4
with:
role-to-assume: "arn:aws:iam::016170277896:role/github_goauthentik_authentik"
aws-region: ${{ env.AWS_REGION }}
@@ -214,15 +214,15 @@ jobs:
- build-outpost-binary
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/checkout@v5
- name: Run test suite in final docker images
run: |
echo "PG_PASS=$(openssl rand 32 | base64 -w 0)" >> lifecycle/container/.env
echo "AUTHENTIK_SECRET_KEY=$(openssl rand 32 | base64 -w 0)" >> lifecycle/container/.env
docker compose -f lifecycle/container/compose.yml pull -q
docker compose -f lifecycle/container/compose.yml up --no-start
docker compose -f lifecycle/container/compose.yml start postgresql
docker compose -f lifecycle/container/compose.yml run -u root server test-all
echo "PG_PASS=$(openssl rand 32 | base64 -w 0)" >> .env
echo "AUTHENTIK_SECRET_KEY=$(openssl rand 32 | base64 -w 0)" >> .env
docker compose pull -q
docker compose up --no-start
docker compose start postgresql redis
docker compose run -u root server test-all
sentry-release:
needs:
- build-server
@@ -230,7 +230,7 @@ jobs:
- build-outpost-binary
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/checkout@v5
- name: prepare variables
uses: ./.github/actions/docker-push-variables
id: ev
@@ -244,7 +244,7 @@ jobs:
container=$(docker container create ${{ steps.ev.outputs.imageMainName }})
docker cp ${container}:web/ .
- name: Create a Sentry.io release
uses: getsentry/action-release@dab6548b3c03c4717878099e43782cf5be654289 # v3
uses: getsentry/action-release@v3
continue-on-error: true
env:
SENTRY_AUTH_TOKEN: ${{ secrets.SENTRY_AUTH_TOKEN }}

View File

@@ -35,10 +35,8 @@ jobs:
echo "major_version=${{ inputs.version }}" | grep -oE "^major_version=[0-9]{4}\.[0-9]{1,2}" >> "$GITHUB_OUTPUT"
- id: changelog-url
run: |
if [ "${{ inputs.release_reason }}" = "feature" ]; then
if [ "${{ inputs.release_reason }}" = "feature" ] || [ "${{ inputs.release_reason }}" = "prerelease" ]; then
changelog_url="https://docs.goauthentik.io/docs/releases/${{ steps.check.outputs.major_version }}"
elif [ "${{ inputs.release_reason }}" = "prerelease" ]; then
changelog_url="https://next.goauthentik.io/docs/releases/${{ steps.check.outputs.major_version }}"
else
changelog_url="https://docs.goauthentik.io/docs/releases/${{ steps.check.outputs.major_version }}#fixed-in-$(echo -n ${{ inputs.version }} | sed 's/\.//g')"
fi
@@ -52,7 +50,7 @@ jobs:
needs:
- check-inputs
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/checkout@v5
with:
ref: "version-${{ needs.check-inputs.outputs.major_version }}"
- name: Setup authentik env
@@ -67,7 +65,7 @@ jobs:
steps:
- id: app-token
name: Generate app token
uses: actions/create-github-app-token@29824e69f54612133e76f7eaac726eef6c875baf # v2
uses: actions/create-github-app-token@v2
with:
app-id: ${{ secrets.GH_APP_ID }}
private-key: ${{ secrets.GH_APP_PRIVATE_KEY }}
@@ -76,7 +74,7 @@ jobs:
run: echo "user-id=$(gh api "/users/${{ steps.app-token.outputs.app-slug }}[bot]" --jq .id)" >> "$GITHUB_OUTPUT"
env:
GH_TOKEN: "${{ steps.app-token.outputs.token }}"
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/checkout@v5
with:
ref: "version-${{ needs.check-inputs.outputs.major_version }}"
token: "${{ steps.app-token.outputs.token }}"
@@ -91,11 +89,12 @@ jobs:
# ID from https://api.github.com/users/authentik-automation[bot]
git config --global user.name '${{ steps.app-token.outputs.app-slug }}[bot]'
git config --global user.email '${{ steps.get-user-id.outputs.user-id }}+${{ steps.app-token.outputs.app-slug }}[bot]@users.noreply.github.com'
git pull
git commit -a -m "release: ${{ inputs.version }}" --allow-empty
git tag "version/${{ inputs.version }}" HEAD -m "version/${{ inputs.version }}"
git push --follow-tags
- name: Create Release
uses: softprops/action-gh-release@a06a81a03ee405af7f2048a818ed3f03bbf83c7b # v2.5.0
uses: softprops/action-gh-release@v2
with:
token: "${{ steps.app-token.outputs.token }}"
tag_name: "version/${{ inputs.version }}"
@@ -114,7 +113,7 @@ jobs:
steps:
- id: app-token
name: Generate app token
uses: actions/create-github-app-token@29824e69f54612133e76f7eaac726eef6c875baf # v2
uses: actions/create-github-app-token@v2
with:
app-id: ${{ secrets.GH_APP_ID }}
private-key: ${{ secrets.GH_APP_PRIVATE_KEY }}
@@ -124,7 +123,7 @@ jobs:
run: echo "user-id=$(gh api "/users/${{ steps.app-token.outputs.app-slug }}[bot]" --jq .id)" >> "$GITHUB_OUTPUT"
env:
GH_TOKEN: "${{ steps.app-token.outputs.token }}"
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/checkout@v5
with:
repository: "${{ github.repository_owner }}/helm"
token: "${{ steps.app-token.outputs.token }}"
@@ -136,7 +135,7 @@ jobs:
sed -E -i 's/[0-9]{4}\.[0-9]{1,2}\.[0-9]+$/${{ inputs.version }}/' charts/authentik/Chart.yaml
./scripts/helm-docs.sh
- name: Create pull request
uses: peter-evans/create-pull-request@c0f553fe549906ede9cf27b5156039d195d2ece0 # v7
uses: peter-evans/create-pull-request@v7
with:
token: "${{ steps.app-token.outputs.token }}"
branch: bump-${{ inputs.version }}
@@ -156,7 +155,7 @@ jobs:
steps:
- id: app-token
name: Generate app token
uses: actions/create-github-app-token@29824e69f54612133e76f7eaac726eef6c875baf # v2
uses: actions/create-github-app-token@v2
with:
app-id: ${{ secrets.GH_APP_ID }}
private-key: ${{ secrets.GH_APP_PRIVATE_KEY }}
@@ -166,7 +165,7 @@ jobs:
run: echo "user-id=$(gh api "/users/${{ steps.app-token.outputs.app-slug }}[bot]" --jq .id)" >> "$GITHUB_OUTPUT"
env:
GH_TOKEN: "${{ steps.app-token.outputs.token }}"
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/checkout@v5
with:
repository: "${{ github.repository_owner }}/version"
token: "${{ steps.app-token.outputs.token }}"
@@ -191,7 +190,7 @@ jobs:
'.stable.version = $version | .stable.changelog = $changelog | .stable.changelog_url = $changelog_url' version.json > version.new.json
mv version.new.json version.json
- name: Create pull request
uses: peter-evans/create-pull-request@c0f553fe549906ede9cf27b5156039d195d2ece0 # v7
uses: peter-evans/create-pull-request@v7
with:
token: "${{ steps.app-token.outputs.token }}"
branch: bump-${{ inputs.version }}

View File

@@ -15,11 +15,11 @@ jobs:
runs-on: ubuntu-latest
steps:
- id: generate_token
uses: actions/create-github-app-token@29824e69f54612133e76f7eaac726eef6c875baf # v2
uses: tibdex/github-app-token@v2
with:
app-id: ${{ secrets.GH_APP_ID }}
private-key: ${{ secrets.GH_APP_PRIVATE_KEY }}
- uses: actions/stale@997185467fa4f803885201cee163a9f38240193d # v10
app_id: ${{ secrets.GH_APP_ID }}
private_key: ${{ secrets.GH_APP_PRIVATE_KEY }}
- uses: actions/stale@v9
with:
repo-token: ${{ steps.generate_token.outputs.token }}
days-before-stale: 60

View File

@@ -20,14 +20,14 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Find Comment
uses: peter-evans/find-comment@b30e6a3c0ed37e7c023ccd3f1db5c6c0b0c23aad # v4
uses: peter-evans/find-comment@v3
id: fc
with:
issue-number: ${{ github.event.pull_request.number }}
comment-author: "github-actions[bot]"
body-includes: authentik translations instructions
- name: Create or update comment
uses: peter-evans/create-or-update-comment@e8674b075228eee787fea43ef493e45ece1004c9 # v5
uses: peter-evans/create-or-update-comment@v4
with:
comment-id: ${{ steps.fc.outputs.comment-id }}
issue-number: ${{ github.event.pull_request.number }}

View File

@@ -21,15 +21,15 @@ jobs:
steps:
- id: generate_token
if: ${{ github.event_name != 'pull_request' }}
uses: actions/create-github-app-token@29824e69f54612133e76f7eaac726eef6c875baf # v2
uses: tibdex/github-app-token@v2
with:
app-id: ${{ secrets.GH_APP_ID }}
private-key: ${{ secrets.GH_APP_PRIVATE_KEY }}
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
app_id: ${{ secrets.GH_APP_ID }}
private_key: ${{ secrets.GH_APP_PRIVATE_KEY }}
- uses: actions/checkout@v5
if: ${{ github.event_name != 'pull_request' }}
with:
token: ${{ steps.generate_token.outputs.token }}
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v5
- uses: actions/checkout@v5
if: ${{ github.event_name == 'pull_request' }}
- name: Setup authentik env
uses: ./.github/actions/setup
@@ -44,7 +44,7 @@ jobs:
make web-check-compile
- name: Create Pull Request
if: ${{ github.event_name != 'pull_request' }}
uses: peter-evans/create-pull-request@c0f553fe549906ede9cf27b5156039d195d2ece0 # v7
uses: peter-evans/create-pull-request@v7
with:
token: ${{ steps.generate_token.outputs.token }}
branch: extract-compile-backend-translation

View File

@@ -0,0 +1,41 @@
---
# Rename transifex pull requests to have a correct naming
# Also enables auto squash-merge
name: Translation - Auto-rename Transifex PRs
on:
pull_request:
types: [opened, reopened]
permissions:
# Permission to rename PR
pull-requests: write
jobs:
rename_pr:
runs-on: ubuntu-latest
if: ${{ github.event.pull_request.user.login == 'transifex-integration[bot]'}}
steps:
- uses: actions/checkout@v5
- id: generate_token
uses: tibdex/github-app-token@v2
with:
app_id: ${{ secrets.GH_APP_ID }}
private_key: ${{ secrets.GH_APP_PRIVATE_KEY }}
- name: Get current title
id: title
env:
GH_TOKEN: ${{ steps.generate_token.outputs.token }}
run: |
title=$(gh pr view ${{ github.event.pull_request.number }} --json "title" -q ".title")
echo "title=${title}" >> "$GITHUB_OUTPUT"
- name: Rename
env:
GH_TOKEN: ${{ steps.generate_token.outputs.token }}
run: |
gh pr edit ${{ github.event.pull_request.number }} -t "translate: ${{ steps.title.outputs.title }}" --add-label dependencies
- uses: peter-evans/enable-pull-request-automerge@v3
with:
token: ${{ steps.generate_token.outputs.token }}
pull-request-number: ${{ github.event.pull_request.number }}
merge-method: squash

5
.gitignore vendored
View File

@@ -72,7 +72,7 @@ unittest.xml
# Translations
# Have to include binary mo files as they are annoying to compile at build time
# since a full postgres instance is required
# since a full postgres and redis instance are required
# *.mo
# Django stuff:
@@ -211,5 +211,4 @@ source_docs/
/vendor/
### Docker ###
tests/openid_conformance/exports/*.zip
compose.override.yml
docker-compose.override.yml

View File

@@ -26,10 +26,6 @@ website/api/reference
node_modules
coverage
## Vendored files
vendored
*.min.js
## Configs
*.log
*.yaml

18
.vscode/settings.json vendored
View File

@@ -1,19 +1,4 @@
{
"[css]": {
"editor.minimap.markSectionHeaderRegex": "#\\bregion\\s*(?<separator>-?)\\s*(?<label>.*)\\*/$"
},
"[makefile]": {
"editor.minimap.markSectionHeaderRegex": "^#{25}\n##\\s\\s*(?<separator>-?)\\s*(?<label>[^\n]*)\n#{25}$"
},
"[dockerfile]": {
"editor.minimap.markSectionHeaderRegex": "\\bStage\\s*\\d:(?<separator>-?)\\s*(?<label>.*)$"
},
"[jsonc]": {
"editor.minimap.markSectionHeaderRegex": "#\\bregion\\s*(?<separator>-?)\\s*(?<label>.*)$"
},
"[xml]": {
"editor.minimap.markSectionHeaderRegex": "<!--\\s*#\\bregion\\s*(?<separator>-?)\\s*(?<label>.*)\\s*-->"
},
"todo-tree.tree.showCountsInTree": true,
"todo-tree.tree.showBadges": true,
"yaml.customTags": [
@@ -52,9 +37,6 @@
"go.testFlags": [
"-count=1"
],
"go.testEnvVars": {
"WORKSPACE_DIR": "${workspaceFolder}"
},
"github-actions.workflows.pinned.workflows": [
".github/workflows/ci-main.yml"
]

View File

@@ -16,20 +16,16 @@ go.sum @goauthentik/backend
# Infrastructure
.github/ @goauthentik/infrastructure
lifecycle/aws/ @goauthentik/infrastructure
lifecycle/container/ @goauthentik/infrastructure
Dockerfile @goauthentik/infrastructure
*Dockerfile @goauthentik/infrastructure
.dockerignore @goauthentik/infrastructure
docker-compose.yml @goauthentik/infrastructure
Makefile @goauthentik/infrastructure
.editorconfig @goauthentik/infrastructure
CODEOWNERS @goauthentik/infrastructure
# Backend packages
packages/django-channels-postgres @goauthentik/backend
packages/django-postgres-cache @goauthentik/backend
packages/django-dramatiq-postgres @goauthentik/backend
# Web packages
package.json @goauthentik/frontend
package-lock.json @goauthentik/frontend
packages/package.json @goauthentik/frontend
packages/package-lock.json @goauthentik/frontend
packages/docusaurus-config @goauthentik/frontend
packages/esbuild-plugin-live-reload @goauthentik/frontend
packages/eslint-config @goauthentik/frontend
@@ -37,12 +33,17 @@ packages/prettier-config @goauthentik/frontend
packages/tsconfig @goauthentik/frontend
# Web
web/ @goauthentik/frontend
tests/wdio/ @goauthentik/frontend
# Locale
/locale/ @goauthentik/backend @goauthentik/frontend
locale/ @goauthentik/backend @goauthentik/frontend
web/xliff/ @goauthentik/backend @goauthentik/frontend
# Docs
# Docs & Website
docs/ @goauthentik/docs
# TODO Remove after moving website to docs
website/ @goauthentik/docs
CODE_OF_CONDUCT.md @goauthentik/docs
# Security
SECURITY.md @goauthentik/security @goauthentik/docs
# TODO Remove after moving website to docs
website/security/ @goauthentik/security @goauthentik/docs
docs/security/ @goauthentik/security @goauthentik/docs

View File

@@ -1,7 +1,7 @@
# syntax=docker/dockerfile:1
# Stage 1: Build webui
FROM --platform=${BUILDPLATFORM} docker.io/library/node:24-trixie-slim@sha256:45babd1b4ce0349fb12c4e24bf017b90b96d52806db32e001e3013f341bef0fe AS node-builder
FROM --platform=${BUILDPLATFORM} docker.io/library/node:24-slim AS node-builder
ARG GIT_BUILD_HASH
ENV GIT_BUILD_HASH=$GIT_BUILD_HASH
@@ -26,7 +26,7 @@ RUN npm run build && \
npm run build:sfe
# Stage 2: Build go proxy
FROM --platform=${BUILDPLATFORM} docker.io/library/golang:1.25.5-trixie@sha256:8e8f9c84609b6005af0a4a8227cee53d6226aab1c6dcb22daf5aeeb8b05480e1 AS go-builder
FROM --platform=${BUILDPLATFORM} docker.io/library/golang:1.25-bookworm AS go-builder
ARG TARGETOS
ARG TARGETARCH
@@ -65,7 +65,7 @@ RUN --mount=type=cache,sharing=locked,target=/go/pkg/mod \
go build -o /go/authentik ./cmd/server
# Stage 3: MaxMind GeoIP
FROM --platform=${BUILDPLATFORM} ghcr.io/maxmind/geoipupdate:v7.1.1@sha256:faecdca22579730ab0b7dea5aa9af350bb3c93cb9d39845c173639ead30346d2 AS geoip
FROM --platform=${BUILDPLATFORM} ghcr.io/maxmind/geoipupdate:v7.1.1 AS geoip
ENV GEOIPUPDATE_EDITION_IDS="GeoLite2-City GeoLite2-ASN"
ENV GEOIPUPDATE_VERBOSE="1"
@@ -78,9 +78,9 @@ RUN --mount=type=secret,id=GEOIPUPDATE_ACCOUNT_ID \
/bin/sh -c "GEOIPUPDATE_LICENSE_KEY_FILE=/run/secrets/GEOIPUPDATE_LICENSE_KEY /usr/bin/entry.sh || echo 'Failed to get GeoIP database, disabling'; exit 0"
# Stage 4: Download uv
FROM ghcr.io/astral-sh/uv:0.9.18@sha256:5713fa8217f92b80223bc83aac7db36ec80a84437dbc0d04bbc659cae030d8c9 AS uv
FROM ghcr.io/astral-sh/uv:0.8.8 AS uv
# Stage 5: Base python image
FROM ghcr.io/goauthentik/fips-python:3.14.2-slim-trixie-fips@sha256:46c0658052e43ad303da39e461ad106c499a03fabd3512d05ff586e506507242 AS python-base
FROM ghcr.io/goauthentik/fips-python:3.13.6-slim-bookworm-fips AS python-base
ENV VENV_PATH="/ak-root/.venv" \
PATH="/lifecycle:/ak-root/.venv/bin:$PATH" \
@@ -116,7 +116,7 @@ RUN --mount=type=cache,id=apt-$TARGETARCH$TARGETVARIANT,sharing=locked,target=/v
# postgresql
libpq-dev \
# python-kadmin-rs
krb5-multidev libkrb5-dev heimdal-multidev libclang-dev \
clang libkrb5-dev sccache \
# xmlsec
libltdl-dev && \
curl https://sh.rustup.rs -sSf | sh -s -- -y
@@ -141,7 +141,6 @@ ARG GIT_BUILD_HASH
ENV GIT_BUILD_HASH=$GIT_BUILD_HASH
LABEL org.opencontainers.image.authors="Authentik Security Inc." \
org.opencontainers.image.source="https://github.com/goauthentik/authentik" \
org.opencontainers.image.description="goauthentik.io Main server image, see https://goauthentik.io for more info." \
org.opencontainers.image.documentation="https://docs.goauthentik.io" \
org.opencontainers.image.licenses="https://github.com/goauthentik/authentik/blob/main/LICENSE" \
@@ -158,22 +157,17 @@ WORKDIR /
RUN apt-get update && \
apt-get upgrade -y && \
# Required for runtime
apt-get install -y --no-install-recommends \
libpq5 libmaxminddb0 ca-certificates \
krb5-multidev libkrb5-3 libkdb5-10 libkadm5clnt-mit12 \
heimdal-multidev libkadm5clnt7t64-heimdal \
libltdl7 libxslt1.1 && \
apt-get install -y --no-install-recommends libpq5 libmaxminddb0 ca-certificates libkrb5-3 libkadm5clnt-mit12 libkdb5-10 libltdl7 libxslt1.1 && \
# Required for bootstrap & healtcheck
apt-get install -y --no-install-recommends runit && \
pip3 install --no-cache-dir --upgrade pip && \
apt-get clean && \
rm -rf /tmp/* /var/lib/apt/lists/* /var/tmp/ && \
adduser --system --no-create-home --uid 1000 --group --home /authentik authentik && \
mkdir -p /certs /data /media /blueprints && \
ln -s /media /data/media && \
mkdir -p /certs /media /blueprints && \
mkdir -p /authentik/.ssh && \
mkdir -p /ak-root && \
chown authentik:authentik /certs /data /data/media /media /authentik/.ssh /ak-root
chown authentik:authentik /certs /media /authentik/.ssh /ak-root
COPY ./authentik/ /authentik
COPY ./pyproject.toml /

222
Makefile
View File

@@ -5,59 +5,20 @@ SHELL := /usr/bin/env bash
PWD = $(shell pwd)
UID = $(shell id -u)
GID = $(shell id -g)
NPM_VERSION = $(shell python -m scripts.generate_semver)
PY_SOURCES = authentik packages tests scripts lifecycle .github
DOCKER_IMAGE ?= "authentik:test"
UNAME_S := $(shell uname -s)
ifeq ($(UNAME_S),Darwin)
SED_INPLACE = sed -i ''
else
SED_INPLACE = sed -i
endif
GEN_API_TS = gen-ts-api
GEN_API_PY = gen-py-api
GEN_API_GO = gen-go-api
BREW_LDFLAGS :=
BREW_CPPFLAGS :=
BREW_PKG_CONFIG_PATH :=
pg_user := $(shell uv run python -m authentik.lib.config postgresql.user 2>/dev/null)
pg_host := $(shell uv run python -m authentik.lib.config postgresql.host 2>/dev/null)
pg_name := $(shell uv run python -m authentik.lib.config postgresql.name 2>/dev/null)
redis_db := $(shell uv run python -m authentik.lib.config redis.db 2>/dev/null)
UV := uv
# For macOS users, add the libxml2 installed from brew libxmlsec1 to the build path
# to prevent SAML-related tests from failing and ensure correct pip dependency compilation
ifeq ($(UNAME_S),Darwin)
# Only add for brew users who installed libxmlsec1
BREW_EXISTS := $(shell command -v brew 2> /dev/null)
ifdef BREW_EXISTS
LIBXML2_EXISTS := $(shell brew list libxml2 2> /dev/null)
ifdef LIBXML2_EXISTS
_xml_pref := $(shell brew --prefix libxml2)
BREW_LDFLAGS += -L${_xml_pref}/lib
BREW_CPPFLAGS += -I${_xml_pref}/include
BREW_PKG_CONFIG_PATH = ${_xml_pref}/lib/pkgconfig:$(PKG_CONFIG_PATH)
endif
KRB5_EXISTS := $(shell brew list krb5 2> /dev/null)
ifdef KRB5_EXISTS
_krb5_pref := $(shell brew --prefix krb5)
BREW_LDFLAGS += -L${_krb5_pref}/lib
BREW_CPPFLAGS += -I${_krb5_pref}/include
BREW_PKG_CONFIG_PATH = ${_krb5_pref}/lib/pkgconfig:$(PKG_CONFIG_PATH)
endif
UV := LDFLAGS="$(BREW_LDFLAGS)" CPPFLAGS="$(BREW_CPPFLAGS)" PKG_CONFIG_PATH="$(BREW_PKG_CONFIG_PATH)" uv
endif
endif
NPM_VERSION :=
UV_EXISTS := $(shell command -v uv 2> /dev/null)
ifdef UV_EXISTS
NPM_VERSION := $(shell $(UV) run python -m scripts.generate_semver)
else
NPM_VERSION = $(shell python -m scripts.generate_semver)
endif
all: lint-fix lint gen web test ## Lint, build, and test everything
all: lint-fix lint test gen web ## Lint, build, and test everything
HELP_WIDTH := $(shell grep -h '^[a-z][^ ]*:.*\#\#' $(MAKEFILE_LIST) 2>/dev/null | \
cut -d':' -f1 | awk '{printf "%d\n", length}' | sort -rn | head -1)
@@ -73,46 +34,40 @@ go-test:
go test -timeout 0 -v -race -cover ./...
test: ## Run the server tests and produce a coverage report (locally)
$(UV) run coverage run manage.py test --keepdb $(or $(filter-out $@,$(MAKECMDGOALS)),authentik)
$(UV) run coverage html
$(UV) run coverage report
uv run coverage run manage.py test --keepdb authentik
uv run coverage html
uv run coverage report
lint-fix: lint-codespell ## Lint and automatically fix errors in the python source code. Reports spelling errors.
$(UV) run black $(PY_SOURCES)
$(UV) run ruff check --fix $(PY_SOURCES)
uv run black $(PY_SOURCES)
uv run ruff check --fix $(PY_SOURCES)
lint-codespell: ## Reports spelling errors.
$(UV) run codespell -w
uv run codespell -w
lint: ci-bandit ci-mypy ## Lint the python and golang sources
lint: ## Lint the python and golang sources
uv run bandit -c pyproject.toml -r $(PY_SOURCES)
golangci-lint run -v
core-install:
ifdef ($(BREW_EXISTS))
# Clear cache to ensure fresh compilation
$(UV) cache clean
# Force compilation from source for lxml and xmlsec with correct environment
$(UV) sync --frozen --reinstall-package lxml --reinstall-package xmlsec --no-binary-package lxml --no-binary-package xmlsec
else
$(UV) sync --frozen
endif
uv sync --frozen
migrate: ## Run the Authentik Django server's migrations
$(UV) run python -m lifecycle.migrate
uv run python -m lifecycle.migrate
i18n-extract: core-i18n-extract web-i18n-extract ## Extract strings that require translation into files to send to a translation service
aws-cfn:
cd lifecycle/aws && npm i && $(UV) run npm run aws-cfn
cd lifecycle/aws && npm i && uv run npm run aws-cfn
run-server: ## Run the main authentik server process
$(UV) run ak server
uv run ak server
run-worker: ## Run the main authentik worker process
$(UV) run ak worker
uv run ak worker
core-i18n-extract:
$(UV) run ak makemessages \
uv run ak makemessages \
--add-location file \
--no-obsolete \
--ignore web \
@@ -125,17 +80,12 @@ core-i18n-extract:
install: node-install docs-install core-install ## Install all requires dependencies for `node`, `docs` and `core`
dev-drop-db:
$(eval pg_user := $(shell $(UV) run python -m authentik.lib.config postgresql.user 2>/dev/null))
$(eval pg_host := $(shell $(UV) run python -m authentik.lib.config postgresql.host 2>/dev/null))
$(eval pg_name := $(shell $(UV) run python -m authentik.lib.config postgresql.name 2>/dev/null))
dropdb -U ${pg_user} -h ${pg_host} ${pg_name} || true
# Also remove the test-db if it exists
dropdb -U ${pg_user} -h ${pg_host} test_${pg_name} || true
redis-cli -n ${redis_db} flushall
dev-create-db:
$(eval pg_user := $(shell $(UV) run python -m authentik.lib.config postgresql.user 2>/dev/null))
$(eval pg_host := $(shell $(UV) run python -m authentik.lib.config postgresql.host 2>/dev/null))
$(eval pg_name := $(shell $(UV) run python -m authentik.lib.config postgresql.name 2>/dev/null))
createdb -U ${pg_user} -h ${pg_host} ${pg_name}
dev-reset: dev-drop-db dev-create-db migrate ## Drop and restore the Authentik PostgreSQL instance to a "fresh install" state.
@@ -148,11 +98,11 @@ bump: ## Bump authentik version. Usage: make bump version=20xx.xx.xx
ifndef version
$(error Usage: make bump version=20xx.xx.xx )
endif
$(SED_INPLACE) 's/^version = ".*"/version = "$(version)"/' pyproject.toml
$(SED_INPLACE) 's/^VERSION = ".*"/VERSION = "$(version)"/' authentik/__init__.py
$(eval current_version := $(shell cat ${PWD}/internal/constants/VERSION))
sed -i 's/^version = ".*"/version = "$(version)"/' pyproject.toml
sed -i 's/^VERSION = ".*"/VERSION = "$(version)"/' authentik/__init__.py
$(MAKE) gen-build gen-compose aws-cfn
npm version --no-git-tag-version --allow-same-version $(version)
cd ${PWD}/web && npm version --no-git-tag-version --allow-same-version $(version)
sed -i "s/\"${current_version}\"/\"$(version)\"/" ${PWD}/package.json ${PWD}/package-lock.json ${PWD}/web/package.json ${PWD}/web/package-lock.json
echo -n $(version) > ${PWD}/internal/constants/VERSION
#########################
@@ -163,59 +113,72 @@ gen-build: ## Extract the schema from the database
AUTHENTIK_DEBUG=true \
AUTHENTIK_TENANTS__ENABLED=true \
AUTHENTIK_OUTPOSTS__DISABLE_EMBEDDED_OUTPOST=true \
$(UV) run ak build_schema
uv run ak make_blueprint_schema --file blueprints/schema.json
AUTHENTIK_DEBUG=true \
AUTHENTIK_TENANTS__ENABLED=true \
AUTHENTIK_OUTPOSTS__DISABLE_EMBEDDED_OUTPOST=true \
uv run ak spectacular --file schema.yml
gen-compose:
$(UV) run scripts/generate_compose.py
uv run scripts/generate_docker_compose.py
gen-changelog: ## (Release) generate the changelog based from the commits since the last tag
git log --pretty=format:" - %s" $(shell git describe --tags $(shell git rev-list --tags --max-count=1))...$(shell git branch --show-current) | sort > changelog.md
npx prettier --write changelog.md
gen-diff: ## (Release) generate the changelog diff between the current schema and the last tag
git show $(shell git describe --tags $(shell git rev-list --tags --max-count=1)):schema.yml > schema-old.yml
docker compose -f scripts/api/compose.yml run --rm --user "${UID}:${GID}" diff \
--markdown \
/local/diff.md \
/local/schema-old.yml \
/local/schema.yml
rm schema-old.yml
$(SED_INPLACE) 's/{/&#123;/g' diff.md
$(SED_INPLACE) 's/}/&#125;/g' diff.md
git show $(shell git describe --tags $(shell git rev-list --tags --max-count=1)):schema.yml > old_schema.yml
docker run \
--rm -v ${PWD}:/local \
--user ${UID}:${GID} \
docker.io/openapitools/openapi-diff:2.1.0-beta.8 \
--markdown /local/diff.md \
/local/old_schema.yml /local/schema.yml
rm old_schema.yml
sed -i 's/{/&#123;/g' diff.md
sed -i 's/}/&#125;/g' diff.md
npx prettier --write diff.md
gen-clean-ts: ## Remove generated API client for TypeScript
rm -rf ${PWD}/${GEN_API_TS}/
rm -rf ${PWD}/web/node_modules/@goauthentik/api/
gen-clean-py: ## Remove generated API client for Python
rm -rf ${PWD}/${GEN_API_PY}
gen-clean-go: ## Remove generated API client for Go
rm -rf ${PWD}/${GEN_API_GO}
gen-clean-py: ## Remove generated API client for Python
rm -rf ${PWD}/${GEN_API_PY}/
gen-clean: gen-clean-ts gen-clean-go gen-clean-py ## Remove generated API clients
gen-client-ts: gen-clean-ts ## Build and install the authentik API for Typescript into the authentik UI Application
docker compose -f scripts/api/compose.yml run --rm --user "${UID}:${GID}" gen \
generate \
docker run \
--rm -v ${PWD}:/local \
--user ${UID}:${GID} \
docker.io/openapitools/openapi-generator-cli:v7.11.0 generate \
-i /local/schema.yml \
-g typescript-fetch \
-o /local/${GEN_API_TS} \
-c /local/scripts/api/ts-config.yaml \
-c /local/scripts/api-ts-config.yaml \
--additional-properties=npmVersion=${NPM_VERSION} \
--git-repo-id authentik \
--git-user-id goauthentik
cd ${PWD}/${GEN_API_TS} && npm i
cd ${PWD}/${GEN_API_TS} && npm link
cd ${PWD}/web && npm link @goauthentik/api
gen-client-py: gen-clean-py ## Build and install the authentik API for Python
mkdir -p ${PWD}/${GEN_API_PY}
git clone --depth 1 https://github.com/goauthentik/client-python.git ${PWD}/${GEN_API_PY}
cp ${PWD}/schema.yml ${PWD}/${GEN_API_PY}
make -C ${PWD}/${GEN_API_PY} build version=${NPM_VERSION}
docker run \
--rm -v ${PWD}:/local \
--user ${UID}:${GID} \
docker.io/openapitools/openapi-generator-cli:v7.11.0 generate \
-i /local/schema.yml \
-g python \
-o /local/${GEN_API_PY} \
-c /local/scripts/api-py-config.yaml \
--additional-properties=packageVersion=${NPM_VERSION} \
--git-repo-id authentik \
--git-user-id goauthentik
gen-client-go: gen-clean-go ## Build and install the authentik API for Golang
mkdir -p ${PWD}/${GEN_API_GO}
@@ -225,7 +188,7 @@ gen-client-go: gen-clean-go ## Build and install the authentik API for Golang
go mod edit -replace goauthentik.io/api/v3=./${GEN_API_GO}
gen-dev-config: ## Generate a local development config file
$(UV) run scripts/generate_config.py
uv run scripts/generate_config.py
gen: gen-build gen-client-ts
@@ -242,30 +205,34 @@ node-install: ## Install the necessary libraries to build Node.js packages
#########################
web-build: node-install ## Build the Authentik UI
npm run --prefix web build
cd web && npm run build
web: web-lint-fix web-lint web-check-compile ## Automatically fix formatting issues in the Authentik UI source code, lint the code, and compile it
web-test: ## Run tests for the Authentik UI
npm run --prefix web test
cd web && npm run test
web-watch: ## Build and watch the Authentik UI for changes, updating automatically
npm run --prefix web watch
rm -rf web/dist/
mkdir web/dist/
touch web/dist/.gitkeep
cd web && npm run watch
web-storybook-watch: ## Build and run the storybook documentation server
npm run --prefix web storybook
cd web && npm run storybook
web-lint-fix:
npm run --prefix web prettier
cd web && npm run prettier
web-lint:
npm run --prefix web lint
npm run --prefix web lit-analyse
cd web && npm run lint
cd web && npm run lit-analyse
web-check-compile:
npm run --prefix web tsc
cd web && npm run tsc
web-i18n-extract:
npm run --prefix web extract-locales
cd web && npm run extract-locales
#########################
## Docs
@@ -277,31 +244,31 @@ docs-install:
npm ci --prefix website
docs-lint-fix: lint-codespell
npm run --prefix website prettier
npm run prettier --prefix website
docs-build:
npm run --prefix website build
npm run build --prefix website
docs-watch: ## Build and watch the topics documentation
npm run --prefix website start
npm run start --prefix website
integrations: docs-lint-fix integrations-build ## Fix formatting issues in the integrations source code, lint the code, and compile it
integrations-build:
npm run --prefix website -w integrations build
npm run build --prefix website -w integrations
integrations-watch: ## Build and watch the Integrations documentation
npm run --prefix website -w integrations start
npm run start --prefix website -w integrations
docs-api-build:
npm run --prefix website -w api build
npm run build --prefix website -w api
docs-api-watch: ## Build and watch the API documentation
npm run --prefix website -w api build:api
npm run --prefix website -w api start
npm run build:api --prefix website -w api
npm run start --prefix website -w api
docs-api-clean: ## Clean generated API documentation
npm run --prefix website -w api build:api:clean
npm run build:api:clean --prefix website -w api
#########################
## Docker
@@ -309,7 +276,7 @@ docs-api-clean: ## Clean generated API documentation
docker: ## Build a docker image of the current source tree
mkdir -p ${GEN_API_TS}
DOCKER_BUILDKIT=1 docker build . -f lifecycle/container/Dockerfile --progress plain --tag ${DOCKER_IMAGE}
DOCKER_BUILDKIT=1 docker build . --progress plain --tag ${DOCKER_IMAGE}
test-docker:
BUILD=true ${PWD}/scripts/test_docker.sh
@@ -321,28 +288,25 @@ test-docker:
# which makes the YAML File a lot smaller
ci--meta-debug:
$(UV) run python -V
python -V
node --version
ci-mypy: ci--meta-debug
$(UV) run mypy --strict $(PY_SOURCES)
ci-black: ci--meta-debug
$(UV) run black --check $(PY_SOURCES)
uv run black --check $(PY_SOURCES)
ci-ruff: ci--meta-debug
$(UV) run ruff check $(PY_SOURCES)
uv run ruff check $(PY_SOURCES)
ci-codespell: ci--meta-debug
$(UV) run codespell -s
uv run codespell -s
ci-bandit: ci--meta-debug
$(UV) run bandit -c pyproject.toml -r $(PY_SOURCES) -iii
uv run bandit -r $(PY_SOURCES)
ci-pending-migrations: ci--meta-debug
$(UV) run ak makemigrations --check
uv run ak makemigrations --check
ci-test: ci--meta-debug
$(UV) run coverage run manage.py test --keepdb authentik
$(UV) run coverage report
$(UV) run coverage xml
uv run coverage run manage.py test --keepdb --randomly-seed ${CI_TEST_SEED} authentik
uv run coverage report
uv run coverage xml

View File

@@ -9,8 +9,9 @@
[![GitHub Workflow Status](https://img.shields.io/github/actions/workflow/status/goauthentik/authentik/ci-outpost.yml?branch=main&label=outpost%20build&style=for-the-badge)](https://github.com/goauthentik/authentik/actions/workflows/ci-outpost.yml)
[![GitHub Workflow Status](https://img.shields.io/github/actions/workflow/status/goauthentik/authentik/ci-web.yml?branch=main&label=web%20build&style=for-the-badge)](https://github.com/goauthentik/authentik/actions/workflows/ci-web.yml)
[![Code Coverage](https://img.shields.io/codecov/c/gh/goauthentik/authentik?style=for-the-badge)](https://codecov.io/gh/goauthentik/authentik)
![Docker pulls](https://img.shields.io/docker/pulls/authentik/server.svg?style=for-the-badge)
![Latest version](https://img.shields.io/docker/v/authentik/server?sort=semver&style=for-the-badge)
[![](https://img.shields.io/badge/Help%20translate-transifex-blue?style=for-the-badge)](https://explore.transifex.com/authentik/authentik/)
[![](https://img.shields.io/badge/Help%20translate-transifex-blue?style=for-the-badge)](https://www.transifex.com/authentik/authentik/)
## What is authentik?

View File

@@ -18,10 +18,10 @@ Even if the issue is not a CVE, we still greatly appreciate your help in hardeni
(.x being the latest patch release for each version)
| Version | Supported |
| ---------- | ---------- |
| 2025.10.x | ✅ |
| 2025.12.x | ✅ |
| Version | Supported |
| --------- | --------- |
| 2025.6.x | ✅ |
| 2025.8.x | ✅ |
## Reporting a Vulnerability

View File

@@ -3,7 +3,7 @@
from functools import lru_cache
from os import environ
VERSION = "2026.5.0-rc1"
VERSION = "2025.8.6"
ENV_GIT_HASH_KEY = "GIT_BUILD_HASH"

View File

@@ -18,6 +18,7 @@ from rest_framework.views import APIView
from authentik import authentik_full_version
from authentik.core.api.utils import PassiveSerializer
from authentik.enterprise.license import LicenseKey
from authentik.lib.config import CONFIG
from authentik.lib.utils.reflection import get_env
from authentik.outposts.apps import MANAGED_OUTPOST
@@ -25,15 +26,6 @@ from authentik.outposts.models import Outpost
from authentik.rbac.permissions import HasPermission
def fips_enabled():
try:
from authentik.enterprise.license import LicenseKey
return backend._fips_enabled if LicenseKey.get_total().status().is_valid else None
except ModuleNotFoundError:
return None
class RuntimeDict(TypedDict):
"""Runtime information"""
@@ -88,7 +80,9 @@ class SystemInfoSerializer(PassiveSerializer):
"architecture": platform.machine(),
"authentik_version": authentik_full_version(),
"environment": get_env(),
"openssl_fips_enabled": fips_enabled(),
"openssl_fips_enabled": (
backend._fips_enabled if LicenseKey.get_total().status().is_valid else None
),
"openssl_version": OPENSSL_VERSION,
"platform": platform.platform(),
"python_version": python_version,

View File

@@ -37,7 +37,7 @@ class VersionSerializer(PassiveSerializer):
def get_version_latest(self, _) -> str:
"""Get latest version from cache"""
if get_current_tenant().schema_name != get_public_schema_name():
if get_current_tenant().schema_name == get_public_schema_name():
return authentik_version()
version_in_cache = cache.get(VERSION_CACHE_KEY)
if not version_in_cache: # pragma: no cover

View File

@@ -1,256 +0,0 @@
from django.db.models import Q
from django.utils.translation import gettext as _
from drf_spectacular.utils import extend_schema
from guardian.shortcuts import get_objects_for_user
from rest_framework.exceptions import ValidationError
from rest_framework.fields import BooleanField, CharField, ChoiceField, FileField
from rest_framework.parsers import MultiPartParser
from rest_framework.permissions import SAFE_METHODS
from rest_framework.request import Request
from rest_framework.response import Response
from rest_framework.views import APIView
from authentik.admin.files.backends.base import get_content_type
from authentik.admin.files.fields import FileField as AkFileField
from authentik.admin.files.manager import get_file_manager
from authentik.admin.files.usage import FileApiUsage
from authentik.admin.files.validation import validate_upload_file_name
from authentik.api.validation import validate
from authentik.core.api.used_by import DeleteAction, UsedBySerializer
from authentik.core.api.utils import PassiveSerializer, ThemedUrlsSerializer
from authentik.events.models import Event, EventAction
from authentik.lib.utils.reflection import get_apps
from authentik.rbac.permissions import HasPermission
MAX_FILE_SIZE_BYTES = 25 * 1024 * 1024 # 25MB
class FileView(APIView):
pagination_class = None
parser_classes = [MultiPartParser]
def get_permissions(self):
return [
HasPermission(
"authentik_rbac.view_media_files"
if self.request.method in SAFE_METHODS
else "authentik_rbac.manage_media_files"
)()
]
class FileListParameters(PassiveSerializer):
usage = ChoiceField(choices=list(FileApiUsage), default=FileApiUsage.MEDIA.value)
search = CharField(required=False)
manageable_only = BooleanField(required=False, default=False)
class FileListSerializer(PassiveSerializer):
name = CharField()
mime_type = CharField()
url = CharField()
themed_urls = ThemedUrlsSerializer(required=False, allow_null=True)
@extend_schema(
parameters=[FileListParameters],
responses={200: FileListSerializer(many=True)},
)
@validate(FileListParameters, location="query")
def get(self, request: Request, query: FileListParameters) -> Response:
"""List files from storage backend."""
params = query.validated_data
try:
usage = FileApiUsage(params.get("usage", FileApiUsage.MEDIA.value))
except ValueError as exc:
raise ValidationError(
f"Invalid usage parameter provided: {params.get('usage')}"
) from exc
# Backend is source of truth - list all files from storage
manager = get_file_manager(usage)
files = manager.list_files(manageable_only=params.get("manageable_only", False))
search_query = params.get("search", "")
if search_query:
files = filter(lambda file: search_query in file.lower(), files)
files = [
FileView.FileListSerializer(
data={
"name": file,
"url": manager.file_url(file, request),
"mime_type": get_content_type(file),
"themed_urls": manager.themed_urls(file, request),
}
)
for file in files
]
for file in files:
file.is_valid(raise_exception=True)
return Response([file.data for file in files])
class FileUploadSerializer(PassiveSerializer):
file = FileField(required=True)
name = CharField(required=False, allow_blank=True)
usage = CharField(required=False, default=FileApiUsage.MEDIA.value)
@extend_schema(
request=FileUploadSerializer,
responses={200: None},
)
@validate(FileUploadSerializer)
def post(self, request: Request, body: FileUploadSerializer) -> Response:
"""Upload file to storage backend."""
file = body.validated_data["file"]
name = body.validated_data.get("name", "").strip()
usage_value = body.validated_data.get("usage", FileApiUsage.MEDIA.value)
# Validate file size and type
if file.size > MAX_FILE_SIZE_BYTES:
raise ValidationError(
{
"file": [
_(
f"File size ({file.size}B) exceeds maximum allowed "
f"size ({MAX_FILE_SIZE_BYTES}B)."
)
]
}
)
try:
usage = FileApiUsage(usage_value)
except ValueError as exc:
raise ValidationError(f"Invalid usage parameter provided: {usage_value}") from exc
# Use original filename
if not name:
name = file.name
# Sanitize path to prevent directory traversal
validate_upload_file_name(name, ValidationError)
manager = get_file_manager(usage)
# Check if file already exists
if manager.file_exists(name):
raise ValidationError({"name": ["A file with this name already exists."]})
# Save to backend
with manager.save_file_stream(name) as f:
f.write(file.read())
Event.new(
EventAction.MODEL_CREATED,
model={
"app": "authentik_admin_files",
"model_name": "File",
"pk": name,
"name": name,
"usage": usage.value,
"mime_type": get_content_type(name),
},
).from_http(request)
return Response()
class FileDeleteParameters(PassiveSerializer):
name = CharField()
usage = ChoiceField(choices=list(FileApiUsage), default=FileApiUsage.MEDIA.value)
@extend_schema(
parameters=[FileDeleteParameters],
responses={200: None},
)
@validate(FileDeleteParameters, location="query")
def delete(self, request: Request, query: FileDeleteParameters) -> Response:
"""Delete file from storage backend."""
params = query.validated_data
validate_upload_file_name(params.get("name", ""), ValidationError)
try:
usage = FileApiUsage(params.get("usage", FileApiUsage.MEDIA.value))
except ValueError as exc:
raise ValidationError(
f"Invalid usage parameter provided: {params.get('usage')}"
) from exc
manager = get_file_manager(usage)
# Delete from backend
manager.delete_file(params.get("name"))
# Audit log for file deletion
Event.new(
EventAction.MODEL_DELETED,
model={
"app": "authentik_admin_files",
"model_name": "File",
"pk": params.get("name"),
"name": params.get("name"),
"usage": usage.value,
},
).from_http(request)
return Response()
class FileUsedByView(APIView):
pagination_class = None
def get_permissions(self):
return [
HasPermission(
"authentik_rbac.view_media_files"
if self.request.method in SAFE_METHODS
else "authentik_rbac.manage_media_files"
)()
]
class FileUsedByParameters(PassiveSerializer):
name = CharField()
@extend_schema(
parameters=[FileUsedByParameters],
responses={200: UsedBySerializer(many=True)},
)
@validate(FileUsedByParameters, location="query")
def get(self, request: Request, query: FileUsedByParameters) -> Response:
params = query.validated_data
models_and_fields = {}
for app in get_apps():
for model in app.get_models():
if model._meta.abstract:
continue
for field in model._meta.get_fields():
if isinstance(field, AkFileField):
models_and_fields.setdefault(model, []).append(field.name)
used_by = []
for model, fields in models_and_fields.items():
app = model._meta.app_label
model_name = model._meta.model_name
q = Q()
for field in fields:
q |= Q(**{field: params.get("name")})
objs = get_objects_for_user(
request.user, f"{app}.view_{model_name}", model.objects.all()
)
objs = objs.filter(q)
for obj in objs:
serializer = UsedBySerializer(
data={
"app": model._meta.app_label,
"model_name": model._meta.model_name,
"pk": str(obj.pk),
"name": str(obj),
"action": DeleteAction.LEFT_DANGLING,
}
)
serializer.is_valid()
used_by.append(serializer.data)
return Response(used_by)

View File

@@ -1,8 +0,0 @@
from authentik.blueprints.apps import ManagedAppConfig
class AuthentikFilesConfig(ManagedAppConfig):
name = "authentik.admin.files"
label = "authentik_admin_files"
verbose_name = "authentik Files"
default = True

View File

@@ -1,212 +0,0 @@
import mimetypes
from collections.abc import Callable, Generator, Iterator
from typing import cast
from django.core.cache import cache
from django.http.request import HttpRequest
from structlog.stdlib import get_logger
from authentik.admin.files.usage import FileUsage
CACHE_PREFIX = "goauthentik.io/admin/files"
LOGGER = get_logger()
# Theme variable placeholder for theme-specific files like logo-%(theme)s.png
THEME_VARIABLE = "%(theme)s"
def get_content_type(name: str) -> str:
"""Get MIME type for a file based on its extension."""
content_type, _ = mimetypes.guess_type(name)
return content_type or "application/octet-stream"
def get_valid_themes() -> list[str]:
"""Get valid themes that can be substituted for %(theme)s."""
from authentik.brands.api import Themes
return [t.value for t in Themes if t != Themes.AUTOMATIC]
def has_theme_variable(name: str) -> bool:
"""Check if filename contains %(theme)s variable."""
return THEME_VARIABLE in name
def substitute_theme(name: str, theme: str) -> str:
"""Replace %(theme)s with the given theme."""
return name.replace(THEME_VARIABLE, theme)
class Backend:
"""
Base class for file storage backends.
Class attributes:
allowed_usages: List of usages that can be used with this backend
"""
allowed_usages: list[FileUsage]
def __init__(self, usage: FileUsage):
"""
Initialize backend for the given usage type.
Args:
usage: FileUsage type enum value
"""
self.usage = usage
LOGGER.debug(
"Initializing storage backend",
backend=self.__class__.__name__,
usage=usage.value,
)
def supports_file(self, name: str) -> bool:
"""
Check if this backend can handle the given file path.
Args:
name: File path to check
Returns:
True if this backend supports this file path
"""
raise NotImplementedError
def list_files(self) -> Generator[str]:
"""
List all files stored in this backend.
Yields:
Relative file paths
"""
raise NotImplementedError
def file_url(
self,
name: str,
request: HttpRequest | None = None,
use_cache: bool = True,
) -> str:
"""
Get URL for accessing the file.
Args:
file_path: Relative file path
request: Optional Django HttpRequest for fully qualified URL building
use_cache: whether to retrieve the URL from cache
Returns:
URL to access the file (may be relative or absolute depending on backend)
"""
raise NotImplementedError
def themed_urls(
self,
name: str,
request: HttpRequest | None = None,
) -> dict[str, str] | None:
"""
Get URLs for each theme variant when filename contains %(theme)s.
Args:
name: File path potentially containing %(theme)s
request: Optional Django HttpRequest for URL building
Returns:
Dict mapping theme to URL if %(theme)s present, None otherwise
"""
if not has_theme_variable(name):
return None
return {
theme: self.file_url(substitute_theme(name, theme), request, use_cache=True)
for theme in get_valid_themes()
}
class ManageableBackend(Backend):
"""
Base class for manageable file storage backends.
Class attributes:
name: Canonical name of the storage backend, for use in configuration.
"""
name: str
@property
def manageable(self) -> bool:
"""
Whether this backend can actually be used for management.
Used only for management check, not for created the backend
"""
raise NotImplementedError
def save_file(self, name: str, content: bytes) -> None:
"""
Save file content to storage.
Args:
file_path: Relative file path
content: File content as bytes
"""
raise NotImplementedError
def save_file_stream(self, name: str) -> Iterator:
"""
Context manager for streaming file writes.
Args:
file_path: Relative file path
Returns:
Context manager that yields a writable file-like object
FileUsage:
with backend.save_file_stream("output.csv") as f:
f.write(b"data...")
"""
raise NotImplementedError
def delete_file(self, name: str) -> None:
"""
Delete file from storage.
Args:
file_path: Relative file path
"""
raise NotImplementedError
def file_exists(self, name: str) -> bool:
"""
Check if a file exists.
Args:
file_path: Relative file path
Returns:
True if file exists, False otherwise
"""
raise NotImplementedError
def _cache_get_or_set(
self,
name: str,
request: HttpRequest | None,
default: Callable[[str, HttpRequest | None], str],
timeout: int,
) -> str:
timeout_ignore = 60
timeout = int(timeout * 0.67)
if timeout < timeout_ignore:
timeout = 0
request_key = "None"
if request is not None:
request_key = f"{request.build_absolute_uri('/')}"
cache_key = f"{CACHE_PREFIX}/{self.name}/{self.usage}/{request_key}/{name}"
return cast(str, cache.get_or_set(cache_key, lambda: default(name, request), timeout))

View File

@@ -1,131 +0,0 @@
import os
from collections.abc import Generator, Iterator
from contextlib import contextmanager
from datetime import timedelta
from hashlib import sha256
from pathlib import Path
import jwt
from django.conf import settings
from django.db import connection
from django.http.request import HttpRequest
from django.utils.timezone import now
from authentik.admin.files.backends.base import ManageableBackend
from authentik.admin.files.usage import FileUsage
from authentik.lib.config import CONFIG
from authentik.lib.utils.time import timedelta_from_string
class FileBackend(ManageableBackend):
"""Local filesystem backend for file storage.
Stores files in a local directory structure:
- Path: {base_dir}/{usage}/{schema}/{filename}
- Supports full file management (upload, delete, list)
- Used when storage.backend=file (default)
"""
name = "file"
allowed_usages = list(FileUsage) # All usages
@property
def _base_dir(self) -> Path:
return Path(
CONFIG.get(
f"storage.{self.usage.value}.{self.name}.path",
CONFIG.get(f"storage.{self.name}.path", "./data"),
)
)
@property
def base_path(self) -> Path:
"""Path structure: {base_dir}/{usage}/{schema}"""
return self._base_dir / self.usage.value / connection.schema_name
@property
def manageable(self) -> bool:
# Check _base_dir (the mount point, e.g. /data) rather than base_path
# (which includes usage/schema subdirs, e.g. /data/media/public).
# The subdirectories are created on first file write via mkdir(parents=True)
# in save_file(), so requiring them to exist beforehand would prevent
# file creation on fresh installs.
return (
self._base_dir.exists()
and (self._base_dir.is_mount() or (self._base_dir / self.usage.value).is_mount())
or (settings.DEBUG or settings.TEST)
)
def supports_file(self, name: str) -> bool:
"""We support all files"""
return True
def list_files(self) -> Generator[str]:
"""List all files returning relative paths from base_path."""
for root, _, files in os.walk(self.base_path):
for file in files:
full_path = Path(root) / file
rel_path = full_path.relative_to(self.base_path)
yield str(rel_path)
def file_url(
self,
name: str,
request: HttpRequest | None = None,
use_cache: bool = True,
) -> str:
"""Get URL for accessing the file."""
expires_in = timedelta_from_string(
CONFIG.get(
f"storage.{self.usage.value}.{self.name}.url_expiry",
CONFIG.get(f"storage.{self.name}.url_expiry", "minutes=15"),
)
)
def _file_url(name: str, request: HttpRequest | None) -> str:
prefix = CONFIG.get("web.path", "/")[:-1]
path = f"{self.usage.value}/{connection.schema_name}/{name}"
token = jwt.encode(
payload={
"path": path,
"exp": now() + expires_in,
"nbf": now() - timedelta(seconds=15),
},
key=sha256(f"{settings.SECRET_KEY}:{self.usage}".encode()).hexdigest(),
algorithm="HS256",
)
url = f"{prefix}/files/{path}?token={token}"
if request is None:
return url
return request.build_absolute_uri(url)
if use_cache:
timeout = int(expires_in.total_seconds())
return self._cache_get_or_set(name, request, _file_url, timeout)
else:
return _file_url(name, request)
def save_file(self, name: str, content: bytes) -> None:
"""Save file to local filesystem."""
path = self.base_path / Path(name)
path.parent.mkdir(parents=True, exist_ok=True)
with open(path, "w+b") as f:
f.write(content)
@contextmanager
def save_file_stream(self, name: str) -> Iterator:
"""Context manager for streaming file writes to local filesystem."""
path = self.base_path / Path(name)
path.parent.mkdir(parents=True, exist_ok=True)
with open(path, "wb") as f:
yield f
def delete_file(self, name: str) -> None:
"""Delete file from local filesystem."""
path = self.base_path / Path(name)
path.unlink(missing_ok=True)
def file_exists(self, name: str) -> bool:
"""Check if a file exists."""
path = self.base_path / Path(name)
return path.exists()

View File

@@ -1,70 +0,0 @@
from collections.abc import Generator
from django.http.request import HttpRequest
from authentik.admin.files.backends.base import Backend
from authentik.admin.files.usage import FileUsage
EXTERNAL_URL_SCHEMES = ["http:", "https://"]
FONT_AWESOME_SCHEME = "fa://"
class PassthroughBackend(Backend):
"""Passthrough backend for external URLs and special schemes.
Handles external resources that aren't stored in authentik:
- Font Awesome icons (fa://...)
- HTTP/HTTPS URLs (http://..., https://...)
Files that are "managed" by this backend are just passed through as-is.
No upload, delete, or listing operations are supported.
Only accessible through resolve_file_url when an external URL is detected.
"""
allowed_usages = [FileUsage.MEDIA]
def supports_file(self, name: str) -> bool:
"""Check if file path is an external URL or Font Awesome icon."""
if name.startswith(FONT_AWESOME_SCHEME):
return True
for scheme in EXTERNAL_URL_SCHEMES:
if name.startswith(scheme):
return True
return False
def list_files(self) -> Generator[str]:
"""External files cannot be listed."""
yield from []
def file_url(
self,
name: str,
request: HttpRequest | None = None,
use_cache: bool = True,
) -> str:
"""Return the URL as-is for passthrough files."""
return name
def themed_urls(
self,
name: str,
request: HttpRequest | None = None,
) -> dict[str, str] | None:
"""Support themed URLs for external URLs with %(theme)s placeholder.
If the external URL contains %(theme)s, substitute it for each theme.
We can't verify that themed variants exist at the external location,
but we trust the user to provide valid URLs.
"""
from authentik.admin.files.backends.base import (
get_valid_themes,
has_theme_variable,
substitute_theme,
)
if not has_theme_variable(name):
return None
return {theme: substitute_theme(name, theme) for theme in get_valid_themes()}

View File

@@ -1,243 +0,0 @@
from collections.abc import Generator, Iterator
from contextlib import contextmanager
from tempfile import SpooledTemporaryFile
from urllib.parse import urlsplit
import boto3
from botocore.config import Config
from botocore.exceptions import ClientError
from django.db import connection
from django.http.request import HttpRequest
from authentik.admin.files.backends.base import ManageableBackend, get_content_type
from authentik.admin.files.usage import FileUsage
from authentik.lib.config import CONFIG
from authentik.lib.utils.time import timedelta_from_string
class S3Backend(ManageableBackend):
"""S3-compatible object storage backend.
Stores files in s3-compatible storage:
- Key prefix: {usage}/{schema}/{filename}
- Supports full file management (upload, delete, list)
- Generates presigned URLs for file access
- Used when storage.backend=s3
"""
allowed_usages = list(FileUsage) # All usages
name = "s3"
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self._config = {}
self._session = None
def _get_config(self, key: str, default: str | None) -> tuple[str | None, bool]:
unset = object()
current = self._config.get(key, unset)
refreshed = CONFIG.refresh(
f"storage.{self.usage.value}.{self.name}.{key}",
CONFIG.refresh(f"storage.{self.name}.{key}", default),
)
if current is unset:
current = refreshed
self._config[key] = refreshed
return (refreshed, current != refreshed)
@property
def base_path(self) -> str:
"""S3 key prefix: {usage}/{schema}/"""
return f"{self.usage.value}/{connection.schema_name}"
@property
def bucket_name(self) -> str:
return CONFIG.get(
f"storage.{self.usage.value}.{self.name}.bucket_name",
CONFIG.get(f"storage.{self.name}.bucket_name"),
)
@property
def session(self) -> boto3.Session:
"""Create boto3 session with configured credentials."""
session_profile, session_profile_r = self._get_config("session_profile", None)
if session_profile is not None:
if session_profile_r or self._session is None:
self._session = boto3.Session(profile_name=session_profile)
return self._session
else:
return self._session
else:
access_key, access_key_r = self._get_config("access_key", None)
secret_key, secret_key_r = self._get_config("secret_key", None)
session_token, session_token_r = self._get_config("session_token", None)
if access_key_r or secret_key_r or session_token_r or self._session is None:
self._session = boto3.Session(
aws_access_key_id=access_key,
aws_secret_access_key=secret_key,
aws_session_token=session_token,
)
return self._session
else:
return self._session
@property
def client(self):
"""Create S3 client with configured endpoint and region."""
endpoint_url = CONFIG.get(
f"storage.{self.usage.value}.{self.name}.endpoint",
CONFIG.get(f"storage.{self.name}.endpoint", None),
)
use_ssl = CONFIG.get(
f"storage.{self.usage.value}.{self.name}.use_ssl",
CONFIG.get(f"storage.{self.name}.use_ssl", True),
)
region_name = CONFIG.get(
f"storage.{self.usage.value}.{self.name}.region",
CONFIG.get(f"storage.{self.name}.region", None),
)
addressing_style = CONFIG.get(
f"storage.{self.usage.value}.{self.name}.addressing_style",
CONFIG.get(f"storage.{self.name}.addressing_style", "auto"),
)
return self.session.client(
"s3",
endpoint_url=endpoint_url,
use_ssl=use_ssl,
region_name=region_name,
config=Config(signature_version="s3v4", s3={"addressing_style": addressing_style}),
)
@property
def manageable(self) -> bool:
return True
def supports_file(self, name: str) -> bool:
"""We support all files"""
return True
def list_files(self) -> Generator[str]:
"""List all files returning relative paths from base_path."""
paginator = self.client.get_paginator("list_objects_v2")
pages = paginator.paginate(Bucket=self.bucket_name, Prefix=f"{self.base_path}/")
for page in pages:
for obj in page.get("Contents", []):
key = obj["Key"]
# Remove base path prefix to get relative path
rel_path = key.removeprefix(f"{self.base_path}/")
if rel_path: # Skip if it's just the directory itself
yield rel_path
def file_url(
self,
name: str,
request: HttpRequest | None = None,
use_cache: bool = True,
) -> str:
"""Generate presigned URL for file access."""
use_https = CONFIG.get_bool(
f"storage.{self.usage.value}.{self.name}.secure_urls",
CONFIG.get_bool(f"storage.{self.name}.secure_urls", True),
)
expires_in = int(
timedelta_from_string(
CONFIG.get(
f"storage.{self.usage.value}.{self.name}.url_expiry",
CONFIG.get(f"storage.{self.name}.url_expiry", "minutes=15"),
)
).total_seconds()
)
def _file_url(name: str, request: HttpRequest | None) -> str:
params = {
"Bucket": self.bucket_name,
"Key": f"{self.base_path}/{name}",
}
url = self.client.generate_presigned_url(
"get_object",
Params=params,
ExpiresIn=expires_in,
HttpMethod="GET",
)
# Support custom domain for S3-compatible storage (so not AWS)
# Well, can't you do custom domains on AWS as well?
custom_domain = CONFIG.get(
f"storage.{self.usage.value}.{self.name}.custom_domain",
CONFIG.get(f"storage.{self.name}.custom_domain", None),
)
if custom_domain:
parsed = urlsplit(url)
scheme = "https" if use_https else "http"
path = parsed.path
# When using path-style addressing, the presigned URL contains the bucket
# name in the path (e.g., /bucket-name/key). Since custom_domain must
# include the bucket name (per docs), strip it from the path to avoid
# duplication. See: https://github.com/goauthentik/authentik/issues/19521
# Check with trailing slash to ensure exact bucket name match
if path.startswith(f"/{self.bucket_name}/"):
path = path.removeprefix(f"/{self.bucket_name}")
# Normalize to avoid double slashes
custom_domain = custom_domain.rstrip("/")
if not path.startswith("/"):
path = f"/{path}"
url = f"{scheme}://{custom_domain}{path}?{parsed.query}"
return url
if use_cache:
return self._cache_get_or_set(name, request, _file_url, expires_in)
else:
return _file_url(name, request)
def save_file(self, name: str, content: bytes) -> None:
"""Save file to S3."""
self.client.put_object(
Bucket=self.bucket_name,
Key=f"{self.base_path}/{name}",
Body=content,
ACL="private",
ContentType=get_content_type(name),
)
@contextmanager
def save_file_stream(self, name: str) -> Iterator:
"""Context manager for streaming file writes to S3."""
# Keep files in memory up to 5 MB
with SpooledTemporaryFile(max_size=5 * 1024 * 1024, suffix=".S3File") as file:
yield file
file.seek(0)
self.client.upload_fileobj(
Fileobj=file,
Bucket=self.bucket_name,
Key=f"{self.base_path}/{name}",
ExtraArgs={
"ACL": "private",
"ContentType": get_content_type(name),
},
)
def delete_file(self, name: str) -> None:
"""Delete file from S3."""
self.client.delete_object(
Bucket=self.bucket_name,
Key=f"{self.base_path}/{name}",
)
def file_exists(self, name: str) -> bool:
"""Check if a file exists in S3."""
try:
self.client.head_object(
Bucket=self.bucket_name,
Key=f"{self.base_path}/{name}",
)
return True
except ClientError:
return False

View File

@@ -1,58 +0,0 @@
from collections.abc import Generator
from pathlib import Path
from django.http.request import HttpRequest
from authentik.admin.files.backends.base import Backend
from authentik.admin.files.usage import FileUsage
from authentik.lib.config import CONFIG
STATIC_ASSETS_BASE_DIR = Path("web/dist")
STATIC_ASSETS_DIRS = [Path(p) for p in ("assets/icons", "assets/images")]
STATIC_ASSETS_SOURCES_DIR = Path("web/authentik/sources")
STATIC_FILE_EXTENSIONS = [".svg", ".png", ".jpg", ".jpeg"]
STATIC_PATH_PREFIX = "/static"
class StaticBackend(Backend):
"""Read-only backend for static files from web/dist/assets.
- Used for serving built-in static assets like icons and images.
- Files cannot be uploaded or deleted through this backend.
- Only accessible through resolve_file_url when a static path is detected.
"""
allowed_usages = [FileUsage.MEDIA]
def supports_file(self, name: str) -> bool:
"""Check if file path is a static path."""
return name.startswith(STATIC_PATH_PREFIX)
def list_files(self) -> Generator[str]:
"""List all static files."""
# List built-in source icons
if STATIC_ASSETS_SOURCES_DIR.exists():
for file_path in STATIC_ASSETS_SOURCES_DIR.iterdir():
if file_path.is_file() and (file_path.suffix in STATIC_FILE_EXTENSIONS):
yield f"{STATIC_PATH_PREFIX}/authentik/sources/{file_path.name}"
# List other static assets
for dir in STATIC_ASSETS_DIRS:
dist_dir = STATIC_ASSETS_BASE_DIR / dir
if dist_dir.exists():
for file_path in dist_dir.rglob("*"):
if file_path.is_file() and (file_path.suffix in STATIC_FILE_EXTENSIONS):
yield f"{STATIC_PATH_PREFIX}/dist/{dir}/{file_path.name}"
def file_url(
self,
name: str,
request: HttpRequest | None = None,
use_cache: bool = True,
) -> str:
"""Get URL for static file."""
prefix = CONFIG.get("web.path", "/")[:-1]
url = f"{prefix}{name}"
if request is None:
return url
return request.build_absolute_uri(url)

View File

@@ -1,195 +0,0 @@
from pathlib import Path
from django.test import TestCase
from authentik.admin.files.backends.file import FileBackend
from authentik.admin.files.tests.utils import FileTestFileBackendMixin
from authentik.admin.files.usage import FileUsage
from authentik.lib.config import CONFIG
class TestFileBackend(FileTestFileBackendMixin, TestCase):
"""Test FileBackend class"""
def setUp(self):
"""Set up test fixtures"""
super().setUp()
self.backend = FileBackend(FileUsage.MEDIA)
def test_allowed_usages(self):
"""Test that FileBackend supports all usage types"""
self.assertEqual(self.backend.allowed_usages, list(FileUsage))
def test_base_path(self):
"""Test base_path property constructs correct path"""
base_path = self.backend.base_path
expected = Path(self.media_backend_path) / "media" / "public"
self.assertEqual(base_path, expected)
def test_base_path_reports_usage(self):
"""Test base_path with reports usage"""
backend = FileBackend(FileUsage.REPORTS)
base_path = backend.base_path
expected = Path(self.reports_backend_path) / "reports" / "public"
self.assertEqual(base_path, expected)
def test_list_files_empty_directory(self):
"""Test list_files returns empty when directory is empty"""
# Create the directory but keep it empty
self.backend.base_path.mkdir(parents=True, exist_ok=True)
files = list(self.backend.list_files())
self.assertEqual(files, [])
def test_list_files_with_files(self):
"""Test list_files returns all files in directory"""
base_path = self.backend.base_path
base_path.mkdir(parents=True, exist_ok=True)
# Create some test files
(base_path / "file1.txt").write_text("content1")
(base_path / "file2.png").write_text("content2")
(base_path / "subdir").mkdir()
(base_path / "subdir" / "file3.csv").write_text("content3")
files = sorted(list(self.backend.list_files()))
expected = sorted(["file1.txt", "file2.png", "subdir/file3.csv"])
self.assertEqual(files, expected)
def test_list_files_nonexistent_directory(self):
"""Test list_files returns empty when directory doesn't exist"""
files = list(self.backend.list_files())
self.assertEqual(files, [])
def test_save_file(self):
content = b"test file content"
file_name = "test.txt"
self.backend.save_file(file_name, content)
# Verify file was created
file_path = self.backend.base_path / file_name
self.assertTrue(file_path.exists())
self.assertEqual(file_path.read_bytes(), content)
def test_save_file_creates_subdirectories(self):
"""Test save_file creates parent directories as needed"""
content = b"nested file content"
file_name = "subdir1/subdir2/nested.txt"
self.backend.save_file(file_name, content)
# Verify file and directories were created
file_path = self.backend.base_path / file_name
self.assertTrue(file_path.exists())
self.assertEqual(file_path.read_bytes(), content)
def test_save_file_stream(self):
"""Test save_file_stream context manager writes file correctly"""
content = b"streamed content"
file_name = "stream_test.txt"
with self.backend.save_file_stream(file_name) as f:
f.write(content)
# Verify file was created
file_path = self.backend.base_path / file_name
self.assertTrue(file_path.exists())
self.assertEqual(file_path.read_bytes(), content)
def test_save_file_stream_creates_subdirectories(self):
"""Test save_file_stream creates parent directories as needed"""
content = b"nested stream content"
file_name = "dir1/dir2/stream.bin"
with self.backend.save_file_stream(file_name) as f:
f.write(content)
# Verify file and directories were created
file_path = self.backend.base_path / file_name
self.assertTrue(file_path.exists())
self.assertEqual(file_path.read_bytes(), content)
def test_delete_file(self):
"""Test delete_file removes existing file"""
file_name = "to_delete.txt"
# Create file first
self.backend.save_file(file_name, b"content")
file_path = self.backend.base_path / file_name
self.assertTrue(file_path.exists())
# Delete it
self.backend.delete_file(file_name)
self.assertFalse(file_path.exists())
def test_delete_file_nonexistent(self):
"""Test delete_file handles nonexistent file gracefully"""
file_name = "does_not_exist.txt"
self.backend.delete_file(file_name)
def test_file_url(self):
"""Test file_url generates correct URL"""
file_name = "icon.png"
url = self.backend.file_url(file_name).split("?")[0]
expected = "/files/media/public/icon.png"
self.assertEqual(url, expected)
@CONFIG.patch("web.path", "/authentik/")
def test_file_url_with_prefix(self):
"""Test file_url with web path prefix"""
file_name = "logo.svg"
url = self.backend.file_url(file_name).split("?")[0]
expected = "/authentik/files/media/public/logo.svg"
self.assertEqual(url, expected)
def test_file_url_nested_path(self):
"""Test file_url with nested file path"""
file_name = "path/to/file.png"
url = self.backend.file_url(file_name).split("?")[0]
expected = "/files/media/public/path/to/file.png"
self.assertEqual(url, expected)
def test_file_exists_true(self):
"""Test file_exists returns True for existing file"""
file_name = "exists.txt"
self.backend.base_path.mkdir(parents=True, exist_ok=True)
(self.backend.base_path / file_name).touch()
self.assertTrue(self.backend.file_exists(file_name))
def test_file_exists_false(self):
"""Test file_exists returns False for nonexistent file"""
self.assertFalse(self.backend.file_exists("does_not_exist.txt"))
def test_themed_urls_without_theme_variable(self):
"""Test themed_urls returns None when filename has no %(theme)s"""
file_name = "logo.png"
result = self.backend.themed_urls(file_name)
self.assertIsNone(result)
def test_themed_urls_with_theme_variable(self):
"""Test themed_urls returns dict of URLs for each theme"""
file_name = "logo-%(theme)s.png"
result = self.backend.themed_urls(file_name)
self.assertIsInstance(result, dict)
self.assertIn("light", result)
self.assertIn("dark", result)
# Check URLs contain the substituted theme
self.assertIn("logo-light.png", result["light"])
self.assertIn("logo-dark.png", result["dark"])
def test_themed_urls_multiple_theme_variables(self):
"""Test themed_urls with multiple %(theme)s in path"""
file_name = "%(theme)s/logo-%(theme)s.svg"
result = self.backend.themed_urls(file_name)
self.assertIsInstance(result, dict)
self.assertIn("light/logo-light.svg", result["light"])
self.assertIn("dark/logo-dark.svg", result["dark"])

View File

@@ -1,67 +0,0 @@
"""Test passthrough backend"""
from django.test import TestCase
from authentik.admin.files.backends.passthrough import PassthroughBackend
from authentik.admin.files.usage import FileUsage
class TestPassthroughBackend(TestCase):
"""Test PassthroughBackend class"""
def setUp(self):
"""Set up test fixtures"""
self.backend = PassthroughBackend(FileUsage.MEDIA)
def test_allowed_usages(self):
"""Test that PassthroughBackend only supports MEDIA usage"""
self.assertEqual(self.backend.allowed_usages, [FileUsage.MEDIA])
def test_supports_file_path_font_awesome(self):
"""Test supports_file_path returns True for Font Awesome icons"""
self.assertTrue(self.backend.supports_file("fa://user"))
self.assertTrue(self.backend.supports_file("fa://home"))
self.assertTrue(self.backend.supports_file("fa://shield"))
def test_supports_file_path_http(self):
"""Test supports_file_path returns True for HTTP URLs"""
self.assertTrue(self.backend.supports_file("http://example.com/icon.png"))
self.assertTrue(self.backend.supports_file("http://cdn.example.com/logo.svg"))
def test_supports_file_path_https(self):
"""Test supports_file_path returns True for HTTPS URLs"""
self.assertTrue(self.backend.supports_file("https://example.com/icon.png"))
self.assertTrue(self.backend.supports_file("https://cdn.example.com/logo.svg"))
def test_supports_file_path_false(self):
"""Test supports_file_path returns False for regular paths"""
self.assertFalse(self.backend.supports_file("icon.png"))
self.assertFalse(self.backend.supports_file("/static/icon.png"))
self.assertFalse(self.backend.supports_file("media/logo.svg"))
self.assertFalse(self.backend.supports_file(""))
def test_supports_file_path_invalid_scheme(self):
"""Test supports_file_path returns False for invalid schemes"""
self.assertFalse(self.backend.supports_file("ftp://example.com/file.png"))
self.assertFalse(self.backend.supports_file("file:///path/to/file.png"))
self.assertFalse(self.backend.supports_file("data:image/png;base64,abc123"))
def test_list_files(self):
"""Test list_files returns empty generator"""
files = list(self.backend.list_files())
self.assertEqual(files, [])
def test_file_url(self):
"""Test file_url returns the URL as-is"""
url = "https://example.com/icon.png"
self.assertEqual(self.backend.file_url(url), url)
def test_file_url_font_awesome(self):
"""Test file_url returns Font Awesome URL as-is"""
url = "fa://user"
self.assertEqual(self.backend.file_url(url), url)
def test_file_url_http(self):
"""Test file_url returns HTTP URL as-is"""
url = "http://cdn.example.com/logo.svg"
self.assertEqual(self.backend.file_url(url), url)

View File

@@ -1,215 +0,0 @@
from unittest import skipUnless
from django.test import TestCase
from authentik.admin.files.tests.utils import FileTestS3BackendMixin, s3_test_server_available
from authentik.admin.files.usage import FileUsage
from authentik.lib.config import CONFIG
@skipUnless(s3_test_server_available(), "S3 test server not available")
class TestS3Backend(FileTestS3BackendMixin, TestCase):
"""Test S3 backend functionality"""
def setUp(self):
super().setUp()
def test_base_path(self):
"""Test base_path property generates correct S3 key prefix"""
expected = "media/public"
self.assertEqual(self.media_s3_backend.base_path, expected)
def test_supports_file_path_s3(self):
"""Test supports_file_path returns True for s3 backend"""
self.assertTrue(self.media_s3_backend.supports_file("path/to/any-file.png"))
self.assertTrue(self.media_s3_backend.supports_file("any-file.png"))
def test_list_files(self):
"""Test list_files returns relative paths"""
self.media_s3_backend.client.put_object(
Bucket=self.media_s3_bucket_name,
Key="media/public/file1.png",
Body=b"test content",
ACL="private",
)
self.media_s3_backend.client.put_object(
Bucket=self.media_s3_bucket_name,
Key="media/other/file1.png",
Body=b"test content",
ACL="private",
)
files = list(self.media_s3_backend.list_files())
self.assertEqual(len(files), 1)
self.assertIn("file1.png", files)
def test_list_files_empty(self):
"""Test list_files with no files"""
files = list(self.media_s3_backend.list_files())
self.assertEqual(len(files), 0)
def test_save_file(self):
"""Test save_file uploads to S3"""
content = b"test file content"
self.media_s3_backend.save_file("test.png", content)
def test_save_file_stream(self):
"""Test save_file_stream uploads to S3 using context manager"""
with self.media_s3_backend.save_file_stream("test.csv") as f:
f.write(b"header1,header2\n")
f.write(b"value1,value2\n")
def test_delete_file(self):
"""Test delete_file removes from S3"""
self.media_s3_backend.client.put_object(
Bucket=self.media_s3_bucket_name,
Key="media/public/test.png",
Body=b"test content",
ACL="private",
)
self.media_s3_backend.delete_file("test.png")
@CONFIG.patch("storage.s3.secure_urls", True)
@CONFIG.patch("storage.s3.custom_domain", None)
def test_file_url_basic(self):
"""Test file_url generates presigned URL with AWS signature format"""
url = self.media_s3_backend.file_url("test.png")
self.assertIn("X-Amz-Algorithm=AWS4-HMAC-SHA256", url)
self.assertIn("X-Amz-Signature=", url)
self.assertIn("test.png", url)
@CONFIG.patch("storage.s3.bucket_name", "test-bucket")
def test_file_exists_true(self):
"""Test file_exists returns True for existing file"""
self.media_s3_backend.client.put_object(
Bucket=self.media_s3_bucket_name,
Key="media/public/test.png",
Body=b"test content",
ACL="private",
)
exists = self.media_s3_backend.file_exists("test.png")
self.assertTrue(exists)
@CONFIG.patch("storage.s3.bucket_name", "test-bucket")
def test_file_exists_false(self):
"""Test file_exists returns False for non-existent file"""
exists = self.media_s3_backend.file_exists("nonexistent.png")
self.assertFalse(exists)
def test_allowed_usages(self):
"""Test that S3Backend supports all usage types"""
self.assertEqual(self.media_s3_backend.allowed_usages, list(FileUsage))
def test_reports_usage(self):
"""Test S3Backend with REPORTS usage"""
self.assertEqual(self.reports_s3_backend.usage, FileUsage.REPORTS)
self.assertEqual(self.reports_s3_backend.base_path, "reports/public")
@CONFIG.patch("storage.s3.secure_urls", True)
@CONFIG.patch("storage.s3.addressing_style", "path")
def test_file_url_custom_domain_with_bucket_no_duplicate(self):
"""Test file_url doesn't duplicate bucket name when custom_domain includes bucket.
Regression test for https://github.com/goauthentik/authentik/issues/19521
When using:
- Path-style addressing (bucket name goes in URL path, not subdomain)
- Custom domain that includes the bucket name (e.g., s3.example.com/bucket-name)
The bucket name should NOT appear twice in the final URL.
Example of the bug:
- custom_domain = "s3.example.com/authentik-media"
- boto3 presigned URL = "http://s3.example.com/authentik-media/media/public/file.png?..."
- Buggy result = "https://s3.example.com/authentik-media/authentik-media/media/public/file.png?..."
"""
bucket_name = self.media_s3_bucket_name
# Custom domain includes the bucket name
custom_domain = f"localhost:8020/{bucket_name}"
with CONFIG.patch("storage.media.s3.custom_domain", custom_domain):
url = self.media_s3_backend.file_url("application-icons/test.svg", use_cache=False)
# The bucket name should appear exactly once in the URL path, not twice
bucket_occurrences = url.count(bucket_name)
self.assertEqual(
bucket_occurrences,
1,
f"Bucket name '{bucket_name}' appears {bucket_occurrences} times in URL, expected 1. "
f"URL: {url}",
)
def test_themed_urls_without_theme_variable(self):
"""Test themed_urls returns None when filename has no %(theme)s"""
result = self.media_s3_backend.themed_urls("logo.png")
self.assertIsNone(result)
def test_themed_urls_with_theme_variable(self):
"""Test themed_urls returns dict of presigned URLs for each theme"""
result = self.media_s3_backend.themed_urls("logo-%(theme)s.png")
self.assertIsInstance(result, dict)
self.assertIn("light", result)
self.assertIn("dark", result)
# Check URLs are valid presigned URLs with correct file paths
self.assertIn("logo-light.png", result["light"])
self.assertIn("logo-dark.png", result["dark"])
self.assertIn("X-Amz-Signature=", result["light"])
self.assertIn("X-Amz-Signature=", result["dark"])
def test_themed_urls_multiple_theme_variables(self):
"""Test themed_urls with multiple %(theme)s in path"""
result = self.media_s3_backend.themed_urls("%(theme)s/logo-%(theme)s.svg")
self.assertIsInstance(result, dict)
self.assertIn("light/logo-light.svg", result["light"])
self.assertIn("dark/logo-dark.svg", result["dark"])
def test_save_file_sets_content_type_svg(self):
"""Test save_file sets correct ContentType for SVG files"""
self.media_s3_backend.save_file("test.svg", b"<svg></svg>")
response = self.media_s3_backend.client.head_object(
Bucket=self.media_s3_bucket_name,
Key="media/public/test.svg",
)
self.assertEqual(response["ContentType"], "image/svg+xml")
def test_save_file_sets_content_type_png(self):
"""Test save_file sets correct ContentType for PNG files"""
self.media_s3_backend.save_file("test.png", b"\x89PNG\r\n\x1a\n")
response = self.media_s3_backend.client.head_object(
Bucket=self.media_s3_bucket_name,
Key="media/public/test.png",
)
self.assertEqual(response["ContentType"], "image/png")
def test_save_file_stream_sets_content_type(self):
"""Test save_file_stream sets correct ContentType"""
with self.media_s3_backend.save_file_stream("test.css") as f:
f.write(b"body { color: red; }")
response = self.media_s3_backend.client.head_object(
Bucket=self.media_s3_bucket_name,
Key="media/public/test.css",
)
self.assertEqual(response["ContentType"], "text/css")
def test_save_file_unknown_extension_octet_stream(self):
"""Test save_file sets octet-stream for unknown extensions"""
self.media_s3_backend.save_file("test.unknownext123", b"data")
response = self.media_s3_backend.client.head_object(
Bucket=self.media_s3_bucket_name,
Key="media/public/test.unknownext123",
)
self.assertEqual(response["ContentType"], "application/octet-stream")

View File

@@ -1,42 +0,0 @@
from django.test import TestCase
from authentik.admin.files.backends.static import StaticBackend
from authentik.admin.files.usage import FileUsage
class TestStaticBackend(TestCase):
"""Test Static backend functionality"""
def setUp(self):
"""Set up test fixtures"""
self.usage = FileUsage.MEDIA
self.backend = StaticBackend(self.usage)
def test_init(self):
"""Test StaticBackend initialization"""
self.assertEqual(self.backend.usage, self.usage)
def test_allowed_usages(self):
"""Test that StaticBackend only supports MEDIA usage"""
self.assertEqual(self.backend.allowed_usages, [FileUsage.MEDIA])
def test_supports_file_path_static_prefix(self):
"""Test supports_file_path returns True for /static prefix"""
self.assertTrue(self.backend.supports_file("/static/assets/icons/test.svg"))
self.assertTrue(self.backend.supports_file("/static/authentik/sources/icon.png"))
def test_supports_file_path_not_static(self):
"""Test supports_file_path returns False for non-static paths"""
self.assertFalse(self.backend.supports_file("web/dist/assets/icons/test.svg"))
self.assertFalse(self.backend.supports_file("web/dist/assets/images/logo.png"))
self.assertFalse(self.backend.supports_file("media/public/test.png"))
self.assertFalse(self.backend.supports_file("/media/test.svg"))
self.assertFalse(self.backend.supports_file("test.jpg"))
def test_list_files(self):
"""Test list_files includes expected files"""
files = list(self.backend.list_files())
self.assertIn("/static/authentik/sources/ldap.png", files)
self.assertIn("/static/authentik/sources/openidconnect.svg", files)
self.assertIn("/static/authentik/sources/saml.png", files)

View File

@@ -1,7 +0,0 @@
from django.db import models
from authentik.admin.files.validation import validate_file_name
class FileField(models.TextField):
default_validators = [validate_file_name]

View File

@@ -1,164 +0,0 @@
from collections.abc import Generator, Iterator
from django.core.exceptions import ImproperlyConfigured
from django.http.request import HttpRequest
from rest_framework.request import Request
from structlog.stdlib import get_logger
from authentik.admin.files.backends.base import ManageableBackend
from authentik.admin.files.backends.file import FileBackend
from authentik.admin.files.backends.passthrough import PassthroughBackend
from authentik.admin.files.backends.s3 import S3Backend
from authentik.admin.files.backends.static import StaticBackend
from authentik.admin.files.usage import FileUsage
from authentik.lib.config import CONFIG
LOGGER = get_logger()
_FILE_BACKENDS = [
StaticBackend,
PassthroughBackend,
FileBackend,
S3Backend,
]
class FileManager:
def __init__(self, usage: FileUsage) -> None:
management_backend_name = CONFIG.get(
f"storage.{usage.value}.backend",
CONFIG.get("storage.backend", "file"),
)
self.management_backend = None
for backend in _FILE_BACKENDS:
if issubclass(backend, ManageableBackend) and backend.name == management_backend_name:
self.management_backend = backend(usage)
if self.management_backend is None:
LOGGER.warning(
f"Storage backend configuration for {usage.value} is "
f"invalid: {management_backend_name}"
)
self.backends = []
for backend in _FILE_BACKENDS:
if usage not in backend.allowed_usages:
continue
if isinstance(self.management_backend, backend):
self.backends.append(self.management_backend)
elif not issubclass(backend, ManageableBackend):
self.backends.append(backend(usage))
@property
def manageable(self) -> bool:
"""
Whether this file manager is able to manage files.
"""
return self.management_backend is not None and self.management_backend.manageable
def list_files(self, manageable_only: bool = False) -> Generator[str]:
"""
List available files.
"""
for backend in self.backends:
if manageable_only and not isinstance(backend, ManageableBackend):
continue
yield from backend.list_files()
def file_url(
self,
name: str | None,
request: HttpRequest | Request | None = None,
use_cache: bool = True,
) -> str:
"""
Get URL for accessing the file.
"""
if not name:
return ""
if isinstance(request, Request):
request = request._request
for backend in self.backends:
if backend.supports_file(name):
return backend.file_url(name, request)
LOGGER.warning(f"Could not find file backend for file: {name}")
return ""
def themed_urls(
self,
name: str | None,
request: HttpRequest | Request | None = None,
) -> dict[str, str] | None:
"""
Get URLs for each theme variant when filename contains %(theme)s.
Returns dict mapping theme to URL if %(theme)s present, None otherwise.
"""
if not name:
return None
if isinstance(request, Request):
request = request._request
for backend in self.backends:
if backend.supports_file(name):
return backend.themed_urls(name, request)
return None
def _check_manageable(self) -> None:
if not self.manageable:
raise ImproperlyConfigured("No file management backend configured.")
def save_file(self, file_path: str, content: bytes) -> None:
"""
Save file contents to storage.
"""
self._check_manageable()
assert self.management_backend is not None # nosec
return self.management_backend.save_file(file_path, content)
def save_file_stream(self, file_path: str) -> Iterator:
"""
Context manager for streaming file writes.
Args:
file_path: Relative file path
Returns:
Context manager that yields a writable file-like object
Usage:
with manager.save_file_stream("output.csv") as f:
f.write(b"data...")
"""
self._check_manageable()
assert self.management_backend is not None # nosec
return self.management_backend.save_file_stream(file_path)
def delete_file(self, file_path: str) -> None:
"""
Delete file from storage.
"""
self._check_manageable()
assert self.management_backend is not None # nosec
return self.management_backend.delete_file(file_path)
def file_exists(self, file_path: str) -> bool:
"""
Check if a file exists.
"""
self._check_manageable()
assert self.management_backend is not None # nosec
return self.management_backend.file_exists(file_path)
MANAGERS = {usage: FileManager(usage) for usage in list(FileUsage)}
def get_file_manager(usage: FileUsage) -> FileManager:
return MANAGERS[usage]

View File

@@ -1 +0,0 @@
"""authentik files tests"""

View File

@@ -1,264 +0,0 @@
"""test file api"""
from io import BytesIO
from django.test import TestCase
from django.urls import reverse
from authentik.admin.files.manager import FileManager
from authentik.admin.files.tests.utils import FileTestFileBackendMixin
from authentik.admin.files.usage import FileUsage
from authentik.core.tests.utils import create_test_admin_user
from authentik.events.models import Event, EventAction
class TestFileAPI(FileTestFileBackendMixin, TestCase):
"""test file api"""
def setUp(self) -> None:
super().setUp()
self.user = create_test_admin_user()
self.client.force_login(self.user)
def test_upload_creates_event(self):
"""Test that uploading a file creates a FILE_UPLOADED event"""
manager = FileManager(FileUsage.MEDIA)
file_content = b"test file content"
file_name = "test-upload.png"
# Upload file
response = self.client.post(
reverse("authentik_api:files"),
{
"file": BytesIO(file_content),
"name": file_name,
"usage": FileUsage.MEDIA.value,
},
format="multipart",
)
self.assertEqual(response.status_code, 200)
# Verify event was created
event = Event.objects.filter(action=EventAction.MODEL_CREATED).first()
self.assertIsNotNone(event)
assert event is not None # nosec
self.assertEqual(event.context["model"]["name"], file_name)
self.assertEqual(event.context["model"]["usage"], FileUsage.MEDIA.value)
self.assertEqual(event.context["model"]["mime_type"], "image/png")
# Verify user is captured
self.assertEqual(event.user["username"], self.user.username)
self.assertEqual(event.user["pk"], self.user.pk)
manager.delete_file(file_name)
def test_delete_creates_event(self):
"""Test that deleting a file creates an event"""
manager = FileManager(FileUsage.MEDIA)
file_name = "test-delete.png"
manager.save_file(file_name, b"test content")
# Delete file
response = self.client.delete(
reverse(
"authentik_api:files",
query={
"name": file_name,
"usage": FileUsage.MEDIA.value,
},
)
)
self.assertEqual(response.status_code, 200)
# Verify event was created
event = Event.objects.filter(action=EventAction.MODEL_DELETED).first()
self.assertIsNotNone(event)
assert event is not None # nosec
self.assertEqual(event.context["model"]["name"], file_name)
self.assertEqual(event.context["model"]["usage"], FileUsage.MEDIA.value)
# Verify user is captured
self.assertEqual(event.user["username"], self.user.username)
self.assertEqual(event.user["pk"], self.user.pk)
def test_list_files_basic(self):
"""Test listing files with default parameters"""
response = self.client.get(reverse("authentik_api:files"))
self.assertEqual(response.status_code, 200)
self.assertIn(
{
"name": "/static/authentik/sources/ldap.png",
"url": "http://testserver/static/authentik/sources/ldap.png",
"mime_type": "image/png",
"themed_urls": None,
},
response.data,
)
def test_list_files_invalid_usage(self):
"""Test listing files with invalid usage parameter"""
response = self.client.get(
reverse(
"authentik_api:files",
query={
"usage": "invalid",
},
)
)
self.assertEqual(response.status_code, 400)
self.assertIn("not a valid choice", str(response.data))
def test_list_files_with_search(self):
"""Test listing files with search query"""
response = self.client.get(
reverse(
"authentik_api:files",
query={
"search": "ldap.png",
},
)
)
self.assertEqual(response.status_code, 200)
self.assertIn(
{
"name": "/static/authentik/sources/ldap.png",
"url": "http://testserver/static/authentik/sources/ldap.png",
"mime_type": "image/png",
"themed_urls": None,
},
response.data,
)
def test_list_files_with_manageable_only(self):
"""Test listing files with omit parameter"""
response = self.client.get(
reverse(
"authentik_api:files",
query={
"manageableOnly": "true",
},
)
)
self.assertEqual(response.status_code, 200)
self.assertNotIn(
{
"name": "/static/dist/assets/images/flow_background.jpg",
"mime_type": "image/jpeg",
},
response.data,
)
def test_upload_file_with_custom_path(self):
"""Test uploading file with custom path"""
manager = FileManager(FileUsage.MEDIA)
file_name = "custom/test"
file_content = b"test content"
response = self.client.post(
reverse("authentik_api:files"),
{
"file": BytesIO(file_content),
"name": file_name,
"usage": FileUsage.MEDIA.value,
},
format="multipart",
)
self.assertEqual(response.status_code, 200)
self.assertTrue(manager.file_exists(file_name))
manager.delete_file(file_name)
def test_upload_file_duplicate(self):
"""Test uploading file that already exists"""
manager = FileManager(FileUsage.MEDIA)
file_name = "test-file.png"
file_content = b"test content"
manager.save_file(file_name, file_content)
response = self.client.post(
reverse("authentik_api:files"),
{
"file": BytesIO(file_content),
"name": file_name,
},
format="multipart",
)
self.assertEqual(response.status_code, 400)
self.assertIn("already exists", str(response.data))
manager.delete_file(file_name)
def test_delete_without_name_parameter(self):
"""Test delete without name parameter"""
response = self.client.delete(reverse("authentik_api:files"))
self.assertEqual(response.status_code, 400)
self.assertIn("field is required", str(response.data))
def test_list_files_includes_themed_urls_none(self):
"""Test listing files includes themed_urls as None for non-themed files"""
manager = FileManager(FileUsage.MEDIA)
file_name = "test-no-theme.png"
manager.save_file(file_name, b"test content")
response = self.client.get(
reverse("authentik_api:files", query={"search": file_name, "manageableOnly": "true"})
)
self.assertEqual(response.status_code, 200)
file_entry = next((f for f in response.data if f["name"] == file_name), None)
self.assertIsNotNone(file_entry)
self.assertIn("themed_urls", file_entry)
self.assertIsNone(file_entry["themed_urls"])
manager.delete_file(file_name)
def test_list_files_includes_themed_urls_dict(self):
"""Test listing files includes themed_urls as dict for themed files"""
manager = FileManager(FileUsage.MEDIA)
file_name = "logo-%(theme)s.svg"
manager.save_file("logo-light.svg", b"<svg>light</svg>")
manager.save_file("logo-dark.svg", b"<svg>dark</svg>")
manager.save_file(file_name, b"<svg>placeholder</svg>")
response = self.client.get(
reverse("authentik_api:files", query={"search": "%(theme)s", "manageableOnly": "true"})
)
self.assertEqual(response.status_code, 200)
file_entry = next((f for f in response.data if f["name"] == file_name), None)
self.assertIsNotNone(file_entry)
self.assertIn("themed_urls", file_entry)
self.assertIsInstance(file_entry["themed_urls"], dict)
self.assertIn("light", file_entry["themed_urls"])
self.assertIn("dark", file_entry["themed_urls"])
manager.delete_file(file_name)
manager.delete_file("logo-light.svg")
manager.delete_file("logo-dark.svg")
def test_upload_file_with_theme_variable(self):
"""Test uploading file with %(theme)s in name"""
manager = FileManager(FileUsage.MEDIA)
file_name = "brand-logo-%(theme)s.svg"
file_content = b"<svg></svg>"
response = self.client.post(
reverse("authentik_api:files"),
{
"file": BytesIO(file_content),
"name": file_name,
"usage": FileUsage.MEDIA.value,
},
format="multipart",
)
self.assertEqual(response.status_code, 200)
self.assertTrue(manager.file_exists(file_name))
manager.delete_file(file_name)

View File

@@ -1,175 +0,0 @@
"""Test file service layer"""
from unittest import skipUnless
from urllib.parse import urlparse
from django.http import HttpRequest
from django.test import TestCase
from authentik.admin.files.manager import FileManager
from authentik.admin.files.tests.utils import (
FileTestFileBackendMixin,
FileTestS3BackendMixin,
s3_test_server_available,
)
from authentik.admin.files.usage import FileUsage
from authentik.lib.config import CONFIG
class TestResolveFileUrlBasic(TestCase):
def test_resolve_empty_path(self):
"""Test resolving empty file path"""
manager = FileManager(FileUsage.MEDIA)
result = manager.file_url("")
self.assertEqual(result, "")
def test_resolve_none_path(self):
"""Test resolving None file path"""
manager = FileManager(FileUsage.MEDIA)
result = manager.file_url(None)
self.assertEqual(result, "")
def test_resolve_font_awesome(self):
"""Test resolving Font Awesome icon"""
manager = FileManager(FileUsage.MEDIA)
result = manager.file_url("fa://fa-check")
self.assertEqual(result, "fa://fa-check")
def test_resolve_http_url(self):
"""Test resolving HTTP URL"""
manager = FileManager(FileUsage.MEDIA)
result = manager.file_url("http://example.com/icon.png")
self.assertEqual(result, "http://example.com/icon.png")
def test_resolve_https_url(self):
"""Test resolving HTTPS URL"""
manager = FileManager(FileUsage.MEDIA)
result = manager.file_url("https://example.com/icon.png")
self.assertEqual(result, "https://example.com/icon.png")
def test_resolve_static_path(self):
"""Test resolving static file path"""
manager = FileManager(FileUsage.MEDIA)
result = manager.file_url("/static/authentik/sources/icon.svg")
self.assertEqual(result, "/static/authentik/sources/icon.svg")
class TestResolveFileUrlFileBackend(FileTestFileBackendMixin, TestCase):
def test_resolve_storage_file(self):
"""Test resolving uploaded storage file"""
manager = FileManager(FileUsage.MEDIA)
result = manager.file_url("test.png").split("?")[0]
self.assertEqual(result, "/files/media/public/test.png")
def test_resolve_full_static_with_request(self):
"""Test resolving static file with request builds absolute URI"""
mock_request = HttpRequest()
mock_request.META = {
"HTTP_HOST": "example.com",
"SERVER_NAME": "example.com",
}
manager = FileManager(FileUsage.MEDIA)
result = manager.file_url("/static/icon.svg", mock_request)
self.assertEqual(result, "http://example.com/static/icon.svg")
def test_resolve_full_file_backend_with_request(self):
"""Test resolving FileBackend file with request"""
mock_request = HttpRequest()
mock_request.META = {
"HTTP_HOST": "example.com",
"SERVER_NAME": "example.com",
}
manager = FileManager(FileUsage.MEDIA)
result = manager.file_url("test.png", mock_request).split("?")[0]
self.assertEqual(result, "http://example.com/files/media/public/test.png")
@skipUnless(s3_test_server_available(), "S3 test server not available")
class TestResolveFileUrlS3Backend(FileTestS3BackendMixin, TestCase):
@CONFIG.patch("storage.media.s3.custom_domain", "s3.test:8080/test")
@CONFIG.patch("storage.media.s3.secure_urls", False)
def test_resolve_full_s3_backend(self):
"""Test resolving S3Backend returns presigned URL as-is"""
mock_request = HttpRequest()
mock_request.META = {
"HTTP_HOST": "example.com",
"SERVER_NAME": "example.com",
}
manager = FileManager(FileUsage.MEDIA)
result = manager.file_url("test.png", mock_request)
# S3 URLs should be returned as-is (already absolute)
self.assertTrue(result.startswith("http://s3.test:8080/test"))
class TestThemedUrls(FileTestFileBackendMixin, TestCase):
"""Test FileManager.themed_urls method"""
def test_themed_urls_none_path(self):
"""Test themed_urls returns None for None path"""
manager = FileManager(FileUsage.MEDIA)
result = manager.themed_urls(None)
self.assertIsNone(result)
def test_themed_urls_empty_path(self):
"""Test themed_urls returns None for empty path"""
manager = FileManager(FileUsage.MEDIA)
result = manager.themed_urls("")
self.assertIsNone(result)
def test_themed_urls_no_theme_variable(self):
"""Test themed_urls returns None when no %(theme)s in path"""
manager = FileManager(FileUsage.MEDIA)
result = manager.themed_urls("logo.png")
self.assertIsNone(result)
def test_themed_urls_with_theme_variable(self):
"""Test themed_urls returns dict of URLs for each theme"""
manager = FileManager(FileUsage.MEDIA)
result = manager.themed_urls("logo-%(theme)s.png")
self.assertIsInstance(result, dict)
self.assertIn("light", result)
self.assertIn("dark", result)
self.assertIn("logo-light.png", result["light"])
self.assertIn("logo-dark.png", result["dark"])
def test_themed_urls_with_request(self):
"""Test themed_urls builds absolute URLs with request"""
mock_request = HttpRequest()
mock_request.META = {
"HTTP_HOST": "example.com",
"SERVER_NAME": "example.com",
}
manager = FileManager(FileUsage.MEDIA)
result = manager.themed_urls("logo-%(theme)s.svg", mock_request)
self.assertIsInstance(result, dict)
light_url = urlparse(result["light"])
dark_url = urlparse(result["dark"])
self.assertEqual(light_url.scheme, "http")
self.assertEqual(light_url.netloc, "example.com")
self.assertEqual(dark_url.scheme, "http")
self.assertEqual(dark_url.netloc, "example.com")
def test_themed_urls_passthrough_with_theme_variable(self):
"""Test themed_urls returns dict for passthrough URLs with %(theme)s"""
manager = FileManager(FileUsage.MEDIA)
# External URLs with %(theme)s should return themed URLs
result = manager.themed_urls("https://example.com/logo-%(theme)s.png")
self.assertIsInstance(result, dict)
self.assertEqual(result["light"], "https://example.com/logo-light.png")
self.assertEqual(result["dark"], "https://example.com/logo-dark.png")
def test_themed_urls_passthrough_without_theme_variable(self):
"""Test themed_urls returns None for passthrough URLs without %(theme)s"""
manager = FileManager(FileUsage.MEDIA)
# External URLs without %(theme)s should return None
result = manager.themed_urls("https://example.com/logo.png")
self.assertIsNone(result)

View File

@@ -1,137 +0,0 @@
from django.core.exceptions import ValidationError
from django.test import TestCase
from authentik.admin.files.validation import (
MAX_FILE_NAME_LENGTH,
MAX_PATH_COMPONENT_LENGTH,
validate_file_name,
)
class TestSanitizeFilePath(TestCase):
"""Test validate_file_name function"""
def test_sanitize_valid_filename(self):
"""Test sanitizing valid filename"""
validate_file_name("test.png")
def test_sanitize_valid_path_with_directory(self):
"""Test sanitizing valid path with directory"""
validate_file_name("images/test.png")
def test_sanitize_valid_path_with_nested_dirs(self):
"""Test sanitizing valid path with nested directories"""
validate_file_name("dir1/dir2/dir3/test.png")
def test_sanitize_with_hyphens(self):
"""Test sanitizing filename with hyphens"""
validate_file_name("test-file-name.png")
def test_sanitize_with_underscores(self):
"""Test sanitizing filename with underscores"""
validate_file_name("test_file_name.png")
def test_sanitize_with_dots(self):
"""Test sanitizing filename with multiple dots"""
validate_file_name("test.file.name.png")
def test_sanitize_strips_whitespace(self):
"""Test sanitizing filename strips whitespace"""
with self.assertRaises(ValidationError):
validate_file_name(" test.png ")
def test_sanitize_removes_duplicate_slashes(self):
"""Test sanitizing path removes duplicate slashes"""
with self.assertRaises(ValidationError):
validate_file_name("dir1//dir2///test.png")
def test_sanitize_empty_path_raises(self):
"""Test sanitizing empty path raises ValidationError"""
with self.assertRaises(ValidationError):
validate_file_name("")
def test_sanitize_whitespace_only_raises(self):
"""Test sanitizing whitespace-only path raises ValidationError"""
with self.assertRaises(ValidationError):
validate_file_name(" ")
def test_sanitize_invalid_characters_raises(self):
"""Test sanitizing path with invalid characters raises ValidationError"""
invalid_paths = [
"test file.png", # space
"test@file.png", # @
"test#file.png", # #
"test$file.png", # $
"test%file.png", # % (but %(theme)s is allowed)
"test&file.png", # &
"test*file.png", # *
"test(file).png", # parentheses (but %(theme)s is allowed)
"test[file].png", # brackets
"test{file}.png", # braces
]
for path in invalid_paths:
with self.assertRaises(ValidationError):
validate_file_name(path)
def test_sanitize_absolute_path_raises(self):
"""Test sanitizing absolute path raises ValidationError"""
with self.assertRaises(ValidationError):
validate_file_name("/absolute/path/test.png")
def test_sanitize_parent_directory_raises(self):
"""Test sanitizing path with parent directory reference raises ValidationError"""
with self.assertRaises(ValidationError):
validate_file_name("../test.png")
def test_sanitize_nested_parent_directory_raises(self):
"""Test sanitizing path with nested parent directory reference raises ValidationError"""
with self.assertRaises(ValidationError):
validate_file_name("dir1/../test.png")
def test_sanitize_starts_with_dot_raises(self):
"""Test sanitizing path starting with dot raises ValidationError"""
with self.assertRaises(ValidationError):
validate_file_name(".hidden")
def test_sanitize_too_long_path_raises(self):
"""Test sanitizing too long path raises ValidationError"""
long_path = "a" * (MAX_FILE_NAME_LENGTH + 1) + ".png"
with self.assertRaises(ValidationError):
validate_file_name(long_path)
def test_sanitize_too_long_component_raises(self):
"""Test sanitizing path with too long component raises ValidationError"""
long_component = "a" * (MAX_PATH_COMPONENT_LENGTH + 1)
path = f"dir/{long_component}.png"
with self.assertRaises(ValidationError):
validate_file_name(path)
def test_sanitize_theme_variable_valid(self):
"""Test sanitizing filename with %(theme)s variable"""
# These should all be valid
validate_file_name("logo-%(theme)s.png")
validate_file_name("brand/logo-%(theme)s.svg")
validate_file_name("images/icon-%(theme)s.png")
validate_file_name("%(theme)s/logo.png")
validate_file_name("brand/%(theme)s/logo.png")
def test_sanitize_theme_variable_multiple(self):
"""Test sanitizing filename with multiple %(theme)s variables"""
validate_file_name("%(theme)s/logo-%(theme)s.png")
def test_sanitize_theme_variable_invalid_format(self):
"""Test that partial or malformed theme variables are rejected"""
invalid_paths = [
"test%(theme.png", # missing )s
"test%theme)s.png", # missing (
"test%(themes).png", # wrong variable name
"test%(THEME)s.png", # wrong case
"test%()s.png", # empty variable name
]
for path in invalid_paths:
with self.assertRaises(ValidationError):
validate_file_name(path)

View File

@@ -1,129 +0,0 @@
import shutil
import socket
from tempfile import mkdtemp
from urllib.parse import urlparse
from authentik.admin.files.backends.s3 import S3Backend
from authentik.admin.files.usage import FileUsage
from authentik.lib.config import CONFIG, UNSET
from authentik.lib.generators import generate_id
S3_TEST_ENDPOINT = "http://localhost:8020"
def s3_test_server_available() -> bool:
"""Check if the S3 test server is reachable."""
parsed = urlparse(S3_TEST_ENDPOINT)
try:
with socket.create_connection((parsed.hostname, parsed.port), timeout=2):
return True
except OSError:
return False
class FileTestFileBackendMixin:
def setUp(self):
self.original_media_backend = CONFIG.get("storage.media.backend", UNSET)
self.original_media_backend_path = CONFIG.get("storage.media.file.path", UNSET)
self.media_backend_path = mkdtemp()
CONFIG.set("storage.media.backend", "file")
CONFIG.set("storage.media.file.path", str(self.media_backend_path))
self.original_reports_backend = CONFIG.get("storage.reports.backend", UNSET)
self.original_reports_backend_path = CONFIG.get("storage.reports.file.path", UNSET)
self.reports_backend_path = mkdtemp()
CONFIG.set("storage.reports.backend", "file")
CONFIG.set("storage.reports.file.path", str(self.reports_backend_path))
def tearDown(self):
if self.original_media_backend is not UNSET:
CONFIG.set("storage.media.backend", self.original_media_backend)
else:
CONFIG.delete("storage.media.backend")
if self.original_media_backend_path is not UNSET:
CONFIG.set("storage.media.file.path", self.original_media_backend_path)
else:
CONFIG.delete("storage.media.file.path")
shutil.rmtree(self.media_backend_path)
if self.original_reports_backend is not UNSET:
CONFIG.set("storage.reports.backend", self.original_reports_backend)
else:
CONFIG.delete("storage.reports.backend")
if self.original_reports_backend_path is not UNSET:
CONFIG.set("storage.reports.file.path", self.original_reports_backend_path)
else:
CONFIG.delete("storage.reports.file.path")
shutil.rmtree(self.reports_backend_path)
class FileTestS3BackendMixin:
def setUp(self):
s3_config_keys = {
"endpoint",
"access_key",
"secret_key",
"bucket_name",
}
self.original_media_backend = CONFIG.get("storage.media.backend", UNSET)
CONFIG.set("storage.media.backend", "s3")
self.original_media_s3_settings = {}
for key in s3_config_keys:
self.original_media_s3_settings[key] = CONFIG.get(f"storage.media.s3.{key}", UNSET)
self.media_s3_bucket_name = f"authentik-test-{generate_id(10)}".lower()
CONFIG.set("storage.media.s3.endpoint", S3_TEST_ENDPOINT)
CONFIG.set("storage.media.s3.access_key", "accessKey1")
CONFIG.set("storage.media.s3.secret_key", "secretKey1")
CONFIG.set("storage.media.s3.bucket_name", self.media_s3_bucket_name)
self.media_s3_backend = S3Backend(FileUsage.MEDIA)
self.media_s3_backend.client.create_bucket(Bucket=self.media_s3_bucket_name, ACL="private")
self.original_reports_backend = CONFIG.get("storage.reports.backend", UNSET)
CONFIG.set("storage.reports.backend", "s3")
self.original_reports_s3_settings = {}
for key in s3_config_keys:
self.original_reports_s3_settings[key] = CONFIG.get(f"storage.reports.s3.{key}", UNSET)
self.reports_s3_bucket_name = f"authentik-test-{generate_id(10)}".lower()
CONFIG.set("storage.reports.s3.endpoint", S3_TEST_ENDPOINT)
CONFIG.set("storage.reports.s3.access_key", "accessKey1")
CONFIG.set("storage.reports.s3.secret_key", "secretKey1")
CONFIG.set("storage.reports.s3.bucket_name", self.reports_s3_bucket_name)
self.reports_s3_backend = S3Backend(FileUsage.REPORTS)
self.reports_s3_backend.client.create_bucket(
Bucket=self.reports_s3_bucket_name, ACL="private"
)
def tearDown(self):
def delete_objects_in_bucket(client, bucket_name):
paginator = client.get_paginator("list_objects_v2")
pages = paginator.paginate(Bucket=bucket_name)
for page in pages:
if "Contents" not in page:
continue
for obj in page["Contents"]:
client.delete_object(Bucket=bucket_name, Key=obj["Key"])
delete_objects_in_bucket(self.media_s3_backend.client, self.media_s3_bucket_name)
self.media_s3_backend.client.delete_bucket(Bucket=self.media_s3_bucket_name)
if self.original_media_backend is not UNSET:
CONFIG.set("storage.media.backend", self.original_media_backend)
else:
CONFIG.delete("storage.media.backend")
for k, v in self.original_media_s3_settings.items():
if v is not UNSET:
CONFIG.set(f"storage.media.s3.{k}", v)
else:
CONFIG.delete(f"storage.media.s3.{k}")
delete_objects_in_bucket(self.reports_s3_backend.client, self.reports_s3_bucket_name)
self.reports_s3_backend.client.delete_bucket(Bucket=self.reports_s3_bucket_name)
if self.original_reports_backend is not UNSET:
CONFIG.set("storage.reports.backend", self.original_reports_backend)
else:
CONFIG.delete("storage.reports.backend")
for k, v in self.original_reports_s3_settings.items():
if v is not UNSET:
CONFIG.set(f"storage.reports.s3.{k}", v)
else:
CONFIG.delete(f"storage.reports.s3.{k}")

View File

@@ -1,8 +0,0 @@
from django.urls import path
from authentik.admin.files.api import FileUsedByView, FileView
api_urlpatterns = [
path("admin/file/", FileView.as_view(), name="files"),
path("admin/file/used_by/", FileUsedByView.as_view(), name="files-used-by"),
]

View File

@@ -1,17 +0,0 @@
from enum import StrEnum
from itertools import chain
class FileApiUsage(StrEnum):
"""Usage types for file API"""
MEDIA = "media"
class FileManagedUsage(StrEnum):
"""Usage types for managed files"""
REPORTS = "reports"
FileUsage = StrEnum("FileUsage", [(v.name, v.value) for v in chain(FileApiUsage, FileManagedUsage)])

View File

@@ -1,85 +0,0 @@
import re
from pathlib import PurePosixPath
from django.core.exceptions import ValidationError
from django.utils.translation import gettext as _
from authentik.admin.files.backends.base import THEME_VARIABLE
from authentik.admin.files.backends.passthrough import PassthroughBackend
from authentik.admin.files.backends.static import StaticBackend
from authentik.admin.files.usage import FileUsage
# File upload limits
MAX_FILE_NAME_LENGTH = 1024
MAX_PATH_COMPONENT_LENGTH = 255
def validate_file_name(name: str) -> None:
if PassthroughBackend(FileUsage.MEDIA).supports_file(name) or StaticBackend(
FileUsage.MEDIA
).supports_file(name):
return
validate_upload_file_name(name)
def validate_upload_file_name(
name: str,
ValidationError: type[Exception] = ValidationError,
) -> None:
"""Sanitize file path.
Args:
file_path: The file path to sanitize
Returns:
Sanitized file path
Raises:
ValidationError: If file path is invalid
"""
if not name:
raise ValidationError(_("File name cannot be empty"))
# Allow %(theme)s placeholder for theme-specific files
# Replace with placeholder for validation, then check the result
name_for_validation = name.replace(THEME_VARIABLE, "theme")
# Same regex is used in the frontend as well (with %(theme)s handling)
if not re.match(r"^[a-zA-Z0-9._/-]+$", name_for_validation):
raise ValidationError(
_(
"File name can only contain letters (a-z, A-Z), numbers (0-9), "
"dots (.), hyphens (-), underscores (_), forward slashes (/), "
"and the placeholder %(theme)s for theme-specific files"
)
)
if "//" in name:
raise ValidationError(_("File name cannot contain duplicate /"))
# Convert to posix path
path = PurePosixPath(name)
# Check for absolute paths
# Needs the / at the start. If it doesn't have it, it might still be unsafe, so see L53+
if path.is_absolute():
raise ValidationError(_("Absolute paths are not allowed"))
# Check for parent directory references
if ".." in path.parts:
raise ValidationError(_("Parent directory references ('..') are not allowed"))
# Disallow paths starting with dot (hidden files at root level)
if str(path).startswith("."):
raise ValidationError(_("Paths cannot start with '.'"))
# Check path length limits
normalized = str(path)
if len(normalized) > MAX_FILE_NAME_LENGTH:
raise ValidationError(_(f"File name too long (max {MAX_FILE_NAME_LENGTH} characters)"))
for part in path.parts:
if len(part) > MAX_PATH_COMPONENT_LENGTH:
raise ValidationError(
_(f"Path component too long (max {MAX_PATH_COMPONENT_LENGTH} characters)")
)

View File

@@ -1,9 +0,0 @@
from django.dispatch import receiver
from authentik.admin.tasks import _set_prom_info
from authentik.root.signals import post_startup
@receiver(post_startup)
def post_startup_admin_metrics(sender, **_):
_set_prom_info()

View File

@@ -2,6 +2,7 @@
from django.core.cache import cache
from django.utils.translation import gettext_lazy as _
from django_dramatiq_postgres.middleware import CurrentTask
from dramatiq import actor
from packaging.version import parse
from requests import RequestException
@@ -12,7 +13,7 @@ from authentik.admin.apps import PROM_INFO
from authentik.events.models import Event, EventAction
from authentik.lib.config import CONFIG
from authentik.lib.utils.http import get_http_session
from authentik.tasks.middleware import CurrentTask
from authentik.tasks.models import Task
LOGGER = get_logger()
VERSION_NULL = "0.0.0"
@@ -34,7 +35,7 @@ def _set_prom_info():
@actor(description=_("Update latest version info."))
def update_latest_version():
self = CurrentTask.get_task()
self: Task = CurrentTask.get_task()
if CONFIG.get_bool("disable_update_check"):
cache.set(VERSION_CACHE_KEY, VERSION_NULL, VERSION_CACHE_TIMEOUT)
self.info("Version check disabled.")
@@ -71,3 +72,6 @@ def update_latest_version():
except (RequestException, IndexError) as exc:
cache.set(VERSION_CACHE_KEY, VERSION_NULL, VERSION_CACHE_TIMEOUT)
raise exc
_set_prom_info()

View File

@@ -13,10 +13,10 @@ from rest_framework.exceptions import AuthenticationFailed
from rest_framework.request import Request
from structlog.stdlib import get_logger
from authentik.common.oauth.constants import SCOPE_AUTHENTIK_API
from authentik.core.middleware import CTX_AUTH_VIA
from authentik.core.models import Token, TokenIntents, User, UserTypes
from authentik.outposts.models import Outpost
from authentik.providers.oauth2.constants import SCOPE_AUTHENTIK_API
LOGGER = get_logger()
_tmp = Path(gettempdir())
@@ -27,21 +27,83 @@ except OSError:
ipc_key = None
def validate_auth(header: bytes, format="bearer") -> str | None:
def validate_auth(header: bytes) -> str | None:
"""Validate that the header is in a correct format,
returns type and credentials"""
auth_credentials = header.decode().strip()
if auth_credentials == "" or " " not in auth_credentials:
return None
auth_type, _, auth_credentials = auth_credentials.partition(" ")
if not compare_digest(auth_type.lower(), format):
if auth_type.lower() != "bearer":
LOGGER.debug("Unsupported authentication type, denying", type=auth_type.lower())
return None
raise AuthenticationFailed("Unsupported authentication type")
if auth_credentials == "": # nosec # noqa
raise AuthenticationFailed("Malformed header")
return auth_credentials
def bearer_auth(raw_header: bytes) -> User | None:
"""raw_header in the Format of `Bearer ....`"""
user = auth_user_lookup(raw_header)
if not user:
return None
if not user.is_active:
raise AuthenticationFailed("Token invalid/expired")
return user
def auth_user_lookup(raw_header: bytes) -> User | None:
"""raw_header in the Format of `Bearer ....`"""
from authentik.providers.oauth2.models import AccessToken
auth_credentials = validate_auth(raw_header)
if not auth_credentials:
return None
# first, check traditional tokens
key_token = Token.filter_not_expired(
key=auth_credentials, intent=TokenIntents.INTENT_API
).first()
if key_token:
CTX_AUTH_VIA.set("api_token")
return key_token.user
# then try to auth via JWT
jwt_token = AccessToken.filter_not_expired(
token=auth_credentials, _scope__icontains=SCOPE_AUTHENTIK_API
).first()
if jwt_token:
# Double-check scopes, since they are saved in a single string
# we want to check the parsed version too
if SCOPE_AUTHENTIK_API not in jwt_token.scope:
raise AuthenticationFailed("Token invalid/expired")
CTX_AUTH_VIA.set("jwt")
return jwt_token.user
# then try to auth via secret key (for embedded outpost/etc)
user = token_secret_key(auth_credentials)
if user:
CTX_AUTH_VIA.set("secret_key")
return user
# then try to auth via secret key (for embedded outpost/etc)
user = token_ipc(auth_credentials)
if user:
CTX_AUTH_VIA.set("ipc")
return user
raise AuthenticationFailed("Token invalid/expired")
def token_secret_key(value: str) -> User | None:
"""Check if the token is the secret key
and return the service account for the managed outpost"""
from authentik.outposts.apps import MANAGED_OUTPOST
if not compare_digest(value, settings.SECRET_KEY):
return None
outposts = Outpost.objects.filter(managed=MANAGED_OUTPOST)
if not outposts:
return None
outpost = outposts.first()
return outpost.user
class IPCUser(AnonymousUser):
"""'Virtual' user for IPC communication between authentik core and the authentik router"""
@@ -70,8 +132,13 @@ class IPCUser(AnonymousUser):
def is_authenticated(self):
return True
def all_roles(self):
return []
def token_ipc(value: str) -> User | None:
"""Check if the token is the secret key
and return the service account for the managed outpost"""
if not ipc_key or not compare_digest(value, ipc_key):
return None
return IPCUser()
class TokenAuthentication(BaseAuthentication):
@@ -81,79 +148,12 @@ class TokenAuthentication(BaseAuthentication):
"""Token-based authentication using HTTP Bearer authentication"""
auth = get_authorization_header(request)
user_ctx = self.bearer_auth(auth)
user = bearer_auth(auth)
# None is only returned when the header isn't set.
if not user_ctx:
if not user:
return None
return user_ctx
def bearer_auth(self, raw_header: bytes) -> tuple[User, Any] | None:
"""raw_header in the Format of `Bearer ....`"""
user_ctx = self.auth_user_lookup(raw_header)
if not user_ctx:
return None
user, ctx = user_ctx
if not user.is_active:
raise AuthenticationFailed("Token invalid/expired")
return user, ctx
def auth_user_lookup(self, raw_header: bytes) -> tuple[User, Any] | None:
"""raw_header in the Format of `Bearer ....`"""
from authentik.providers.oauth2.models import AccessToken
auth_credentials = validate_auth(raw_header)
if not auth_credentials:
return None
# first, check traditional tokens
key_token = Token.filter_not_expired(
key=auth_credentials, intent=TokenIntents.INTENT_API
).first()
if key_token:
CTX_AUTH_VIA.set("api_token")
return key_token.user, key_token
# then try to auth via JWT
jwt_token = AccessToken.filter_not_expired(
token=auth_credentials, _scope__icontains=SCOPE_AUTHENTIK_API
).first()
if jwt_token:
# Double-check scopes, since they are saved in a single string
# we want to check the parsed version too
if SCOPE_AUTHENTIK_API not in jwt_token.scope:
raise AuthenticationFailed("Token invalid/expired")
CTX_AUTH_VIA.set("jwt")
return jwt_token.user, jwt_token
# then try to auth via secret key (for embedded outpost/etc)
user_outpost = self.token_secret_key(auth_credentials)
if user_outpost:
CTX_AUTH_VIA.set("secret_key")
return user_outpost
# then try to auth via secret key (for embedded outpost/etc)
user = self.token_ipc(auth_credentials)
if user:
CTX_AUTH_VIA.set("ipc")
return user
raise AuthenticationFailed("Token invalid/expired")
def token_ipc(self, value: str) -> tuple[User, None] | None:
"""Check if the token is the secret key
and return the service account for the managed outpost"""
if not ipc_key or not compare_digest(value, ipc_key):
return None
return IPCUser(), None
def token_secret_key(self, value: str) -> tuple[User, Outpost] | None:
"""Check if the token is the secret key
and return the service account for the managed outpost"""
from authentik.outposts.apps import MANAGED_OUTPOST
if not compare_digest(value, settings.SECRET_KEY):
return None
outposts = Outpost.objects.filter(managed=MANAGED_OUTPOST)
if not outposts:
return None
outpost = outposts.first()
return outpost.user, outpost
return (user, None) # pragma: no cover
class TokenSchema(OpenApiAuthenticationExtension):

View File

@@ -1,45 +0,0 @@
from json import dumps
from django.core.management.base import BaseCommand, no_translations
from drf_spectacular.drainage import GENERATOR_STATS
from drf_spectacular.generators import SchemaGenerator
from drf_spectacular.renderers import OpenApiYamlRenderer
from drf_spectacular.validation import validate_schema
from structlog.stdlib import get_logger
from authentik.blueprints.v1.schema import SchemaBuilder
class Command(BaseCommand):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.logger = get_logger()
def add_arguments(self, parser):
parser.add_argument("--blueprint-file", type=str, default="blueprints/schema.json")
parser.add_argument("--api-file", type=str, default="schema.yml")
@no_translations
def handle(self, *args, blueprint_file: str, api_file: str, **options):
self.build_blueprint(blueprint_file)
self.build_api(api_file)
def build_blueprint(self, file: str):
self.logger.debug("Building blueprint schema...", file=file)
blueprint_builder = SchemaBuilder()
blueprint_builder.build()
with open(file, "w") as _schema:
_schema.write(
dumps(blueprint_builder.schema, indent=4, default=SchemaBuilder.json_default)
)
def build_api(self, file: str):
self.logger.debug("Building API schema...", file=file)
generator = SchemaGenerator()
schema = generator.get_schema(request=None, public=True)
GENERATOR_STATS.emit_summary()
validate_schema(schema)
output = OpenApiYamlRenderer().render(schema, renderer_context={})
with open(file, "wb") as f:
f.write(output)

View File

@@ -1,10 +1,44 @@
"""Pagination which includes total pages and current page"""
from drf_spectacular.plumbing import build_object_type
from rest_framework import pagination
from rest_framework.response import Response
from authentik.api.v3.schema.response import PAGINATION
PAGINATION_COMPONENT_NAME = "Pagination"
PAGINATION_SCHEMA = {
"type": "object",
"properties": {
"next": {
"type": "number",
},
"previous": {
"type": "number",
},
"count": {
"type": "number",
},
"current": {
"type": "number",
},
"total_pages": {
"type": "number",
},
"start_index": {
"type": "number",
},
"end_index": {
"type": "number",
},
},
"required": [
"next",
"previous",
"count",
"current",
"total_pages",
"start_index",
"end_index",
],
}
class Pagination(pagination.PageNumberPagination):
@@ -13,13 +47,6 @@ class Pagination(pagination.PageNumberPagination):
page_query_param = "page"
page_size_query_param = "page_size"
def get_page_size(self, request):
if self.page_size_query_param in request.query_params:
page_size = super().get_page_size(request)
if page_size is not None:
return min(super().get_page_size(request), request.tenant.pagination_max_page_size)
return request.tenant.pagination_default_page_size
def get_paginated_response(self, data):
previous_page_number = 0
if self.page.has_previous():
@@ -43,13 +70,14 @@ class Pagination(pagination.PageNumberPagination):
)
def get_paginated_response_schema(self, schema):
return build_object_type(
properties={
"pagination": PAGINATION.ref,
return {
"type": "object",
"properties": {
"pagination": {"$ref": f"#/components/schemas/{PAGINATION_COMPONENT_NAME}"},
"results": schema,
},
required=["pagination", "results"],
)
"required": ["pagination", "results"],
}
class SmallerPagination(Pagination):

View File

@@ -1,60 +1,96 @@
"""Error Response schema, from https://github.com/axnsan12/drf-yasg/issues/224"""
from collections.abc import Callable
from typing import Any
from django.utils.translation import gettext_lazy as _
from drf_spectacular.generators import SchemaGenerator
from drf_spectacular.plumbing import ResolvedComponent
from drf_spectacular.renderers import OpenApiJsonRenderer
from drf_spectacular.plumbing import (
ResolvedComponent,
build_array_type,
build_basic_type,
build_object_type,
)
from drf_spectacular.settings import spectacular_settings
from structlog.stdlib import get_logger
from drf_spectacular.types import OpenApiTypes
from rest_framework.settings import api_settings
from authentik.api.apps import AuthentikAPIConfig
from authentik.api.v3.schema.query import QUERY_PARAMS
from authentik.api.v3.schema.response import (
GENERIC_ERROR,
GENERIC_ERROR_RESPONSE,
PAGINATION,
VALIDATION_ERROR,
VALIDATION_ERROR_RESPONSE,
from authentik.api.pagination import PAGINATION_COMPONENT_NAME, PAGINATION_SCHEMA
def build_standard_type(obj, **kwargs):
"""Build a basic type with optional add owns."""
schema = build_basic_type(obj)
schema.update(kwargs)
return schema
GENERIC_ERROR = build_object_type(
description=_("Generic API Error"),
properties={
"detail": build_standard_type(OpenApiTypes.STR),
"code": build_standard_type(OpenApiTypes.STR),
},
required=["detail"],
)
VALIDATION_ERROR = build_object_type(
description=_("Validation Error"),
properties={
api_settings.NON_FIELD_ERRORS_KEY: build_array_type(build_standard_type(OpenApiTypes.STR)),
"code": build_standard_type(OpenApiTypes.STR),
},
required=[],
additionalProperties={},
)
LOGGER = get_logger()
def create_component(generator: SchemaGenerator, name, schema, type_=ResolvedComponent.SCHEMA):
"""Register a component and return a reference to it."""
component = ResolvedComponent(
name=name,
type=type_,
schema=schema,
object=name,
)
generator.registry.register_on_missing(component)
return component
def preprocess_schema_exclude_non_api(endpoints: list[tuple[str, Any, Any, Callable]], **kwargs):
"""Filter out all API Views which are not mounted under /api"""
return [
(path, path_regex, method, callback)
for path, path_regex, method, callback in endpoints
if path.startswith("/" + AuthentikAPIConfig.mountpoint)
]
def postprocess_schema_responses(result, generator: SchemaGenerator, **kwargs):
"""Workaround to set a default response for endpoints.
Workaround suggested at
<https://github.com/tfranzel/drf-spectacular/issues/119#issuecomment-656970357>
for the missing drf-spectacular feature discussed in
<https://github.com/tfranzel/drf-spectacular/issues/101>.
"""
create_component(generator, PAGINATION_COMPONENT_NAME, PAGINATION_SCHEMA)
def postprocess_schema_register(
result: dict[str, Any], generator: SchemaGenerator, **kwargs
) -> dict[str, Any]:
"""Register custom schema components"""
LOGGER.debug("Registering custom schemas")
generator.registry.register_on_missing(PAGINATION)
generator.registry.register_on_missing(GENERIC_ERROR)
generator.registry.register_on_missing(GENERIC_ERROR_RESPONSE)
generator.registry.register_on_missing(VALIDATION_ERROR)
generator.registry.register_on_missing(VALIDATION_ERROR_RESPONSE)
for query in QUERY_PARAMS.values():
generator.registry.register_on_missing(query)
return result
generic_error = create_component(generator, "GenericError", GENERIC_ERROR)
validation_error = create_component(generator, "ValidationError", VALIDATION_ERROR)
def postprocess_schema_responses(
result: dict[str, Any], generator: SchemaGenerator, **kwargs
) -> dict[str, Any]:
"""Default error responses"""
LOGGER.debug("Adding default error responses")
for path in result["paths"].values():
for method in path.values():
method["responses"].setdefault("400", VALIDATION_ERROR_RESPONSE.ref)
method["responses"].setdefault("403", GENERIC_ERROR_RESPONSE.ref)
method["responses"].setdefault(
"400",
{
"content": {
"application/json": {
"schema": validation_error.ref,
}
},
"description": "",
},
)
method["responses"].setdefault(
"403",
{
"content": {
"application/json": {
"schema": generic_error.ref,
}
},
"description": "",
},
)
result["components"] = generator.registry.build(spectacular_settings.APPEND_COMPONENTS)
@@ -68,36 +104,10 @@ def postprocess_schema_responses(
return result
def postprocess_schema_query_params(
result: dict[str, Any], generator: SchemaGenerator, **kwargs
) -> dict[str, Any]:
"""Optimize pagination parameters, instead of redeclaring parameters for each endpoint
declare them globally and refer to them"""
LOGGER.debug("Deduplicating query parameters")
for path in result["paths"].values():
for method in path.values():
for idx, param in enumerate(method.get("parameters", [])):
if param["name"] not in QUERY_PARAMS:
continue
method["parameters"][idx] = QUERY_PARAMS[param["name"]].ref
return result
def postprocess_schema_remove_unused(
result: dict[str, Any], generator: SchemaGenerator, **kwargs
) -> dict[str, Any]:
"""Remove unused components"""
# To check if the schema is used, render it to JSON and then substring check that
# less efficient than walking through the tree but a lot simpler and no
# possibility that we miss something
raw = OpenApiJsonRenderer().render(result, renderer_context={}).decode()
count = 0
for key in result["components"][ResolvedComponent.SCHEMA].keys():
schema_usages = raw.count(f"#/components/{ResolvedComponent.SCHEMA}/{key}")
if schema_usages >= 1:
continue
del generator.registry[(key, ResolvedComponent.SCHEMA)]
count += 1
LOGGER.debug("Removing unused components", count=count)
result["components"] = generator.registry.build(spectacular_settings.APPEND_COMPONENTS)
return result
def preprocess_schema_exclude_non_api(endpoints, **kwargs):
"""Filter out all API Views which are not mounted under /api"""
return [
(path, path_regex, method, callback)
for path, path_regex, method, callback in endpoints
if path.startswith("/" + AuthentikAPIConfig.mountpoint)
]

View File

@@ -2,21 +2,20 @@
import json
from base64 import b64encode
from unittest.mock import patch
from django.conf import settings
from django.test import TestCase
from django.utils import timezone
from rest_framework.exceptions import AuthenticationFailed
from authentik.api.authentication import IPCUser, TokenAuthentication
from authentik.api.authentication import bearer_auth
from authentik.blueprints.tests import reconcile_app
from authentik.common.oauth.constants import SCOPE_AUTHENTIK_API
from authentik.core.models import Token, TokenIntents, UserTypes
from authentik.core.models import Token, TokenIntents, User, UserTypes
from authentik.core.tests.utils import create_test_admin_user, create_test_flow
from authentik.lib.generators import generate_id
from authentik.outposts.apps import MANAGED_OUTPOST
from authentik.outposts.models import Outpost
from authentik.providers.oauth2.constants import SCOPE_AUTHENTIK_API
from authentik.providers.oauth2.models import AccessToken, OAuth2Provider
@@ -25,24 +24,24 @@ class TestAPIAuth(TestCase):
def test_invalid_type(self):
"""Test invalid type"""
self.assertIsNone(TokenAuthentication().bearer_auth(b"foo bar"))
with self.assertRaises(AuthenticationFailed):
bearer_auth(b"foo bar")
def test_invalid_empty(self):
"""Test invalid type"""
self.assertIsNone(TokenAuthentication().bearer_auth(b"Bearer "))
self.assertIsNone(TokenAuthentication().bearer_auth(b""))
self.assertIsNone(bearer_auth(b"Bearer "))
self.assertIsNone(bearer_auth(b""))
def test_invalid_no_token(self):
"""Test invalid with no token"""
auth = b64encode(b":abc").decode()
self.assertIsNone(TokenAuthentication().bearer_auth(f"Basic :{auth}".encode()))
with self.assertRaises(AuthenticationFailed):
auth = b64encode(b":abc").decode()
self.assertIsNone(bearer_auth(f"Basic :{auth}".encode()))
def test_bearer_valid(self):
"""Test valid token"""
token = Token.objects.create(intent=TokenIntents.INTENT_API, user=create_test_admin_user())
user, tk = TokenAuthentication().bearer_auth(f"Bearer {token.key}".encode())
self.assertEqual(user, token.user)
self.assertEqual(token, token)
self.assertEqual(bearer_auth(f"Bearer {token.key}".encode()), token.user)
def test_bearer_valid_deactivated(self):
"""Test valid token"""
@@ -51,7 +50,7 @@ class TestAPIAuth(TestCase):
user.save()
token = Token.objects.create(intent=TokenIntents.INTENT_API, user=user)
with self.assertRaises(AuthenticationFailed):
TokenAuthentication().bearer_auth(f"Bearer {token.key}".encode())
bearer_auth(f"Bearer {token.key}".encode())
@reconcile_app("authentik_outposts")
def test_managed_outpost_fail(self):
@@ -60,21 +59,20 @@ class TestAPIAuth(TestCase):
outpost.user.delete()
outpost.delete()
with self.assertRaises(AuthenticationFailed):
TokenAuthentication().bearer_auth(f"Bearer {settings.SECRET_KEY}".encode())
bearer_auth(f"Bearer {settings.SECRET_KEY}".encode())
@reconcile_app("authentik_outposts")
def test_managed_outpost_success(self):
"""Test managed outpost"""
user, outpost = TokenAuthentication().bearer_auth(f"Bearer {settings.SECRET_KEY}".encode())
user: User = bearer_auth(f"Bearer {settings.SECRET_KEY}".encode())
self.assertEqual(user.type, UserTypes.INTERNAL_SERVICE_ACCOUNT)
self.assertEqual(outpost, Outpost.objects.filter(managed=MANAGED_OUTPOST).first())
def test_jwt_valid(self):
"""Test valid JWT"""
provider = OAuth2Provider.objects.create(
name=generate_id(), client_id=generate_id(), authorization_flow=create_test_flow()
)
access = AccessToken.objects.create(
refresh = AccessToken.objects.create(
user=create_test_admin_user(),
provider=provider,
token=generate_id(),
@@ -82,16 +80,14 @@ class TestAPIAuth(TestCase):
_scope=SCOPE_AUTHENTIK_API,
_id_token=json.dumps({}),
)
user, token = TokenAuthentication().bearer_auth(f"Bearer {access.token}".encode())
self.assertEqual(user, access.user)
self.assertEqual(token, access)
self.assertEqual(bearer_auth(f"Bearer {refresh.token}".encode()), refresh.user)
def test_jwt_missing_scope(self):
"""Test valid JWT"""
provider = OAuth2Provider.objects.create(
name=generate_id(), client_id=generate_id(), authorization_flow=create_test_flow()
)
access = AccessToken.objects.create(
refresh = AccessToken.objects.create(
user=create_test_admin_user(),
provider=provider,
token=generate_id(),
@@ -100,12 +96,4 @@ class TestAPIAuth(TestCase):
_id_token=json.dumps({}),
)
with self.assertRaises(AuthenticationFailed):
TokenAuthentication().bearer_auth(f"Bearer {access.token}".encode())
def test_ipc(self):
"""Test IPC auth (mock key)"""
key = generate_id()
with patch("authentik.api.authentication.ipc_key", key):
user, ctx = TokenAuthentication().bearer_auth(f"Bearer {key}".encode())
self.assertEqual(user, IPCUser())
self.assertEqual(ctx, None)
self.assertEqual(bearer_auth(f"Bearer {refresh.token}".encode()), refresh.user)

View File

@@ -1,16 +1,9 @@
"""Schema generation tests"""
from pathlib import Path
from tempfile import gettempdir
from uuid import uuid4
from django.core.management import call_command
from django.urls import reverse
from rest_framework.test import APITestCase
from yaml import safe_load
from authentik.lib.config import CONFIG
class TestSchemaGeneration(APITestCase):
"""Generic admin tests"""
@@ -28,17 +21,3 @@ class TestSchemaGeneration(APITestCase):
reverse("authentik_api:schema-browser"),
)
self.assertEqual(response.status_code, 200)
def test_build_schema(self):
"""Test schema build command"""
tmp = Path(gettempdir())
blueprint_file = tmp / f"{str(uuid4())}.json"
api_file = tmp / f"{str(uuid4())}.yml"
with (
CONFIG.patch("debug", True),
CONFIG.patch("tenants.enabled", True),
CONFIG.patch("outposts.disable_embedded_outpost", True),
):
call_command("build_schema", blueprint_file=blueprint_file, api_file=api_file)
self.assertTrue(blueprint_file.exists())
self.assertTrue(api_file.exists())

View File

@@ -1,62 +0,0 @@
from collections.abc import Callable
from inspect import getmembers
from django.urls import reverse
from rest_framework.test import APITestCase
from rest_framework.views import APIView
from rest_framework.viewsets import GenericViewSet
from authentik.lib.utils.reflection import all_subclasses
class TestAPIViewAuthnAuthz(APITestCase): ...
def api_viewset_action(viewset: GenericViewSet, member: Callable) -> Callable:
"""Test API Viewset action"""
def tester(self: TestAPIViewAuthnAuthz):
if "permission_classes" in member.kwargs:
self.assertNotEqual(
member.kwargs["permission_classes"], [], "permission_classes should not be empty"
)
if "authentication_classes" in member.kwargs:
self.assertNotEqual(
member.kwargs["authentication_classes"],
[],
"authentication_classes should not be empty",
)
return tester
def api_view(view: APIView) -> Callable:
def tester(self: TestAPIViewAuthnAuthz):
self.assertNotEqual(view.permission_classes, [], "permission_classes should not be empty")
self.assertNotEqual(
view.authentication_classes,
[],
"authentication_classes should not be empty",
)
return tester
# Tell django to load all URLs
reverse("authentik_core:root-redirect")
for viewset in all_subclasses(GenericViewSet):
for act_name, member in getmembers(viewset(), lambda x: isinstance(x, Callable)):
if not hasattr(member, "kwargs") or not hasattr(member, "mapping"):
continue
setattr(
TestAPIViewAuthnAuthz,
f"test_viewset_{viewset.__name__}_action_{act_name}",
api_viewset_action(viewset, member),
)
for view in all_subclasses(APIView):
setattr(
TestAPIViewAuthnAuthz,
f"test_view_{view.__name__}",
api_view(view),
)

View File

@@ -1,9 +1,10 @@
"""core Configs API"""
from pathlib import Path
from django.conf import settings
from django.db import models
from django.dispatch import Signal
from django.http import HttpRequest
from drf_spectacular.utils import extend_schema
from rest_framework.fields import (
BooleanField,
@@ -18,8 +19,6 @@ from rest_framework.request import Request
from rest_framework.response import Response
from rest_framework.views import APIView
from authentik.admin.files.manager import get_file_manager
from authentik.admin.files.usage import FileUsage
from authentik.core.api.utils import PassiveSerializer
from authentik.events.context_processors.base import get_context_processors
from authentik.lib.config import CONFIG
@@ -31,7 +30,6 @@ class Capabilities(models.TextChoices):
"""Define capabilities which influence which APIs can/should be used"""
CAN_SAVE_MEDIA = "can_save_media"
CAN_SAVE_REPORTS = "can_save_reports"
CAN_GEO_IP = "can_geo_ip"
CAN_ASN = "can_asn"
CAN_IMPERSONATE = "can_impersonate"
@@ -58,6 +56,7 @@ class ConfigSerializer(PassiveSerializer):
cache_timeout = IntegerField(required=True)
cache_timeout_flows = IntegerField(required=True)
cache_timeout_policies = IntegerField(required=True)
cache_timeout_reputation = IntegerField(required=True)
class ConfigView(APIView):
@@ -65,30 +64,31 @@ class ConfigView(APIView):
permission_classes = [AllowAny]
@staticmethod
def get_capabilities(request: HttpRequest) -> list[Capabilities]:
def get_capabilities(self) -> list[Capabilities]:
"""Get all capabilities this server instance supports"""
caps = []
if get_file_manager(FileUsage.MEDIA).manageable:
deb_test = settings.DEBUG or settings.TEST
if (
CONFIG.get("storage.media.backend", "file") == "s3"
or Path(settings.STORAGES["default"]["OPTIONS"]["location"]).is_mount()
or deb_test
):
caps.append(Capabilities.CAN_SAVE_MEDIA)
if get_file_manager(FileUsage.REPORTS).manageable:
caps.append(Capabilities.CAN_SAVE_REPORTS)
for processor in get_context_processors():
if cap := processor.capability():
caps.append(cap)
if request.tenant.impersonation:
if self.request.tenant.impersonation:
caps.append(Capabilities.CAN_IMPERSONATE)
if settings.DEBUG: # pragma: no cover
caps.append(Capabilities.CAN_DEBUG)
if "authentik.enterprise" in settings.INSTALLED_APPS:
caps.append(Capabilities.IS_ENTERPRISE)
for _, result in capabilities.send(sender=ConfigView):
for _, result in capabilities.send(sender=self):
if result:
caps.append(result)
return caps
@staticmethod
def get_config(request: HttpRequest) -> ConfigSerializer:
def get_config(self) -> ConfigSerializer:
"""Get Config"""
return ConfigSerializer(
{
@@ -99,14 +99,15 @@ class ConfigView(APIView):
"send_pii": CONFIG.get("error_reporting.send_pii"),
"traces_sample_rate": float(CONFIG.get("error_reporting.sample_rate", 0.4)),
},
"capabilities": ConfigView.get_capabilities(request),
"capabilities": self.get_capabilities(),
"cache_timeout": CONFIG.get_int("cache.timeout"),
"cache_timeout_flows": CONFIG.get_int("cache.timeout_flows"),
"cache_timeout_policies": CONFIG.get_int("cache.timeout_policies"),
"cache_timeout_reputation": CONFIG.get_int("cache.timeout_reputation"),
}
)
@extend_schema(responses={200: ConfigSerializer(many=False)})
def get(self, request: Request) -> Response:
"""Retrieve public configuration options"""
return Response(ConfigView.get_config(request).data)
return Response(self.get_config().data)

View File

@@ -1,65 +0,0 @@
from django.utils.translation import gettext_lazy as _
from drf_spectacular.plumbing import (
ResolvedComponent,
build_basic_type,
build_parameter_type,
)
from drf_spectacular.types import OpenApiTypes
QUERY_PARAMS = {
"ordering": ResolvedComponent(
name="QueryPaginationOrdering",
type=ResolvedComponent.PARAMETER,
object="QueryPaginationOrdering",
schema=build_parameter_type(
name="ordering",
schema=build_basic_type(OpenApiTypes.STR),
location="query",
description=_("Which field to use when ordering the results."),
),
),
"page": ResolvedComponent(
name="QueryPaginationPage",
type=ResolvedComponent.PARAMETER,
object="QueryPaginationPage",
schema=build_parameter_type(
name="page",
schema=build_basic_type(OpenApiTypes.INT),
location="query",
description=_("A page number within the paginated result set."),
),
),
"page_size": ResolvedComponent(
name="QueryPaginationPageSize",
type=ResolvedComponent.PARAMETER,
object="QueryPaginationPageSize",
schema=build_parameter_type(
name="page_size",
schema=build_basic_type(OpenApiTypes.INT),
location="query",
description=_("Number of results to return per page."),
),
),
"search": ResolvedComponent(
name="QuerySearch",
type=ResolvedComponent.PARAMETER,
object="QuerySearch",
schema=build_parameter_type(
name="search",
schema=build_basic_type(OpenApiTypes.STR),
location="query",
description=_("A search term."),
),
),
# Not related to pagination but a very common query param
"name": ResolvedComponent(
name="QueryName",
type=ResolvedComponent.PARAMETER,
object="QueryName",
schema=build_parameter_type(
name="name",
schema=build_basic_type(OpenApiTypes.STR),
location="query",
),
),
}

View File

@@ -1,84 +0,0 @@
from django.utils.translation import gettext_lazy as _
from drf_spectacular.plumbing import (
ResolvedComponent,
build_array_type,
build_basic_type,
build_object_type,
)
from drf_spectacular.types import OpenApiTypes
from rest_framework.settings import api_settings
GENERIC_ERROR = ResolvedComponent(
name="GenericError",
type=ResolvedComponent.SCHEMA,
object="GenericError",
schema=build_object_type(
description=_("Generic API Error"),
properties={
"detail": build_basic_type(OpenApiTypes.STR),
"code": build_basic_type(OpenApiTypes.STR),
},
required=["detail"],
),
)
GENERIC_ERROR_RESPONSE = ResolvedComponent(
name="GenericErrorResponse",
type=ResolvedComponent.RESPONSE,
object="GenericErrorResponse",
schema={
"content": {"application/json": {"schema": GENERIC_ERROR.ref}},
"description": "",
},
)
VALIDATION_ERROR = ResolvedComponent(
"ValidationError",
object="ValidationError",
type=ResolvedComponent.SCHEMA,
schema=build_object_type(
description=_("Validation Error"),
properties={
api_settings.NON_FIELD_ERRORS_KEY: build_array_type(build_basic_type(OpenApiTypes.STR)),
"code": build_basic_type(OpenApiTypes.STR),
},
required=[],
additionalProperties={},
),
)
VALIDATION_ERROR_RESPONSE = ResolvedComponent(
name="ValidationErrorResponse",
type=ResolvedComponent.RESPONSE,
object="ValidationErrorResponse",
schema={
"content": {
"application/json": {
"schema": VALIDATION_ERROR.ref,
}
},
"description": "",
},
)
PAGINATION = ResolvedComponent(
name="Pagination",
type=ResolvedComponent.SCHEMA,
object="Pagination",
schema=build_object_type(
properties={
"next": build_basic_type(OpenApiTypes.NUMBER),
"previous": build_basic_type(OpenApiTypes.NUMBER),
"count": build_basic_type(OpenApiTypes.NUMBER),
"current": build_basic_type(OpenApiTypes.NUMBER),
"total_pages": build_basic_type(OpenApiTypes.NUMBER),
"start_index": build_basic_type(OpenApiTypes.NUMBER),
"end_index": build_basic_type(OpenApiTypes.NUMBER),
},
required=[
"next",
"previous",
"count",
"current",
"total_pages",
"start_index",
"end_index",
],
),
)

View File

@@ -1,50 +0,0 @@
from collections.abc import Callable
from functools import wraps
from typing import Literal
from rest_framework.request import Request
from rest_framework.response import Response
from rest_framework.serializers import Serializer
from rest_framework.viewsets import ViewSet
def validate(serializer_type: type[Serializer], location: Literal["body", "query"] = "body"):
"""Validate incoming data with the specified serializer. Raw data can either be taken
from request body or query string, defaulting to body.
Validated data is added to the function this decorator is used on with a named parameter
based on the location of the data.
Example:
@validate(MySerializer)
@validate(MyQuerySerializer, location="query")
def my_action(self, request, *, body: MySerializer, query: MyQuerySerializer):
...
"""
def wrapper_outer(func: Callable):
@wraps(func)
def wrapper(self: ViewSet, request: Request, *args, **kwargs) -> Response:
data = {}
if location == "body":
data = request.data
elif location == "query":
data = request.query_params
else:
raise ValueError(f"Invalid data location '{location}'")
instance = serializer_type(
data=data,
context={
"request": request,
},
)
instance.is_valid(raise_exception=True)
kwargs[location] = instance
return func(self, request, *args, **kwargs)
return wrapper
return wrapper_outer

View File

@@ -1,12 +1,10 @@
"""authentik Blueprints app"""
import traceback
from collections.abc import Callable
from importlib import import_module
from inspect import ismethod
from django.apps import AppConfig
from django.conf import settings
from django.db import DatabaseError, InternalError, ProgrammingError
from dramatiq.broker import get_broker
from structlog.stdlib import BoundLogger, get_logger
@@ -46,21 +44,8 @@ class ManagedAppConfig(AppConfig):
module_name = f"{self.name}.{rel_module}"
import_module(module_name)
self.logger.info("Imported related module", module=module_name)
except ModuleNotFoundError as exc:
if settings.DEBUG:
# This is a heuristic for determining whether the exception was caused
# "directly" by the `import_module` call or whether the initial import
# succeeded and a later import (within the existing module) failed.
# 1. <the calling function>
# 2. importlib.import_module
# 3. importlib._bootstrap._gcd_import
# 4. importlib._bootstrap._find_and_load
# 5. importlib._bootstrap._find_and_load_unlocked
STACK_LENGTH_HEURISTIC = 5
stack_length = len(traceback.extract_tb(exc.__traceback__))
if stack_length > STACK_LENGTH_HEURISTIC:
raise
except ModuleNotFoundError:
pass
import_relative("checks")
import_relative("tasks")

Some files were not shown because too many files have changed in this diff Show More