Compare commits

..

26 Commits

Author SHA1 Message Date
Jens Langhammer
99a56a5b9c add context
Signed-off-by: Jens Langhammer <jens@goauthentik.io>
2025-12-16 15:01:39 +01:00
Jens Langhammer
73afaed115 more tests, get rid of code smells
Signed-off-by: Jens Langhammer <jens@goauthentik.io>
2025-12-16 14:55:02 +01:00
Jens Langhammer
8b758402c0 more
Signed-off-by: Jens Langhammer <jens@goauthentik.io>
2025-12-16 14:36:38 +01:00
Jens Langhammer
050c9c31af no more relative imports
Signed-off-by: Jens Langhammer <jens@goauthentik.io>
2025-12-16 13:44:07 +01:00
Jens Langhammer
921269f990 more typing
Signed-off-by: Jens Langhammer <jens@goauthentik.io>
2025-12-16 13:43:29 +01:00
Jens Langhammer
87732a413c add some tests, some typing, introduce variable
Signed-off-by: Jens Langhammer <jens@goauthentik.io>
2025-12-16 13:36:38 +01:00
Jens Langhammer
8cfe83bd47 Merge branch 'main' into packages/akql 2025-12-16 12:55:37 +01:00
dependabot[bot]
42c4fee053 core: bump goauthentik/fips-debian from c10cd2c to 2f19fc1 (#18856)
Bumps goauthentik/fips-debian from `c10cd2c` to `2f19fc1`.

---
updated-dependencies:
- dependency-name: goauthentik/fips-debian
  dependency-version: trixie-slim-fips
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-12-16 12:33:13 +01:00
Dominic R
26cfbe67f3 admin/files: fix get_objects_for_user queryset argument in FileUsedByView (#18845)
Co-authored-by: Marcelo Elizeche Landó <marcelo@goauthentik.io>
2025-12-16 00:39:13 +00:00
Marcelo Elizeche Landó
2a17024afc core: skip s3 tests if endpoint isn't available (#18841)
skip s3 tests if endpoint isn't available
2025-12-15 20:22:59 -03:00
Connor Peshek
c557b55e0e crypto: Store details parsed from includeDetails in database instead (#18013)
* crypto: Store details parsed from includeDetails in database instead

* fix signal for tests

* Update authentik/crypto/signals.py

Co-authored-by: Jens L. <jens@goauthentik.io>
Signed-off-by: Connor Peshek <connor@connorpeshek.me>

* Update authentik/crypto/apps.py

Co-authored-by: Jens L. <jens@goauthentik.io>
Signed-off-by: Connor Peshek <connor@connorpeshek.me>

* Update authentik/crypto/signals.py

Co-authored-by: Jens L. <jens@goauthentik.io>
Signed-off-by: Connor Peshek <connor@connorpeshek.me>

* Add feedback

* cleanup

* update

* cleanup

* simplify serializer

Signed-off-by: Jens Langhammer <jens@goauthentik.io>

* Update KID for when updating certificates

* lint

---------

Signed-off-by: Connor Peshek <connor@connorpeshek.me>
Signed-off-by: Jens Langhammer <jens@goauthentik.io>
Co-authored-by: connor peshek <connorpeshek@connors-MacBook-Pro.local>
Co-authored-by: Jens L. <jens@goauthentik.io>
2025-12-15 13:50:16 -06:00
Roi Gabay
f56e354e38 website/docs: add jellyseer integration doc (#18812)
* website/docs: add jellyseer integration doc

* Slight tweaks

* Apply suggestions from code review

Co-authored-by: Dominic R <dominic@sdko.org>
Signed-off-by: Dewi Roberts <dewi@goauthentik.io>

* Update website/integrations/media/jellyseerr/index.md

Signed-off-by: Dewi Roberts <dewi@goauthentik.io>

* Apply suggestions from code review

Co-authored-by: Dominic R <dominic@sdko.org>
Signed-off-by: Dewi Roberts <dewi@goauthentik.io>

---------

Signed-off-by: Dewi Roberts <dewi@goauthentik.io>
Co-authored-by: dewi-tik <dewi@goauthentik.io>
Co-authored-by: Dominic R <dominic@sdko.org>
2025-12-15 17:12:06 +00:00
Marc 'risson' Schmitt
c50c2b0e0c admin/files: revert add check for /media existence (#18636) (#18829) 2025-12-15 15:29:21 +00:00
dependabot[bot]
662124cac9 core: bump goauthentik.io/api/v3 from 3.2025120.26 to 3.2026020.1 (#18815)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-12-15 15:20:48 +00:00
Marc 'risson' Schmitt
3d671a901b packages/django-dramatiq-postgres: broker: close django connections on consumer close (#18833)
Co-authored-by: Norman Ziebal <norman.ziebal@mail.schwarz>
2025-12-15 14:59:51 +01:00
Simonyi Gergő
a7fb031b64 core: remove superuser check from Token list (#18684) 2025-12-15 14:29:42 +01:00
Dewi Roberts
2818b0bbdf website/docs: add icon info to style guide (#18832) 2025-12-15 13:27:22 +00:00
Ryan Pesek
60075e39fb core: list applications fix (#18798) 2025-12-15 13:16:07 +01:00
dependabot[bot]
c112f702b3 ci: bump actions/cache from 5.0.0 to 5.0.1 (#18823)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-12-15 13:15:05 +01:00
dependabot[bot]
42b3323b3d ci: bump actions/download-artifact from 6.0.0 to 7.0.0 (#18825)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-12-15 13:14:22 +01:00
dependabot[bot]
78380831de core: bump goauthentik/fips-debian from 07f41ce to c10cd2c (#18822)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-12-15 13:14:09 +01:00
dependabot[bot]
8b5195aeff ci: bump actions/upload-artifact from 5.0.0 to 6.0.0 (#18824)
Bumps [actions/upload-artifact](https://github.com/actions/upload-artifact) from 5.0.0 to 6.0.0.
- [Release notes](https://github.com/actions/upload-artifact/releases)
- [Commits](330a01c490...b7c566a772)

---
updated-dependencies:
- dependency-name: actions/upload-artifact
  dependency-version: 6.0.0
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-12-15 12:45:16 +01:00
dependabot[bot]
d762e38027 ci: bump astral-sh/setup-uv from 7.1.5 to 7.1.6 in /.github/actions/setup (#18826)
ci: bump astral-sh/setup-uv in /.github/actions/setup

Bumps [astral-sh/setup-uv](https://github.com/astral-sh/setup-uv) from 7.1.5 to 7.1.6.
- [Release notes](https://github.com/astral-sh/setup-uv/releases)
- [Commits](ed21f2f24f...681c641aba)

---
updated-dependencies:
- dependency-name: astral-sh/setup-uv
  dependency-version: 7.1.6
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-12-15 12:36:47 +01:00
Marcelo Elizeche Landó
e427cb611e root: Add macOS support for sed in Makefile (#18795)
Add macOS support for sed
2025-12-15 12:09:35 +01:00
Jens Langhammer
1df84d68dd first cleanup pass
Signed-off-by: Jens Langhammer <jens@goauthentik.io>
2025-12-08 18:04:30 +01:00
Jens Langhammer
7f8527461a initial fork
Signed-off-by: Jens Langhammer <jens@goauthentik.io>
2025-12-08 17:57:47 +01:00
60 changed files with 3058 additions and 754 deletions

View File

@@ -1,63 +0,0 @@
# syntax=docker/dockerfile:1
# Start from the same FIPS Python base as production (python-base stage)
FROM ghcr.io/goauthentik/fips-python:3.13.9-slim-trixie-fips@sha256:700fc8c1e290bd14e5eaca50b1d8e8c748c820010559cbfb4c4f8dfbe2c4c9ff
USER root
# Setup environment matching production python-base stage
ENV VENV_PATH="/ak-root/.venv" \
PATH="/lifecycle:/ak-root/.venv/bin:$PATH" \
UV_COMPILE_BYTECODE=1 \
UV_LINK_MODE=copy \
UV_NATIVE_TLS=1 \
UV_PYTHON_DOWNLOADS=0
WORKDIR /ak-root
# Copy uv package manager
COPY --from=ghcr.io/astral-sh/uv:0.9.7@sha256:ba4857bf2a068e9bc0e64eed8563b065908a4cd6bfb66b531a9c424c8e25e142 /uv /uvx /bin/
# Install build dependencies
RUN rm -f /etc/apt/apt.conf.d/docker-clean && \
echo 'Binary::apt::APT::Keep-Downloaded-Packages "true";' > /etc/apt/apt.conf.d/keep-cache && \
apt-get update && \
apt-get install -y --no-install-recommends \
# Build essentials
build-essential pkg-config libffi-dev git binutils \
# cryptography
curl \
# libxml
libxslt-dev zlib1g-dev \
# postgresql
libpq-dev \
# python-kadmin-rs and kerberos testing
clang libkrb5-dev sccache krb5-kdc krb5-admin-server \
# xmlsec
libltdl-dev \
# runit (for chpst command used by lifecycle/ak)
runit \
# sudo (required by devcontainer features)
sudo && \
rm -rf /var/lib/apt/lists/*
# Environment for building native Python packages
ENV UV_NO_BINARY_PACKAGE="cryptography lxml python-kadmin-rs xmlsec" \
RUSTUP_PERMIT_COPY_RENAME="true"
# Create authentik user with proper home directory (required for devcontainer features)
RUN adduser --disabled-password --gecos "" --uid 1000 --home /home/authentik authentik && \
mkdir -p /certs /media /ak-root && \
chown -R authentik:authentik /certs /media /ak-root /home/authentik && \
echo "authentik ALL=(ALL) NOPASSWD:ALL" >> /etc/sudoers.d/authentik
# FIPS configuration for Go development
# Don't set GOFIPS/GOFIPS140 globally to avoid breaking Go tools like docker-compose
# These will be set when building/running authentik Go code (see lifecycle/ak and Makefile)
ENV CGO_ENABLED=1
# Set TMPDIR for PID files and temp data
# Use /tmp instead of /dev/shm for development because go run needs to execute binaries
ENV TMPDIR=/tmp
USER authentik

View File

@@ -1,68 +0,0 @@
{
"name": "authentik",
"dockerComposeFile": "docker-compose.yml",
"service": "app",
"workspaceFolder": "/ak-root",
"containerUser": "authentik",
"remoteUser": "authentik",
"shutdownAction": "stopCompose",
"containerEnv": {
"LOCAL_PROJECT_DIR": "/ak-root"
},
"features": {
"ghcr.io/devcontainers/features/go:1": {
"version": "1.24"
},
"ghcr.io/devcontainers/features/node:1": {
"version": "24"
},
"ghcr.io/devcontainers/features/rust:1": {
"version": "latest"
},
"ghcr.io/devcontainers/features/docker-in-docker:2": {
"version": "latest",
"moby": false
}
},
"mounts": [],
"forwardPorts": [9000, 9443],
"portsAttributes": {
"8000": {
"onAutoForward": "ignore"
},
"3963": {
"onAutoForward": "ignore"
},
"35151": {
"onAutoForward": "ignore"
},
"9901": {
"onAutoForward": "ignore"
}
},
"postCreateCommand": "bash .devcontainer/setup.sh",
"customizations": {
"vscode": {
"extensions": [
"EditorConfig.EditorConfig",
"bashmish.es6-string-css",
"dbaeumer.vscode-eslint",
"esbenp.prettier-vscode",
"golang.go",
"Gruntfuggly.todo-tree",
"ms-python.black-formatter",
"ms-python.isort",
"ms-python.pylint",
"ms-python.python",
"ms-python.vscode-pylance",
"redhat.vscode-yaml",
"Tobermory.es6-string-html",
"charliermarsh.ruff"
],
"settings": {
"python.defaultInterpreterPath": "/ak-root/.venv/bin/python",
"python.terminal.activateEnvironment": true
}
}
}
}

View File

@@ -1,50 +0,0 @@
services:
app:
build:
context: ..
dockerfile: .devcontainer/Dockerfile
user: authentik
privileged: true
volumes:
- ../:/ak-root
entrypoint: []
command: sleep infinity
depends_on:
postgresql:
condition: service_healthy
env_file: .env
environment:
PATH: "/ak-root/.venv/bin:${PATH}"
ports:
- "9000:9000"
- "9443:9443"
postgresql:
image: docker.io/library/postgres:16
restart: unless-stopped
healthcheck:
test: ["CMD-SHELL", "pg_isready -d authentik -U postgres"]
interval: 10s
timeout: 5s
retries: 5
start_period: 20s
volumes:
- postgres-data:/var/lib/postgresql/data
env_file: .env
command: ["postgres", "-c", "log_statement=all", "-c", "log_destination=stderr"]
s3:
image: docker.io/zenko/cloudserver
env_file: .env
environment:
REMOTE_MANAGEMENT_DISABLE: "1"
ports:
- "8020:8000"
volumes:
- s3-data:/usr/src/app/localData
- s3-metadata:/usr/src/app/localMetadata
volumes:
postgres-data:
s3-data:
s3-metadata:

View File

@@ -1,37 +0,0 @@
#!/usr/bin/env bash
set -e
echo "======================================"
echo "Running authentik devcontainer setup"
echo "======================================"
echo ""
echo "Step 1/5: Installing dependencies"
make install
echo ""
echo "Step 2/5: Generating development config"
make gen-dev-config
echo ""
echo "Step 3/5: Running database migrations"
make migrate
echo ""
echo "Step 4/5: Generating API clients"
make gen
echo ""
echo "Step 5/5: Building web assets"
make web
echo ""
echo "======================================"
echo "Setup complete!"
echo "======================================"
echo ""
echo "You can now run:"
echo " - 'make run-server' to start the backend server"
echo " - 'make run-worker' to start the worker (must be ran once after initial setup)"
echo " - 'make web-watch' for live web development"
echo ""

View File

@@ -21,7 +21,7 @@ runs:
sudo apt-get install --no-install-recommends -y libpq-dev openssl libxmlsec1-dev pkg-config gettext libkrb5-dev krb5-kdc krb5-user krb5-admin-server
- name: Install uv
if: ${{ contains(inputs.dependencies, 'python') }}
uses: astral-sh/setup-uv@ed21f2f24f8dd64503750218de024bcf64c7250a # v5
uses: astral-sh/setup-uv@681c641aba71e4a1c380be3ab5e12ad51f415867 # v5
with:
enable-cache: true
- name: Setup python

View File

@@ -41,7 +41,7 @@ jobs:
- working-directory: website/
name: Install Dependencies
run: npm ci
- uses: actions/cache@a7833574556fa59680c1b7cb190c1735db73ebf0 # v4
- uses: actions/cache@9255dc7a253b0ccc959486e2bca901246202afeb # v4
with:
path: |
${{ github.workspace }}/website/api/.docusaurus
@@ -55,7 +55,7 @@ jobs:
env:
NODE_ENV: production
run: npm run build -w api
- uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v4
- uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v4
with:
name: api-docs
path: website/api/build
@@ -67,7 +67,7 @@ jobs:
- build
steps:
- uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v5
- uses: actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53 # v5
- uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v5
with:
name: api-docs
path: website/api/build

View File

@@ -201,7 +201,7 @@ jobs:
run: |
docker compose -f tests/e2e/docker-compose.yml up -d --quiet-pull
- id: cache-web
uses: actions/cache@a7833574556fa59680c1b7cb190c1735db73ebf0 # v4
uses: actions/cache@9255dc7a253b0ccc959486e2bca901246202afeb # v4
with:
path: web/dist
key: ${{ runner.os }}-web-${{ hashFiles('web/package-lock.json', 'package-lock.json', 'web/src/**', 'web/packages/sfe/src/**') }}-b

View File

@@ -9,6 +9,13 @@ NPM_VERSION = $(shell python -m scripts.generate_semver)
PY_SOURCES = authentik packages tests scripts lifecycle .github
DOCKER_IMAGE ?= "authentik:test"
UNAME_S := $(shell uname -s)
ifeq ($(UNAME_S),Darwin)
SED_INPLACE = sed -i ''
else
SED_INPLACE = sed -i
endif
GEN_API_TS = gen-ts-api
GEN_API_PY = gen-py-api
GEN_API_GO = gen-go-api
@@ -46,7 +53,7 @@ help: ## Show this help
@echo ""
go-test:
GOFIPS140=latest CGO_ENABLED=1 go test -timeout 0 -v -race -cover ./...
go test -timeout 0 -v -race -cover ./...
test: ## Run the server tests and produce a coverage report (locally)
$(KRB_PATH) uv run coverage run manage.py test --keepdb $(or $(filter-out $@,$(MAKECMDGOALS)),authentik)
@@ -119,8 +126,8 @@ bump: ## Bump authentik version. Usage: make bump version=20xx.xx.xx
ifndef version
$(error Usage: make bump version=20xx.xx.xx )
endif
sed -i 's/^version = ".*"/version = "$(version)"/' pyproject.toml
sed -i 's/^VERSION = ".*"/VERSION = "$(version)"/' authentik/__init__.py
$(SED_INPLACE) 's/^version = ".*"/version = "$(version)"/' pyproject.toml
$(SED_INPLACE) 's/^VERSION = ".*"/VERSION = "$(version)"/' authentik/__init__.py
$(MAKE) gen-build gen-compose aws-cfn
npm version --no-git-tag-version --allow-same-version $(version)
cd ${PWD}/web && npm version --no-git-tag-version --allow-same-version $(version)
@@ -155,8 +162,8 @@ gen-diff: ## (Release) generate the changelog diff between the current schema a
/local/schema-old.yml \
/local/schema.yml
rm schema-old.yml
sed -i 's/{/&#123;/g' diff.md
sed -i 's/}/&#125;/g' diff.md
$(SED_INPLACE) 's/{/&#123;/g' diff.md
$(SED_INPLACE) 's/}/&#125;/g' diff.md
npx prettier --write diff.md
gen-clean-ts: ## Remove generated API client for TypeScript

View File

@@ -240,7 +240,9 @@ class FileUsedByView(APIView):
for field in fields:
q |= Q(**{field: params.get("name")})
objs = get_objects_for_user(request.user, f"{app}.view_{model_name}", model)
objs = get_objects_for_user(
request.user, f"{app}.view_{model_name}", model.objects.all()
)
objs = objs.filter(q)
for obj in objs:
serializer = UsedBySerializer(

View File

@@ -1,9 +1,4 @@
from pathlib import Path
from django.conf import settings
from authentik.blueprints.apps import ManagedAppConfig
from authentik.lib.config import CONFIG
class AuthentikFilesConfig(ManagedAppConfig):
@@ -11,20 +6,3 @@ class AuthentikFilesConfig(ManagedAppConfig):
label = "authentik_admin_files"
verbose_name = "authentik Files"
default = True
@ManagedAppConfig.reconcile_global
def check_for_media_mount(self):
if settings.TEST:
return
from authentik.events.models import Event, EventAction
if (
CONFIG.get("storage.media.backend", CONFIG.get("storage.backend", "file")) == "file"
and Path("/media").exists()
):
Event.new(
EventAction.CONFIGURATION_ERROR,
message="/media has been moved to /data/media. "
"Check the release notes for migration steps.",
).save()

View File

@@ -1,10 +1,13 @@
from unittest import skipUnless
from django.test import TestCase
from authentik.admin.files.tests.utils import FileTestS3BackendMixin
from authentik.admin.files.tests.utils import FileTestS3BackendMixin, s3_test_server_available
from authentik.admin.files.usage import FileUsage
from authentik.lib.config import CONFIG
@skipUnless(s3_test_server_available(), "S3 test server not available")
class TestS3Backend(FileTestS3BackendMixin, TestCase):
"""Test S3 backend functionality"""

View File

@@ -1,11 +1,26 @@
import shutil
import socket
from tempfile import mkdtemp
from urllib.parse import urlparse
from authentik.admin.files.backends.s3 import S3Backend
from authentik.admin.files.usage import FileUsage
from authentik.lib.config import CONFIG, UNSET
from authentik.lib.generators import generate_id
S3_TEST_ENDPOINT = "http://localhost:8020"
def s3_test_server_available() -> bool:
"""Check if the S3 test server is reachable."""
parsed = urlparse(S3_TEST_ENDPOINT)
try:
with socket.create_connection((parsed.hostname, parsed.port), timeout=2):
return True
except OSError:
return False
class FileTestFileBackendMixin:
def setUp(self):
@@ -57,7 +72,7 @@ class FileTestS3BackendMixin:
for key in s3_config_keys:
self.original_media_s3_settings[key] = CONFIG.get(f"storage.media.s3.{key}", UNSET)
self.media_s3_bucket_name = f"authentik-test-{generate_id(10)}".lower()
CONFIG.set("storage.media.s3.endpoint", "http://localhost:8020")
CONFIG.set("storage.media.s3.endpoint", S3_TEST_ENDPOINT)
CONFIG.set("storage.media.s3.access_key", "accessKey1")
CONFIG.set("storage.media.s3.secret_key", "secretKey1")
CONFIG.set("storage.media.s3.bucket_name", self.media_s3_bucket_name)
@@ -70,7 +85,7 @@ class FileTestS3BackendMixin:
for key in s3_config_keys:
self.original_reports_s3_settings[key] = CONFIG.get(f"storage.reports.s3.{key}", UNSET)
self.reports_s3_bucket_name = f"authentik-test-{generate_id(10)}".lower()
CONFIG.set("storage.reports.s3.endpoint", "http://localhost:8020")
CONFIG.set("storage.reports.s3.endpoint", S3_TEST_ENDPOINT)
CONFIG.set("storage.reports.s3.access_key", "accessKey1")
CONFIG.set("storage.reports.s3.secret_key", "secretKey1")
CONFIG.set("storage.reports.s3.bucket_name", self.reports_s3_bucket_name)

View File

@@ -180,10 +180,10 @@ class ApplicationViewSet(UsedByMixin, ModelViewSet):
)
def _filter_applications_with_launch_url(
self, applications: QuerySet[Application]
self, paginated_apps: QuerySet[Application]
) -> list[Application]:
applications = []
for app in applications:
for app in paginated_apps:
if app.get_launch_url():
applications.append(app)
return applications

View File

@@ -246,11 +246,7 @@ class GroupViewSet(UsedByMixin, ModelViewSet):
]
def get_ql_fields(self):
from djangoql.schema import BoolField, StrField
from authentik.enterprise.search.fields import (
JSONSearchField,
)
from akql.schema import BoolField, JSONSearchField, StrField
return [
StrField(Group, "name"),

View File

@@ -4,7 +4,6 @@ from typing import Any
from django.utils.timezone import now
from drf_spectacular.utils import OpenApiResponse, extend_schema
from guardian.shortcuts import get_anonymous_user
from rest_framework.decorators import action
from rest_framework.exceptions import ValidationError
from rest_framework.fields import CharField
@@ -145,12 +144,6 @@ class TokenViewSet(UsedByMixin, ModelViewSet):
owner_field = "user"
rbac_allow_create_without_perm = True
def get_queryset(self):
user = self.request.user if self.request else get_anonymous_user()
if user.is_superuser:
return super().get_queryset()
return super().get_queryset().filter(user=user.pk)
def perform_create(self, serializer: TokenSerializer):
if not self.request.user.is_superuser:
instance = serializer.save(

View File

@@ -504,12 +504,7 @@ class UserViewSet(
]
def get_ql_fields(self):
from djangoql.schema import BoolField, StrField
from authentik.enterprise.search.fields import (
ChoiceSearchField,
JSONSearchField,
)
from akql.schema import BoolField, ChoiceSearchField, JSONSearchField, StrField
return [
StrField(User, "username"),

View File

@@ -183,16 +183,16 @@ class TestTokenAPI(APITestCase):
self.assertEqual(len(body["results"]), 1)
self.assertEqual(body["results"][0]["identifier"], token_should.identifier)
def test_list_admin(self):
"""Test Token List (Test with admin auth)"""
def test_list_with_permission(self):
"""Test Token List (Test with `view_token` permission)"""
Token.objects.all().delete()
self.client.force_login(self.admin)
token_should: Token = Token.objects.create(
identifier="test", expiring=False, user=self.user
)
token_should_not: Token = Token.objects.create(
identifier="test-2", expiring=False, user=get_anonymous_user()
)
self.user.assign_perms_to_managed_role("authentik_core.view_token")
response = self.client.get(reverse("authentik_api:token-list"))
body = loads(response.content)
self.assertEqual(len(body["results"]), 2)

View File

@@ -1,7 +1,5 @@
"""Crypto API Views"""
from datetime import datetime
from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives.serialization import load_pem_private_key
from cryptography.x509 import load_pem_x509_certificate
@@ -15,14 +13,12 @@ from drf_spectacular.utils import (
OpenApiParameter,
OpenApiResponse,
extend_schema,
extend_schema_field,
)
from rest_framework.decorators import action
from rest_framework.exceptions import ValidationError
from rest_framework.fields import (
CharField,
ChoiceField,
DateTimeField,
IntegerField,
SerializerMethodField,
)
@@ -51,59 +47,15 @@ LOGGER = get_logger()
class CertificateKeyPairSerializer(ModelSerializer):
"""CertificateKeyPair Serializer"""
fingerprint_sha256 = SerializerMethodField()
fingerprint_sha1 = SerializerMethodField()
cert_expiry = SerializerMethodField()
cert_subject = SerializerMethodField()
private_key_available = SerializerMethodField()
key_type = SerializerMethodField()
certificate_download_url = SerializerMethodField()
private_key_download_url = SerializerMethodField()
@property
def _should_include_details(self) -> bool:
request: Request = self.context.get("request", None)
if not request:
return True
return str(request.query_params.get("include_details", "true")).lower() == "true"
def get_fingerprint_sha256(self, instance: CertificateKeyPair) -> str | None:
"Get certificate Hash (SHA256)"
if not self._should_include_details:
return None
return instance.fingerprint_sha256
def get_fingerprint_sha1(self, instance: CertificateKeyPair) -> str | None:
"Get certificate Hash (SHA1)"
if not self._should_include_details:
return None
return instance.fingerprint_sha1
def get_cert_expiry(self, instance: CertificateKeyPair) -> datetime | None:
"Get certificate expiry"
if not self._should_include_details:
return None
return DateTimeField().to_representation(instance.certificate.not_valid_after_utc)
def get_cert_subject(self, instance: CertificateKeyPair) -> str | None:
"""Get certificate subject as full rfc4514"""
if not self._should_include_details:
return None
return instance.certificate.subject.rfc4514_string()
def get_private_key_available(self, instance: CertificateKeyPair) -> bool:
"""Show if this keypair has a private key configured or not"""
return instance.key_data != "" and instance.key_data is not None
@extend_schema_field(ChoiceField(choices=KeyType.choices, allow_null=True))
def get_key_type(self, instance: CertificateKeyPair) -> str | None:
"""Get the key algorithm type from the certificate's public key"""
if not self._should_include_details:
return None
return instance.key_type
def get_certificate_download_url(self, instance: CertificateKeyPair) -> str:
"""Get URL to download certificate"""
return (
@@ -175,6 +127,11 @@ class CertificateKeyPairSerializer(ModelSerializer):
"managed": {"read_only": True},
"key_data": {"write_only": True},
"certificate_data": {"write_only": True},
"fingerprint_sha256": {"read_only": True},
"fingerprint_sha1": {"read_only": True},
"cert_expiry": {"read_only": True},
"cert_subject": {"read_only": True},
"key_type": {"read_only": True},
}
@@ -216,17 +173,12 @@ class CertificateKeyPairFilter(FilterSet):
return queryset.exclude(key_data__exact="")
def filter_key_type(self, queryset, name, value): # pragma: no cover
"""Filter certificates by key type using the public key from the certificate"""
"""Filter certificates by key type using the stored database field"""
if not value:
return queryset
# value is a list of KeyType enum values from MultipleChoiceFilter
filtered_pks = []
for cert in queryset:
if cert.key_type in value:
filtered_pks.append(cert.pk)
return queryset.filter(pk__in=filtered_pks)
return queryset.filter(key_type__in=value)
class Meta:
model = CertificateKeyPair
@@ -263,7 +215,6 @@ class CertificateKeyPairViewSet(UsedByMixin, ModelViewSet):
"Can be specified multiple times (e.g. '?key_type=rsa&key_type=ec')"
),
),
OpenApiParameter("include_details", bool, default=True),
]
)
def list(self, request, *args, **kwargs):

View File

@@ -0,0 +1,117 @@
# Generated by Django 5.2.9 on 2025-12-09 06:22
from hashlib import md5
from cryptography.hazmat.backends import default_backend
from cryptography.x509 import load_pem_x509_certificate
from django.db import migrations, models
from authentik.crypto.signals import extract_certificate_metadata
def backfill_certificate_metadata(apps, schema_editor): # noqa: ARG001
"""Backfill certificate metadata and kid for existing records."""
CertificateKeyPair = apps.get_model("authentik_crypto", "CertificateKeyPair")
for cert in CertificateKeyPair.objects.all():
updated_fields = []
if cert.certificate_data:
try:
certificate = load_pem_x509_certificate(
cert.certificate_data.encode("utf-8"), default_backend()
)
metadata = extract_certificate_metadata(certificate)
cert.key_type = metadata["key_type"]
cert.cert_expiry = metadata["cert_expiry"]
cert.cert_subject = metadata["cert_subject"]
cert.fingerprint_sha256 = metadata["fingerprint_sha256"]
cert.fingerprint_sha1 = metadata["fingerprint_sha1"]
updated_fields.extend(
[
"key_type",
"cert_expiry",
"cert_subject",
"fingerprint_sha256",
"fingerprint_sha1",
]
)
except (ValueError, TypeError, AttributeError):
pass
# Backfill kid with MD5 for backwards compatibility
if cert.key_data:
cert.kid = md5(cert.key_data.encode("utf-8"), usedforsecurity=False).hexdigest()
updated_fields.append("kid")
if updated_fields:
cert.save(update_fields=updated_fields)
class Migration(migrations.Migration):
dependencies = [
("authentik_crypto", "0005_alter_certificatekeypair_options"),
]
operations = [
migrations.AddField(
model_name="certificatekeypair",
name="cert_expiry",
field=models.DateTimeField(blank=True, help_text="Certificate expiry date", null=True),
),
migrations.AddField(
model_name="certificatekeypair",
name="cert_subject",
field=models.TextField(
blank=True, help_text="Certificate subject as RFC4514 string", null=True
),
),
migrations.AddField(
model_name="certificatekeypair",
name="fingerprint_sha1",
field=models.CharField(
blank=True,
help_text="SHA1 fingerprint of the certificate",
max_length=59,
null=True,
),
),
migrations.AddField(
model_name="certificatekeypair",
name="fingerprint_sha256",
field=models.CharField(
blank=True,
help_text="SHA256 fingerprint of the certificate",
max_length=95,
null=True,
),
),
migrations.AddField(
model_name="certificatekeypair",
name="key_type",
field=models.CharField(
blank=True,
choices=[
("rsa", "RSA"),
("ec", "Elliptic Curve"),
("dsa", "DSA"),
("ed25519", "Ed25519"),
("ed448", "Ed448"),
],
help_text="Key algorithm type detected from the certificate's public key",
max_length=16,
null=True,
),
),
migrations.AddField(
model_name="certificatekeypair",
name="kid",
field=models.CharField(
blank=True, help_text="Key ID generated from private key", max_length=128, null=True
),
),
migrations.RunPython(backfill_certificate_metadata, migrations.RunPython.noop),
]

View File

@@ -1,7 +1,8 @@
"""authentik crypto models"""
from base64 import urlsafe_b64encode
from binascii import hexlify
from hashlib import md5
from hashlib import md5, sha512
from ssl import PEM_FOOTER, PEM_HEADER
from textwrap import wrap
from uuid import uuid4
@@ -47,6 +48,39 @@ def fingerprint_sha256(cert: Certificate) -> str:
return hexlify(cert.fingerprint(hashes.SHA256()), ":").decode("utf-8")
def detect_key_type(certificate: Certificate) -> str | None:
"""Detect the key algorithm type by parsing the certificate's public key"""
try:
public_key = certificate.public_key()
if isinstance(public_key, RSAPublicKey):
return KeyType.RSA
if isinstance(public_key, EllipticCurvePublicKey):
return KeyType.EC
if isinstance(public_key, DSAPublicKey):
return KeyType.DSA
if isinstance(public_key, Ed25519PublicKey):
return KeyType.ED25519
if isinstance(public_key, Ed448PublicKey):
return KeyType.ED448
except (ValueError, TypeError, AttributeError) as exc:
LOGGER.warning("Failed to detect key type", exc=exc)
return None
def generate_key_id(key_data: str) -> str:
"""Generate Key ID using SHA512 + urlsafe_b64encode."""
if not key_data:
return ""
return urlsafe_b64encode(sha512(key_data.encode("utf-8")).digest()).decode("utf-8").rstrip("=")
def generate_key_id_legacy(key_data: str) -> str:
"""Generate Key ID using MD5 (legacy format for backwards compatibility)."""
if not key_data:
return ""
return md5(key_data.encode("utf-8")).hexdigest() # nosec
class CertificateKeyPair(SerializerModel, ManagedModel, CreatedUpdatedModel):
"""CertificateKeyPair that can be used for signing or encrypting if `key_data`
is set, otherwise it can be used to verify remote data."""
@@ -62,6 +96,41 @@ class CertificateKeyPair(SerializerModel, ManagedModel, CreatedUpdatedModel):
blank=True,
default="",
)
key_type = models.CharField(
max_length=16,
choices=KeyType.choices,
null=True,
blank=True,
help_text=_("Key algorithm type detected from the certificate's public key"),
)
cert_expiry = models.DateTimeField(
null=True,
blank=True,
help_text=_("Certificate expiry date"),
)
cert_subject = models.TextField(
null=True,
blank=True,
help_text=_("Certificate subject as RFC4514 string"),
)
fingerprint_sha256 = models.CharField(
max_length=95,
null=True,
blank=True,
help_text=_("SHA256 fingerprint of the certificate"),
)
fingerprint_sha1 = models.CharField(
max_length=59,
null=True,
blank=True,
help_text=_("SHA1 fingerprint of the certificate"),
)
kid = models.CharField(
max_length=128,
null=True,
blank=True,
help_text=_("Key ID generated from private key"),
)
_cert: Certificate | None = None
_private_key: PrivateKeyTypes | None = None
@@ -106,41 +175,6 @@ class CertificateKeyPair(SerializerModel, ManagedModel, CreatedUpdatedModel):
return None
return self._private_key
@property
def fingerprint_sha256(self) -> str:
"""Get SHA256 Fingerprint of certificate_data"""
return fingerprint_sha256(self.certificate)
@property
def fingerprint_sha1(self) -> str:
"""Get SHA1 Fingerprint of certificate_data"""
return hexlify(self.certificate.fingerprint(hashes.SHA1()), ":").decode("utf-8") # nosec
@property
def kid(self):
"""Get Key ID used for JWKS"""
return (
md5(self.key_data.encode("utf-8"), usedforsecurity=False).hexdigest()
if self.key_data
else ""
) # nosec
@property
def key_type(self) -> str | None:
"""Get the key algorithm type from the certificate's public key"""
public_key = self.certificate.public_key()
if isinstance(public_key, RSAPublicKey):
return KeyType.RSA
if isinstance(public_key, EllipticCurvePublicKey):
return KeyType.EC
if isinstance(public_key, DSAPublicKey):
return KeyType.DSA
if isinstance(public_key, Ed25519PublicKey):
return KeyType.ED25519
if isinstance(public_key, Ed448PublicKey):
return KeyType.ED448
return None
def __str__(self) -> str:
return f"Certificate-Key Pair {self.name}"

View File

@@ -0,0 +1,70 @@
"""authentik crypto signals"""
from binascii import hexlify
from datetime import datetime
from ssl import CertificateError
from cryptography.hazmat.primitives import hashes
from cryptography.x509 import Certificate
from django.db.models.signals import pre_save
from django.dispatch import receiver
from structlog.stdlib import get_logger
from authentik.crypto.models import (
CertificateKeyPair,
detect_key_type,
fingerprint_sha256,
generate_key_id,
generate_key_id_legacy,
)
LOGGER = get_logger()
def extract_certificate_metadata(certificate: Certificate) -> dict[str, str | datetime]:
"""Extract all metadata fields from a certificate."""
metadata = {}
try:
metadata["key_type"] = detect_key_type(certificate)
metadata["cert_expiry"] = certificate.not_valid_after_utc
metadata["cert_subject"] = certificate.subject.rfc4514_string()
metadata["fingerprint_sha256"] = fingerprint_sha256(certificate)
metadata["fingerprint_sha1"] = hexlify(
certificate.fingerprint(hashes.SHA1()), ":" # nosec
).decode("utf-8")
except (ValueError, TypeError, AttributeError) as exc:
raise CertificateError(f"Invalid certificate metadata: {exc}") from exc
return metadata
@receiver(pre_save, sender="authentik_crypto.CertificateKeyPair")
def certificate_key_pair_pre_save(
sender: type[CertificateKeyPair], instance: CertificateKeyPair, **_
):
"""Automatically populate certificate metadata fields before saving"""
# Only extract metadata if certificate_data is present
if not instance.certificate_data:
return
try:
metadata = extract_certificate_metadata(instance.certificate)
except (CertificateError, ValueError, TypeError, AttributeError) as exc:
LOGGER.warning("Failed to extract certificate metadata", exc=exc)
return
instance.key_type = metadata["key_type"]
instance.cert_expiry = metadata["cert_expiry"]
instance.cert_subject = metadata["cert_subject"]
instance.fingerprint_sha256 = metadata["fingerprint_sha256"]
instance.fingerprint_sha1 = metadata["fingerprint_sha1"]
# Generate kid if not set, or regenerate if key_data has changed
# Preserve existing kid (MD5 or SHA512) if it matches the current key_data
if instance.key_data:
new_kid = generate_key_id(instance.key_data)
legacy_kid = generate_key_id_legacy(instance.key_data)
if instance.kid not in (new_kid, legacy_kid):
instance.kid = new_kid

View File

@@ -20,7 +20,7 @@ from authentik.core.tests.utils import (
)
from authentik.crypto.api import CertificateKeyPairSerializer
from authentik.crypto.builder import CertificateBuilder
from authentik.crypto.models import CertificateKeyPair
from authentik.crypto.models import CertificateKeyPair, generate_key_id, generate_key_id_legacy
from authentik.crypto.tasks import MANAGED_DISCOVERED, certificate_discovery
from authentik.lib.config import CONFIG
from authentik.lib.generators import generate_id, generate_key
@@ -173,21 +173,24 @@ class TestCrypto(APITestCase):
self.assertEqual(api_cert["fingerprint_sha1"], cert.fingerprint_sha1)
self.assertEqual(api_cert["fingerprint_sha256"], cert.fingerprint_sha256)
def test_list_without_details(self):
"""Test API List (no details)"""
def test_list_always_includes_details(self):
"""Test API List always includes certificate details"""
cert = create_test_cert()
self.client.force_login(create_test_admin_user())
response = self.client.get(
reverse(
"authentik_api:certificatekeypair-list",
),
data={"name": cert.name, "include_details": False},
data={"name": cert.name},
)
self.assertEqual(response.status_code, 200)
body = loads(response.content.decode())
api_cert = [x for x in body["results"] if x["name"] == cert.name][0]
self.assertEqual(api_cert["fingerprint_sha1"], None)
self.assertEqual(api_cert["fingerprint_sha256"], None)
# All details should now always be included
self.assertEqual(api_cert["fingerprint_sha1"], cert.fingerprint_sha1)
self.assertEqual(api_cert["fingerprint_sha256"], cert.fingerprint_sha256)
self.assertIsNotNone(api_cert["cert_expiry"])
self.assertIsNotNone(api_cert["cert_subject"])
def test_certificate_download(self):
"""Test certificate export (download)"""
@@ -426,3 +429,114 @@ class TestCrypto(APITestCase):
self.assertEqual(
1, final_count, "Should not create duplicate cert for same private key"
)
def test_metadata_extraction_with_cert_and_key(self):
"""Test that metadata is extracted when creating keypair with certificate and key"""
cert = create_test_cert()
# Verify all metadata fields are populated
self.assertIsNotNone(cert.key_type)
self.assertIsNotNone(cert.cert_expiry)
self.assertIsNotNone(cert.cert_subject)
self.assertIsNotNone(cert.fingerprint_sha256)
self.assertIsNotNone(cert.fingerprint_sha1)
# Verify kid is generated using SHA512 for new records
self.assertIsNotNone(cert.kid)
self.assertEqual(cert.kid, generate_key_id(cert.key_data))
def test_metadata_extraction_without_key(self):
"""Test that metadata is extracted when creating keypair without private key"""
builder = CertificateBuilder(generate_id())
builder.build(subject_alt_names=[], validity_days=3)
# Create keypair with only certificate, no key
cert = CertificateKeyPair.objects.create(
name=generate_id(),
certificate_data=builder.certificate,
key_data="",
)
# Verify certificate metadata fields are populated
self.assertIsNotNone(cert.key_type)
self.assertIsNotNone(cert.cert_expiry)
self.assertIsNotNone(cert.cert_subject)
self.assertIsNotNone(cert.fingerprint_sha256)
self.assertIsNotNone(cert.fingerprint_sha1)
# Verify kid is empty when no key_data
self.assertEqual(cert.kid, None)
def test_metadata_extraction_invalid_cert(self):
"""Test that invalid certificate data doesn't crash, just skips metadata"""
cert = CertificateKeyPair.objects.create(
name=generate_id(),
certificate_data="invalid certificate data",
key_data="",
)
# Verify metadata fields are None for invalid cert
self.assertIsNone(cert.key_type)
self.assertIsNone(cert.cert_expiry)
self.assertIsNone(cert.cert_subject)
self.assertIsNone(cert.fingerprint_sha256)
self.assertIsNone(cert.fingerprint_sha1)
self.assertIsNone(cert.kid)
def test_kid_legacy_preservation(self):
"""Test that legacy MD5 kid is preserved when key_data hasn't changed"""
cert = create_test_cert()
# Simulate a legacy MD5 kid (as if backfilled from old system)
legacy_kid = generate_key_id_legacy(cert.key_data)
CertificateKeyPair.objects.filter(pk=cert.pk).update(kid=legacy_kid)
cert.refresh_from_db()
self.assertEqual(cert.kid, legacy_kid)
# Save the cert again (e.g., name change) - kid should be preserved
cert.name = generate_id()
cert.save()
cert.refresh_from_db()
self.assertEqual(cert.kid, legacy_kid)
def test_kid_regenerated_on_key_change(self):
"""Test that kid is regenerated when key_data changes"""
cert = create_test_cert()
original_kid = cert.kid
# Generate a new key and update the keypair
builder = CertificateBuilder(generate_id())
builder.build(subject_alt_names=[], validity_days=3)
cert.key_data = builder.private_key
cert.certificate_data = builder.certificate
cert.save()
cert.refresh_from_db()
# Kid should be regenerated for the new key
self.assertNotEqual(cert.kid, original_kid)
self.assertEqual(cert.kid, generate_key_id(cert.key_data))
def test_kid_regenerated_on_key_change_from_legacy(self):
"""Test that kid is regenerated from legacy MD5 when key_data changes"""
cert = create_test_cert()
# Simulate a legacy MD5 kid
legacy_kid = generate_key_id_legacy(cert.key_data)
CertificateKeyPair.objects.filter(pk=cert.pk).update(kid=legacy_kid)
cert.refresh_from_db()
self.assertEqual(cert.kid, legacy_kid)
# Generate a new key and update the keypair
builder = CertificateBuilder(generate_id())
builder.build(subject_alt_names=[], validity_days=3)
cert.key_data = builder.private_key
cert.certificate_data = builder.certificate
cert.save()
cert.refresh_from_db()
# Kid should now be SHA512 for the new key
self.assertNotEqual(cert.kid, legacy_kid)
self.assertEqual(cert.kid, generate_key_id(cert.key_data))

View File

@@ -1,128 +0,0 @@
"""DjangoQL search"""
from collections import OrderedDict, defaultdict
from collections.abc import Generator
from django.db import connection
from django.db.models import Model, Q
from djangoql.compat import text_type
from djangoql.schema import StrField
class JSONSearchField(StrField):
"""JSON field for DjangoQL"""
model: Model
def __init__(self, model=None, name=None, nullable=None, suggest_nested=True):
# Set this in the constructor to not clobber the type variable
self.type = "relation"
self.suggest_nested = suggest_nested
super().__init__(model, name, nullable)
def get_lookup(self, path, operator, value):
search = "__".join(path)
op, invert = self.get_operator(operator)
q = Q(**{f"{search}{op}": self.get_lookup_value(value)})
return ~q if invert else q
def json_field_keys(self) -> Generator[tuple[str]]:
with connection.cursor() as cursor:
cursor.execute(
f"""
WITH RECURSIVE "{self.name}_keys" AS (
SELECT
ARRAY[jsonb_object_keys("{self.name}")] AS key_path_array,
"{self.name}" -> jsonb_object_keys("{self.name}") AS value
FROM {self.model._meta.db_table}
WHERE "{self.name}" IS NOT NULL
AND jsonb_typeof("{self.name}") = 'object'
UNION ALL
SELECT
ck.key_path_array || jsonb_object_keys(ck.value),
ck.value -> jsonb_object_keys(ck.value) AS value
FROM "{self.name}_keys" ck
WHERE jsonb_typeof(ck.value) = 'object'
),
unique_paths AS (
SELECT DISTINCT key_path_array
FROM "{self.name}_keys"
)
SELECT key_path_array FROM unique_paths;
""" # nosec
)
return (x[0] for x in cursor.fetchall())
def get_nested_options(self) -> OrderedDict:
"""Get keys of all nested objects to show autocomplete"""
if not self.suggest_nested:
return OrderedDict()
base_model_name = f"{self.model._meta.app_label}.{self.model._meta.model_name}_{self.name}"
def recursive_function(parts: list[str], parent_parts: list[str] | None = None):
if not parent_parts:
parent_parts = []
path = parts.pop(0)
parent_parts.append(path)
relation_key = "_".join(parent_parts)
if len(parts) > 1:
out_dict = {
relation_key: {
parts[0]: {
"type": "relation",
"relation": f"{relation_key}_{parts[0]}",
}
}
}
child_paths = recursive_function(parts.copy(), parent_parts.copy())
child_paths.update(out_dict)
return child_paths
else:
return {relation_key: {parts[0]: {}}}
relation_structure = defaultdict(dict)
for relations in self.json_field_keys():
result = recursive_function([base_model_name] + relations)
for relation_key, value in result.items():
for sub_relation_key, sub_value in value.items():
if not relation_structure[relation_key].get(sub_relation_key, None):
relation_structure[relation_key][sub_relation_key] = sub_value
else:
relation_structure[relation_key][sub_relation_key].update(sub_value)
final_dict = defaultdict(dict)
for key, value in relation_structure.items():
for sub_key, sub_value in value.items():
if not sub_value:
final_dict[key][sub_key] = {
"type": "str",
"nullable": True,
}
else:
final_dict[key][sub_key] = sub_value
return OrderedDict(final_dict)
def relation(self) -> str:
return f"{self.model._meta.app_label}.{self.model._meta.model_name}_{self.name}"
class ChoiceSearchField(StrField):
def __init__(self, model=None, name=None, nullable=None):
super().__init__(model, name, nullable, suggest_options=True)
def get_options(self, search):
result = []
choices = self._field_choices()
if choices:
search = search.lower()
for c in choices:
choice = text_type(c[0])
if search in choice.lower():
result.append(choice)
return result

View File

@@ -1,18 +1,15 @@
"""DjangoQL search"""
"""QL search"""
from akql.exceptions import AKQLError
from akql.queryset import apply_search
from akql.schema import AKQLSchema
from django.apps import apps
from django.db.models import QuerySet
from djangoql.ast import Name
from djangoql.exceptions import DjangoQLError
from djangoql.queryset import apply_search
from djangoql.schema import DjangoQLSchema
from drf_spectacular.plumbing import ResolvedComponent, build_object_type
from rest_framework.filters import SearchFilter
from rest_framework.request import Request
from structlog.stdlib import get_logger
from authentik.enterprise.search.fields import JSONSearchField
LOGGER = get_logger()
AUTOCOMPLETE_SCHEMA = ResolvedComponent(
name="Autocomplete",
@@ -22,27 +19,8 @@ AUTOCOMPLETE_SCHEMA = ResolvedComponent(
)
class BaseSchema(DjangoQLSchema):
"""Base Schema which deals with JSON Fields"""
def resolve_name(self, name: Name):
model = self.model_label(self.current_model)
root_field = name.parts[0]
field = self.models[model].get(root_field)
# If the query goes into a JSON field, return the root
# field as the JSON field will do the rest
if isinstance(field, JSONSearchField):
# This is a workaround; build_filter will remove the right-most
# entry in the path as that is intended to be the same as the field
# however for JSON that is not the case
if name.parts[-1] != root_field:
name.parts.append(root_field)
return field
return super().resolve_name(name)
class QLSearch(SearchFilter):
"""rest_framework search filter which uses DjangoQL"""
"""rest_framework search filter which uses AKQL"""
def __init__(self):
super().__init__()
@@ -59,24 +37,30 @@ class QLSearch(SearchFilter):
params = params.replace("\x00", "") # strip null characters
return params
def get_schema(self, request: Request, view) -> BaseSchema:
def get_schema(self, request: Request, view) -> AKQLSchema:
ql_fields = []
if hasattr(view, "get_ql_fields"):
ql_fields = view.get_ql_fields()
class InlineSchema(BaseSchema):
class InlineSchema(AKQLSchema):
def get_fields(self, model):
return ql_fields or []
return InlineSchema
def get_search_context(self, request: Request):
return {
"$ak_user": request.user.pk,
}
def filter_queryset(self, request: Request, queryset: QuerySet, view) -> QuerySet:
search_query = self.get_search_terms(request)
schema = self.get_schema(request, view)
if len(search_query) == 0 or not self.enabled:
return self._fallback.filter_queryset(request, queryset, view)
context = self.get_search_context(request)
try:
return apply_search(queryset, search_query, schema=schema)
except DjangoQLError as exc:
return apply_search(queryset, search_query, context=context, schema=schema)
except AKQLError as exc:
LOGGER.debug("Failed to parse search expression", exc=exc)
return self._fallback.filter_queryset(request, queryset, view)

View File

@@ -1,11 +1,11 @@
from djangoql.serializers import DjangoQLSchemaSerializer
from akql.schema import JSONSearchField
from akql.serializers import AKQLSchemaSerializer
from drf_spectacular.generators import SchemaGenerator
from authentik.enterprise.search.fields import JSONSearchField
from authentik.enterprise.search.ql import AUTOCOMPLETE_SCHEMA
class AKQLSchemaSerializer(DjangoQLSchemaSerializer):
class AKQLSchemaSerializer(AKQLSchemaSerializer):
def serialize(self, schema):
serialization = super().serialize(schema)
for _, fields in schema.models.items():
@@ -15,12 +15,6 @@ class AKQLSchemaSerializer(DjangoQLSchemaSerializer):
serialization["models"].update(field.get_nested_options())
return serialization
def serialize_field(self, field):
result = super().serialize_field(field)
if isinstance(field, JSONSearchField):
result["relation"] = field.relation()
return result
def postprocess_schema_search_autocomplete(result, generator: SchemaGenerator, **kwargs):
generator.registry.register_on_missing(AUTOCOMPLETE_SCHEMA)

View File

@@ -136,9 +136,7 @@ class EventViewSet(
filterset_class = EventsFilter
def get_ql_fields(self):
from djangoql.schema import DateTimeField, StrField
from authentik.enterprise.search.fields import ChoiceSearchField, JSONSearchField
from akql.schema import ChoiceSearchField, DateTimeField, JSONSearchField, StrField
return [
ChoiceSearchField(Event, "action"),

2
go.mod
View File

@@ -32,7 +32,7 @@ require (
github.com/spf13/cobra v1.10.2
github.com/stretchr/testify v1.11.1
github.com/wwt/guac v1.3.2
goauthentik.io/api/v3 v3.2025120.26
goauthentik.io/api/v3 v3.2026020.1
golang.org/x/exp v0.0.0-20230210204819-062eb4c674ab
golang.org/x/oauth2 v0.34.0
golang.org/x/sync v0.19.0

4
go.sum
View File

@@ -214,8 +214,8 @@ go.yaml.in/yaml/v2 v2.4.2 h1:DzmwEr2rDGHl7lsFgAHxmNz/1NlQ7xLIrlN2h5d1eGI=
go.yaml.in/yaml/v2 v2.4.2/go.mod h1:081UH+NErpNdqlCXm3TtEran0rJZGxAYx9hb/ELlsPU=
go.yaml.in/yaml/v3 v3.0.4 h1:tfq32ie2Jv2UxXFdLJdh3jXuOzWiL1fo0bu/FbuKpbc=
go.yaml.in/yaml/v3 v3.0.4/go.mod h1:DhzuOOF2ATzADvBadXxruRBLzYTpT36CKvDb3+aBEFg=
goauthentik.io/api/v3 v3.2025120.26 h1:2lTMtjCWtdOeQe7kwjpGUx39qUEpcxcxTirIqMvn0Os=
goauthentik.io/api/v3 v3.2025120.26/go.mod h1:82lqAz4jxzl6Cg0YDbhNtvvTG2rm6605ZhdJFnbbsl8=
goauthentik.io/api/v3 v3.2026020.1 h1:R7WdvVmfm066d3Zu7R+WfjDGdFqC/X2gONHIGPfcLzk=
goauthentik.io/api/v3 v3.2026020.1/go.mod h1:82lqAz4jxzl6Cg0YDbhNtvvTG2rm6605ZhdJFnbbsl8=
golang.org/x/crypto v0.0.0-20190308221718-c2843e01d9a2/go.mod h1:djNgcEr1/C05ACkg1iLfiJU5Ep61QUkGW8qpdssI0+w=
golang.org/x/crypto v0.0.0-20210921155107-089bfa567519/go.mod h1:GvvjBRRGRdwPK5ydBHafDWAxML/pGHZbMvKqRZ5+Abc=
golang.org/x/crypto v0.13.0/go.mod h1:y6Z2r+Rw4iayiXXAIxJIDAJ1zMW4yaTpebo8fPOliYc=

View File

@@ -31,7 +31,7 @@ RUN --mount=type=cache,sharing=locked,target=/go/pkg/mod \
go build -o /go/ldap ./cmd/ldap
# Stage 2: Run
FROM ghcr.io/goauthentik/fips-debian:trixie-slim-fips@sha256:07f41ce3f15b2bb5eb5bcd4e6efc0cb42bb7e5609e7244f636da1a91166817ca
FROM ghcr.io/goauthentik/fips-debian:trixie-slim-fips@sha256:2f19fc114923ec0842329bf638cb155e597c4be9c8119a3db038ffc3fede9228
ARG VERSION
ARG GIT_BUILD_HASH

View File

@@ -38,14 +38,14 @@ function check_if_root {
chown -R authentik:authentik /media /certs "${PROMETHEUS_MULTIPROC_DIR}"
chmod ug+rwx /media
chmod ug+rx /certs
exec chpst -u authentik:$GROUP env HOME=/authentik PATH="$PATH" $1
exec chpst -u authentik:$GROUP env HOME=/authentik $1
}
function run_authentik {
if [[ -x "$(command -v authentik)" ]]; then
exec authentik $@
else
exec env GOFIPS140=latest CGO_ENABLED=1 go run -v ./cmd/server/ $@
exec go run -v ./cmd/server/ $@
fi
}

21
packages/akql/LICENSE Normal file
View File

@@ -0,0 +1,21 @@
MIT License
Copyright (c) 2017 ivelum
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

3
packages/akql/README.md Normal file
View File

@@ -0,0 +1,3 @@
This is a fork of djangoql.
https://github.com/ivelum/djangoql

View File

@@ -0,0 +1 @@
__version__ = "0.18.1"

91
packages/akql/akql/ast.py Normal file
View File

@@ -0,0 +1,91 @@
from typing import TYPE_CHECKING, Any
if TYPE_CHECKING:
from akql.parser import AKQLParser
class Node:
def __str__(self):
children = []
for k, v in self.__dict__.items():
vv = v
if isinstance(v, list | tuple):
vv = "[{}]".format(", ".join([str(v) for v in v if v]))
children.append(f"{k}={vv}")
return "<{}{}{}>".format(
self.__class__.__name__,
": " if children else "",
", ".join(children),
)
__repr__ = __str__
def __eq__(self, other):
if not isinstance(other, self.__class__):
return False
for k, v in self.__dict__.items():
if getattr(other, k) != v:
return False
return True
def __ne__(self, other):
return not self.__eq__(other)
class Expression(Node):
def __init__(self, left, operator, right):
self.left = left
self.operator = operator
self.right = right
class Name(Node):
def __init__(self, parts):
if isinstance(parts, list):
self.parts = parts
elif isinstance(parts, tuple):
self.parts = list(parts)
else:
self.parts = [parts]
@property
def value(self):
return ".".join(self.parts)
class Const(Node):
def __init__(self, value):
self.value = value
class List(Node):
def __init__(self, items):
self.items = items
@property
def value(self):
return [i.value for i in self.items]
class Operator(Node):
def __init__(self, operator):
self.operator = operator
class Logical(Operator):
pass
class Comparison(Operator):
pass
class Variable(Node):
def __init__(self, name: str, parser: "AKQLParser"):
self.name = name
self.parser = parser
@property
def value(self) -> Any:
return self.parser.context.get(self.name)

View File

@@ -0,0 +1,32 @@
class AKQLError(Exception):
def __init__(self, message=None, value=None, line=None, column=None):
self.value = value
self.line = line
self.column = column
super().__init__(message)
def __str__(self):
message = super().__str__()
if self.line:
position_info = f"Line {self.line}"
if self.column:
position_info += f", col {self.column}"
return f"{position_info}: {message}"
else:
return message
class AKQLSyntaxError(AKQLError):
pass
class AKQLLexerError(AKQLSyntaxError):
pass
class AKQLParserError(AKQLSyntaxError):
pass
class AKQLSchemaError(AKQLError):
pass

181
packages/akql/akql/lexer.py Normal file
View File

@@ -0,0 +1,181 @@
from ply import lex
from ply.lex import TOKEN, Lexer, LexToken
from akql.exceptions import AKQLLexerError
class AKQLLexer:
_lexer: Lexer
def __init__(self, **kwargs):
self._lexer = lex.lex(module=self, **kwargs)
self.reset()
def reset(self):
self.text = ""
self._lexer.lineno = 1
return self
def input(self, s):
self.reset()
self.text = s
self._lexer.input(s)
return self
def token(self):
return self._lexer.token()
# Iterator interface
def __iter__(self):
return self
def next(self):
t = self.token()
if t is None:
raise StopIteration
return t
__next__ = next
def find_column(self, t: LexToken):
"""
Returns token position in current text, starting from 1
"""
cr = max(self.text.rfind(lt, 0, t.lexpos) for lt in self.line_terminators)
if cr == -1:
return t.lexpos + 1
return t.lexpos - cr
whitespace = " \t\v\f\u00a0"
line_terminators = "\n\r\u2028\u2029"
re_line_terminators = r"\n\r\u2028\u2029"
re_escaped_char = r"\\[\"\\/bfnrt]"
re_escaped_unicode = r"\\u[0-9A-Fa-f]{4}"
re_string_char = r"[^\"\\" + re_line_terminators + "]"
re_int_value = r"(-?0|-?[1-9][0-9]*)"
re_fraction_part = r"\.[0-9]+"
re_exponent_part = r"[eE][\+-]?[0-9]+"
tokens = [
"COMMA",
"OR",
"AND",
"NOT",
"IN",
"TRUE",
"FALSE",
"NONE",
"NAME",
"STRING_VALUE",
"FLOAT_VALUE",
"INT_VALUE",
"PAREN_L",
"PAREN_R",
"EQUALS",
"NOT_EQUALS",
"GREATER",
"GREATER_EQUAL",
"LESS",
"LESS_EQUAL",
"CONTAINS",
"NOT_CONTAINS",
"STARTSWITH",
"ENDSWITH",
"VARIABLE",
]
t_COMMA = ","
t_PAREN_L = r"\("
t_PAREN_R = r"\)"
t_EQUALS = "="
t_NOT_EQUALS = "!="
t_GREATER = ">"
t_GREATER_EQUAL = ">="
t_LESS = "<"
t_LESS_EQUAL = "<="
t_CONTAINS = "~"
t_NOT_CONTAINS = "!~"
t_NAME = r"[_A-Za-z][_0-9A-Za-z]*(\.[_A-Za-z][_0-9A-Za-z]*)*"
t_ignore = whitespace
@TOKEN(r"\$([_A-Za-z\.]+)")
def t_VARIABLE(self, t: LexToken):
return t
@TOKEN(r"\"(" + re_escaped_char + "|" + re_escaped_unicode + "|" + re_string_char + r")*\"")
def t_STRING_VALUE(self, t: LexToken):
t.value = t.value[1:-1] # cut leading and trailing quotes ""
return t
@TOKEN(
re_int_value
+ re_fraction_part
+ re_exponent_part
+ "|"
+ re_int_value
+ re_fraction_part
+ "|"
+ re_int_value
+ re_exponent_part
)
def t_FLOAT_VALUE(self, t: LexToken):
return t
@TOKEN(re_int_value)
def t_INT_VALUE(self, t: LexToken):
return t
not_followed_by_name = "(?![_0-9A-Za-z])"
@TOKEN("or" + not_followed_by_name)
def t_OR(self, t: LexToken):
return t
@TOKEN("and" + not_followed_by_name)
def t_AND(self, t: LexToken):
return t
@TOKEN("not" + not_followed_by_name)
def t_NOT(self, t: LexToken):
return t
@TOKEN("in" + not_followed_by_name)
def t_IN(self, t: LexToken):
return t
@TOKEN("startswith" + not_followed_by_name)
def t_STARTSWITH(self, t: LexToken):
return t
@TOKEN("endswith" + not_followed_by_name)
def t_ENDSWITH(self, t: LexToken):
return t
@TOKEN("True" + not_followed_by_name)
def t_TRUE(self, t: LexToken):
return t
@TOKEN("False" + not_followed_by_name)
def t_FALSE(self, t: LexToken):
return t
@TOKEN("None" + not_followed_by_name)
def t_NONE(self, t: LexToken):
return t
def t_error(self, t: LexToken):
raise AKQLLexerError(
message=f"Illegal character {repr(t.value[0])}",
value=t.value,
line=t.lineno,
column=self.find_column(t),
)
@TOKEN("[" + re_line_terminators + "]+")
def t_newline(self, t: LexToken):
t.lexer.lineno += len(t.value)

View File

@@ -0,0 +1,239 @@
import re
from decimal import Decimal
from typing import Any
from ply import yacc
from ply.yacc import LRParser, YaccProduction
from akql.ast import Comparison, Const, Expression, List, Logical, Name, Variable
from akql.exceptions import AKQLParserError
from akql.lexer import AKQLLexer
unescape_pattern = re.compile(
"(" + AKQLLexer.re_escaped_char + "|" + AKQLLexer.re_escaped_unicode + ")",
)
def unescape_repl(m: re.Match[str]) -> str:
contents = m.group(1)
if len(contents) == 2: # noqa
return contents[1]
else:
return contents.encode("utf8").decode("unicode_escape")
def unescape(value):
if isinstance(value, bytes):
value = value.decode("utf8")
return re.sub(unescape_pattern, unescape_repl, value)
class AKQLParser:
yacc: LRParser
context: dict[str, Any]
def __init__(self, debug=False, context: dict[str, Any] | None = None, **kwargs):
self.default_lexer = AKQLLexer()
self.tokens = self.default_lexer.tokens
kwargs["debug"] = debug
if "write_tables" not in kwargs:
kwargs["write_tables"] = False
self.context = context or {}
self.yacc = yacc.yacc(module=self, **kwargs)
def parse(
self, input=None, lexer: AKQLLexer | None = None, **kwargs
) -> Expression: # noqa: A002
lexer = lexer or self.default_lexer
return self.yacc.parse(input=input, lexer=lexer, **kwargs)
start = "expression"
def p_expression_parens(self, p: YaccProduction):
"""
expression : PAREN_L expression PAREN_R
"""
p[0] = p[2]
def p_expression_logical(self, p: YaccProduction):
"""
expression : expression logical expression
"""
p[0] = Expression(left=p[1], operator=p[2], right=p[3])
def p_expression_comparison(self, p: YaccProduction):
"""
expression : name comparison_number number
| name comparison_string string
| name comparison_equality boolean_value
| name comparison_equality none
| name comparison_in_list const_list_value
| name comparison_number variable
| name comparison_string variable
| name comparison_equality variable
| name comparison_in_list variable
"""
p[0] = Expression(left=p[1], operator=p[2], right=p[3])
def p_name(self, p: YaccProduction):
"""
name : NAME
"""
p[0] = Name(parts=p[1].split("."))
def p_logical(self, p: YaccProduction):
"""
logical : AND
| OR
"""
p[0] = Logical(operator=p[1])
def p_comparison_number(self, p: YaccProduction):
"""
comparison_number : comparison_equality
| comparison_greater_less
"""
p[0] = p[1]
def p_comparison_string(self, p: YaccProduction):
"""
comparison_string : comparison_equality
| comparison_greater_less
| comparison_string_specific
"""
p[0] = p[1]
def p_comparison_equality(self, p: YaccProduction):
"""
comparison_equality : EQUALS
| NOT_EQUALS
"""
p[0] = Comparison(operator=p[1])
def p_comparison_greater_less(self, p: YaccProduction):
"""
comparison_greater_less : GREATER
| GREATER_EQUAL
| LESS
| LESS_EQUAL
"""
p[0] = Comparison(operator=p[1])
def p_comparison_string_specific(self, p: YaccProduction):
"""
comparison_string_specific : CONTAINS
| NOT_CONTAINS
| STARTSWITH
| NOT STARTSWITH
| ENDSWITH
| NOT ENDSWITH
"""
p[0] = Comparison(operator=" ".join(p[1:]))
def p_comparison_in_list(self, p: YaccProduction):
"""
comparison_in_list : IN
| NOT IN
"""
p[0] = Comparison(operator=" ".join(p[1:]))
def p_const_value(self, p: YaccProduction):
"""
const_value : number
| string
| none
| boolean_value
"""
p[0] = p[1]
def p_variable(self, p: YaccProduction):
"""
variable : VARIABLE
"""
p[0] = Variable(name=unescape(p[1]), parser=self)
def p_number_int(self, p: YaccProduction):
"""
number : INT_VALUE
"""
p[0] = Const(value=int(p[1]))
def p_number_float(self, p: YaccProduction):
"""
number : FLOAT_VALUE
"""
p[0] = Const(value=Decimal(p[1]))
def p_string(self, p: YaccProduction):
"""
string : STRING_VALUE
"""
p[0] = Const(value=unescape(p[1]))
def p_none(self, p: YaccProduction):
"""
none : NONE
"""
p[0] = Const(value=None)
def p_boolean_value(self, p: YaccProduction):
"""
boolean_value : true
| false
"""
p[0] = p[1]
def p_true(self, p: YaccProduction):
"""
true : TRUE
"""
p[0] = Const(value=True)
def p_false(self, p: YaccProduction):
"""
false : FALSE
"""
p[0] = Const(value=False)
def p_const_list_value(self, p: YaccProduction):
"""
const_list_value : PAREN_L const_value_list PAREN_R
"""
p[0] = List(items=p[2])
def p_const_value_list(self, p: YaccProduction):
"""
const_value_list : const_value_list COMMA const_value
"""
p[0] = p[1] + [p[3]]
def p_const_value_list_single(self, p: YaccProduction):
"""
const_value_list : const_value
"""
p[0] = [p[1]]
def p_error(self, token):
if token is None:
self.raise_syntax_error("Unexpected end of input")
else:
fragment = str(token.value)
self.raise_syntax_error(
f"Syntax error at {repr(fragment)}",
token=token,
)
def raise_syntax_error(self, message, token=None):
if token is None:
raise AKQLParserError(message)
lexer = token.lexer
if callable(getattr(lexer, "find_column", None)):
column = lexer.find_column(token)
else:
column = None
raise AKQLParserError(
message=message,
value=token.value,
line=token.lineno,
column=column,
)

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,47 @@
from typing import Any
from django.db.models import QuerySet
from akql.ast import Logical
from akql.parser import AKQLParser
from akql.schema import AKQLField, AKQLSchema
def build_filter(expr: str, schema_instance: AKQLSchema):
if isinstance(expr.operator, Logical):
left = build_filter(expr.left, schema_instance)
right = build_filter(expr.right, schema_instance)
if expr.operator.operator == "or":
return left | right
else:
return left & right
field = schema_instance.resolve_name(expr.left)
if not field:
# That must be a reference to a model without specifying a field.
# Let's construct an abstract lookup field for it
field = AKQLField(
name=expr.left.parts[-1],
nullable=True,
)
return field.get_lookup(
path=expr.left.parts[:-1],
operator=expr.operator.operator,
value=expr.right.value,
)
def apply_search(
queryset: QuerySet,
search: str,
context: dict[str, Any] | None = None,
schema: type[AKQLSchema] | None = None,
) -> QuerySet:
"""
Applies search written in DjangoQL mini-language to given queryset
"""
ast = AKQLParser(context=context).parse(search)
schema = schema or AKQLSchema
schema_instance = schema(queryset.model)
schema_instance.validate(ast)
return queryset.filter(build_filter(ast, schema_instance))

View File

@@ -0,0 +1,618 @@
import inspect
import warnings
from collections import OrderedDict, defaultdict, deque
from collections.abc import Generator
from datetime import datetime
from decimal import Decimal
from django.conf import settings
from django.core.exceptions import FieldDoesNotExist
from django.db import connection, models
from django.db.models import ManyToManyRel, ManyToOneRel, Model, Q
from django.db.models.fields.related import ForeignObjectRel
from django.utils.timezone import get_current_timezone
from akql.ast import Comparison, Const, List, Logical, Name, Node, Variable
from akql.exceptions import AKQLSchemaError
class AKQLField:
"""
Abstract searchable field
"""
model = None
name = None
nullable = False
suggest_options = False
type = "unknown"
value_types = []
value_types_description = ""
def __init__(self, model=None, name=None, nullable=None, suggest_options=None):
if model is not None:
self.model = model
if name is not None:
self.name = name
if nullable is not None:
self.nullable = nullable
if suggest_options is not None:
self.suggest_options = suggest_options
def _field_choices(self):
if self.model:
try:
return self.model._meta.get_field(self.name).choices
except (AttributeError, FieldDoesNotExist):
pass
return []
@property
def async_options(self):
return not self._field_choices()
def get_options(self, search):
"""
Override this method to provide custom suggestion options
"""
result = []
choices = self._field_choices()
if choices:
search = search.lower()
for c in choices:
choice = str(c[1])
if search in choice.lower():
result.append(choice)
return result
def get_lookup_name(self):
"""
Override this method to provide custom lookup name
"""
return self.name
def get_lookup_value(self, value):
"""
Override this method to convert displayed values to lookup values
"""
choices = self._field_choices()
if choices:
if isinstance(value, list):
return [c[0] for c in choices if c[0] in value or c[1] in value]
else:
for c in choices:
if value in c:
return c[0]
return value
def get_operator(self, operator):
"""
Get a comparison suffix to be used in Django ORM & inversion flag for it
:param operator: string, DjangoQL comparison operator
:return: (suffix, invert) - a tuple with 2 values:
suffix - suffix to be used in ORM query, for example '__gt' for '>'
invert - boolean, True if this comparison needs to be inverted
"""
op = {
"=": "",
">": "__gt",
">=": "__gte",
"<": "__lt",
"<=": "__lte",
"~": "__icontains",
"in": "__in",
"startswith": "__istartswith",
"endswith": "__iendswith",
}.get(operator)
if op is not None:
return op, False
op = {
"!=": "",
"!~": "__icontains",
"not in": "__in",
"not startswith": "__istartswith",
"not endswith": "__iendswith",
}[operator]
return op, True
def get_lookup(self, path, operator, value):
"""
Performs a lookup for this field with given path, operator and value.
Override this if you'd like to implement a fully custom lookup. It
should support all comparison operators compatible with the field type.
:param path: a list of names preceding current lookup. For example,
if expression looks like 'author.groups.name = "Foo"' path would
be ['author', 'groups']. 'name' is not included, because it's the
current field instance itself.
:param operator: a string with comparison operator. It could be one of
the following: '=', '!=', '>', '>=', '<', '<=', '~', '!~', 'in',
'not in'. Depending on the field type, some operators may be
excluded. '~' and '!~' can be applied to StrField only and aren't
allowed for any other fields. BoolField can't be used with less or
greater operators, '>', '>=', '<' and '<=' are excluded for it.
:param value: value passed for comparison
:return: Q-object
"""
search = "__".join(path + [self.get_lookup_name()])
op, invert = self.get_operator(operator)
q = models.Q(**{f"{search}{op}": self.get_lookup_value(value)})
return ~q if invert else q
def validate(self, value):
if not self.nullable and value is None:
raise AKQLSchemaError(
f"Field {self.name} is not nullable, " "can't compare it to None",
)
if value is not None and type(value) not in self.value_types:
if self.nullable:
msg = (
'Field "{field}" has "nullable {field_type}" type. '
"It can be compared to {possible_values} or None, "
"but not to {value}"
)
else:
msg = (
'Field "{field}" has "{field_type}" type. It can '
"be compared to {possible_values}, "
"but not to {value}"
)
raise AKQLSchemaError(
msg.format(
field=self.name,
field_type=self.type,
possible_values=self.value_types_description,
value=repr(value),
)
)
class IntField(AKQLField):
type = "int"
value_types = [int]
value_types_description = "integer numbers"
def validate(self, value):
"""
Support enum-like choices defined on an integer field
"""
return super().validate(self.get_lookup_value(value))
class FloatField(AKQLField):
type = "float"
value_types = [int, float, Decimal]
value_types_description = "floating point numbers"
class StrField(AKQLField):
type = "str"
value_types = [str]
value_types_description = "strings"
def get_options(self, search):
choice_options = super().get_options(search)
if choice_options:
return choice_options
lookup = {}
if search:
lookup[f"{self.name}__icontains"] = search
return (
self.model.objects.filter(**lookup)
.order_by(self.name)
.values_list(self.name, flat=True)
.distinct()
)
class BoolField(AKQLField):
type = "bool"
value_types = [bool]
value_types_description = "True or False"
class DateField(AKQLField):
type = "date"
value_types = [str]
value_types_description = 'dates in "YYYY-MM-DD" format'
def validate(self, value):
super().validate(value)
try:
self.get_lookup_value(value)
except ValueError as exc:
raise AKQLSchemaError(
f'Field "{self.name}" can be compared to dates in '
f'"YYYY-MM-DD" format, but not to {repr(value)}',
) from exc
def get_lookup_value(self, value):
if not value:
return None
return datetime.strptime(value, "%Y-%m-%d").date()
class DateTimeField(AKQLField):
type = "datetime"
value_types = [str]
value_types_description = 'timestamps in "YYYY-MM-DD HH:MM" format'
def validate(self, value):
super().validate(value)
try:
self.get_lookup_value(value)
except ValueError as exc:
raise AKQLSchemaError(
f'Field "{self.name}" can be compared to timestamps in '
f'"YYYY-MM-DD HH:MM" format, but not to {repr(value)}',
) from exc
def get_lookup_value(self, value):
if not value:
return None
for format in [
"%Y-%m-%d",
"%Y-%m-%d %H:%M",
"%Y-%m-%d %H:%M:%S",
]:
try:
dt = datetime.strptime(value, format)
if settings.USE_TZ:
dt = dt.replace(tzinfo=get_current_timezone())
return dt
except ValueError:
pass
return None
def get_lookup(self, path, operator, value):
search = "__".join(path + [self.get_lookup_name()])
op, invert = self.get_operator(operator)
# Add LIKE operator support for datetime fields. For LIKE comparisons
# we don't want to convert source value to datetime instance, because
# it would effectively kill the idea. What we want is expressions like
# 'created ~ "2017-01-30'
# to be translated to
# 'created LIKE %2017-01-30%',
# but it would work only if we pass a string as a parameter. If we pass
# a datetime instance, it would add time part in a form of 00:00:00,
# and resulting comparison would look like
# 'created LIKE %2017-01-30 00:00:00%'
# which is not what we want for this case.
val = value if operator in ("~", "!~") else self.get_lookup_value(value)
q = models.Q(**{f"{search}{op}": val})
return ~q if invert else q
class RelationField(AKQLField):
type = "relation"
def __init__(self, model, name, related_model, nullable=False, suggest_options=False):
super().__init__(
model=model,
name=name,
nullable=nullable,
suggest_options=suggest_options,
)
self.related_model = related_model
@property
def relation(self):
return AKQLSchema.model_label(self.related_model)
class JSONSearchField(StrField):
"""JSON field for DjangoQL"""
model: Model
def __init__(self, model=None, name=None, nullable=None, suggest_nested=True):
# Set this in the constructor to not clobber the type variable
self.type = "relation"
self.suggest_nested = suggest_nested
super().__init__(model, name, nullable)
def get_lookup(self, path, operator, value):
search = "__".join(path)
op, invert = self.get_operator(operator)
q = Q(**{f"{search}{op}": self.get_lookup_value(value)})
return ~q if invert else q
def json_field_keys(self) -> Generator[tuple[str]]:
with connection.cursor() as cursor:
cursor.execute(
f"""
WITH RECURSIVE "{self.name}_keys" AS (
SELECT
ARRAY[jsonb_object_keys("{self.name}")] AS key_path_array,
"{self.name}" -> jsonb_object_keys("{self.name}") AS value
FROM {self.model._meta.db_table}
WHERE "{self.name}" IS NOT NULL
AND jsonb_typeof("{self.name}") = 'object'
UNION ALL
SELECT
ck.key_path_array || jsonb_object_keys(ck.value),
ck.value -> jsonb_object_keys(ck.value) AS value
FROM "{self.name}_keys" ck
WHERE jsonb_typeof(ck.value) = 'object'
),
unique_paths AS (
SELECT DISTINCT key_path_array
FROM "{self.name}_keys"
)
SELECT key_path_array FROM unique_paths;
""" # nosec
)
return (x[0] for x in cursor.fetchall())
def get_nested_options(self) -> OrderedDict:
"""Get keys of all nested objects to show autocomplete"""
if not self.suggest_nested:
return OrderedDict()
base_model_name = f"{self.model._meta.app_label}.{self.model._meta.model_name}_{self.name}"
def recursive_function(parts: list[str], parent_parts: list[str] | None = None):
if not parent_parts:
parent_parts = []
path = parts.pop(0)
parent_parts.append(path)
relation_key = "_".join(parent_parts)
if len(parts) > 1:
out_dict = {
relation_key: {
parts[0]: {
"type": "relation",
"relation": f"{relation_key}_{parts[0]}",
}
}
}
child_paths = recursive_function(parts.copy(), parent_parts.copy())
child_paths.update(out_dict)
return child_paths
else:
return {relation_key: {parts[0]: {}}}
relation_structure = defaultdict(dict)
for relations in self.json_field_keys():
result = recursive_function([base_model_name] + relations)
for relation_key, value in result.items():
for sub_relation_key, sub_value in value.items():
if not relation_structure[relation_key].get(sub_relation_key, None):
relation_structure[relation_key][sub_relation_key] = sub_value
else:
relation_structure[relation_key][sub_relation_key].update(sub_value)
final_dict = defaultdict(dict)
for key, value in relation_structure.items():
for sub_key, sub_value in value.items():
if not sub_value:
final_dict[key][sub_key] = {
"type": "str",
"nullable": True,
}
else:
final_dict[key][sub_key] = sub_value
return OrderedDict(final_dict)
def relation(self) -> str:
return f"{self.model._meta.app_label}.{self.model._meta.model_name}_{self.name}"
class ChoiceSearchField(StrField):
def __init__(self, model=None, name=None, nullable=None):
super().__init__(model, name, nullable, suggest_options=True)
def get_options(self, search):
result = []
choices = self._field_choices()
if choices:
search = search.lower()
for c in choices:
choice = str(c[0])
if search in choice.lower():
result.append(choice)
return result
class AKQLSchema:
include = () # models to include into introspection
exclude = () # models to exclude from introspection
suggest_options = None
def __init__(self, model):
if not inspect.isclass(model) or not issubclass(model, models.Model):
raise AKQLSchemaError(
"Schema must be initialized with a subclass of Django model",
)
if self.include and self.exclude:
raise AKQLSchemaError(
"Either include or exclude can be specified, but not both",
)
if self.excluded(model):
raise AKQLSchemaError(
f"{model} can't be used with {self.__class__} because it's excluded from it",
)
self.current_model = model
self._models = None
if self.suggest_options is None:
self.suggest_options = {}
def excluded(self, model):
return model in self.exclude or (self.include and model not in self.include)
@property
def models(self):
if not self._models:
self._models = self.introspect(
model=self.current_model,
exclude=tuple(self.model_label(m) for m in self.exclude),
)
return self._models
@classmethod
def model_label(self, model):
return str(model._meta)
def introspect(self, model, exclude=()):
"""
Start with given model and recursively walk through its relationships.
Returns a dict with all model labels and their fields found.
"""
result = {}
open_set = deque([model])
closed_set = set(exclude)
while open_set:
model = open_set.popleft()
model_label = self.model_label(model)
if model_label in closed_set:
continue
model_fields = OrderedDict()
for field in self.get_fields(model):
field_instance = field
if not isinstance(field, AKQLField):
field_instance = self.get_field_instance(model, field)
if not field_instance:
continue
if isinstance(field_instance, RelationField):
open_set.append(field_instance.related_model)
model_fields[field_instance.name] = field_instance
result[model_label] = model_fields
closed_set.add(model_label)
return result
def get_fields(self, model):
"""
By default, returns all field names of a given model.
Override this method to limit field options. You can either return a
plain list of field names from it, like ['id', 'name'], or call
.super() and exclude unwanted fields from its result.
"""
return sorted(
[f.name for f in model._meta.get_fields() if f.name != "password"],
)
def get_field_instance(self, model, field_name):
field = model._meta.get_field(field_name)
field_kwargs = {"model": model, "name": field.name}
if field.is_relation:
if not field.related_model:
# GenericForeignKey
return
if self.excluded(field.related_model):
return
field_cls = RelationField
field_kwargs["related_model"] = field.related_model
else:
field_cls = self.get_field_cls(field)
if isinstance(field, ManyToOneRel | ManyToManyRel | ForeignObjectRel):
# Django 1.8 doesn't have .null attribute for these fields
field_kwargs["nullable"] = True
else:
field_kwargs["nullable"] = field.null
field_kwargs["suggest_options"] = field.name in self.suggest_options.get(model, [])
return field_cls(**field_kwargs)
def get_field_cls(self, field):
str_fields = (
models.CharField,
models.TextField,
models.UUIDField,
models.BinaryField,
models.GenericIPAddressField,
)
if isinstance(field, str_fields):
return StrField
elif isinstance(field, models.AutoField | models.IntegerField):
return IntField
elif isinstance(field, models.BooleanField | models.NullBooleanField):
return BoolField
elif isinstance(field, models.DecimalField | models.FloatField):
return FloatField
elif isinstance(field, models.DateTimeField):
return DateTimeField
elif isinstance(field, models.DateField):
return DateField
return AKQLField
def as_dict(self):
from akql.serializers import AKQLSchemaSerializer
warnings.warn(
"DjangoQLSchema.as_dict() is deprecated and will be removed in "
"future releases. Please use DjangoQLSchemaSerializer instead.",
stacklevel=2,
)
return AKQLSchemaSerializer().serialize(self)
def resolve_name(self, name):
assert isinstance(name, Name)
model = self.model_label(self.current_model)
root_field = name.parts[0]
field = self.models[model].get(root_field)
# If the query goes into a JSON field, return the root
# field as the JSON field will do the rest
if isinstance(field, JSONSearchField):
# This is a workaround; build_filter will remove the right-most
# entry in the path as that is intended to be the same as the field
# however for JSON that is not the case
if name.parts[-1] != root_field:
name.parts.append(root_field)
return field
for name_part in name.parts:
field = self.models[model].get(name_part)
if not field:
raise AKQLSchemaError(
"Unknown field: {}. Possible choices are: {}".format(
name_part,
", ".join(sorted(self.models[model].keys())),
),
)
if field.type == "relation":
model = field.relation
field = None
return field
def validate(self, node):
"""
Validate DjangoQL AST tree vs. current schema
"""
assert isinstance(node, Node)
if isinstance(node.operator, Logical):
self.validate(node.left)
self.validate(node.right)
return
assert isinstance(node.left, Name)
assert isinstance(node.operator, Comparison)
assert isinstance(node.right, Const | List | Variable)
# Check that field and value types are compatible
field = self.resolve_name(node.left)
value = node.right.value
if field is None:
if value is not None:
raise AKQLSchemaError(
f"Related model {node.left.value} can be compared to None only, but not to "
f"{type(value).__name__}",
)
else:
values = value if isinstance(node.right, List) else [value]
for v in values:
field.validate(v)

View File

@@ -0,0 +1,31 @@
from collections import OrderedDict
from akql.schema import JSONSearchField, RelationField
class AKQLSchemaSerializer:
def serialize(self, schema):
models = {}
for model_label, fields in schema.models.items():
models[model_label] = OrderedDict(
[(name, self.serialize_field(f)) for name, f in fields.items()],
)
return {
"current_model": schema.model_label(schema.current_model),
"models": models,
}
def serialize_field(self, field):
result = {
"type": field.type,
"nullable": field.nullable,
"options": self.serialize_field_options(field),
}
if isinstance(field, RelationField):
result["relation"] = field.relation
if isinstance(field, JSONSearchField):
result["relation"] = field.relation()
return result
def serialize_field_options(self, field):
return list(field.get_options("")) if field.suggest_options else None

View File

View File

@@ -0,0 +1,16 @@
from django.test import TestCase
from akql.queryset import apply_search
from authentik.core.tests.utils import create_test_user
from authentik.events.models import Notification
class TestFilter(TestCase):
def test_filter(self):
user = create_test_user()
notif = Notification.objects.create(user=user)
qs = apply_search(
Notification.objects.all(), "user.id = $current_user", {"$current_user": user.pk}
)
self.assertEqual(qs.first(), notif)

View File

@@ -0,0 +1,18 @@
from django.test import TestCase
from akql.lexer import AKQLLexer
class TestLexer(TestCase):
def test_lexer_simple(self):
lexer = AKQLLexer().input('foo = "bar"')
tokens = list(str(t) for t in lexer)
self.assertEqual(
tokens,
[
"LexToken(NAME,'foo',1,0)",
"LexToken(EQUALS,'=',1,4)",
"LexToken(STRING_VALUE,'bar',1,6)",
],
)

View File

@@ -0,0 +1,41 @@
from django.test import TestCase
from akql.ast import Comparison, Const, Expression, Name, Variable
from akql.parser import AKQLParser
class TestParser(TestCase):
def test_parser_simple(self):
ast = AKQLParser().parse('foo = "bar"')
self.assertEqual(
ast,
Expression(
left=Name(parts=["foo"]),
operator=Comparison(operator="="),
right=Const(value="bar"),
),
)
def test_parser_not_startswith(self):
ast = AKQLParser().parse('foo not startswith "bar"')
self.assertEqual(
ast,
Expression(
left=Name(parts=["foo"]),
operator=Comparison(operator="not startswith"),
right=Const(value="bar"),
),
)
def test_parser_variable(self):
parser = AKQLParser()
ast = parser.parse("foo = $bar")
self.assertEqual(
ast,
Expression(
left=Name(parts=["foo"]),
operator=Comparison(operator="="),
right=Variable(name="$bar", parser=parser),
),
)

View File

@@ -0,0 +1,51 @@
[project]
name = "akql"
version = "3.2.0"
description = "Model and object permissions for Django"
requires-python = ">=3.9,<3.14"
readme = "README.md"
license = { text = "MIT" }
authors = [
{ name = "Authentik Security Inc.", email = "hello@goauthentik.io" },
{ name = "Denis Stebunov", email = "support@ivelum.com" },
]
keywords = ["django", "permissions", "authorization", "object", "row", "level"]
classifiers = [
'Development Status :: 4 - Beta',
'Intended Audience :: Developers',
'Natural Language :: English',
'License :: OSI Approved :: MIT License',
'Programming Language :: Python',
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6',
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8',
'Programming Language :: Python :: 3.9',
'Programming Language :: Python :: 3.10',
'Programming Language :: Python :: 3.11',
'Programming Language :: Python :: 3.12',
'Programming Language :: Python :: 3.13',
]
dependencies = [
"ply>=3.8",
]
[project.urls]
Homepage = "https://github.com/goauthentik/authentik/tree/main/packages/akql"
Documentation = "https://github.com/goauthentik/authentik/tree/main/packages/akql"
Repository = "https://github.com/goauthentik/authentik/tree/main/packages/akql"
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[tool.hatch.build.targets.wheel]
packages = [
"akql",
]
[tool.setuptools.packages]
find = {}

View File

@@ -529,3 +529,7 @@ class _PostgresConsumer(Consumer):
conn.close()
except DATABASE_ERRORS:
pass
try:
connections.close_all()
except DATABASE_ERRORS:
pass

View File

@@ -47,7 +47,7 @@ RUN --mount=type=cache,sharing=locked,target=/go/pkg/mod \
go build -o /go/proxy ./cmd/proxy
# Stage 3: Run
FROM ghcr.io/goauthentik/fips-debian:trixie-slim-fips@sha256:07f41ce3f15b2bb5eb5bcd4e6efc0cb42bb7e5609e7244f636da1a91166817ca
FROM ghcr.io/goauthentik/fips-debian:trixie-slim-fips@sha256:2f19fc114923ec0842329bf638cb155e597c4be9c8119a3db038ffc3fede9228
ARG VERSION
ARG GIT_BUILD_HASH

View File

@@ -6,6 +6,7 @@ authors = [{ name = "authentik Team", email = "hello@goauthentik.io" }]
requires-python = "==3.13.*"
dependencies = [
"ak-guardian==3.2.0",
"akql",
"argon2-cffi==25.1.0",
"channels==4.3.1",
"cryptography==45.0.5",
@@ -26,7 +27,6 @@ dependencies = [
"django-prometheus==2.4.1",
"django-storages[s3]==1.14.6",
"django-tenants==3.9.0",
"djangoql==0.18.1",
"djangorestframework==3.16.1",
"docker==7.1.0",
"drf-orjson-renderer==1.7.3",
@@ -121,6 +121,7 @@ no-binary-package = [
[tool.uv.sources]
ak-guardian = { workspace = true }
akql = { workspace = true }
django-channels-postgres = { workspace = true }
django-dramatiq-postgres = { workspace = true }
django-postgres-cache = { workspace = true }
@@ -129,6 +130,7 @@ opencontainers = { git = "https://github.com/vsoch/oci-python", rev = "ceb4fcc09
[tool.uv.workspace]
members = [
"packages/ak-guardian",
"packages/akql",
"packages/django-channels-postgres",
"packages/django-dramatiq-postgres",
"packages/django-postgres-cache",

View File

@@ -31,7 +31,7 @@ RUN --mount=type=cache,sharing=locked,target=/go/pkg/mod \
go build -o /go/radius ./cmd/radius
# Stage 2: Run
FROM ghcr.io/goauthentik/fips-debian:trixie-slim-fips@sha256:07f41ce3f15b2bb5eb5bcd4e6efc0cb42bb7e5609e7244f636da1a91166817ca
FROM ghcr.io/goauthentik/fips-debian:trixie-slim-fips@sha256:2f19fc114923ec0842329bf638cb155e597c4be9c8119a3db038ffc3fede9228
ARG VERSION
ARG GIT_BUILD_HASH

View File

@@ -4728,11 +4728,6 @@ paths:
schema:
type: boolean
description: Only return certificate-key pairs with keys
- in: query
name: include_details
schema:
type: boolean
default: true
- in: query
name: key_type
schema:
@@ -34992,25 +34987,25 @@ components:
type: string
fingerprint_sha256:
type: string
nullable: true
description: Get certificate Hash (SHA256)
readOnly: true
nullable: true
description: SHA256 fingerprint of the certificate
fingerprint_sha1:
type: string
nullable: true
description: Get certificate Hash (SHA1)
readOnly: true
nullable: true
description: SHA1 fingerprint of the certificate
cert_expiry:
type: string
format: date-time
nullable: true
description: Get certificate expiry
readOnly: true
nullable: true
description: Certificate expiry date
cert_subject:
type: string
nullable: true
description: Get certificate subject as full rfc4514
readOnly: true
nullable: true
description: Certificate subject as RFC4514 string
private_key_available:
type: boolean
description: Show if this keypair has a private key configured or not
@@ -35018,8 +35013,9 @@ components:
key_type:
allOf:
- $ref: '#/components/schemas/KeyTypeEnum'
nullable: true
readOnly: true
nullable: true
description: Key algorithm type detected from the certificate's public key
certificate_download_url:
type: string
description: Get URL to download certificate

View File

@@ -5,11 +5,11 @@ services:
restart: never
network_mode: none
volumes:
- ${LOCAL_PROJECT_DIR:-../../}:/local
- ../../:/local
gen:
image: docker.io/openapitools/openapi-generator-cli:v7.16.0
restart: never
network_mode: none
volumes:
- ${LOCAL_PROJECT_DIR:-../../}:/local
- ../../:/local

27
uv.lock generated
View File

@@ -5,6 +5,7 @@ requires-python = "==3.13.*"
[manifest]
members = [
"ak-guardian",
"akql",
"authentik",
"django-channels-postgres",
"django-dramatiq-postgres",
@@ -93,6 +94,17 @@ requires-dist = [
{ name = "typing-extensions", marker = "python_full_version < '3.15'", specifier = ">=4.12.0" },
]
[[package]]
name = "akql"
version = "3.2.0"
source = { editable = "packages/akql" }
dependencies = [
{ name = "ply" },
]
[package.metadata]
requires-dist = [{ name = "ply", specifier = ">=3.8" }]
[[package]]
name = "annotated-types"
version = "0.7.0"
@@ -189,6 +201,7 @@ version = "2026.2.0rc1"
source = { editable = "." }
dependencies = [
{ name = "ak-guardian" },
{ name = "akql" },
{ name = "argon2-cffi" },
{ name = "channels" },
{ name = "cryptography" },
@@ -209,7 +222,6 @@ dependencies = [
{ name = "django-prometheus" },
{ name = "django-storages", extra = ["s3"] },
{ name = "django-tenants" },
{ name = "djangoql" },
{ name = "djangorestframework" },
{ name = "docker" },
{ name = "drf-orjson-renderer" },
@@ -293,6 +305,7 @@ dev = [
[package.metadata]
requires-dist = [
{ name = "ak-guardian", editable = "packages/ak-guardian" },
{ name = "akql", editable = "packages/akql" },
{ name = "argon2-cffi", specifier = "==25.1.0" },
{ name = "channels", specifier = "==4.3.1" },
{ name = "cryptography", specifier = "==45.0.5" },
@@ -313,7 +326,6 @@ requires-dist = [
{ name = "django-prometheus", specifier = "==2.4.1" },
{ name = "django-storages", extras = ["s3"], specifier = "==1.14.6" },
{ name = "django-tenants", specifier = "==3.9.0" },
{ name = "djangoql", specifier = "==0.18.1" },
{ name = "djangorestframework", specifier = "==3.16.1" },
{ name = "docker", specifier = "==7.1.0" },
{ name = "drf-orjson-renderer", specifier = "==1.7.3" },
@@ -1252,17 +1264,6 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/b7/57/918cfca627fcdc3441981dddc72a22be02e57abdb5391eb7339ea77a5ef4/django_tenants-3.9.0-py3-none-any.whl", hash = "sha256:14421088a4336444e2c4af54f21a6af2e57e53dcf95ba5d19b5fa17142cb460b", size = 215955, upload-time = "2025-09-06T21:46:05.939Z" },
]
[[package]]
name = "djangoql"
version = "0.18.1"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "ply" },
]
wheels = [
{ url = "https://files.pythonhosted.org/packages/9e/0a/83cdb7b9d3b854b98941363153945f6c051b3bc50cd61108a85677c98c3a/djangoql-0.18.1-py2.py3-none-any.whl", hash = "sha256:51b3085a805627ebb43cfd0aa861137cdf8f69cc3c9244699718fe04a6c8e26d", size = 218209, upload-time = "2024-01-08T14:10:47.915Z" },
]
[[package]]
name = "djangorestframework"
version = "3.16.1"

View File

@@ -67,24 +67,9 @@ export class AkCryptoCertificateSearch extends CustomListenerElement(AKElement)
@property({ type: Boolean, attribute: "singleton" })
public singleton = false;
/**
* Set to `true` to include certificate details (fingerprints, expiry, certificate subject, key type)
* in the API response.
* Each returned certificate's PEM data must be parsed using cryptography library,
* public keys extracted, and hashes computed. With large result sets, this can add a lot of time
* to responses.
* Only enable when you actually need the detailed fields displayed in the UI.
* For simple certificate selection dropdowns, leave this as `false` (default).
* @attr
*/
@property({ type: Boolean, attribute: "include-details" })
public includeDetails = false;
/**
* When allowedKeyTypes is set, only certificates or keypairs with matching
* key algorithms will be shown. Since certificates must be parsed to
* extract algorithm details, an instance with many certificates may experience
* long delays and server performance slowdowns. Avoid setting this field whenever possible.
* key algorithms will be shown.
* @attr
* @example [KeyTypeEnum.Rsa, KeyTypeEnum.Ec]
*/
@@ -123,7 +108,6 @@ export class AkCryptoCertificateSearch extends CustomListenerElement(AKElement)
const args: CryptoCertificatekeypairsListRequest = {
ordering: "name",
hasKey: !this.noKey,
includeDetails: this.includeDetails,
};
if (query !== undefined) {
args.search = query;

View File

@@ -45,9 +45,9 @@ export class CertificateKeyPairListPage extends TablePage<CertificateKeyPair> {
static styles: CSSResult[] = [...super.styles, PFDescriptionList];
async apiEndpoint(): Promise<PaginatedResponse<CertificateKeyPair>> {
return new CryptoApi(DEFAULT_CONFIG).cryptoCertificatekeypairsList(
await this.defaultEndpointConfig(),
);
return new CryptoApi(DEFAULT_CONFIG).cryptoCertificatekeypairsList({
...(await this.defaultEndpointConfig()),
});
}
protected columns: TableColumn[] = [

View File

@@ -394,6 +394,8 @@ When documenting errors, follow this structure:
- **Diagrams**:
- Use [Mermaid](https://mermaid.js.org/) for creating diagrams directly in markdown. Mermaid is our preferred tool for documentation diagrams as it allows for version control and easy updates.
- For more complex diagrams, you can use tools like [Draw.io](https://draw.io). Ensure high contrast and text descriptions.
- **authentik icons**:
- For authentik icons in integration guides, reference assets from the user's own self-hosted instance to avoid external calls, for example: `https://authentik.company/static/dist/assets/icons/icon.svg`
---

View File

@@ -1,148 +0,0 @@
---
title: Devcontainer development environment
sidebar_label: Devcontainer development
tags:
- development
- contributor
- devcontainer
- docker
---
If you prefer a containerized development environment with all dependencies pre-configured, you can use the devcontainer setup. This provides a fully isolated development environment that runs inside Docker. The devcontainer mounts your local workspace into the container, so changes to files are reflected immediately.
### Prerequisites
- [Docker](https://www.docker.com/) (Latest Community Edition or Docker Desktop)
- [Visual Studio Code](https://code.visualstudio.com/) with the [Dev Containers extension](https://marketplace.visualstudio.com/items?itemName=ms-vscode-remote.remote-containers)
Alternatively, you can use any IDE or editor that supports the [devcontainer specification](https://containers.dev/).
### Instructions
1. Clone the Git repo to your development machine and navigate to the authentik directory.
```shell
git clone https://github.com/goauthentik/authentik
cd authentik
```
2. Open the repository in Visual Studio Code.
```shell
code .
```
3. When prompted, click "Reopen in Container" or run the command "Dev Containers: Reopen in Container" from the command palette (Ctrl+Shift+P / Cmd+Shift+P).
4. VS Code will build the devcontainer image and start the container. This may take several minutes on the first run.
5. Once the container is running, all development tools and dependencies will be available inside the container environment.
### What's included
The devcontainer provides:
- Pre-configured development environment with all required dependencies
- Python, Go, and Node.js development tools
- PostgreSQL database
- All necessary system packages
### Running authentik
After the devcontainer starts, you can run authentik using the standard development commands:
Start the server:
```shell
make run-server
```
In a separate terminal, start the worker:
```shell
make run-worker
```
For frontend development:
```shell
make web-watch
```
authentik will be accessible at http://localhost:9000.
### Initial setup
To set a password for the default admin user (**akadmin**):
1. Navigate to http://localhost:9000/if/flow/initial-setup/ in your browser.
2. Follow the prompts to set up your admin account.
### Hot-reloading
When `AUTHENTIK_DEBUG` is set to `true` (the default for the development environment), the authentik server automatically reloads whenever changes are made to the code. However, due to instabilities in the reloading process of the worker, that behavior is turned off for the worker. You can enable code reloading in the worker by manually running `uv run ak worker --watch`.
## End-to-End (E2E) Setup
Start the E2E test services with the following command:
```shell
docker compose -f tests/e2e/docker-compose.yml up -d
```
You can then view the Selenium Chrome browser via http://localhost:7900/ using the password: `secret`.
Alternatively, you can connect directly via VNC on port `5900` using the password: `secret`.
:::info
When using Docker Desktop, host networking needs to be enabled via **Docker Settings** > **Resources** > **Network** > **Enable host networking**.
:::
## 6. Contributing code
### Before submitting a pull request
Ensure your code meets our quality standards by running:
1. **Code linting**:
```shell
make lint-fix
make lint
```
2. **Generate updated API documentation**:
```shell
make gen
```
3. **Format frontend code**:
```shell
make web
```
4. **Run tests**:
```shell
make test
```
You can run all these checks at once with:
```shell
make all
```
### Submitting your changes
After your code passes all checks, submit a pull request on [GitHub](https://github.com/goauthentik/authentik/pulls). Be sure to:
- Provide a clear description of your changes
- Reference any related issues
- Follow our code style guidelines
- Update any related documentation
- Include tests for your changes where appropriate
Thank you for contributing to authentik!

View File

@@ -707,7 +707,6 @@ const items = [
id: "developer-docs/setup/index",
},
items: [
"developer-docs/setup/devcontainer",
"developer-docs/setup/full-dev-environment",
"developer-docs/setup/frontend-dev-environment",
"developer-docs/setup/debugging",

View File

@@ -0,0 +1,56 @@
---
title: Integrate with Jellyseerr
sidebar_label: Jellyseerr
support_level: community
---
## What is Jellyseerr
> Jellyseerr is a free and open source application for managing requests in your media library. It integrates with media servers like Jellyfin, Plex, and Emby, and services such as Sonarr and Radarr.
>
> -- https://docs.seerr.dev/
## Preparation
- `jellyseerr.company` is the FQDN of the Jellyseerr installation.
- `authentik.company` is the FQDN of the authentik installation.
## authentik configuration
To support the integration of Jellyseerr with authentik, you need to create an application/provider pair in authentik.
1. Log in to authentik as an administrator and open the authentik Admin interface.
2. Navigate to **Applications** > **Applications** and click **Create with Provider** to create an application and provider pair. (Alternatively you can first create a provider separately, then create the application and connect it with the provider.)
- **Application**: provide a descriptive name, a slug, an optional group for the type of application, the policy engine mode, and optional UI settings. Take note of the **Slug** value as it will be required later.
- **Choose a Provider type**: OAuth2/OpenID
- **Configure the Provider**: provide a name (or accept the auto-provided name), the authorization flow to use for this provider, and any required configurations.
- Note the **Client ID** and **Client Secret** values because they will be required later.
- Set a `Strict` redirect URI to `https://jellyseerr.company/login?provider=authentik&callback=true`.
- Select any available signing key.
- **Configure Bindings** _(optional):_ you can create a [binding](https://docs.goauthentik.io/docs/add-secure-apps/flows-stages/bindings/) (policy, group, or user) to manage the listing and access to applications on a users **My applications** page.
3. Click **Submit** to save the new application and provider.
## Jellyseerr configuration
:::info
Jellyseer OAuth support is currently in preview, please make sure to use the `preview-OIDC` Docker tag.
:::
1. Log in to Jellyseerr with an administrator account.
2. Navigate to **Settings** > **Users**.
3. Toggle on **Enable OpenID Connect Sign-In** and click on the cogwheel icon next to it.
4. Click **Add OpenID Connect Provider** and configure the following settings:
- **Provider Name**: `authentik`
- **Logo**: `https://authentik.company/static/dist/assets/icons/icon.svg`
- **Issuer URL**: `https://authentik.company/application/o/jellyseerr/`
- **Client ID**: Client ID from authentik
- **Client Secret**: Client Secret from authentik
- Under **Advanced Settings**:
- **Scopes**: `openid profile email groups`
- **Allow New Users**: Enabled
5. Click **Save Changes**.
## Configuration verification
To verify that authentik is correctly set up with Jellyseerr, log out of Jellyseerr and try logging back in using the authentik button. You should be redirected to authentik, and once authenticated you will be signed in to Jellyseerr.

View File

@@ -38,7 +38,7 @@ The steps to configure authentik include creating an application and provider pa
- Note the **Client ID**, **Client Secret**, and **slug** values because they will be required later.
- Set a `Strict` redirect URI to `https://beszel.company/api/oauth2-redirect`.
- Select any available signing key.
- **Configure Bindings** _(optional):_ you can create a [binding](https://docs.goauthentik.io/docs/add-secure-apps/flows-stages/bindings/) (policy, group, or user) to manage the listing and access to applications on a users \***\*My applications** \*_page_.\*
- **Configure Bindings** _(optional):_ you can create a [binding](https://docs.goauthentik.io/docs/add-secure-apps/flows-stages/bindings/) (policy, group, or user) to manage the listing and access to applications on a users **My applications** page.
3. Click **Submit** to save the new application and provider.