(all) refactor several areas before 0.1 release (#265)

We are going to release a 0.1 version soon, along with our first production deployment. Starting from there, migrations and a consistent developer experience will be officially supported. To make that easier, this large patch cleans up several areas:
 * Reset migrations one last time
 * Update models for storage efficiency (move textchoices to integerchoices on high-volume tables)
 * Use Blobs for mail mime data and draft bodies. Having them in a separate PG table is a first step, we will later start offloading them to object storage.
 * Add default ZSTD compression to blobs
 * Add per-domain DKIM Keys
 * Add DNS check and provisioning, with a first Scaleway provider
 * Fix Keycloak user provisioning
 * Fix Attachment storage, they are now stored individually only at the drafting stage. Afterwards they are extracted from the main blob. This may be optimized later but at least we only store once. For JMAP compatibility, this requires using fake IDs in the blob API route.
 * Add a management command and recurring task to retry unsent messages
 * Improve the local developer experience with new ports and make commands
 * Repackage MTA-in and MTA-out to be closer to Backend: Poetry, multi-step Dockerfile, move compose and makefile to the root
 * Migrate to OpenSearch
 * Improve overall documentation and add a self-hosting page

Contributes to #177 and #185
This commit is contained in:
Sylvain Zimmer
2025-07-15 10:41:55 +02:00
committed by GitHub
parent e1940295dc
commit f1a89a5bdb
157 changed files with 8323 additions and 2399 deletions

View File

@@ -37,7 +37,7 @@ jobs:
- name: Collect static files
run: make collectstatic
- name: Run backend tests
run: make test-back
run: make back-test
test-front:
runs-on: ubuntu-latest
@@ -47,9 +47,9 @@ jobs:
- name: Create env files
run: make create-env-files
- name: Install frontend dependencies
run: make frontend-install-frozen
run: make front-install-frozen
- name: Run frontend tests
run: make frontend-test
run: make front-test
lint-front:
runs-on: ubuntu-latest
@@ -59,9 +59,9 @@ jobs:
- name: Create env files
run: make create-env-files
- name: Install frontend dependencies
run: make frontend-install-frozen
run: make front-install-frozen
- name: Run frontend linting
run: make frontend-lint
run: make front-lint
build-front:
runs-on: ubuntu-latest
@@ -71,9 +71,9 @@ jobs:
- name: Create env files
run: make create-env-files
- name: Install frontend dependencies
run: make frontend-install-frozen
run: make front-install-frozen
- name: Build frontend
run: make frontend-build
run: make front-build
check-api-state:
runs-on: ubuntu-latest
@@ -99,7 +99,8 @@ jobs:
secrets: inherit
with:
image_name: "${{ github.repository }}-mta-in"
context: "src/mta-in/postfix"
context: "src/mta-in"
target: runtime-prod
docker-publish-mta-out:
uses: ./.github/workflows/docker-publish.yml
@@ -111,4 +112,5 @@ jobs:
secrets: inherit
with:
image_name: "${{ github.repository }}-mta-out"
context: "src/mta-out/postfix"
context: "src/mta-out"
target: runtime-prod

2
.gitignore vendored
View File

@@ -41,7 +41,7 @@ ENV/
env.bak/
venv.bak/
env.d/development/*
!env.d/development/*.dist
!env.d/development/*.defaults
env.d/terraform
# npm

View File

@@ -62,7 +62,7 @@ It is nice to add information about the purpose of the pull request to help revi
- signoff your commits
- sign your commits with your key (SSH, GPG etc.)
- check your commits (see warnings above)
- check the linting: `make lint && make frontend-lint`
- check the linting: `make lint`
- check the tests: `make test`
Once all the required tests have passed, you can request a review from the project maintainers.

373
Makefile
View File

@@ -43,13 +43,13 @@ COMPOSE_EXEC = $(COMPOSE) exec
COMPOSE_EXEC_APP = $(COMPOSE_EXEC) backend-dev
COMPOSE_RUN = $(COMPOSE) run --rm
COMPOSE_RUN_APP = $(COMPOSE_RUN) backend-dev
COMPOSE_RUN_APP_DB = $(COMPOSE_RUN) backend-db
COMPOSE_RUN_APP_TOOLS = $(COMPOSE_RUN) --no-deps backend-dev
COMPOSE_RUN_CROWDIN = $(COMPOSE_RUN) crowdin crowdin
COMPOSE_RUN_MTA_IN_TESTS = cd src/mta-in && $(COMPOSE_RUN) --build test
COMPOSE_RUN_MTA_OUT_TESTS = cd src/mta-out && $(COMPOSE_RUN) --build test
# -- Backend
MANAGE = $(COMPOSE_RUN_APP) python manage.py
MANAGE_DB = $(COMPOSE_RUN_APP_DB) python manage.py
# ==============================================================================
@@ -65,157 +65,194 @@ data/static:
# -- Project
create-env-files: ## Copy the dist env files to env files
create-env-files: ## Create empty .local env files for local development
create-env-files: \
env.d/development/common \
env.d/development/crowdin \
env.d/development/postgresql \
env.d/development/kc_postgresql \
env.d/development/backend \
env.d/development/mta-in \
env.d/development/mta-out
env.d/development/common.local \
env.d/development/crowdin.local \
env.d/development/postgresql.local \
env.d/development/keycloak.local \
env.d/development/backend.local \
env.d/development/frontend.local \
env.d/development/mta-in.local \
env.d/development/mta-out.local
.PHONY: create-env-files
bootstrap: ## Prepare Docker images for the project
bootstrap: \
data/media \
data/static \
create-env-files \
build \
migrate \
collectstatic \
frontend-install-frozen \
# back-i18n-compile
bootstrap: ## Prepare the project for local development
@echo "$(BOLD)"
@echo "╔══════════════════════════════════════════════════════════════════════════════╗"
@echo "║ ║"
@echo "║ 🚀 Welcome to Messages - Collaborative Inbox from La Suite! 🚀 ║"
@echo "║ ║"
@echo "║ This will set up your development environment with : ║"
@echo "║ • Docker containers for all services ║"
@echo "║ • Database migrations and static files ║"
@echo "║ • Frontend dependencies and build ║"
@echo "║ • Environment configuration files ║"
@echo "║ ║"
@echo "║ Services will be available at: ║"
@echo "║ • Frontend: http://localhost:8900 ║"
@echo "║ • API: http://localhost:8901 ║"
@echo "║ • Admin: http://localhost:8901/admin ║"
@echo "║ ║"
@echo "╚══════════════════════════════════════════════════════════════════════════════╝"
@echo "$(RESET)"
@echo "$(GREEN)Starting bootstrap process...$(RESET)"
@echo ""
@$(MAKE) update
@$(MAKE) superuser
@$(MAKE) start
@echo ""
@echo "$(GREEN)🎉 Bootstrap completed successfully!$(RESET)"
@echo ""
@echo "$(BOLD)Next steps:$(RESET)"
@echo " • Visit http://localhost:8900 to access the application"
@echo " • Run 'make help' to see all available commands"
@echo ""
.PHONY: bootstrap
update: ## Update the project with latest changes
@$(MAKE) data/media
@$(MAKE) data/static
@$(MAKE) create-env-files
@$(MAKE) build
@$(MAKE) collectstatic
@$(MAKE) migrate
@$(MAKE) front-install-frozen
# @$(MAKE) back-i18n-compile
.PHONY: update
# -- Docker/compose
build: ## build the project containers
@$(MAKE) build-backend
@$(MAKE) build-frontend-dev
@$(COMPOSE) build
.PHONY: build
build-backend: ## build the backend-dev container
@$(COMPOSE) build backend-dev
.PHONY: build-backend
build-frontend-dev: ## build the frontend container
@$(COMPOSE) build frontend-dev
.PHONY: build-frontend-dev
build-frontend: ## build the frontend container
@$(COMPOSE) build frontend
.PHONY: build-frontend
down: ## stop and remove containers, networks, images, and volumes
@$(COMPOSE) down
.PHONY: down
logs: ## display backend-dev logs (follow mode)
@$(COMPOSE) logs -f backend-dev
logs: ## display all services logs (follow mode)
@$(COMPOSE) logs -f
.PHONY: logs
start: ## start the wsgi (production) and development server
start: ## start all development services
@$(COMPOSE) up --force-recreate --build -d frontend-dev backend-dev celery-dev mta-in
.PHONY: start
run-with-frontend: ## Start all the containers needed (backend to frontend)
@$(MAKE) run
@$(COMPOSE) up --force-recreate -d frontend-dev
.PHONY: run-with-frontend
run-all-fg: ## Start backend containers and frontend in foreground
@$(COMPOSE) up --force-recreate --build frontend-dev backend-dev celery-dev mta-in
.PHONY: run-all-fg
start-minimal: ## start minimal services (backend, frontend, keycloak and DB)
@$(COMPOSE) up --force-recreate --build -d backend-db frontend-dev keycloak
.PHONY: start-minimal
status: ## an alias for "docker compose ps"
@$(COMPOSE) ps
.PHONY: status
stop: ## stop the development server using Docker
stop: ## stop all development services
@$(COMPOSE) stop
.PHONY: stop
# -- Backend
demo: ## flush db then create a demo for load testing purpose
@$(MAKE) resetdb
@$(MANAGE) create_demo
.PHONY: demo
# -- Linters
lint: ## run all linters
lint: \
lint-ruff-format \
lint-check
back-lint \
front-lint \
mta-in-lint \
mta-out-lint
.PHONY: lint
## Check-only version
lint-check: \
lint-ruff-check \
lint-back \
lint-mta-in \
lint-mta-out
.PHONY: lint-check
back-lint: ## run back-end linters
back-lint: \
back-ruff-format \
back-ruff-check \
back-pylint
.PHONY: back-lint
lint-ruff-format: ## format back-end python sources with ruff
@echo 'lint:ruff-format started…'
back-ruff-format: ## format back-end python sources with ruff
@$(COMPOSE_RUN_APP_TOOLS) ruff format .
.PHONY: lint-ruff-format
.PHONY: back-ruff-format
lint-ruff-check: ## lint back-end python sources with ruff
@echo 'lint:ruff-check started…'
back-ruff-check: ## lint back-end python sources with ruff
@$(COMPOSE_RUN_APP_TOOLS) ruff check . --fix
.PHONY: lint-ruff-check
.PHONY: back-ruff-check
lint-back: ## lint back-end python sources with pylint
@echo 'lint:pylint started…'
back-pylint: ## lint back-end python sources with pylint
@$(COMPOSE_RUN_APP_TOOLS) sh -c "pylint ."
.PHONY: lint-back
.PHONY: back-pylint
lint-mta-in: ## lint mta-in python sources with pylint
@echo 'lint:mta-in started…'
@$(COMPOSE_RUN_MTA_IN_TESTS) ruff format .
@$(COMPOSE_RUN_MTA_IN_TESTS) ruff check . --fix
# @$(COMPOSE_RUN_MTA_IN_TESTS) pylint .
.PHONY: lint-mta-in
front-ts-check: ## run the frontend type checker
@$(COMPOSE) run --rm frontend-tools npm run ts:check
.PHONY: front-ts-check
lint-mta-out: ## lint mta-out python sources with pylint
@echo 'lint:mta-out started…'
@$(COMPOSE_RUN_MTA_OUT_TESTS) ruff format .
@$(COMPOSE_RUN_MTA_OUT_TESTS) ruff check . --fix
.PHONY: lint-mta-out
front-lint: ## run the frontend linter
@$(COMPOSE) run --rm frontend-tools npm run lint
.PHONY: front-lint
test: ## run project tests
@$(MAKE) test-back-parallel
mta-in-lint: ## lint mta-in python sources with pylint
$(COMPOSE_RUN) --rm -e EXEC_CMD_ONLY=true mta-in-test ruff format .
#$(COMPOSE_RUN) --rm -e EXEC_CMD_ONLY=true mta-in-test ruff check . --fix
#$(COMPOSE_RUN) --rm -e EXEC_CMD_ONLY=true mta-in-test pylint .
.PHONY: mta-in-lint
mta-out-lint: ## lint mta-out python sources with pylint
$(COMPOSE_RUN) --rm -e EXEC_CMD_ONLY=true mta-out-test ruff format .
.PHONY: mta-out-lint
# -- Tests
test: ## run all tests
test: \
back-test \
front-test \
mta-in-test \
mta-out-test
.PHONY: test
test-back: ## run back-end tests
back-test: ## run back-end tests
@args="$(filter-out $@,$(MAKECMDGOALS))" && \
bin/pytest $${args:-${1}}
.PHONY: test-back
.PHONY: back-test
test-back-parallel: ## run all back-end tests in parallel
back-test-parallel: ## run all back-end tests in parallel
@args="$(filter-out $@,$(MAKECMDGOALS))" && \
bin/pytest -n auto $${args:-${1}}
.PHONY: test-back-parallel
.PHONY: back-test-parallel
makemigrations: ## run django makemigrations for the messages project.
front-test: ## run the frontend tests
@$(COMPOSE) run --rm frontend-tools npm run test
.PHONY: front-test
front-test-amd64: ## run the frontend tests in amd64
@$(COMPOSE) run --rm frontend-tools-amd64 npm run test
.PHONY: front-test
mta-in-test: ## run the mta-in tests
@$(COMPOSE) run --build --rm mta-in-test
.PHONY: mta-in-test
mta-out-test: ## run the mta-out tests
@$(COMPOSE) run --build --rm mta-out-test
.PHONY: mta-out-test
# -- Backend
migrations: ## run django makemigrations for the messages project.
@echo "$(BOLD)Running makemigrations$(RESET)"
@$(COMPOSE) up -d postgresql
@$(MANAGE) makemigrations
.PHONY: makemigrations
@$(MANAGE_DB) makemigrations
.PHONY: migrations
migrate: ## run django migrations for the messages project.
@echo "$(BOLD)Running migrations$(RESET)"
@$(COMPOSE) up -d postgresql
@$(MANAGE) migrate
@$(MANAGE_DB) migrate
.PHONY: migrate
showmigrations: ## show all migrations for the messages project.
@$(MANAGE) showmigrations
@$(MANAGE_DB) showmigrations
.PHONY: showmigrations
superuser: ## Create an admin superuser with password "admin"
@echo "$(BOLD)Creating a Django superuser$(RESET)"
@$(MANAGE) createsuperuser --email admin@example.com --password admin
@$(MANAGE_DB) createsuperuser --email admin@admin.local --password admin
.PHONY: superuser
back-i18n-compile: ## compile the gettext files
@@ -230,12 +267,17 @@ back-shell: ## open a shell in the backend container
@$(COMPOSE) run --rm --build backend-dev /bin/bash
.PHONY: back-shell
back-shell-no-deps: ## open a shell in the backend container without dependencies
@$(COMPOSE) run --rm --no-deps --build backend-dev /bin/bash
.PHONY: back-shell-no-deps
back-exec: ## open a shell in the running backend-dev container
@$(COMPOSE) exec backend-dev /bin/bash
.PHONY: back-exec
back-poetry-lock: ## lock the dependencies
@$(COMPOSE) run --rm --build backend-poetry poetry lock
make pip-audit
.PHONY: back-poetry-lock
back-poetry-check: ## check the dependencies
@@ -246,13 +288,17 @@ back-poetry-outdated: ## show outdated dependencies
@$(COMPOSE) run --rm --build backend-poetry poetry show --outdated
.PHONY: back-poetry-outdated
pip-audit: ## check the dependencies
@$(COMPOSE) run --rm --no-deps -e HOME=/tmp --build backend-dev pip-audit
.PHONY: pip-audit
collectstatic: ## collect static files
@$(MANAGE) collectstatic --noinput
@$(MANAGE_DB) collectstatic --noinput
.PHONY: collectstatic
shell: ## connect to django shell
@$(MANAGE) shell #_plus
.PHONY: dbshell
.PHONY: shell
keycloak-export: ## export all keycloak data to a JSON file
@$(COMPOSE) run -v `pwd`/src/keycloak:/tmp/keycloak-export --rm keycloak export --realm messages --file /tmp/keycloak-export/realm.json
@@ -260,45 +306,54 @@ keycloak-export: ## export all keycloak data to a JSON file
# -- Database
dbshell: ## connect to database shell
db-shell: ## connect to database shell
docker compose exec backend-dev python manage.py dbshell
.PHONY: dbshell
.PHONY: db-shell
resetdb: FLUSH_ARGS ?=
resetdb: ## flush database
db-reset: FLUSH_ARGS ?=
db-reset: ## flush database
@echo "$(BOLD)Flush database$(RESET)"
@$(MANAGE) flush $(FLUSH_ARGS)
.PHONY: resetdb
@$(MANAGE_DB) flush $(FLUSH_ARGS)
.PHONY: db-reset
fullresetdb: build ## flush database, including schema
db-reset-full: build ## flush database, including schema
@echo "$(BOLD)Flush database$(RESET)"
$(MANAGE) drop_all_tables
$(MANAGE) migrate
.PHONY: fullresetdb
$(MANAGE_DB) drop_all_tables
$(MANAGE_DB) migrate
.PHONY: db-reset-full
env.d/development/common:
cp -n env.d/development/common.dist env.d/development/common
env.d/development/%.local:
@echo "# Local development overrides for $(notdir $*)" > $@
@echo "# Add your local-specific environment variables below:" >> $@
@echo "# Example: DJANGO_DEBUG=True" >> $@
@echo "" >> $@
env.d/development/backend:
cp -n env.d/development/backend.dist env.d/development/backend
# env.d/development/common.local:
# @echo "# Put your local-specific, gitignored env vars here" > env.d/development/common.local
env.d/development/mta-in:
cp -n env.d/development/mta-in.dist env.d/development/mta-in
# env.d/development/backend.local:
# @echo "# Put your local-specific, gitignored env vars here" > env.d/development/backend.local
env.d/development/postgresql:
cp -n env.d/development/postgresql.dist env.d/development/postgresql
# env.d/development/frontend.local:
# @echo "# Put your local-specific, gitignored env vars here" > env.d/development/frontend.local
env.d/development/kc_postgresql:
cp -n env.d/development/kc_postgresql.dist env.d/development/kc_postgresql
# env.d/development/mta-in.local:
# @echo "# Put your local-specific, gitignored env vars here" > env.d/development/mta-in.local
env.d/development/mta-out:
cp -n env.d/development/mta-out.dist env.d/development/mta-out
# env.d/development/postgresql.local:
# @echo "# Put your local-specific, gitignored env vars here" > env.d/development/postgresql.local
# env.d/development/keycloak.local:
# @echo "# Put your local-specific, gitignored env vars here" > env.d/development/keycloak.local
# env.d/development/mta-out.local:
# @echo "# Put your local-specific, gitignored env vars here" > env.d/development/mta-out.local
# env.d/development/crowdin.local:
# @echo "# Put your local-specific, gitignored env vars here" > env.d/development/crowdin.local
# -- Internationalization
env.d/development/crowdin:
cp -n env.d/development/crowdin.dist env.d/development/crowdin
crowdin-download: ## Download translated message from crowdin
@$(COMPOSE_RUN_CROWDIN) download -c crowdin/config.yml
.PHONY: crowdin-download
@@ -314,13 +369,13 @@ crowdin-upload: ## Upload source translations to crowdin
i18n-compile: ## compile all translations
i18n-compile: \
back-i18n-compile \
frontend-i18n-compile
front-i18n-compile
.PHONY: i18n-compile
i18n-generate: ## create the .pot files and extract frontend messages
i18n-generate: \
back-i18n-generate \
frontend-i18n-generate
front-i18n-generate
.PHONY: i18n-generate
i18n-download-and-compile: ## download all translated messages and compile them to be used by all applications
@@ -354,72 +409,64 @@ help:
@grep -E '^[a-zA-Z0-9_-]+:.*?## .*$$' $(firstword $(MAKEFILE_LIST)) | sort | awk 'BEGIN {FS = ":.*?## "}; {printf "$(GREEN)%-30s$(RESET) %s\n", $$1, $$2}'
.PHONY: help
frontend-shell: ## open a shell in the frontend container
front-shell: ## open a shell in the frontend container
@$(COMPOSE) run --rm frontend-tools /bin/sh
.PHONY: frontend-shell
.PHONY: front-shell
# Front
frontend-install: ## install the frontend locally
front-install: ## install the frontend locally
@$(COMPOSE) run --rm frontend-tools npm install
.PHONY: frontend-install
.PHONY: front-install
frontend-install-frozen: ## install the frontend locally, following the frozen lockfile
front-install-frozen: ## install the frontend locally, following the frozen lockfile
@echo "Installing frontend dependencies, this might take a few minutes..."
@$(COMPOSE) run --rm frontend-tools npm ci
.PHONY: frontend-install-frozen
.PHONY: front-install-frozen
frontend-install-frozen-amd64: ## install the frontend locally, following the frozen lockfile
front-install-frozen-amd64: ## install the frontend locally, following the frozen lockfile
@$(COMPOSE) run --rm frontend-tools-amd64 npm ci
.PHONY: frontend-install-frozen-amd64
.PHONY: front-install-frozen-amd64
frontend-build: ## build the frontend locally
front-build: ## build the frontend locally
@$(COMPOSE) run --rm frontend-tools npm run build
.PHONY: frontend-build
.PHONY: front-build
frontend-ts-check: ## build the frontend locally
@$(COMPOSE) run --rm frontend-tools npm run ts:check
.PHONY: frontend-ts-check
frontend-lint: ## run the frontend linter
@$(COMPOSE) run --rm frontend-tools npm run lint
.PHONY: frontend-lint
frontend-test: ## run the frontend tests
@$(COMPOSE) run --rm frontend-tools npm run test
.PHONY: frontend-test
frontend-test-amd64: ## run the frontend tests
@$(COMPOSE) run --rm frontend-tools-amd64 npm run test
.PHONY: frontend-test
frontend-i18n-extract: ## Extract the frontend translation inside a json to be used for crowdin
front-i18n-extract: ## Extract the frontend translation inside a json to be used for crowdin
@$(COMPOSE) run --rm frontend-tools npm run i18n:extract
.PHONY: frontend-i18n-extract
.PHONY: front-i18n-extract
frontend-i18n-generate: ## Generate the frontend json files used for crowdin
frontend-i18n-generate: \
front-i18n-generate: ## Generate the frontend json files used for crowdin
crowdin-download-sources \
frontend-i18n-extract
.PHONY: frontend-i18n-generate
front-i18n-extract
.PHONY: front-i18n-generate
frontend-i18n-compile: ## Format the crowin json files used deploy to the apps
front-i18n-compile: ## Format the crowin json files used deploy to the apps
@$(COMPOSE) run --rm frontend-tools npm run i18n:deploy
.PHONY: frontend-i18n-compile
.PHONY: front-i18n-compile
back-api-update: ## Update the OpenAPI schema
bin/update_openapi_schema
.PHONY: back-api-update
frontend-api-update: ## Update the frontend API client
front-api-update: ## Update the frontend API client
@$(COMPOSE) run --rm frontend-tools npm run api:update
.PHONY: frontend-api-update
.PHONY: front-api-update
api-update: ## Update the OpenAPI schema then frontend API client
api-update: \
back-api-update \
frontend-api-update
front-api-update
.PHONY: api-update
elasticsearch-index: ## Create and/or reindex elasticsearch data
search-index: ## Create and/or reindex opensearch data
@$(MANAGE) es_create_index
@$(MANAGE) es_reindex --all
.PHONY: elasticsearch-index
.PHONY: search-index
mta-in-poetry-lock: ## lock the dependencies
@$(COMPOSE) run --rm --build mta-in-poetry poetry lock
.PHONY: mta-in-poetry-lock
mta-out-poetry-lock: ## lock the dependencies
@$(COMPOSE) run --rm --build mta-out-poetry poetry lock
.PHONY: mta-out-poetry-lock

View File

@@ -1,3 +1,3 @@
web: bin/scalingo_run_web
worker: celery -A messages.celery_app worker --task-events --beat -l DEBUG -c $CELERY_CONCURRENCY
worker: celery -A messages.celery_app worker --task-events --beat -l INFO -c $CELERY_CONCURRENCY
postdeploy: python manage.py migrate

View File

@@ -92,13 +92,13 @@ $ make bootstrap
```
This command builds all required containers, installs dependencies, performs
database migrations and compiles translations. It's a good idea to use this
command each time you are pulling code from the project repository to avoid
database migrations and compiles translations. Later it's a good idea to run
`make update` each time you are pulling code from the project repository to avoid
dependency-related or migration-related issues.
Your Docker services should now be up and running 🎉
You can access the project by going to <http://localhost:3000>.
You can access the project by going to <http://localhost:8900>.
You will be prompted to log in. The default credentials are:
@@ -122,18 +122,23 @@ $ make start
$ make help
```
### Django admin
### Development Services
You can access the Django admin site at
[http://localhost:8071/admin](http://localhost:8071/admin).
When running the project, the following services are available:
You first need to create a superuser account:
```bash
$ make superuser
```
You can then login with email `admin@admin.local` and password `admin`.
| Service | URL / Port | Description | Credentials |
|---------|------------|-------------|------------|
| **Frontend** | [http://localhost:8900](http://localhost:8900) | Main Messages frontend | `user1@example.local` / `user1` |
| **Backend API** | [http://localhost:8901](http://localhost:8901) | Django [REST API](http://localhost:8901/api/v1.0/) and [Admin](http://localhost:8901/admin/) | `admin@admin.local` / `admin` |
| **Keycloak** | [http://localhost:8902](http://localhost:8902) | Identity provider admin | `admin` / `admin` |
| **Celery UI** | [http://localhost:8903](http://localhost:8903) | Task queue monitoring | No auth required |
| **Mailcatcher** | [http://localhost:8904](http://localhost:8904) | Email testing interface | No auth required |
| **MTA-in (SMTP)** | 8910 | Incoming email server | No auth required |
| **MTA-out (SMTP)** | 8911 | Outgoing email server | `user` / `pass` |
| **PostgreSQL** | 8912 | Database server | `user` / `pass` |
| **Redis** | 8913 | Cache and message broker | No auth required |
| **OpenSearch** | 8914 | Search engine | No auth required |
| **OpenSearch PA** | 8915 | Performance analyzer | No auth required |
### OpenAPI client
@@ -154,13 +159,13 @@ $ make api-update
You can also generate the schema only with:
```bash
$ make backend-api-update
$ make back-api-update
```
And the frontend API client only with:
```bash
$ make frontend-api-update
$ make front-api-update
```
### Sending test emails 📨
@@ -171,19 +176,21 @@ These examples use [swaks](https://www.jetmore.org/john/code/swaks/), a simple c
```
# First, make sure services are running
make run
# Send a test message to the MTA-out, which will then relay it to mailcatcher. Read it on http://localhost:1081/
swaks -tls --to=test@example.com --server 127.0.0.1:8587 --auth-user testuser --auth-password=testpass
make start
# Send a test message to the MTA-in, which will relay it to the Django MDA.
# The domain must be MESSAGES_TESTDOMAIN if you want the mailbox created automatically.
# You can then read it on the frontend on http://localhost:3000/ (login as user1/user1) and reply to it there.
# The replies will then be sent through the MTA-out to the mailcatcher on http://localhost:1081/
swaks --to=user1@example.local --server 127.0.0.1:8025
# The domain must be MESSAGES_TESTDOMAIN (default is example.local) if you want the mailbox created automatically.
# You can then read it on the frontend at http://localhost:8900/ (login as user1/user1) and reply to it there.
# The replies will then be sent through the MTA-out to the mailcatcher on http://localhost:8904/
swaks --to=user1@example.local --server localhost:8910
# Send a message manually to the MTA-out, which will then relay it to mailcatcher on http://localhost:8904/
swaks -tls --to=test@example.external --server localhost:8911 --auth-user user --auth-password=pass
```
> ⚠️ Most residential ISPs block the outgoing port 25, so you might not be able to send emails to outside
> servers from your localhost. This is why the mailcatcher is so useful locally.
## Feedback 🙋‍♂️🙋‍♀️

View File

@@ -11,3 +11,5 @@ mv src/frontend/out build/frontend-out
mv src/backend/* ./
mv src/nginx/* ./
echo "3.13" > .python-version

View File

@@ -4,49 +4,51 @@ services:
postgresql:
image: postgres:16.6
ports:
- "6434:5432"
- "8912:5432"
healthcheck:
test: ["CMD-SHELL", "pg_isready"]
interval: 1s
timeout: 2s
retries: 300
env_file:
- env.d/development/postgresql
- env.d/development/postgresql.defaults
- env.d/development/postgresql.local
redis:
image: redis:5
ports:
- "8913:6379"
elasticsearch:
# Same version as Scalingo
image: docker.elastic.co/elasticsearch/elasticsearch:7.10.2
opensearch:
image: opensearchproject/opensearch:2.19.2
environment:
- discovery.type=single-node
- "ES_JAVA_OPTS=-Xms512m -Xmx512m"
- xpack.security.enabled=false
- http.cors.enabled=true
- "http.cors.allow-origin=/.*/"
- bootstrap.memory_lock=true
- "OPENSEARCH_JAVA_OPTS=-Xms512m -Xmx512m"
- "DISABLE_INSTALL_DEMO_CONFIG=true"
- "DISABLE_SECURITY_PLUGIN=true"
# - http.cors.enabled=true
# - "http.cors.allow-origin=/.*/"
ports:
- "9200:9200"
- "8914:9200" # REST API
- "8915:9600" # Performance Analyzer
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:9200"]
interval: 1s
timeout: 5s
retries: 60
elasticsearch-ui:
image: cars10/elasticvue:latest
ports:
- "8093:8080"
environment:
- "ELASTICVUE_CLUSTERS=[{\"name\": \"dev cluster\", \"uri\": \"http://localhost:9200\"}]"
depends_on:
elasticsearch:
condition: service_healthy
ulimits:
memlock:
soft: -1 # Set memlock to unlimited (no soft or hard limit)
hard: -1
nofile:
soft: 65536 # Maximum number of open files for the opensearch user - set to at least 65536
hard: 65536
mailcatcher:
image: maildev/maildev:2.2.1
ports:
- "1081:1080"
- "8904:1080"
# minio:
# user: ${DOCKER_USER:-1000}
@@ -87,16 +89,14 @@ services:
args:
DOCKER_USER: ${DOCKER_USER:-1000}
user: ${DOCKER_USER:-1000}
image: st-messages:backend-development
environment:
- PYLINTHOME=/app/.pylint.d
- DJANGO_CONFIGURATION=Development
env_file:
- env.d/development/common
- env.d/development/backend
- env.d/development/postgresql
- env.d/development/backend.defaults
- env.d/development/backend.local
ports:
- "8071:8000"
- "8901:8000"
volumes:
- ./src/backend:/app
- ./data/static:/data/static
@@ -108,7 +108,7 @@ services:
condition: service_started
mta-out:
condition: service_started
elasticsearch:
opensearch:
condition: service_healthy
keycloak:
condition: service_started
@@ -116,6 +116,30 @@ services:
# createbuckets:
# condition: service_started
backend-db:
profiles:
- tools
build:
context: src/backend
target: runtime-dev
args:
DOCKER_USER: ${DOCKER_USER:-1000}
user: ${DOCKER_USER:-1000}
environment:
- DJANGO_CONFIGURATION=DevelopmentMinimal
env_file:
- env.d/development/backend.defaults
- env.d/development/backend.local
ports:
- "8901:8000"
volumes:
- ./src/backend:/app
- ./data/static:/data/static
depends_on:
postgresql:
condition: service_healthy
restart: true
backend-poetry:
profiles:
- tools
@@ -126,7 +150,6 @@ services:
target: poetry
pull_policy: build
celery-dev:
build:
context: src/backend
@@ -134,14 +157,12 @@ services:
args:
DOCKER_USER: ${DOCKER_USER:-1000}
user: ${DOCKER_USER:-1000}
image: st-messages:backend-development
command: ["celery", "-A", "messages.celery_app", "worker", "-l", "DEBUG"]
environment:
- DJANGO_CONFIGURATION=Development
env_file:
- env.d/development/common
- env.d/development/backend
- env.d/development/postgresql
- env.d/development/backend.defaults
- env.d/development/backend.local
volumes:
- ./src/backend:/app
- ./data/static:/data/static
@@ -161,37 +182,13 @@ services:
- FLOWER_UNAUTHENTICATED_API=true
- DJANGO_CONFIGURATION=Development
env_file:
- env.d/development/common
- env.d/development/backend
- env.d/development/postgresql
- env.d/development/backend.defaults
- env.d/development/backend.local
volumes:
- ./src/backend:/app
ports:
- "5556:5556"
command: celery -A messages.celery_app flower --port=5556
# app:
# build:
# context: .
# target: backend-production
# args:
# DOCKER_USER: ${DOCKER_USER:-1000}
# user: ${DOCKER_USER:-1000}
# image: st-messages:backend-production
# environment:
# - DJANGO_CONFIGURATION=Production
# env_file:
# - env.d/development/common
# - env.d/development/backend
# - env.d/development/postgresql
# depends_on:
# postgresql:
# condition: service_healthy
# restart: true
# redis:
# condition: service_started
# #minio:
# # condition: service_started
- "8903:8803"
command: celery -A messages.celery_app flower --port=8803
# nginx:
# image: nginx:1.25
@@ -210,15 +207,14 @@ services:
build:
context: .
dockerfile: ./src/frontend/Dockerfile.dev
environment:
- NEXT_PUBLIC_API_ORIGIN=http://localhost:8071
- NEXT_PUBLIC_S3_DOMAIN_REPLACE=http://localhost:9000
image: st-messages:frontend-development
env_file:
- env.d/development/frontend.defaults
- env.d/development/frontend.local
command: ["npm", "run", "dev"]
volumes:
- ./src/frontend/:/home/frontend/
ports:
- "3000:3000"
- "8900:3000"
frontend-tools:
user: "${DOCKER_USER:-1000}"
@@ -241,61 +237,96 @@ services:
- ./src/backend/core/api/openapi.json:/home/backend/core/api/openapi.json
- ./src/frontend/:/home/frontend/
# frontend:
# user: "${DOCKER_USER:-1000}"
# build:
# context: .
# dockerfile: ./src/frontend/Dockerfile
# target: frontend-production
# args:
# API_ORIGIN: "http://localhost:8071"
# S3_DOMAIN_REPLACE: "http://localhost:9000"
# image: st-messages:frontend-production
# ports:
# - "3001:3000"
crowdin:
image: crowdin/cli:3.16.0
volumes:
- ".:/app"
env_file:
- env.d/development/crowdin
user: "${DOCKER_USER:-1000}"
working_dir: /app
# node:
# image: node:22
# user: "${DOCKER_USER:-1000}"
# environment:
# HOME: /tmp
# crowdin:
# image: crowdin/cli:3.16.0
# volumes:
# - ".:/app"
# env_file:
# - env.d/development/crowdin
# user: "${DOCKER_USER:-1000}"
# working_dir: /app
mta-in:
build:
context: src/mta-in/postfix
dockerfile: Dockerfile
context: src/mta-in
target: runtime-prod
env_file:
- env.d/development/common
- env.d/development/mta-in
- env.d/development/mta-in.defaults
- env.d/development/mta-in.local
ports:
- "8025:25"
- "8910:25"
depends_on:
- backend-dev
mta-out:
image: mta-out:latest
mta-in-test:
profiles:
- tools
build:
context: src/mta-out/postfix
dockerfile: Dockerfile
context: src/mta-in
target: runtime-dev
env_file:
- env.d/development/mta-out
- env.d/development/mta-in.defaults
- env.d/development/mta-in.local
environment:
- EXEC_CMD=true
- MDA_API_BASE_URL=http://localhost:8000/api/mail/
- MTA_HOST=localhost
command: pytest -vvs tests/
volumes:
- ./src/mta-in:/app
mta-in-poetry:
profiles:
- tools
volumes:
- ./src/mta-in:/app
build:
context: src/mta-in
target: poetry
pull_policy: build
mta-out:
build:
context: src/mta-out
target: runtime-prod
env_file:
- env.d/development/mta-out.defaults
- env.d/development/mta-out.local
ports:
- "8587:587"
- "8911:587"
depends_on:
mailcatcher:
condition: service_started
mta-out-test:
profiles:
- tools
build:
context: src/mta-out
target: runtime-dev
env_file:
- env.d/development/mta-out.defaults
- env.d/development/mta-out.local
environment:
- EXEC_CMD=true
- MTA_OUT_HOST=localhost:587
- MTA_OUT_SMTP_USERNAME=user
- MTA_OUT_SMTP_PASSWORD=pass
- SMTP_RELAY_HOST=localhost:2525
command: pytest -vvs tests/
volumes:
- ./src/mta-out:/app
mta-out-poetry:
profiles:
- tools
volumes:
- ./src/mta-out:/app
build:
context: src/mta-out
target: poetry
pull_policy: build
keycloak:
image: quay.io/keycloak/keycloak:26.2.5
volumes:
@@ -306,24 +337,13 @@ services:
- --features=preview
- --import-realm
- --proxy=edge
- --hostname=http://localhost:8083
- --hostname-admin=http://localhost:8083/
- --http-port=8083
environment:
KC_BOOTSTRAP_ADMIN_USERNAME: admin
KC_BOOTSTRAP_ADMIN_PASSWORD: admin
KC_DB: postgres
KC_DB_URL_HOST: postgresql
KC_DB_URL_DATABASE: messages
KC_DB_PASSWORD: pass
KC_DB_USERNAME: user
KC_DB_SCHEMA: public
KC_HOSTNAME_STRICT: false
KC_HOSTNAME_STRICT_HTTPS: false
KC_HTTP_ENABLED: true
KC_HEALTH_ENABLED: true
PROXY_ADDRESS_FORWARDING: "true"
- --hostname=http://localhost:8902
- --hostname-admin=http://localhost:8902/
- --http-port=8802
env_file:
- env.d/development/keycloak.defaults
- env.d/development/keycloak.local
ports:
- "8083:8083"
- "8902:8802"
depends_on:
- postgresql

View File

@@ -17,7 +17,7 @@
- **Django REST Framework**: Main API service handling business logic, including email processing
- **Celery Workers**: Asynchronous task processing for heavy operations
- **Search Service**: Elasticsearch integration for full-text search
- **Search Service**: OpenSearch integration for full-text search
### Mail Transfer Layer
@@ -29,7 +29,7 @@
- **PostgreSQL**: Primary relational database for all structured data
- **Redis**: Caching layer and Celery message broker
- **Elasticsearch**: Full-text search index for messages and threads
- **OpenSearch**: Full-text search index for messages and threads
- **S3-Compatible Storage**: File and attachment storage (In progress)
### Authentication & Authorization
@@ -44,7 +44,7 @@
1. External email arrives at **MTA-In** via SMTP
2. **MTA-In** validates recipients against Django backend
3. **MDA** parses and stores messages in PostgreSQL
4. **Celery** tasks index content in Elasticsearch
4. **Celery** tasks index content in OpenSearch
5. Users see new messages in real-time via frontend
### Outbound Email Processing
@@ -58,7 +58,7 @@
### Search Operations
1. User submits search query via frontend
2. Backend directly queries Elasticsearch for real-time results
2. Backend directly queries OpenSearch for real-time results
3. Results are ranked and filtered by permissions
4. Frontend displays paginated results
@@ -66,7 +66,7 @@
1. New messages/threads are saved to PostgreSQL
2. Backend queues indexing tasks to Celery
3. Celery workers asynchronously index content in Elasticsearch
3. Celery workers asynchronously index content in OpenSearch
4. Heavy operations (bulk imports, reindexing) are handled via Celery
## Key Features
@@ -82,7 +82,7 @@
- **Microservices Architecture**: Independent scaling of components
- **Async Processing**: Non-blocking operations via Celery
- **Caching Strategy**: Redis for session and query caching
- **Search Optimization**: Elasticsearch for fast full-text search
- **Search Optimization**: OpenSearch for fast full-text search
### Development Experience

View File

@@ -2,6 +2,26 @@
This document provides a comprehensive overview of all environment variables used in the Messages application. These variables are organized by service and functionality.
## Development Environment
### Environment Files Structure
The application uses a new environment file structure with `.defaults` and `.local` files:
- `*.defaults` - Committed default configurations
- `*.local` - Gitignored local overrides (created by `make bootstrap`)
#### Available Environment Files
- `backend.defaults` - Main Django application settings
- `common.defaults` - Shared settings across services
- `frontend.defaults` - Frontend configuration
- `postgresql.defaults` - PostgreSQL database configuration
- `keycloak.defaults` - Keycloak configuration
- `mta-in.defaults` - Inbound mail server settings
- `mta-out.defaults` - Outbound mail server settings
- `crowdin.defaults` - Translation service configuration
## Core Application Configuration
### Django Settings
@@ -22,16 +42,16 @@ This document provides a comprehensive overview of all environment variables use
|----------|---------|-------------|----------|
| `DATABASE_URL` | None | Complete database URL (overrides individual DB_* vars) | Optional |
| `DB_ENGINE` | `django.db.backends.postgresql_psycopg2` | Database engine | Optional |
| `DB_HOST` | `localhost` | Database hostname | Optional |
| `DB_HOST` | `postgresql` | Database hostname (container name) | Optional |
| `DB_NAME` | `messages` | Database name | Optional |
| `DB_USER` | `dbuser` | Database username | Optional |
| `DB_PASSWORD` | `dbpass` | Database password | Optional |
| `DB_USER` | `user` | Database username | Optional |
| `DB_PASSWORD` | `pass` | Database password | Optional |
| `DB_PORT` | `5432` | Database port | Optional |
#### PostgreSQL (Keycloak)
| Variable | Default | Description | Required |
|----------|---------|-------------|----------|
| `POSTGRES_DB` | `keycloak` | Keycloak database name | Dev |
| `POSTGRES_DB` | `messages` | Keycloak database name | Dev |
| `POSTGRES_USER` | `user` | Keycloak database user | Dev |
| `POSTGRES_PASSWORD` | `pass` | Keycloak database password | Dev |
@@ -39,17 +59,19 @@ This document provides a comprehensive overview of all environment variables use
| Variable | Default | Description | Required |
|----------|---------|-------------|----------|
| `REDIS_URL` | `redis://redis:6379` | Redis connection URL | Optional |
| `CELERY_BROKER_URL` | `redis://redis:6379` | Celery message broker URL | Optional |
| `REDIS_URL` | `redis://redis:6379` | Redis connection URL (internal) | Optional |
| `CELERY_BROKER_URL` | `redis://redis:6379` | Celery message broker URL (internal) | Optional |
| `CACHES_DEFAULT_TIMEOUT` | `30` | Default cache timeout in seconds | Optional |
### Elasticsearch Configuration
**Note**: For external Redis access, use `localhost:8913`. For internal container communication, use `redis:6379`.
### OpenSearch Configuration
| Variable | Default | Description | Required |
|----------|---------|-------------|----------|
| `ELASTICSEARCH_URL` | `["http://elasticsearch:9200"]` | Elasticsearch hosts list | Optional |
| `ELASTICSEARCH_TIMEOUT` | `20` | Elasticsearch query timeout | Optional |
| `ELASTICSEARCH_INDEX_THREADS` | `True` | Enable thread indexing | Optional |
| `OPENSEARCH_URL` | `["http://opensearch:9200"]` | OpenSearch hosts list | Optional |
| `OPENSEARCH_TIMEOUT` | `20` | OpenSearch query timeout | Optional |
| `OPENSEARCH_INDEX_THREADS` | `True` | Enable thread indexing | Optional |
## Mail Processing Configuration
@@ -57,26 +79,26 @@ This document provides a comprehensive overview of all environment variables use
| Variable | Default | Description | Required |
|----------|---------|-------------|----------|
| `MTA_OUT_HOST` | None | Outbound SMTP server host | Required |
| `MTA_OUT_SMTP_USERNAME` | None | Outbound SMTP username | Optional |
| `MTA_OUT_SMTP_PASSWORD` | None | Outbound SMTP password | Optional |
| `MTA_OUT_HOST` | `mta-out:587` | Outbound SMTP server host | Required |
| `MTA_OUT_SMTP_USERNAME` | `user` | Outbound SMTP username | Optional |
| `MTA_OUT_SMTP_PASSWORD` | `pass` | Outbound SMTP password | Optional |
| `MTA_OUT_SMTP_USE_TLS` | `True` | Use TLS for outbound SMTP | Optional |
| `MDA_API_SECRET` | `default-mda-api-secret` | Shared secret for MDA API | Required |
| `MDA_API_BASE_URL` | None | Base URL for MDA API | Dev |
| `MDA_API_SECRET` | `my-shared-secret-mda` | Shared secret for MDA API | Required |
| `MDA_API_BASE_URL` | `http://backend-dev:8000/api/v1.0/mta/` | Base URL for MDA API | Dev |
### MTA-Out Specific
| Variable | Default | Description | Required |
|----------|---------|-------------|----------|
| `SMTP_RELAY_HOST` | None | SMTP relay server | Dev |
| `SMTP_USERNAME` | None | SMTP authentication username | Dev |
| `SMTP_PASSWORD` | None | SMTP authentication password | Dev |
| `SMTP_RELAY_HOST` | `mailcatcher:1025` | SMTP relay server | Dev |
| `SMTP_USERNAME` | `user` | SMTP authentication username | Dev |
| `SMTP_PASSWORD` | `pass` | SMTP authentication password | Dev |
### Email Domain Configuration
| Variable | Default | Description | Required |
|----------|---------|-------------|----------|
| `MESSAGES_TESTDOMAIN` | `localhost` | Test domain for development | Dev |
| `MESSAGES_TESTDOMAIN_MAPPING_BASEDOMAIN` | `gouv.fr` | Base domain mapping | Dev |
| `MESSAGES_TESTDOMAIN` | `example.local` | Test domain for development | Dev |
| `MESSAGES_TESTDOMAIN_MAPPING_BASEDOMAIN` | `example.com` | Base domain mapping | Dev |
| `MESSAGES_ACCEPT_ALL_EMAILS` | `False` | Accept emails to any domain | Optional |
### DKIM Configuration
@@ -94,20 +116,20 @@ This document provides a comprehensive overview of all environment variables use
| Variable | Default | Description | Required |
|----------|---------|-------------|----------|
| `AWS_S3_ENDPOINT_URL` | None | S3 endpoint URL | Optional |
| `AWS_S3_ACCESS_KEY_ID` | None | S3 access key | Optional |
| `AWS_S3_SECRET_ACCESS_KEY` | None | S3 secret key | Optional |
| `AWS_S3_ENDPOINT_URL` | `http://minio:9000` | S3 endpoint URL | Optional |
| `AWS_S3_ACCESS_KEY_ID` | `messages` | S3 access key | Optional |
| `AWS_S3_SECRET_ACCESS_KEY` | `password` | S3 secret key | Optional |
| `AWS_S3_REGION_NAME` | None | S3 region | Optional |
| `AWS_STORAGE_BUCKET_NAME` | `st-messages-media-storage` | S3 bucket name | Optional |
| `AWS_S3_UPLOAD_POLICY_EXPIRATION` | `86400` | Upload policy expiration (24h) | Optional |
| `MEDIA_BASE_URL` | None | Base URL for media files | Optional |
| `MEDIA_BASE_URL` | `http://localhost:8902` | Base URL for media files | Optional |
| `ITEM_FILE_MAX_SIZE` | `5368709120` | Max file size (5GB) | Optional |
### Static Files
| Variable | Default | Description | Required |
|----------|---------|-------------|----------|
| `STORAGES_STATICFILES_BACKEND` | `whitenoise.storage.CompressedManifestStaticFilesStorage` | Static files storage backend | Optional |
| `STORAGES_STATICFILES_BACKEND` | `django.contrib.staticfiles.storage.StaticFilesStorage` | Static files storage backend | Optional |
## Authentication & Authorization
@@ -116,14 +138,14 @@ This document provides a comprehensive overview of all environment variables use
| Variable | Default | Description | Required |
|----------|---------|-------------|----------|
| `OIDC_CREATE_USER` | `False` | Automatically create users from OIDC | Optional |
| `OIDC_RP_CLIENT_ID` | `st_messages` | OIDC client ID | Required |
| `OIDC_RP_CLIENT_SECRET` | None | OIDC client secret | Required |
| `OIDC_RP_CLIENT_ID` | `messages` | OIDC client ID | Required |
| `OIDC_RP_CLIENT_SECRET` | `ThisIsAnExampleKeyForDevPurposeOnly` | OIDC client secret | Required |
| `OIDC_RP_SIGN_ALGO` | `RS256` | OIDC signing algorithm | Optional |
| `OIDC_RP_SCOPES` | `openid email` | OIDC scopes | Optional |
| `OIDC_OP_JWKS_ENDPOINT` | None | OIDC JWKS endpoint | Required |
| `OIDC_OP_AUTHORIZATION_ENDPOINT` | None | OIDC authorization endpoint | Required |
| `OIDC_OP_TOKEN_ENDPOINT` | None | OIDC token endpoint | Required |
| `OIDC_OP_USER_ENDPOINT` | None | OIDC user info endpoint | Required |
| `OIDC_OP_JWKS_ENDPOINT` | `http://keycloak:8000/realms/messages/protocol/openid-connect/certs` | OIDC JWKS endpoint | Required |
| `OIDC_OP_AUTHORIZATION_ENDPOINT` | `http://localhost:8902/realms/messages/protocol/openid-connect/auth` | OIDC authorization endpoint | Required |
| `OIDC_OP_TOKEN_ENDPOINT` | `http://keycloak:8000/realms/messages/protocol/openid-connect/token` | OIDC token endpoint | Required |
| `OIDC_OP_USER_ENDPOINT` | `http://keycloak:8000/realms/messages/protocol/openid-connect/userinfo` | OIDC user info endpoint | Required |
| `OIDC_OP_LOGOUT_ENDPOINT` | None | OIDC logout endpoint | Optional |
### OIDC Advanced Settings
@@ -132,19 +154,19 @@ This document provides a comprehensive overview of all environment variables use
|----------|---------|-------------|----------|
| `OIDC_USE_NONCE` | `True` | Use nonce in OIDC flow | Optional |
| `OIDC_REDIRECT_REQUIRE_HTTPS` | `False` | Require HTTPS for redirects | Optional |
| `OIDC_REDIRECT_ALLOWED_HOSTS` | `[]` | Allowed redirect hosts | Optional |
| `OIDC_REDIRECT_ALLOWED_HOSTS` | `["http://localhost:8902", "http://localhost:8900"]` | Allowed redirect hosts | Optional |
| `OIDC_STORE_ID_TOKEN` | `True` | Store ID token | Optional |
| `OIDC_FALLBACK_TO_EMAIL_FOR_IDENTIFICATION` | `True` | Use email as fallback identifier | Optional |
| `OIDC_ALLOW_DUPLICATE_EMAILS` | `False` | Allow duplicate emails (⚠️ Security risk) | Optional |
| `OIDC_AUTH_REQUEST_EXTRA_PARAMS` | `{}` | Extra parameters for auth requests | Optional |
| `OIDC_AUTH_REQUEST_EXTRA_PARAMS` | `{"acr_values": "eidas1"}` | Extra parameters for auth requests | Optional |
### Authentication URLs
| Variable | Default | Description | Required |
|----------|---------|-------------|----------|
| `LOGIN_REDIRECT_URL` | None | Post-login redirect URL | Optional |
| `LOGIN_REDIRECT_URL_FAILURE` | None | Login failure redirect URL | Optional |
| `LOGOUT_REDIRECT_URL` | None | Post-logout redirect URL | Optional |
| `LOGIN_REDIRECT_URL` | `http://localhost:8900` | Post-login redirect URL | Optional |
| `LOGIN_REDIRECT_URL_FAILURE` | `http://localhost:8900` | Login failure redirect URL | Optional |
| `LOGOUT_REDIRECT_URL` | `http://localhost:8900` | Post-logout redirect URL | Optional |
| `ALLOW_LOGOUT_GET_METHOD` | `True` | Allow GET method for logout | Optional |
### User Mapping
@@ -159,10 +181,10 @@ This document provides a comprehensive overview of all environment variables use
| Variable | Default | Description | Required |
|----------|---------|-------------|----------|
| `CORS_ALLOW_ALL_ORIGINS` | `False` | Allow all CORS origins | Optional |
| `CORS_ALLOW_ALL_ORIGINS` | `True` | Allow all CORS origins | Optional |
| `CORS_ALLOWED_ORIGINS` | `[]` | Specific allowed CORS origins | Optional |
| `CORS_ALLOWED_ORIGIN_REGEXES` | `[]` | Regex patterns for allowed origins | Optional |
| `CSRF_TRUSTED_ORIGINS` | `[]` | Trusted origins for CSRF | Optional |
| `CSRF_TRUSTED_ORIGINS` | `["http://localhost:8900", "http://localhost:8901"]` | Trusted origins for CSRF | Optional |
| `SERVER_TO_SERVER_API_TOKENS` | `[]` | API tokens for server-to-server auth | Optional |
## Monitoring & Observability
@@ -205,9 +227,9 @@ This document provides a comprehensive overview of all environment variables use
| Variable | Default | Description | Required |
|----------|---------|-------------|----------|
| `FRONTEND_THEME` | None | Frontend theme identifier | Optional |
| `NEXT_PUBLIC_API_ORIGIN` | None | Frontend API origin | Dev |
| `NEXT_PUBLIC_S3_DOMAIN_REPLACE` | None | S3 domain replacement for frontend | Dev |
| `FRONTEND_THEME` | `dsfr` | Frontend theme identifier | Optional |
| `NEXT_PUBLIC_API_ORIGIN` | `http://localhost:8901` | Frontend API origin | Dev |
| `NEXT_PUBLIC_S3_DOMAIN_REPLACE` | `http://localhost:9000` | S3 domain replacement for frontend | Dev |
## Development Tools
@@ -215,8 +237,8 @@ This document provides a comprehensive overview of all environment variables use
| Variable | Default | Description | Required |
|----------|---------|-------------|----------|
| `CROWDIN_PERSONAL_TOKEN` | None | Crowdin API token | Dev |
| `CROWDIN_PROJECT_ID` | None | Crowdin project ID | Dev |
| `CROWDIN_PERSONAL_TOKEN` | `Your-Personal-Token` | Crowdin API token | Dev |
| `CROWDIN_PROJECT_ID` | `Your-Project-Id` | Crowdin project ID | Dev |
| `CROWDIN_BASE_PATH` | `/app/src` | Base path for translations | Dev |
## Application Settings
@@ -244,19 +266,29 @@ This document provides a comprehensive overview of all environment variables use
The application uses environment files located in `env.d/development/` for different services:
- `backend.dist` - Main Django application settings
- `common.dist` - Shared settings across services
- `postgresql.dist` - PostgreSQL database configuration
- `kc_postgresql.dist` - Keycloak database configuration
- `mta-in.dist` - Inbound mail server settings
- `mta-out.dist` - Outbound mail server settings
- `crowdin.dist` - Translation service configuration
- `backend.defaults` - Main Django application settings
- `common.defaults` - Shared settings across services
- `frontend.defaults` - Frontend configuration
- `postgresql.defaults` - PostgreSQL database configuration
- `keycloak.defaults` - Keycloak configuration
- `mta-in.defaults` - Inbound mail server settings
- `mta-out.defaults` - Outbound mail server settings
- `crowdin.defaults` - Translation service configuration
### Local Overrides
The `make bootstrap` command creates empty `.local` files for each service with a comment header:
```
# Put your local-specific, gitignored env vars here
```
These files are gitignored and allow for local development customizations without affecting the repository.
## Security Notes
⚠️ **Important Security Considerations:**
1. **Never commit actual secrets** - Use `.dist` files as templates
1. **Never commit actual secrets** - Use `.local` files only
2. **OIDC_ALLOW_DUPLICATE_EMAILS** - Should remain `False` in production
3. **CORS_ALLOW_ALL_ORIGINS** - Should be `False` in production
4. **DJANGO_SECRET_KEY** - Must be unique and secret in production

259
docs/self-hosting.md Normal file
View File

@@ -0,0 +1,259 @@
# Self-Hosting Guide
This guide explains how to deploy Messages in production, focusing on Messages-specific configuration and architecture.
## Overview
Messages is designed to be self-hosted. See [Architecture](./architecture.md) for component details.
## Prerequisites
- **Domain name(s)** for your email service
- **SSL certificates** for your domains
- **Server resources**: Minimum 4GB RAM
## Deployment Options
Messages supports multiple deployment strategies depending on your infrastructure and expertise level:
### Docker Compose (Recommended for most users)
**Best for**: Small to medium deployments, single-server setups, quick prototyping
**Requirements**:
- Docker and Docker Compose installed
- Single server or VM with sufficient resources
- Basic Docker knowledge
**Process**:
1. Start from the `compose.yaml` in the repository
2. Create production environment files (`env.d/production/*.defaults`)
3. Deploy to any environment where Docker Compose runs
4. Configure DNS and SSL certificates
**Advantages**:
- Simplest setup and maintenance
- Easy to understand and modify
- Quick deployment and updates
- Good for development and testing
### Ansible Deployment
**Best for**: Multi-server deployments, infrastructure automation, production environments
**Requirements**:
- Ansible knowledge
- Target servers with Docker support
- Infrastructure automation experience
**Process**:
1. Use our [ST Ansible repository](https://github.com/suitenumerique/st-ansible) as a base
2. Customize playbooks for your infrastructure
3. Deploy across multiple servers with automation
4. Configure monitoring and backup strategies
**Advantages**:
- Infrastructure as code
- Automated deployment and updates
- Multi-server support
- Production-ready with monitoring
### Kubernetes Deployment
**Best for**: Large-scale deployments, cloud-native environments, enterprise setups
**Requirements**:
- Kubernetes cluster
- Helm knowledge (when charts become available)
- Container orchestration experience
**Process**:
1. Wait for Helm charts (coming in future releases)
2. Deploy to Kubernetes cluster
3. Configure ingress controllers and load balancers
4. Set up monitoring with Prometheus/Grafana
**Advantages**:
- High availability and scalability
- Advanced orchestration features
- Cloud-native deployment patterns
- Enterprise-grade monitoring and logging
**Note**: Kubernetes deployment might be supported in future releases with official Helm charts.
## Messages-Specific Configuration
### 1. Technical Domain Setup
Messages uses a technical domain concept for DNS infrastructure:
```bash
# Set your technical domain
MESSAGES_TECHNICAL_DOMAIN=mail.yourdomain.com
```
**Technical Domain DNS Records:**
```
mx1.mail.yourdomain.com. A YOUR_SERVER_IP
mx2.mail.yourdomain.com. A YOUR_SERVER_IP
_spf.mail.yourdomain.com. TXT "v=spf1 ip4:YOUR_SERVER_IP -all"
```
**Customer Domain DNS Records:**
```
@ MX 10 mx1.customer-domain.com.
@ MX 20 mx2.customer-domain.com.
@ TXT "v=spf1 include:_spf.mail.yourdomain.com -all"
_dmarc TXT "v=DMARC1; p=reject; adkim=s; aspf=s;"
```
The DNS records for each customer domains are available either via API at http://localhost:8901/api/v1.0/maildomains/{maildomain-uuid}/ or in the admin interface at http://localhost:8900/domains
### 2. Environment Configuration
Messages uses environment variables as the primary configuration method:
**Environment File Structure:**
- `env.d/production/backend.defaults` - Main Django application settings
- `env.d/production/frontend.defaults` - Frontend configuration
- `env.d/production/mta-in.defaults` - Inbound mail server settings
- `env.d/production/mta-out.defaults` - Outbound mail server settings
- `env.d/production/postgresql.defaults` - Database configuration
- `env.d/production/keycloak.defaults` - Identity provider settings
**For detailed environment variable documentation, see [Environment Variables](./env.md).**
### 3. MTA Configuration
#### MTA-in (Inbound Email)
- Configured via `env.d/production/mta-in.defaults`
- Uses custom milter for synchronous delivery during SMTP sessions
- Validates recipients via REST API before accepting messages
#### MTA-out (Outbound Email)
- Configured via `env.d/production/mta-out.defaults`
- Supports relay configuration for external SMTP providers
- Requires TLS certificates for production
### 4. DNS Management
Messages includes automated DNS management:
```bash
# Check DNS records for a customer domain
python manage.py dns_check --domain example.com
# Provision DNS records automatically
python manage.py dns_provision --domain example.com --provider scaleway
# Simulate provisioning without making changes
python manage.py dns_provision --domain example.com --pretend
```
**Supported DNS Providers:**
- Scaleway DNS (full automation support)
### 5. Domain and Mailbox Management
#### Creating Mail Domains
```bash
# Via Django admin at /admin/
# Via API endpoints
# Via management commands
python manage.py shell
>>> from core.models import MailDomain
>>> MailDomain.objects.create(name='customer-domain.com')
```
#### Mailbox Creation
- Manual creation through admin interface
- Automatic creation via OIDC integration
- Programmatic creation via API
### 6. Identity Management
Messages uses OpenID Connect (OIDC) for user authentication. This is the only authentication method supported.
**OIDC Configuration Options:**
1. **Bundled Keycloak** (Recommended for most deployments)
- Keycloak is included in the default Docker Compose setup
- Pre-configured with Messages realm and users
- Suitable for organizations wanting a self-hosted identity provider
- Configure via `env.d/production/keycloak.defaults`
2. **External OIDC Provider**
- Use any OIDC-compliant identity provider
- Examples: Auth0, Okta, Azure AD, Google Workspace
- Configure via `env.d/production/backend.defaults`
- Requires proper OIDC endpoint configuration
**User Management:**
- Users are created automatically when they first log in via OIDC
- Mailboxes can be created automatically based on OIDC email addresses
### 7. Production Deployment
For production deployment, create your own Docker Compose configuration based on `compose.yaml`:
**Key Considerations:**
- Use production environment files (`env.d/production/*.defaults`)
- Configure SSL/TLS certificates
- Set up persistent volumes for databases
- Implement proper restart policies
- Configure reverse proxy (nginx) for SSL termination
## Security Considerations
### Messages-Specific Security
- **MDA API Secret**: Use strong, unique `MDA_API_SECRET`
- **OIDC Configuration**: Properly configure Keycloak endpoints
- **Technical Domain**: Secure DNS records for your technical domain
- **Environment Files**: Never commit production secrets
### IP Reputation Management
**Monitoring:**
- Check your server's IP reputation at [MXToolbox](https://mxtoolbox.com/blacklists.aspx)
- Monitor key blacklists: Spamhaus, Barracuda, SORBS
**Recovery from Blacklisting:**
1. Stop all outgoing email immediately
2. Check server logs for abuse indicators
3. Follow blacklist's delisting procedure
4. Implement stricter authentication and rate limiting
## Troubleshooting
### Common Messages Issues
1. **MTA-in not receiving emails**
- Check firewall settings for port 25
- Verify DNS MX records point to your technical domain
- Check MTA-in logs for API connection issues
2. **MTA-out not sending emails**
- Verify SMTP credentials in environment files
- Check relay host configuration
- Review MTA-out logs for authentication errors
3. **DNS issues**
- Use `dns_check` command to verify records
- Ensure technical domain A records are correct
- Check DNS propagation with `dig`
4. **Authentication problems**
- Verify Keycloak configuration in environment files
- Check OIDC endpoint URLs
- Review backend logs for authentication errors
## Next Steps
After setting up your production environment:
1. **Test thoroughly** with a small group of users
2. **Monitor performance** and adjust resources as needed
3. **Set up automated backups** and monitoring
4. **Plan for scaling** as your user base grows
For additional help, join the [Matrix community](https://matrix.to/#/#messages-official:matrix.org)!

View File

@@ -1,3 +1,10 @@
# App database configuration
DB_HOST=postgresql
DB_NAME=messages
DB_USER=user
DB_PASSWORD=pass
DB_PORT=5432
# Django
DJANGO_ALLOWED_HOSTS=*
DJANGO_SECRET_KEY=ThisIsAnExampleKeyForDevPurposeOnly
@@ -18,7 +25,7 @@ PYTHONPATH=/app
# Mail
DJANGO_EMAIL_BRAND_NAME="La Suite territoriale"
DJANGO_EMAIL_HOST="mailcatcher"
DJANGO_EMAIL_LOGO_IMG="http://localhost:3000/assets/logo-suite-numerique.png"
DJANGO_EMAIL_LOGO_IMG="http://localhost:8900/assets/logo-suite-numerique.png"
DJANGO_EMAIL_PORT=1025
# Media
@@ -26,31 +33,33 @@ STORAGES_STATICFILES_BACKEND=django.contrib.staticfiles.storage.StaticFilesStora
AWS_S3_ENDPOINT_URL=http://minio:9000
AWS_S3_ACCESS_KEY_ID=messages
AWS_S3_SECRET_ACCESS_KEY=password
MEDIA_BASE_URL=http://localhost:8083
MEDIA_BASE_URL=http://localhost:8902
# OIDC
OIDC_OP_JWKS_ENDPOINT=http://keycloak:8083/realms/messages/protocol/openid-connect/certs
OIDC_OP_AUTHORIZATION_ENDPOINT=http://localhost:8083/realms/messages/protocol/openid-connect/auth
OIDC_OP_TOKEN_ENDPOINT=http://keycloak:8083/realms/messages/protocol/openid-connect/token
OIDC_OP_USER_ENDPOINT=http://keycloak:8083/realms/messages/protocol/openid-connect/userinfo
OIDC_OP_JWKS_ENDPOINT=http://keycloak:8802/realms/messages/protocol/openid-connect/certs
OIDC_OP_AUTHORIZATION_ENDPOINT=http://localhost:8902/realms/messages/protocol/openid-connect/auth
OIDC_OP_TOKEN_ENDPOINT=http://keycloak:8802/realms/messages/protocol/openid-connect/token
OIDC_OP_USER_ENDPOINT=http://keycloak:8802/realms/messages/protocol/openid-connect/userinfo
OIDC_RP_CLIENT_ID=messages
OIDC_RP_CLIENT_SECRET=ThisIsAnExampleKeyForDevPurposeOnly
OIDC_RP_SIGN_ALGO=RS256
OIDC_RP_SCOPES="openid email"
LOGIN_REDIRECT_URL=http://localhost:3000
LOGIN_REDIRECT_URL_FAILURE=http://localhost:3000
LOGOUT_REDIRECT_URL=http://localhost:3000
LOGIN_REDIRECT_URL=http://localhost:8900
LOGIN_REDIRECT_URL_FAILURE=http://localhost:8900
LOGOUT_REDIRECT_URL=http://localhost:8900
OIDC_REDIRECT_ALLOWED_HOSTS=["http://localhost:8083", "http://localhost:3000"]
OIDC_REDIRECT_ALLOWED_HOSTS=["http://localhost:8902", "http://localhost:8900"]
OIDC_AUTH_REQUEST_EXTRA_PARAMS={"acr_values": "eidas1"}
# Collaboration
COLLABORATION_API_URL=http://nginx:8083/collaboration/api/
COLLABORATION_SERVER_ORIGIN=http://localhost:3000
COLLABORATION_SERVER_SECRET=my-secret
COLLABORATION_WS_URL=ws://localhost:8083/collaboration/ws/
# keycloak
IDENTITY_PROVIDER=keycloak
KEYCLOAK_REALM=messages
KEYCLOAK_URL=http://keycloak:8802
KEYCLOAK_CLIENT_ID=rest-api
KEYCLOAK_CLIENT_SECRET=ServiceAccountClientSecretForDev
KEYCLOAK_GROUP_PATH_PREFIX=/maildomain-
# Frontend
FRONTEND_THEME=dsfr
@@ -59,5 +68,7 @@ FRONTEND_THEME=dsfr
MESSAGES_TESTDOMAIN=example.local
MESSAGES_TESTDOMAIN_MAPPING_BASEDOMAIN=example.com
MTA_OUT_HOST=mta-out:587
MTA_OUT_SMTP_USERNAME=testuser
MTA_OUT_SMTP_PASSWORD=testpass
MTA_OUT_SMTP_USERNAME=user
MTA_OUT_SMTP_PASSWORD=pass
MDA_API_SECRET=my-shared-secret-mda
SALT_KEY=ThisIsAnExampleSaltForDevPurposeOnly

View File

@@ -0,0 +1,2 @@
NEXT_PUBLIC_API_ORIGIN=http://localhost:8901
NEXT_PUBLIC_S3_DOMAIN_REPLACE=http://localhost:9000

View File

@@ -1,11 +0,0 @@
# Postgresql db container configuration
POSTGRES_DB=keycloak
POSTGRES_USER=user
POSTGRES_PASSWORD=pass
# App database configuration
DB_HOST=kc_postgresql
DB_NAME=keycloak
DB_USER=user
DB_PASSWORD=pass
DB_PORT=5433

View File

@@ -0,0 +1,13 @@
KC_BOOTSTRAP_ADMIN_USERNAME=admin
KC_BOOTSTRAP_ADMIN_PASSWORD=admin
KC_DB=postgres
KC_DB_URL_HOST=postgresql
KC_DB_URL_DATABASE=messages
KC_DB_PASSWORD=pass
KC_DB_USERNAME=user
KC_DB_SCHEMA=public
KC_HOSTNAME_STRICT=false
KC_HOSTNAME_STRICT_HTTPS=false
KC_HTTP_ENABLED=true
KC_HEALTH_ENABLED=true
PROXY_ADDRESS_FORWARDING=true

View File

@@ -1,3 +1,4 @@
MDA_API_BASE_URL=http://backend-dev:8000/api/v1.0/mta/
MDA_API_SECRET=my-shared-secret-mda
MDA_API_TIMEOUT=2
MESSAGE_SIZE_LIMIT=30000000

View File

@@ -0,0 +1,24 @@
# Set to a meaningful FQDN for HELO/EHLO (recommended for production)
MYHOSTNAME=mta-out-test.localhost
# Username & password for connecting to this server
SMTP_USERNAME=user
SMTP_PASSWORD=pass
# Optional: Message Size Limit
# MESSAGE_SIZE_LIMIT=10240000
# POSTFIX_DEBUG=1
# --- Optional: Relay Host Configuration ---
# If SMTP_RELAY_HOST is set, mail is sent via this host instead of direct delivery.
# Example for production relay: SMTP_RELAY_HOST=[smtp.yourprovider.com]:587
SMTP_RELAY_HOST=mailcatcher:1025
# Optional: Credentials for authenticating TO the SMTP_RELAY_HOST (if it requires auth)
# SMTP_RELAY_USERNAME=
# SMTP_RELAY_PASSWORD=
# TLS Configuration (WARNING: Mount real certs/keys in production!)
# TLS_CERT_PATH=/etc/ssl/certs/ssl-cert-snakeoil.pem
# TLS_KEY_PATH=/etc/ssl/private/ssl-cert-snakeoil.key

View File

@@ -1,3 +0,0 @@
SMTP_RELAY_HOST=mailcatcher:1025
SMTP_USERNAME=testuser
SMTP_PASSWORD=testpass

View File

@@ -0,0 +1,4 @@
# Postgresql db container configuration
POSTGRES_DB=messages
POSTGRES_USER=user
POSTGRES_PASSWORD=pass

View File

@@ -1,11 +0,0 @@
# Postgresql db container configuration
POSTGRES_DB=messages
POSTGRES_USER=user
POSTGRES_PASSWORD=pass
# App database configuration
DB_HOST=postgresql
DB_NAME=messages
DB_USER=user
DB_PASSWORD=pass
DB_PORT=5432

View File

@@ -1,8 +1,11 @@
# https://hub.docker.com/_/python
FROM python:3.13.3-slim-bookworm AS base
FROM python:3.13.5-slim-bookworm AS base
# Bump this to force an update of the apt repositories
ENV MIN_UPDATE_DATE="2025-05-09"
ENV MIN_UPDATE_DATE="2025-07-14"
RUN apt-get update && apt-get upgrade \
&& rm -rf /var/lib/apt/lists/*
ENV PYTHONUNBUFFERED=1

View File

@@ -1,7 +1,5 @@
"""Admin classes and registrations for core app."""
import hashlib
from django.contrib import admin, messages
from django.contrib.auth import admin as auth_admin
from django.shortcuts import redirect
@@ -41,7 +39,7 @@ def reset_keycloak_password_action(_, request, queryset):
success_count += 1
# pylint: disable=broad-except
except Exception as e: # noqa: BLE001
except Exception as e:
messages.error(request, f"Failed to reset password for {mailbox}: {str(e)}")
error_count += 1
@@ -78,7 +76,6 @@ class UserAdmin(auth_admin.UserAdmin):
"sub",
"email",
"full_name",
"short_name",
"language",
"timezone",
)
@@ -89,7 +86,6 @@ class UserAdmin(auth_admin.UserAdmin):
{
"fields": (
"is_active",
"is_device",
"is_staff",
"is_superuser",
"groups",
@@ -117,16 +113,14 @@ class UserAdmin(auth_admin.UserAdmin):
"is_active",
"is_staff",
"is_superuser",
"is_device",
"created_at",
"updated_at",
)
list_filter = ("is_staff", "is_superuser", "is_device", "is_active")
list_filter = ("is_staff", "is_superuser", "is_active")
ordering = (
"is_active",
"-is_superuser",
"-is_staff",
"-is_device",
"-updated_at",
"full_name",
)
@@ -135,7 +129,6 @@ class UserAdmin(auth_admin.UserAdmin):
"sub",
"email",
"full_name",
"short_name",
"created_at",
"updated_at",
)
@@ -221,6 +214,7 @@ class ThreadAdmin(admin.ModelAdmin):
"has_starred",
"has_sender",
"has_messages",
"has_attachments",
"is_spam",
"has_active",
),
@@ -241,6 +235,7 @@ class ThreadAdmin(admin.ModelAdmin):
"has_trashed",
"has_draft",
"has_starred",
"has_attachments",
"has_sender",
"has_messages",
"is_spam",
@@ -335,12 +330,9 @@ class MessageAdmin(admin.ModelAdmin):
# Create a Blob from the uploaded file
file_content = import_file.read()
blob = models.Blob.objects.create(
raw_content=file_content,
type=import_file.content_type,
size=import_file.size,
mailbox=recipient,
sha256=hashlib.sha256(file_content).hexdigest(),
blob = recipient.create_blob(
content=file_content,
content_type=import_file.content_type,
)
success, _response_data = ImportService.import_file(
@@ -465,9 +457,9 @@ class LabelAdmin(admin.ModelAdmin):
class BlobAdmin(admin.ModelAdmin):
"""Admin class for the Blob model"""
list_display = ("id", "mailbox", "type", "size", "created_at")
list_display = ("id", "mailbox", "content_type", "size", "created_at")
search_fields = ("mailbox__local_part", "mailbox__domain__name")
list_filter = ("mailbox", "type")
list_filter = ("mailbox", "content_type")
@admin.register(models.MailDomainAccess)
@@ -477,3 +469,44 @@ class MailDomainAccessAdmin(admin.ModelAdmin):
list_display = ("id", "maildomain", "user", "role")
search_fields = ("maildomain__name", "user__email")
list_filter = ("role",)
@admin.register(models.DKIMKey)
class DKIMKeyAdmin(admin.ModelAdmin):
"""Admin class for the DKIMKey model"""
list_display = (
"id",
"selector",
"domain",
"algorithm",
"key_size",
"is_active",
"created_at",
)
search_fields = ("selector", "domain__name")
list_filter = ("algorithm", "is_active", "domain")
readonly_fields = ("public_key", "created_at", "updated_at")
fieldsets = (
(
None,
{
"fields": (
"selector",
"domain",
"algorithm",
"key_size",
"is_active",
"created_at",
"updated_at",
)
},
),
(
_("Keys"),
{
"fields": ("public_key",),
"classes": ("collapse",),
},
),
)

View File

@@ -9,6 +9,69 @@ from rest_framework.exceptions import PermissionDenied
from core import models
class IntegerChoicesField(serializers.Field):
"""
Custom field to handle IntegerChoices that accepts string labels for input
and returns string labels for output.
Example usage:
role = IntegerChoicesField(MailboxRoleChoices)
This field will:
- Accept strings like "viewer", "editor", "admin" for input
- Store them as integers (1, 2, 4) in the database
- Return strings like "viewer", "editor", "admin" for output
- Provide helpful error messages for invalid choices
- Support backward compatibility with integer input
"""
def __init__(self, choices_class, **kwargs):
self.choices_class = choices_class
super().__init__(**kwargs)
def to_representation(self, value):
"""Convert integer value to string label for output."""
if value is None:
return None
enum_instance = self.choices_class(value)
return enum_instance.label
def to_internal_value(self, data):
"""Convert string label to integer value for storage."""
if data is None:
return None
# If it's already an integer (for backward compatibility), validate and return it
if isinstance(data, int):
try:
self.choices_class(data) # Validate it's a valid choice
return data
except ValueError:
self.fail("invalid_choice", input=data)
# Convert string label to integer value
if isinstance(data, str):
for choice_value, choice_label in self.choices_class.choices:
if choice_label == data:
return choice_value
self.fail("invalid_choice", input=data)
self.fail("invalid_choice", input=data)
return None
default_error_messages = {
"invalid_choice": "Invalid choice: {input}. Valid choices are: {choices}."
}
def fail(self, key, **kwargs):
"""Override to provide better error messages."""
if key == "invalid_choice":
valid_choices = [label for value, label in self.choices_class.choices]
kwargs["choices"] = ", ".join(valid_choices)
super().fail(key, **kwargs)
class AbilitiesModelSerializer(serializers.ModelSerializer):
"""
A ModelSerializer that takes an additional `exclude` argument that
@@ -37,8 +100,8 @@ class UserSerializer(AbilitiesModelSerializer):
class Meta:
model = models.User
fields = ["id", "email", "full_name", "short_name"]
read_only_fields = ["id", "email", "full_name", "short_name"]
fields = ["id", "email", "full_name"]
read_only_fields = ["id", "email", "full_name"]
class MailboxAvailableSerializer(serializers.ModelSerializer):
@@ -78,7 +141,10 @@ class MailboxSerializer(serializers.ModelSerializer):
"""Return the allowed actions of the logged-in user on the instance."""
request = self.context.get("request")
if request:
return instance.accesses.get(user=request.user).role
role_enum = models.MailboxRoleChoices(
instance.accesses.get(user=request.user).role
)
return role_enum.label
return None
def get_count_unread_messages(self, instance):
@@ -134,6 +200,12 @@ class BlobSerializer(serializers.ModelSerializer):
"""Serialize blobs."""
blobId = serializers.UUIDField(source="id", read_only=True)
type = serializers.CharField(source="content_type", read_only=True)
sha256 = serializers.SerializerMethodField()
def get_sha256(self, obj):
"""Convert binary SHA256 to hex string."""
return obj.sha256.hex() if obj.sha256 else None
class Meta:
model = models.Blob
@@ -152,6 +224,11 @@ class AttachmentSerializer(serializers.ModelSerializer):
blobId = serializers.UUIDField(source="blob.id", read_only=True)
type = serializers.CharField(source="content_type", read_only=True)
sha256 = serializers.SerializerMethodField()
def get_sha256(self, obj):
"""Convert binary SHA256 to hex string."""
return obj.sha256.hex() if obj.sha256 else None
class Meta:
model = models.Attachment
@@ -233,7 +310,7 @@ class ThreadAccessDetailSerializer(serializers.ModelSerializer):
"""Serializer for thread access details."""
mailbox = MailboxLightSerializer()
role = serializers.ChoiceField(choices=models.ThreadAccessRoleChoices.choices)
role = IntegerChoicesField(models.ThreadAccessRoleChoices, read_only=True)
class Meta:
model = models.ThreadAccess
@@ -273,7 +350,9 @@ class ThreadSerializer(serializers.ModelSerializer):
return None
if request and hasattr(request, "user") and request.user.is_authenticated:
try:
return instance.accesses.get(mailbox=mailbox).role
role_value = instance.accesses.get(mailbox=mailbox).role
role_enum = models.ThreadAccessRoleChoices(role_value)
return role_enum.label
except models.ThreadAccess.DoesNotExist:
return None
return None
@@ -306,6 +385,7 @@ class ThreadSerializer(serializers.ModelSerializer):
"has_trashed",
"has_draft",
"has_starred",
"has_attachments",
"has_sender",
"has_messages",
"is_spam",
@@ -362,21 +442,43 @@ class MessageSerializer(serializers.ModelSerializer):
@extend_schema_field(serializers.CharField())
def get_draftBody(self, instance): # pylint: disable=invalid-name
"""Return an arbitrary JSON object representing the draft body."""
return instance.draft_body
return (
instance.draft_blob.get_content().decode("utf-8")
if instance.draft_blob
else None
)
@extend_schema_field(AttachmentSerializer(many=True))
def get_attachments(self, instance):
"""Return the parsed email attachments or linked attachments for drafts."""
# If the message has no attachments, return an empty list
if not instance.has_attachments:
return []
# First check for directly linked attachments (for drafts)
if instance.attachments.exists():
if instance.is_draft:
return AttachmentSerializer(instance.attachments.all(), many=True).data
# Then get any parsed attachments from the email if available
parsed_attachments = instance.get_parsed_field("attachments") or []
# Convert parsed attachments to a format similar to AttachmentSerializer
# Remove the content field from the parsed attachments and create a
# reference to a virtual blob msg_[message_id]_[attachment_number]
# This is needed to map our storage schema with the JMAP spec.
if parsed_attachments:
return parsed_attachments
stripped_attachments = []
for index, attachment in enumerate(parsed_attachments):
stripped_attachments.append(
{
"blobId": f"msg_{instance.id}_{index}",
"name": attachment["name"],
"size": attachment["size"],
"type": attachment["type"],
}
)
return stripped_attachments
return []
@@ -451,6 +553,7 @@ class MessageSerializer(serializers.ModelSerializer):
"is_unread",
"is_starred",
"is_trashed",
"has_attachments",
]
read_only_fields = fields # Mark all as read-only
@@ -458,6 +561,8 @@ class MessageSerializer(serializers.ModelSerializer):
class ThreadAccessSerializer(serializers.ModelSerializer):
"""Serialize thread access information."""
role = IntegerChoicesField(models.ThreadAccessRoleChoices)
class Meta:
model = models.ThreadAccess
fields = ["id", "thread", "mailbox", "role", "created_at", "updated_at"]
@@ -470,6 +575,7 @@ class MailboxAccessReadSerializer(serializers.ModelSerializer):
"""
user_details = UserSerializer(source="user", read_only=True, exclude_abilities=True)
role = IntegerChoicesField(models.MailboxRoleChoices, read_only=True)
class Meta:
model = models.MailboxAccess
@@ -482,6 +588,8 @@ class MailboxAccessWriteSerializer(serializers.ModelSerializer):
Mailbox is set from the view based on URL parameters.
"""
role = IntegerChoicesField(models.MailboxRoleChoices)
class Meta:
model = models.MailboxAccess
fields = ["id", "user", "role", "created_at", "updated_at"]
@@ -503,9 +611,19 @@ class MailboxAccessWriteSerializer(serializers.ModelSerializer):
class MailDomainAdminSerializer(AbilitiesModelSerializer):
"""Serialize mail domains for admin view."""
expected_dns_records = serializers.SerializerMethodField(read_only=True)
def get_expected_dns_records(self, instance):
"""Return the expected DNS records for the mail domain, only in detail views."""
# Only include DNS records in detail views, not in list views
view = self.context.get("view")
if view and hasattr(view, "action") and view.action == "retrieve":
return instance.get_expected_dns_records()
return None
class Meta:
model = models.MailDomain
fields = ["id", "name", "created_at", "updated_at"]
fields = ["id", "name", "created_at", "updated_at", "expected_dns_records"]
read_only_fields = fields
@@ -516,6 +634,7 @@ class MailboxAccessNestedUserSerializer(serializers.ModelSerializer):
"""
user = UserSerializer(read_only=True, exclude_abilities=True)
role = IntegerChoicesField(models.MailboxRoleChoices, read_only=True)
class Meta:
model = models.MailboxAccess

View File

@@ -1,6 +1,5 @@
"""API ViewSet for handling binary data upload and download (JMAP-inspired implementation)."""
import hashlib
import logging
from django.http import HttpResponse
@@ -120,17 +119,13 @@ class BlobViewSet(ViewSet):
uploaded_file = request.FILES["file"]
content_type = uploaded_file.content_type or "application/octet-stream"
# Read file content and calculate SHA-256 hash
content = uploaded_file.read() # Read once, use for both hash and storage
sha256 = hashlib.sha256(content).hexdigest()
# Read file content
content = uploaded_file.read()
# Create the blob record
blob = models.Blob.objects.create(
sha256=sha256,
size=len(content),
type=content_type,
raw_content=content,
mailbox=mailbox,
# Create the blob record using mailbox method
blob = mailbox.create_blob(
content=content,
content_type=content_type,
)
# Return a response with the blob details
@@ -140,7 +135,7 @@ class BlobViewSet(ViewSet):
"blobId": str(blob.id),
"type": content_type,
"size": len(content),
"sha256": sha256,
"sha256": blob.sha256.hex(),
},
status=status.HTTP_201_CREATED,
)
@@ -165,28 +160,63 @@ class BlobViewSet(ViewSet):
by checking if the user has access to any mailbox that owns this blob.
"""
try:
# Get the blob
blob = models.Blob.objects.get(id=pk)
# Blob IDs in the form msg_[message_id]_[attachment_number] are looked up
# directly in the message's attachments.
if pk.startswith("msg_"):
message_id = pk.split("_")[1]
attachment_number = pk.split("_")[2]
message = models.Message.objects.get(id=message_id)
# Check if user has access to the mailbox that owns this blob
if not models.MailboxAccess.objects.filter(
mailbox=blob.mailbox, user=request.user
).exists():
return Response(
{"error": "You do not have permission to download this blob"},
status=status.HTTP_403_FORBIDDEN,
# Does the user have access to the message via its thread?
if not models.ThreadAccess.objects.filter(
thread=message.thread, mailbox__accesses__user=request.user
).exists():
raise models.Blob.DoesNotExist()
# Does the message have any attachments?
if not message.has_attachments:
raise models.Blob.DoesNotExist()
# Parse the raw mime message to get the attachment
parsed_email = message.get_parsed_data()
attachment = parsed_email.get("attachments", [])[int(attachment_number)]
# Create response with decompressed content
response = HttpResponse(
attachment["content"], content_type=attachment["type"]
)
# Get the first attachment name to use as filename (if available)
attachment = models.Attachment.objects.filter(blob=blob).first()
filename = attachment.name if attachment else f"blob-{blob.id}.bin"
# Add appropriate headers for download
response["Content-Disposition"] = (
f'attachment; filename="{attachment["name"]}"'
)
response["Content-Length"] = attachment["size"]
# Create response with raw_content
response = HttpResponse(blob.raw_content, content_type=blob.type)
else:
# Get the blob
blob = models.Blob.objects.get(id=pk)
# Add appropriate headers for download
response["Content-Disposition"] = f'attachment; filename="{filename}"'
response["Content-Length"] = blob.size
# Check if user has access to the mailbox that owns this blob
if not models.MailboxAccess.objects.filter(
mailbox=blob.mailbox, user=request.user
).exists():
return Response(
{"error": "You do not have permission to download this blob"},
status=status.HTTP_403_FORBIDDEN,
)
# Get the first attachment name to use as filename (if available)
attachment = models.Attachment.objects.filter(blob=blob).first()
filename = attachment.name if attachment else f"blob-{blob.id}.bin"
# Create response with decompressed content
response = HttpResponse(
blob.get_content(), content_type=blob.content_type
)
# Add appropriate headers for download
response["Content-Disposition"] = f'attachment; filename="{filename}"'
response["Content-Length"] = blob.size
return response

View File

@@ -216,13 +216,20 @@ class DraftMessageView(APIView):
thread_updated_fields.extend(["subject", "updated_at"])
# Update recipients if provided
recipient_type_mapping = {
"to": enums.MessageRecipientTypeChoices.TO,
"cc": enums.MessageRecipientTypeChoices.CC,
"bcc": enums.MessageRecipientTypeChoices.BCC,
}
recipient_types = ["to", "cc", "bcc"]
for recipient_type in recipient_types:
if recipient_type in request_data:
# Delete existing recipients of this type
# Ensure message has a pk before accessing m2m
if message.pk:
message.recipients.filter(type=recipient_type).delete()
message.recipients.filter(
type=recipient_type_mapping[recipient_type]
).delete()
# Create new recipients
emails = request_data.get(recipient_type) or []
@@ -240,14 +247,19 @@ class DraftMessageView(APIView):
models.MessageRecipient.objects.create(
message=message,
contact=contact,
type=recipient_type,
type=recipient_type_mapping[recipient_type],
)
# If message not saved yet (POST case), recipients will be added after save
# Update draft body if provided
if "draftBody" in request_data:
message.draft_body = request_data.get("draftBody", "")
updated_fields.append("draft_body")
if message.draft_blob:
message.draft_blob.delete()
message.draft_blob = self.mailbox.create_blob(
content=(request_data.get("draftBody") or "").encode("utf-8"),
content_type="application/json",
)
updated_fields.append("draft_blob")
# Update attachments if provided
if "attachments" in request_data:
@@ -328,6 +340,11 @@ class DraftMessageView(APIView):
set(to_add) - {a.id for a in valid_attachments},
)
has_attachments = message.attachments.exists()
if has_attachments != message.has_attachments:
message.has_attachments = has_attachments
updated_fields.append("has_attachments")
# Save message and thread if changes were made
if updated_fields and message.pk: # Only save if message exists
message.save(update_fields=updated_fields + ["updated_at"])
@@ -404,7 +421,10 @@ class DraftMessageView(APIView):
read_at=timezone.now(),
is_draft=True,
is_sender=True,
draft_body=request.data.get("draftBody", ""),
draft_blob=self.mailbox.create_blob(
content=(request.data.get("draftBody") or "").encode("utf-8"),
content_type="application/json",
),
)
message.save() # Save message before adding recipients

View File

@@ -193,7 +193,6 @@ class AdminMailDomainMailboxViewSet(
email=email,
defaults={
"full_name": f"{first_name} {last_name}",
"short_name": first_name,
"password": "?",
},
)
@@ -241,7 +240,7 @@ class AdminMailDomainUserViewSet(mixins.ListModelMixin, viewsets.GenericViewSet)
)
)
.distinct()
.order_by("full_name", "short_name", "email")
.order_by("full_name", "email")
)
@extend_schema(
@@ -266,7 +265,6 @@ class AdminMailDomainUserViewSet(mixins.ListModelMixin, viewsets.GenericViewSet)
queryset = queryset.filter(
Q(email__unaccent__icontains=query)
| Q(full_name__unaccent__icontains=query)
| Q(short_name__unaccent__icontains=query)
)
serializer = core_serializers.UserSerializer(queryset, many=True)

View File

@@ -49,10 +49,6 @@ class MessageViewSet(
else:
return queryset.none()
# For retrieve and list actions, prefetch attachments to optimize performance
if self.action in ["retrieve", "list"]:
queryset = queryset.prefetch_related("attachments")
return queryset
def destroy(self, request, *args, **kwargs):

View File

@@ -104,6 +104,11 @@ class SendMessageView(APIView):
if not sender_id:
raise drf_exceptions.ValidationError("senderId is required.")
try:
mailbox_sender = models.Mailbox.objects.get(id=sender_id)
except models.Mailbox.DoesNotExist as e:
raise drf_exceptions.NotFound("Sender mailbox not found.") from e
try:
message = (
models.Message.objects.select_related("sender")
@@ -111,7 +116,9 @@ class SendMessageView(APIView):
"thread__accesses", "recipients__contact", "attachments__blob"
)
.get(
id=message_id, is_draft=True, thread__accesses__mailbox_id=sender_id
id=message_id,
is_draft=True,
thread__accesses__mailbox=mailbox_sender,
)
)
except models.Message.DoesNotExist as e:
@@ -120,7 +127,10 @@ class SendMessageView(APIView):
) from e
prepared = prepare_outbound_message(
message, request.data.get("textBody"), request.data.get("htmlBody")
mailbox_sender,
message,
request.data.get("textBody"),
request.data.get("htmlBody"),
)
if not prepared:
raise drf_exceptions.APIException(

View File

@@ -79,6 +79,7 @@ class ThreadViewSet(
"has_sender": "has_sender",
"has_active": "has_active",
"has_messages": "has_messages",
"has_attachments": "has_attachments",
"is_spam": "is_spam",
}
@@ -132,6 +133,12 @@ class ThreadViewSet(
location=OpenApiParameter.QUERY,
description="Filter threads with starred messages (1=true, 0=false).",
),
OpenApiParameter(
name="has_attachments",
type=OpenApiTypes.INT,
location=OpenApiParameter.QUERY,
description="Filter threads with attachments (1=true, 0=false).",
),
OpenApiParameter(
name="has_sender",
type=OpenApiTypes.INT,
@@ -145,7 +152,7 @@ class ThreadViewSet(
required=True,
description="""Comma-separated list of fields to aggregate.
Special values: 'all' (count all threads), 'all_unread' (count all unread threads).
Boolean fields: has_trashed, has_draft, has_starred, has_sender, has_active, is_spam, has_messages.
Boolean fields: has_trashed, has_draft, has_starred, has_attachments, has_sender, has_active, is_spam, has_messages.
Unread variants ('_unread' suffix): count threads where the condition is true AND the thread is unread.
Examples: 'all,all_unread', 'has_starred,has_starred_unread', 'is_spam,is_spam_unread'""",
enum=list(enums.THREAD_STATS_FIELDS_MAP.keys()),
@@ -202,6 +209,7 @@ class ThreadViewSet(
"has_trashed",
"has_draft",
"has_starred",
"has_attachments",
"has_sender",
"has_active",
"is_spam",
@@ -310,6 +318,12 @@ class ThreadViewSet(
location=OpenApiParameter.QUERY,
description="Filter threads with starred messages (1=true, 0=false).",
),
OpenApiParameter(
name="has_attachments",
type=OpenApiTypes.INT,
location=OpenApiParameter.QUERY,
description="Filter threads with attachments (1=true, 0=false).",
),
OpenApiParameter(
name="has_sender",
type=OpenApiTypes.INT,
@@ -340,8 +354,8 @@ class ThreadViewSet(
"""List threads with optional search functionality."""
search_query = request.query_params.get("search", "").strip()
# If search is provided and Elasticsearch is available, use it
if search_query and hasattr(settings, "ELASTICSEARCH_HOSTS"):
# If search is provided and OpenSearch is available, use it
if search_query and len(settings.OPENSEARCH_HOSTS[0]) > 0:
# Get the mailbox_id for filtering
mailbox_id = request.query_params.get("mailbox_id")
@@ -359,7 +373,7 @@ class ThreadViewSet(
page = int(self.paginator.get_page_number(request, self))
page_size = int(self.paginator.get_page_size(request))
# Get search results from Elasticsearch
# Get search results from OpenSearch
results = search_threads(
query=search_query,
mailbox_ids=[mailbox_id] if mailbox_id else None,
@@ -393,7 +407,7 @@ class ThreadViewSet(
serializer = self.get_serializer(ordered_threads, many=True)
return drf.response.Response(serializer.data)
# Fall back to regular DB query if no search query or Elasticsearch not available
# Fall back to regular DB query if no search query or OpenSearch not available
return super().list(request, *args, **kwargs)
# @extend_schema(

View File

@@ -99,13 +99,8 @@ class OIDCAuthenticationBackend(MozillaOIDCAuthenticationBackend):
# Get user's full name from OIDC fields defined in settings
full_name = self.compute_full_name(user_info)
short_name = user_info.get(settings.USER_OIDC_FIELD_TO_SHORTNAME)
claims = {
"email": email,
"full_name": full_name,
"short_name": short_name,
}
claims = {"email": email, "full_name": full_name}
try:
user = User.objects.get_user_by_sub_or_email(sub, email)

View File

@@ -0,0 +1,94 @@
"""
DNS checking functionality for mail domains.
"""
from typing import Dict, List
import dns.resolver
from core.models import MailDomain
def check_single_record(
maildomain: MailDomain, expected_record: Dict[str, any]
) -> Dict[str, any]:
"""
Check a single DNS record for a mail domain.
Args:
maildomain: The MailDomain instance
expected_record: The expected record to check
Returns:
Check result dictionary with status and details
"""
record_type = expected_record["type"]
target = expected_record["target"]
expected_value = expected_record["value"]
# Build the query name
if target:
query_name = f"{target}.{maildomain.name}"
else:
query_name = maildomain.name
try:
# Query DNS records
if record_type.upper() == "MX":
answers = dns.resolver.resolve(query_name, "MX")
found_values = [
f"{answer.preference} {answer.exchange}" for answer in answers
]
elif record_type.upper() == "TXT":
answers = dns.resolver.resolve(query_name, "TXT")
found_values = [answer.to_text().strip('"') for answer in answers]
else:
# For other record types, try to resolve them as-is
answers = dns.resolver.resolve(query_name, record_type)
found_values = [answer.to_text() for answer in answers]
# Check if expected value is in found values
if expected_value in found_values:
return {"status": "correct", "found": found_values}
return {"status": "incorrect", "found": found_values}
except dns.resolver.NXDOMAIN:
# Domain doesn't exist
return {"status": "missing", "error": "Domain not found"}
except dns.resolver.NoAnswer:
# No records found for this query
return {"status": "missing", "error": "No records found"}
except dns.resolver.NoNameservers:
# No nameservers found
return {"status": "missing", "error": "No nameservers found"}
except dns.resolver.Timeout:
# DNS query timed out
return {"status": "error", "error": "DNS query timeout"}
except dns.resolver.YXDOMAIN:
# Domain name is too long
return {"status": "error", "error": "Domain name too long"}
except Exception as e: # pylint: disable=broad-exception-caught
# Other DNS errors
return {"status": "error", "error": f"DNS query failed: {str(e)}"}
def check_dns_records(maildomain: MailDomain) -> List[Dict[str, any]]:
"""
Check DNS records for a mail domain against expected records.
Args:
maildomain: The MailDomain instance to check
Returns:
List of records with their check status
"""
expected_records = maildomain.get_expected_dns_records()
results = []
for expected_record in expected_records:
result_record = expected_record.copy()
result_record["_check"] = check_single_record(maildomain, expected_record)
results.append(result_record)
return results

View File

@@ -0,0 +1,623 @@
"""
Scaleway DNS provider implementation.
"""
from typing import Any, Dict, List, Optional
from django.conf import settings
import requests
class ScalewayDNSProvider:
"""DNS provider for Scaleway Domains and DNS service."""
def __init__(self):
"""
Initialize the Scaleway DNS provider.
"""
self.api_token = settings.DNS_SCALEWAY_API_TOKEN
self.project_id = settings.DNS_SCALEWAY_PROJECT_ID
self.ttl = settings.DNS_SCALEWAY_TTL
self.base_url = "https://api.scaleway.com/domain/v2beta1"
self.headers = {
"X-Auth-Token": self.api_token,
"Content-Type": "application/json",
}
def is_configured(self) -> bool:
"""
Check if the Scaleway DNS provider is configured.
"""
return bool(self.api_token) and bool(self.project_id)
def _get_zone_name(self, domain: str) -> str:
"""
Get the zone name for API calls.
Args:
domain: Domain name
Returns:
Zone name for API calls
"""
# First, check if the exact domain exists as a zone
zone = self.get_zone(domain)
if zone:
return domain
# If not found, check if we need to use a parent zone
# This handles cases where the domain is a subdomain of an existing zone
parts = domain.split(".")
for i in range(1, len(parts)):
potential_parent = ".".join(parts[i:])
zone = self.get_zone(potential_parent)
if zone:
return potential_parent
# If no parent found, use the domain as is
return domain
def _validate_zone_exists(self, domain: str) -> bool:
"""
Validate that a zone exists for the given domain.
Args:
domain: Domain name
Returns:
True if zone exists, False otherwise
"""
zone = self.get_zone(domain)
return zone is not None
def _handle_api_error(self, response: requests.Response) -> None:
"""
Handle Scaleway API errors with proper error messages.
Args:
response: HTTP response object
Raises:
Exception: With detailed error message and context
"""
try:
error_data = response.json()
error_message = error_data.get("message", "Unknown error")
except (ValueError, KeyError):
error_message = "Unknown error"
if response.status_code == 404:
raise ValueError(f"Zone not found: {error_message}")
if response.status_code == 409:
raise ValueError(f"Zone already exists: {error_message}")
if response.status_code == 400:
raise ValueError(f"Invalid request: {error_message}")
if response.status_code == 401:
raise ValueError(f"Authentication failed: {error_message}")
# For any other status code
raise ValueError(f"API error ({response.status_code}): {error_message}")
def _make_request(
self, method: str, endpoint: str, data: Optional[Dict] = None
) -> Dict[str, Any]:
"""
Make a request to the Scaleway API.
Args:
method: HTTP method
endpoint: API endpoint
data: Request data
Returns:
API response as dictionary
Raises:
Exception: If the request fails
"""
url = f"{self.base_url}/{endpoint}"
response = requests.request(
method=method, url=url, headers=self.headers, json=data, timeout=30
)
if not response.ok:
self._handle_api_error(response)
return response.json()
def get_zones(self) -> List[Dict[str, Any]]:
"""
Get all DNS zones.
Returns:
List of zone dictionaries
"""
response = self._make_request("GET", "dns-zones")
return response.get("dns_zones", [])
def get_zone(self, domain: str) -> Optional[Dict[str, Any]]:
"""
Get a specific DNS zone by domain name.
Args:
domain: Domain name
Returns:
Zone dictionary or None if not found
"""
zones = self.get_zones()
for zone in zones:
# Check if this zone matches our domain
zone_domain = zone.get("domain", "")
zone_subdomain = zone.get("subdomain", "")
if zone_subdomain:
# This is a subdomain zone
zone_full_name = f"{zone_subdomain}.{zone_domain}"
if zone_full_name == domain:
return zone
# This is a root domain zone
elif zone_domain == domain:
return zone
return None
def _resolve_zone_components(self, domain: str) -> tuple[str, str]:
"""
Resolve domain into parent domain and subdomain components.
This method implements a smart algorithm to determine the correct
parent domain and subdomain by checking existing zones:
- For x.tld: create new zone, no parent
- For a.b.c.d.tld: recursively check potential parent zones starting with b.c.d.tld
- If parent exists, create sub-zone; otherwise create new zone
Args:
domain: Domain name
Returns:
Tuple of (parent_domain, subdomain)
"""
if "." not in domain:
# Single level domain, no parent
return domain, ""
# Get existing zones to check for potential parents
existing_zones = self.get_zones()
# Split domain into parts
parts = domain.split(".")
# For domains like a.b.c.d.tld, check potential parents:
# - b.c.d.tld
# - c.d.tld
# - d.tld
# - tld (but this is unlikely to be a managed zone)
for i in range(1, len(parts)):
potential_parent = ".".join(parts[i:])
# Check if this potential parent exists as a zone
for zone in existing_zones:
zone_domain = zone.get("domain", "")
zone_subdomain = zone.get("subdomain", "")
if zone_subdomain:
# This is a subdomain zone
zone_full_name = f"{zone_subdomain}.{zone_domain}"
if zone_full_name == potential_parent:
# Found existing parent zone, create sub-zone
# The subdomain should be the remaining parts
subdomain = ".".join(parts[:i])
return zone_domain, subdomain
# This is a root domain zone
elif zone_domain == potential_parent:
# Found existing parent zone, create sub-zone
# The subdomain should be the remaining parts
subdomain = ".".join(parts[:i])
return zone_domain, subdomain
# No existing parent zone found, create new zone
# Use the last two parts as the parent domain (common pattern)
if len(parts) >= 2:
parent_domain = ".".join(parts[-2:])
subdomain = ".".join(parts[:-2])
else:
parent_domain = domain
subdomain = ""
return parent_domain, subdomain
def create_zone(self, domain: str) -> Dict[str, Any]:
"""
Create a new DNS zone.
Args:
domain: Domain name to create
Returns:
Created zone dictionary
"""
parent_domain, subdomain = self._resolve_zone_components(domain)
data = {
"domain": parent_domain,
"subdomain": subdomain,
"project_id": self.project_id,
}
response = self._make_request("POST", "dns-zones", data)
return response.get("dns_zone", {})
def get_records(self, domain: str) -> List[Dict[str, Any]]:
"""
Get all DNS records for a zone.
Args:
domain: Domain name
Returns:
List of record dictionaries
"""
zone_name = self._get_zone_name(domain)
response = self._make_request("GET", f"dns-zones/{zone_name}/records")
return response.get("records", [])
def _format_record_name(self, name: str, domain: str) -> str:
"""
Format record name according to Scaleway API requirements.
Args:
name: Record name (can be FQDN or short name)
domain: Domain name
Returns:
Short format record name
"""
# If name is empty or None, return empty string
if not name:
return ""
# If name is exactly the domain, it's a root domain record
if name == domain:
return ""
# If name is FQDN ending with domain, extract short name
if name.endswith(f".{domain}"):
# Remove the domain suffix to get the short name
short_name = name[: -len(f".{domain}")]
# If the result is empty, it means this is a root domain record
return short_name if short_name else ""
# If name contains dots but doesn't end with domain, it might be a subdomain
# Return the first part as short name
if "." in name:
return name.split(".")[0]
# Otherwise, return as is (already short format)
return name
def create_record(
self,
domain: str,
name: str,
record_type: str,
data: str,
ttl: Optional[int] = None,
) -> Dict[str, Any]:
"""
Create a DNS record.
Args:
domain: Domain name
name: Record name
record_type: Record type (A, MX, TXT, etc.)
data: Record data
ttl: TTL in seconds (uses DNS_SCALEWAY_TTL if not specified)
Returns:
Created record dictionary
"""
if ttl is None:
ttl = self.ttl
zone_name = self._get_zone_name(domain)
# Validate that the zone exists
if not self._validate_zone_exists(zone_name):
raise ValueError(f"Zone not found: {zone_name}")
short_name = self._format_record_name(name, domain)
record_data = {
"return_all_records": False,
"changes": [
{
"add": {
"records": [
{
"name": short_name,
"type": record_type,
"data": data,
"ttl": ttl,
}
]
}
}
],
}
response = self._make_request(
"PATCH", f"dns-zones/{zone_name}/records", record_data
)
return response.get("records", [{}])[0] if response.get("records") else {}
def update_record(
self,
domain: str,
record_id: str, # pylint: disable=unused-argument
name: str,
record_type: str,
data: str,
ttl: Optional[int] = None,
) -> Dict[str, Any]:
"""
Update a DNS record.
Args:
domain: Domain name
record_id: Record ID (not used, kept for compatibility)
name: Record name
record_type: Record type
data: Record data
ttl: TTL in seconds (uses DNS_SCALEWAY_TTL if not specified)
Returns:
Updated record dictionary
"""
if ttl is None:
ttl = self.ttl
zone_name = self._get_zone_name(domain)
# Validate that the zone exists
if not self._validate_zone_exists(zone_name):
raise ValueError(f"Zone not found: {zone_name}")
short_name = self._format_record_name(name, domain)
record_data = {
"return_all_records": False,
"changes": [
{
"set": {
"id_fields": {
"name": short_name,
"type": record_type,
},
"records": [
{
"name": short_name,
"type": record_type,
"data": data,
"ttl": ttl,
}
],
}
}
],
}
response = self._make_request(
"PATCH", f"dns-zones/{zone_name}/records", record_data
)
return response.get("records", [{}])[0] if response.get("records") else {}
def delete_record(self, domain: str, record_id: str) -> None:
"""
Delete a DNS record.
Args:
domain: Domain name
record_id: Record ID (not used, kept for compatibility)
"""
# For delete operations, we need to specify the record by name and type
# Since we don't have the name and type here, this method needs to be updated
# to accept name and type parameters
raise NotImplementedError(
"Delete record requires name and type parameters. Use delete_record_by_name_type instead."
)
def delete_record_by_name_type(
self, domain: str, name: str, record_type: str
) -> None:
"""
Delete a DNS record by name and type.
Args:
domain: Domain name
name: Record name
record_type: Record type
"""
zone_name = self._get_zone_name(domain)
# Validate that the zone exists
if not self._validate_zone_exists(zone_name):
raise ValueError(f"Zone not found: {zone_name}")
short_name = self._format_record_name(name, domain)
record_data = {
"return_all_records": False,
"changes": [
{
"delete": {
"id_fields": {
"name": short_name,
"type": record_type,
}
}
}
],
}
self._make_request("PATCH", f"dns-zones/{zone_name}/records", record_data)
def find_records(
self, domain: str, name: str, record_type: str
) -> List[Dict[str, Any]]:
"""
Find all DNS records of a specific type and name.
Args:
domain: Domain name
name: Record name
record_type: Record type
Returns:
List of record dictionaries
"""
records = self.get_records(domain)
found_records = []
short_name = self._format_record_name(name, domain)
for record in records:
if record.get("name") == short_name and record.get("type") == record_type:
found_records.append(record)
return found_records
def find_record(
self, domain: str, name: str, record_type: str
) -> Optional[Dict[str, Any]]:
"""
Find a specific DNS record.
Args:
domain: Domain name
name: Record name
record_type: Record type
Returns:
Record dictionary or None if not found
"""
records = self.find_records(domain, name, record_type)
return records[0] if records else None
def record_exists(
self, domain: str, name: str, record_type: str, expected_value: str
) -> bool:
"""
Check if a specific DNS record with the expected value exists.
Args:
domain: Domain name
name: Record name
record_type: Record type
expected_value: Expected record value
Returns:
True if the record exists with the expected value
"""
records = self.find_records(domain, name, record_type)
for record in records:
if record.get("data") == expected_value:
return True
return False
def provision_domain_records(
self, domain: str, expected_records: List[Dict[str, Any]], pretend: bool = False
) -> Dict[str, Any]:
"""
Provision DNS records for a domain.
Only creates records that don't already exist.
Args:
domain: Domain name
expected_records: List of expected DNS records
pretend: If True, simulate operations without making actual changes
Returns:
Dictionary with provisioning results
"""
# Get or create zone
zone = self.get_zone(domain)
if not zone:
if pretend:
# Simulate zone creation
zone = {"domain": domain}
else:
try:
zone = self.create_zone(domain)
except Exception as e: # pylint: disable=broad-exception-caught
return {
"success": False,
"error": f"Failed to create zone for {domain}: {e}",
"domain": domain,
}
# Use domain name directly for API calls
zone_name = self._get_zone_name(domain)
results = {
"success": True,
"domain": domain,
"zone_name": zone_name,
"created": [],
"updated": [],
"errors": [],
"pretend": pretend,
}
# Provision each expected record
for expected_record in expected_records:
record_type = expected_record["type"]
target = expected_record["target"]
expected_value = expected_record["value"]
# Build record name
if target:
record_name = f"{target}.{domain}"
else:
record_name = domain
try:
# Check if this specific record already exists
if self.record_exists(domain, record_name, record_type, expected_value):
# Record already exists, skip it
continue
if pretend:
# Simulate creating new record
results["created"].append(
{
"name": record_name,
"type": record_type,
"value": expected_value,
"pretend": True,
}
)
else:
# Create new record only if it doesn't exist
self.create_record(domain, record_name, record_type, expected_value)
results["created"].append(
{
"name": record_name,
"type": record_type,
"value": expected_value,
}
)
except Exception as e: # pylint: disable=broad-exception-caught
error_data = {
"name": record_name,
"type": record_type,
"value": expected_value,
"error": str(e),
}
if pretend:
error_data["pretend"] = True
results["errors"].append(error_data)
results["success"] = False
return results

View File

@@ -0,0 +1,170 @@
"""
DNS provisioning functionality for mail domains.
"""
import logging
from typing import Any, Dict, Optional
from django.conf import settings
import dns.resolver
from core.dns.check import check_dns_records
from core.dns.providers.scaleway import ScalewayDNSProvider
from core.models import MailDomain
logger = logging.getLogger(__name__)
def detect_dns_provider(domain: str) -> Optional[str]:
"""
Detect which DNS provider is being used for a domain.
Args:
domain: Domain name to check
Returns:
Provider name ('scaleway') or None if unknown
"""
try:
# Get nameservers for the domain
nameservers = dns.resolver.resolve(domain, "NS")
ns_names = [ns.target.to_text().rstrip(".") for ns in nameservers]
# Check for Scaleway nameservers
scaleway_ns = ["ns0.dom.scw.cloud", "ns1.dom.scw.cloud"]
if any(ns in ns_names for ns in scaleway_ns):
return "scaleway"
return None
except (
dns.resolver.NXDOMAIN,
dns.resolver.Timeout,
dns.resolver.NoNameservers,
dns.resolver.NoAnswer,
) as e:
# Log unexpected errors but don't fail
logger.warning("Unexpected error detecting DNS provider for %s: %s", domain, e)
return None
except Exception as e: # pylint: disable=broad-exception-caught
# Log unexpected errors but don't fail
logger.warning("Unexpected error detecting DNS provider for %s: %s", domain, e)
return None
def get_dns_provider(provider_name: str, **kwargs) -> Optional[Any]:
"""
Get a DNS provider instance by name.
Args:
provider_name: Name of the provider
**kwargs: Provider-specific configuration
Returns:
Provider instance or None if not supported
"""
provider = None
if provider_name == "scaleway":
provider = ScalewayDNSProvider()
if provider is None or not provider.is_configured():
return None
return provider
def provision_domain_dns(
maildomain: MailDomain,
provider_name: Optional[str] = None,
pretend: bool = False,
**provider_kwargs,
) -> Dict[str, Any]:
"""
Provision DNS records for a mail domain.
Args:
maildomain: MailDomain instance to provision
provider_name: DNS provider name (if None, will auto-detect or use default)
pretend: If True, simulate operations without making actual changes
**provider_kwargs: Provider-specific configuration
Returns:
Dictionary with provisioning results
"""
domain = maildomain.name
# Auto-detect provider if not specified
if not provider_name:
provider_name = detect_dns_provider(domain)
if not provider_name:
# Use default provider from environment if no provider detected
provider_name = settings.DNS_DEFAULT_PROVIDER
if not provider_name:
return {
"success": False,
"error": f"Could not detect DNS provider for domain {domain} and no default provider configured",
"domain": domain,
}
# Get provider instance
provider = get_dns_provider(provider_name, **provider_kwargs)
if not provider:
return {
"success": False,
"error": f"DNS provider '{provider_name}' is not supported or not configured",
"domain": domain,
"provider": provider_name,
}
# Get expected DNS records
expected_records = maildomain.get_expected_dns_records()
# Provision records
try:
results = provider.provision_domain_records(
domain, expected_records, pretend=pretend
)
results["provider"] = provider_name
results["pretend"] = pretend
return results
except Exception as e: # pylint: disable=broad-exception-caught
return {
"success": False,
"error": f"Failed to provision DNS records: {e}",
"domain": domain,
"provider": provider_name,
"pretend": pretend,
}
def check_and_provision_domain(
maildomain: MailDomain, **provider_kwargs
) -> Dict[str, Any]:
"""
Check DNS records for a domain and provision missing ones.
Args:
maildomain: MailDomain instance to check/provision
**provider_kwargs: Provider-specific configuration
Returns:
Dictionary with check and provisioning results
"""
# Check current DNS records
check_results = check_dns_records(maildomain)
results = {"domain": maildomain.name, "check_results": check_results}
# Only provision if there are missing records
missing_records = [r for r in check_results if r["_check"]["status"] == "missing"]
if missing_records:
provisioning_results = provision_domain_dns(maildomain, **provider_kwargs)
results["provisioning_results"] = provisioning_results
# If provisioning was successful, check again
if provisioning_results.get("success"):
updated_check_results = check_dns_records(maildomain)
results["updated_check_results"] = updated_check_results
return results

View File

@@ -4,51 +4,65 @@ Core application enums declaration
from django.conf import global_settings
from django.db import models
from django.utils.translation import gettext_lazy as _
# In Django's code base, `LANGUAGES` is set by default with all supported languages.
# We can use it for the choice of languages which should not be limited to the few languages
# active in the app.
# pylint: disable=no-member
ALL_LANGUAGES = {language: _(name) for language, name in global_settings.LANGUAGES}
ALL_LANGUAGES = dict(global_settings.LANGUAGES)
class MailboxRoleChoices(models.TextChoices):
class MailboxRoleChoices(models.IntegerChoices):
"""Defines the unique roles a user can have to access a mailbox."""
VIEWER = "viewer", _("Viewer")
EDITOR = "editor", _("Editor")
ADMIN = "admin", _("Admin")
VIEWER = 1, "viewer"
EDITOR = 2, "editor"
SENDER = 3, "sender"
ADMIN = 4, "admin"
class ThreadAccessRoleChoices(models.TextChoices):
class ThreadAccessRoleChoices(models.IntegerChoices):
"""Defines the possible roles a mailbox can have to access to a thread."""
VIEWER = "viewer", _("Viewer")
EDITOR = "editor", _("Editor")
VIEWER = 1, "viewer"
EDITOR = 2, "editor"
class MessageRecipientTypeChoices(models.TextChoices):
class MessageRecipientTypeChoices(models.IntegerChoices):
"""Defines the possible types of message recipients."""
TO = "to", _("To")
CC = "cc", _("Cc")
BCC = "bcc", _("Bcc")
TO = 1, "to"
CC = 2, "cc"
BCC = 3, "bcc"
class MessageDeliveryStatusChoices(models.TextChoices):
class MessageDeliveryStatusChoices(models.IntegerChoices):
"""Defines the possible statuses of a message delivery."""
INTERNAL = "internal", _("Internal")
SENT = "sent", _("Sent")
FAILED = "failed", _("Failed")
RETRY = "retry", _("Retry")
INTERNAL = 1, "internal"
SENT = 2, "sent"
FAILED = 3, "failed"
RETRY = 4, "retry"
class MailDomainAccessRoleChoices(models.TextChoices):
class MailDomainAccessRoleChoices(models.IntegerChoices):
"""Defines the unique roles a user can have to access a mail domain."""
ADMIN = "ADMIN", _("Admin")
ADMIN = 1, "admin"
class CompressionTypeChoices(models.IntegerChoices):
"""Defines the possible compression types."""
NONE = 0, "None"
ZSTD = 1, "Zstd"
class DKIMAlgorithmChoices(models.IntegerChoices):
"""Defines the possible DKIM signing algorithms."""
RSA = 1, "rsa"
ED25519 = 2, "ed25519"
THREAD_STATS_FIELDS_MAP = {

View File

@@ -25,7 +25,6 @@ class UserFactory(factory.django.DjangoModelFactory):
sub = factory.Sequence(lambda n: f"user{n!s}")
email = factory.Faker("email")
full_name = factory.Faker("name")
short_name = factory.Faker("first_name")
language = factory.fuzzy.FuzzyChoice([lang[0] for lang in settings.LANGUAGES])
password = make_password("password")
@@ -168,6 +167,22 @@ class MessageFactory(factory.django.DjangoModelFactory):
created_at = factory.LazyAttribute(lambda o: timezone.now())
mime_id = factory.Sequence(lambda n: f"message{n!s}")
@factory.post_generation
def raw_mime(self, create, extracted, **kwargs):
"""
Create a blob with raw MIME content when raw_mime is provided.
Usage: MessageFactory(raw_mime=b"raw email content")
"""
if not create or not extracted:
return
# Create a blob with the raw MIME content using the sender's mailbox
self.blob = self.sender.mailbox.create_blob( # pylint: disable=attribute-defined-outside-init
content=extracted,
content_type="message/rfc822",
)
self.save()
class MessageRecipientFactory(factory.django.DjangoModelFactory):
"""A factory to random message recipients for testing purposes."""

View File

@@ -38,7 +38,7 @@ def get_keycloak_admin_client():
return keycloak_admin
def sync_maildomain_to_keycloak_group(maildomain):
def sync_maildomain_to_keycloak_group(maildomain: MailDomain):
"""
Sync a MailDomain to Keycloak as a group.
Creates the group if it doesn't exist and updates its attributes.
@@ -52,16 +52,14 @@ def sync_maildomain_to_keycloak_group(maildomain):
try:
keycloak_admin = get_keycloak_admin_client()
group_name = f"maildomain-{maildomain.name}"
group_path = f"{settings.KEYCLOAK_GROUP_PATH_PREFIX}{maildomain.name}"
group_name = group_path.rsplit("/", maxsplit=1)[-1]
parent_path = group_path.rsplit("/", maxsplit=1)[0]
# Check if group exists
existing_groups = keycloak_admin.get_groups({"search": group_name})
group_id = None
for group in existing_groups:
if group.get("name") == group_name:
group_id = group["id"]
break
existing_group = keycloak_admin.get_group_by_path(group_path)
if existing_group and "error" in existing_group:
existing_group = None
# Prepare group attributes
group_attributes = {
@@ -69,20 +67,24 @@ def sync_maildomain_to_keycloak_group(maildomain):
"maildomain_name": [maildomain.name],
}
# Add custom metadata from identity_group_metadata
if maildomain.identity_group_metadata:
for key, value in maildomain.identity_group_metadata.items():
# Add custom attributes
if maildomain.custom_attributes:
for key, value in maildomain.custom_attributes.items():
# Ensure values are lists (Keycloak requirement)
if isinstance(value, list):
group_attributes[key] = value
else:
group_attributes[key] = [str(value)]
if group_id:
if existing_group:
# Update existing group
group_id = existing_group["id"]
keycloak_admin.update_group(
group_id=group_id,
payload={"name": group_name, "attributes": group_attributes},
payload={
"name": group_name,
"attributes": group_attributes,
},
)
logger.info(
"Updated Keycloak group %s for MailDomain %s",
@@ -91,8 +93,19 @@ def sync_maildomain_to_keycloak_group(maildomain):
)
else:
# Create new group
group_payload = {"name": group_name, "attributes": group_attributes}
group_id = keycloak_admin.create_group(payload=group_payload)
group_payload = {
"name": group_name,
"attributes": group_attributes,
}
parent_id = None
if parent_path:
parent_group = keycloak_admin.get_group_by_path(parent_path)
if parent_group and "error" not in parent_group:
parent_id = parent_group["id"]
group_id = keycloak_admin.create_group(
payload=group_payload, parent=parent_id
)
logger.info(
"Created Keycloak group %s for MailDomain %s",
group_name,
@@ -106,7 +119,7 @@ def sync_maildomain_to_keycloak_group(maildomain):
raise
def sync_mailbox_to_keycloak_user(mailbox):
def sync_mailbox_to_keycloak_user(mailbox: Mailbox):
"""
Sync a Mailbox to Keycloak as a user in its maildomain group.
Creates the user if it doesn't exist and adds them to the appropriate group.

View File

@@ -0,0 +1,75 @@
"""
Django management command to check DNS records for mail domains.
"""
from django.core.management.base import BaseCommand, CommandError
from core.dns.check import check_dns_records
from core.models import MailDomain
class Command(BaseCommand):
help = "Check DNS records for mail domains"
def add_arguments(self, parser):
parser.add_argument(
"--domain",
type=str,
help="Specific domain to check (if not provided, checks all domains)",
)
def handle(self, *args, **options):
domain_name = options["domain"]
if domain_name:
try:
maildomain = MailDomain.objects.get(name=domain_name)
domains = [maildomain]
except MailDomain.DoesNotExist:
raise CommandError(f"Domain '{domain_name}' not found") from None
else:
domains = MailDomain.objects.all()
self.stdout.write(f"Checking DNS records for {len(domains)} domain(s)...")
self.stdout.write("")
for maildomain in domains:
self.check_domain(maildomain)
def check_domain(self, maildomain):
"""Check DNS records for a specific domain."""
domain = maildomain.name
self.stdout.write(f"Domain: {domain}")
self.stdout.write("-" * (len(domain) + 8))
# Get DNS check results
check_results = check_dns_records(maildomain)
self.print_detailed_results(check_results)
self.stdout.write("")
def print_detailed_results(self, check_results):
"""Print a flat list of DNS check results with status emojis."""
status_emoji = {
"correct": "🟢",
"incorrect": "🟡",
"missing": "🔴",
"error": "⚠️",
}
for record in check_results:
status = record["_check"]["status"]
emoji = status_emoji.get(status, "")
target = record["target"] or "@"
line = f"{emoji} {record['type']} record for {target}"
if status == "correct":
line += f" — Value: {record['value']}"
elif status == "incorrect":
line += f" — Expected: {record['value']} | Found: {', '.join(record['_check'].get('found', []))}"
elif status == "missing":
line += f" — Expected: {record['value']} | Error: {record['_check'].get('error', '')}"
elif status == "error":
line += f" — Error: {record['_check'].get('error', '')}"
self.stdout.write(line)
self.stdout.write("")

View File

@@ -0,0 +1,159 @@
"""
Django management command to provision DNS records for mail domains.
"""
from django.conf import settings
from django.core.management.base import BaseCommand, CommandError
from core.dns.provisioning import (
detect_dns_provider,
provision_domain_dns,
)
from core.models import MailDomain
class Command(BaseCommand):
help = "Provision DNS records for mail domains"
def add_arguments(self, parser):
parser.add_argument("--domain", type=str, help="Domain name to provision")
parser.add_argument("--domainid", type=int, help="Domain ID to provision")
parser.add_argument(
"--provider",
type=str,
help="DNS provider to use (auto-detect if not specified)",
)
parser.add_argument(
"--pretend",
action="store_true",
help="Simulate provisioning without making actual changes",
)
def handle(self, *args, **options):
domain_name = options["domain"]
domain_id = options["domainid"]
provider_name = options["provider"]
pretend = options["pretend"]
if not domain_name and not domain_id:
raise CommandError("Either --domain or --domainid must be specified")
if domain_name and domain_id:
raise CommandError("Cannot specify both --domain and --domainid")
try:
if domain_name:
maildomain = MailDomain.objects.get(name=domain_name)
else:
maildomain = MailDomain.objects.get(id=domain_id)
except MailDomain.DoesNotExist:
if domain_name:
raise CommandError(f"Domain '{domain_name}' not found") from None
else:
raise CommandError(f"Domain with ID {domain_id} not found") from None
if pretend:
self.stdout.write(
self.style.WARNING("PRETEND MODE: No actual changes will be made")
)
self.stdout.write("")
self.process_domain(maildomain, provider_name, pretend)
def process_domain(self, maildomain, provider_name, pretend):
"""Process a single domain for DNS provisioning."""
domain = maildomain.name
self.stdout.write(f"Domain: {domain}")
self.stdout.write("-" * (len(domain) + 8))
# Show provider information
detected_provider = detect_dns_provider(domain)
if detected_provider:
self.stdout.write(f"Detected provider: {detected_provider}")
else:
self.stdout.write(self.style.WARNING("No provider detected"))
# Check if we can provision
can_provision = (
provider_name or detected_provider or settings.DNS_DEFAULT_PROVIDER
)
if not can_provision:
self.stdout.write(
self.style.ERROR("✗ Cannot provision DNS records for this domain")
)
self.stdout.write("")
return
self.provision_domain(maildomain, provider_name, pretend)
self.stdout.write("")
def provision_domain(self, maildomain, provider_name, pretend):
"""Provision DNS records for a domain."""
if pretend:
self.stdout.write("Simulating DNS record provisioning...")
else:
self.stdout.write("Provisioning DNS records...")
results = provision_domain_dns(
maildomain, provider_name=provider_name, pretend=pretend
)
if results["success"]:
if pretend:
self.stdout.write(
self.style.SUCCESS("✓ DNS provisioning simulation successful")
)
else:
self.stdout.write(self.style.SUCCESS("✓ DNS provisioning successful"))
# Show which provider was used
provider_used = results.get("provider", "unknown")
if provider_used:
self.stdout.write(f"Provider used: {provider_used}")
if results["created"]:
if pretend:
self.stdout.write(
f"Would create {len(results['created'])} records:"
)
else:
self.stdout.write(f"Created {len(results['created'])} records:")
for record in results["created"]:
self.stdout.write(
f" - {record['type']} record for {record['name']}: {record['value']}"
)
if results["updated"]:
if pretend:
self.stdout.write(
f"Would update {len(results['updated'])} records:"
)
else:
self.stdout.write(f"Updated {len(results['updated'])} records:")
for record in results["updated"]:
self.stdout.write(
f" - {record['type']} record for {record['name']}"
)
self.stdout.write(f" Old: {record['old_value']}")
self.stdout.write(f" New: {record['new_value']}")
if results["errors"]:
self.stdout.write(
self.style.WARNING(f"Errors ({len(results['errors'])}):")
)
for error in results["errors"]:
self.stdout.write(
f" - {error['type']} record for {error['name']}: {error['error']}"
)
elif pretend:
self.stdout.write(
self.style.ERROR(
f"✗ DNS provisioning simulation failed: {results['error']}"
)
)
else:
self.stdout.write(
self.style.ERROR(f"✗ DNS provisioning failed: {results['error']}")
)

View File

@@ -103,7 +103,7 @@ class Command(BaseCommand):
return user, session_key, data
# pylint: disable=broad-except
except Exception as e: # noqa: BLE001
except Exception as e:
logger.error(f"Failed to process session {redis_key}: {e}")
return None

View File

@@ -0,0 +1,159 @@
"""Management command to retry sending a message to failed/retry recipients."""
import logging
from django.core.management.base import BaseCommand, CommandError
from django.db import transaction
from core import models
from core.enums import MessageDeliveryStatusChoices
from core.tasks import retry_messages_task
logger = logging.getLogger(__name__)
class Command(BaseCommand):
"""Management command to retry sending message(s) to recipients with retry status."""
help = "Retry sending message(s) to recipients with retry status. Without --force, delegates to celery task (respects retry timing). With --force, processes immediately (ignores retry delays). Specify message_id for single message, or omit for bulk processing."
def add_arguments(self, parser):
"""Define optional argument for message ID."""
parser.add_argument(
"message_id",
nargs="?",
help="ID of the message to retry sending (if not provided, retry all retryable messages)",
)
parser.add_argument(
"--force-mta-out",
action="store_true",
help="Force sending through external MTA even for local recipients",
)
parser.add_argument(
"--batch-size",
type=int,
default=100,
help="Number of messages to process in each batch (default: 100)",
)
parser.add_argument(
"--force",
action="store_true",
help="Force immediate retry by resetting retry_at timestamps (ignores retry delays)",
)
def handle(self, *args, **options):
"""
Retry sending messages to recipients with retry status.
Without --force: Delegates to celery task (respects retry timing).
With --force: Processes immediately (resets retry_at timestamps).
"""
message_id = options.get("message_id")
force_mta_out = options.get("force_mta_out", False)
batch_size = options.get("batch_size", 100)
force = options.get("force", False)
if force:
# Handle force operations: reset timestamps and delegate to celery task
if message_id:
# Reset timestamps for single message and delegate
self._reset_and_delegate_single(message_id, force_mta_out)
else:
# Reset timestamps for all messages and delegate
self._reset_and_delegate_all(force_mta_out, batch_size)
else:
# Delegate to celery task for non-force operations
self._delegate_to_celery_task(message_id, force_mta_out, batch_size)
def _delegate_to_celery_task(self, message_id, force_mta_out, batch_size):
"""Delegate retry operations to celery task synchronously and print result."""
self.stdout.write("Running retry operations via celery task (synchronously)...")
result = retry_messages_task.apply(
args=(),
kwargs={
"message_id": message_id,
"force_mta_out": force_mta_out,
"batch_size": batch_size,
},
)
if result.successful():
self.stdout.write(
self.style.SUCCESS(f"Task completed successfully: {result.get()}")
)
else:
self.stdout.write(self.style.ERROR(f"Task failed: {result.result}"))
def _reset_and_delegate_single(self, message_id, force_mta_out):
"""Reset retry_at timestamps for single message and delegate to celery task."""
try:
message = models.Message.objects.get(id=message_id)
except models.Message.DoesNotExist:
raise CommandError(
f"Message with ID '{message_id}' does not exist."
) from None
# Check if message is a draft
if message.is_draft:
raise CommandError(
f"Message '{message_id}' is still a draft and cannot be sent."
)
# Get recipients with retry status
retry_recipients = message.recipients.filter(
delivery_status__in=[
MessageDeliveryStatusChoices.RETRY,
# MessageDeliveryStatusChoices.FAILED,
]
)
if not retry_recipients.exists():
self.stdout.write(
self.style.WARNING(
f"No recipients with retry status found for message '{message_id}'"
)
)
return
# Reset retry_at timestamps
with transaction.atomic():
updated_count = retry_recipients.update(retry_at=None)
self.stdout.write(
f"Reset retry_at timestamp for {updated_count} recipient(s) of message '{message_id}'"
)
# Delegate to celery task
self._delegate_to_celery_task(message_id, force_mta_out, 100)
def _reset_and_delegate_all(self, force_mta_out, batch_size):
"""Reset retry_at timestamps for all messages and delegate to celery task."""
# Find all messages with retryable recipients
messages_with_retries = models.Message.objects.filter(
is_draft=False,
recipients__delivery_status=MessageDeliveryStatusChoices.RETRY,
).distinct()
total_messages = messages_with_retries.count()
if total_messages == 0:
self.stdout.write(
self.style.WARNING("No messages with retryable recipients found")
)
return
self.stdout.write(
f"Found {total_messages} message(s) with retryable recipients"
)
# Reset retry_at timestamps for all recipients
with transaction.atomic():
updated_count = models.MessageRecipient.objects.filter(
delivery_status=MessageDeliveryStatusChoices.RETRY
).update(retry_at=None)
self.stdout.write(
f"Reset retry_at timestamp for {updated_count} recipient(s) across all messages"
)
# Delegate to celery task
self._delegate_to_celery_task(None, force_mta_out, batch_size)

View File

@@ -1,4 +1,4 @@
"""Management command to create Elasticsearch index."""
"""Management command to create OpenSearch index."""
import sys
@@ -8,19 +8,19 @@ from core.search import create_index_if_not_exists
class Command(BaseCommand):
"""Create Elasticsearch index if it doesn't exist."""
"""Create OpenSearch index if it doesn't exist."""
help = "Create Elasticsearch index if it doesn't exist"
help = "Create OpenSearch index if it doesn't exist"
def handle(self, *args, **options):
"""Execute the command."""
self.stdout.write("Creating Elasticsearch index...")
self.stdout.write("Creating OpenSearch index...")
result = create_index_if_not_exists()
if result:
self.stdout.write(
self.style.SUCCESS("Elasticsearch index created or already exists")
self.style.SUCCESS("OpenSearch index created or already exists")
)
else:
self.stdout.write(self.style.ERROR("Failed to create Elasticsearch index"))
self.stdout.write(self.style.ERROR("Failed to create OpenSearch index"))
sys.exit(1)

View File

@@ -1,4 +1,4 @@
"""Management command to delete Elasticsearch index."""
"""Management command to delete OpenSearch index."""
import sys
@@ -8,9 +8,9 @@ from core.search import delete_index
class Command(BaseCommand):
"""Delete Elasticsearch index."""
"""Delete OpenSearch index."""
help = "Delete Elasticsearch index"
help = "Delete OpenSearch index"
def add_arguments(self, parser):
"""Add command arguments."""
@@ -24,21 +24,21 @@ class Command(BaseCommand):
"""Execute the command."""
if not options["force"]:
confirm = input(
"Are you sure you want to delete the Elasticsearch index? This cannot be undone. [y/N] "
"Are you sure you want to delete the OpenSearch index? This cannot be undone. [y/N] "
)
if confirm.lower() != "y":
self.stdout.write(self.style.WARNING("Operation cancelled"))
return
self.stdout.write("Deleting Elasticsearch index...")
self.stdout.write("Deleting OpenSearch index...")
result = delete_index()
if result:
self.stdout.write(
self.style.SUCCESS("Elasticsearch index deleted successfully")
self.style.SUCCESS("OpenSearch index deleted successfully")
)
else:
self.stdout.write(
self.style.WARNING("Elasticsearch index not found or already deleted")
self.style.WARNING("OpenSearch index not found or already deleted")
)
sys.exit(1)

View File

@@ -1,4 +1,4 @@
"""Management command to reindex content in Elasticsearch."""
"""Management command to reindex content in OpenSearch."""
import uuid
@@ -15,9 +15,9 @@ from core.tasks import (
class Command(BaseCommand):
"""Reindex content in Elasticsearch."""
"""Reindex content in OpenSearch."""
help = "Reindex content in Elasticsearch"
help = "Reindex content in OpenSearch"
def add_arguments(self, parser):
"""Add command arguments."""
@@ -57,7 +57,7 @@ class Command(BaseCommand):
def handle(self, *args, **options):
"""Execute the command."""
# Ensure index exists
self.stdout.write("Ensuring Elasticsearch index exists...")
self.stdout.write("Ensuring OpenSearch index exists...")
create_index_if_not_exists()
# Handle reindexing based on scope

View File

@@ -83,44 +83,41 @@ def compute_labels_and_flags(
return labels_to_add, message_flags, thread_flags
def _process_attachments(
message: models.Message, attachment_data: List[Dict], mailbox: models.Mailbox
) -> None:
"""
Process attachments found during email parsing.
# def _process_attachments(
# message: models.Message, attachment_data: List[Dict], mailbox: models.Mailbox
# ) -> None:
# """
# Process attachments found during email parsing.
Creates Blob records for each attachment and links them to the message.
# Creates Blob records for each attachment and links them to the message.
Args:
message: The message object to link attachments to
attachment_data: List of attachment data dictionaries from parsing
mailbox: The mailbox that owns these attachments
"""
for attachment_info in attachment_data:
try:
# Check if we have content to store
if "content" in attachment_info and attachment_info["content"]:
# Create a blob for this attachment
content = attachment_info["content"]
blob = models.Blob.objects.create(
sha256=attachment_info["sha256"],
size=attachment_info["size"],
type=attachment_info["type"],
raw_content=content,
mailbox=mailbox,
)
# Args:
# message: The message object to link attachments to
# attachment_data: List of attachment data dictionaries from parsing
# mailbox: The mailbox that owns these attachments
# """
# for attachment_info in attachment_data:
# try:
# # Check if we have content to store
# if "content" in attachment_info and attachment_info["content"]:
# # Create a blob for this attachment using the mailbox method
# content = attachment_info["content"]
# blob = mailbox.create_blob(
# content=content,
# content_type=attachment_info["type"],
# )
# Create an attachment record linking to this blob
attachment = models.Attachment.objects.create(
name=attachment_info.get("name", "unnamed"),
blob=blob,
mailbox=mailbox,
)
# # Create an attachment record linking to this blob
# attachment = models.Attachment.objects.create(
# name=attachment_info.get("name", "unnamed"),
# blob=blob,
# mailbox=mailbox,
# )
# Link the attachment to the message
message.attachments.add(attachment)
except Exception as e:
logger.exception("Error processing attachment: %s", e)
# # Link the attachment to the message
# message.attachments.add(attachment)
# except Exception as e:
# logger.exception("Error processing attachment: %s", e)
def check_local_recipient(
@@ -438,11 +435,16 @@ def deliver_inbound_message( # pylint: disable=too-many-branches, too-many-stat
mime_id=parsed_email.get("in_reply_to"), thread=thread
).first()
blob = mailbox.create_blob(
content=raw_data,
content_type="message/rfc822",
)
message = models.Message.objects.create(
thread=thread,
sender=sender_contact,
subject=parsed_email.get("subject"),
raw_mime=raw_data,
blob=blob,
mime_id=parsed_email.get("messageId", parsed_email.get("message_id"))
or None,
parent=parent_message,
@@ -453,6 +455,7 @@ def deliver_inbound_message( # pylint: disable=too-many-branches, too-many-stat
is_starred=False,
is_trashed=False,
is_unread=True,
has_attachments=len(parsed_email.get("attachments", [])) > 0,
)
if is_import:
# We need to set the created_at field to the date of the message
@@ -548,8 +551,8 @@ def deliver_inbound_message( # pylint: disable=too-many-branches, too-many-stat
# Log and continue
# --- 7. Process Attachments if present --- #
if parsed_email.get("attachments"):
_process_attachments(message, parsed_email["attachments"], mailbox)
# if parsed_email.get("attachments"):
# _process_attachments(message, parsed_email["attachments"], mailbox)
# --- 8. Final Updates --- #
try:

View File

@@ -39,7 +39,10 @@ RETRY_INTERVALS = [
def prepare_outbound_message(
message: models.Message, text_body: str, html_body: str
mailbox_sender: models.Mailbox,
message: models.Message,
text_body: str,
html_body: str,
) -> bool:
"""Compose and sign an existing draft Message object before sending via SMTP.
@@ -101,6 +104,7 @@ def prepare_outbound_message(
"email": message.sender.email,
}
],
"date": timezone.now().strftime("%a, %d %b %Y %H:%M:%S %z"),
"to": recipients_by_type.get(models.MessageRecipientTypeChoices.TO, []),
"cc": recipients_by_type.get(models.MessageRecipientTypeChoices.CC, []),
# BCC is not included in headers
@@ -120,8 +124,8 @@ def prepare_outbound_message(
# Add the attachment to the MIME data
attachments.append(
{
"content": blob.raw_content, # Binary content
"type": blob.type, # MIME type
"content": blob.get_content(), # Decompressed binary content
"type": blob.content_type, # MIME type
"name": attachment.name, # Original filename
"disposition": "attachment", # Default to attachment disposition
"size": blob.size, # Size in bytes
@@ -139,13 +143,13 @@ def prepare_outbound_message(
in_reply_to=message.parent.mime_id if message.parent else None,
# TODO: Add References header logic
)
except Exception as e: # noqa: BLE001
except Exception as e:
logger.error("Failed to compose MIME for message %s: %s", message.id, e)
return False
# Sign the message with DKIM
dkim_signature_header: Optional[bytes] = sign_message_dkim(
raw_mime_message=raw_mime, sender_email=message.sender.email
raw_mime_message=raw_mime, maildomain=mailbox_sender.domain
)
raw_mime_signed = raw_mime
@@ -153,23 +157,39 @@ def prepare_outbound_message(
# Prepend the signature header
raw_mime_signed = dkim_signature_header + b"\r\n" + raw_mime
message.raw_mime = raw_mime_signed
# Create a blob to store the raw MIME content
blob = mailbox_sender.create_blob(
content=raw_mime_signed,
content_type="message/rfc822",
)
draft_blob = message.draft_blob
message.blob = blob
message.is_draft = False
message.draft_body = None
message.draft_blob = None
message.created_at = timezone.now()
message.updated_at = timezone.now()
message.save(
update_fields=[
"updated_at",
"raw_mime",
"blob",
"mime_id",
"is_draft",
"draft_body",
"draft_blob",
"created_at",
]
)
message.thread.update_stats()
# Clean up the draft blob and the attachment blobs
if draft_blob:
draft_blob.delete()
for attachment in message.attachments.all():
if attachment.blob:
attachment.blob.delete()
attachment.delete()
return True
@@ -182,7 +202,7 @@ def send_message(message: models.Message, force_mta_out: bool = False):
message.sent_at = timezone.now()
message.save(update_fields=["sent_at"])
mime_data = parse_email_message(message.raw_mime)
mime_data = parse_email_message(message.blob.get_content())
# Include all recipients in the envelope that have not been delivered yet, including BCC
envelope_to = {
@@ -193,7 +213,7 @@ def send_message(message: models.Message, force_mta_out: bool = False):
None,
MessageDeliveryStatusChoices.RETRY,
}
and (recipient.retry_at is None or recipient.retry_at < timezone.now())
and (recipient.retry_at is None or recipient.retry_at <= timezone.now())
}
def _mark_delivered(
@@ -249,10 +269,10 @@ def send_message(message: models.Message, force_mta_out: bool = False):
):
try:
delivered = deliver_inbound_message(
recipient_email, mime_data, message.raw_mime
recipient_email, mime_data, message.blob.get_content()
)
_mark_delivered(recipient_email, delivered, True)
except Exception as e: # noqa: BLE001
except Exception as e:
logger.error(
"Failed to deliver internal message to %s: %s", recipient_email, e
)
@@ -302,7 +322,7 @@ def send_outbound_message(
statuses = {}
try:
with smtplib.SMTP(smtp_host, smtp_port, timeout=10) as client:
with smtplib.SMTP(smtp_host, smtp_port, timeout=60) as client:
client.ehlo()
if settings.MTA_OUT_SMTP_USE_TLS:
client.starttls()
@@ -316,7 +336,7 @@ def send_outbound_message(
)
smtp_response = client.sendmail(
envelope_from, recipient_emails, message.raw_mime
envelope_from, recipient_emails, message.blob.get_content()
)
logger.info(
"Sent message %s via SMTP. Response: %s",

View File

@@ -4,67 +4,86 @@ import base64
import logging
from typing import Optional
from django.conf import settings
from cryptography.hazmat.primitives import serialization
from cryptography.hazmat.primitives.asymmetric import rsa
from dkim import sign as dkim_sign
from core.enums import DKIMAlgorithmChoices
logger = logging.getLogger(__name__)
def sign_message_dkim(raw_mime_message: bytes, sender_email: str) -> Optional[bytes]:
def generate_dkim_key(
algorithm: DKIMAlgorithmChoices = DKIMAlgorithmChoices.RSA, key_size: int = 2048
) -> tuple[str, str]:
"""Generate a new DKIM key pair.
Args:
algorithm: The signing algorithm (DKIMAlgorithmChoices)
key_size: The key size in bits (e.g., 2048, 4096 for RSA)
Returns:
Tuple of (private_key_pem, public_key_base64)
Raises:
ValueError: If the algorithm is not supported
"""
if algorithm != DKIMAlgorithmChoices.RSA:
raise ValueError(
f"Unsupported algorithm: {algorithm}. Only RSA is currently supported."
)
# Generate RSA private key
private_key = rsa.generate_private_key(public_exponent=65537, key_size=key_size)
# Convert private key to PEM format
private_key_pem = private_key.private_bytes(
encoding=serialization.Encoding.PEM,
format=serialization.PrivateFormat.PKCS8,
encryption_algorithm=serialization.NoEncryption(),
).decode("utf-8")
# Extract public key for DNS records
public_key_der = private_key.public_key().public_bytes(
encoding=serialization.Encoding.DER,
format=serialization.PublicFormat.SubjectPublicKeyInfo,
)
public_key_b64 = base64.b64encode(public_key_der).decode("ascii")
return private_key_pem, public_key_b64
def sign_message_dkim(raw_mime_message: bytes, maildomain) -> Optional[bytes]:
"""Sign a raw MIME message with DKIM.
Uses the private key and selector defined in Django settings.
Only signs domains listed in settings.MESSAGES_DKIM_DOMAINS.
Uses the most recent active DKIM key for the domain.
Only signs if the domain has an active DKIM key configured.
Args:
raw_mime_message: The raw bytes of the MIME message.
sender_email: The email address of the sender (e.g., "user@example.com").
maildomain: The MailDomain object with DKIM key.
Returns:
The DKIM-Signature header bytes if signed, otherwise None.
"""
domain = maildomain.name
dkim_private_key = None
if settings.MESSAGES_DKIM_PRIVATE_KEY_FILE:
try:
with open(settings.MESSAGES_DKIM_PRIVATE_KEY_FILE, "rb") as f:
dkim_private_key = f.read()
except FileNotFoundError:
logger.error(
"DKIM private key file not found: %s",
settings.MESSAGES_DKIM_PRIVATE_KEY_FILE,
)
return None
elif settings.MESSAGES_DKIM_PRIVATE_KEY_B64:
try:
dkim_private_key = base64.b64decode(settings.MESSAGES_DKIM_PRIVATE_KEY_B64)
except (TypeError, ValueError):
logger.error("Failed to decode MESSAGES_DKIM_PRIVATE_KEY_B64.")
return None
# Find the most recent active DKIM key for this domain
dkim_key = maildomain.get_active_dkim_key()
if not dkim_private_key:
if not dkim_key:
logger.warning(
"MESSAGES_DKIM_PRIVATE_KEY_B64/FILE is not set, skipping DKIM signing"
"Domain %s has no active DKIM key configured, skipping DKIM signing", domain
)
return None
try:
domain = sender_email.split("@")[1]
except IndexError:
logger.error("Invalid sender email format for DKIM signing: %s", sender_email)
return None
dkim_private_key = dkim_key.get_private_key_bytes()
if domain not in settings.MESSAGES_DKIM_DOMAINS:
logger.warning(
"Domain %s is not in MESSAGES_DKIM_DOMAINS, skipping DKIM signing", domain
)
return None
try:
signature = dkim_sign(
message=raw_mime_message,
selector=settings.MESSAGES_DKIM_SELECTOR.encode("ascii"),
selector=dkim_key.selector.encode("ascii"),
domain=domain.encode("ascii"),
privkey=dkim_private_key,
include_headers=[
@@ -83,9 +102,14 @@ def sign_message_dkim(raw_mime_message: bytes, sender_email: str) -> Optional[by
# dkim_sign returns the full message including the signature header,
# we only want the header itself.
signature_header = (
signature.split(b"\\r\\n\\r\\n", 1)[0].split(b"DKIM-Signature:")[1].strip()
signature.split(b"\r\n\r\n", 1)[0].split(b"DKIM-Signature:")[1].strip()
)
logger.info(
"Successfully signed message for domain %s with selector %s",
domain,
dkim_key.selector,
)
return b"DKIM-Signature: " + signature_header
except Exception as e: # noqa: BLE001 pylint: disable=broad-exception-caught
except Exception as e: # pylint: disable=broad-exception-caught
logger.error("Error during DKIM signing for domain %s: %s", domain, e)
return None

View File

@@ -1,12 +1,14 @@
# Generated by Django 5.1.8 on 2025-04-29 14:13
# Generated by Django 5.1.8 on 2025-07-13 20:48
import core.models
import django.core.validators
import django.db.models.deletion
import encrypted_fields.fields
import timezone_field.fields
import uuid
from django.conf import settings
from django.db import migrations, models
from django.contrib.postgres.operations import UnaccentExtension
class Migration(migrations.Migration):
@@ -18,33 +20,46 @@ class Migration(migrations.Migration):
]
operations = [
UnaccentExtension(),
migrations.CreateModel(
name='Mailbox',
name='Contact',
fields=[
('id', models.UUIDField(default=uuid.uuid4, editable=False, help_text='primary key for the record as UUID', primary_key=True, serialize=False, verbose_name='id')),
('created_at', models.DateTimeField(auto_now_add=True, help_text='date and time at which a record was created', verbose_name='created on')),
('updated_at', models.DateTimeField(auto_now=True, help_text='date and time at which a record was last updated', verbose_name='updated on')),
('local_part', models.CharField(max_length=255, verbose_name='local part')),
('name', models.CharField(blank=True, max_length=255, null=True, verbose_name='name')),
('email', models.EmailField(max_length=254, verbose_name='email')),
],
options={
'verbose_name': 'mailbox',
'verbose_name_plural': 'mailboxes',
'db_table': 'messages_mailbox',
'ordering': ['-created_at'],
'verbose_name': 'contact',
'verbose_name_plural': 'contacts',
'db_table': 'messages_contact',
},
),
migrations.CreateModel(
name='MailDomain',
name='Thread',
fields=[
('id', models.UUIDField(default=uuid.uuid4, editable=False, help_text='primary key for the record as UUID', primary_key=True, serialize=False, verbose_name='id')),
('created_at', models.DateTimeField(auto_now_add=True, help_text='date and time at which a record was created', verbose_name='created on')),
('updated_at', models.DateTimeField(auto_now=True, help_text='date and time at which a record was last updated', verbose_name='updated on')),
('name', models.CharField(max_length=255, verbose_name='name')),
('subject', models.CharField(blank=True, max_length=255, null=True, verbose_name='subject')),
('snippet', models.TextField(blank=True, verbose_name='snippet')),
('has_unread', models.BooleanField(default=False, verbose_name='has unread')),
('has_trashed', models.BooleanField(default=False, verbose_name='has trashed')),
('has_draft', models.BooleanField(default=False, verbose_name='has draft')),
('has_starred', models.BooleanField(default=False, verbose_name='has starred')),
('has_sender', models.BooleanField(default=False, verbose_name='has sender')),
('has_messages', models.BooleanField(default=True, verbose_name='has messages')),
('has_attachments', models.BooleanField(default=False, verbose_name='has attachments')),
('is_spam', models.BooleanField(default=False, verbose_name='is spam')),
('has_active', models.BooleanField(default=False, verbose_name='has active')),
('messaged_at', models.DateTimeField(blank=True, null=True, verbose_name='messaged at')),
('sender_names', models.JSONField(blank=True, null=True, verbose_name='sender names')),
],
options={
'verbose_name': 'mail domain',
'verbose_name_plural': 'mail domains',
'db_table': 'messages_maildomain',
'verbose_name': 'thread',
'verbose_name_plural': 'threads',
'db_table': 'messages_thread',
},
),
migrations.CreateModel(
@@ -57,15 +72,14 @@ class Migration(migrations.Migration):
('created_at', models.DateTimeField(auto_now_add=True, help_text='date and time at which a record was created', verbose_name='created on')),
('updated_at', models.DateTimeField(auto_now=True, help_text='date and time at which a record was last updated', verbose_name='updated on')),
('sub', models.CharField(blank=True, help_text='Required. 255 characters or fewer. Letters, numbers, and @/./+/-/_/: characters only.', max_length=255, null=True, unique=True, validators=[django.core.validators.RegexValidator(message='Enter a valid sub. This value may contain only letters, numbers, and @/./+/-/_/: characters.', regex='^[\\w.@+-:]+\\Z')], verbose_name='sub')),
('full_name', models.CharField(blank=True, max_length=100, null=True, verbose_name='full name')),
('short_name', models.CharField(blank=True, max_length=20, null=True, verbose_name='short name')),
('full_name', models.CharField(blank=True, max_length=255, null=True, verbose_name='full name')),
('email', models.EmailField(blank=True, max_length=254, null=True, verbose_name='identity email address')),
('admin_email', models.EmailField(blank=True, max_length=254, null=True, unique=True, verbose_name='admin email address')),
('language', models.CharField(choices=[('en-us', 'English'), ('fr-fr', 'French'), ('de-de', 'German')], default='en-us', help_text='The language in which the user wants to see the interface.', max_length=10, verbose_name='language')),
('timezone', timezone_field.fields.TimeZoneField(choices_display='WITH_GMT_OFFSET', default='UTC', help_text='The timezone in which the user wants to see times.', use_pytz=False)),
('is_device', models.BooleanField(default=False, help_text='Whether the user is a device or a real user.', verbose_name='device')),
('is_staff', models.BooleanField(default=False, help_text='Whether the user can log into this admin site.', verbose_name='staff status')),
('is_active', models.BooleanField(default=True, help_text='Whether this user should be treated as active. Unselect this instead of deleting accounts.', verbose_name='active')),
('custom_attributes', models.JSONField(blank=True, default=None, help_text='Metadata to sync to the user in the identity provider.', null=True, verbose_name='Custom attributes')),
('groups', models.ManyToManyField(blank=True, help_text='The groups this user belongs to. A user will get all permissions granted to each of their groups.', related_name='user_set', related_query_name='user', to='auth.group', verbose_name='groups')),
('user_permissions', models.ManyToManyField(blank=True, help_text='Specific permissions for this user.', related_name='user_set', related_query_name='user', to='auth.permission', verbose_name='user permissions')),
],
@@ -79,36 +93,63 @@ class Migration(migrations.Migration):
],
),
migrations.CreateModel(
name='Contact',
name='Mailbox',
fields=[
('id', models.UUIDField(default=uuid.uuid4, editable=False, help_text='primary key for the record as UUID', primary_key=True, serialize=False, verbose_name='id')),
('created_at', models.DateTimeField(auto_now_add=True, help_text='date and time at which a record was created', verbose_name='created on')),
('updated_at', models.DateTimeField(auto_now=True, help_text='date and time at which a record was last updated', verbose_name='updated on')),
('name', models.CharField(blank=True, max_length=255, null=True, verbose_name='name')),
('email', models.EmailField(max_length=254, verbose_name='email')),
('mailbox', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='contacts', to='core.mailbox')),
('local_part', models.CharField(max_length=64, validators=[django.core.validators.RegexValidator(regex='^[a-zA-Z0-9_.-]+$')], verbose_name='local part')),
('alias_of', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, to='core.mailbox')),
('contact', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='mailboxes', to='core.contact')),
],
options={
'verbose_name': 'contact',
'verbose_name_plural': 'contacts',
'db_table': 'messages_contact',
'unique_together': {('email', 'mailbox')},
'verbose_name': 'mailbox',
'verbose_name_plural': 'mailboxes',
'db_table': 'messages_mailbox',
'ordering': ['-created_at'],
},
),
migrations.AddField(
model_name='contact',
name='mailbox',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='contacts', to='core.mailbox'),
),
migrations.CreateModel(
name='Blob',
fields=[
('id', models.UUIDField(default=uuid.uuid4, editable=False, help_text='primary key for the record as UUID', primary_key=True, serialize=False, verbose_name='id')),
('created_at', models.DateTimeField(auto_now_add=True, help_text='date and time at which a record was created', verbose_name='created on')),
('updated_at', models.DateTimeField(auto_now=True, help_text='date and time at which a record was last updated', verbose_name='updated on')),
('sha256', models.BinaryField(db_index=True, help_text='SHA-256 hash of the uncompressed blob content', max_length=32, verbose_name='sha256 hash')),
('size', models.PositiveIntegerField(help_text='Size of the blob in bytes', verbose_name='file size')),
('content_type', models.CharField(help_text='MIME type of the blob', max_length=127, verbose_name='content type')),
('compression', models.SmallIntegerField(choices=[(0, 'None'), (1, 'Zstd')], default=0, verbose_name='compression')),
('raw_content', models.BinaryField(help_text='Compressed binary content of the blob', verbose_name='raw content')),
('mailbox', models.ForeignKey(help_text='Mailbox that owns this blob', on_delete=django.db.models.deletion.CASCADE, related_name='blobs', to='core.mailbox')),
],
options={
'verbose_name': 'blob',
'verbose_name_plural': 'blobs',
'db_table': 'messages_blob',
'ordering': ['-created_at'],
},
),
migrations.CreateModel(
name='MailboxAccess',
name='MailDomain',
fields=[
('id', models.UUIDField(default=uuid.uuid4, editable=False, help_text='primary key for the record as UUID', primary_key=True, serialize=False, verbose_name='id')),
('created_at', models.DateTimeField(auto_now_add=True, help_text='date and time at which a record was created', verbose_name='created on')),
('updated_at', models.DateTimeField(auto_now=True, help_text='date and time at which a record was last updated', verbose_name='updated on')),
('permission', models.CharField(choices=[('read', 'Read'), ('edit', 'Edit'), ('send', 'Send'), ('delete', 'Delete'), ('admin', 'Admin')], default='read', max_length=20, verbose_name='permission')),
('mailbox', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='accesses', to='core.mailbox')),
('user', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='mailbox_accesses', to=settings.AUTH_USER_MODEL)),
('name', models.CharField(max_length=253, unique=True, validators=[django.core.validators.RegexValidator(message='Enter a valid domain name. This value may contain only lowercase letters, numbers, dots and - characters.', regex='^[a-z0-9][a-z0-9.-]*[a-z0-9]$')], verbose_name='name')),
('oidc_autojoin', models.BooleanField(default=False, help_text='Create mailboxes automatically based on OIDC emails.', verbose_name='oidc autojoin')),
('identity_sync', models.BooleanField(default=False, help_text='Sync mailboxes to an identity provider.', verbose_name='Identity sync')),
('custom_attributes', models.JSONField(blank=True, default=None, help_text='Metadata to sync to the maildomain group in the identity provider.', null=True, verbose_name='Custom attributes')),
('alias_of', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, to='core.maildomain')),
],
options={
'verbose_name': 'mailbox access',
'verbose_name_plural': 'mailbox accesses',
'db_table': 'messages_mailboxaccess',
'verbose_name': 'mail domain',
'verbose_name_plural': 'mail domains',
'db_table': 'messages_maildomain',
},
),
migrations.AddField(
@@ -117,20 +158,24 @@ class Migration(migrations.Migration):
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='core.maildomain'),
),
migrations.CreateModel(
name='Thread',
name='DKIMKey',
fields=[
('id', models.UUIDField(default=uuid.uuid4, editable=False, help_text='primary key for the record as UUID', primary_key=True, serialize=False, verbose_name='id')),
('created_at', models.DateTimeField(auto_now_add=True, help_text='date and time at which a record was created', verbose_name='created on')),
('updated_at', models.DateTimeField(auto_now=True, help_text='date and time at which a record was last updated', verbose_name='updated on')),
('subject', models.CharField(max_length=255, verbose_name='subject')),
('snippet', models.TextField(blank=True, verbose_name='snippet')),
('is_read', models.BooleanField(default=False, verbose_name='is read')),
('mailbox', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='threads', to='core.mailbox')),
('selector', models.CharField(help_text="DKIM selector (e.g., 'default', 'mail')", max_length=255, verbose_name='selector')),
('private_key', encrypted_fields.fields.EncryptedTextField(help_text='DKIM private key in PEM format (encrypted)', verbose_name='private key')),
('public_key', models.TextField(help_text='DKIM public key for DNS record generation', verbose_name='public key')),
('algorithm', models.SmallIntegerField(choices=[(1, 'rsa'), (2, 'ed25519')], default=1, help_text='DKIM signing algorithm', verbose_name='algorithm')),
('key_size', models.PositiveIntegerField(help_text='Key size in bits (e.g., 2048, 4096 for RSA)', verbose_name='key size')),
('is_active', models.BooleanField(default=True, help_text='Whether this DKIM key is active and should be used for signing', verbose_name='is active')),
('domain', models.ForeignKey(help_text='Domain that owns this DKIM key', on_delete=django.db.models.deletion.CASCADE, related_name='dkim_keys', to='core.maildomain')),
],
options={
'verbose_name': 'thread',
'verbose_name_plural': 'threads',
'db_table': 'messages_thread',
'verbose_name': 'DKIM key',
'verbose_name_plural': 'DKIM keys',
'db_table': 'messages_dkimkey',
'ordering': ['-created_at'],
},
),
migrations.CreateModel(
@@ -139,19 +184,22 @@ class Migration(migrations.Migration):
('id', models.UUIDField(default=uuid.uuid4, editable=False, help_text='primary key for the record as UUID', primary_key=True, serialize=False, verbose_name='id')),
('created_at', models.DateTimeField(auto_now_add=True, help_text='date and time at which a record was created', verbose_name='created on')),
('updated_at', models.DateTimeField(auto_now=True, help_text='date and time at which a record was last updated', verbose_name='updated on')),
('subject', models.CharField(max_length=255, verbose_name='subject')),
('subject', models.CharField(blank=True, max_length=255, null=True, verbose_name='subject')),
('is_draft', models.BooleanField(default=False, verbose_name='is draft')),
('is_sender', models.BooleanField(default=False, verbose_name='is sender')),
('is_starred', models.BooleanField(default=False, verbose_name='is starred')),
('is_trashed', models.BooleanField(default=False, verbose_name='is trashed')),
('is_read', models.BooleanField(default=False, verbose_name='is read')),
('is_unread', models.BooleanField(default=False, verbose_name='is unread')),
('is_spam', models.BooleanField(default=False, verbose_name='is spam')),
('is_archived', models.BooleanField(default=False, verbose_name='is archived')),
('has_attachments', models.BooleanField(default=False, verbose_name='has attachments')),
('trashed_at', models.DateTimeField(blank=True, null=True, verbose_name='trashed at')),
('sent_at', models.DateTimeField(blank=True, null=True, verbose_name='sent at')),
('read_at', models.DateTimeField(blank=True, null=True, verbose_name='read at')),
('mta_sent', models.BooleanField(default=False, verbose_name='mta sent')),
('archived_at', models.DateTimeField(blank=True, null=True, verbose_name='archived at')),
('mime_id', models.CharField(blank=True, max_length=998, null=True, verbose_name='mime id')),
('raw_mime', models.BinaryField(blank=True, default=b'')),
('draft_body', models.TextField(blank=True, null=True, verbose_name='draft body')),
('blob', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='messages', to='core.blob')),
('draft_blob', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='drafts', to='core.blob')),
('parent', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, to='core.message')),
('sender', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='core.contact')),
('thread', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='messages', to='core.thread')),
@@ -163,17 +211,78 @@ class Migration(migrations.Migration):
'ordering': ['-created_at'],
},
),
migrations.CreateModel(
name='Attachment',
fields=[
('id', models.UUIDField(default=uuid.uuid4, editable=False, help_text='primary key for the record as UUID', primary_key=True, serialize=False, verbose_name='id')),
('created_at', models.DateTimeField(auto_now_add=True, help_text='date and time at which a record was created', verbose_name='created on')),
('updated_at', models.DateTimeField(auto_now=True, help_text='date and time at which a record was last updated', verbose_name='updated on')),
('name', models.CharField(help_text='Original filename of the attachment', max_length=255, verbose_name='file name')),
('blob', models.ForeignKey(help_text='Reference to the blob containing the attachment data', on_delete=django.db.models.deletion.CASCADE, related_name='attachments', to='core.blob')),
('mailbox', models.ForeignKey(help_text='Mailbox that owns this attachment', on_delete=django.db.models.deletion.CASCADE, related_name='attachments', to='core.mailbox')),
('messages', models.ManyToManyField(help_text='Messages that use this attachment', related_name='attachments', to='core.message')),
],
options={
'verbose_name': 'attachment',
'verbose_name_plural': 'attachments',
'db_table': 'messages_attachment',
'ordering': ['-created_at'],
},
),
migrations.AlterUniqueTogether(
name='contact',
unique_together={('email', 'mailbox')},
),
migrations.CreateModel(
name='MailboxAccess',
fields=[
('id', models.UUIDField(default=uuid.uuid4, editable=False, help_text='primary key for the record as UUID', primary_key=True, serialize=False, verbose_name='id')),
('created_at', models.DateTimeField(auto_now_add=True, help_text='date and time at which a record was created', verbose_name='created on')),
('updated_at', models.DateTimeField(auto_now=True, help_text='date and time at which a record was last updated', verbose_name='updated on')),
('role', models.SmallIntegerField(choices=[(1, 'viewer'), (2, 'editor'), (3, 'sender'), (4, 'admin')], default=1, verbose_name='role')),
('mailbox', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='accesses', to='core.mailbox')),
('user', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='mailbox_accesses', to=settings.AUTH_USER_MODEL)),
],
options={
'verbose_name': 'mailbox access',
'verbose_name_plural': 'mailbox accesses',
'db_table': 'messages_mailboxaccess',
'unique_together': {('mailbox', 'user')},
},
),
migrations.AlterUniqueTogether(
name='mailbox',
unique_together={('local_part', 'domain')},
),
migrations.CreateModel(
name='MailDomainAccess',
fields=[
('id', models.UUIDField(default=uuid.uuid4, editable=False, help_text='primary key for the record as UUID', primary_key=True, serialize=False, verbose_name='id')),
('created_at', models.DateTimeField(auto_now_add=True, help_text='date and time at which a record was created', verbose_name='created on')),
('updated_at', models.DateTimeField(auto_now=True, help_text='date and time at which a record was last updated', verbose_name='updated on')),
('role', models.SmallIntegerField(choices=[(1, 'admin')], default=1, verbose_name='role')),
('maildomain', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='accesses', to='core.maildomain')),
('user', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='maildomain_accesses', to=settings.AUTH_USER_MODEL)),
],
options={
'verbose_name': 'mail domain access',
'verbose_name_plural': 'mail domain accesses',
'db_table': 'messages_maildomainaccess',
'unique_together': {('maildomain', 'user')},
},
),
migrations.CreateModel(
name='MessageRecipient',
fields=[
('id', models.UUIDField(default=uuid.uuid4, editable=False, help_text='primary key for the record as UUID', primary_key=True, serialize=False, verbose_name='id')),
('created_at', models.DateTimeField(auto_now_add=True, help_text='date and time at which a record was created', verbose_name='created on')),
('updated_at', models.DateTimeField(auto_now=True, help_text='date and time at which a record was last updated', verbose_name='updated on')),
('type', models.CharField(choices=[('to', 'To'), ('cc', 'Cc'), ('bcc', 'Bcc')], default='to', max_length=20, verbose_name='type')),
('type', models.SmallIntegerField(choices=[(1, 'to'), (2, 'cc'), (3, 'bcc')], default=1, verbose_name='type')),
('delivered_at', models.DateTimeField(blank=True, null=True, verbose_name='delivered at')),
('delivery_status', models.SmallIntegerField(blank=True, choices=[(1, 'internal'), (2, 'sent'), (3, 'failed'), (4, 'retry')], null=True, verbose_name='delivery status')),
('delivery_message', models.TextField(blank=True, null=True, verbose_name='delivery message')),
('retry_count', models.IntegerField(default=0, verbose_name='retry count')),
('retry_at', models.DateTimeField(blank=True, null=True, verbose_name='retry at')),
('contact', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='messages', to='core.contact')),
('message', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='recipients', to='core.message')),
],
@@ -184,4 +293,41 @@ class Migration(migrations.Migration):
'unique_together': {('message', 'contact', 'type')},
},
),
migrations.CreateModel(
name='Label',
fields=[
('id', models.UUIDField(default=uuid.uuid4, editable=False, help_text='primary key for the record as UUID', primary_key=True, serialize=False, verbose_name='id')),
('created_at', models.DateTimeField(auto_now_add=True, help_text='date and time at which a record was created', verbose_name='created on')),
('updated_at', models.DateTimeField(auto_now=True, help_text='date and time at which a record was last updated', verbose_name='updated on')),
('name', models.CharField(help_text="Name of the label/folder (can use slashes for hierarchy, e.g. 'Work/Projects')", max_length=255, verbose_name='name')),
('slug', models.SlugField(help_text='URL-friendly version of the name', max_length=255, verbose_name='slug')),
('color', models.CharField(default='#E3E3FD', help_text='Color of the label in hex format (e.g. #FF0000)', max_length=7, verbose_name='color')),
('mailbox', models.ForeignKey(help_text='Mailbox that owns this label', on_delete=django.db.models.deletion.CASCADE, related_name='labels', to='core.mailbox')),
('threads', models.ManyToManyField(blank=True, help_text='Threads that have this label', related_name='labels', to='core.thread')),
],
options={
'verbose_name': 'label',
'verbose_name_plural': 'labels',
'db_table': 'messages_label',
'ordering': ['slug'],
'unique_together': {('slug', 'mailbox')},
},
),
migrations.CreateModel(
name='ThreadAccess',
fields=[
('id', models.UUIDField(default=uuid.uuid4, editable=False, help_text='primary key for the record as UUID', primary_key=True, serialize=False, verbose_name='id')),
('created_at', models.DateTimeField(auto_now_add=True, help_text='date and time at which a record was created', verbose_name='created on')),
('updated_at', models.DateTimeField(auto_now=True, help_text='date and time at which a record was last updated', verbose_name='updated on')),
('role', models.SmallIntegerField(choices=[(1, 'viewer'), (2, 'editor')], default=1, verbose_name='role')),
('mailbox', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='thread_accesses', to='core.mailbox')),
('thread', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='accesses', to='core.thread')),
],
options={
'verbose_name': 'thread access',
'verbose_name_plural': 'thread accesses',
'db_table': 'messages_threadaccess',
'unique_together': {('thread', 'mailbox')},
},
),
]

View File

@@ -1,51 +0,0 @@
# Generated by Django 5.1.8 on 2025-04-29 17:13
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0001_initial'),
]
operations = [
migrations.RemoveField(
model_name='message',
name='is_read',
),
migrations.RemoveField(
model_name='thread',
name='is_read',
),
migrations.AddField(
model_name='message',
name='is_unread',
field=models.BooleanField(default=False, verbose_name='is unread'),
),
migrations.AddField(
model_name='thread',
name='count_draft',
field=models.IntegerField(default=0, verbose_name='count draft'),
),
migrations.AddField(
model_name='thread',
name='count_sender',
field=models.IntegerField(default=0, verbose_name='count sender'),
),
migrations.AddField(
model_name='thread',
name='count_starred',
field=models.IntegerField(default=0, verbose_name='count starred'),
),
migrations.AddField(
model_name='thread',
name='count_trashed',
field=models.IntegerField(default=0, verbose_name='count trashed'),
),
migrations.AddField(
model_name='thread',
name='count_unread',
field=models.IntegerField(default=0, verbose_name='count unread'),
),
]

View File

@@ -1,23 +0,0 @@
# Generated by Django 5.1.8 on 2025-04-29 19:58
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0002_remove_message_is_read_remove_thread_is_read_and_more'),
]
operations = [
migrations.AddField(
model_name='thread',
name='count_messages',
field=models.IntegerField(default=1, verbose_name='count messages'),
),
migrations.AddField(
model_name='thread',
name='messaged_at',
field=models.DateTimeField(blank=True, null=True, verbose_name='messaged at'),
),
]

View File

@@ -1,18 +0,0 @@
# Generated by Django 5.1.8 on 2025-04-30 09:26
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0003_thread_count_messages_thread_messaged_at'),
]
operations = [
migrations.AddField(
model_name='thread',
name='sender_names',
field=models.JSONField(default=list, verbose_name='sender names'),
),
]

View File

@@ -1,18 +0,0 @@
# Generated by Django 5.1.8 on 2025-04-30 09:30
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0004_thread_sender_names'),
]
operations = [
migrations.AlterField(
model_name='thread',
name='sender_names',
field=models.JSONField(blank=True, null=True, verbose_name='sender names'),
),
]

View File

@@ -1,36 +0,0 @@
# Generated by Django 5.1.8 on 2025-05-02 14:05
import django.db.models.deletion
import uuid
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0005_alter_thread_sender_names'),
]
operations = [
migrations.RemoveField(
model_name='thread',
name='mailbox',
),
migrations.CreateModel(
name='ThreadAccess',
fields=[
('id', models.UUIDField(default=uuid.uuid4, editable=False, help_text='primary key for the record as UUID', primary_key=True, serialize=False, verbose_name='id')),
('created_at', models.DateTimeField(auto_now_add=True, help_text='date and time at which a record was created', verbose_name='created on')),
('updated_at', models.DateTimeField(auto_now=True, help_text='date and time at which a record was last updated', verbose_name='updated on')),
('role', models.CharField(choices=[('viewer', 'Viewer'), ('editor', 'Editor')], default='viewer', max_length=20, verbose_name='role')),
('mailbox', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='thread_accesses', to='core.mailbox')),
('thread', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='accesses', to='core.thread')),
],
options={
'verbose_name': 'thread access',
'verbose_name_plural': 'thread accesses',
'db_table': 'messages_threadaccess',
'unique_together': {('thread', 'mailbox')},
},
),
]

View File

@@ -1,26 +0,0 @@
# Generated by Django 5.1.8 on 2025-05-12 13:13
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0006_remove_thread_mailbox_threadaccess'),
]
operations = [
migrations.AlterUniqueTogether(
name='mailboxaccess',
unique_together={('mailbox', 'user')},
),
migrations.AddField(
model_name='mailboxaccess',
name='role',
field=models.CharField(choices=[('viewer', 'Viewer'), ('editor', 'Editor'), ('admin', 'Admin')], default='viewer', max_length=20, verbose_name='role'),
),
migrations.RemoveField(
model_name='mailboxaccess',
name='permission',
),
]

View File

@@ -1,42 +0,0 @@
# Generated by Django 5.1.8 on 2025-05-05 16:36
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0007_alter_mailboxaccess_unique_together_and_more'),
]
operations = [
migrations.RemoveField(
model_name='message',
name='mta_sent',
),
migrations.AddField(
model_name='messagerecipient',
name='delivered_at',
field=models.DateTimeField(blank=True, null=True, verbose_name='delivered at'),
),
migrations.AddField(
model_name='messagerecipient',
name='delivery_message',
field=models.TextField(blank=True, null=True, verbose_name='delivery message'),
),
migrations.AddField(
model_name='messagerecipient',
name='delivery_status',
field=models.CharField(blank=True, choices=[('draft', 'Draft'), ('internal', 'Internal'), ('sent', 'Sent'), ('failed', 'Failed'), ('retry', 'Retry')], max_length=20, null=True, verbose_name='delivery status'),
),
migrations.AddField(
model_name='messagerecipient',
name='retry_at',
field=models.DateTimeField(blank=True, null=True, verbose_name='retry at'),
),
migrations.AddField(
model_name='messagerecipient',
name='retry_count',
field=models.IntegerField(default=0, verbose_name='retry count'),
),
]

View File

@@ -1,39 +0,0 @@
# Generated by Django 5.1.8 on 2025-05-09 14:08
import django.db.models.deletion
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0008_remove_message_mta_sent_and_more'),
]
operations = [
migrations.AddField(
model_name='mailbox',
name='alias_of',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, to='core.mailbox'),
),
migrations.AddField(
model_name='maildomain',
name='alias_of',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, to='core.maildomain'),
),
migrations.AddField(
model_name='maildomain',
name='oidc_autojoin',
field=models.BooleanField(default=False, help_text='Create mailboxes automatically based on OIDC emails.', verbose_name='oidc autojoin'),
),
migrations.AlterField(
model_name='maildomain',
name='name',
field=models.CharField(max_length=255, unique=True, verbose_name='name'),
),
migrations.AlterField(
model_name='messagerecipient',
name='delivery_status',
field=models.CharField(blank=True, choices=[('internal', 'Internal'), ('sent', 'Sent'), ('failed', 'Failed'), ('retry', 'Retry')], max_length=20, null=True, verbose_name='delivery status'),
),
]

View File

@@ -1,36 +0,0 @@
# Generated by Django 5.1.8 on 2025-05-13 12:04
import django.db.models.deletion
import uuid
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0009_mailbox_alias_of_maildomain_alias_of_and_more'),
]
operations = [
migrations.CreateModel(
name='Attachment',
fields=[
('id', models.UUIDField(default=uuid.uuid4, editable=False, help_text='primary key for the record as UUID', primary_key=True, serialize=False, verbose_name='id')),
('created_at', models.DateTimeField(auto_now_add=True, help_text='date and time at which a record was created', verbose_name='created on')),
('updated_at', models.DateTimeField(auto_now=True, help_text='date and time at which a record was last updated', verbose_name='updated on')),
('sha256', models.CharField(db_index=True, help_text='SHA-256 hash of the attachment content', max_length=64, verbose_name='sha256 hash')),
('size', models.PositiveIntegerField(help_text='Size of the attachment in bytes', verbose_name='file size')),
('name', models.CharField(help_text='Original filename of the attachment', max_length=255, verbose_name='file name')),
('content_type', models.CharField(help_text='MIME type of the attachment', max_length=255, verbose_name='content type')),
('raw_content', models.BinaryField(help_text='Binary content of the attachment, will be offloaded to object storage in the future', verbose_name='raw content')),
('mailbox', models.ForeignKey(help_text='Mailbox that owns this attachment', on_delete=django.db.models.deletion.CASCADE, related_name='attachments', to='core.mailbox')),
('messages', models.ManyToManyField(help_text='Messages that use this attachment', related_name='attachments', to='core.message')),
],
options={
'verbose_name': 'attachment',
'verbose_name_plural': 'attachments',
'db_table': 'messages_attachment',
'ordering': ['-created_at'],
},
),
]

View File

@@ -1,56 +0,0 @@
# Generated by Django 5.1.8 on 2025-05-13 13:56
import django.db.models.deletion
import uuid
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0010_attachment'),
]
operations = [
migrations.RemoveField(
model_name='attachment',
name='content_type',
),
migrations.RemoveField(
model_name='attachment',
name='raw_content',
),
migrations.RemoveField(
model_name='attachment',
name='sha256',
),
migrations.RemoveField(
model_name='attachment',
name='size',
),
migrations.CreateModel(
name='Blob',
fields=[
('id', models.UUIDField(default=uuid.uuid4, editable=False, help_text='primary key for the record as UUID', primary_key=True, serialize=False, verbose_name='id')),
('created_at', models.DateTimeField(auto_now_add=True, help_text='date and time at which a record was created', verbose_name='created on')),
('updated_at', models.DateTimeField(auto_now=True, help_text='date and time at which a record was last updated', verbose_name='updated on')),
('sha256', models.CharField(db_index=True, help_text='SHA-256 hash of the blob content', max_length=64, verbose_name='sha256 hash')),
('size', models.PositiveIntegerField(help_text='Size of the blob in bytes', verbose_name='file size')),
('type', models.CharField(help_text='MIME type of the blob', max_length=255, verbose_name='content type')),
('raw_content', models.BinaryField(help_text='Binary content of the blob, will be offloaded to object storage in the future', verbose_name='raw content')),
('mailbox', models.ForeignKey(help_text='Mailbox that owns this blob', on_delete=django.db.models.deletion.CASCADE, related_name='blobs', to='core.mailbox')),
],
options={
'verbose_name': 'blob',
'verbose_name_plural': 'blobs',
'db_table': 'messages_blob',
'ordering': ['-created_at'],
},
),
migrations.AddField(
model_name='attachment',
name='blob',
field=models.ForeignKey(default='', help_text='Reference to the blob containing the attachment data', on_delete=django.db.models.deletion.CASCADE, related_name='attachments', to='core.blob'),
preserve_default=False,
),
]

View File

@@ -1,36 +0,0 @@
# Generated by Django 5.1.8 on 2025-05-19 15:16
import django.db.models.deletion
from django.db import migrations, models
def migrate_mailbox_contacts(apps, schema_editor):
Mailbox = apps.get_model('core', 'Mailbox')
Contact = apps.get_model('core', 'Contact')
for mailbox in Mailbox.objects.filter(contact__isnull=True):
mailbox.contact = Contact.objects.create(email=str(mailbox), mailbox=mailbox)
mailbox.save()
def reverse_migrate_mailbox_contacts(apps, schema_editor):
Mailbox = apps.get_model('core', 'Mailbox')
for mailbox in Mailbox.objects.filter(contact__isnull=False):
mailbox.contact = None
mailbox.save()
class Migration(migrations.Migration):
dependencies = [
('core', '0011_remove_attachment_content_type_and_more'),
]
operations = [
migrations.RunSQL('CREATE EXTENSION IF NOT EXISTS unaccent;', 'DROP EXTENSION IF EXISTS unaccent;'),
migrations.AddField(
model_name='mailbox',
name='contact',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='mailboxes', to='core.contact'),
),
migrations.RunPython(
code=migrate_mailbox_contacts,
reverse_code=reverse_migrate_mailbox_contacts,
),
]

View File

@@ -1,33 +0,0 @@
# Generated by Django 5.1.8 on 2025-05-18 11:56
import django.db.models.deletion
import uuid
from django.conf import settings
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0012_mailbox_contact'),
]
operations = [
migrations.CreateModel(
name='MailDomainAccess',
fields=[
('id', models.UUIDField(default=uuid.uuid4, editable=False, help_text='primary key for the record as UUID', primary_key=True, serialize=False, verbose_name='id')),
('created_at', models.DateTimeField(auto_now_add=True, help_text='date and time at which a record was created', verbose_name='created on')),
('updated_at', models.DateTimeField(auto_now=True, help_text='date and time at which a record was last updated', verbose_name='updated on')),
('role', models.CharField(choices=[('ADMIN', 'Admin')], default='ADMIN', max_length=20, verbose_name='role')),
('maildomain', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='accesses', to='core.maildomain')),
('user', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='maildomain_accesses', to=settings.AUTH_USER_MODEL)),
],
options={
'verbose_name': 'mail domain access',
'verbose_name_plural': 'mail domain accesses',
'db_table': 'messages_maildomainaccess',
'unique_together': {('maildomain', 'user')},
},
),
]

View File

@@ -1,19 +0,0 @@
# Generated by Django 5.1.8 on 2025-05-21 20:06
import django.core.validators
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0013_maildomainaccess'),
]
operations = [
migrations.AlterField(
model_name='mailbox',
name='local_part',
field=models.CharField(max_length=255, validators=[django.core.validators.RegexValidator(regex='^[a-zA-Z0-9_.-]+$')], verbose_name='local part'),
),
]

View File

@@ -1,35 +0,0 @@
# Generated by Django 5.1.8 on 2025-06-10 08:55
import django.db.models.deletion
import uuid
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0014_alter_mailbox_local_part'),
]
operations = [
migrations.CreateModel(
name='Label',
fields=[
('id', models.UUIDField(default=uuid.uuid4, editable=False, help_text='primary key for the record as UUID', primary_key=True, serialize=False, verbose_name='id')),
('created_at', models.DateTimeField(auto_now_add=True, help_text='date and time at which a record was created', verbose_name='created on')),
('updated_at', models.DateTimeField(auto_now=True, help_text='date and time at which a record was last updated', verbose_name='updated on')),
('name', models.CharField(help_text="Name of the label/folder (can use slashes for hierarchy, e.g. 'Work/Projects')", max_length=255, verbose_name='name')),
('slug', models.SlugField(help_text='URL-friendly version of the name', max_length=255, verbose_name='slug')),
('color', models.CharField(default='#E3E3FD', help_text='Color of the label in hex format (e.g. #FF0000)', max_length=7, verbose_name='color')),
('mailbox', models.ForeignKey(help_text='Mailbox that owns this label', on_delete=django.db.models.deletion.CASCADE, related_name='labels', to='core.mailbox')),
('threads', models.ManyToManyField(blank=True, help_text='Threads that have this label', related_name='labels', to='core.thread')),
],
options={
'verbose_name': 'label',
'verbose_name_plural': 'labels',
'db_table': 'messages_label',
'ordering': ['name'],
'unique_together': {('slug', 'mailbox')},
},
),
]

View File

@@ -1,23 +0,0 @@
# Generated by Django 5.1.8 on 2025-06-16 07:36
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0015_label'),
]
operations = [
migrations.AddField(
model_name='maildomain',
name='identity_group_metadata',
field=models.JSONField(blank=True, default=None, help_text='Metadata to sync to the maildomain group in the identity provider.', null=True, verbose_name='Metadata to sync to the maildomain group in the identity provider'),
),
migrations.AddField(
model_name='maildomain',
name='identity_sync',
field=models.BooleanField(default=False, help_text='Sync mailboxes to identity provider.', verbose_name='Identity sync'),
),
]

View File

@@ -1,92 +0,0 @@
# Generated by Django 5.1.8 on 2025-06-16 14:35
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0016_maildomain_identity_group_metadata_and_more'),
]
operations = [
migrations.RemoveField(
model_name='thread',
name='count_draft',
),
migrations.RemoveField(
model_name='thread',
name='count_messages',
),
migrations.RemoveField(
model_name='thread',
name='count_sender',
),
migrations.RemoveField(
model_name='thread',
name='count_starred',
),
migrations.RemoveField(
model_name='thread',
name='count_trashed',
),
migrations.RemoveField(
model_name='thread',
name='count_unread',
),
migrations.AddField(
model_name='message',
name='archived_at',
field=models.DateTimeField(blank=True, null=True, verbose_name='archived at'),
),
migrations.AddField(
model_name='message',
name='is_archived',
field=models.BooleanField(default=False, verbose_name='is archived'),
),
migrations.AddField(
model_name='message',
name='is_spam',
field=models.BooleanField(default=False, verbose_name='is spam'),
),
migrations.AddField(
model_name='thread',
name='has_active',
field=models.BooleanField(default=False, verbose_name='has active'),
),
migrations.AddField(
model_name='thread',
name='has_draft',
field=models.BooleanField(default=False, verbose_name='has draft'),
),
migrations.AddField(
model_name='thread',
name='has_messages',
field=models.BooleanField(default=True, verbose_name='has messages'),
),
migrations.AddField(
model_name='thread',
name='has_sender',
field=models.BooleanField(default=False, verbose_name='has sender'),
),
migrations.AddField(
model_name='thread',
name='has_starred',
field=models.BooleanField(default=False, verbose_name='has starred'),
),
migrations.AddField(
model_name='thread',
name='has_trashed',
field=models.BooleanField(default=False, verbose_name='has trashed'),
),
migrations.AddField(
model_name='thread',
name='has_unread',
field=models.BooleanField(default=False, verbose_name='has unread'),
),
migrations.AddField(
model_name='thread',
name='is_spam',
field=models.BooleanField(default=False, verbose_name='is spam'),
),
]

View File

@@ -1,27 +0,0 @@
# Generated by Django 5.1.8 on 2025-07-01 09:27
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0017_remove_thread_count_draft_and_more'),
]
operations = [
migrations.AlterModelOptions(
name='label',
options={'ordering': ['slug'], 'verbose_name': 'label', 'verbose_name_plural': 'labels'},
),
migrations.AlterField(
model_name='message',
name='subject',
field=models.CharField(blank=True, null=True, max_length=255, verbose_name='subject'),
),
migrations.AlterField(
model_name='thread',
name='subject',
field=models.CharField(blank=True, null=True, max_length=255, verbose_name='subject'),
),
]

View File

@@ -4,6 +4,7 @@ Declare and configure the models for the messages core application
# pylint: disable=too-many-lines,too-many-instance-attributes
import base64
import hashlib
import uuid
from logging import getLogger
from typing import Any, Dict, List, Optional
@@ -16,9 +17,13 @@ from django.db import models
from django.utils.text import slugify
from django.utils.translation import gettext_lazy as _
import pyzstd
from encrypted_fields.fields import EncryptedTextField
from timezone_field import TimeZoneField
from core.enums import (
CompressionTypeChoices,
DKIMAlgorithmChoices,
MailboxRoleChoices,
MailDomainAccessRoleChoices,
MessageDeliveryStatusChoices,
@@ -26,6 +31,7 @@ from core.enums import (
ThreadAccessRoleChoices,
)
from core.mda.rfc5322 import parse_email_message
from core.mda.signing import generate_dkim_key as _generate_dkim_key
logger = getLogger(__name__)
@@ -129,8 +135,7 @@ class User(AbstractBaseUser, BaseModel, auth_models.PermissionsMixin):
null=True,
)
full_name = models.CharField(_("full name"), max_length=100, null=True, blank=True)
short_name = models.CharField(_("short name"), max_length=20, null=True, blank=True)
full_name = models.CharField(_("full name"), max_length=255, null=True, blank=True)
email = models.EmailField(_("identity email address"), blank=True, null=True)
@@ -153,11 +158,6 @@ class User(AbstractBaseUser, BaseModel, auth_models.PermissionsMixin):
default=settings.TIME_ZONE,
help_text=_("The timezone in which the user wants to see times."),
)
is_device = models.BooleanField(
_("device"),
default=False,
help_text=_("Whether the user is a device or a real user."),
)
is_staff = models.BooleanField(
_("staff status"),
default=False,
@@ -172,6 +172,14 @@ class User(AbstractBaseUser, BaseModel, auth_models.PermissionsMixin):
),
)
custom_attributes = models.JSONField(
_("Custom attributes"),
default=None,
null=True,
blank=True,
help_text=_("Metadata to sync to the user in the identity provider."),
)
objects = UserManager()
USERNAME_FIELD = "admin_email"
@@ -199,7 +207,17 @@ class User(AbstractBaseUser, BaseModel, auth_models.PermissionsMixin):
class MailDomain(BaseModel):
"""Mail domain model to store mail domain information."""
name = models.CharField(_("name"), max_length=255, unique=True)
name_validator = validators.RegexValidator(
regex=r"^[a-z0-9][a-z0-9.-]*[a-z0-9]$",
message=_(
"Enter a valid domain name. This value may contain only lowercase "
"letters, numbers, dots and - characters."
),
)
name = models.CharField(
_("name"), max_length=253, unique=True, validators=[name_validator]
)
alias_of = models.ForeignKey(
"self", on_delete=models.SET_NULL, null=True, blank=True
@@ -214,12 +232,11 @@ class MailDomain(BaseModel):
identity_sync = models.BooleanField(
_("Identity sync"),
default=False,
help_text=_("Sync mailboxes to identity provider."),
help_text=_("Sync mailboxes to an identity provider."),
)
# This contains French SIRETs
identity_group_metadata = models.JSONField(
_("Metadata to sync to the maildomain group in the identity provider"),
custom_attributes = models.JSONField(
_("Custom attributes"),
default=None,
null=True,
blank=True,
@@ -238,24 +255,35 @@ class MailDomain(BaseModel):
def get_expected_dns_records(self) -> List[str]:
"""Get the list of DNS records we expect to be present for this domain."""
technical_domain = settings.MESSAGES_TECHNICAL_DOMAIN
records = [
{"target": "", "type": "mx", "value": "TODO"},
{"target": "", "type": "mx", "value": f"10 mx1.{technical_domain}."},
{"target": "", "type": "mx", "value": f"20 mx2.{technical_domain}."},
{
"target": "",
"type": "txt",
"value": "v=spf1 include:_spf.TODO -all",
"value": f"v=spf1 include:_spf.{technical_domain} -all",
},
{
"target": "_dmarc",
"type": "txt",
"value": "v=DMARC1; p=reject; adkim=s; aspf=s;",
},
{
"target": "s1._domainkey",
"type": "cname",
"value": "TODO",
},
]
# Add DKIM record if we have an active DKIM key
dkim_key = self.get_active_dkim_key()
if dkim_key:
records.append(
{
"target": f"{dkim_key.selector}._domainkey",
"type": "txt",
"value": dkim_key.get_dns_record_value(),
}
)
return records
def get_abilities(self, user):
@@ -296,13 +324,50 @@ class MailDomain(BaseModel):
"manage_mailboxes": is_admin,
}
def generate_dkim_key(
self,
selector: str = settings.MESSAGES_DKIM_DEFAULT_SELECTOR,
algorithm: DKIMAlgorithmChoices = DKIMAlgorithmChoices.RSA,
key_size: int = 2048,
) -> "DKIMKey":
"""
Generate and create a new DKIM key for this domain.
Args:
selector: The DKIM selector (e.g., 'default', 'mail')
algorithm: The signing algorithm
key_size: The key size in bits (e.g., 2048, 4096 for RSA)
Returns:
The created DKIMKey instance
"""
# Generate private and public keys
private_key, public_key = _generate_dkim_key(algorithm, key_size=key_size)
return DKIMKey.objects.create(
selector=selector,
private_key=private_key,
public_key=public_key,
algorithm=algorithm,
key_size=key_size,
is_active=True,
domain=self,
)
def get_active_dkim_key(self):
"""Get the most recent active DKIM key for this domain."""
return (
DKIMKey.objects.filter(
domain=self, is_active=True
).first() # Most recent due to ordering in model
)
class Mailbox(BaseModel):
"""Mailbox model to store mailbox information."""
local_part = models.CharField(
_("local part"),
max_length=255,
max_length=64,
validators=[validators.RegexValidator(regex=r"^[a-zA-Z0-9_.-]+$")],
)
domain = models.ForeignKey("MailDomain", on_delete=models.CASCADE)
@@ -343,6 +408,72 @@ class Mailbox(BaseModel):
accesses__role=ThreadAccessRoleChoices.EDITOR,
)
def create_blob(
self,
content: bytes,
content_type: str,
compression: Optional[CompressionTypeChoices] = CompressionTypeChoices.ZSTD,
) -> "Blob":
"""
Create a new blob with automatic SHA256 calculation and compression.
Args:
content: Raw binary content to store
content_type: MIME type of the content
compression: Compression type to use (defaults to ZSTD)
Returns:
The created Blob instance
Raises:
ValueError: If content is empty
"""
if not content:
raise ValueError("Content cannot be empty")
# Calculate SHA256 hash of the original content
sha256_hash = hashlib.sha256(content).digest()
# Store the original size
original_size = len(content)
# Apply compression if requested
compressed_content = content
if compression == CompressionTypeChoices.ZSTD:
compressed_content = pyzstd.compress(
content, level_or_option=settings.MESSAGES_BLOB_ZSTD_LEVEL
)
logger.debug(
"Compressed blob from %d bytes to %d bytes (%.1f%% reduction)",
original_size,
len(compressed_content),
(1 - len(compressed_content) / original_size) * 100,
)
elif compression == CompressionTypeChoices.NONE:
compressed_content = content
else:
raise ValueError(f"Unsupported compression type: {compression}")
# Create the blob
blob = Blob.objects.create(
sha256=sha256_hash,
size=original_size,
content_type=content_type,
compression=compression,
raw_content=compressed_content,
mailbox=self,
)
logger.info(
"Created blob %s: %d bytes, %s compression, %s content type",
blob.id,
original_size,
compression.label,
content_type,
)
return blob
class MailboxAccess(BaseModel):
"""Mailbox access model to store mailbox access information."""
@@ -353,9 +484,8 @@ class MailboxAccess(BaseModel):
user = models.ForeignKey(
"User", on_delete=models.CASCADE, related_name="mailbox_accesses"
)
role = models.CharField(
role = models.SmallIntegerField(
_("role"),
max_length=20,
choices=MailboxRoleChoices.choices,
default=MailboxRoleChoices.VIEWER,
)
@@ -381,6 +511,7 @@ class Thread(BaseModel):
has_starred = models.BooleanField(_("has starred"), default=False)
has_sender = models.BooleanField(_("has sender"), default=False)
has_messages = models.BooleanField(_("has messages"), default=True)
has_attachments = models.BooleanField(_("has attachments"), default=False)
is_spam = models.BooleanField(_("is spam"), default=False)
has_active = models.BooleanField(_("has active"), default=False)
messaged_at = models.DateTimeField(_("messaged at"), null=True, blank=True)
@@ -407,6 +538,7 @@ class Thread(BaseModel):
"is_sender",
"is_spam",
"is_archived",
"has_attachments",
"created_at",
"sender__name",
)
@@ -421,6 +553,7 @@ class Thread(BaseModel):
self.has_starred = False
self.has_sender = False
self.has_messages = False
self.has_attachments = False
self.is_spam = False
self.has_active = False
self.messaged_at = None
@@ -441,6 +574,9 @@ class Thread(BaseModel):
msg["is_sender"] and not msg["is_trashed"] and not msg["is_draft"]
for msg in message_data
)
self.has_attachments = any(
msg["has_attachments"] and not msg["is_trashed"] for msg in message_data
)
# Check if we have any non-trashed, non-spam messages
active_messages = [
@@ -506,6 +642,7 @@ class Thread(BaseModel):
"has_starred",
"has_sender",
"has_messages",
"has_attachments",
"is_spam",
"has_active",
"messaged_at",
@@ -654,9 +791,8 @@ class ThreadAccess(BaseModel):
mailbox = models.ForeignKey(
"Mailbox", on_delete=models.CASCADE, related_name="thread_accesses"
)
role = models.CharField(
role = models.SmallIntegerField(
_("role"),
max_length=20,
choices=ThreadAccessRoleChoices.choices,
default=ThreadAccessRoleChoices.VIEWER,
)
@@ -706,17 +842,15 @@ class MessageRecipient(BaseModel):
contact = models.ForeignKey(
"Contact", on_delete=models.CASCADE, related_name="messages"
)
type = models.CharField(
type = models.SmallIntegerField(
_("type"),
max_length=20,
choices=MessageRecipientTypeChoices.choices,
default=MessageRecipientTypeChoices.TO,
)
delivered_at = models.DateTimeField(_("delivered at"), null=True, blank=True)
delivery_status = models.CharField(
delivery_status = models.SmallIntegerField(
_("delivery status"),
max_length=20,
null=True,
blank=True,
choices=MessageDeliveryStatusChoices.choices,
@@ -755,6 +889,7 @@ class Message(BaseModel):
is_unread = models.BooleanField(_("is unread"), default=False)
is_spam = models.BooleanField(_("is spam"), default=False)
is_archived = models.BooleanField(_("is archived"), default=False)
has_attachments = models.BooleanField(_("has attachments"), default=False)
trashed_at = models.DateTimeField(_("trashed at"), null=True, blank=True)
sent_at = models.DateTimeField(_("sent at"), null=True, blank=True)
@@ -763,13 +898,22 @@ class Message(BaseModel):
mime_id = models.CharField(_("mime id"), max_length=998, null=True, blank=True)
# Stores the raw MIME message. This will be optimized and offloaded
# to object storage in the future.
raw_mime = models.BinaryField(blank=True, default=b"")
# Stores the raw MIME message.
blob = models.ForeignKey(
"Blob",
on_delete=models.SET_NULL,
null=True,
blank=True,
related_name="messages",
)
# Store the draft body as arbitrary JSON text. Might be offloaded
# somewhere else as well.
draft_body = models.TextField(_("draft body"), blank=True, null=True)
draft_blob = models.ForeignKey(
"Blob",
on_delete=models.SET_NULL,
null=True,
blank=True,
related_name="drafts",
)
# Internal cache for parsed data
_parsed_email_cache: Optional[Dict[str, Any]] = None
@@ -784,12 +928,12 @@ class Message(BaseModel):
return str(self.subject) if self.subject else "(no subject)"
def get_parsed_data(self) -> Dict[str, Any]:
"""Parse raw_mime using parser and cache the result."""
"""Parse raw mime message using parser and cache the result."""
if self._parsed_email_cache is not None:
return self._parsed_email_cache
if self.raw_mime:
self._parsed_email_cache = parse_email_message(self.raw_mime)
if self.blob:
self._parsed_email_cache = parse_email_message(self.blob.get_content())
else:
self._parsed_email_cache = {}
return self._parsed_email_cache
@@ -819,28 +963,34 @@ class Blob(BaseModel):
This model follows the JMAP blob design, storing raw content that can
be referenced by multiple attachments.
This will be offloaded to object storage in the future.
"""
sha256 = models.CharField(
sha256 = models.BinaryField(
_("sha256 hash"),
max_length=64,
max_length=32,
db_index=True,
help_text=_("SHA-256 hash of the blob content"),
help_text=_("SHA-256 hash of the uncompressed blob content"),
)
size = models.PositiveIntegerField(
_("file size"), help_text=_("Size of the blob in bytes")
)
type = models.CharField(
_("content type"), max_length=255, help_text=_("MIME type of the blob")
content_type = models.CharField(
_("content type"), max_length=127, help_text=_("MIME type of the blob")
)
compression = models.SmallIntegerField(
_("compression"),
choices=CompressionTypeChoices.choices,
default=CompressionTypeChoices.NONE,
)
raw_content = models.BinaryField(
_("raw content"),
help_text=_(
"Binary content of the blob, will be offloaded to object storage in the future"
),
help_text=_("Compressed binary content of the blob"),
)
mailbox = models.ForeignKey(
@@ -859,6 +1009,22 @@ class Blob(BaseModel):
def __str__(self):
return f"Blob {self.id} ({self.size} bytes)"
def get_content(self) -> bytes:
"""
Get the decompressed content of this blob.
Returns:
The decompressed content
Raises:
ValueError: If the blob compression type is not supported
"""
if self.compression == CompressionTypeChoices.NONE:
return self.raw_content
if self.compression == CompressionTypeChoices.ZSTD:
return pyzstd.decompress(self.raw_content)
raise ValueError(f"Unsupported compression type: {self.compression}")
class Attachment(BaseModel):
"""Attachment model to link messages with blobs."""
@@ -901,7 +1067,7 @@ class Attachment(BaseModel):
@property
def content_type(self):
"""Return the content type of the associated blob."""
return self.blob.type
return self.blob.content_type
@property
def size(self) -> int:
@@ -923,9 +1089,8 @@ class MailDomainAccess(BaseModel):
user = models.ForeignKey(
"User", on_delete=models.CASCADE, related_name="maildomain_accesses"
)
role = models.CharField(
role = models.SmallIntegerField(
_("role"),
max_length=20,
choices=MailDomainAccessRoleChoices.choices,
default=MailDomainAccessRoleChoices.ADMIN,
)
@@ -938,3 +1103,66 @@ class MailDomainAccess(BaseModel):
def __str__(self):
return f"Access to {self.maildomain} for {self.user} with {self.role} role"
class DKIMKey(BaseModel):
"""DKIM Key model to store DKIM signing keys with encrypted private key storage."""
selector = models.CharField(
_("selector"),
max_length=255,
help_text=_("DKIM selector (e.g., 'default', 'mail')"),
)
private_key = EncryptedTextField(
_("private key"),
help_text=_("DKIM private key in PEM format (encrypted)"),
)
public_key = models.TextField(
_("public key"),
help_text=_("DKIM public key for DNS record generation"),
)
algorithm = models.SmallIntegerField(
_("algorithm"),
choices=DKIMAlgorithmChoices.choices,
default=DKIMAlgorithmChoices.RSA,
help_text=_("DKIM signing algorithm"),
)
key_size = models.PositiveIntegerField(
_("key size"),
help_text=_("Key size in bits (e.g., 2048, 4096 for RSA)"),
)
is_active = models.BooleanField(
_("is active"),
default=True,
help_text=_("Whether this DKIM key is active and should be used for signing"),
)
domain = models.ForeignKey(
"MailDomain",
on_delete=models.CASCADE,
related_name="dkim_keys",
help_text=_("Domain that owns this DKIM key"),
)
class Meta:
db_table = "messages_dkimkey"
verbose_name = _("DKIM key")
verbose_name_plural = _("DKIM keys")
ordering = ["-created_at"] # Most recent first for picking latest active key
def __str__(self):
return f"DKIM Key {self.selector} ({self.algorithm}) - {self.domain}"
def get_private_key_bytes(self) -> bytes:
"""Get the private key as bytes."""
return self.private_key.encode("utf-8")
def get_dns_record_value(self) -> str:
"""Get the DNS TXT record value for this DKIM key."""
algorithm_enum = DKIMAlgorithmChoices(self.algorithm)
return f"v=DKIM1; k={algorithm_enum.label}; p={self.public_key}"

View File

@@ -1,9 +1,9 @@
"""Elasticsearch search functionality for messages."""
"""OpenSearch search functionality for messages."""
from core.search.index import (
create_index_if_not_exists,
delete_index,
get_es_client,
get_opensearch_client,
index_message,
index_thread,
reindex_all,
@@ -19,7 +19,7 @@ __all__ = [
"MESSAGE_INDEX",
"MESSAGE_MAPPING",
# Client & Index management
"get_es_client",
"get_opensearch_client",
"create_index_if_not_exists",
"delete_index",
# Indexing

View File

@@ -1,12 +1,12 @@
"""Elasticsearch client and indexing functionality."""
"""OpenSearch client and indexing functionality."""
# pylint: disable=unexpected-keyword-arg
import logging
from django.conf import settings
from elasticsearch import Elasticsearch
from elasticsearch.exceptions import NotFoundError
from opensearchpy import OpenSearch
from opensearchpy.exceptions import NotFoundError
from core import enums, models
from core.mda.rfc5322 import parse_email_message
@@ -15,32 +15,35 @@ from core.search.mapping import MESSAGE_INDEX, MESSAGE_MAPPING
logger = logging.getLogger(__name__)
# Elasticsearch client instantiation
def get_es_client():
"""Get Elasticsearch client instance."""
if not hasattr(get_es_client, "cached_client"):
get_es_client.cached_client = Elasticsearch(hosts=settings.ELASTICSEARCH_HOSTS)
return get_es_client.cached_client
# OpenSearch client instantiation
def get_opensearch_client():
"""Get OpenSearch client instance."""
if not hasattr(get_opensearch_client, "cached_client"):
kwargs = {"hosts": settings.OPENSEARCH_HOSTS}
if settings.OPENSEARCH_CA_CERTS:
kwargs["ca_certs"] = settings.OPENSEARCH_CA_CERTS
get_opensearch_client.cached_client = OpenSearch(**kwargs)
return get_opensearch_client.cached_client
def create_index_if_not_exists():
"""Create ES indices if they don't exist."""
es = get_es_client()
es = get_opensearch_client()
# Check if the index exists
if not es.indices.exists(index=MESSAGE_INDEX):
# Create the index with our mapping
es.indices.create(index=MESSAGE_INDEX, **MESSAGE_MAPPING)
logger.info("Created Elasticsearch index: %s", MESSAGE_INDEX)
es.indices.create(index=MESSAGE_INDEX, body=MESSAGE_MAPPING)
logger.info("Created OpenSearch index: %s", MESSAGE_INDEX)
return True
def delete_index():
"""Delete the messages index."""
es = get_es_client()
es = get_opensearch_client()
try:
es.indices.delete(index=MESSAGE_INDEX)
logger.info("Deleted Elasticsearch index: %s", MESSAGE_INDEX)
logger.info("Deleted OpenSearch index: %s", MESSAGE_INDEX)
return True
except NotFoundError:
logger.warning("Index %s not found, nothing to delete", MESSAGE_INDEX)
@@ -49,16 +52,16 @@ def delete_index():
def index_message(message: models.Message) -> bool:
"""Index a single message."""
es = get_es_client()
es = get_opensearch_client()
# Parse message content if it has raw MIME
# Parse message content if it has a blob
parsed_data = {}
if message.raw_mime:
if message.blob:
try:
parsed_data = parse_email_message(message.raw_mime)
parsed_data = parse_email_message(message.blob.get_content())
# pylint: disable=broad-exception-caught
except Exception as e: # noqa: BLE001
logger.error("Error parsing raw MIME for message %s: %s", message.id, e)
except Exception as e:
logger.error("Error parsing blob content for message %s: %s", message.id, e)
return False
# Extract text content from parsed data
@@ -138,19 +141,19 @@ def index_message(message: models.Message) -> bool:
index=MESSAGE_INDEX,
id=str(message.id),
routing=str(message.thread_id), # Ensure parent-child routing
document=doc,
body=doc,
)
logger.debug("Indexed message %s", message.id)
return True
# pylint: disable=broad-exception-caught
except Exception as e: # noqa: BLE001
except Exception as e:
logger.error("Error indexing message %s: %s", message.id, e)
return False
def index_thread(thread: models.Thread) -> bool:
"""Index a thread and all its messages."""
es = get_es_client()
es = get_opensearch_client()
# Get mailbox IDs that have access to this thread
mailbox_ids = list(thread.accesses.values_list("mailbox__id", flat=True))
@@ -166,7 +169,7 @@ def index_thread(thread: models.Thread) -> bool:
try:
# Index thread as parent document
# pylint: disable=no-value-for-parameter
es.index(index=MESSAGE_INDEX, id=str(thread.id), document=thread_doc)
es.index(index=MESSAGE_INDEX, id=str(thread.id), body=thread_doc)
# Index all messages in the thread
messages = thread.messages.all()
@@ -177,7 +180,7 @@ def index_thread(thread: models.Thread) -> bool:
return success
# pylint: disable=broad-exception-caught
except Exception as e: # noqa: BLE001
except Exception as e:
logger.error("Error indexing thread %s: %s", thread.id, e)
return False
@@ -233,7 +236,7 @@ def reindex_mailbox(mailbox_id: str):
return {"status": "error", "mailbox": mailbox_id, "error": "Mailbox not found"}
# pylint: disable=broad-exception-caught
except Exception as e: # noqa: BLE001
except Exception as e:
logger.error("Error reindexing mailbox %s: %s", mailbox_id, e)
return {"status": "error", "mailbox": mailbox_id, "error": str(e)}
@@ -253,5 +256,5 @@ def reindex_thread(thread_id: str):
except models.Thread.DoesNotExist:
return {"status": "error", "thread": thread_id, "error": "Thread not found"}
# pylint: disable=broad-exception-caught
except Exception as e: # noqa: BLE001
except Exception as e:
return {"status": "error", "thread": thread_id, "error": str(e)}

View File

@@ -1,4 +1,4 @@
"""Elasticsearch index and mapping configuration."""
"""OpenSearch index and mapping configuration."""
# Index name constants
MESSAGE_INDEX = "messages"

View File

@@ -7,7 +7,7 @@ from typing import Any, Dict, Optional
from django.conf import settings
from core.search.index import get_es_client
from core.search.index import get_opensearch_client
from core.search.mapping import MESSAGE_INDEX
from core.search.parse import parse_search_query
@@ -36,13 +36,13 @@ def search_threads(
Returns:
Dictionary with thread search results: {"threads": [...], "total": int, "from": int, "size": int}
"""
# Check if Elasticsearch is enabled
if not getattr(settings, "ELASTICSEARCH_INDEX_THREADS", True):
logger.debug("Elasticsearch search is disabled, returning empty results")
# Check if OpenSearch is enabled
if not getattr(settings, "OPENSEARCH_INDEX_THREADS", True):
logger.debug("OpenSearch search is disabled, returning empty results")
return {"threads": [], "total": 0, "from": from_offset, "size": size}
try:
es = get_es_client()
es = get_opensearch_client()
# Parse the query for modifiers
parsed_query = parse_search_query(query)
@@ -50,7 +50,7 @@ def search_threads(
# Build the search query
search_body = {
"query": {"bool": {"must": [], "should": [], "filter": []}},
"from_": from_offset,
"from": from_offset,
"size": size,
"sort": [{"created_at": {"order": "desc"}}],
}
@@ -189,7 +189,7 @@ def search_threads(
# Execute search
# pylint: disable=unexpected-keyword-arg
results = es.search(index=MESSAGE_INDEX, **search_body)
results = es.search(index=MESSAGE_INDEX, body=search_body)
if profile:
logger.debug("Search body: %s", json.dumps(search_body, indent=2))
@@ -209,7 +209,7 @@ def search_threads(
):
total = hits["total"]["value"]
elif "total" in hits and isinstance(hits["total"], int):
# Handle older Elasticsearch versions
# Handle older OpenSearch versions
total = hits["total"]
thread_ids = set()
@@ -232,7 +232,7 @@ def search_threads(
}
# pylint: disable=broad-exception-caught
except Exception as e: # noqa: BLE001
except Exception as e:
logger.error("Error searching threads: %s", e)
return {
"threads": [],

View File

@@ -42,8 +42,8 @@ class ImportService:
return False, {"detail": "You do not have access to this mailbox."}
try:
file_content = file.raw_content
content_type = file.type
file_content = file.get_content()
content_type = file.content_type
# Check MIME type for MBOX
if content_type in [

View File

@@ -12,12 +12,19 @@ from core.identity.keycloak import (
sync_mailbox_to_keycloak_user,
sync_maildomain_to_keycloak_group,
)
from core.search import MESSAGE_INDEX, get_es_client
from core.search import MESSAGE_INDEX, get_opensearch_client
from core.tasks import index_message_task, reindex_thread_task
logger = logging.getLogger(__name__)
@receiver(post_save, sender=models.MailDomain)
def create_dkim_key(sender, instance, created, **kwargs):
"""Create a DKIM key for a new MailDomain."""
if created:
instance.generate_dkim_key()
@receiver(post_save, sender=models.MailDomain)
def sync_maildomain_to_keycloak(sender, instance, created, **kwargs):
"""Sync MailDomain to Keycloak as a group when saved."""
@@ -41,7 +48,7 @@ def sync_mailbox_to_keycloak(sender, instance, created, **kwargs):
@receiver(post_save, sender=models.Message)
def index_message_post_save(sender, instance, created, **kwargs):
"""Index a message after it's saved."""
if not getattr(settings, "ELASTICSEARCH_INDEX_THREADS", False):
if not getattr(settings, "OPENSEARCH_INDEX_THREADS", False):
return
try:
@@ -61,7 +68,7 @@ def index_message_post_save(sender, instance, created, **kwargs):
@receiver(post_save, sender=models.MessageRecipient)
def index_message_recipient_post_save(sender, instance, created, **kwargs):
"""Index a message recipient after it's saved."""
if not getattr(settings, "ELASTICSEARCH_INDEX_THREADS", False):
if not getattr(settings, "OPENSEARCH_INDEX_THREADS", False):
return
try:
@@ -81,7 +88,7 @@ def index_message_recipient_post_save(sender, instance, created, **kwargs):
@receiver(post_save, sender=models.Thread)
def index_thread_post_save(sender, instance, created, **kwargs):
"""Index a thread after it's saved."""
if not getattr(settings, "ELASTICSEARCH_INDEX_THREADS", False):
if not getattr(settings, "OPENSEARCH_INDEX_THREADS", False):
return
try:
@@ -100,11 +107,11 @@ def index_thread_post_save(sender, instance, created, **kwargs):
@receiver(post_delete, sender=models.Message)
def delete_message_from_index(sender, instance, **kwargs):
"""Remove a message from the index after it's deleted."""
if not getattr(settings, "ELASTICSEARCH_INDEX_THREADS", False):
if not getattr(settings, "OPENSEARCH_INDEX_THREADS", False):
return
try:
es = get_es_client()
es = get_opensearch_client()
# pylint: disable=unexpected-keyword-arg
es.delete(
index=MESSAGE_INDEX,
@@ -124,11 +131,11 @@ def delete_message_from_index(sender, instance, **kwargs):
@receiver(post_delete, sender=models.Thread)
def delete_thread_from_index(sender, instance, **kwargs):
"""Remove a thread and its messages from the index after it's deleted."""
if not getattr(settings, "ELASTICSEARCH_INDEX_THREADS", False):
if not getattr(settings, "OPENSEARCH_INDEX_THREADS", False):
return
try:
es = get_es_client()
es = get_opensearch_client()
# Delete the thread document
# pylint: disable=unexpected-keyword-arg

View File

@@ -5,10 +5,13 @@ import imaplib
from typing import Any, Dict, List
from django.conf import settings
from django.db.models import Q
from django.utils import timezone
from celery.utils.log import get_task_logger
from core import models
from core.enums import MessageDeliveryStatusChoices
from core.mda.inbound import deliver_inbound_message
from core.mda.outbound import send_message
from core.mda.rfc5322 import parse_email_message
@@ -70,51 +73,165 @@ def send_message_task(self, message_id, force_mta_out=False):
raise
@celery_app.task(bind=True)
def retry_messages_task(self, message_id=None, force_mta_out=False, batch_size=100):
"""Retry sending messages with retryable recipients (respects retry timing).
Args:
message_id: Optional specific message ID to retry
force_mta_out: Whether to force sending via MTA
batch_size: Number of messages to process in each batch
Returns:
dict: A dictionary with task status and results
"""
# Get messages to process
if message_id:
# Single message mode
try:
message = models.Message.objects.get(id=message_id)
except models.Message.DoesNotExist:
error_msg = f"Message with ID '{message_id}' does not exist"
return {"success": False, "error": error_msg}
if message.is_draft:
error_msg = f"Message '{message_id}' is still a draft and cannot be sent"
return {"success": False, "error": error_msg}
messages_to_process = [message]
total_messages = 1
else:
# Bulk mode - find all messages with retryable recipients that are ready for retry
message_filter_q = Q(
is_draft=False,
recipients__delivery_status=MessageDeliveryStatusChoices.RETRY,
) & (
Q(recipients__retry_at__isnull=True)
| Q(recipients__retry_at__lte=timezone.now())
)
messages_to_process = list(
models.Message.objects.filter(message_filter_q).distinct()
)
total_messages = len(messages_to_process)
if total_messages == 0:
return {
"success": True,
"total_messages": 0,
"processed_messages": 0,
"success_count": 0,
"error_count": 0,
"message": "No messages ready for retry",
}
# Process messages in batches
processed_count = 0
success_count = 0
error_count = 0
for batch_start in range(0, total_messages, batch_size):
batch_messages = messages_to_process[batch_start : batch_start + batch_size]
# Update progress for bulk operations
if not message_id:
self.update_state(
state="PROGRESS",
meta={
"current_batch": batch_start // batch_size + 1,
"total_batches": (total_messages + batch_size - 1) // batch_size,
"processed_messages": processed_count,
"total_messages": total_messages,
"success_count": success_count,
"error_count": error_count,
},
)
for message in batch_messages:
try:
# Get recipients with retry status that are ready for retry
retry_filter_q = Q(
delivery_status=MessageDeliveryStatusChoices.RETRY
) & (Q(retry_at__isnull=True) | Q(retry_at__lte=timezone.now()))
retry_recipients = message.recipients.filter(retry_filter_q)
if retry_recipients.exists():
# Process this message
send_message(message, force_mta_out=force_mta_out)
success_count += 1
logger.info(
"Successfully retried message %s (%d recipients)",
message.id,
retry_recipients.count(),
)
processed_count += 1
except Exception as e:
error_count += 1
logger.exception("Failed to retry message %s: %s", message.id, e)
# Return appropriate result format
if message_id:
return {
"success": True,
"message_id": str(message_id),
"recipients_processed": success_count,
"processed_messages": processed_count,
"success_count": success_count,
"error_count": error_count,
}
return {
"success": True,
"total_messages": total_messages,
"processed_messages": processed_count,
"success_count": success_count,
"error_count": error_count,
}
def _reindex_all_base(update_progress=None):
"""Base function for reindexing all threads and messages.
Args:
update_progress: Optional callback function to update progress
"""
if not settings.ELASTICSEARCH_INDEX_THREADS:
logger.info("Elasticsearch thread indexing is disabled.")
if not settings.OPENSEARCH_INDEX_THREADS:
logger.info("OpenSearch thread indexing is disabled.")
return {"success": False, "reason": "disabled"}
try:
# Ensure index exists first
create_index_if_not_exists()
# Ensure index exists first
create_index_if_not_exists()
# Get all threads and index them
threads = models.Thread.objects.all()
total = threads.count()
success_count = 0
failure_count = 0
# Get all threads and index them
threads = models.Thread.objects.all()
total = threads.count()
success_count = 0
failure_count = 0
for i, thread in enumerate(threads):
try:
if index_thread(thread):
success_count += 1
else:
failure_count += 1
# pylint: disable=broad-exception-caught
except Exception as e:
for i, thread in enumerate(threads):
try:
if index_thread(thread):
success_count += 1
else:
failure_count += 1
logger.exception("Error indexing thread %s: %s", thread.id, e)
# pylint: disable=broad-exception-caught
except Exception as e:
failure_count += 1
logger.exception("Error indexing thread %s: %s", thread.id, e)
# Update progress if callback provided
if update_progress and i % 100 == 0:
update_progress(i, total, success_count, failure_count)
# Update progress if callback provided
if update_progress and i % 100 == 0:
update_progress(i, total, success_count, failure_count)
return {
"success": True,
"total": total,
"success_count": success_count,
"failure_count": failure_count,
}
# pylint: disable=broad-exception-caught
except Exception as e:
logger.exception("Error in reindex_all task: %s", e)
raise
return {
"success": True,
"total": total,
"success_count": success_count,
"failure_count": failure_count,
}
@celery_app.task(bind=True)
@@ -139,8 +256,8 @@ def reindex_all(self):
@celery_app.task(bind=True)
def reindex_thread_task(self, thread_id):
"""Reindex a specific thread and all its messages."""
if not settings.ELASTICSEARCH_INDEX_THREADS:
logger.info("Elasticsearch thread indexing is disabled.")
if not settings.OPENSEARCH_INDEX_THREADS:
logger.info("OpenSearch thread indexing is disabled.")
return {"success": False, "reason": "disabled"}
try:
@@ -172,62 +289,56 @@ def reindex_thread_task(self, thread_id):
@celery_app.task(bind=True)
def reindex_mailbox_task(self, mailbox_id):
"""Reindex all threads and messages in a specific mailbox."""
if not settings.ELASTICSEARCH_INDEX_THREADS:
logger.info("Elasticsearch thread indexing is disabled.")
if not settings.OPENSEARCH_INDEX_THREADS:
logger.info("OpenSearch thread indexing is disabled.")
return {"success": False, "reason": "disabled"}
try:
# Ensure index exists first
create_index_if_not_exists()
# Ensure index exists first
create_index_if_not_exists()
# Get all threads in the mailbox
threads = models.Mailbox.objects.get(id=mailbox_id).threads_viewer
total = threads.count()
success_count = 0
failure_count = 0
# Get all threads in the mailbox
threads = models.Mailbox.objects.get(id=mailbox_id).threads_viewer
total = threads.count()
success_count = 0
failure_count = 0
for i, thread in enumerate(threads):
try:
if index_thread(thread):
success_count += 1
else:
failure_count += 1
# pylint: disable=broad-exception-caught
except Exception as e:
for i, thread in enumerate(threads):
try:
if index_thread(thread):
success_count += 1
else:
failure_count += 1
logger.exception("Error indexing thread %s: %s", thread.id, e)
# pylint: disable=broad-exception-caught
except Exception as e:
failure_count += 1
logger.exception("Error indexing thread %s: %s", thread.id, e)
# Update progress every 50 threads
if i % 50 == 0:
self.update_state(
state="PROGRESS",
meta={
"current": i,
"total": total,
"success_count": success_count,
"failure_count": failure_count,
},
)
# Update progress every 50 threads
if i % 50 == 0:
self.update_state(
state="PROGRESS",
meta={
"current": i,
"total": total,
"success_count": success_count,
"failure_count": failure_count,
},
)
return {
"mailbox_id": str(mailbox_id),
"success": True,
"total": total,
"success_count": success_count,
"failure_count": failure_count,
}
except Exception as e:
logger.exception(
"Error in reindex_mailbox_task for mailbox %s: %s", mailbox_id, e
)
raise
return {
"mailbox_id": str(mailbox_id),
"success": True,
"total": total,
"success_count": success_count,
"failure_count": failure_count,
}
@celery_app.task(bind=True)
def index_message_task(self, message_id):
"""Index a single message."""
if not settings.ELASTICSEARCH_INDEX_THREADS:
logger.info("Elasticsearch message indexing is disabled.")
if not settings.OPENSEARCH_INDEX_THREADS:
logger.info("OpenSearch message indexing is disabled.")
return {"success": False, "reason": "disabled"}
try:
@@ -264,15 +375,12 @@ def index_message_task(self, message_id):
@celery_app.task(bind=True)
def reset_elasticsearch_index(self):
"""Delete and recreate the Elasticsearch index."""
try:
delete_index()
create_index_if_not_exists()
return {"success": True}
except Exception as e:
logger.exception("Error resetting Elasticsearch index: %s", e)
raise
def reset_search_index(self):
"""Delete and recreate the OpenSearch index."""
delete_index()
create_index_if_not_exists()
return {"success": True}
# @celery_app.task(bind=True)

View File

@@ -286,24 +286,82 @@ class TestAdminMailDomainViewSet:
assert response.status_code == status.HTTP_200_OK
assert response.data["count"] == 10
def test_maildomain_retrieve_query_optimization(
def test_maildomain_expected_dns_records_in_response(
self,
api_client,
domain_admin_user,
domain_admin_access1,
mail_domain1,
django_assert_num_queries,
):
"""Test that maildomain retrieve endpoint is optimized for queries."""
"""Test that expected_dns_records field is only included in detail views, not list views."""
api_client.force_authenticate(user=domain_admin_user)
with django_assert_num_queries(
1
): # 1 query to retrieve maildomain with annotation
response = api_client.get(f"{self.LIST_DOMAINS_URL}{mail_domain1.id}/")
# Test list endpoint - should NOT include DNS records
response = api_client.get(self.LIST_DOMAINS_URL)
assert response.status_code == status.HTTP_200_OK
assert response.data["id"] == str(mail_domain1.id)
assert response.data["count"] == 1
domain_data = response.data["results"][0]
assert "expected_dns_records" in domain_data
assert (
domain_data["expected_dns_records"] is None
) # Should be None in list view
# Test detail endpoint - should include DNS records
detail_response = api_client.get(f"{self.LIST_DOMAINS_URL}{mail_domain1.id}/")
assert detail_response.status_code == status.HTTP_200_OK
assert "expected_dns_records" in detail_response.data
dns_records = detail_response.data["expected_dns_records"]
assert dns_records is not None # Should have DNS records in detail view
# Verify DNS records structure
assert isinstance(dns_records, list)
assert (
len(dns_records) >= 4
) # At least MX (2), SPF, DMARC, and potentially DKIM
# Verify MX records (should have 2)
mx_records = [record for record in dns_records if record["type"] == "mx"]
assert len(mx_records) == 2
assert any("mx1." in record["value"] for record in mx_records)
assert any("mx2." in record["value"] for record in mx_records)
# Verify SPF record
spf_records = [
record
for record in dns_records
if record["type"] == "txt" and "spf1" in record["value"]
]
assert len(spf_records) == 1
assert spf_records[0]["target"] == ""
assert "v=spf1" in spf_records[0]["value"]
# Verify DMARC record
dmarc_records = [
record
for record in dns_records
if record["type"] == "txt" and "DMARC1" in record["value"]
]
assert len(dmarc_records) == 1
assert dmarc_records[0]["target"] == "_dmarc"
assert "v=DMARC1" in dmarc_records[0]["value"]
# Verify DKIM record (should be present since we auto-generate DKIM keys)
dkim_records = [
record
for record in dns_records
if record["type"] == "txt" and "DKIM1" in record["value"]
]
assert len(dkim_records) == 1
assert dkim_records[0]["target"].endswith("._domainkey")
assert "v=DKIM1" in dkim_records[0]["value"]
# Test detail endpoint
detail_response = api_client.get(f"{self.LIST_DOMAINS_URL}{mail_domain1.id}/")
assert detail_response.status_code == status.HTTP_200_OK
assert "expected_dns_records" in detail_response.data
assert detail_response.data["expected_dns_records"] == dns_records
class TestMailDomainAbilitiesAPI:

View File

@@ -169,7 +169,7 @@ class TestAdminMailDomainMailboxViewSet:
None,
)
assert user1_access_data is not None
assert user1_access_data["role"] == MailboxRoleChoices.EDITOR.value
assert user1_access_data["role"] == "editor"
assert user1_access_data["user"]["email"] == user_for_access1.email
user2_access_data = next(
@@ -181,7 +181,7 @@ class TestAdminMailDomainMailboxViewSet:
None,
)
assert user2_access_data is not None
assert user2_access_data["role"] == MailboxRoleChoices.VIEWER.value
assert user2_access_data["role"] == "viewer"
# Check that mailbox2_domain1 is also present
mb2_data = next(
@@ -372,7 +372,6 @@ class TestAdminMailDomainMailboxViewSet:
# Verify user was created with correct details
user = models.User.objects.get(email="newuser@admin-domain1.com")
assert user.full_name == "John Doe"
assert user.short_name == "John"
assert user.password == "?"
# Verify mailbox access was created
@@ -392,7 +391,6 @@ class TestAdminMailDomainMailboxViewSet:
factories.UserFactory(
email="existinguser@admin-domain1.com",
full_name="Existing User",
short_name="Existing",
password="existing-password",
)
@@ -411,7 +409,6 @@ class TestAdminMailDomainMailboxViewSet:
# Verify user details were not updated
user = models.User.objects.get(email="existinguser@admin-domain1.com")
assert user.full_name == "Existing User" # Should not be updated
assert user.short_name == "Existing" # Should not be updated
assert user.password == "existing-password" # Should not be updated
# Verify mailbox access was created
@@ -465,7 +462,6 @@ class TestAdminMailDomainMailboxViewSet:
assert "id" in user_data
assert "email" in user_data
assert "full_name" in user_data
assert "short_name" in user_data
# Also check mailbox2_domain1 (should have 0 accesses)
mb2_data = next(
@@ -509,7 +505,6 @@ class TestAdminMailDomainMailboxViewSet:
assert "id" in user_data
assert "email" in user_data
assert "full_name" in user_data
assert "short_name" in user_data
def test_admin_maildomains_mailbox_excludes_abilities_with_superuser(
self,

View File

@@ -225,36 +225,6 @@ class TestAdminMaildomainsUserList:
assert len(response.data) == 1
assert response.data[0]["full_name"] == user1.full_name
def test_admin_maildomains_user_list_search_by_short_name(self, api_client):
"""Test searching users by short name."""
domain = factories.MailDomainFactory(name="search.local")
admin_user = factories.UserFactory(email="admin@search.local")
user1 = factories.UserFactory(
email="alice@search.local", full_name="Alice Smith", short_name="Alice"
)
user2 = factories.UserFactory(
email="bob@search.local", full_name="Bob Jones", short_name="Bob"
)
factories.MailDomainAccessFactory(
maildomain=domain,
user=admin_user,
role=enums.MailDomainAccessRoleChoices.ADMIN,
)
factories.MailboxAccessFactory(mailbox__domain=domain, user=user1)
factories.MailboxAccessFactory(mailbox__domain=domain, user=user2)
url = reverse(
"admin-maildomains-user-list", kwargs={"maildomain_pk": domain.id}
)
api_client.force_authenticate(user=admin_user)
# Search for "Alice"
response = api_client.get(url, {"q": "Alice"})
assert response.status_code == status.HTTP_200_OK
assert len(response.data) == 1
assert response.data[0]["short_name"] == user1.short_name
def test_admin_maildomains_user_list_search_case_insensitive(self, api_client):
"""Test that search is case insensitive."""
domain = factories.MailDomainFactory(name="search.local")
@@ -348,19 +318,17 @@ class TestAdminMaildomainsUserList:
# ============================================================================
def test_admin_maildomains_user_list_ordering(self, api_client):
"""Test that users are ordered correctly (full_name, short_name, email)."""
"""Test that users are ordered correctly (full_name, email)."""
domain = factories.MailDomainFactory(name="order.local")
admin_user = factories.UserFactory(
email="admin@order.local", full_name="Admin User", short_name="Admin"
email="admin@order.local", full_name="Admin User"
)
user1 = factories.UserFactory(
email="alice@order.local", full_name="Alice Smith", short_name="Alice"
)
user2 = factories.UserFactory(
email="bob@order.local", full_name="Bob Jones", short_name="Bob"
email="alice@order.local", full_name="Alice Smith"
)
user2 = factories.UserFactory(email="bob@order.local", full_name="Bob Jones")
user3 = factories.UserFactory(
email="charlie@order.local", full_name="Charlie Brown", short_name="Charlie"
email="charlie@order.local", full_name="Charlie Brown"
)
factories.MailDomainAccessFactory(
@@ -392,14 +360,10 @@ class TestAdminMaildomainsUserList:
"""Test ordering when some users have null names."""
domain = factories.MailDomainFactory(name="order.local")
admin_user = factories.UserFactory(
email="fritz@order.local", full_name="Admin User", short_name="Admin"
)
user1 = factories.UserFactory(
email="bob@order.local", full_name=None, short_name=None
)
user2 = factories.UserFactory(
email="alice@order.local", full_name=None, short_name=None
email="fritz@order.local", full_name="Admin User"
)
user1 = factories.UserFactory(email="bob@order.local", full_name=None)
user2 = factories.UserFactory(email="alice@order.local", full_name=None)
factories.MailDomainAccessFactory(
maildomain=domain,
@@ -430,10 +394,10 @@ class TestAdminMaildomainsUserList:
"""Test that the serializer returns the correct fields."""
domain = factories.MailDomainFactory(name="serializer.local")
admin_user = factories.UserFactory(
email="admin@serializer.local", full_name="Admin User", short_name="Admin"
email="admin@serializer.local", full_name="Admin User"
)
user1 = factories.UserFactory(
email="alice@serializer.local", full_name="Alice Smith", short_name="Alice"
email="alice@serializer.local", full_name="Alice Smith"
)
factories.MailDomainAccessFactory(
@@ -454,19 +418,12 @@ class TestAdminMaildomainsUserList:
# Check that all expected fields are present
for user_data in response.data:
assert "id" in user_data
assert "email" in user_data
assert "full_name" in user_data
assert "short_name" in user_data
assert "abilities" not in user_data
assert len(user_data.keys()) == 4
assert set(user_data.keys()) == {"id", "email", "full_name"}
def test_admin_maildomains_user_list_serializer_null_fields(self, api_client):
"""Test that null fields are handled correctly in the serializer."""
domain = factories.MailDomainFactory(name="null.local")
admin_user = factories.UserFactory(
email="admin@null.local", full_name=None, short_name=None
)
admin_user = factories.UserFactory(email="admin@null.local", full_name=None)
factories.MailDomainAccessFactory(
maildomain=domain,
@@ -487,7 +444,6 @@ class TestAdminMaildomainsUserList:
assert user_data["id"] == str(admin_user.id)
assert user_data["email"] == admin_user.email
assert user_data["full_name"] is None
assert user_data["short_name"] is None
def test_admin_maildomains_user_list_user_without_email(self, api_client):
"""Test handling of users without email addresses."""

View File

@@ -86,8 +86,8 @@ class TestBlobAPI:
# Verify the blob was created in the database
blob_id = uuid.UUID(response.data["blobId"])
blob = models.Blob.objects.get(id=blob_id)
assert blob.type == "text/plain"
assert blob.sha256 == expected_hash
assert blob.content_type == "text/plain"
assert blob.sha256.hex() == expected_hash
assert blob.size == len(file_content)
assert blob.mailbox == user_mailbox
@@ -160,12 +160,9 @@ class TestDraftWithAttachments:
def blob(self, user_mailbox):
"""Create a test blob."""
test_content = b"Test attachment content %i" % random.randint(0, 10000000)
return models.Blob.objects.create(
sha256=hashlib.sha256(test_content).hexdigest(),
size=len(test_content),
type="text/plain",
raw_content=test_content,
mailbox=user_mailbox,
return user_mailbox.create_blob(
content=test_content,
content_type="text/plain",
)
@pytest.fixture
@@ -241,6 +238,10 @@ class TestDraftWithAttachments:
thread=thread, sender=sender, is_draft=True, subject="Existing draft"
)
# attachment blob should already be created
assert models.Blob.objects.count() == 1
assert models.Blob.objects.first().content_type == "text/plain"
text_body = (
f"This is a test draft with an attachment {random.randint(0, 10000000)}"
)
@@ -266,6 +267,9 @@ class TestDraftWithAttachments:
# Check response
assert response.status_code == status.HTTP_200_OK
# still a single blob
assert models.Blob.objects.count() == 1
# Verify an attachment was created and linked to the draft
draft.refresh_from_db()
assert draft.attachments.count() == 1
@@ -292,10 +296,13 @@ class TestDraftWithAttachments:
draft.refresh_from_db()
assert draft.is_draft is False
assert draft.attachments.count() == 1
assert draft.attachments.first().blob == blob
assert draft.attachments.first().mailbox == user_mailbox
parsed_email = email.message_from_bytes(draft.raw_mime)
assert draft.attachments.count() == 0
# Original attachment blob should be deleted.
assert models.Blob.objects.count() == 1
assert models.Blob.objects.first().content_type == "message/rfc822"
parsed_email = email.message_from_bytes(draft.blob.get_content())
# Check that the email is multipart
assert parsed_email.is_multipart()
@@ -313,6 +320,6 @@ class TestDraftWithAttachments:
"text/plain",
]
assert parts[4].get_payload(decode=True).decode() == blob.raw_content.decode()
assert parts[4].get_payload(decode=True).decode() == blob.get_content().decode()
assert parts[4].get_content_disposition() == "attachment"
assert parts[4].get_filename() == "test_attachment.txt"

View File

@@ -227,7 +227,7 @@ class TestMailboxAccessViewSet:
data = { # No 'mailbox' field in data, it comes from URL
"user": str(user_beta.pk),
"role": MailboxRoleChoices.EDITOR.value,
"role": "editor",
}
response = api_client.post(
self.list_create_url(mailbox_id=mailbox1_domain1.pk), data
@@ -237,7 +237,7 @@ class TestMailboxAccessViewSet:
# Serializer might return mailbox PK if not read_only=True, or nested details.
# For now, check what's guaranteed by create.
assert response.data["user"] == user_beta.pk
assert response.data["role"] == MailboxRoleChoices.EDITOR.value
assert response.data["role"] == "editor"
assert models.MailboxAccess.objects.filter(
mailbox=mailbox1_domain1, user=user_beta, role=MailboxRoleChoices.EDITOR
).exists()
@@ -253,7 +253,7 @@ class TestMailboxAccessViewSet:
):
"""Mailbox admin should not be able to create accesses for unmanaged mailboxes."""
api_client.force_authenticate(user=mailbox1_admin_user)
data = {"user": str(user_beta.pk), "role": MailboxRoleChoices.EDITOR.value}
data = {"user": str(user_beta.pk), "role": "editor"}
response = api_client.post(
self.list_create_url(mailbox_id=mailbox1_domain2.pk), data
) # Attempt on mailbox1_domain2
@@ -264,7 +264,7 @@ class TestMailboxAccessViewSet:
):
"""Domain admin should not be able to create accesses for mailboxes in unmanaged domains."""
api_client.force_authenticate(user=domain_admin_user) # Admin for domain1
data = {"user": str(user_beta.pk), "role": MailboxRoleChoices.EDITOR.value}
data = {"user": str(user_beta.pk), "role": "editor"}
response = api_client.post(
self.list_create_url(mailbox_id=mailbox1_domain2.pk), data
) # mailbox1_domain2 is in domain2
@@ -325,14 +325,14 @@ class TestMailboxAccessViewSet:
domain_admin_user if admin_type == "domain_admin" else mailbox1_admin_user
)
api_client.force_authenticate(user=user_performing_action)
data = {"role": MailboxRoleChoices.ADMIN.value}
data = {"role": "admin"}
response = api_client.patch(
self.detail_url(mailbox_id=mailbox1_domain1.pk, pk=access_m1d1_alpha.pk),
data,
)
assert response.status_code == status.HTTP_200_OK
access_m1d1_alpha.refresh_from_db()
assert access_m1d1_alpha.role == MailboxRoleChoices.ADMIN
assert access_m1d1_alpha.role == MailboxRoleChoices.ADMIN.value
data = {"role": "invalid"}
response = api_client.patch(
@@ -341,7 +341,7 @@ class TestMailboxAccessViewSet:
)
assert response.status_code == status.HTTP_400_BAD_REQUEST
data = {"role": MailboxRoleChoices.ADMIN.value, "user": str(user_beta.pk)}
data = {"role": "admin", "user": str(user_beta.pk)}
response = api_client.patch(
self.detail_url(mailbox_id=mailbox1_domain1.pk, pk=access_m1d1_alpha.pk),
data,
@@ -395,7 +395,6 @@ class TestMailboxAccessViewSet:
assert "id" in user_details
assert "email" in user_details
assert "full_name" in user_details
assert "short_name" in user_details
def test_retrieve_mailbox_access_excludes_abilities_from_nested_user(
self,
@@ -419,7 +418,6 @@ class TestMailboxAccessViewSet:
assert "id" in user_details
assert "email" in user_details
assert "full_name" in user_details
assert "short_name" in user_details
def test_mailbox_access_excludes_abilities_with_superuser(
self,

View File

@@ -82,14 +82,14 @@ class TestMailboxViewSet:
{
"id": str(user_mailbox2.id),
"email": str(user_mailbox2),
"role": str(models.MailboxRoleChoices.EDITOR),
"role": "editor",
"count_unread_messages": 1,
"count_messages": 2,
},
{
"id": str(user_mailbox1.id),
"email": str(user_mailbox1),
"role": str(models.MailboxRoleChoices.VIEWER),
"role": "viewer",
"count_unread_messages": 1,
"count_messages": 1,
},

View File

@@ -99,8 +99,9 @@ class TestApiDraftAndSendMessage:
assert draft_message.is_unread is False
assert draft_message.is_trashed is False
assert draft_message.is_starred is False
assert draft_message.has_attachments is False
assert draft_message.sent_at is None
assert draft_message.draft_body == draft_content
assert draft_message.draft_blob.get_content().decode("utf-8") == draft_content
assert all(
recipient.delivery_status is None
@@ -116,6 +117,7 @@ class TestApiDraftAndSendMessage:
assert draft_message.thread.has_unread is False
assert draft_message.thread.has_trashed is False
assert draft_message.thread.has_starred is False
assert draft_message.thread.has_attachments is False
assert draft_message.thread.has_draft is True
assert draft_message.thread.sender_names == [draft_message.sender.name]
@@ -167,8 +169,8 @@ class TestApiDraftAndSendMessage:
mock_send_outbound_message.assert_called()
sent_message = models.Message.objects.get(id=draft_message_id)
assert sent_message.raw_mime
assert subject in sent_message.raw_mime.decode("utf-8")
assert sent_message.blob
assert subject in sent_message.blob.get_content().decode("utf-8")
assert sent_message.is_draft is False
assert sent_message.is_sender is True
@@ -264,13 +266,15 @@ class TestApiDraftAndSendMessage:
assert draft_message.is_unread is False
assert draft_message.is_trashed is False
assert draft_message.is_starred is False
assert draft_message.draft_body == draft_content
assert draft_message.has_attachments is False
assert draft_message.draft_blob.get_content().decode("utf-8") == draft_content
assert draft_message.thread.has_messages is True
assert draft_message.thread.has_sender is False
assert draft_message.thread.has_unread is False
assert draft_message.thread.has_trashed is False
assert draft_message.thread.has_starred is False
assert draft_message.thread.has_attachments is False
assert draft_message.thread.has_draft is True
assert draft_message.thread.sender_names == [
message.sender.name,
@@ -299,8 +303,8 @@ class TestApiDraftAndSendMessage:
mock_send_outbound_message.assert_called()
sent_message = models.Message.objects.get(id=draft_message_id)
assert sent_message.raw_mime
assert subject in sent_message.raw_mime.decode("utf-8")
assert sent_message.blob
assert subject in sent_message.blob.get_content().decode("utf-8")
assert sent_message.is_draft is False
assert sent_message.is_sender is True
@@ -788,7 +792,7 @@ class TestApiDraftAndSendReply:
assert (
b"In-Reply-To: <" + message.mime_id.encode("utf-8") + b">\r\n"
in sent_message.raw_mime
in sent_message.blob.get_content()
)
@pytest.mark.parametrize(
@@ -972,7 +976,10 @@ class TestApiDraftAndSendReply:
updated_message = models.Message.objects.get(id=draft_message.id)
assert updated_message.is_draft is True
assert updated_message.subject == updated_subject
assert updated_message.draft_body == "updated content"
assert (
updated_message.draft_blob.get_content().decode("utf-8")
== "updated content"
)
# Assert recipients are updated
assert updated_message.recipients.count() == 3

View File

@@ -2,7 +2,6 @@
# pylint: disable=redefined-outer-name, unused-argument, no-value-for-parameter
import datetime
import hashlib
from unittest.mock import patch
from django.core.files.uploadedfile import SimpleUploadedFile
@@ -12,7 +11,7 @@ from rest_framework.test import APIClient
from core import factories
from core.enums import MailboxRoleChoices
from core.models import Blob, Mailbox, MailDomain, Message, Thread
from core.models import Mailbox, MailDomain, Message, Thread
from core.tasks import process_eml_file_task, process_mbox_file_task
pytestmark = pytest.mark.django_db
@@ -80,13 +79,9 @@ def blob_mbox(mbox_file, mailbox):
"""Create a blob from a file."""
# Read the file content once
file_content = mbox_file.read()
expected_hash = hashlib.sha256(file_content).hexdigest()
return Blob.objects.create(
raw_content=file_content,
type=mbox_file.content_type,
size=mbox_file.size,
mailbox=mailbox,
sha256=expected_hash,
return mailbox.create_blob(
content=file_content,
content_type=mbox_file.content_type,
)
@@ -95,13 +90,9 @@ def blob_eml(eml_file, mailbox):
"""Create a blob from a file."""
# Read the file content once
file_content = eml_file.read()
expected_hash = hashlib.sha256(file_content).hexdigest()
return Blob.objects.create(
raw_content=file_content,
type=eml_file.content_type,
size=eml_file.size,
mailbox=mailbox,
sha256=expected_hash,
return mailbox.create_blob(
content=file_content,
content_type=eml_file.content_type,
)
@@ -121,7 +112,7 @@ def test_import_eml_file(api_client, user, mailbox, blob_eml):
assert Message.objects.count() == 1
message = Message.objects.first()
assert message.subject == "Mon mail avec joli pj"
assert message.attachments.count() == 1
assert message.has_attachments is True
assert message.sender.email == "sender@example.com"
assert message.recipients.get().contact.email == "recipient@example.com"
assert message.sent_at == message.thread.messaged_at
@@ -161,7 +152,7 @@ def test_import_mbox_file(api_client, user, mailbox, blob_mbox):
# Check messages
assert messages[0].subject == "Mon mail avec joli pj"
assert messages[0].attachments.count() == 1
assert messages[0].has_attachments is True
assert messages[1].subject == "Je t'envoie encore un message..."
body1 = messages[1].get_parsed_field("textBody")[0]["content"]
@@ -210,7 +201,7 @@ def test_import_text_plain_mime_type(api_client, user, mailbox, blob_mbox):
mailbox.accesses.create(user=user, role=MailboxRoleChoices.ADMIN)
# Create a file with text/plain MIME type
blob_mbox.type = "text/plain"
blob_mbox.content_type = "text/plain"
blob_mbox.save()
with patch("core.tasks.process_mbox_file_task.delay") as mock_task:
@@ -334,16 +325,13 @@ def test_import_duplicate_eml_file(api_client, user, mailbox, blob_eml):
mailbox.accesses.create(user=user, role=MailboxRoleChoices.ADMIN)
# create a copy of the blob because the blob is deleted after the import
blob_eml2 = Blob.objects.create(
raw_content=blob_eml.raw_content,
type=blob_eml.type,
size=blob_eml.size,
mailbox=blob_eml.mailbox,
sha256=blob_eml.sha256,
blob_eml2 = blob_eml.mailbox.create_blob(
content=blob_eml.get_content(),
content_type=blob_eml.content_type,
)
# Get file content from blob
file_content = blob_eml.raw_content
file_content = blob_eml.get_content()
assert Message.objects.count() == 0
assert Thread.objects.count() == 0
@@ -404,16 +392,13 @@ def test_import_duplicate_mbox_file(api_client, user, mailbox, blob_mbox):
mailbox.accesses.create(user=user, role=MailboxRoleChoices.ADMIN)
# create a copy of the blob because the blob is deleted after the import
blob_mbox2 = Blob.objects.create(
raw_content=blob_mbox.raw_content,
type=blob_mbox.type,
size=blob_mbox.size,
mailbox=blob_mbox.mailbox,
sha256=blob_mbox.sha256,
blob_mbox2 = blob_mbox.mailbox.create_blob(
content=blob_mbox.get_content(),
content_type=blob_mbox.content_type,
)
# Get file content from blob
file_content = blob_mbox.raw_content
file_content = blob_mbox.get_content()
assert Message.objects.count() == 0
assert Thread.objects.count() == 0
@@ -486,19 +471,13 @@ def test_import_eml_same_message_different_mailboxes(api_client, user, eml_file_
file_content = f.read()
# Create blobs for each mailbox
blob1 = Blob.objects.create(
raw_content=file_content,
type="message/rfc822",
size=len(file_content),
mailbox=mailbox1,
sha256=hashlib.sha256(file_content).hexdigest(),
blob1 = mailbox1.create_blob(
content=file_content,
content_type="message/rfc822",
)
blob2 = Blob.objects.create(
raw_content=file_content,
type="message/rfc822",
size=len(file_content),
mailbox=mailbox2,
sha256=hashlib.sha256(file_content).hexdigest(),
blob2 = mailbox2.create_blob(
content=file_content,
content_type="message/rfc822",
)
assert Message.objects.count() == 0
@@ -575,19 +554,13 @@ def test_import_mbox_same_message_different_mailboxes(api_client, user, mbox_fil
file_content = f.read()
# Create blobs for each mailbox
blob1 = Blob.objects.create(
raw_content=file_content,
type="application/mbox",
size=len(file_content),
mailbox=mailbox1,
sha256=hashlib.sha256(file_content).hexdigest(),
blob1 = mailbox1.create_blob(
content=file_content,
content_type="application/mbox",
)
blob2 = Blob.objects.create(
raw_content=file_content,
type="application/mbox",
size=len(file_content),
mailbox=mailbox2,
sha256=hashlib.sha256(file_content).hexdigest(),
blob2 = mailbox2.create_blob(
content=file_content,
content_type="application/mbox",
)
assert Message.objects.count() == 0

View File

@@ -2,15 +2,12 @@
# pylint: disable=redefined-outer-name,R0801
# TODO: fix R0801 by refactoring the tests and merge into one filetest_messages_import_labels.py
import hashlib
import pytest
from rest_framework import status
from rest_framework.test import APIClient
from core import models
from core.factories import MailboxFactory, UserFactory
from core.models import Blob
IMPORT_FILE_URL = "/api/v1.0/import/file/"
@@ -53,12 +50,9 @@ def upload_mbox_file(client, mbox_file_path, mailbox):
with open(mbox_file_path, "rb") as f:
mbox_content = f.read()
blob = Blob.objects.create(
raw_content=mbox_content,
type="application/mbox",
size=len(mbox_content),
mailbox=mailbox,
sha256=hashlib.sha256(mbox_content).hexdigest(),
blob = mailbox.create_blob(
content=mbox_content,
content_type="application/mbox",
)
response = client.post(
@@ -250,12 +244,9 @@ def test_api_authentication_required(api_client, mbox_file_path, mailbox):
with open(mbox_file_path, "rb") as f:
mbox_content = f.read()
blob = Blob.objects.create(
raw_content=mbox_content,
type="application/mbox",
size=len(mbox_content),
mailbox=mailbox,
sha256=hashlib.sha256(mbox_content).hexdigest(),
blob = mailbox.create_blob(
content=mbox_content,
content_type="application/mbox",
)
response = api_client.post(
@@ -277,12 +268,9 @@ def test_mailbox_access_required(api_client, mbox_file_path, mailbox):
with open(mbox_file_path, "rb") as f:
mbox_content = f.read()
blob = Blob.objects.create(
raw_content=mbox_content,
type="application/mbox",
size=len(mbox_content),
mailbox=mailbox,
sha256=hashlib.sha256(mbox_content).hexdigest(),
blob = mailbox.create_blob(
content=mbox_content,
content_type="application/mbox",
)
response = api_client.post(

View File

@@ -2,15 +2,12 @@
# pylint: disable=redefined-outer-name,R0801
import hashlib
import pytest
from rest_framework import status
from rest_framework.test import APIClient
from core import models
from core.factories import MailboxFactory, UserFactory
from core.models import Blob
IMPORT_FILE_URL = "/api/v1.0/import/file/"
@@ -55,12 +52,9 @@ def upload_mbox_file(client, mbox_file_path, mailbox):
with open(mbox_file_path, "rb") as f:
mbox_content = f.read()
blob = Blob.objects.create(
raw_content=mbox_content,
type="application/mbox",
size=len(mbox_content),
mailbox=mailbox,
sha256=hashlib.sha256(mbox_content).hexdigest(),
blob = mailbox.create_blob(
content=mbox_content,
content_type="application/mbox",
)
response = client.post(
@@ -272,12 +266,9 @@ def test_french_api_authentication_required(api_client, mbox_file_path, mailbox)
with open(mbox_file_path, "rb") as f:
mbox_content = f.read()
blob = Blob.objects.create(
raw_content=mbox_content,
type="application/mbox",
size=len(mbox_content),
mailbox=mailbox,
sha256=hashlib.sha256(mbox_content).hexdigest(),
blob = mailbox.create_blob(
content=mbox_content,
content_type="application/mbox",
)
response = api_client.post(
@@ -299,12 +290,9 @@ def test_french_mailbox_access_required(api_client, mbox_file_path, mailbox):
with open(mbox_file_path, "rb") as f:
mbox_content = f.read()
blob = Blob.objects.create(
raw_content=mbox_content,
type="application/mbox",
size=len(mbox_content),
mailbox=mailbox,
sha256=hashlib.sha256(mbox_content).hexdigest(),
blob = mailbox.create_blob(
content=mbox_content,
content_type="application/mbox",
)
response = api_client.post(

View File

@@ -460,12 +460,18 @@ class TestMTAInboundEmailThreading:
subject=subject,
sender=sender_contact,
mime_id=mime_id,
raw_mime=b"From: sender@example.com\r\nTo: testuser@threadtest.com\r\nSubject: "
)
# Create a blob for the message
blob = self.mailbox.create_blob(
content=b"From: sender@example.com\r\nTo: testuser@threadtest.com\r\nSubject: "
+ subject.encode("utf-8")
+ b"\r\nMessage-ID: <"
+ mime_id.encode("utf-8")
+ b">\r\n\r\nInitial body.",
content_type="message/rfc822",
)
message.blob = blob
message.save()
# Create recipients for the initial message
recipient_contact = factories.ContactFactory(
mailbox=self.mailbox, email=self.recipient_email
@@ -705,11 +711,17 @@ class TestMTAInboundEmailThreading:
subject="Other Mailbox Subject",
sender=other_sender,
mime_id=other_mime_id,
raw_mime=b"From: other@sender.com\r\nTo: otheruser@otherdomain.com"
)
# Create a blob for the message
blob = other_mailbox.create_blob(
content=b"From: other@sender.com\r\nTo: otheruser@otherdomain.com"
+ b"\r\nSubject: Other Mailbox Subject\r\nMessage-ID: <"
+ other_mime_id.encode("utf-8")
+ b">\r\n\r\nBody.",
content_type="message/rfc822",
)
other_message.blob = blob
other_message.save()
# Add recipient for other message
other_recipient_contact = factories.ContactFactory(
mailbox=other_mailbox,

View File

@@ -200,14 +200,14 @@ class TestThreadAccessCreate:
delegated_mailbox = factories.MailboxFactory()
data = {
"mailbox": str(delegated_mailbox.id),
"role": enums.ThreadAccessRoleChoices.VIEWER,
"role": "viewer",
}
response = api_client.post(get_thread_access_url(thread.id), data)
assert response.status_code == status.HTTP_201_CREATED
assert response.data["thread"] == thread.id
assert response.data["mailbox"] == delegated_mailbox.id
assert response.data["role"] == enums.ThreadAccessRoleChoices.VIEWER
assert response.data["role"] == "viewer"
def test_create_thread_access_duplicate(
self, api_client, thread_with_editor_access
@@ -218,7 +218,7 @@ class TestThreadAccessCreate:
data = {
"mailbox": str(mailbox.id),
"role": enums.ThreadAccessRoleChoices.EDITOR,
"role": "editor",
}
response = api_client.post(get_thread_access_url(thread.id), data)
@@ -255,7 +255,7 @@ class TestThreadAccessCreate:
delegated_mailbox = factories.MailboxFactory()
data = {
"mailbox": str(delegated_mailbox.id),
"role": enums.ThreadAccessRoleChoices.VIEWER,
"role": "viewer",
}
response = api_client.post(get_thread_access_url(thread.id), data)
@@ -318,11 +318,11 @@ class TestThreadAccessUpdate:
)
url = get_thread_access_url(thread.id, thread_access.id)
data = {"role": enums.ThreadAccessRoleChoices.EDITOR}
data = {"role": "editor"}
response = api_client.patch(url, data)
assert response.status_code == status.HTTP_200_OK
assert response.data["role"] == enums.ThreadAccessRoleChoices.EDITOR
assert response.data["role"] == "editor"
@pytest.mark.parametrize(
"thread_access_role, mailbox_access_role",
@@ -353,7 +353,7 @@ class TestThreadAccessUpdate:
role=thread_access_role,
)
url = get_thread_access_url(thread.id, thread_access.id)
data = {"role": enums.ThreadAccessRoleChoices.EDITOR}
data = {"role": "editor"}
response = api_client.patch(url, data)
assert response.status_code == status.HTTP_403_FORBIDDEN
@@ -363,7 +363,7 @@ class TestThreadAccessUpdate:
thread_access = factories.ThreadAccessFactory()
url = get_thread_access_url(thread.id, thread_access.id)
data = {"role": enums.ThreadAccessRoleChoices.EDITOR}
data = {"role": "editor"}
response = api_client.patch(url, data)
assert response.status_code == status.HTTP_404_NOT_FOUND
@@ -375,7 +375,7 @@ class TestThreadAccessUpdate:
thread = factories.ThreadFactory()
url = get_thread_access_url(thread.id, uuid.uuid4())
data = {"role": enums.ThreadAccessRoleChoices.EDITOR}
data = {"role": "editor"}
response = api_client.patch(url, data)
assert response.status_code == status.HTTP_404_NOT_FOUND

View File

@@ -734,10 +734,7 @@ class TestThreadListAPI:
assert str(thread1.id) not in thread_ids
assert str(thread2.id) in thread_ids
assert str(thread3.id) in thread_ids
assert (
response.data["results"][0]["user_role"]
== enums.ThreadAccessRoleChoices.VIEWER
)
assert response.data["results"][0]["user_role"] == "viewer"
# check that the accesses are returned
assert len(response.data["results"][0]["accesses"]) == 1
access = response.data["results"][1]["accesses"][0]
@@ -745,11 +742,11 @@ class TestThreadListAPI:
assert access["mailbox"]["id"] == str(access2.mailbox.id)
assert access["mailbox"]["email"] == str(access2.mailbox)
assert access["mailbox"]["name"] == access2.mailbox.contact.name
assert access["role"] == access2.role
assert access["role"] == enums.ThreadAccessRoleChoices(access2.role).label
assert access["mailbox"]["id"] == str(access2.mailbox.id)
assert access["mailbox"]["email"] == str(access2.mailbox)
assert access["mailbox"]["name"] == access2.mailbox.contact.name
assert access["role"] == access2.role
assert access["role"] == enums.ThreadAccessRoleChoices(access2.role).label
def test_list_threads_unauthorized(self, api_client, url):
"""Test listing threads without authentication."""

View File

@@ -39,7 +39,6 @@ def test_api_users_retrieve_me_authenticated():
"id": str(user.id),
"email": user.email,
"full_name": user.full_name,
"short_name": user.short_name,
"abilities": {
"create_maildomains": False,
"view_maildomains": False,
@@ -175,4 +174,3 @@ def test_users_me_endpoint_includes_abilities_by_default():
assert "id" in data
assert "email" in data
assert "full_name" in data
assert "short_name" in data

View File

@@ -92,7 +92,7 @@ def test_authentication_getter_existing_user_with_email(monkeypatch):
When the user's info contains an email and targets an existing user,
"""
klass = OIDCAuthenticationBackend()
user = UserFactory(full_name="John Doe", short_name="John")
user = UserFactory(full_name="John Doe")
def get_userinfo_mocked(*args):
return {
@@ -129,9 +129,7 @@ def test_authentication_getter_existing_user_change_fields_sub(
and the user was identified by its "sub".
"""
klass = OIDCAuthenticationBackend()
user = UserFactory(
full_name="John Doe", short_name="John", email="john.doe@example.com"
)
user = UserFactory(full_name="John Doe", email="john.doe@example.com")
def get_userinfo_mocked(*args):
return {
@@ -151,7 +149,6 @@ def test_authentication_getter_existing_user_change_fields_sub(
user.refresh_from_db()
assert user.email == email
assert user.full_name == f"{first_name:s} {last_name:s}"
assert user.short_name == first_name
@override_settings(MESSAGES_TESTDOMAIN=None)
@@ -170,9 +167,7 @@ def test_authentication_getter_existing_user_change_fields_email(
and the user was identified by its "email" as fallback.
"""
klass = OIDCAuthenticationBackend()
user = UserFactory(
full_name="John Doe", short_name="John", email="john.doe@example.com"
)
user = UserFactory(full_name="John Doe", email="john.doe@example.com")
def get_userinfo_mocked(*args):
return {
@@ -192,7 +187,6 @@ def test_authentication_getter_existing_user_change_fields_email(
user.refresh_from_db()
assert user.email == email
assert user.full_name == f"{first_name:s} {last_name:s}"
assert user.short_name == first_name
def test_authentication_getter_new_user_no_email(monkeypatch):
@@ -236,7 +230,6 @@ def test_authentication_getter_new_user_with_email(monkeypatch):
assert user.sub == "123"
assert user.email == email
assert user.full_name == "John Doe"
assert user.short_name == "John"
assert user.password == "!"
assert models.User.objects.count() == 1
@@ -457,7 +450,6 @@ def test_authentication_verify_claims_success(monkeypatch):
assert user.sub == "123"
assert user.full_name == "Doe"
assert user.short_name is None
assert user.email == "john.doe@example.com"
@@ -491,7 +483,6 @@ def test_authentication_getter_new_user_with_testdomain(monkeypatch):
assert user.sub == "123"
assert user.full_name == "Doe"
assert user.short_name is None
assert user.email == "john.doe@sub.gouv.fr"
maildomain = models.MailDomain.objects.get(name="testdomain.bzh")

View File

View File

@@ -0,0 +1,303 @@
"""
Tests for DNS checking functionality.
"""
from unittest.mock import MagicMock, patch
import pytest
from dns.resolver import NXDOMAIN, YXDOMAIN, NoAnswer, NoNameservers, Timeout
from core.dns.check import check_dns_records, check_single_record
from core.models import MailDomain
@pytest.mark.django_db
class TestDNSChecking:
"""Test DNS checking functionality."""
def test_check_single_record_mx_correct(self, maildomain_factory):
"""Test checking a correct MX record."""
maildomain = maildomain_factory(name="example.com")
expected_record = {"type": "MX", "target": "@", "value": "10 mx1.example.com"}
with patch("core.dns.check.dns.resolver.resolve") as mock_resolve:
# Mock correct MX record
mock_answer = MagicMock()
mock_answer.preference = 10
mock_answer.exchange = "mx1.example.com"
mock_resolve.return_value = [mock_answer]
result = check_single_record(maildomain, expected_record)
assert result["status"] == "correct"
assert result["found"] == ["10 mx1.example.com"]
def test_check_single_record_mx_incorrect(self, maildomain_factory):
"""Test checking an incorrect MX record."""
maildomain = maildomain_factory(name="example.com")
expected_record = {"type": "MX", "target": "@", "value": "10 mx1.example.com"}
with patch("core.dns.check.dns.resolver.resolve") as mock_resolve:
# Mock incorrect MX record
mock_answer = MagicMock()
mock_answer.preference = 20
mock_answer.exchange = "mx2.example.com"
mock_resolve.return_value = [mock_answer]
result = check_single_record(maildomain, expected_record)
assert result["status"] == "incorrect"
assert result["found"] == ["20 mx2.example.com"]
def test_check_single_record_txt_correct(self, maildomain_factory):
"""Test checking a correct TXT record."""
maildomain = maildomain_factory(name="example.com")
expected_record = {
"type": "TXT",
"target": "@",
"value": "v=spf1 include:_spf.example.com -all",
}
with patch("core.dns.check.dns.resolver.resolve") as mock_resolve:
# Mock correct TXT record
mock_answer = MagicMock()
mock_answer.to_text.return_value = '"v=spf1 include:_spf.example.com -all"'
mock_resolve.return_value = [mock_answer]
result = check_single_record(maildomain, expected_record)
assert result["status"] == "correct"
assert result["found"] == ["v=spf1 include:_spf.example.com -all"]
def test_check_single_record_missing(self, maildomain_factory):
"""Test checking a missing record."""
maildomain = maildomain_factory(name="example.com")
expected_record = {"type": "MX", "target": "@", "value": "10 mx1.example.com"}
with patch("core.dns.check.dns.resolver.resolve") as mock_resolve:
# Mock missing record
mock_resolve.side_effect = Exception("No records found")
result = check_single_record(maildomain, expected_record)
assert result["status"] == "error"
assert "No records found" in result["error"]
def test_check_single_record_nxdomain(self, maildomain_factory):
"""Test checking a record when domain doesn't exist."""
maildomain = maildomain_factory(name="example.com")
expected_record = {"type": "MX", "target": "@", "value": "10 mx1.example.com"}
with patch("core.dns.check.dns.resolver.resolve") as mock_resolve:
# Mock NXDOMAIN
mock_resolve.side_effect = NXDOMAIN()
result = check_single_record(maildomain, expected_record)
assert result["status"] == "missing"
assert result["error"] == "Domain not found"
def test_check_single_record_no_answer(self, maildomain_factory):
"""Test checking a record when no answer is returned."""
maildomain = maildomain_factory(name="example.com")
expected_record = {"type": "MX", "target": "@", "value": "10 mx1.example.com"}
with patch("core.dns.check.dns.resolver.resolve") as mock_resolve:
# Mock NoAnswer
mock_resolve.side_effect = NoAnswer()
result = check_single_record(maildomain, expected_record)
assert result["status"] == "missing"
assert result["error"] == "No records found"
def test_check_single_record_no_nameservers(self, maildomain_factory):
"""Test checking a record when no nameservers are found."""
maildomain = maildomain_factory(name="example.com")
expected_record = {"type": "MX", "target": "@", "value": "10 mx1.example.com"}
with patch("core.dns.check.dns.resolver.resolve") as mock_resolve:
# Mock NoNameservers
mock_resolve.side_effect = NoNameservers()
result = check_single_record(maildomain, expected_record)
assert result["status"] == "missing"
assert result["error"] == "No nameservers found"
def test_check_single_record_timeout(self, maildomain_factory):
"""Test checking a record when DNS query times out."""
maildomain = maildomain_factory(name="example.com")
expected_record = {"type": "MX", "target": "@", "value": "10 mx1.example.com"}
with patch("core.dns.check.dns.resolver.resolve") as mock_resolve:
# Mock Timeout
mock_resolve.side_effect = Timeout()
result = check_single_record(maildomain, expected_record)
assert result["status"] == "error"
assert result["error"] == "DNS query timeout"
def test_check_single_record_yxdomain(self, maildomain_factory):
"""Test checking a record when domain name is too long."""
maildomain = maildomain_factory(name="example.com")
expected_record = {"type": "MX", "target": "@", "value": "10 mx1.example.com"}
with patch("core.dns.check.dns.resolver.resolve") as mock_resolve:
# Mock YXDOMAIN
mock_resolve.side_effect = YXDOMAIN()
result = check_single_record(maildomain, expected_record)
assert result["status"] == "error"
assert result["error"] == "Domain name too long"
def test_check_single_record_generic_exception(self, maildomain_factory):
"""Test checking a record when a generic exception occurs."""
maildomain = maildomain_factory(name="example.com")
expected_record = {"type": "MX", "target": "@", "value": "10 mx1.example.com"}
with patch("core.dns.check.dns.resolver.resolve") as mock_resolve:
# Mock generic exception
mock_resolve.side_effect = Exception("Network error")
result = check_single_record(maildomain, expected_record)
assert result["status"] == "error"
assert "DNS query failed: Network error" in result["error"]
def test_check_single_record_mx_correct_format(self, maildomain_factory):
"""Test that MX records are formatted correctly in results."""
maildomain = maildomain_factory(name="example.com")
expected_record = {"type": "MX", "target": "@", "value": "10 mx1.example.com"}
with patch("core.dns.check.dns.resolver.resolve") as mock_resolve:
# Mock correct MX record
mock_answer = MagicMock()
mock_answer.preference = 10
mock_answer.exchange = "mx1.example.com"
mock_resolve.return_value = [mock_answer]
result = check_single_record(maildomain, expected_record)
assert result["status"] == "correct"
assert result["found"] == ["10 mx1.example.com"]
def test_check_single_record_mx_incorrect_format(self, maildomain_factory):
"""Test that MX records with wrong format are detected as incorrect."""
maildomain = maildomain_factory(name="example.com")
expected_record = {"type": "MX", "target": "@", "value": "10 mx1.example.com"}
with patch("core.dns.check.dns.resolver.resolve") as mock_resolve:
# Mock MX record with different preference
mock_answer = MagicMock()
mock_answer.preference = 20
mock_answer.exchange = "mx1.example.com"
mock_resolve.return_value = [mock_answer]
result = check_single_record(maildomain, expected_record)
assert result["status"] == "incorrect"
assert result["found"] == ["20 mx1.example.com"]
def test_check_dns_records_multiple_records(self, maildomain_factory):
"""Test checking multiple DNS records."""
maildomain = maildomain_factory(name="example.com")
with patch.object(maildomain, "get_expected_dns_records") as mock_get_records:
mock_get_records.return_value = [
{"type": "MX", "target": "@", "value": "10 mx1.example.com"},
{
"type": "TXT",
"target": "@",
"value": "v=spf1 include:_spf.example.com -all",
},
]
with patch("core.dns.check.dns.resolver.resolve") as mock_resolve:
# Mock responses for both records
mock_mx_answer = MagicMock()
mock_mx_answer.preference = 10
mock_mx_answer.exchange = "mx1.example.com"
mock_txt_answer = MagicMock()
mock_txt_answer.to_text.return_value = (
'"v=spf1 include:_spf.example.com -all"'
)
mock_resolve.side_effect = [
[mock_mx_answer], # MX record response
[mock_txt_answer], # TXT record response
]
results = check_dns_records(maildomain)
assert len(results) == 2
assert results[0]["type"] == "MX"
assert results[0]["_check"]["status"] == "correct"
assert results[1]["type"] == "TXT"
assert results[1]["_check"]["status"] == "correct"
def test_check_dns_records_mixed_status(self, maildomain_factory):
"""Test checking DNS records with mixed status (correct, incorrect, missing)."""
maildomain = maildomain_factory(name="example.com")
with patch.object(maildomain, "get_expected_dns_records") as mock_get_records:
mock_get_records.return_value = [
{"type": "MX", "target": "@", "value": "10 mx1.example.com"},
{
"type": "TXT",
"target": "@",
"value": "v=spf1 include:_spf.example.com -all",
},
{"type": "A", "target": "@", "value": "192.168.1.1"},
]
with patch("core.dns.check.dns.resolver.resolve") as mock_resolve:
# Mock responses: correct MX, incorrect TXT, missing A
mock_mx_answer = MagicMock()
mock_mx_answer.preference = 10
mock_mx_answer.exchange = "mx1.example.com"
mock_resolve.side_effect = [
[mock_mx_answer], # Correct MX
[], # Incorrect TXT (empty response)
NoAnswer(), # Missing A record
]
results = check_dns_records(maildomain)
assert len(results) == 3
assert results[0]["_check"]["status"] == "correct"
assert results[1]["_check"]["status"] == "incorrect"
assert results[2]["_check"]["status"] == "missing"
def test_check_single_record_with_subdomain(self, maildomain_factory):
"""Test checking a record for a subdomain."""
maildomain = maildomain_factory(name="example.com")
expected_record = {"type": "A", "target": "www", "value": "192.168.1.1"}
with patch("core.dns.check.dns.resolver.resolve") as mock_resolve:
# Mock correct A record for subdomain
mock_answer = MagicMock()
mock_answer.to_text.return_value = "192.168.1.1"
mock_resolve.return_value = [mock_answer]
result = check_single_record(maildomain, expected_record)
assert result["status"] == "correct"
assert result["found"] == ["192.168.1.1"]
# Verify the query was made for the subdomain
mock_resolve.assert_called_once_with("www.example.com", "A")
@pytest.fixture(name="maildomain_factory")
@pytest.mark.django_db
def fixture_maildomain_factory():
"""Factory for creating test mail domains."""
def _create_maildomain(name="test.com"):
return MailDomain.objects.create(name=name)
return _create_maildomain

View File

@@ -0,0 +1,714 @@
"""
Tests for Scaleway DNS provider functionality.
"""
from unittest.mock import patch
from django.test.utils import override_settings
import pytest
from core.dns.providers.scaleway import ScalewayDNSProvider
@pytest.mark.django_db
# pylint: disable=protected-access,too-many-public-methods
class TestScalewayDNSProvider:
"""Test Scaleway DNS provider functionality."""
@override_settings(
DNS_SCALEWAY_API_TOKEN="test-token",
DNS_SCALEWAY_PROJECT_ID="test-project",
DNS_SCALEWAY_TTL=600,
)
def test_scaleway_provider_is_configured(self):
"""Test that is_configured returns True when properly configured."""
provider = ScalewayDNSProvider()
assert provider.is_configured() is True
@override_settings(
DNS_SCALEWAY_API_TOKEN="",
DNS_SCALEWAY_PROJECT_ID="test-project",
DNS_SCALEWAY_TTL=600,
)
def test_scaleway_provider_is_not_configured_missing_token(self):
"""Test that is_configured returns False when API token is missing."""
provider = ScalewayDNSProvider()
assert provider.is_configured() is False
@override_settings(
DNS_SCALEWAY_API_TOKEN="test-token",
DNS_SCALEWAY_PROJECT_ID="",
DNS_SCALEWAY_TTL=600,
)
def test_scaleway_provider_is_not_configured_missing_project(self):
"""Test that is_configured returns False when project ID is missing."""
provider = ScalewayDNSProvider()
assert provider.is_configured() is False
@override_settings(
DNS_SCALEWAY_API_TOKEN="test-token",
DNS_SCALEWAY_PROJECT_ID="test-project",
DNS_SCALEWAY_TTL=600,
)
def test_scaleway_provider_ttl_setting(self):
"""Test that Scaleway provider uses the TTL setting correctly."""
provider = ScalewayDNSProvider()
assert provider.ttl == 600
@override_settings(
DNS_SCALEWAY_API_TOKEN="test-token",
DNS_SCALEWAY_PROJECT_ID="test-project",
DNS_SCALEWAY_TTL=600,
)
def test_scaleway_provider_resolve_zone_components_root_domain(self):
"""Test that _resolve_zone_components handles root domains correctly."""
provider = ScalewayDNSProvider()
with patch.object(provider, "get_zones") as mock_get_zones:
mock_get_zones.return_value = []
parent_domain, subdomain = provider._resolve_zone_components("example.com")
assert parent_domain == "example.com"
assert subdomain == ""
@override_settings(
DNS_SCALEWAY_API_TOKEN="test-token",
DNS_SCALEWAY_PROJECT_ID="test-project",
DNS_SCALEWAY_TTL=600,
)
def test_scaleway_provider_resolve_zone_components_subdomain_no_parent(self):
"""Test that _resolve_zone_components handles subdomains when no parent exists."""
provider = ScalewayDNSProvider()
with patch.object(provider, "get_zones") as mock_get_zones:
mock_get_zones.return_value = []
parent_domain, subdomain = provider._resolve_zone_components(
"mail.example.com"
)
assert parent_domain == "example.com"
assert subdomain == "mail"
@override_settings(
DNS_SCALEWAY_API_TOKEN="test-token",
DNS_SCALEWAY_PROJECT_ID="test-project",
DNS_SCALEWAY_TTL=600,
)
def test_scaleway_provider_resolve_zone_components_subdomain_with_parent(self):
"""Test that _resolve_zone_components finds existing parent zone."""
provider = ScalewayDNSProvider()
with patch.object(provider, "get_zones") as mock_get_zones:
# Mock existing zones - example.com exists as a root domain
mock_get_zones.return_value = [
{"domain": "example.com", "subdomain": ""},
]
parent_domain, subdomain = provider._resolve_zone_components(
"mail.example.com"
)
assert parent_domain == "example.com"
assert subdomain == "mail"
@override_settings(
DNS_SCALEWAY_API_TOKEN="test-token",
DNS_SCALEWAY_PROJECT_ID="test-project",
DNS_SCALEWAY_TTL=600,
)
def test_scaleway_provider_resolve_zone_components_nested_subdomain_with_parent(
self,
):
"""Test that _resolve_zone_components finds existing parent zone for nested subdomain."""
provider = ScalewayDNSProvider()
with patch.object(provider, "get_zones") as mock_get_zones:
# Mock existing zones - mail.example.com exists as a subdomain
mock_get_zones.return_value = [
{"domain": "example.com", "subdomain": "mail"},
]
parent_domain, subdomain = provider._resolve_zone_components(
"smtp.mail.example.com"
)
assert parent_domain == "example.com"
assert subdomain == "smtp"
@override_settings(
DNS_SCALEWAY_API_TOKEN="test-token",
DNS_SCALEWAY_PROJECT_ID="test-project",
DNS_SCALEWAY_TTL=600,
)
def test_scaleway_provider_resolve_zone_components_deep_nested_subdomain(self):
"""Test that _resolve_zone_components handles deeply nested subdomains."""
provider = ScalewayDNSProvider()
with patch.object(provider, "get_zones") as mock_get_zones:
# Mock existing zones - example.com exists as a root domain
mock_get_zones.return_value = [
{"domain": "example.com", "subdomain": ""},
]
parent_domain, subdomain = provider._resolve_zone_components(
"smtp.mail.example.com"
)
assert parent_domain == "example.com"
assert subdomain == "smtp.mail"
@override_settings(
DNS_SCALEWAY_API_TOKEN="test-token",
DNS_SCALEWAY_PROJECT_ID="test-project",
DNS_SCALEWAY_TTL=600,
)
def test_scaleway_provider_resolve_zone_components_multiple_potential_parents(self):
"""Test that _resolve_zone_components finds the closest existing parent."""
provider = ScalewayDNSProvider()
with patch.object(provider, "get_zones") as mock_get_zones:
# Mock existing zones - both example.com and mail.example.com exist
mock_get_zones.return_value = [
{"domain": "example.com", "subdomain": ""},
{"domain": "example.com", "subdomain": "mail"},
]
# Should find mail.example.com as the closest parent
parent_domain, subdomain = provider._resolve_zone_components(
"smtp.mail.example.com"
)
assert parent_domain == "example.com"
assert subdomain == "smtp"
@override_settings(
DNS_SCALEWAY_API_TOKEN="test-token",
DNS_SCALEWAY_PROJECT_ID="test-project",
DNS_SCALEWAY_TTL=600,
)
def test_scaleway_provider_resolve_zone_components_no_parent_found(self):
"""Test that _resolve_zone_components creates new zone when no parent exists."""
provider = ScalewayDNSProvider()
with patch.object(provider, "get_zones") as mock_get_zones:
# Mock existing zones - no relevant parent exists
mock_get_zones.return_value = [
{"domain": "other.com", "subdomain": ""},
]
parent_domain, subdomain = provider._resolve_zone_components(
"mail.example.com"
)
assert parent_domain == "example.com"
assert subdomain == "mail"
@override_settings(
DNS_SCALEWAY_API_TOKEN="test-token",
DNS_SCALEWAY_PROJECT_ID="test-project",
DNS_SCALEWAY_TTL=600,
)
def test_scaleway_provider_create_zone_root_domain(self):
"""Test that create_zone handles root domains correctly."""
provider = ScalewayDNSProvider()
with (
patch.object(provider, "_make_request") as mock_request,
patch.object(provider, "get_zones") as mock_get_zones,
):
mock_request.return_value = {"dns_zone": {"domain": "example.com"}}
mock_get_zones.return_value = []
provider.create_zone("example.com")
# Verify correct parameters for root domain
call_args = mock_request.call_args
assert call_args[0][0] == "POST" # HTTP method
assert call_args[0][1] == "dns-zones" # endpoint
assert call_args[0][2]["domain"] == "example.com"
assert call_args[0][2]["subdomain"] == ""
assert call_args[0][2]["project_id"] == "test-project"
@override_settings(
DNS_SCALEWAY_API_TOKEN="test-token",
DNS_SCALEWAY_PROJECT_ID="test-project",
DNS_SCALEWAY_TTL=600,
)
def test_scaleway_provider_create_zone_subdomain(self):
"""Test that create_zone handles subdomains correctly."""
provider = ScalewayDNSProvider()
with (
patch.object(provider, "_make_request") as mock_request,
patch.object(provider, "get_zones") as mock_get_zones,
):
mock_request.return_value = {"dns_zone": {"domain": "example.com"}}
mock_get_zones.return_value = []
provider.create_zone("mail.example.com")
# Verify correct parameters for subdomain
call_args = mock_request.call_args
assert call_args[0][0] == "POST" # HTTP method
assert call_args[0][1] == "dns-zones" # endpoint
assert call_args[0][2]["domain"] == "example.com"
assert call_args[0][2]["subdomain"] == "mail"
assert call_args[0][2]["project_id"] == "test-project"
@override_settings(
DNS_SCALEWAY_API_TOKEN="test-token",
DNS_SCALEWAY_PROJECT_ID="test-project",
DNS_SCALEWAY_TTL=600,
)
def test_scaleway_provider_create_zone_subdomain_with_existing_parent(self):
"""Test that create_zone creates sub-zone when parent exists."""
provider = ScalewayDNSProvider()
with (
patch.object(provider, "_make_request") as mock_request,
patch.object(provider, "get_zones") as mock_get_zones,
):
mock_request.return_value = {"dns_zone": {"domain": "example.com"}}
# Mock existing parent zone
mock_get_zones.return_value = [
{"domain": "example.com", "subdomain": ""},
]
provider.create_zone("mail.example.com")
# Should still create sub-zone under existing parent
call_args = mock_request.call_args
assert call_args[0][0] == "POST" # HTTP method
assert call_args[0][1] == "dns-zones" # endpoint
assert call_args[0][2]["domain"] == "example.com"
assert call_args[0][2]["subdomain"] == "mail"
assert call_args[0][2]["project_id"] == "test-project"
@override_settings(
DNS_SCALEWAY_API_TOKEN="test-token",
DNS_SCALEWAY_PROJECT_ID="test-project",
DNS_SCALEWAY_TTL=600,
)
def test_scaleway_provider_get_zone_name(self):
"""Test that _get_zone_name returns the correct zone name for API calls."""
provider = ScalewayDNSProvider()
with patch.object(provider, "get_zone") as mock_get_zone:
mock_get_zone.return_value = None
# Should return the full domain name for API calls
assert provider._get_zone_name("example.com") == "example.com"
assert provider._get_zone_name("mail.example.com") == "mail.example.com"
assert (
provider._get_zone_name("smtp.mail.example.com")
== "smtp.mail.example.com"
)
@override_settings(
DNS_SCALEWAY_API_TOKEN="test-token",
DNS_SCALEWAY_PROJECT_ID="test-project",
DNS_SCALEWAY_TTL=600,
)
def test_scaleway_provider_format_record_name(self):
"""Test that _format_record_name formats record names correctly."""
provider = ScalewayDNSProvider()
# Test root domain record
assert provider._format_record_name("example.com", "example.com") == ""
# Test subdomain record
assert provider._format_record_name("test.example.com", "example.com") == "test"
# Test short name
assert provider._format_record_name("test", "example.com") == "test"
# Test empty name
assert provider._format_record_name("", "example.com") == ""
assert provider._format_record_name(None, "example.com") == ""
# Test edge cases
assert provider._format_record_name("www.example.com", "example.com") == "www"
assert provider._format_record_name("mail.example.com", "example.com") == "mail"
@override_settings(
DNS_SCALEWAY_API_TOKEN="test-token",
DNS_SCALEWAY_PROJECT_ID="test-project",
DNS_SCALEWAY_TTL=600,
)
def test_scaleway_provider_create_record_with_default_ttl(self):
"""Test that create_record uses default TTL when not specified."""
provider = ScalewayDNSProvider()
with (
patch.object(provider, "_make_request") as mock_request,
patch.object(provider, "_validate_zone_exists") as mock_validate,
):
mock_request.return_value = {"records": [{"id": "test-record"}]}
mock_validate.return_value = True
provider.create_record(
"example.com", "test.example.com", "A", "192.168.1.1"
)
# Verify TTL was passed correctly and record name is formatted
call_args = mock_request.call_args
assert call_args[0][0] == "PATCH" # HTTP method
assert call_args[0][1] == "dns-zones/example.com/records" # endpoint
assert call_args[0][2]["return_all_records"] is False
assert call_args[0][2]["changes"][0]["add"]["records"][0]["name"] == "test"
assert call_args[0][2]["changes"][0]["add"]["records"][0]["ttl"] == 600
@override_settings(
DNS_SCALEWAY_API_TOKEN="test-token",
DNS_SCALEWAY_PROJECT_ID="test-project",
DNS_SCALEWAY_TTL=600,
)
def test_scaleway_provider_create_record_with_custom_ttl(self):
"""Test that create_record uses custom TTL when specified."""
provider = ScalewayDNSProvider()
with (
patch.object(provider, "_make_request") as mock_request,
patch.object(provider, "_validate_zone_exists") as mock_validate,
):
mock_request.return_value = {"records": [{"id": "test-record"}]}
mock_validate.return_value = True
provider.create_record(
"example.com", "test.example.com", "A", "192.168.1.1", ttl=300
)
# Verify custom TTL was passed correctly
call_args = mock_request.call_args
assert call_args[0][0] == "PATCH" # HTTP method
assert call_args[0][1] == "dns-zones/example.com/records" # endpoint
assert call_args[0][2]["return_all_records"] is False
assert call_args[0][2]["changes"][0]["add"]["records"][0]["name"] == "test"
assert call_args[0][2]["changes"][0]["add"]["records"][0]["ttl"] == 300
@override_settings(
DNS_SCALEWAY_API_TOKEN="test-token",
DNS_SCALEWAY_PROJECT_ID="test-project",
DNS_SCALEWAY_TTL=600,
)
def test_scaleway_provider_create_record_root_domain(self):
"""Test that create_record handles root domain records correctly."""
provider = ScalewayDNSProvider()
with (
patch.object(provider, "_make_request") as mock_request,
patch.object(provider, "_validate_zone_exists") as mock_validate,
):
mock_request.return_value = {"records": [{"id": "test-record"}]}
mock_validate.return_value = True
provider.create_record("example.com", "example.com", "A", "192.168.1.1")
# Verify root domain record has empty name
call_args = mock_request.call_args
assert call_args[0][2]["changes"][0]["add"]["records"][0]["name"] == ""
@override_settings(
DNS_SCALEWAY_API_TOKEN="test-token",
DNS_SCALEWAY_PROJECT_ID="test-project",
DNS_SCALEWAY_TTL=600,
)
def test_scaleway_provider_update_record(self):
"""Test that update_record uses the correct API structure."""
provider = ScalewayDNSProvider()
with (
patch.object(provider, "_make_request") as mock_request,
patch.object(provider, "_validate_zone_exists") as mock_validate,
):
mock_request.return_value = {"records": [{"id": "updated-record"}]}
mock_validate.return_value = True
provider.update_record(
"example.com",
"record-id",
"test.example.com",
"A",
"192.168.1.2",
ttl=300,
)
# Verify the correct API structure with formatted record name
call_args = mock_request.call_args
assert call_args[0][0] == "PATCH" # HTTP method
assert call_args[0][1] == "dns-zones/example.com/records" # endpoint
assert call_args[0][2]["return_all_records"] is False
assert call_args[0][2]["changes"][0]["set"]["id_fields"]["name"] == "test"
assert call_args[0][2]["changes"][0]["set"]["id_fields"]["type"] == "A"
assert call_args[0][2]["changes"][0]["set"]["records"][0]["name"] == "test"
assert call_args[0][2]["changes"][0]["set"]["records"][0]["ttl"] == 300
@override_settings(
DNS_SCALEWAY_API_TOKEN="test-token",
DNS_SCALEWAY_PROJECT_ID="test-project",
DNS_SCALEWAY_TTL=600,
)
def test_scaleway_provider_delete_record_by_name_type(self):
"""Test that delete_record_by_name_type uses the correct API structure."""
provider = ScalewayDNSProvider()
with (
patch.object(provider, "_make_request") as mock_request,
patch.object(provider, "_validate_zone_exists") as mock_validate,
):
mock_request.return_value = {}
mock_validate.return_value = True
provider.delete_record_by_name_type("example.com", "test.example.com", "A")
# Verify the correct API structure with formatted record name
call_args = mock_request.call_args
assert call_args[0][0] == "PATCH" # HTTP method
assert call_args[0][1] == "dns-zones/example.com/records" # endpoint
assert call_args[0][2]["return_all_records"] is False
assert (
call_args[0][2]["changes"][0]["delete"]["id_fields"]["name"] == "test"
)
assert call_args[0][2]["changes"][0]["delete"]["id_fields"]["type"] == "A"
@override_settings(
DNS_SCALEWAY_API_TOKEN="test-token",
DNS_SCALEWAY_PROJECT_ID="test-project",
DNS_SCALEWAY_TTL=600,
)
def test_scaleway_provider_find_records(self):
"""Test that find_records uses formatted record names."""
provider = ScalewayDNSProvider()
with patch.object(provider, "get_records") as mock_get_records:
mock_get_records.return_value = [
{"name": "test", "type": "A", "data": "192.168.1.1"},
{"name": "other", "type": "A", "data": "192.168.1.2"},
]
records = provider.find_records("example.com", "test.example.com", "A")
# Should find the record with formatted name
assert len(records) == 1
assert records[0]["name"] == "test"
assert records[0]["type"] == "A"
@override_settings(
DNS_SCALEWAY_API_TOKEN="test-token",
DNS_SCALEWAY_PROJECT_ID="test-project",
DNS_SCALEWAY_TTL=600,
)
def test_scaleway_provider_record_exists(self):
"""Test that record_exists uses formatted record names."""
provider = ScalewayDNSProvider()
with patch.object(provider, "find_records") as mock_find_records:
mock_find_records.return_value = [
{"name": "test", "type": "A", "data": "192.168.1.1"},
]
exists = provider.record_exists(
"example.com", "test.example.com", "A", "192.168.1.1"
)
assert exists is True
# Verify that find_records was called with formatted name
mock_find_records.assert_called_once_with(
"example.com", "test.example.com", "A"
)
@override_settings(
DNS_SCALEWAY_API_TOKEN="test-token",
DNS_SCALEWAY_PROJECT_ID="test-project",
DNS_SCALEWAY_TTL=600,
)
def test_scaleway_provider_get_records(self):
"""Test that get_records uses correct zone name."""
provider = ScalewayDNSProvider()
with patch.object(provider, "_make_request") as mock_request:
mock_request.return_value = {"records": [{"name": "test", "type": "A"}]}
provider.get_records("example.com")
# Verify correct zone name is used
call_args = mock_request.call_args
assert call_args[0][0] == "GET" # HTTP method
assert call_args[0][1] == "dns-zones/example.com/records" # endpoint
@override_settings(
DNS_SCALEWAY_API_TOKEN="test-token",
DNS_SCALEWAY_PROJECT_ID="test-project",
DNS_SCALEWAY_TTL=600,
)
def test_scaleway_provider_get_records_subdomain(self):
"""Test that get_records works with subdomains."""
provider = ScalewayDNSProvider()
with patch.object(provider, "_make_request") as mock_request:
mock_request.return_value = {"records": [{"name": "test", "type": "A"}]}
provider.get_records("mail.example.com")
# Verify correct zone name is used for subdomain
call_args = mock_request.call_args
assert call_args[0][0] == "GET" # HTTP method
assert call_args[0][1] == "dns-zones/mail.example.com/records" # endpoint
@override_settings(
DNS_SCALEWAY_API_TOKEN="test-token",
DNS_SCALEWAY_PROJECT_ID="test-project",
DNS_SCALEWAY_TTL=600,
)
def test_scaleway_provider_get_zone_name_with_existing_zone(self):
"""Test that _get_zone_name returns correct zone name when zone exists."""
provider = ScalewayDNSProvider()
with patch.object(provider, "get_zone") as mock_get_zone:
# Mock that the zone exists
mock_get_zone.return_value = {"domain": "example.com", "subdomain": ""}
zone_name = provider._get_zone_name("example.com")
assert zone_name == "example.com"
@override_settings(
DNS_SCALEWAY_API_TOKEN="test-token",
DNS_SCALEWAY_PROJECT_ID="test-project",
DNS_SCALEWAY_TTL=600,
)
def test_scaleway_provider_get_zone_name_with_parent_zone(self):
"""Test that _get_zone_name finds parent zone when exact zone doesn't exist."""
provider = ScalewayDNSProvider()
with patch.object(provider, "get_zone") as mock_get_zone:
# Mock that exact zone doesn't exist but parent does
mock_get_zone.side_effect = (
lambda domain: {"domain": "example.com", "subdomain": ""}
if domain == "example.com"
else None
)
zone_name = provider._get_zone_name("mail.example.com")
assert zone_name == "example.com"
@override_settings(
DNS_SCALEWAY_API_TOKEN="test-token",
DNS_SCALEWAY_PROJECT_ID="test-project",
DNS_SCALEWAY_TTL=600,
)
def test_scaleway_provider_validate_zone_exists(self):
"""Test that _validate_zone_exists works correctly."""
provider = ScalewayDNSProvider()
with patch.object(provider, "get_zone") as mock_get_zone:
# Test existing zone
mock_get_zone.return_value = {"domain": "example.com", "subdomain": ""}
assert provider._validate_zone_exists("example.com") is True
# Test non-existing zone
mock_get_zone.return_value = None
assert provider._validate_zone_exists("nonexistent.com") is False
@override_settings(
DNS_SCALEWAY_API_TOKEN="test-token",
DNS_SCALEWAY_PROJECT_ID="test-project",
DNS_SCALEWAY_TTL=600,
)
def test_scaleway_provider_create_record_zone_not_found(self):
"""Test that create_record raises error when zone doesn't exist."""
provider = ScalewayDNSProvider()
with (
patch.object(provider, "_validate_zone_exists") as mock_validate,
patch.object(provider, "_get_zone_name") as mock_get_zone_name,
):
mock_validate.return_value = False
mock_get_zone_name.return_value = "nonexistent.com"
with pytest.raises(Exception, match="Zone not found"):
provider.create_record("nonexistent.com", "test", "A", "192.168.1.1")
@override_settings(
DNS_SCALEWAY_API_TOKEN="test-token",
DNS_SCALEWAY_PROJECT_ID="test-project",
DNS_SCALEWAY_TTL=600,
)
def test_scaleway_provider_update_record_zone_not_found(self):
"""Test that update_record raises error when zone doesn't exist."""
provider = ScalewayDNSProvider()
with (
patch.object(provider, "_validate_zone_exists") as mock_validate,
patch.object(provider, "_get_zone_name") as mock_get_zone_name,
):
mock_validate.return_value = False
mock_get_zone_name.return_value = "nonexistent.com"
with pytest.raises(Exception, match="Zone not found"):
provider.update_record(
"nonexistent.com", "id", "test", "A", "192.168.1.1"
)
@override_settings(
DNS_SCALEWAY_API_TOKEN="test-token",
DNS_SCALEWAY_PROJECT_ID="test-project",
DNS_SCALEWAY_TTL=600,
)
def test_scaleway_provider_delete_record_zone_not_found(self):
"""Test that delete_record_by_name_type raises error when zone doesn't exist."""
provider = ScalewayDNSProvider()
with (
patch.object(provider, "_validate_zone_exists") as mock_validate,
patch.object(provider, "_get_zone_name") as mock_get_zone_name,
):
mock_validate.return_value = False
mock_get_zone_name.return_value = "nonexistent.com"
with pytest.raises(Exception, match="Zone not found"):
provider.delete_record_by_name_type("nonexistent.com", "test", "A")
@override_settings(
DNS_SCALEWAY_API_TOKEN="test-token",
DNS_SCALEWAY_PROJECT_ID="test-project",
DNS_SCALEWAY_TTL=600,
)
def test_scaleway_provider_handle_api_error_404(self):
"""Test that _handle_api_error handles 404 errors correctly."""
provider = ScalewayDNSProvider()
# Create a mock response
mock_response = type(
"MockResponse",
(),
{
"status_code": 404,
"json": lambda self: {"message": "Zone not found", "code": "not_found"},
"raise_for_status": lambda self: None,
},
)()
with pytest.raises(Exception, match="Zone not found"):
provider._handle_api_error(mock_response)
@override_settings(
DNS_SCALEWAY_API_TOKEN="test-token",
DNS_SCALEWAY_PROJECT_ID="test-project",
DNS_SCALEWAY_TTL=600,
)
def test_scaleway_provider_handle_api_error_409(self):
"""Test that _handle_api_error handles 409 errors correctly."""
provider = ScalewayDNSProvider()
# Create a mock response
mock_response = type(
"MockResponse",
(),
{
"status_code": 409,
"json": lambda self: {
"message": "Zone already exists",
"code": "conflict",
},
"raise_for_status": lambda self: None,
},
)()
with pytest.raises(Exception, match="Zone already exists"):
provider._handle_api_error(mock_response)

View File

@@ -0,0 +1,375 @@
"""
Tests for DNS provisioning functionality.
"""
from unittest.mock import MagicMock, patch
from django.test.utils import override_settings
import pytest
from dns.resolver import NXDOMAIN, NoNameservers, Timeout
from core.dns.provisioning import (
check_and_provision_domain,
detect_dns_provider,
get_dns_provider,
provision_domain_dns,
)
from core.models import MailDomain
@pytest.mark.django_db
class TestDNSProvisioning:
"""Test DNS provisioning functionality."""
def test_detect_dns_provider_scaleway(self):
"""Test detection of Scaleway DNS provider."""
with patch("core.dns.provisioning.dns.resolver.resolve") as mock_resolve:
# Mock nameservers for Scaleway
mock_ns1 = MagicMock()
mock_ns1.target.to_text.return_value = "ns0.dom.scw.cloud."
mock_ns2 = MagicMock()
mock_ns2.target.to_text.return_value = "ns1.dom.scw.cloud."
mock_resolve.return_value = [mock_ns1, mock_ns2]
provider = detect_dns_provider("example.com")
assert provider == "scaleway"
def test_detect_dns_provider_unknown(self):
"""Test detection of unknown DNS provider."""
with patch("core.dns.provisioning.dns.resolver.resolve") as mock_resolve:
# Mock unknown nameservers
mock_ns1 = MagicMock()
mock_ns1.target.to_text.return_value = "ns1.unknown.com."
mock_ns2 = MagicMock()
mock_ns2.target.to_text.return_value = "ns2.unknown.com."
mock_resolve.return_value = [mock_ns1, mock_ns2]
provider = detect_dns_provider("example.com")
assert provider is None
def test_detect_dns_provider_exception(self):
"""Test DNS provider detection with exception."""
with patch("core.dns.provisioning.dns.resolver.resolve") as mock_resolve:
mock_resolve.side_effect = Exception("DNS error")
provider = detect_dns_provider("example.com")
assert provider is None
def test_detect_dns_provider_nxdomain(self):
"""Test DNS provider detection when domain doesn't exist."""
with patch("core.dns.provisioning.dns.resolver.resolve") as mock_resolve:
mock_resolve.side_effect = NXDOMAIN()
provider = detect_dns_provider("example.com")
assert provider is None
def test_detect_dns_provider_no_nameservers(self):
"""Test DNS provider detection when no nameservers are found."""
with patch("core.dns.provisioning.dns.resolver.resolve") as mock_resolve:
mock_resolve.side_effect = NoNameservers()
provider = detect_dns_provider("example.com")
assert provider is None
def test_detect_dns_provider_timeout(self):
"""Test DNS provider detection when query times out."""
with patch("core.dns.provisioning.dns.resolver.resolve") as mock_resolve:
mock_resolve.side_effect = Timeout()
provider = detect_dns_provider("example.com")
assert provider is None
@override_settings(
DNS_SCALEWAY_API_TOKEN="test-token",
DNS_SCALEWAY_PROJECT_ID="test-project",
DNS_SCALEWAY_TTL=600,
)
def test_get_dns_provider_scaleway(self):
"""Test getting Scaleway DNS provider."""
provider = get_dns_provider("scaleway")
assert provider
@override_settings(
DNS_SCALEWAY_API_TOKEN="",
DNS_SCALEWAY_PROJECT_ID="test-project",
DNS_SCALEWAY_TTL=600,
)
def test_get_dns_provider_scaleway_not_configured(self):
"""Test getting Scaleway DNS provider when not configured."""
provider = get_dns_provider("scaleway")
assert provider is None
def test_get_dns_provider_unsupported(self):
"""Test getting unsupported DNS provider."""
provider = get_dns_provider("unsupported-provider")
assert provider is None
def test_provision_domain_dns_auto_detect(self, maildomain_factory):
"""Test DNS provisioning with auto-detection."""
maildomain = maildomain_factory(name="example.com")
with patch("core.dns.provisioning.detect_dns_provider") as mock_detect:
with patch("core.dns.provisioning.get_dns_provider") as mock_get_provider:
mock_detect.return_value = "scaleway"
mock_provider = MagicMock()
mock_provider.provision_domain_records.return_value = {
"success": True,
"created": [],
"updated": [],
"errors": [],
}
mock_get_provider.return_value = mock_provider
results = provision_domain_dns(maildomain)
assert results["success"] is True
assert results["provider"] == "scaleway"
# Verify the provider was called with domain and expected_records
mock_provider.provision_domain_records.assert_called_once()
call_args = mock_provider.provision_domain_records.call_args
assert call_args[0][0] == "example.com" # domain
assert isinstance(call_args[0][1], list) # expected_records
def test_provision_domain_dns_specific_provider(self, maildomain_factory):
"""Test DNS provisioning with specific provider."""
maildomain = maildomain_factory(name="example.com")
with patch("core.dns.provisioning.get_dns_provider") as mock_get_provider:
mock_provider = MagicMock()
mock_provider.provision_domain_records.return_value = {
"success": True,
"created": [{"type": "MX", "name": "@", "value": "10 mx1.example.com"}],
"updated": [],
"errors": [],
}
mock_get_provider.return_value = mock_provider
results = provision_domain_dns(maildomain, provider_name="scaleway")
assert results["success"] is True
assert results["provider"] == "scaleway"
assert len(results["created"]) == 1
def test_provision_domain_dns_no_provider_detected(self, maildomain_factory):
"""Test DNS provisioning when no provider is detected."""
maildomain = maildomain_factory(name="example.com")
with patch("core.dns.provisioning.detect_dns_provider") as mock_detect:
mock_detect.return_value = None
results = provision_domain_dns(maildomain)
assert results["success"] is False
assert "Could not detect DNS provider" in results["error"]
def test_provision_domain_dns_unsupported_provider(self, maildomain_factory):
"""Test DNS provisioning with unsupported provider."""
maildomain = maildomain_factory(name="example.com")
with patch("core.dns.provisioning.get_dns_provider") as mock_get_provider:
mock_get_provider.return_value = None
results = provision_domain_dns(maildomain, provider_name="unsupported")
assert results["success"] is False
assert "not supported" in results["error"]
def test_provision_domain_dns_provider_error(self, maildomain_factory):
"""Test DNS provisioning when provider raises an error."""
maildomain = maildomain_factory(name="example.com")
with patch("core.dns.provisioning.get_dns_provider") as mock_get_provider:
mock_provider = MagicMock()
mock_provider.provision_domain_records.side_effect = Exception(
"Provider error"
)
mock_get_provider.return_value = mock_provider
results = provision_domain_dns(maildomain, provider_name="scaleway")
assert results["success"] is False
assert "Failed to provision DNS records" in results["error"]
@override_settings(DNS_DEFAULT_PROVIDER="scaleway")
def test_provision_domain_dns_with_default_provider(self, maildomain_factory):
"""Test DNS provisioning using default provider from environment."""
maildomain = maildomain_factory(name="example.com")
with patch("core.dns.provisioning.detect_dns_provider") as mock_detect:
with patch("core.dns.provisioning.get_dns_provider") as mock_get_provider:
# No provider detected
mock_detect.return_value = None
mock_provider = MagicMock()
mock_provider.provision_domain_records.return_value = {
"success": True,
"created": [],
"updated": [],
"errors": [],
}
mock_get_provider.return_value = mock_provider
results = provision_domain_dns(maildomain)
assert results["success"] is True
assert results["provider"] == "scaleway"
# Verify the provider was called with default provider
mock_get_provider.assert_called_once_with("scaleway")
@override_settings(DNS_DEFAULT_PROVIDER=None)
def test_provision_domain_dns_no_provider_and_no_default(self, maildomain_factory):
"""Test DNS provisioning when no provider is detected and no default is configured."""
maildomain = maildomain_factory(name="example.com")
with patch("core.dns.provisioning.detect_dns_provider") as mock_detect:
# No provider detected
mock_detect.return_value = None
results = provision_domain_dns(maildomain)
assert results["success"] is False
assert (
"Could not detect DNS provider for domain example.com and no default provider configured"
in results["error"]
)
def test_provision_domain_dns_pretend_mode(self, maildomain_factory):
"""Test DNS provisioning in pretend mode."""
maildomain = maildomain_factory(name="example.com")
with patch("core.dns.provisioning.get_dns_provider") as mock_get_provider:
mock_provider = MagicMock()
mock_provider.provision_domain_records.return_value = {
"success": True,
"created": [{"type": "MX", "name": "@", "value": "10 mx1.example.com"}],
"updated": [],
"errors": [],
"pretend": True,
}
mock_get_provider.return_value = mock_provider
results = provision_domain_dns(
maildomain, provider_name="scaleway", pretend=True
)
assert results["success"] is True
assert results["pretend"] is True
assert results["provider"] == "scaleway"
# Verify the provider was called with pretend=True
mock_provider.provision_domain_records.assert_called_once()
call_args = mock_provider.provision_domain_records.call_args
assert call_args[1]["pretend"] is True
def test_check_and_provision_domain_no_missing_records(self, maildomain_factory):
"""Test check_and_provision_domain when no records are missing."""
maildomain = maildomain_factory(name="example.com")
with patch("core.dns.provisioning.check_dns_records") as mock_check:
# Mock all records as correct
mock_check.return_value = [
{
"type": "MX",
"target": "@",
"value": "10 mx1.example.com",
"_check": {"status": "correct"},
},
{
"type": "TXT",
"target": "@",
"value": "v=spf1 include:_spf.example.com -all",
"_check": {"status": "correct"},
},
]
results = check_and_provision_domain(maildomain)
assert results["domain"] == "example.com"
assert results.get("provisioning_results") is None
assert len(results["check_results"]) == 2
assert "updated_check_results" not in results
def test_check_and_provision_domain_with_missing_records(self, maildomain_factory):
"""Test check_and_provision_domain when some records are missing."""
maildomain = maildomain_factory(name="example.com")
with patch("core.dns.provisioning.check_dns_records") as mock_check:
with patch("core.dns.provisioning.provision_domain_dns") as mock_provision:
# Mock mixed results: one correct, one missing
mock_check.return_value = [
{
"type": "MX",
"target": "@",
"value": "10 mx1.example.com",
"_check": {"status": "correct"},
},
{
"type": "TXT",
"target": "@",
"value": "v=spf1 include:_spf.example.com -all",
"_check": {"status": "missing"},
},
]
# Mock successful provisioning
mock_provision.return_value = {
"success": True,
"created": [
{
"type": "TXT",
"name": "@",
"value": "v=spf1 include:_spf.example.com -all",
}
],
"updated": [],
"errors": [],
}
results = check_and_provision_domain(maildomain)
assert results["domain"] == "example.com"
assert results["provisioning_results"] is not None
assert results["provisioning_results"]["success"] is True
assert len(results["check_results"]) == 2
# Should have updated check results after successful provisioning
assert "updated_check_results" in results
def test_check_and_provision_domain_provisioning_failure(self, maildomain_factory):
"""Test check_and_provision_domain when provisioning fails."""
maildomain = maildomain_factory(name="example.com")
with patch("core.dns.provisioning.check_dns_records") as mock_check:
with patch("core.dns.provisioning.provision_domain_dns") as mock_provision:
# Mock missing records
mock_check.return_value = [
{
"type": "MX",
"target": "@",
"value": "10 mx1.example.com",
"_check": {"status": "missing"},
}
]
# Mock failed provisioning
mock_provision.return_value = {"success": False, "error": "API error"}
results = check_and_provision_domain(maildomain)
assert results["domain"] == "example.com"
assert results["provisioning_results"] is not None
assert results["provisioning_results"]["success"] is False
assert "updated_check_results" not in results
@pytest.fixture(name="maildomain_factory")
@pytest.mark.django_db
def fixture_maildomain_factory():
"""Create a maildomain factory for testing."""
def _create_maildomain(name="test.com"):
return MailDomain.objects.create(name=name)
return _create_maildomain

View File

@@ -152,7 +152,7 @@ def test_import_eml_file(admin_client, eml_file, mailbox):
assert Message.objects.count() == 1
message = Message.objects.first()
assert message.subject == "Mon mail avec joli pj"
assert message.attachments.count() == 1
assert message.has_attachments is True
assert message.sender.email == "sender@example.com"
assert message.recipients.get().contact.email == "recipient@example.com"
assert message.sent_at == message.thread.messaged_at
@@ -234,7 +234,7 @@ def test_process_mbox_file_task(mailbox, mbox_file):
# Check messages
assert messages[0].subject == "Mon mail avec joli pj"
assert messages[0].attachments.count() == 1
assert messages[0].has_attachments is True
assert messages[1].subject == "Je t'envoie encore un message..."
body1 = messages[1].get_parsed_field("textBody")[0]["content"]

View File

@@ -184,6 +184,7 @@ Content-Disposition: attachment; filename="{attachment_data["filename"]}"
# Verify message content via API
assert message_data["sender"]["email"] == "sender@example.com"
assert message_data["has_attachments"] is True
assert len(message_data["textBody"]) >= 1
assert len(message_data["htmlBody"]) >= 1
@@ -224,6 +225,9 @@ Content-Disposition: attachment; filename="{attachment_data["filename"]}"
assert response.status_code == status.HTTP_200_OK
assert response.content == attachment_data["content"]
# TODO?
# assert response["Content-Type"] == attachment_data["content_type"]
# TODO: should we get a dynamic content type from the attachment?
assert response["Content-Type"] == "application/octet-stream"
assert (
response["Content-Disposition"]
== f'attachment; filename="{attachment_data["filename"]}"'
)

View File

@@ -1,7 +1,6 @@
"""End-to-End tests for message sending and receiving flow."""
# pylint: disable=too-many-positional-arguments
import base64
import json
import random
import time
@@ -11,13 +10,12 @@ from django.urls import reverse
import pytest
import requests
from cryptography.hazmat.primitives import serialization
from cryptography.hazmat.primitives.asymmetric import rsa
from dkim import verify as dkim_verify
from rest_framework import status
from rest_framework.test import APIClient
from core import enums, factories, models
from core.mda.signing import generate_dkim_key
# --- Fixtures (copied/adapted from test_messages_create.py) --- #
@@ -67,17 +65,8 @@ def fixture_sender_contact(mailbox, authenticated_user):
return contact
# Test private key for DKIM
private_key_for_tests = rsa.generate_private_key(public_exponent=65537, key_size=1024)
public_key_der = private_key_for_tests.public_key().public_bytes(
encoding=serialization.Encoding.DER,
format=serialization.PublicFormat.SubjectPublicKeyInfo,
)
private_key_pem = private_key_for_tests.private_bytes(
encoding=serialization.Encoding.PEM,
format=serialization.PrivateFormat.PKCS8,
encryption_algorithm=serialization.NoEncryption(),
)
# Test private key for DKIM - using centralized generation
test_private_key, test_public_key = generate_dkim_key(key_size=1024)
# --- E2E Test Class --- #
@@ -87,11 +76,6 @@ class TestE2EMessageOutboundFlow:
"""Test the outbound flow: API -> MDA -> Mailcatcher -> Verification."""
@override_settings(
# Ensure DKIM settings are configured for the test domain
MESSAGES_DKIM_DOMAINS=["example.com"], # Match the mailbox domain
MESSAGES_DKIM_SELECTOR="testselector",
MESSAGES_DKIM_PRIVATE_KEY_B64=base64.b64encode(private_key_pem).decode("utf-8"),
MESSAGES_DKIM_PRIVATE_KEY_FILE=None,
# Ensure MTA-OUT is configured to point to Mailcatcher
MTA_OUT_HOST="mailcatcher:1025",
MTA_OUT_SMTP_USERNAME=None,
@@ -103,6 +87,16 @@ class TestE2EMessageOutboundFlow:
):
"""Test creating a draft, sending it, receiving via mailcatcher, and verifying content/DKIM."""
# --- Setup --- #
# Create and configure DKIM key for the domain
dkim_key = models.DKIMKey.objects.create(
selector="testselector",
private_key=test_private_key,
public_key=test_public_key,
key_size=1024,
is_active=True,
domain=mailbox.domain,
)
# Grant necessary permissions
factories.MailboxAccessFactory(
mailbox=mailbox,
@@ -181,7 +175,7 @@ class TestE2EMessageOutboundFlow:
sent_message = models.Message.objects.get(id=draft_message_id)
assert not sent_message.is_draft
assert sent_message.sent_at is not None
assert len(sent_message.raw_mime) > 0 # Ensure raw_mime was generated
assert sent_message.blob # Ensure blob was generated
# --- Step 3: Wait for and Fetch from Mailcatcher --- #
# Increased wait time for E2E test involving network/docker
@@ -240,8 +234,8 @@ class TestE2EMessageOutboundFlow:
def get_dns_txt(fqdn, **kwargs):
# Mock DNS lookup for the public key
if fqdn == b"testselector._domainkey.example.com.":
# Format according to RFC 6376 TXT record format
return b"v=DKIM1; k=rsa; p=" + base64.b64encode(public_key_der)
# Use the public key stored in the DKIM key model
return f"v=DKIM1; k=rsa; p={dkim_key.public_key}".encode()
return None
# Ensure the DKIM-Signature header is present
@@ -263,7 +257,7 @@ class TestE2EMessageOutboundFlow:
assert local_mailbox_messages.count() == 1
local_message = local_mailbox_messages.first()
assert local_message.subject == subject
assert local_message.raw_mime == sent_message.raw_mime
assert local_message.blob.get_content() == sent_message.blob.get_content()
assert local_message.sender.email == sender_contact.email
assert local_message.parent is None
assert local_message.is_draft is False

View File

@@ -284,7 +284,7 @@ class TestDeliverInboundMessage:
assert message.sender.email == "sender@test.com"
assert message.sender.name == "Test Sender"
assert message.sender.mailbox == target_mailbox
assert message.raw_mime == raw_email_data
assert message.blob.get_content() == raw_email_data
assert message.mime_id == sample_parsed_email["message_id"]
assert message.read_at is None

View File

@@ -30,8 +30,14 @@ class TestSendOutboundMessage:
sender=sender_contact,
is_draft=False,
subject="Test Outbound",
raw_mime=b"From: sender@sendtest.com\nTo: to@example.com\nSubject: Test Outbound\n\nTest body",
)
# Create a blob with the raw MIME content
blob = mailbox.create_blob(
content=b"From: sender@sendtest.com\nTo: to@example.com\nSubject: Test Outbound\n\nTest body",
content_type="message/rfc822",
)
message.blob = blob
message.save()
# Add recipients
to_contact = factories.ContactFactory(mailbox=mailbox, email="to@example.com")
cc_contact = factories.ContactFactory(mailbox=mailbox, email="cc@example.com")
@@ -71,7 +77,7 @@ class TestSendOutboundMessage:
outbound.send_message(draft_message, force_mta_out=True)
# Check SMTP calls
mock_smtp.assert_called_once_with("smtp.test", 1025, timeout=10)
mock_smtp.assert_called_once_with("smtp.test", 1025, timeout=60)
mock_smtp_instance.ehlo.assert_called()
# Assume no TLS/auth configured in this test override
mock_smtp_instance.starttls.assert_not_called()
@@ -86,7 +92,7 @@ class TestSendOutboundMessage:
"bcc@example.com",
} # envelope_to
# Check that the signed message was sent
assert call_args[2].endswith(draft_message.raw_mime)
assert call_args[2].endswith(draft_message.blob.get_content())
# Check message object updated
draft_message.refresh_from_db()

Some files were not shown because too many files have changed in this diff Show More