Compare commits

...

15 Commits

Author SHA1 Message Date
Manuel Raynaud
0f0989d5d8 ♻️(ci) use arm64 runner to build images for arm64 architecture
We have trouble build docker images for arm64 architecture when we use
QEMU emulation. We want to try building them using an arm64 native
runner.
2026-05-07 16:37:33 +02:00
Manuel Raynaud
fa3beca494 💚(docker) ignore .venv with compilemessages command
The compilemessages management command was also compiling messages for
all the libraries present in the .venv folder. We have to ignore it, on
arm64 this management command is taking too much time otherwise.
2026-05-07 14:21:38 +00:00
Anthony LC
d340c8f1f1 🧐(frontend) dispatch the app version to posthog
We add the app version in Posthog events to be
able to track which versions are being used and
identify potential issues related to specific
versions.
2026-05-07 14:20:05 +02:00
Anthony LC
67773ef2d9 🩺(project) reload app if front and back unsync
We observe some cases where the frontend and
backend versions can get out of sync, which can
cause issues.
To mitigate this, we want to implement a mechanism
that detects when the frontend and backend
versions are mismatched and triggers a
reload of the application to ensure they are in sync.
2026-05-07 14:20:05 +02:00
Manuel Raynaud
1268bbe5ea 🔧(actions) migrate from pip to uv
Migreate usage of pip to uv in github actions. How python is setup is
also changed.
2026-05-07 13:09:43 +02:00
Manuel Raynaud
8fc13d75dc ♻️(backend) migrate from setuptool to uv_build as build backend
We already migrate from pip to uv to manage our dependencies. We can also
migrate the build backend from setuptool to uv_build.
In the pyproject file, the readme property has been removed, because
uv_build try to read it, but the readme is at the root of the project
and not copied into the Dockerfile instructions. This readme can be used
when the package is published on pypi but it is not the case for Docs.
2026-05-07 13:09:43 +02:00
Manuel Raynaud
aea6fbef9b 🏗️(core) migrate from pip to uv
We want to remove the usage of pip in order to use uv as python package
manager.
2026-05-07 13:09:42 +02:00
Manuel Raynaud
a47c35195d 🐛(backend) replace document creation table locks with retry strategy
We have situation where the number of locks in the database can increase
dangerously creating deadlock situation. To remove this situation we
decided to change the strategy to manage document creation concurrency.
We decided to use a retry strategy, trying to create the document
multiple times while a usable path is found. To avoid having an
inifinite loop, we use a max_attempts counter configurable using the
setting TREEBEARD_PATH_COMPUTE_RETRY_MAX_ATTEMPTS
2026-05-07 11:52:48 +02:00
Manuel Raynaud
8f67b37d70 ♻️(backend) split core/utils.py module
We need to split the core/utils.py in multiple submodule created in
core/utils/*.py. We need to do this to avoir circular import between
this module and the models module.
2026-05-07 11:45:57 +02:00
Manuel Raynaud
0b20d9f435 🐛(backend) manage race condition between GET and PATCH content
When a PATCH and a GET on the content endpoint are made at the same time
for different users a race condition can happen and the metadata
returned
by the S3 head_object can be outdated when the object is fetched leading
to an error raised because the Content-Length header does not match the
size of the response body. To avoid this, we no longer used head_object
followed bu get_object, we have to manage
everything in one call with the get_object. The get_object also accepts
as parameters an etag or last-modified header and will return a 304 if
the content has not changed, so we can use this to not return the entire
body if this one has not changed.
2026-05-07 09:43:20 +00:00
Anthony LC
a166716a2f ️(frontend) close websocket connection when user change tab
When a user change to another tab, after a delay of "inactivity"
we disconnect the user from the collaboration server.
When the user come back we reconnect to the server
again. It will reduce the connection to the collaboration
server and reduce outburst during reconnection during
a ingress ngnix restart.
2026-05-06 16:19:40 +02:00
Manuel Raynaud
6fe0221596 (backend) new settings COLLABORATION_WS_INACTIVITY_TIMEOUT
We want to configure the timeout, in second, a user is consider as
inactive. After this inactivity period we want to close the websocket
connection
2026-05-06 16:19:40 +02:00
Anthony LC
bd662d72bf 🐛(frontend) fix loading comments transaction
When we load the comments we have to notify the
subscribers of the DocsThreadStore. This generates
a Yjs transaction that is currently treated as a
user-initiated content change that will trigger
a patch request when the doc will try to save.
We now update the transaction origin when we notify
the subscribers so that we can reliably identify
and ignore those transactions in the useSaveDoc
hook.
2026-05-06 15:55:47 +02:00
Anthony LC
3701fe5a45 🐛(frontend) interlinking are exported correctly in print mode
Interlinking are now exported correctly in print
mode. The interlinking was not a link in print
mode, but now it is.
2026-05-06 15:12:24 +02:00
Anthony LC
0f527a789a 🔒️(frontend) sanitize color during collaboration
To improve security we sanitize the color used
for collaboration presence to ensure it's a valid
hex color. If the color is not valid, we generate
a random color instead. This prevents potential
issues with invalid color values being used in the UI.
2026-05-05 15:21:16 +02:00
57 changed files with 3838 additions and 750 deletions

View File

@@ -41,23 +41,17 @@ permissions:
contents: read
jobs:
build-and-push:
meta:
runs-on: ubuntu-latest
permissions:
contents: read
outputs:
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
amd64: ${{ steps.platform-tags.outputs.amd64 }}
arm64: ${{ steps.platform-tags.outputs.arm64 }}
amd64_first: ${{ steps.platform-tags.outputs.amd64_first }}
steps:
- name: Checkout repository
uses: actions/checkout@v6
- name: Set up QEMU
uses: docker/setup-qemu-action@v3
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Login to DockerHub
if: ${{ inputs.should_push }}
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKER_HUB_USER }}
password: ${{ secrets.DOCKER_HUB_PASSWORD }}
- name: Extract metadata (tags, labels) for Docker
id: meta
uses: docker/metadata-action@v5
@@ -78,13 +72,23 @@ jobs:
echo "EOF"
echo "amd64_first=$FIRST_AMD64_TAG"
} >> "$GITHUB_OUTPUT"
# - name: Run trivy scan
# if: ${{ vars.TRIVY_SCAN_ENABLED }} == 'true'
# uses: numerique-gouv/action-trivy-cache@main
# with:
# docker-build-args: "--target ${{ inputs.target }} -f ${{ inputs.file }}"
# docker-image-name: "docker.io/${{ inputs.image_name }}:${{ github.sha }}"
# trivyignores: ./.github/.trivyignore
build-amd64:
needs: meta
runs-on: ubuntu-latest
permissions:
contents: read
steps:
- name: Checkout repository
uses: actions/checkout@v6
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Login to DockerHub
if: ${{ inputs.should_push }}
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKER_HUB_USER }}
password: ${{ secrets.DOCKER_HUB_PASSWORD }}
- name: Build and push (amd64)
if: ${{ inputs.should_push }}||${{ vars.TRIVY_SCAN_ENABLED }} != 'true'
uses: docker/build-push-action@v6
@@ -98,10 +102,33 @@ jobs:
PUBLISH_AS_MIT=false
push: ${{ inputs.should_push }}
provenance: false
tags: ${{ steps.platform-tags.outputs.amd64 }}
labels: ${{ steps.meta.outputs.labels }}
tags: ${{ needs.meta.outputs.amd64 }}
labels: ${{ needs.meta.outputs.labels }}
- name: Cleanup Docker after build
if: always()
run: |
docker system prune -af
docker volume prune -f
build-arm64:
needs:
- meta
- build-amd64
if: ${{ inputs.should_push }}
runs-on: ubuntu-24.04-arm
permissions:
contents: read
steps:
- name: Checkout repository
uses: actions/checkout@v6
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Login to DockerHub
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKER_HUB_USER }}
password: ${{ secrets.DOCKER_HUB_PASSWORD }}
- name: Build and push (arm64)
if: ${{ inputs.should_push }}
uses: docker/build-push-action@v6
with:
context: ${{ inputs.context }}
@@ -111,17 +138,38 @@ jobs:
build-args: |
DOCKER_USER=${{ inputs.docker_user }}
PUBLISH_AS_MIT=false
${{ inputs.arm64_reuse_amd64_build_arg && format('{0}={1}', inputs.arm64_reuse_amd64_build_arg, steps.platform-tags.outputs.amd64_first) || '' }}
push: ${{ inputs.should_push }}
${{ inputs.arm64_reuse_amd64_build_arg && format('{0}={1}', inputs.arm64_reuse_amd64_build_arg, needs.meta.outputs.amd64_first) || '' }}
push: true
provenance: false
tags: ${{ steps.platform-tags.outputs.arm64 }}
labels: ${{ steps.meta.outputs.labels }}
tags: ${{ needs.meta.outputs.arm64 }}
labels: ${{ needs.meta.outputs.labels }}
- name: Cleanup Docker after build
if: always()
run: |
docker system prune -af
docker volume prune -f
manifest:
needs:
- meta
- build-amd64
- build-arm64
if: ${{ inputs.should_push }}
runs-on: ubuntu-latest
permissions:
contents: read
outputs:
digest: ${{ steps.create-manifest.outputs.digest }}
steps:
- name: Login to DockerHub
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKER_HUB_USER }}
password: ${{ secrets.DOCKER_HUB_PASSWORD }}
- name: Create multi-arch manifests
if: ${{ inputs.should_push }}
id: create-manifest
run: |
IMAGE="${{ inputs.image_name }}"
readarray -t TAGS <<< "${{ steps.meta.outputs.tags }}"
readarray -t TAGS <<< "${{ needs.meta.outputs.tags }}"
FIRST_TAG=""
for tag in "${TAGS[@]}"; do
[ -z "$tag" ] && continue
@@ -138,8 +186,3 @@ jobs:
DIGEST="sha256:$(docker buildx imagetools inspect "$FIRST_TAG" --raw | sha256sum | awk '{print $1}')"
echo "digest=$DIGEST" >> "$GITHUB_OUTPUT"
fi
- name: Cleanup Docker after build
if: always()
run: |
docker system prune -af
docker volume prune -f

View File

@@ -96,21 +96,20 @@ jobs:
steps:
- name: Checkout repository
uses: actions/checkout@v6
- name: Install Python
- name: Set up Python
uses: actions/setup-python@v6
with:
python-version: "3.13.3"
cache: "pip"
- name: Upgrade pip and setuptools
run: pip install --upgrade pip setuptools
- name: Install development dependencies
run: pip install --user .[dev]
python-version-file: "src/backend/pyproject.toml"
- name: Install uv
uses: astral-sh/setup-uv@v6
- name: Install the project
run: uv sync --locked --all-extras
- name: Check code formatting with ruff
run: ~/.local/bin/ruff format . --diff
run: uv run ruff format . --diff
- name: Lint code with ruff
run: ~/.local/bin/ruff check .
run: uv run ruff check .
- name: Lint code with pylint
run: ~/.local/bin/pylint .
run: uv run pylint .
test-back:
runs-on: ubuntu-latest
@@ -192,14 +191,14 @@ jobs:
mc mb impress/impress-media-storage && \
mc version enable impress/impress-media-storage"
- name: Install Python
- name: Set up Python
uses: actions/setup-python@v6
with:
python-version: "3.13.3"
cache: "pip"
- name: Install development dependencies
run: pip install --user .[dev]
python-version-file: "src/backend/pyproject.toml"
- name: Install uv
uses: astral-sh/setup-uv@v6
- name: Install the project
run: uv sync --locked --all-extras
- name: Install gettext (required to compile messages) and MIME support
run: |
@@ -208,7 +207,7 @@ jobs:
sudo wget https://raw.githubusercontent.com/suitenumerique/django-lasuite/refs/heads/main/assets/conf/mime.types -O /etc/mime.types
- name: Generate a MO file from strings extracted from the project
run: python manage.py compilemessages
run: uv run python manage.py compilemessages
- name: Run tests
run: ~/.local/bin/pytest -n 2
run: uv run pytest -n 2

View File

@@ -9,14 +9,29 @@ and this project adheres to
### Added
- ⚡️(frontend) add skeleton on content loading #2254
- ⚡️(frontend) close websocket connection when user change tab #2264
### Changed
- 🏗️(core) migrate from pip to uv
### Fixed
- 🩺(project) reload app if front and back unsync #2276
- 🐛(frontend) fix patch and comments #2273
- 🐛(frontend) interlinking are exported correctly in print mode #2269
- 💬(frontend) add missing link in onboarding description #2233
- 🐛(frontend) sanitize pasted and dropped content in document title #2210
- 🐛(frontend) Emoji menu doesn't display above comment box #2229
- 🐛(frontend) Block menu doesn't stay open on 1st line #2229
- 🐛(frontend) The "+" on the first line of a new doc doesn't work #2229
- 🐛(backend) manage race condition between GET and PATCH content #2271
- 🐛(backend) replace document creation table locks with retry strategy #2274
### Security
- 🔒️(frontend) sanitize color during collaboration #2270
## [v5.0.0] - 2026-04-08

View File

@@ -1,24 +1,37 @@
# Django impress
# ---- base image to inherit from ----
FROM python:3.13.3-alpine AS base
# Upgrade pip to its latest release to speed up dependencies installation
RUN python -m pip install --upgrade pip
FROM python:3.13.13-alpine AS base
# Upgrade system packages to install security updates
RUN apk update && apk upgrade --no-cache
# We must do that to avoid having an outdated pip version with security issues
RUN python -m pip install --upgrade pip
# ---- Back-end builder image ----
FROM base AS back-builder
WORKDIR /builder
ENV UV_COMPILE_BYTECODE=1
ENV UV_LINK_MODE=copy
# Copy required python dependencies
COPY ./src/backend /builder
# Disable Python downloads, because we want to use the system interpreter
# across both images. If using a managed Python version, it needs to be
# copied from the build image into the final image;
ENV UV_PYTHON_DOWNLOADS=0
RUN mkdir /install && \
pip install --prefix=/install .
# install uv
COPY --from=ghcr.io/astral-sh/uv:0.11.10 /uv /uvx /bin/
WORKDIR /app
RUN --mount=type=cache,target=/root/.cache/uv \
--mount=type=bind,source=src/backend/uv.lock,target=uv.lock \
--mount=type=bind,source=src/backend/pyproject.toml,target=pyproject.toml \
uv sync --locked --no-install-project --no-dev
COPY src/backend /app
RUN --mount=type=cache,target=/root/.cache/uv \
uv sync --locked --no-dev
# ---- mails ----
@@ -41,14 +54,13 @@ RUN apk add --no-cache \
pango \
rdfind
# Copy installed python dependencies
COPY --from=back-builder /install /usr/local
# Copy impress application (see .dockerignore)
COPY ./src/backend /app/
# Copy the application from the builder
COPY --from=back-builder /app /app
WORKDIR /app
ENV PATH="/app/.venv/bin:$PATH"
# collectstatic
RUN DJANGO_CONFIGURATION=Build \
python manage.py collectstatic --noinput
@@ -84,8 +96,12 @@ COPY ./docker/files/usr/local/bin/entrypoint /usr/local/bin/entrypoint
# docker user (see entrypoint).
RUN chmod g=u /etc/passwd
# Copy installed python dependencies
COPY --from=back-builder /install /usr/local
# Copy the application from the builder
COPY --from=back-builder /app /app
WORKDIR /app
ENV PATH="/app/.venv/bin:$PATH"
# Link certifi certificate from a static path /cert/cacert.pem to avoid issues
# when python is upgraded and the path to the certificate changes.
@@ -95,14 +111,9 @@ RUN mkdir /cert && \
mv $path /cert/ && \
ln -s /cert/cacert.pem $path
# Copy impress application (see .dockerignore)
COPY ./src/backend /app/
WORKDIR /app
# Generate compiled translation messages
RUN DJANGO_CONFIGURATION=Build \
python manage.py compilemessages
python manage.py compilemessages --ignore=".venv/**/*"
# We wrap commands run in this container by the following entrypoint that
@@ -119,10 +130,9 @@ USER root:root
# Install psql
RUN apk add --no-cache postgresql-client
# Uninstall impress and re-install it in editable mode along with development
# dependencies
RUN pip uninstall -y impress
RUN pip install -e .[dev]
# Install development dependencies
RUN --mount=from=ghcr.io/astral-sh/uv:0.11.10,source=/uv,target=/bin/uv \
uv sync --all-extras --locked
# Restore the un-privileged user running the application
ARG DOCKER_USER

View File

@@ -72,7 +72,7 @@ data/static:
# -- Project
create-env-local-files: ## create env.local files in env.d/development
create-env-local-files:
create-env-local-files:
@touch env.d/development/crowdin.local
@touch env.d/development/common.local
@touch env.d/development/postgresql.local
@@ -141,7 +141,7 @@ else
@echo "$(RESET)"
@echo "$(GREEN)Starting bootstrap process...$(RESET)"
endif
@echo ""
@echo ""
.PHONY: pre-beautiful-bootstrap
post-beautiful-bootstrap: ## Display a success message after bootstrap
@@ -235,7 +235,7 @@ run-backend: ## Start only the backend application and all needed services
.PHONY: run-backend
run: ## start the wsgi (production) and development server
run:
run:
@$(MAKE) run-backend
@$(COMPOSE) up --force-recreate -d frontend-development
.PHONY: run
@@ -322,7 +322,7 @@ superuser: ## Create an admin superuser with password "admin"
.PHONY: superuser
back-i18n-compile: ## compile the gettext files
@$(MANAGE) compilemessages --ignore="venv/**/*"
@$(MANAGE) compilemessages --ignore=".venv/**/*"
.PHONY: back-i18n-compile
back-i18n-generate: ## create the .pot files used for i18n

View File

@@ -80,6 +80,7 @@ services:
volumes:
- ./src/backend:/app
- ./data/static:/data/static
- /app/.venv
depends_on:
postgresql:
condition: service_healthy
@@ -108,6 +109,7 @@ services:
volumes:
- ./src/backend:/app
- ./data/static:/data/static
- /app/.venv
depends_on:
- app-dev

View File

@@ -34,6 +34,7 @@ These are the environment variables you can set for the `impress-backend` contai
| CACHES_DEFAULT_KEY_PREFIX | The prefix used to every cache keys. | docs |
| COLLABORATION_API_URL | Collaboration api host | |
| COLLABORATION_SERVER_SECRET | Collaboration api secret | |
| COLLABORATION_WS_INACTIVITY_TIMEOUT | Timeout (in seconds) after which the user is considered inactive when there is no activity. The WebSocket is closed after this inactivity period. `None` means disabled. | None |
| COLLABORATION_WS_NOT_CONNECTED_READY_ONLY | Users not connected to the collaboration server cannot edit | false |
| COLLABORATION_WS_URL | Collaboration websocket url | |
| CONVERSION_API_CONTENT_FIELD | Conversion api content field | content |
@@ -134,6 +135,7 @@ These are the environment variables you can set for the `impress-backend` contai
| THEME_CUSTOMIZATION_CACHE_TIMEOUT | Cache duration for the customization settings | 86400 |
| THEME_CUSTOMIZATION_FILE_PATH | Full path to the file customizing the theme. An example is provided in src/backend/impress/configuration/theme/default.json | BASE_DIR/impress/configuration/theme/default.json |
| TRASHBIN_CUTOFF_DAYS | Trashbin cutoff | 30 |
| TREEBEARD_PATH_COMPUTE_RETRY_MAX_ATTEMPTS | Number of attempts to create a document before failing. | 10 |
| USER_OIDC_ESSENTIAL_CLAIMS | Essential claims in OIDC token | [] |
| USER_ONBOARDING_DOCUMENTS | A list of documents IDs for which a read-only access will be created for new s | [] |
| USER_ONBOARDING_SANDBOX_DOCUMENT | ID of a template sandbox document that will be duplicated for new users | |

View File

@@ -78,6 +78,7 @@ COLLABORATION_SERVER_ORIGIN=http://localhost:3000
COLLABORATION_SERVER_SECRET=my-secret
COLLABORATION_WS_NOT_CONNECTED_READY_ONLY=true
COLLABORATION_WS_URL=ws://localhost:4444/collaboration/ws/
COLLABORATION_WS_INACTIVITY_TIMEOUT=15 # Seconds
DJANGO_SERVER_TO_SERVER_API_TOKENS=server-api-token
Y_PROVIDER_API_BASE_URL=http://y-provider-development:4444/api/

View File

@@ -7,7 +7,6 @@ from base64 import b64decode
from os.path import splitext
from django.conf import settings
from django.db import connection, transaction
from django.db.models import Q
from django.utils.functional import lazy
from django.utils.text import slugify
@@ -24,6 +23,7 @@ from core.services.converter_services import (
ConversionError,
Converter,
)
from core.utils.treebeard import create_tree_node_with_retry
class UserSerializer(serializers.ModelSerializer):
@@ -467,18 +467,12 @@ class ServerCreateDocumentSerializer(serializers.Serializer):
{"content": ["Could not convert content"]}
) from err
with transaction.atomic():
# locks the table to ensure safe concurrent access
with connection.cursor() as cursor:
cursor.execute(
f'LOCK TABLE "{models.Document._meta.db_table}" ' # noqa: SLF001
"IN SHARE ROW EXCLUSIVE MODE;"
)
document = models.Document.add_root(
document = create_tree_node_with_retry(
lambda: models.Document.add_root(
title=validated_data["title"],
creator=user,
)
)
if user:
# Associate the document with the pre-existing user

View File

@@ -1,5 +1,6 @@
"""Util to generate S3 authorization headers for object storage access control"""
import datetime as dt
import time
from abc import ABC, abstractmethod
@@ -199,3 +200,31 @@ class AIUserRateThrottle(AIBaseRateThrottle):
def get_content_metadata_cache_key(document_id):
"""Return the cache key used to store content metadata."""
return f"docs:content-metadata:{document_id!s}"
def parse_http_conditional_headers(request):
"""Extract and normalize `If-None-Match` and `If-Modified-Since`.
The `W/` weak prefix is stripped from the ETag because reverse proxies
(e.g. nginx with gzip) rewrite strong ETags into weak ones, which would
otherwise break a strict equality check in production.
"""
if_none_match = request.META.get("HTTP_IF_NONE_MATCH")
if if_none_match and if_none_match.startswith("W/"):
if_none_match = if_none_match.removeprefix("W/")
if_modified_since_dt = None
if not (if_modified_since := request.META.get("HTTP_IF_MODIFIED_SINCE")):
return if_none_match, if_modified_since_dt
try:
if_modified_since_dt = dt.datetime.strptime(
if_modified_since, "%a, %d %b %Y %H:%M:%S %Z"
)
except ValueError:
if_modified_since_dt = None
else:
if not if_modified_since_dt.tzinfo:
if_modified_since_dt = if_modified_since_dt.replace(tzinfo=dt.timezone.utc)
return if_none_match, if_modified_since_dt

View File

@@ -67,11 +67,10 @@ from core.services.search_indexers import (
get_visited_document_ids_of,
)
from core.tasks.mail import send_ask_for_access_mail
from core.utils import (
extract_attachments,
filter_descendants,
users_sharing_documents_with,
)
from core.utils.paths import filter_descendants
from core.utils.treebeard import create_tree_node_with_retry
from core.utils.users import users_sharing_documents_with
from core.utils.yjs import extract_attachments
from ..enums import FeatureFlag, SearchType
from . import permissions, serializers, utils
@@ -708,18 +707,12 @@ class DocumentViewSet(
{"file": ["Could not convert file content"]}
) from err
with transaction.atomic():
# locks the table to ensure safe concurrent access
with connection.cursor() as cursor:
cursor.execute(
f'LOCK TABLE "{models.Document._meta.db_table}" ' # noqa: SLF001
"IN SHARE ROW EXCLUSIVE MODE;"
)
obj = models.Document.add_root(
obj = create_tree_node_with_retry(
lambda: models.Document.add_root(
creator=self.request.user,
**serializer.validated_data,
)
)
serializer.instance = obj
models.DocumentAccess.objects.create(
document=obj,
@@ -1023,16 +1016,12 @@ class DocumentViewSet(
)
serializer.is_valid(raise_exception=True)
with transaction.atomic():
# "select_for_update" locks the table to ensure safe concurrent access
locked_parent = models.Document.objects.select_for_update().get(
pk=document.pk
)
child_document = locked_parent.add_child(
child_document = create_tree_node_with_retry(
lambda: document.add_child(
creator=request.user,
**serializer.validated_data,
)
)
# Set the created instance to the serializer
serializer.instance = child_document
@@ -1941,11 +1930,12 @@ class DocumentViewSet(
Retrieve the raw content file from s3 and stream it.
We implement a HTTP cache based on the ETag and LastModified headers.
We retrieve the ETag and LastModified from the S3 head operation, save them in cache to
reuse them in future requests.
The ETag and LastModified are retrieved in the S3 get_object operation to be consistent with
the content Body retrieved at the same time. These metadata are saved in cache for
future requests.
We check in the request if the ETag is present in the If-None-Match header and if it's the
same as the one from the S3 head operation, we return a 304 response.
If the ETag is not present or not the same, we do the same check based on the LastModifed
same as the one from the S3 get_object, we return a 304 response.
If the ETag is not present or not the same, we do the same check based on the LastModified
value if present in the If-Modified-Since header.
"""
document = self.get_object()
@@ -1955,73 +1945,69 @@ class DocumentViewSet(
# the web-socket re-connection burst.
connection.close()
if not (
content_metadata := cache.get(
utils.get_content_metadata_cache_key(document.id)
if_none_match, if_modified_since_dt = utils.parse_http_conditional_headers(
request
)
# First check if a cache is existing to return earlier a 304 without reaching s3
# if etag or last_modified have not changed.
cache_key = utils.get_content_metadata_cache_key(document.id)
if content_metadata := cache.get(cache_key):
if (if_none_match and if_none_match == content_metadata.get("etag")) or (
if_modified_since_dt
and dt.datetime.fromisoformat(content_metadata.get("last_modified"))
<= if_modified_since_dt
):
return drf_response.Response(status=status.HTTP_304_NOT_MODIFIED)
# Prepare get_object S3 operation. The get_object manages ETag and last_modified
# headers will raise a 304 client error if one of them matches the value existing in
# S3.
get_object_kwargs = {
"Bucket": default_storage.bucket_name,
"Key": document.file_key,
}
if if_none_match:
get_object_kwargs["IfNoneMatch"] = if_none_match
if if_modified_since_dt:
get_object_kwargs["IfModifiedSince"] = if_modified_since_dt
try:
s3_response = default_storage.connection.meta.client.get_object(
**get_object_kwargs
)
):
try:
file_metadata = default_storage.connection.meta.client.head_object(
Bucket=default_storage.bucket_name, Key=document.file_key
)
except ClientError:
return StreamingHttpResponse(
b"", content_type="text/plain", status=status.HTTP_200_OK
)
last_modified = file_metadata["LastModified"]
etag = file_metadata["ETag"]
size = file_metadata["ContentLength"]
cache.set(
utils.get_content_metadata_cache_key(document.id),
{
"last_modified": last_modified.isoformat(),
"etag": etag,
"size": size,
},
settings.CONTENT_METADATA_CACHE_TIMEOUT,
)
else:
last_modified = dt.datetime.fromisoformat(
content_metadata.get("last_modified")
)
etag = content_metadata.get("etag")
size = content_metadata.get("size")
# --- Check conditional headers from any client ---
if_none_match = request.META.get("HTTP_IF_NONE_MATCH") # contains ETag
if_modified_since = request.META.get("HTTP_IF_MODIFIED_SINCE")
# Strip the W/ weak prefix. Proxies (e.g. nginx with gzip) convert strong
# ETags to weak ones, so a strict equality check would fail on production
# even when unchanged.
if if_none_match and if_none_match.startswith("W/"):
if_none_match = if_none_match.removeprefix("W/")
if if_none_match and if_none_match == etag:
return drf_response.Response(status=status.HTTP_304_NOT_MODIFIED)
if if_modified_since:
try:
since = dt.datetime.strptime(
if_modified_since, "%a, %d %b %Y %H:%M:%S %Z"
)
except ValueError:
pass
else:
if not since.tzinfo:
since = since.replace(tzinfo=dt.timezone.utc)
if last_modified <= since:
except ClientError as exc:
code = exc.response["Error"]["Code"]
match code:
case "304" | "PreconditionFailed" | "NotModified":
return drf_response.Response(status=status.HTTP_304_NOT_MODIFIED)
case "NoSuchKey" | "404":
return StreamingHttpResponse(
b"", content_type="text/plain", status=200
)
case _:
raise
def _stream(file_key):
with default_storage.open(file_key, "rb") as f:
while chunk := f.read(8192):
yield chunk
last_modified = s3_response["LastModified"]
etag = s3_response["ETag"]
size = s3_response["ContentLength"]
# Refresh the metadata cache
cache.set(
cache_key,
{
"last_modified": last_modified.isoformat(),
"etag": etag,
},
settings.CONTENT_METADATA_CACHE_TIMEOUT,
)
def _stream(body):
yield from body.iter_chunks()
body.close()
response = StreamingHttpResponse(
streaming_content=_stream(document.file_key),
streaming_content=_stream(s3_response["Body"]),
content_type="text/plain",
status=status.HTTP_200_OK,
)
@@ -2831,6 +2817,7 @@ class ConfigView(drf.views.APIView):
"API_USERS_SEARCH_QUERY_MIN_LENGTH",
"COLLABORATION_WS_URL",
"COLLABORATION_WS_NOT_CONNECTED_READY_ONLY",
"COLLABORATION_WS_INACTIVITY_TIMEOUT",
"CONVERSION_FILE_EXTENSIONS_ALLOWED",
"CONVERSION_FILE_MAX_SIZE",
"CONVERSION_UPLOAD_ENABLED",
@@ -2854,6 +2841,7 @@ class ConfigView(drf.views.APIView):
dict_settings[setting] = getattr(settings, setting)
dict_settings["theme_customization"] = self._load_theme_customization()
dict_settings["RELEASE_VERSION"] = settings.RELEASE
return drf.response.Response(dict_settings)

View File

@@ -9,7 +9,7 @@ from django.db import migrations, models
from botocore.exceptions import ClientError
import core.models
from core.utils import extract_attachments
from core.utils.yjs import extract_attachments
def populate_attachments_on_all_documents(apps, schema_editor):

View File

@@ -19,7 +19,7 @@ from django.core.cache import cache
from django.core.files.base import ContentFile
from django.core.files.storage import default_storage
from django.core.mail import send_mail
from django.db import connection, models, transaction
from django.db import models, transaction
from django.db.models.functions import Left, Length
from django.template.loader import render_to_string
from django.utils import timezone
@@ -39,6 +39,7 @@ from core.choices import (
RoleChoices,
get_equivalent_link_definition,
)
from core.utils.treebeard import create_tree_node_with_retry
from core.validators import sub_validator
logger = getLogger(__name__)
@@ -265,8 +266,6 @@ class User(AbstractBaseUser, BaseModel, auth_models.PermissionsMixin):
duplicate the sandbox document for the user
"""
if settings.USER_ONBOARDING_SANDBOX_DOCUMENT:
# transaction.atomic is used in a context manager to avoid a transaction if
# the settings USER_ONBOARDING_SANDBOX_DOCUMENT is unused
sandbox_id = settings.USER_ONBOARDING_SANDBOX_DOCUMENT
try:
template_document = Document.objects.get(id=sandbox_id)
@@ -276,20 +275,15 @@ class User(AbstractBaseUser, BaseModel, auth_models.PermissionsMixin):
sandbox_id,
)
return
with transaction.atomic():
# locks the table to ensure safe concurrent access
with connection.cursor() as cursor:
cursor.execute(
f'LOCK TABLE "{Document._meta.db_table}" ' # noqa: SLF001
"IN SHARE ROW EXCLUSIVE MODE;"
sandbox_document = create_tree_node_with_retry(
lambda: Document.add_root(
title=template_document.title,
content=template_document.content,
attachments=template_document.attachments,
duplicated_from=template_document,
creator=self,
)
sandbox_document = Document.add_root(
title=template_document.title,
content=template_document.content,
attachments=template_document.attachments,
duplicated_from=template_document,
creator=self,
)
DocumentAccess.objects.create(

View File

@@ -12,8 +12,11 @@ from django.utils.module_loading import import_string
import requests
from core import models, utils
from core import models
from core.enums import SearchType
from core.utils.dicts import get_value_by_pattern
from core.utils.paths import get_ancestor_to_descendants_map
from core.utils.yjs import base64_yjs_to_text
logger = logging.getLogger(__name__)
@@ -44,7 +47,7 @@ def get_batch_accesses_by_users_and_teams(paths):
Get accesses related to a list of document paths,
grouped by users and teams, including all ancestor paths.
"""
ancestor_map = utils.get_ancestor_to_descendants_map(
ancestor_map = get_ancestor_to_descendants_map(
paths, steplen=models.Document.steplen
)
ancestor_paths = list(ancestor_map.keys())
@@ -297,7 +300,7 @@ class FindDocumentIndexer(BaseDocumentIndexer):
>>> get_title({"id": 1})
""
"""
titles = utils.get_value_by_pattern(source, r"^title\.")
titles = get_value_by_pattern(source, r"^title\.")
for title in titles:
if title:
return title
@@ -318,7 +321,7 @@ class FindDocumentIndexer(BaseDocumentIndexer):
"""
doc_path = document.path
doc_content = document.content
text_content = utils.base64_yjs_to_text(doc_content) if doc_content else ""
text_content = base64_yjs_to_text(doc_content) if doc_content else ""
return {
"id": str(document.id),

View File

@@ -11,7 +11,7 @@ from django.dispatch import receiver
from core import models
from core.tasks.search import trigger_batch_document_indexer
from core.utils import get_users_sharing_documents_with_cache_key
from core.utils.users import get_users_sharing_documents_with_cache_key
@receiver(signals.post_save, sender=models.Document)

View File

@@ -26,6 +26,7 @@ pytestmark = pytest.mark.django_db
API_USERS_SEARCH_QUERY_MIN_LENGTH=6,
COLLABORATION_WS_URL="http://testcollab/",
COLLABORATION_WS_NOT_CONNECTED_READY_ONLY=True,
COLLABORATION_WS_INACTIVITY_TIMEOUT=300,
CONVERSION_UPLOAD_ENABLED=False,
CRISP_WEBSITE_ID="123",
FRONTEND_CSS_URL="http://testcss/",
@@ -33,6 +34,7 @@ pytestmark = pytest.mark.django_db
FRONTEND_THEME="test-theme",
MEDIA_BASE_URL="http://testserver/",
POSTHOG_KEY={"id": "132456", "host": "https://eu.i.posthog-test.com"},
RELEASE="1.0.0",
SENTRY_DSN="https://sentry.test/123",
THEME_CUSTOMIZATION_FILE_PATH="",
)
@@ -55,6 +57,7 @@ def test_api_config(is_authenticated):
"API_USERS_SEARCH_QUERY_MIN_LENGTH": 6,
"COLLABORATION_WS_URL": "http://testcollab/",
"COLLABORATION_WS_NOT_CONNECTED_READY_ONLY": True,
"COLLABORATION_WS_INACTIVITY_TIMEOUT": 300,
"CONVERSION_FILE_EXTENSIONS_ALLOWED": [".docx", ".md"],
"CONVERSION_FILE_MAX_SIZE": 20971520,
"CONVERSION_UPLOAD_ENABLED": False,
@@ -75,6 +78,7 @@ def test_api_config(is_authenticated):
"LANGUAGE_CODE": "en-us",
"MEDIA_BASE_URL": "http://testserver/",
"POSTHOG_KEY": {"id": "132456", "host": "https://eu.i.posthog-test.com"},
"RELEASE_VERSION": "1.0.0",
"SENTRY_DSN": "https://sentry.test/123",
"TRASHBIN_CUTOFF_DAYS": 30,
"theme_customization": {},

View File

@@ -0,0 +1,52 @@
"""
Unit tests for the parse_http_conditional_headers utility function.
"""
import datetime as dt
import pytest
from rest_framework.test import APIRequestFactory
from core.api.utils import parse_http_conditional_headers
@pytest.fixture(name="prepare_request")
def fixture_prepare_request(request):
"""
Fixture returning a request with headers configured from the indirect parametrize parameters.
"""
return APIRequestFactory().get("/", headers=request.param)
@pytest.mark.parametrize(
"prepare_request, expected_if_none_match, expected_if_modified_since",
[
({}, None, None),
({"if-none-match": '"abc123"'}, '"abc123"', None),
({"if-none-match": 'W/"abc123"'}, '"abc123"', None),
(
{"if-modified-since": "Wed, 21 Oct 2015 07:28:00 GMT"},
None,
dt.datetime(2015, 10, 21, 7, 28, 0, tzinfo=dt.timezone.utc),
),
({"if-modified-since": "not-a-date"}, None, None),
(
{
"if-none-match": 'W/"deadbeef"',
"if-modified-since": "Wed, 21 Oct 2015 07:28:00 GMT",
},
'"deadbeef"',
dt.datetime(2015, 10, 21, 7, 28, 0, tzinfo=dt.timezone.utc),
),
],
indirect=["prepare_request"],
)
def test_api_utils_parse_http_conditional_headers(
prepare_request, expected_if_none_match, expected_if_modified_since
):
"""Test parse_http_conditional_headers utils."""
if_none_match, if_modified_since_dt = parse_http_conditional_headers(
prepare_request
)
assert if_none_match == expected_if_none_match
assert if_modified_since_dt == expected_if_modified_since

View File

@@ -12,13 +12,14 @@ import pytest
import responses
from requests import HTTPError
from core import factories, models, utils
from core import factories, models
from core.services.search_indexers import (
BaseDocumentIndexer,
FindDocumentIndexer,
get_document_indexer,
get_visited_document_ids_of,
)
from core.utils.yjs import base64_yjs_to_text
pytestmark = pytest.mark.django_db
@@ -199,7 +200,7 @@ def test_services_search_indexers_serialize_document_returns_expected_json():
"depth": 1,
"path": document.path,
"numchild": 1,
"content": utils.base64_yjs_to_text(document.content),
"content": base64_yjs_to_text(document.content),
"created_at": document.created_at.isoformat(),
"updated_at": document.updated_at.isoformat(),
"reach": document.link_reach,

View File

@@ -8,7 +8,18 @@ from django.core.cache import cache
import pycrdt
import pytest
from core import factories, utils
from core import factories
from core.utils.dicts import get_value_by_pattern
from core.utils.paths import get_ancestor_to_descendants_map
from core.utils.users import (
get_users_sharing_documents_with_cache_key,
users_sharing_documents_with,
)
from core.utils.yjs import (
base64_yjs_to_text,
base64_yjs_to_xml,
extract_attachments,
)
pytestmark = pytest.mark.django_db
@@ -34,12 +45,12 @@ TEST_BASE64_STRING = (
def test_utils_base64_yjs_to_text():
"""Test extract text from saved yjs document"""
assert utils.base64_yjs_to_text(TEST_BASE64_STRING) == "Hello w or ld"
assert base64_yjs_to_text(TEST_BASE64_STRING) == "Hello w or ld"
def test_utils_base64_yjs_to_xml():
"""Test extract xml from saved yjs document"""
content = utils.base64_yjs_to_xml(TEST_BASE64_STRING)
content = base64_yjs_to_xml(TEST_BASE64_STRING)
assert (
'<heading textAlignment="left" level="1"><italic>Hello</italic></heading>'
in content
@@ -79,13 +90,13 @@ def test_utils_extract_attachments():
update = ydoc.get_update()
base64_string = base64.b64encode(update).decode("utf-8")
# image_key2 is missing the "/media/" part and shouldn't get extracted
assert utils.extract_attachments(base64_string) == [image_key1, image_key3]
assert extract_attachments(base64_string) == [image_key1, image_key3]
def test_utils_get_ancestor_to_descendants_map_single_path():
"""Test ancestor mapping of a single path."""
paths = ["000100020005"]
result = utils.get_ancestor_to_descendants_map(paths, steplen=4)
result = get_ancestor_to_descendants_map(paths, steplen=4)
assert result == {
"0001": {"000100020005"},
@@ -97,7 +108,7 @@ def test_utils_get_ancestor_to_descendants_map_single_path():
def test_utils_get_ancestor_to_descendants_map_multiple_paths():
"""Test ancestor mapping of multiple paths with shared prefixes."""
paths = ["000100020005", "00010003"]
result = utils.get_ancestor_to_descendants_map(paths, steplen=4)
result = get_ancestor_to_descendants_map(paths, steplen=4)
assert result == {
"0001": {"000100020005", "00010003"},
@@ -119,10 +130,10 @@ def test_utils_users_sharing_documents_with_cache_miss():
factories.UserDocumentAccessFactory(user=user2, document=doc1)
factories.UserDocumentAccessFactory(user=user3, document=doc2)
cache_key = utils.get_users_sharing_documents_with_cache_key(user1)
cache_key = get_users_sharing_documents_with_cache_key(user1)
cache.delete(cache_key)
result = utils.users_sharing_documents_with(user1)
result = users_sharing_documents_with(user1)
assert user2.id in result
@@ -139,12 +150,12 @@ def test_utils_users_sharing_documents_with_cache_hit():
factories.UserDocumentAccessFactory(user=user1, document=doc1)
factories.UserDocumentAccessFactory(user=user2, document=doc1)
cache_key = utils.get_users_sharing_documents_with_cache_key(user1)
cache_key = get_users_sharing_documents_with_cache_key(user1)
test_cached_data = {user2.id: "2025-02-10"}
cache.set(cache_key, test_cached_data, 86400)
result = utils.users_sharing_documents_with(user1)
result = users_sharing_documents_with(user1)
assert result == test_cached_data
@@ -156,7 +167,7 @@ def test_utils_users_sharing_documents_with_cache_invalidation_on_create():
doc1 = factories.DocumentFactory()
# Pre-populate cache
cache_key = utils.get_users_sharing_documents_with_cache_key(user1)
cache_key = get_users_sharing_documents_with_cache_key(user1)
cache.set(cache_key, {}, 86400)
# Verify cache exists
@@ -182,7 +193,7 @@ def test_utils_users_sharing_documents_with_cache_invalidation_on_delete():
doc_access = factories.UserDocumentAccessFactory(user=user1, document=doc1)
cache_key = utils.get_users_sharing_documents_with_cache_key(user1)
cache_key = get_users_sharing_documents_with_cache_key(user1)
cache.set(cache_key, {user2.id: "2025-02-10"}, 86400)
assert cache.get(cache_key) is not None
@@ -196,10 +207,10 @@ def test_utils_users_sharing_documents_with_empty_result():
"""Test when user is not sharing any documents."""
user1 = factories.UserFactory()
cache_key = utils.get_users_sharing_documents_with_cache_key(user1)
cache_key = get_users_sharing_documents_with_cache_key(user1)
cache.delete(cache_key)
result = utils.users_sharing_documents_with(user1)
result = users_sharing_documents_with(user1)
assert result == {}
@@ -210,7 +221,7 @@ def test_utils_users_sharing_documents_with_empty_result():
def test_utils_get_value_by_pattern_matching_key():
"""Test extracting value from a dictionary with a matching key pattern."""
data = {"title.extension": "Bonjour", "id": 1, "content": "test"}
result = utils.get_value_by_pattern(data, r"^title\.")
result = get_value_by_pattern(data, r"^title\.")
assert set(result) == {"Bonjour"}
@@ -218,7 +229,7 @@ def test_utils_get_value_by_pattern_matching_key():
def test_utils_get_value_by_pattern_multiple_matches():
"""Test that all matching keys are returned."""
data = {"title.extension_1": "Bonjour", "title.extension_2": "Hello", "id": 1}
result = utils.get_value_by_pattern(data, r"^title\.")
result = get_value_by_pattern(data, r"^title\.")
assert set(result) == {
"Bonjour",
@@ -229,7 +240,7 @@ def test_utils_get_value_by_pattern_multiple_matches():
def test_utils_get_value_by_pattern_multiple_extensions():
"""Test that all matching keys are returned."""
data = {"title.extension_1.extension_2": "Bonjour", "id": 1}
result = utils.get_value_by_pattern(data, r"^title\.")
result = get_value_by_pattern(data, r"^title\.")
assert set(result) == {"Bonjour"}
@@ -237,6 +248,6 @@ def test_utils_get_value_by_pattern_multiple_extensions():
def test_utils_get_value_by_pattern_no_match():
"""Test that empty list is returned when no key matches the pattern."""
data = {"name": "Test", "id": 1}
result = utils.get_value_by_pattern(data, r"^title\.")
result = get_value_by_pattern(data, r"^title\.")
assert result == []

View File

@@ -0,0 +1,89 @@
"""Tests for the create_tree_node_with_retry utils."""
from unittest import mock
from django.core.exceptions import ValidationError as DjangoValidationError
from django.db import IntegrityError
import pytest
from core.factories import UserFactory
from core.models import Document
from core.utils.treebeard import _is_tree_path_collision, create_tree_node_with_retry
pytestmark = pytest.mark.django_db
@pytest.mark.parametrize(
"exc",
[
DjangoValidationError({"path": "not unique"}),
IntegrityError("impress_document_path_key"),
],
)
def test_utils_create_tree_node_with_retry_exceed_max_attempts(settings, exc):
"""Test exceeding the max attempts should reraise the exception."""
settings.TREEBEARD_PATH_COMPUTE_RETRY_MAX_ATTEMPTS = 2
create_fn = mock.MagicMock()
create_fn.side_effect = exc
with (
pytest.raises(exc.__class__),
mock.patch(
"core.utils.treebeard._is_tree_path_collision"
) as mock__is_tree_path_collision,
):
mock__is_tree_path_collision.side_effect = _is_tree_path_collision
create_tree_node_with_retry(create_fn)
mock__is_tree_path_collision.assert_called()
assert mock__is_tree_path_collision.call_count == 2
assert create_fn.call_count == 2
@pytest.mark.parametrize(
"exc",
[
DjangoValidationError({"foo": "bar"}),
IntegrityError("not handled"),
],
)
def test_utils_create_tree_node_with_retry_exceed_exception_not_handled(settings, exc):
"""Test with an exception not handled should return reraise it immediately."""
settings.TREEBEARD_PATH_COMPUTE_RETRY_MAX_ATTEMPTS = 2
create_fn = mock.MagicMock()
create_fn.side_effect = exc
with (
pytest.raises(exc.__class__),
mock.patch(
"core.utils.treebeard._is_tree_path_collision"
) as mock__is_tree_path_collision,
):
mock__is_tree_path_collision.side_effect = _is_tree_path_collision
create_tree_node_with_retry(create_fn)
mock__is_tree_path_collision.assert_called()
assert mock__is_tree_path_collision.call_count == 1
assert create_fn.call_count == 1
def test_utils_create_tree_node_with_retry_success():
"""Test executing successfully the create_fn callback."""
user = UserFactory()
document = create_tree_node_with_retry(
lambda: Document.add_root(
creator=user,
title="success",
)
)
assert isinstance(document, Document)
assert document.title == "success"
assert document.path is not None

View File

@@ -2,7 +2,7 @@
Unit tests for the filter_root_paths utility function.
"""
from core.utils import filter_descendants
from core.utils.paths import filter_descendants
def test_utils_filter_descendants_success():

View File

@@ -4,7 +4,8 @@ from django.utils import timezone
import pytest
from core import factories, utils
from core import factories
from core.utils.users import users_sharing_documents_with
pytestmark = pytest.mark.django_db
@@ -54,7 +55,7 @@ def test_utils_users_sharing_documents_with():
doc_3_pierre_2.created_at = yesterday
doc_3_pierre_2.save()
shared_map = utils.users_sharing_documents_with(user)
shared_map = users_sharing_documents_with(user)
assert shared_map == {
pierre_1.id: last_week,

View File

@@ -1,170 +0,0 @@
"""Utils for the core app."""
import base64
import logging
import re
import time
from collections import defaultdict
from django.core.cache import cache
from django.db import models as db
from django.db.models import Subquery
import pycrdt
from bs4 import BeautifulSoup
from core import enums, models
logger = logging.getLogger(__name__)
def get_value_by_pattern(data, pattern):
"""
Get all values from keys matching a regex pattern in a dictionary.
Args:
data (dict): Source dictionary to search
pattern (str): Regex pattern to match against keys
Returns:
list: List of values for all matching keys, empty list if no matches
Example:
>>> get_value_by_pattern({"title.fr": "Bonjour", "id": 1}, r"^title\\.")
["Bonjour"]
>>> get_value_by_pattern({"title.fr": "Bonjour", "title.en": "Hello"}, r"^title\\.")
["Bonjour", "Hello"]
"""
regex = re.compile(pattern)
return [value for key, value in data.items() if regex.match(key)]
def get_ancestor_to_descendants_map(paths, steplen):
"""
Given a list of document paths, return a mapping of ancestor_path -> set of descendant_paths.
Each path is assumed to use materialized path format with fixed-length segments.
Args:
paths (list of str): List of full document paths.
steplen (int): Length of each path segment.
Returns:
dict[str, set[str]]: Mapping from ancestor path to its descendant paths (including itself).
"""
ancestor_map = defaultdict(set)
for path in paths:
for i in range(steplen, len(path) + 1, steplen):
ancestor = path[:i]
ancestor_map[ancestor].add(path)
return ancestor_map
def filter_descendants(paths, root_paths, skip_sorting=False):
"""
Filters paths to keep only those that are descendants of any path in root_paths.
A path is considered a descendant of a root path if it starts with the root path.
If `skip_sorting` is not set to True, the function will sort both lists before
processing because both `paths` and `root_paths` need to be in lexicographic order
before going through the algorithm.
Args:
paths (iterable of str): List of paths to be filtered.
root_paths (iterable of str): List of paths to check as potential prefixes.
skip_sorting (bool): If True, assumes both `paths` and `root_paths` are already sorted.
Returns:
list of str: A list of sorted paths that are descendants of any path in `root_paths`.
"""
results = []
i = 0
n = len(root_paths)
if not skip_sorting:
paths.sort()
root_paths.sort()
for path in paths:
# Try to find a matching prefix in the sorted accessible paths
while i < n:
if path.startswith(root_paths[i]):
results.append(path)
break
if root_paths[i] < path:
i += 1
else:
# If paths[i] > path, no need to keep searching
break
return results
def base64_yjs_to_xml(base64_string):
"""Extract xml from base64 yjs document."""
decoded_bytes = base64.b64decode(base64_string)
# uint8_array = bytearray(decoded_bytes)
doc = pycrdt.Doc()
doc.apply_update(decoded_bytes)
return str(doc.get("document-store", type=pycrdt.XmlFragment))
def base64_yjs_to_text(base64_string):
"""Extract text from base64 yjs document."""
blocknote_structure = base64_yjs_to_xml(base64_string)
soup = BeautifulSoup(blocknote_structure, "lxml-xml")
return soup.get_text(separator=" ", strip=True)
def extract_attachments(content):
"""Helper method to extract media paths from a document's content."""
if not content:
return []
xml_content = base64_yjs_to_xml(content)
return re.findall(enums.MEDIA_STORAGE_URL_EXTRACT, xml_content)
def get_users_sharing_documents_with_cache_key(user):
"""Generate a unique cache key for each user."""
return f"users_sharing_documents_with_{user.id}"
def users_sharing_documents_with(user):
"""
Returns a map of users sharing documents with the given user,
sorted by last shared date.
"""
start_time = time.time()
cache_key = get_users_sharing_documents_with_cache_key(user)
cached_result = cache.get(cache_key)
if cached_result is not None:
elapsed = time.time() - start_time
logger.info(
"users_sharing_documents_with cache hit for user %s (took %.3fs)",
user.id,
elapsed,
)
return cached_result
user_docs_qs = models.DocumentAccess.objects.filter(user=user).values_list(
"document_id", flat=True
)
shared_qs = (
models.DocumentAccess.objects.filter(document_id__in=Subquery(user_docs_qs))
.exclude(user=user)
.values("user")
.annotate(last_shared=db.Max("created_at"))
)
result = {item["user"]: item["last_shared"] for item in shared_qs}
cache.set(cache_key, result, 86400) # Cache for 1 day
elapsed = time.time() - start_time
logger.info(
"users_sharing_documents_with cache miss for user %s (took %.3fs)",
user.id,
elapsed,
)
return result

View File

@@ -0,0 +1 @@
"""Core utilities package."""

View File

@@ -0,0 +1,24 @@
"""Dictionary utility functions."""
import re
def get_value_by_pattern(data, pattern):
"""
Get all values from keys matching a regex pattern in a dictionary.
Args:
data (dict): Source dictionary to search
pattern (str): Regex pattern to match against keys
Returns:
list: List of values for all matching keys, empty list if no matches
Example:
>>> get_value_by_pattern({"title.fr": "Bonjour", "id": 1}, r"^title\\.")
["Bonjour"]
>>> get_value_by_pattern({"title.fr": "Bonjour", "title.en": "Hello"}, r"^title\\.")
["Bonjour", "Hello"]
"""
regex = re.compile(pattern)
return [value for key, value in data.items() if regex.match(key)]

View File

@@ -0,0 +1,63 @@
"""Path and tree structure utilities."""
from collections import defaultdict
def get_ancestor_to_descendants_map(paths, steplen):
"""
Given a list of document paths, return a mapping of ancestor_path -> set of descendant_paths.
Each path is assumed to use materialized path format with fixed-length segments.
Args:
paths (list of str): List of full document paths.
steplen (int): Length of each path segment.
Returns:
dict[str, set[str]]: Mapping from ancestor path to its descendant paths (including itself).
"""
ancestor_map = defaultdict(set)
for path in paths:
for i in range(steplen, len(path) + 1, steplen):
ancestor = path[:i]
ancestor_map[ancestor].add(path)
return ancestor_map
def filter_descendants(paths, root_paths, skip_sorting=False):
"""
Filters paths to keep only those that are descendants of any path in root_paths.
A path is considered a descendant of a root path if it starts with the root path.
If `skip_sorting` is not set to True, the function will sort both lists before
processing because both `paths` and `root_paths` need to be in lexicographic order
before going through the algorithm.
Args:
paths (iterable of str): List of paths to be filtered.
root_paths (iterable of str): List of paths to check as potential prefixes.
skip_sorting (bool): If True, assumes both `paths` and `root_paths` are already sorted.
Returns:
list of str: A list of sorted paths that are descendants of any path in `root_paths`.
"""
results = []
i = 0
n = len(root_paths)
if not skip_sorting:
paths.sort()
root_paths.sort()
for path in paths:
# Try to find a matching prefix in the sorted accessible paths
while i < n:
if path.startswith(root_paths[i]):
results.append(path)
break
if root_paths[i] < path:
i += 1
else:
# If paths[i] > path, no need to keep searching
break
return results

View File

@@ -0,0 +1,62 @@
"""Treebeard path collision handling utilities."""
import logging
import time
from django.conf import settings
from django.core.exceptions import ValidationError as DjangoValidationError
from django.db import IntegrityError, transaction
logger = logging.getLogger(__name__)
def _is_tree_path_collision(exc):
"""Return True when `exc` is caused by a Document.path uniqueness conflict.
Treebeard computes the materialized path by reading the current siblings;
under concurrency two callers may compute the same value. Depending on
timing this surfaces either as:
- `django.core.exceptions.ValidationError` raised by `full_clean()` /
`validate_unique()` before the INSERT (BaseModel.save calls full_clean),
with this message `{'path': ['Document with this Path already exists.']}`
- or `IntegrityError` from the database unique index when the validate
step misses the conflict. With this message:
duplicate key value violates unique constraint "impress_document_path_key"
DETAIL: Key (path)=(0000001) already exists.
"""
if isinstance(exc, DjangoValidationError):
message_dict = getattr(exc, "message_dict", None)
if message_dict is not None:
return "path" in message_dict
return "path" in str(exc).lower()
# search in the IntegrityError exception
return "impress_document_path_key" in str(exc).lower()
def create_tree_node_with_retry(create_fn):
"""Run `create_fn` in a fresh atomic block, retrying on path collisions.
The Document.path field carries a unique constraint, which is the source of
truth that prevents duplicate paths. On collision we let the failed
transaction roll back, and call `create_fn` again so treebeard recomputes
the path from the latest state.
"""
max_attempts = settings.TREEBEARD_PATH_COMPUTE_RETRY_MAX_ATTEMPTS
for attempt in range(max_attempts):
try:
with transaction.atomic():
return create_fn()
except (IntegrityError, DjangoValidationError) as exc:
if not _is_tree_path_collision(exc) or attempt == max_attempts - 1:
raise
logger.info(
"tree path collision on attempt %d/%d, retrying",
attempt + 1,
max_attempts,
)
time.sleep(attempt * 0.1)
raise RuntimeError("create_tree_node_with_retry exited without result")

View File

@@ -0,0 +1,55 @@
"""User sharing cache utilities."""
import logging
import time
from django.core.cache import cache
from django.db import models as db
from django.db.models import Subquery
from core import models
logger = logging.getLogger(__name__)
def get_users_sharing_documents_with_cache_key(user):
"""Generate a unique cache key for each user."""
return f"users_sharing_documents_with_{user.id}"
def users_sharing_documents_with(user):
"""
Returns a map of users sharing documents with the given user,
sorted by last shared date.
"""
start_time = time.time()
cache_key = get_users_sharing_documents_with_cache_key(user)
cached_result = cache.get(cache_key)
if cached_result is not None:
elapsed = time.time() - start_time
logger.info(
"users_sharing_documents_with cache hit for user %s (took %.3fs)",
user.id,
elapsed,
)
return cached_result
user_docs_qs = models.DocumentAccess.objects.filter(user=user).values_list(
"document_id", flat=True
)
shared_qs = (
models.DocumentAccess.objects.filter(document_id__in=Subquery(user_docs_qs))
.exclude(user=user)
.values("user")
.annotate(last_shared=db.Max("created_at"))
)
result = {item["user"]: item["last_shared"] for item in shared_qs}
cache.set(cache_key, result, 86400) # Cache for 1 day
elapsed = time.time() - start_time
logger.info(
"users_sharing_documents_with cache miss for user %s (took %.3fs)",
user.id,
elapsed,
)
return result

View File

@@ -0,0 +1,36 @@
"""Yjs document conversion utilities."""
import base64
import re
import pycrdt
from bs4 import BeautifulSoup
from core import enums
def base64_yjs_to_xml(base64_string):
"""Extract xml from base64 yjs document."""
decoded_bytes = base64.b64decode(base64_string)
doc = pycrdt.Doc()
doc.apply_update(decoded_bytes)
return str(doc.get("document-store", type=pycrdt.XmlFragment))
def base64_yjs_to_text(base64_string):
"""Extract text from base64 yjs document."""
blocknote_structure = base64_yjs_to_xml(base64_string)
soup = BeautifulSoup(blocknote_structure, "lxml-xml")
return soup.get_text(separator=" ", strip=True)
def extract_attachments(content):
"""Helper method to extract media paths from a document's content."""
if not content:
return []
xml_content = base64_yjs_to_xml(content)
return re.findall(enums.MEDIA_STORAGE_URL_EXTRACT, xml_content)

View File

@@ -507,6 +507,11 @@ class Base(Configuration):
environ_name="COLLABORATION_WS_NOT_CONNECTED_READY_ONLY",
environ_prefix=None,
)
COLLABORATION_WS_INACTIVITY_TIMEOUT = values.IntegerValue(
None,
environ_name="COLLABORATION_WS_INACTIVITY_TIMEOUT",
environ_prefix=None,
)
# Frontend
FRONTEND_THEME = values.Value(
@@ -1081,6 +1086,12 @@ class Base(Configuration):
60 * 60 * 24, environ_name="CONTENT_METADATA_CACHE_TIMEOUT", environ_prefix=None
)
TREEBEARD_PATH_COMPUTE_RETRY_MAX_ATTEMPTS = values.IntegerValue(
10,
environ_name="TREEBEARD_PATH_COMPUTE_RETRY_MAX_ATTEMPTS",
environ_prefix=None,
)
# pylint: disable=invalid-name
@property
def ENVIRONMENT(self):

View File

@@ -2,8 +2,8 @@
# impress package
#
[build-system]
requires = ["setuptools"]
build-backend = "setuptools.build_meta"
requires = ["uv_build>=0.11.9,<0.12"]
build-backend = "uv_build"
[project]
name = "impress"
@@ -21,9 +21,8 @@ classifiers = [
]
description = "Docs is a collaborative text editor designed to address common challenges in knowledge building and sharing."
keywords = ["Django", "Contacts", "Templates", "RBAC"]
license = { file = "LICENSE" }
readme = "README.md"
requires-python = ">=3.12"
license = "MIT"
requires-python = "~=3.13.0"
dependencies = [
"beautifulsoup4==4.14.3",
"boto3==1.42.93",
@@ -97,12 +96,13 @@ dev = [
"types-requests==2.33.0.20260408",
]
[tool.setuptools]
packages = { find = { where = ["."], exclude = ["tests"] } }
zip-safe = true
[tool.distutils.bdist_wheel]
universal = true
[tool.uv.build-backend]
module-root = ""
source-exclude = [
"**/tests/**",
"**/test_*.py",
"**/tests.py",
]
[tool.ruff]
exclude = [

View File

@@ -1,7 +0,0 @@
#!/usr/bin/env python
"""Setup file for the impress module. All configuration stands in the setup.cfg file."""
# coding: utf-8
from setuptools import setup
setup()

2440
src/backend/uv.lock generated Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -2,7 +2,6 @@ PORT=3000
BASE_URL=http://localhost:3000
BASE_API_URL=http://localhost:8071/api/v1.0
COLLABORATION_WS_URL=ws://localhost:4444/collaboration/ws/
COLLABORATION_WS_NOT_CONNECTED_READY_ONLY=true
MEDIA_BASE_URL=http://localhost:8083
CUSTOM_SIGN_IN=false
IS_INSTANCE=false

View File

@@ -2,7 +2,6 @@ PORT=3000
BASE_URL=http://localhost:3000
BASE_API_URL=http://localhost:8071/api/v1.0
COLLABORATION_WS_URL=ws://localhost:4444/collaboration/ws/
COLLABORATION_WS_NOT_CONNECTED_READY_ONLY=true
MEDIA_BASE_URL=http://localhost:8083
IS_INSTANCE=false
CUSTOM_SIGN_IN=false

File diff suppressed because one or more lines are too long

View File

@@ -192,10 +192,10 @@ endobj
(react-pdf)
endobj
55 0 obj
(D:20260403132357Z)
(D:20260505110445Z)
endobj
56 0 obj
(chromium-8651-0-doc-export-override-content)
(chromium-4903-0-doc-export-override-content)
endobj
52 0 obj
<<
@@ -216,7 +216,7 @@ endobj
58 0 obj
<<
/Type /FontDescriptor
/FontName /VIBRRZ+Inter18pt-Regular
/FontName /HRJUFI+Inter18pt-Regular
/Flags 4
/FontBBox [-742.1875 -323.242187 2579.589844 1109.375]
/ItalicAngle 0
@@ -232,7 +232,7 @@ endobj
<<
/Type /Font
/Subtype /CIDFontType2
/BaseFont /VIBRRZ+Inter18pt-Regular
/BaseFont /HRJUFI+Inter18pt-Regular
/CIDSystemInfo <<
/Registry (Adobe)
/Ordering (Identity)
@@ -247,7 +247,7 @@ endobj
<<
/Type /Font
/Subtype /Type0
/BaseFont /VIBRRZ+Inter18pt-Regular
/BaseFont /HRJUFI+Inter18pt-Regular
/Encoding /Identity-H
/DescendantFonts [59 0 R]
/ToUnicode 60 0 R
@@ -256,7 +256,7 @@ endobj
62 0 obj
<<
/Type /FontDescriptor
/FontName /TDKMKH+Inter18pt-Bold
/FontName /XKLDZR+Inter18pt-Bold
/Flags 4
/FontBBox [-790.527344 -334.472656 2580.566406 1114.746094]
/ItalicAngle 0
@@ -272,7 +272,7 @@ endobj
<<
/Type /Font
/Subtype /CIDFontType2
/BaseFont /TDKMKH+Inter18pt-Bold
/BaseFont /XKLDZR+Inter18pt-Bold
/CIDSystemInfo <<
/Registry (Adobe)
/Ordering (Identity)
@@ -287,7 +287,7 @@ endobj
<<
/Type /Font
/Subtype /Type0
/BaseFont /TDKMKH+Inter18pt-Bold
/BaseFont /XKLDZR+Inter18pt-Bold
/Encoding /Identity-H
/DescendantFonts [63 0 R]
/ToUnicode 64 0 R
@@ -296,7 +296,7 @@ endobj
66 0 obj
<<
/Type /FontDescriptor
/FontName /JYBWBW+Inter18pt-Italic
/FontName /QHBJWW+Inter18pt-Italic
/Flags 68
/FontBBox [-747.558594 -323.242187 2595.703125 1109.375]
/ItalicAngle -9.398804
@@ -312,7 +312,7 @@ endobj
<<
/Type /Font
/Subtype /CIDFontType2
/BaseFont /JYBWBW+Inter18pt-Italic
/BaseFont /QHBJWW+Inter18pt-Italic
/CIDSystemInfo <<
/Registry (Adobe)
/Ordering (Identity)
@@ -327,7 +327,7 @@ endobj
<<
/Type /Font
/Subtype /Type0
/BaseFont /JYBWBW+Inter18pt-Italic
/BaseFont /QHBJWW+Inter18pt-Italic
/Encoding /Identity-H
/DescendantFonts [67 0 R]
/ToUnicode 68 0 R
@@ -336,7 +336,7 @@ endobj
70 0 obj
<<
/Type /FontDescriptor
/FontName /DLRHPN+GeistMono-Regular
/FontName /NBHLIK+GeistMono-Regular
/Flags 5
/FontBBox [-1738 -247 654 1012]
/ItalicAngle 0
@@ -352,7 +352,7 @@ endobj
<<
/Type /Font
/Subtype /CIDFontType2
/BaseFont /DLRHPN+GeistMono-Regular
/BaseFont /NBHLIK+GeistMono-Regular
/CIDSystemInfo <<
/Registry (Adobe)
/Ordering (Identity)
@@ -367,7 +367,7 @@ endobj
<<
/Type /Font
/Subtype /Type0
/BaseFont /DLRHPN+GeistMono-Regular
/BaseFont /NBHLIK+GeistMono-Regular
/Encoding /Identity-H
/DescendantFonts [71 0 R]
/ToUnicode 72 0 R
@@ -376,7 +376,7 @@ endobj
74 0 obj
<<
/Type /FontDescriptor
/FontName /LHWXUO+Inter18pt-BoldItalic
/FontName /VMRKYJ+Inter18pt-BoldItalic
/Flags 68
/FontBBox [-795.898437 -334.472656 2596.191406 1114.746094]
/ItalicAngle -9.398804
@@ -392,7 +392,7 @@ endobj
<<
/Type /Font
/Subtype /CIDFontType2
/BaseFont /LHWXUO+Inter18pt-BoldItalic
/BaseFont /VMRKYJ+Inter18pt-BoldItalic
/CIDSystemInfo <<
/Registry (Adobe)
/Ordering (Identity)
@@ -407,7 +407,7 @@ endobj
<<
/Type /Font
/Subtype /Type0
/BaseFont /LHWXUO+Inter18pt-BoldItalic
/BaseFont /VMRKYJ+Inter18pt-BoldItalic
/Encoding /Identity-H
/DescendantFonts [75 0 R]
/ToUnicode 76 0 R
@@ -713,30 +713,21 @@ endobj
/Filter /FlateDecode
>>
stream
xœí]ݎ㸱¾ï§ð ´Â?°èdO=6;@.\Ìhfr0}‚Í¢DŠÿ”e¯»·¶×cWY*ŠUÉ*Êö‡/Hý=cõ<63>dx<64>Bx¼Ì¯O?Û?lÀ:êÿÕ?ª¿ä?{ëèBbßu:Œ¾h÷ý—ÿÿßùË_þôûËüË2Gü2ÿßÓïþô ¾üã<75>Û™‰üýϪ÷Ö<65>äÉåÃ×§¿}‡RzÄÔƒª÷…z¨—ø³úgTÏxyFüå"ù0a†8½è“ÈË…Ða"£<>Ø(Ä‹º
9Œ#ãR<18> ¿_>ü÷Ó}P—º^Œ<> ¢$㲟<C2B2>ÛÜŸï#,Ä€&J©\Ž~“>$§ûP¢aDxiéÈ,9ü ûðË-|ÈbLù0ð¶Qüp^“½£™Èħ@ÖNªéI¢ËëÓÈÑ@¤rç¤åo<C3A5>\fÿX…uåªo±ª®˜õµ¬ço/ã³½æoöÙÿw~úŸ§¿^œ“ƒ “dSAF6©á£ÆÃ0<C383>db Î¥T“Ò õ3uz¶ÅþHÆõ¢™¾}yúêů—‡ŠÓJõ¢‡·!gíˆ^åg¥x^—(ó&2ºÍòøòý?mì<6D>AÏ4j G<C2A0>h%°·^C<E28093>ßo"/½qs·ÑsY¦>ó`vºÓÇÝëõÔ˜[Z<>¾óÕDÉŽîw3ˆÝcô<77>åØ-ÏzŠÒ áë4ªäo<C3A4>\fÿXµýo±ª®˜õµ¬ço/ã³½æoöÙÿ¿ºØ¸UŬ&ziÙ­5Ô,@ËR„„]šL„—F"¶«6®ÜÕ$Yp²KÍÒï¹HÌ ³è7 ÓåÏxmÎ+9p2pfð齌ûåYZÃW6°žPpÐÞ<C390>¹ä×ä¥^ãdøI¡C%äþâÒ¦úé~"—?ýôô·¿«ëülà°oëõ´ØÙq*”QÕóa<C3B3>yb-¬Õ<C2AC>ô!/áÔŸ #<03>bc‰ˆì<CB86>½X*„ônSä<53> »4÷Ú3#¯ÐCÑ«ðy>±<¶sŽ£ûÁ³÷•‡¬ïŒ0F \ƒtÓµö€Ûšy-®Fhy+µ–µ€ e"”ïs{Ør+YÛúWŽÿ¶ôEƒ±7'º] ý:Û¶¶O8Ù F<>µ)5fl¹lºSnMýÜZo\}µYØ>«JS»±Å²ù·Ž"WFùí«9*rÅ({¹L2föÒÌœÞe³ïK]¤ê·a)V¾ý.qÒGõø¤³÷ú³ó<C2B3>©rWÿèý<4˜ÍM»Ÿ§[z¹ÈA5g÷ø´ó0K:MW"z—ÇßÖK­»õÙ-?ìÕ.<2E>nënˆ“Òí½¡÷•n¯}TuÅ.SqvœŠgTuÅO·Wg”@øLOfL‰ˆÜ.ݾ*üwìÒÜ«ïÍíeüÂÌ)+ôPô*|žOD,<2C>íœãè~ðì}•K·}#Þ `Œ@¸éPÿZ#{ãtÛÚ±t»6”‰P¾Ïía»:Ý.Ä[ú¢Á˜<19>o(Ýþ5v°õ½¨q £Édœæž‰½eŒ?ÚtV½&Øfžt½<74>be¶Üj&ëq*ád
Ì&¡Ú†>UknIì5L{•1¬ra:0,ô¾ùz!a#™ÌU<C38C>a2MÞ!ß÷|‡µ<E280A1>¸­RDúNÆiK©ïÔ´Ÿîw‡Û1Íÿ¸3eJ„éÞŤ)’¤-”& íq)¨°~ÌËÃÀþóRlmŸœ°Çlï­èW{»Q83ÎäRµé¡°<C2A1>X0¼#8.5ä†IM1“¸ #æMJ2,Ñ…‰_c¢RŒ`)—ZÙÎTÌ»ÃKí„¡B@Ò{jiðŠÜß̈Üí¬|Zæk<1F>zÞ&D<>MtqïÇä³i<C2B3>~‡­<C2AD>ÿ¸Z¦ØÀõ‡Ž¾Cч<C391>¶UJØgW<1B><˜÷>-Ãt¹]tÈ2˜ã¿X[VÖÓñÞÛlØ…h[õyiœá‰<58>œK´<4B>.”¯Kÿ¿•×05£I%­+8­æ[˜¥:åø)ù™$×<C397>Ca<06>jù¥„ªdò£½\fÿXdNEY1/\¹ #F…TŽUÊâ^ŽòììwkÓ§¯‡œÓ|¿óß²¸Ø·õšê[¤ª+f?ìÎŽS¡Œª®˜Ìka­†¤u —ø<ªúǦÑHl¾ÂÙ3ޝ
ý<EFBFBD>àzã{<7B>ç¡ÇvÎqt/tžsw:3&¼ãp С¾µFõ·{Ÿ³´ÜúÕ²æ•cÿžïqÞkÓe¿q j2p½B·|V.·ìð×t©@L‰ÄÉ/.ì³¼ýÊ÷úD<C3BA>º2NÒf/—¤Ù?™SQVÌ ‰¤ÑóBiñ·PG»./­Ã >P1a9^Så5O¸¨ÐÐ~Pdì…çBØó;Ú)wðš0ßr^Ý·õšê{¤ª+f|ÎŽS¡Œª®˜ÃÜka¯†¤ ¯ Ÿ¶@ö™ xŠM&Â’vC?ßžéå*,Ü ¿.Ç);8LÓŠLÖYL Q˜ó‰CÇvÎ Z/Ë^÷³Ò´ÉÄù  <>àPŸÏBÎm«® ¹x=ëY—
rg.pÎÈH7çD¥~µ4¡+Çhƒw”fdç´Î ¬÷©V¶&@æFQz+Yl“Dì÷<>ü+j½%€Ù€&½¯^¯L¶õÅ~Xˆ0183ÿ<E280B9>\fÿX“£1žó·¾M6Õóļ0û]Ù¥ˆÓ:}¹$mŸ¬Pc3®<©½N šd.Ê7´â9{áùùöü‡vʼ&Ì7Íìvm½¦ú©êŠÙŸ³ãT(£ª+æã0wb-ìÕ<C3AC>ôá5áÓÈ>“‰Ød",ÉQaíåãÛ3½\……;á×%¥o#U]±›Þ¢´¿ ª+懂ˆíœ´n8½î#2c2q~ôÈ7š´NBÎmKñ ¹ÄzÖ³.9pîÓïî\à¤á©š<C2A9>Õ•nøUÒ„¾£ ÞQš<C5A1>Ó:'°n\ܧ2 Z}ÜÊäs©2È|ŠóÍW&v}…MôÇwФ*G Ÿ<>\fÿX3<58>L$+æ·¾<C2B7>¦zž˜f¿+~Џô,”KûÌ÷4)'Æö£<tQ¾¡`ˆgì…ççCØó;Ú)wðš0ßt¡Ûµõšê{¤ª+f|ÎŽS¡Œª®˜<C2AE>Ã܉µ°WC҇ׄO »[8aIŽ
k/ßžéå*,Ü ¿.)<29>|©êŠÝô¦ý%U]1Ÿ8hl眠uñèu“‰óë <C3AB><>àФurn[ŠÍ%Ö³žuÉ<75>s—~÷ç'<27> OÕ„¬ö¨tï&ôåMðŽÒŒìœÖ9<C396>uãâN•ɾՇ­LËW&|@rÒ_Èyó•‰]_ˆÝĤTª`[ÖúRîõìŽ2?þIERH¿t«±[ξ˜fÿÂý„<C3BD>¬[¶;¹$­sÕ7Â(š”»ý1½wH¡™ýpÎX Ïϸ;˜ü®†6Ê];Î[.d~K¯qEYœ¼Üù«%eqîï&”Ytvö"oµ€Ï_îòîNbÛÚJG¬mB8×» Ð%<25>#U]±<C2B1>´¼¤ª+æ`-Ãó¯ PÌÊö±6˜8¿åºk¨;{y2n[
ÍÅkLÏzá<7A>o—þö¯Îç ßW5¡©=*½<>+/Ý]«~  £¥?;_uNNݘ¸SU°oõa«åûüÙORQ:No¿*°+ˆœì"•úW7¨ä¶¨ÜË%iö<69>5Y•¤Y1/lk˜¾$äÆË"æ…ÙÔÙž…rIÚf ŽÕ<AôÇýT-ôP¾<50> WK  0çw7´SîÞ5A¾iƶkë5Õ÷HUWÌ>ôœ§BU]1¹ka¯†¤ ­ —Ö»KúŠAI k.Ýž©å*$Ü ½. <0A>\©êŠÝÔ$öEU]1Ÿ8Xl眠õ¢±èt<1F>i‰Óë<C393><C3AB> ¾AàЄunn[^Í%Ö²ž5ÉAÓOºd'<27> ¬ö¨ô¢¯’!teMàŽŒì|Ö9yu£â>IÐêÃV$æÇ–òŸ ¢’,ßɺ¶,Y#“‚³ ëm¹½\fÿXdN•Y1/xKMñœ”{í-«û´pÁž˜Ö™éßÇähd\<5C>©I!Zò„g²<67>ì‡uÆZpvÁíÁ,áuôµ½gׄõ«Ú¾­×T¿#U]1û`svœ
eTuÅ|ÖžX y5$]@Mø´¬Ïx˜ˆJj4XsùèöL&W!áNèuùgèÙPS“ýÉ,Jí³šš<Ÿˆÿ)¶sN¬zAXr¶ôÁøìœGÀ^¨Ãg å¶5ö¾µhÝj^€÷Iu×ý ð4 j A'Ì*kOÞЄá0sÈÍU=3S_ôïTWì}ܲb|Ñ?0MÌÏ™æê 6Œìþ?1{ƒúb]AìÇ ˜`
ÌÁ£ÿN.I³¬É¼ÍŠyÁ-`ÌæoNʽž½^ø9[°'æ…mžä˜$d¿wy&×H<C397>²e¬gçݘó:úÚÞ³kÂzÓ m×Ökªßª®˜}°9;N…2ªºb>k'ÖB^ IP>­ƒÕß…+G%5¬¹|t{&“«<E2809C>p'ôº43ôl¨©Éþd&ðyMMžOÄ?‹íœ«nœíÃ0c0>»ŽóØ[àMS' å¶ô¾µxÝj^€6<îrê¾5þ„Aà4 j A/ÌÊkWÞÐá0sÈÍU=3S_ôïS_ì}Üú‚o[0†ÙÛ¿m±.ÈþXîÊ9…liº—KÒìës¦Ä¼à.$mâæ‰yaö»â'|H®Ã×—KÒ:QŒ ˜(oºB<Žtš.*4´Ù{áù…3…ßáÐN¹ƒ×„ùˆGp\û©êŠÙŸ³ãT(£ª+æÃ0÷ÄZØ«!éÂk§-<2D>}Vƒ Å&aIŽŠ=Sa"¾=ÓËUX¸~]Âù6RÕ»é-ÌçKªºb>q(ðØÎ9Aë…cÙë>"Ó&ç7€>ù
C}> 9·-­ƒæâõ¬g]òÀ¹Ë¹ûs<C3BB>sF†¯jBV{TzáWKºrŒ6xGiFvNëœÀºqq§ÂdßêãV&â]~ *¦q&ítàq[Ÿ÷xÜ$ð¸<01>ð¸<01>ð¸m`Ûš·Ür ÂÌÈË:xÜÌ7<C38C>±¥þê(ˆRå4õ4¸t4õ@á†#Þªc0
·èڀ (Ü2A
·ãÈ
·Ã#(Ü€Âí1Êâ—_ÔCÍ¿dJ¦ÿDÖŠ<ii9€'­fàIž4àIž4àIž4àI;àIž´ƒé?ù¨ŸJéÿH(£²!ý2²H22 #³ <02><19>µ #22 #22 #»iú?Û Ÿþ3.ÅØ<C385>þã0~ã
ã0~ã0~<7E> `ü:Œ|`üƯK½Í®{)õ¦BõƒÕSo ÕŠ% Õ*6! ÕZ- ÕZ- ÕZ- ÕZ­[¦ýŸí®{6í'¸¨«ÃPW¥Â
ÔU@]ÔU@]µÙê* ®ê* ®êªc_•ÅöQøº,C„6ìÚ5”c0j¨TX<54>
¨¡€
¨¡6;@ ÔP@ ¨¡²ÑÿÍSCEù;)ß•!±þ;p0áH&à`¦°¸&à`ºj(p0p0Ó­Kó}×Âw^ÅÀõ'©HªˆÉŽ0Ùú TGþ<>êHÕPÕPÕѶ­ :Ê-wÑ ÌŒ¼¬ó€êÈd{òåÂUmÀ0S¹ùw ï¢*õ“–óÈç?¢Ét<C389>£<EFBFBD>sÑTßþ£Xþ#à?²_Êþ#à?þ£ö¡üGÀüGÀüG·å?z¹0U#,ߺÔ_Ð$/Êà0<C3A0>£À$Y`1P$Ç–hf¤Xf$`F²É0#30#µ`Ff$`FjŠ
0#3Òaf¤— W9¾$Óµ0 Ó€øR<16>Oé²€Ó$ e0&E0&c](€1 “€1©}(c0&c0&cÒm“TY@&øÈmY°Ô
é[£ù÷†Zè“€> è“P`è“€> è“€>éd}Òaä}Ð'=XžÞ~*ÚŒà†ŒX“b X“
AÖ¤€,X“€5 X“€5 X“€5 X“€5馬I/— BÚÏäëÏâìä9÷a<>´¡:¥íWÿ<57>N)V S:% S:¥ÍÐ)<01>Ð)<01>Ð)]E§Ôú…[…]eG_$p+·Ò¸•öøÛ’àVʸ•ºÐër:àV:+àVn%àVŒ¾en¥¦/ɪà#I'ÙB“
4K4K@³´¨€f hŽâ×%y@³4K@³4Kû¨ÍÐ,]E³ý~Nö²ª%A4I»šã^¢ª·oýõ{L hLvV…h"*йÚÇï¥:¨mý#×·úÅlÔ²l[ŽY˜‡éãd¯Vµ?Œk? —ïÿ¹½ÝÁ[wz¹<s”úiíÉC¹õ¤á/ɇI³g˜lè2¾<77>\84ˆz<03>ƒ-üõ¡˜ªÇd´˜r<CB9C>'gÃ>>/”øs²×ø=^Ê£väÉY §&g²*oKŠBØÖ?lƒ@íƒÙøM:&¸Œ3ɘ™³QmêXS4<53>žº¡5èÚÔGû0fn䢡p×ó$ä}þ{ƒó;£c)òWëŸOŸ“E
êF
Ks댃:<3A>Š&(pþ¾¡ §f(Lz “Ê6ËhPÝ^¢ß {No<0F>t<EFBFBD>HPÝöü6bå!}uäýbcdh™>¦ð!ôÏÝr@ÁºbOë<1B> Y—[
ÿ1ªyZ ÷|ªBä=× £š<ήÊ©¹­>ÍKn¾Yþ<59>>Vf~¼#ì½td|/áï¥#â½tD>VGÞaN¾*ý×¥œ6ó§e]4%6ZÖX³/š®¢õœþ ºucòÀ•«&m7<6D>UY£ÊúçÖSûOÕÜZ\rüe9êYåÔÂ$f;Cû1È6ÄU=&[¶ŽÎ¡æ½Uþl_ +g62('*[¼ÍeeÕþÙxóõI,Ë4OîŸY+ĸ¼ü¯ pÍ[þ Vê.ûè.^æh¬í,=Ò'7ÒÃmÙÿ§os2
xœí][<5B>㸱~ï_á?Ð
XôC²'Áž‡ Af439Ÿ`³ÀÉßIQÅ;eÙÛÝ[Û뱫,Ū<C385>deûäÿž±þG2<H…/Óõég÷‡íøBGó¿~ãGý—|ãgïoy]È@Ü»›ã<>ÏÚß}ÿåÿÿwúò—?ýþ2ýò„ì¿Lÿ÷ô»?ý‚/ÿøEŸõãzfâ"ÿÁ³ê½õA[þ#¹`rùðõéoß!„´ ú}¡ú%þ¬ÿõ3žŸ¹H>(̧s:(2
­B¼è«<EFBFBD>Ã82.…ÕÈ— úûåÃ?ý×}©ËÅ9 Á± Z².ûÙºmûó}„…<18>¢”Êùè7éCrº%F„ÕH[|HG6`ɹào؇_îáC.°cʇ<C38A>ȰŽâWçµ1Ù;:EW9€,<2C>ÔÓ“D—ëÓÈÑ@¤v§2ò·@.I“¬ÆºvÕ·XUWLæZó×—ñÉ]ó7÷ìÿ;=ýÏÓ_/›“ƒ J2Š© #Szøèñ0¨(†±àL QJ=) Ò<ÓMoÃ6Ûɸ\43×ò¯/O_½8âå2òPÙ´Rÿ…èámÈYgG7¢ùY+ž—%ʾ‰¬nµü¾|ÿOûcг<C390>¤G ì-—µ¡Îï7ŒÞȸyØÅ˜¹ŽÌSŸ}07Ý™ã¿îõfjÌ-­“RH…|õǦ  Qr£ûÝ̲a÷<>ãc9vóß³™¢ÌbCø2<C3B8>jù[ —¤É?`cÿ[¬ª+&s-ËùëËøÅä®ù›{öÿ¯.6ÛªbW³´ìÖj y)BÂ-M6Âs#ëU[W®Ë<C2AE>nÌ8Ù¥véß¹HÌ è7ÓùÏz­Î+9p2pfñ齌ûåYZÂW6°œPpÐÞ<C390>½äkòRoq²‡Ž?ü¤Ñ¡rÿ?qiSýô‡??‘Ë¿Ÿ~zúÛßõu~¶pØ·u=mŒxÞìl*”QÕÓa<C393>yb-¬Õ<C2AC>ô!/áÔŸ #U±±DDvÈž-Bú°)òA<C3B2><41>»öÌÈ ôPô*|žND,<2C>íœãè~ðì}åá&kÄ;£Œ— Ýu­½'àÖf®ÅÕÍo¥Ö²°¡L„ò}n[n%k[ÿÊñ_—¾h0öæD÷¡_g»Öö 'Äh²Ö ¥ÆŒ #‚©åÖÔÏ­ÍÆÕW·™…ݳ®ô0u[,(rm”ß¿š£"WŒ²—ˆ”1»—f÷àÌ.{_š"Õì¸ j¤Xûö»ÄIõã“~LÞëÏol•»øÇìç¡Ánnºý<ÓÒËEº9·Çgœ‡YÒi¦1»<þ¶^­Ïnùa¯v<C2AF>tÛtCœ”nï ½¯t{é[¤ª+v™ÊfgSñŒª®xãéö⌟ âÉŒ)û¥Û7…ÿA<C3BF><41>»úÞ\_Æ/윲@E¯ÂçéDÄòØÎ9ŽîÏÞW¹tÛ7â<37>ÑÆ„K<E2809E>õ¯5²wN·= K·ë`C™åûÜÓíBü×¥/Œ™ø†Òí_cÛÜ‹*0R6ã´÷LÜ-cüÑ¥³ú5Á.ó¤Ë}'³ùV3YŽÓ 'Ó¨`. 56Ì©æX{KÚf¯aÚ«<C39A>a<EFBFBD> Ó<>aaöÍ— Éd® Ê6ù€|ßó6>â®Jé;xÕšR?,¨i?=îvcÿqgÊêÑŤ-’¤+””ƒö8TØ<¦ùaaÿy.¶ÖON¸cÖ÷ô«»Ý(-œgr®ÚÌPXO,œ rƒÒSŒ÷#aļII†å º0ñkLTÚ!,å\+»™Šywx©0tHzo@/ ^‘û›»<E28098>•Oó|íãÑÌÛ„h#Eg÷~L^1Sƒ$„ðl­ìüÇõ2Ån>tôŠ>ì´®RÂ<52><·Ú˜äÁ¾÷i¦j|änÑ!óT`<60>ÿâl9ÙLÄ{oµá¢ue4ç¥q†°0;9—h;](_þk¯ajG“NZp:Í·0K±LÊñSò3I[B…M<êå—ªGÍ<E28099>örIšüc=eżpã&Œ5R9Ö)Ëö*î•gg¿[>}9¼àœæû<C3A6>·8øžÅž­kªoª®˜ü°ov6ʨêŠé0À<±ÖjHzP—pi€Ï£®ßql*<11>ÄæK!œ=ãø¦Ð?®w¾×yZqlçG÷Bç9w§3cÂ;¾ˆê[kT»÷9ëAË­_-k^9öïùç£6]öº–!7; tÍgå|ËM—
dÀ”HœüâÂ>ËÛ¯|×'ô¥<C3B4>QI˜½\&ÿXdOEY1/$FOÌ ¥ÅßAíV¸¼´ #,ø@…Âr¼60¦ËkžpQ¡¡ý ÈØ Ï/„ °çw8´Sîà-a¾ç¼ºoëšê{¤ª+&|<>M…2ªºb: sO¬…½.¼&|ÚÙg6`L„%9*܆~>¾=ÓËMXx§Xîà09L+2Yg1<67>Da•VL'Û9'h½p,{ÝÏJÓ&ç7€>ùC}> ­ºæâõ¬g]*È<>¹À9##YÜœ•^øÕÒ„®£ ÞQš<C5A1>Ó:'°n\<¦Z}µ5²7ŠÒ[ɺ`S±Üò¯¨õ–fRf_½^™¬ë‹û°ab°pfÿ;¹$Mþ±6Gc<+æo}S.Õóļ0ù]Ù¥ˆjɾ\ÖOVè±<C3A8>מ4^§) Ù„‹ò í‡xÎ^x~><04>=¿Ã¡<C383>ro ó]3»][×Tß#U]1ùàÛìl*”QÕÓq˜ob-ìÕ<C3AC>ôá5áÓÈ>“‰Ød",ÉQáìåãÛ3½Ü„…áwKJ#ßFªºb7½EiAUWL'Û9'hÝp,zÝGdÆdâü:è#<23>¯ 84i<34>„œûâAs‰õ¬g]ÚÀ¹O¿»s<C2BB>“F†§jBV{TºáWIúrŒ&xGiFvNëœÀºqñ˜Ê$hõõV&ŸK•‰@öSœo¾2qëË(\ª`>¾ƒ”®|vrIšücí4¢HVÌ Ûú6
—êyb^˜ü®ø)âܳP.IÛg¾Õ ¤TŒíGyè¢|CÁÏØ Ïχ °çw8´Sîà-a¾ëB·këšê{¤ª+&|<>M…2ªºb:óM¬…½>¼&|ÚÙÝÂY KrT8{ùøöL/7aáAøÝÒÈ·ª®ØMoaÚ_RÕÓ‰C<E280B0>ÆvÎ Z7™1™8¿úä+MZ'!ç¾¥xÐ\b=ëY—6pîÒïþ\à¤á©š<C2A9>Õ•nøUÒ„¾£ ÞQš<C5A1>Ó:'°n\<¨2Ù·új+Âò• <1F> 9o¾2që q˜”JìÊZ_ʽž¶£ì<C2A3>RÒ/·ÕŒ¸-g_Ì “á~BH<>\9ša)ínRhf?œ3ÖÂó3«¡<C2AB>r׎„óž ™ßÒ5îo (‹Ó¯íüE<C3BC>в8uƒwJÁ,:»{·ZÀç/wyw'±íl¥#Ö6!ŒëC¸%<25>#U]±<C2B1>´¼¤ª+¦`-Ãóo PÌÊö±6˜8¿åºK¨;{y2î[
ÍÅkLÏzá<7A>o—þö¯Îç ßW5¡©=*½<>+/Ý]«~  £¥?;_uNNݘxPU°oõÕVó÷ù³Ÿ¤¢tTo¿*p+ˆTnJó«TrWTîå4ùÇÚ¬JÒ¬˜Ö5Ì\ÚÆË,æ…ÉԹž…rIZg Žõ<AÌÇýT-ôP¾<50> WK  0çw7´SîÞ-A¾kƶkëšê{¤ª+&z<7A>M…2ªºb:òM¬…½.´&\Zì.é+%9$œ¹|t{¦<E28093>ð ônihäÚHUW즶 ±/ªêŠéÄ<C3A9>Àb;ç­<17>E§û€L[Lœ^‡|ñ‡&¬“psßò:h.±õ¬I4ý¤û@¸ðTMÀj<C380>J/ú*BWvÑî(ÁÈÎg<C38E>“W7*S­¾ÚŠÄþØRþTTùû£o»"YÖ9K`Di8 ²Ü–ÛË%iò<69>EöT™ó‚·´¹o“r¯½euŸ.#ØóÂ2S0óû˜<1C>Œ1¥4¢%Ox&ÛÈ~Xg¬gÜÌ^G¯í=»%¬÷\Õöm]SýŽTuÅäƒm³³©PFUWL‡a퉵<E280B0>WCÒÔ„OÀúŒ<07>c‰¨¤Fƒ3—<33>nÏdr„Þ-ÿ =jj²?™E©}VS“§ñ¯b;çĪ„%gû0LŒÏnÀyì%ð‡:|Zî[cï[‹Ö­æhÃã>©îZãož¦C­!è„YeíïÉš0f¹¹ªgfêþƒêŠ]£¯·¬Lûs¦¹ú #{üOÌÞ¡¾XV÷1(&˜³"xtàßÉ%iò<69>µ™— Y1/l sùÛ&å^O^/üœ<C3BC>-ØóÂ:OŒrL²ß;ˆ<“k$HÙ2Ö³ónÌy½¶÷ì°Þ5CÛµuMõ;RÕ¶ÍΦBU]1‡õ&ÖB^ IP>­ƒÕß…+G%5œ¹|t{&“<E2809C>ð ônifèÙPS“ýÉ,Làóšš<<3C>ˆÛ9'VÝ ,8Û‡aÆ`|vç°×Àš¦N@Ë} è}kñºÕ¼­xÜåÔ}kü ƒ`Ó4`¨5½0+¯ý]yC †ÃÌ!7WõÌL}ÑL}±oôõÖ¼xÛ1ÌÞþmeá@îÇrÎ)äJÓ½\&ÿXŸ+0%æ…máBÒ%nž˜&¿+~‡ä2|}¹$-ňЀ‰ö¶¸è+ÄãH•J¸¨ÐÐ~dgì…çBÌ~‡C;åÞæGÐ "Áqé{¤ª+&|<>M…2ªºb: sO¬…½.¼&|ÚÙg=ØPl2䍨3&âÛ3½Ü„…áwK8#ßFªºb7½…ù|IUWL'Û9'h½p,{ÝGdÚdâüÐG _@p¨Ïg!ç¾¥uÐ\¼žõ¬K8w9w.pÎÈðUMÈj<C388>J/üjiBWŽÑï(ÍÈÎi<C38E>X7.T˜ì[}½•‰x—¨ŠiÜ”2N·åy?0<>ÇM<02>ð¸<01>ð¸<01>Û
¶µàqË-wÑ ÌŒ¼¬ó€ÇÍ~Ó;Jà¯ñLªœ¦žFM=P¸áH
·Ãa
·€·êÌ<>Â-º6 p
·LЀÂí8r€ÂíðÈ
7 p{åñË<C3B1>/ú¡ç_¢é?1Ÿµ"„OÚAZàI+„xÒ€' xÒ€' xÒ€' xÒGxÒ€'í`úO>êǧRú?ʨlHÿ<48>Œ,’€Œ ÈÈÜ‚dd@FddíCÈÈ€Œ ÈÈ€Œ ÈÈîšþO®ȧÿŒK16¤ÿÀøŒ_Àø…ƒÀøŒ_ÀøŒ_§ ¿#¿€ñë•¥Þv×½”zS¡ûÁê©7ÐjÅÐj ´ZÐj­Ðj­Ðj­Ðj­Ö=ÓþÏn×=›öÜ´áÔUÃPW¥Â
ÔU@]ÔU@]µÚê* ®ê* ®êªc_•ÅîQøº,C„6ìÚ5ÔÆ`ÔP©°5PC5PC­v€
¨¡€*úè PCe£ÿ§†ŠòwRþ¾+$Bbù!và`‘ LÀÁLaqLÀÁtÓP&à`&à`¦{—öû®…*<C2BC>ORT a²ö¨ŽüÕ‘ª# :ª# :ª£lk3@u”[î¢A˜yYçÕÍöäË…ëÚ€a¦sóïfÞ#Duê'çÏD“é Gç#Ußþ£Xþ#à?r_Êþ#à?þ£ö¡üGÀüGÀüG÷å?z¹0]#Ìߺ4_Ð$/Úà FЉU`, °(´3R,30#¹ä˜‘€ ˜‘Ú‡0#30#5E˜‘€é03ÒË…ë_‰éRP5 >—äSº,àƒR1ÑPcR$c0&¹…“€1 “Ú‡0&c0&c0&Ý—1I—t`<>Ü•s­<73>¾E0Úqo¨€> è“€> <06>> è“€> è“NAÐ'F>Ð'}Ò+ËÃÓÛïBGÜ<>qkR,kR!ÈÀš<14>Åk°&k°&k°&k°&Ý•5éå2¢AH÷™|óYœ<59><å>Œcй6”@§´þê?Ð)¥Â
tJ@§tJ@§´Ú:% S:% Sè”Z¿p«±«í˜n%àVº·ÒkÜJY°·Rz·œ¸•Ç
¸•€[ ¸•€[©£o™[©éK²:øHR%[hR<68>f)€f hfÐ,ÍÒQünIÐ,ÍÐ,ÍÒ>*@³4K7Ñ,E¿Ÿ“ý¢¬®FIMҮ渗¨î-Æký“Wá4&;«C¤ˆŽb®öñûG© j[ÿÈ«ë[ýbÖj9¶­<C2B6>Y؇í£rW«ÛÆ¥Ÿ?<3F>Ë÷ÿÜ <0A>Þîàµ;½\ž9J}€Œ´öäU¹õ¤á/ÉeØ3ì¶t_»uHÎD¿<44>ÇÁþæPLõCY-¦\`µÙp<C399>Ï3%þœìµ”î~¿áÜ>…Œ<9EâÔJAå}Z[û‡"¨{0&e€eÆ8“ŒÙ…Уaó“Â4´ ИúèöÀÌ]e4î){žd˜´OÆop±atL"Eþjýó¹|²HAÝHai¢ŸqЧSÑÎß7¤j†2;úâXP:õ-£Aw{Ž~$Üi8½W=ÒA"AYv¯ÚóÛˆµ‡ÌÕ÷<C3B7>¡Adú˜Â‡0¿‰ôëŠ l<e,o4d9|œïo$üǨ!<21>-ÜÐ𨫢÷\´Œzò8»h)× ®¶oÎ…ÂjùúºÊ„ãaï¥#ã{é/ï¥#òuuä¤ä«Ö<C396>k{[¾š×E[ï£y<C2A3>µ´éÞlšÄkY'9&¯¸r5 ò¶³:kÔYâÜqŒš¯"êšÛˆs<0E>¿ÌG=ëœZØDÒî­?ÙÆœ¸ê‡reë¸9Ô¾·ÈŸÝkáäÌ®ª`¥£¢²ÅÛ\VVퟭ7¯ObþŒ¨}Úþ™ŒBŒóÁÿ†<C3BF>×¾åŸàô1»ï¼©¿ÅË­ƒµžeFºÚFz¸=`
endstream
endobj
77 0 obj
@@ -1405,7 +1396,7 @@ trailer
/Size 87
/Root 3 0 R
/Info 52 0 R
/ID [<7800bd1e70bdb9114e48fc6d480ec696> <7800bd1e70bdb9114e48fc6d480ec696>]
/ID [<f859d5aa137a1b00be187261c0c2b77d> <f859d5aa137a1b00be187261c0c2b77d>]
>>
startxref
101711

View File

@@ -0,0 +1,352 @@
import path from 'path';
import { expect, test } from '@playwright/test';
import { createDoc, overrideConfig, verifyDocName } from './utils-common';
import { writeInEditor } from './utils-editor';
import { connectOtherUserToDoc, updateShareLink } from './utils-share';
import { createRootSubPage } from './utils-sub-pages';
test.beforeEach(async ({ page }) => {
await page.goto('/');
});
test.describe('Doc Collaboration', () => {
/**
* We check:
* - connection to the collaborative server
* - signal of the backend to the collaborative server (connection should close)
* - reconnection to the collaborative server
*/
test('checks the connection with collaborative server', async ({ page }) => {
let webSocketPromise = page.waitForEvent('websocket', (webSocket) => {
return webSocket
.url()
.includes(`${process.env.COLLABORATION_WS_URL}?room=`);
});
await page
.getByRole('button', {
name: 'New doc',
})
.click();
let webSocket = await webSocketPromise;
expect(webSocket.url()).toContain(
`${process.env.COLLABORATION_WS_URL}?room=`,
);
// Is connected
let framesentPromise = webSocket.waitForEvent('framesent');
await writeInEditor({ page, text: 'Hello World' });
let framesent = await framesentPromise;
expect(framesent.payload).not.toBeNull();
await page.getByRole('button', { name: 'Share' }).click();
const selectVisibility = page.getByTestId('doc-visibility');
// When the visibility is changed, the ws should close the connection (backend signal)
const wsClosePromise = webSocket.waitForEvent('close');
await selectVisibility.click();
await page.getByRole('menuitemradio', { name: 'Connected' }).click();
// Assert that the doc reconnects to the ws
const wsClose = await wsClosePromise;
expect(wsClose.isClosed()).toBeTruthy();
// Check the ws is connected again
webSocket = await page.waitForEvent('websocket', (webSocket) => {
return webSocket
.url()
.includes(`${process.env.COLLABORATION_WS_URL}?room=`);
});
framesentPromise = webSocket.waitForEvent('framesent');
framesent = await framesentPromise;
expect(framesent.payload).not.toBeNull();
});
test('it cannot edit if viewer but see and can get resources', async ({
page,
browserName,
}) => {
const [docTitle] = await createDoc(page, 'doc-viewer', browserName, 1);
await verifyDocName(page, docTitle);
await writeInEditor({ page, text: 'Hello World' });
await page.getByRole('button', { name: 'Share' }).click();
await updateShareLink(page, 'Public', 'Reading');
// Close the modal
await page.getByRole('button', { name: 'close' }).first().click();
const { otherPage, cleanup } = await connectOtherUserToDoc({
browserName,
docUrl: page.url(),
withoutSignIn: true,
docTitle,
});
await expect(
otherPage.getByLabel('It is the card information').getByText('Reader'),
).toBeVisible();
// Cannot edit
const editor = otherPage.locator('.ProseMirror');
await expect(editor).toHaveAttribute('contenteditable', 'false');
// Owner add a image
const fileChooserPromise = page.waitForEvent('filechooser');
await page.locator('.bn-block-outer').last().fill('/');
await page.getByText('Resizable image with caption').click();
await page.getByText('Upload image').click();
const fileChooser = await fileChooserPromise;
await fileChooser.setFiles(
path.join(__dirname, 'assets/logo-suite-numerique.png'),
);
// Owner see the image
await expect(
page.locator('.--docs--editor-container img.bn-visual-media').first(),
).toBeVisible();
// Viewser see the image
const viewerImg = otherPage
.locator('.--docs--editor-container img.bn-visual-media')
.first();
await expect(viewerImg).toBeVisible({
timeout: 10000,
});
// Viewer can download the image
await viewerImg.click();
const downloadPromise = otherPage.waitForEvent('download');
await otherPage.getByRole('button', { name: 'Download image' }).click();
const download = await downloadPromise;
expect(download.suggestedFilename()).toBe('logo-suite-numerique.png');
await cleanup();
});
test('it checks block editing when not connected to collab server', async ({
page,
browserName,
}) => {
test.slow();
/**
* The good port is 4444, but we want to simulate a not connected
* collaborative server.
* So we use a port that is not used by the collaborative server.
* The server will not be able to connect to the collaborative server.
*/
await overrideConfig(page, {
COLLABORATION_WS_URL: 'ws://localhost:5555/collaboration/ws/',
COLLABORATION_WS_NOT_CONNECTED_READY_ONLY: true,
});
await page.goto('/');
const [parentTitle] = await createDoc(
page,
'editing-blocking',
browserName,
1,
);
const card = page.getByLabel('It is the card information');
await expect(
card.getByText('Others are editing. Your network prevent changes.'),
).toBeHidden();
const editor = page.locator('.ProseMirror');
await expect(editor).toHaveAttribute('contenteditable', 'true');
let responseCanEditPromise = page.waitForResponse(
(response) =>
response.url().includes(`/can-edit/`) && response.status() === 200,
);
await page.getByRole('button', { name: 'Share' }).click();
await updateShareLink(page, 'Public', 'Editing');
// Close the modal
await page.getByRole('button', { name: 'close' }).first().click();
const urlParentDoc = page.url();
const { name: childTitle } = await createRootSubPage(
page,
browserName,
'editing-blocking - child',
);
let responseCanEdit = await responseCanEditPromise;
expect(responseCanEdit.ok()).toBeTruthy();
let jsonCanEdit = (await responseCanEdit.json()) as { can_edit: boolean };
expect(jsonCanEdit.can_edit).toBeTruthy();
const urlChildDoc = page.url();
/**
* We open another browser that will connect to the collaborative server
* and will block the current browser to edit the doc.
*/
const { otherPage, cleanup } = await connectOtherUserToDoc({
browserName,
docUrl: urlChildDoc,
docTitle: childTitle,
withoutSignIn: true,
});
const webSocketPromise = otherPage.waitForEvent(
'websocket',
(webSocket) => {
return webSocket
.url()
.includes(`${process.env.COLLABORATION_WS_URL}?room=`);
},
);
await otherPage.goto(urlChildDoc);
const webSocket = await webSocketPromise;
expect(webSocket.url()).toContain(
`${process.env.COLLABORATION_WS_URL}?room=`,
);
await verifyDocName(otherPage, childTitle);
await page.reload();
responseCanEdit = await page.waitForResponse(
(response) =>
response.url().includes(`/can-edit/`) && response.status() === 200,
);
expect(responseCanEdit.ok()).toBeTruthy();
jsonCanEdit = (await responseCanEdit.json()) as { can_edit: boolean };
expect(jsonCanEdit.can_edit).toBeFalsy();
await expect(
card.getByText('Others are editing. Your network prevent changes.'),
).toBeVisible({
timeout: 10000,
});
await expect(editor).toHaveAttribute('contenteditable', 'false');
await expect(
page.getByRole('textbox', { name: 'Document title' }),
).toBeHidden();
await expect(page.getByRole('heading', { name: childTitle })).toBeVisible();
await page.goto(urlParentDoc);
await verifyDocName(page, parentTitle);
await page.getByRole('button', { name: 'Share' }).click();
await page.getByTestId('doc-access-mode').click();
await page.getByRole('menuitemradio', { name: 'Reading' }).click();
// Close the modal
await page.getByRole('button', { name: 'close' }).first().click();
await page.goto(urlChildDoc);
await expect(editor).toHaveAttribute('contenteditable', 'true');
await expect(
page.getByRole('textbox', { name: 'Document title' }),
).toContainText(childTitle);
await expect(page.getByRole('heading', { name: childTitle })).toBeHidden();
await expect(
card.getByText('Others are editing. Your network prevent changes.'),
).toBeHidden();
await cleanup();
});
test('checks disconnection and reconnection when changing tab visibility', async ({
page,
}) => {
await overrideConfig(page, {
COLLABORATION_WS_INACTIVITY_TIMEOUT: 2, // 2 seconds for the test to be faster
});
await page.goto('/');
let webSocketPromise = page.waitForEvent('websocket', (webSocket) => {
return webSocket
.url()
.includes(`${process.env.COLLABORATION_WS_URL}?room=`);
});
await page
.getByRole('button', {
name: 'New doc',
})
.click();
let webSocket = await webSocketPromise;
expect(webSocket.url()).toContain(
`${process.env.COLLABORATION_WS_URL}?room=`,
);
// Is connected
let framesentPromise = webSocket.waitForEvent('framesent');
await writeInEditor({ page, text: 'Hello World' });
let framesent = await framesentPromise;
expect(framesent.payload).not.toBeNull();
// When the visibility is changed, the ws should close the connection
const wsClosePromise = webSocket.waitForEvent('close');
// Simulate the tab being hidden
await page.evaluate(() => {
Object.defineProperty(document, 'hidden', {
value: true,
writable: true,
configurable: true,
});
document.dispatchEvent(new Event('visibilitychange'));
});
// Assert the ws connection is closed after inactivity timeout
const wsClose = await wsClosePromise;
expect(wsClose.isClosed()).toBeTruthy();
// Check the ws is connected again
webSocketPromise = page.waitForEvent('websocket', (webSocket) => {
return webSocket
.url()
.includes(`${process.env.COLLABORATION_WS_URL}?room=`);
});
// Simulate the tab becoming visible again
await page.evaluate(() => {
Object.defineProperty(document, 'hidden', {
value: false,
writable: true,
configurable: true,
});
document.dispatchEvent(new Event('visibilitychange'));
});
webSocket = await webSocketPromise;
framesentPromise = webSocket.waitForEvent('framesent');
framesent = await framesentPromise;
// Assert the ws connection is working again
expect(framesent.payload).not.toBeNull();
});
});

View File

@@ -3,14 +3,9 @@ import path from 'path';
import { expect, test } from '@playwright/test';
import cs from 'convert-stream';
import {
createDoc,
goToGridDoc,
overrideConfig,
verifyDocName,
} from './utils-common';
import { createDoc, goToGridDoc, verifyDocName } from './utils-common';
import { getEditor, openSuggestionMenu, writeInEditor } from './utils-editor';
import { connectOtherUserToDoc, updateShareLink } from './utils-share';
import { updateShareLink } from './utils-share';
import {
createRootSubPage,
getTreeRow,
@@ -111,63 +106,6 @@ test.describe('Doc Editor', () => {
).toBeVisible();
});
/**
* We check:
* - connection to the collaborative server
* - signal of the backend to the collaborative server (connection should close)
* - reconnection to the collaborative server
*/
test('checks the connection with collaborative server', async ({ page }) => {
let webSocketPromise = page.waitForEvent('websocket', (webSocket) => {
return webSocket
.url()
.includes(`${process.env.COLLABORATION_WS_URL}?room=`);
});
await page
.getByRole('button', {
name: 'New doc',
})
.click();
let webSocket = await webSocketPromise;
expect(webSocket.url()).toContain(
`${process.env.COLLABORATION_WS_URL}?room=`,
);
// Is connected
let framesentPromise = webSocket.waitForEvent('framesent');
await writeInEditor({ page, text: 'Hello World' });
let framesent = await framesentPromise;
expect(framesent.payload).not.toBeNull();
await page.getByRole('button', { name: 'Share' }).click();
const selectVisibility = page.getByTestId('doc-visibility');
// When the visibility is changed, the ws should close the connection (backend signal)
const wsClosePromise = webSocket.waitForEvent('close');
await selectVisibility.click();
await page.getByRole('menuitemradio', { name: 'Connected' }).click();
// Assert that the doc reconnects to the ws
const wsClose = await wsClosePromise;
expect(wsClose.isClosed()).toBeTruthy();
// Check the ws is connected again
webSocket = await page.waitForEvent('websocket', (webSocket) => {
return webSocket
.url()
.includes(`${process.env.COLLABORATION_WS_URL}?room=`);
});
framesentPromise = webSocket.waitForEvent('framesent');
framesent = await framesentPromise;
expect(framesent.payload).not.toBeNull();
});
test('markdown button converts from markdown to the editor syntax json', async ({
page,
browserName,
@@ -285,70 +223,6 @@ test.describe('Doc Editor', () => {
await expect(editor.getByText('Hello World Doc persisted 2')).toBeVisible();
});
test('it cannot edit if viewer but see and can get resources', async ({
page,
browserName,
}) => {
const [docTitle] = await createDoc(page, 'doc-viewer', browserName, 1);
await verifyDocName(page, docTitle);
await writeInEditor({ page, text: 'Hello World' });
await page.getByRole('button', { name: 'Share' }).click();
await updateShareLink(page, 'Public', 'Reading');
// Close the modal
await page.getByRole('button', { name: 'close' }).first().click();
const { otherPage, cleanup } = await connectOtherUserToDoc({
browserName,
docUrl: page.url(),
withoutSignIn: true,
docTitle,
});
await expect(
otherPage.getByLabel('It is the card information').getByText('Reader'),
).toBeVisible();
// Cannot edit
const editor = otherPage.locator('.ProseMirror');
await expect(editor).toHaveAttribute('contenteditable', 'false');
// Owner add a image
const fileChooserPromise = page.waitForEvent('filechooser');
await page.locator('.bn-block-outer').last().fill('/');
await page.getByText('Resizable image with caption').click();
await page.getByText('Upload image').click();
const fileChooser = await fileChooserPromise;
await fileChooser.setFiles(
path.join(__dirname, 'assets/logo-suite-numerique.png'),
);
// Owner see the image
await expect(
page.locator('.--docs--editor-container img.bn-visual-media').first(),
).toBeVisible();
// Viewser see the image
const viewerImg = otherPage
.locator('.--docs--editor-container img.bn-visual-media')
.first();
await expect(viewerImg).toBeVisible({
timeout: 10000,
});
// Viewer can download the image
await viewerImg.click();
const downloadPromise = otherPage.waitForEvent('download');
await otherPage.getByRole('button', { name: 'Download image' }).click();
const download = await downloadPromise;
expect(download.suggestedFilename()).toBe('logo-suite-numerique.png');
await cleanup();
});
test('it adds an image to the doc editor', async ({ page, browserName }) => {
await createDoc(page, 'doc-image', browserName, 1);
@@ -493,151 +367,6 @@ test.describe('Doc Editor', () => {
await expect(editor.getByText('Analyzing file...')).toBeHidden();
});
if (process.env.COLLABORATION_WS_NOT_CONNECTED_READY_ONLY === 'true') {
test('it checks block editing when not connected to collab server', async ({
page,
browserName,
}) => {
test.slow();
/**
* The good port is 4444, but we want to simulate a not connected
* collaborative server.
* So we use a port that is not used by the collaborative server.
* The server will not be able to connect to the collaborative server.
*/
await overrideConfig(page, {
COLLABORATION_WS_URL: 'ws://localhost:5555/collaboration/ws/',
});
await page.goto('/');
const [parentTitle] = await createDoc(
page,
'editing-blocking',
browserName,
1,
);
const card = page.getByLabel('It is the card information');
await expect(
card.getByText('Others are editing. Your network prevent changes.'),
).toBeHidden();
const editor = page.locator('.ProseMirror');
await expect(editor).toHaveAttribute('contenteditable', 'true');
let responseCanEditPromise = page.waitForResponse(
(response) =>
response.url().includes(`/can-edit/`) && response.status() === 200,
);
await page.getByRole('button', { name: 'Share' }).click();
await updateShareLink(page, 'Public', 'Editing');
// Close the modal
await page.getByRole('button', { name: 'close' }).first().click();
const urlParentDoc = page.url();
const { name: childTitle } = await createRootSubPage(
page,
browserName,
'editing-blocking - child',
);
let responseCanEdit = await responseCanEditPromise;
expect(responseCanEdit.ok()).toBeTruthy();
let jsonCanEdit = (await responseCanEdit.json()) as { can_edit: boolean };
expect(jsonCanEdit.can_edit).toBeTruthy();
const urlChildDoc = page.url();
/**
* We open another browser that will connect to the collaborative server
* and will block the current browser to edit the doc.
*/
const { otherPage } = await connectOtherUserToDoc({
browserName,
docUrl: urlChildDoc,
docTitle: childTitle,
withoutSignIn: true,
});
const webSocketPromise = otherPage.waitForEvent(
'websocket',
(webSocket) => {
return webSocket
.url()
.includes(`${process.env.COLLABORATION_WS_URL}?room=`);
},
);
await otherPage.goto(urlChildDoc);
const webSocket = await webSocketPromise;
expect(webSocket.url()).toContain(
`${process.env.COLLABORATION_WS_URL}?room=`,
);
await verifyDocName(otherPage, childTitle);
await page.reload();
responseCanEdit = await page.waitForResponse(
(response) =>
response.url().includes(`/can-edit/`) && response.status() === 200,
);
expect(responseCanEdit.ok()).toBeTruthy();
jsonCanEdit = (await responseCanEdit.json()) as { can_edit: boolean };
expect(jsonCanEdit.can_edit).toBeFalsy();
await expect(
card.getByText('Others are editing. Your network prevent changes.'),
).toBeVisible({
timeout: 10000,
});
await expect(editor).toHaveAttribute('contenteditable', 'false');
await expect(
page.getByRole('textbox', { name: 'Document title' }),
).toBeHidden();
await expect(
page.getByRole('heading', { name: childTitle }),
).toBeVisible();
await page.goto(urlParentDoc);
await verifyDocName(page, parentTitle);
await page.getByRole('button', { name: 'Share' }).click();
await page.getByTestId('doc-access-mode').click();
await page.getByRole('menuitemradio', { name: 'Reading' }).click();
// Close the modal
await page.getByRole('button', { name: 'close' }).first().click();
await page.goto(urlChildDoc);
await expect(editor).toHaveAttribute('contenteditable', 'true');
await expect(
page.getByRole('textbox', { name: 'Document title' }),
).toContainText(childTitle);
await expect(
page.getByRole('heading', { name: childTitle }),
).toBeHidden();
await expect(
card.getByText('Others are editing. Your network prevent changes.'),
).toBeHidden();
});
}
test('it checks if callout custom block', async ({ page, browserName }) => {
await createDoc(page, 'doc-toolbar', browserName, 1);

View File

@@ -1,6 +1,11 @@
import { expect, test } from '@playwright/test';
import { createDoc, getCurrentConfig, verifyDocName } from './utils-common';
import {
createDoc,
getCurrentConfig,
overrideConfig,
verifyDocName,
} from './utils-common';
import { writeInEditor } from './utils-editor';
import { SignIn, expectLoginPage } from './utils-signin';
import { createRootSubPage } from './utils-sub-pages';
@@ -145,6 +150,29 @@ test.describe('Doc Routing', () => {
);
await expect(page).toHaveTitle(/401 Unauthorized - Docs/);
});
test('checks redirect if unsync version', async ({ page }) => {
await overrideConfig(page, {
RELEASE_VERSION: '0.0.0',
});
let counterReload = 0;
await page.route(/.*\/users\/me\/$/, async (route) => {
counterReload += 1;
await route.continue();
});
await page.waitForTimeout(1000);
// The sessionStorage guard should be set to the mismatched backend version.
const reloadVersion = await page.evaluate(() =>
sessionStorage.getItem('reload-version'),
);
expect(reloadVersion).toBe('0.0.0');
// The page should have reloaded once
expect(counterReload).toBe(2);
});
});
test.describe('Doc Routing: Not logged', () => {

View File

@@ -4,6 +4,7 @@ import path from 'path';
import { Locator, Page, TestInfo, expect } from '@playwright/test';
import theme_customization from '../../../../../backend/impress/configuration/theme/default.json';
import { version as packageJsonVersion } from '../../package.json';
export type BrowserName = 'chromium' | 'firefox' | 'webkit';
export const BROWSERS: BrowserName[] = ['chromium', 'webkit', 'firefox'];
@@ -18,6 +19,7 @@ export const CONFIG = {
AI_FEATURE_LEGACY_ENABLED: true,
API_USERS_SEARCH_QUERY_MIN_LENGTH: 3,
CRISP_WEBSITE_ID: null,
COLLABORATION_WS_INACTIVITY_TIMEOUT: 15,
COLLABORATION_WS_URL: process.env.COLLABORATION_WS_URL,
COLLABORATION_WS_NOT_CONNECTED_READY_ONLY: true,
CONVERSION_UPLOAD_ENABLED: true,
@@ -39,6 +41,7 @@ export const CONFIG = {
],
LANGUAGE_CODE: 'en-us',
POSTHOG_KEY: {},
RELEASE_VERSION: packageJsonVersion,
SENTRY_DSN: null,
TRASHBIN_CUTOFF_DAYS: 30,
theme_customization,

View File

@@ -2,6 +2,8 @@ const crypto = require('crypto');
const { InjectManifest } = require('workbox-webpack-plugin');
const { version } = require('./package.json');
const buildId = crypto.randomBytes(256).toString('hex').slice(0, 8);
/** @type {import('next').NextConfig} */
@@ -25,6 +27,7 @@ const nextConfig = {
generateBuildId: () => buildId,
env: {
NEXT_PUBLIC_BUILD_ID: buildId,
NEXT_PUBLIC_APP_VERSION: version,
},
/**
* In dev mode, Next.js doesn't use Webpack, but Turbopack.

View File

@@ -85,6 +85,32 @@ export const ConfigProvider = ({ children }: PropsWithChildren) => {
});
}, [conf?.CRISP_WEBSITE_ID]);
useEffect(() => {
const frontendVersion = process.env.NEXT_PUBLIC_APP_VERSION;
if (
!conf?.RELEASE_VERSION ||
!frontendVersion ||
conf.RELEASE_VERSION === frontendVersion
) {
return;
}
// Avoid infinite reload loops: only reload once per backend version
const RELOAD_VERSION_KEY = 'reload-version';
try {
const reloadedForVersion = sessionStorage.getItem(RELOAD_VERSION_KEY);
if (reloadedForVersion === conf.RELEASE_VERSION) {
return;
}
sessionStorage.setItem(RELOAD_VERSION_KEY, conf.RELEASE_VERSION);
window.location.reload();
} catch {
console.warn('Failed to access sessionStorage for version reload logic');
}
}, [conf?.RELEASE_VERSION]);
if (!conf) {
return (
<Box $height="100vh" $width="100vw" $align="center" $justify="center">

View File

@@ -42,6 +42,7 @@ export interface ConfigResponse {
API_USERS_SEARCH_QUERY_MIN_LENGTH?: number;
COLLABORATION_WS_URL?: string;
COLLABORATION_WS_NOT_CONNECTED_READY_ONLY?: boolean;
COLLABORATION_WS_INACTIVITY_TIMEOUT?: number | null;
CONVERSION_FILE_EXTENSIONS_ALLOWED: string[];
CONVERSION_FILE_MAX_SIZE: number;
CONVERSION_UPLOAD_ENABLED?: boolean;
@@ -56,6 +57,7 @@ export interface ConfigResponse {
LANGUAGE_CODE: string;
MEDIA_BASE_URL?: string;
POSTHOG_KEY?: PostHogConf;
RELEASE_VERSION: string;
SENTRY_DSN?: string;
TRASHBIN_CUTOFF_DAYS?: number;
theme_customization?: ThemeCustomization;
@@ -93,13 +95,13 @@ export const KEY_CONFIG = 'config';
export function useConfig() {
const cachedData = getCachedTranslation();
const oneHour = 1000 * 60 * 60;
const staleTime = 1000 * 60 * 5;
return useQuery<ConfigResponse, APIError, ConfigResponse>({
queryKey: [KEY_CONFIG],
queryFn: () => getConfig(),
initialData: cachedData,
staleTime: oneHour,
initialDataUpdatedAt: Date.now() - oneHour, // Force initial data to be considered stale
staleTime,
initialDataUpdatedAt: Date.now() - staleTime, // Force initial data to be considered stale
});
}

View File

@@ -0,0 +1,33 @@
import { sanitizeColor } from '../utils';
const HEX_COLOR_RE = /^#[0-9a-fA-F]{6}$/;
describe('sanitizeColor', () => {
it('accepts valid 6-digit hex colors', () => {
expect(sanitizeColor('#1a2b3c')).toBe('#1a2b3c');
expect(sanitizeColor('#AABBCC')).toBe('#AABBCC');
expect(sanitizeColor('#000000')).toBe('#000000');
expect(sanitizeColor('#ffffff')).toBe('#ffffff');
});
it('rejects 3-digit hex colors and returns a valid random hex color', () => {
expect(sanitizeColor('#abc')).toMatch(HEX_COLOR_RE);
});
it('rejects named colors and returns a valid random hex color', () => {
expect(sanitizeColor('red')).toMatch(HEX_COLOR_RE);
expect(sanitizeColor('blue')).toMatch(HEX_COLOR_RE);
});
it('rejects CSS injection attempts and returns a valid random hex color', () => {
expect(sanitizeColor('red; behavior: expression(alert(1))')).toMatch(
HEX_COLOR_RE,
);
expect(sanitizeColor('#fff; color: red')).toMatch(HEX_COLOR_RE);
expect(sanitizeColor('javascript:alert(1)')).toMatch(HEX_COLOR_RE);
});
it('rejects empty string and returns a valid random hex color', () => {
expect(sanitizeColor('')).toMatch(HEX_COLOR_RE);
});
});

View File

@@ -36,7 +36,7 @@ import {
import { useEditorStore } from '../stores';
import { DocsEditorStyle } from '../styles';
import { DocsBlockNoteEditor } from '../types';
import { randomColor } from '../utils';
import { randomColor, sanitizeColor } from '../utils';
import BlockNoteAI from './AI';
import { BlockNoteSuggestionMenu } from './BlockNoteSuggestionMenu';
@@ -152,12 +152,13 @@ export const BlockNoteEditor = ({ doc, provider }: BlockNoteEditorProps) => {
*/
renderCursor: (user: { color: string; name: string }) => {
const cursorElement = document.createElement('span');
const safeColor = sanitizeColor(user.color);
cursorElement.classList.add('collaboration-cursor-custom__base');
const caretElement = document.createElement('span');
caretElement.classList.add('collaboration-cursor-custom__caret');
caretElement.setAttribute('spellcheck', `false`);
caretElement.setAttribute('style', `background-color: ${user.color}`);
caretElement.setAttribute('style', `background-color: ${safeColor}`);
if (showCursorLabels === 'always') {
cursorElement.setAttribute('data-active', '');
@@ -169,7 +170,7 @@ export const BlockNoteEditor = ({ doc, provider }: BlockNoteEditorProps) => {
labelElement.setAttribute('spellcheck', `false`);
labelElement.setAttribute(
'style',
`background-color: ${user.color};border: 1px solid ${user.color};`,
`background-color: ${safeColor};border: 1px solid ${safeColor};`,
);
labelElement.insertBefore(document.createTextNode(user.name), null);

View File

@@ -144,7 +144,6 @@ interface DocCoreEditorProps {
}
export const DocCoreEditor = ({ doc, readOnly }: DocCoreEditorProps) => {
useCollaboration(doc.id);
const { provider, isReady } = useProviderStore();
const isProviderReady = isReady && provider;
const showContent = !!(

View File

@@ -1,5 +1,6 @@
import { CommentBody, ThreadStore } from '@blocknote/core/comments';
import type { Awareness } from 'y-protocols/awareness';
import * as Y from 'yjs';
import { APIError, errorCauses, fetchAPI } from '@/api';
import { Doc } from '@/features/docs/doc-management';
@@ -17,6 +18,13 @@ import {
type ServerThreadListResponse = ServerThread[];
/**
* notifySubscribers generate a transaction, to distinguish
* the origin of the update, we use a specific origin "commentMarkUpdate"
* for the updates coming from the comment mark changes.
*/
export const COMMENT_UPDATE_ORIGIN = 'commentMarkUpdate';
export class DocsThreadStore extends ThreadStore {
protected static COMMENTS_PING = 'commentsPing';
protected threads: Map<string, ClientThreadData> = new Map();
@@ -24,6 +32,7 @@ export class DocsThreadStore extends ThreadStore {
(threads: Map<string, ClientThreadData>) => void
>();
private awareness?: Awareness;
private yDoc?: Y.Doc;
private lastPingAt = 0;
private pingTimer?: ReturnType<typeof setTimeout>;
@@ -31,11 +40,13 @@ export class DocsThreadStore extends ThreadStore {
protected docId: Doc['id'],
awareness: Awareness | undefined,
protected docAuth: DocsThreadStoreAuth,
yDoc?: Y.Doc,
) {
super(docAuth);
if (docAuth.canSee) {
this.awareness = awareness;
this.yDoc = yDoc;
this.awareness?.on('update', this.onAwarenessUpdate);
this.refreshThreads();
@@ -134,18 +145,30 @@ export class DocsThreadStore extends ThreadStore {
}
/**
* Notifies all subscribers about the current thread state
* Notifies all subscribers about the current thread state.
* We trigger the transaction with a specific origin so we will be able
* to flag that the update comes from a comment update.
* The inner ydoc.transact calls from y-prosemirror will see there's already
* an active transaction and reuse it.
*/
private notifySubscribers() {
// Always emit a new Map reference to help consumers detect changes
const threads = new Map(this.threads);
this.subscribers.forEach((cb) => {
try {
cb(threads);
} catch (e) {
console.warn('DocsThreadStore subscriber threw', e);
}
});
const notify = () => {
this.subscribers.forEach((cb) => {
try {
cb(threads);
} catch (e) {
console.warn('DocsThreadStore subscriber threw', e);
}
});
};
if (this.yDoc) {
this.yDoc.transact(notify, COMMENT_UPDATE_ORIGIN);
} else {
notify();
}
}
private upsertClientThreadData(thread: ClientThreadData) {

View File

@@ -27,8 +27,15 @@ export function useComments(
encodeURIComponent(user?.full_name || ''),
canComment,
),
provider?.document,
);
}, [docId, canComment, provider?.awareness, user?.full_name]);
}, [
docId,
canComment,
provider?.awareness,
provider?.document,
user?.full_name,
]);
useEffect(() => {
if (canComment) {

View File

@@ -66,6 +66,7 @@ export const LinkSelected = ({
<BoxButton
as="span"
className="--docs--interlinking-link-inline-content"
data-href={href}
onClick={handleClick}
onAuxClick={handleAuxClick}
draggable="false"

View File

@@ -1,7 +1,7 @@
import { useQueryClient } from '@tanstack/react-query';
import { useEffect } from 'react';
import { useCollaborationUrl } from '@/core/config';
import { useCollaborationUrl, useConfig } from '@/core/config';
import { KEY_DOC } from '@/docs/doc-management/api/useDoc';
import {
KEY_DOC_CONTENT,
@@ -15,6 +15,7 @@ export const useCollaboration = (room: string) => {
const collaborationUrl = useCollaborationUrl(room);
const { addTask } = useBroadcastStore();
const queryClient = useQueryClient();
const { data: config } = useConfig();
const {
setBroadcastProvider,
cleanupBroadcast,
@@ -28,6 +29,8 @@ export const useCollaboration = (room: string) => {
isReady,
hasLostConnection,
resetLostConnection,
pauseForInactivity,
resumeFromInactivity,
} = useProviderStore();
const isOffline = useIsOffline((state) => state.isOffline);
const { data: docContent } = useDocContent(
@@ -109,4 +112,43 @@ export const useCollaboration = (room: string) => {
}
};
}, [destroyProvider, room, cleanupBroadcast]);
useEffect(() => {
if (!provider || !config?.COLLABORATION_WS_INACTIVITY_TIMEOUT) {
return;
}
const timeoutMs = config.COLLABORATION_WS_INACTIVITY_TIMEOUT * 1000;
let inactivityTimeout: ReturnType<typeof setTimeout> | undefined;
const startInactivityTimer = () => {
clearTimeout(inactivityTimeout);
inactivityTimeout = setTimeout(pauseForInactivity, timeoutMs);
};
if (document.hidden) {
startInactivityTimer();
}
const visibilityChangeHandler = () => {
if (document.hidden) {
startInactivityTimer();
} else {
clearTimeout(inactivityTimeout);
resumeFromInactivity();
}
};
document.addEventListener('visibilitychange', visibilityChangeHandler);
return () => {
document.removeEventListener('visibilitychange', visibilityChangeHandler);
clearTimeout(inactivityTimeout);
};
}, [
pauseForInactivity,
provider,
resumeFromInactivity,
config?.COLLABORATION_WS_INACTIVITY_TIMEOUT,
]);
};

View File

@@ -2,6 +2,7 @@ import { useRouter } from 'next/router';
import { useCallback, useEffect, useRef, useState } from 'react';
import * as Y from 'yjs';
import { COMMENT_UPDATE_ORIGIN } from '@/docs/doc-editor/components/comments/DocsThreadStore';
import { useDocContentUpdate } from '@/docs/doc-management/api/useDocContentUpdate';
import { useProviderStore } from '@/docs/doc-management/stores/useProviderStore';
import { KEY_LIST_DOC_VERSIONS } from '@/docs/doc-versioning/api/useDocVersions';
@@ -65,6 +66,16 @@ export const useSaveDoc = (docId: string, yDoc: Y.Doc) => {
const isAIChange =
!transaction.local && transactionOrigin !== PROVIDER_ORIGIN_CONSTRUCTOR;
/**
* notifySubscribers generate a transaction that can be
* interpreted as a local change.
* We intercept the update with this origin to
* avoid marking the change as local.
*/
if (transaction.origin === COMMENT_UPDATE_ORIGIN) {
return;
}
setIsLocalChange(transaction.local || isAIChange);
};

View File

@@ -1,3 +1,9 @@
const HEX_COLOR_REGEX = /^#[0-9a-fA-F]{6}$/;
export const sanitizeColor = (color: string): string => {
return HEX_COLOR_REGEX.test(color) ? color : randomColor();
};
export const randomColor = () => {
const randomInt = (min: number, max: number) => {
return Math.floor(Math.random() * (max - min + 1)) + min;

View File

@@ -28,7 +28,8 @@ const PRINT_ONLY_CONTENT_CSS = `
[role="contentinfo"],
div[data-is-empty-and-focused="true"],
div[data-floating-ui-focusable],
.collaboration-cursor-custom__base
.collaboration-cursor-custom__base,
.c__toast__container
{
display: none !important;
}
@@ -243,6 +244,50 @@ function wrapMediaWithLink() {
};
}
/**
* Wraps interlink inline content with anchor tags for printing,
* so they appear as clickable links in the printed PDF.
*/
function wrapInterlinksWithAnchor() {
const wrappedElements: Array<{
el: Element;
anchor: HTMLAnchorElement;
parent: Node;
}> = [];
document
.querySelectorAll('.--docs--interlinking-link-inline-content[data-href]')
.forEach((el) => {
const href = el.getAttribute('data-href');
if (!href || !isSafeUrl(href)) {
return;
}
const parent = el.parentNode;
if (!parent) {
return;
}
const anchor = document.createElement('a');
anchor.href = href;
anchor.target = '_blank';
anchor.rel = 'noopener noreferrer';
anchor.setAttribute('data-print-link', 'true');
parent.insertBefore(anchor, el);
anchor.appendChild(el);
wrappedElements.push({ el, anchor, parent });
});
return () => {
wrappedElements.forEach(({ el, anchor, parent }) => {
parent.insertBefore(el, anchor);
anchor.remove();
});
};
}
export function printDocumentWithStyles() {
if (typeof window === 'undefined') {
return;
@@ -253,7 +298,9 @@ export function printDocumentWithStyles() {
// Small delay to ensure styles are applied
setTimeout(() => {
const cleanupLinks = wrapMediaWithLink();
const cleanupInterlinks = wrapInterlinksWithAnchor();
const cleanup = () => {
cleanupInterlinks();
cleanupLinks();
cleanupPrintStyles();
};

View File

@@ -13,11 +13,14 @@ export interface UseCollaborationStore {
) => HocuspocusProvider;
destroyProvider: () => void;
setReady: (value: boolean) => void;
pauseForInactivity: () => void;
resumeFromInactivity: () => void;
provider: HocuspocusProvider | undefined;
isConnected: boolean;
isReady: boolean;
isSynced: boolean;
hasLostConnection: boolean;
isPausedForInactivity: boolean;
resetLostConnection: () => void;
}
@@ -27,6 +30,7 @@ const defaultValues = {
isReady: false,
isSynced: false,
hasLostConnection: false,
isPausedForInactivity: false,
};
type ExtendedCloseEvent = CloseEvent & { wasClean: boolean };
@@ -59,6 +63,12 @@ export const useProviderStore = create<UseCollaborationStore>((set, get) => ({
name: storeId,
document: doc,
onDisconnect(data) {
// Skip reconnect when the disconnect was triggered by inactivity:
// reconnection only happens once the user becomes active again.
if (get().isPausedForInactivity) {
return;
}
// Attempt to reconnect if the disconnection was clean (initiated by the client or server)
if ((data.event as ExtendedCloseEvent).wasClean) {
if (data.event.reason === 'No cookies' && data.event.code === 4001) {
@@ -99,7 +109,7 @@ export const useProviderStore = create<UseCollaborationStore>((set, get) => ({
}
// If we were previously connected and now we're not,
// we might have lost the connection
else if (wasConnected) {
else if (wasConnected && !get().isPausedForInactivity) {
clearTimeout(lostConnectionTimeout);
// Jitter spreading for reconnection attempts
// Math.random() generates a random delay to avoid all clients
@@ -163,5 +173,22 @@ export const useProviderStore = create<UseCollaborationStore>((set, get) => ({
set(defaultValues);
},
setReady: (value: boolean) => set({ isReady: value }),
pauseForInactivity: () => {
if (get().isPausedForInactivity) {
return;
}
clearTimeout(reconnectTimeout);
clearTimeout(lostConnectionTimeout);
set({ isPausedForInactivity: true, hasLostConnection: false });
get().provider?.disconnect();
},
resumeFromInactivity: () => {
if (!get().isPausedForInactivity) {
return;
}
clearTimeout(lostConnectionTimeout);
set({ isPausedForInactivity: false });
void get().provider?.connect();
},
resetLostConnection: () => set({ hasLostConnection: false }),
}));

View File

@@ -60,6 +60,12 @@ export function PostHogProvider({
if (process.env.NODE_ENV === 'development') {
posthogInstance.debug();
}
if (process.env.NEXT_PUBLIC_APP_VERSION) {
posthogInstance.register({
app_version: process.env.NEXT_PUBLIC_APP_VERSION,
});
}
},
capture_pageview: false,
capture_pageleave: true,